Skip to main content
All CollectionsAI in Tines
Frequently asked questions: preferred AI providers
Frequently asked questions: preferred AI providers

A document covering common questions around using your own AI provider in Tines.

Hannah Roy avatar
Written by Hannah Roy
Updated yesterday

Getting started

Before bringing in your preferred AI provider into Tines, you must first understand your organization’s AI infrastructure.

Consider the following questions:

  • What AI models does your company support?

  • Are the models schema-compatible (input and output) with either Anthropic or OpenAI?

    • A model may not be schema-compatible if your company wrote its own API or hosts a model other than Claude, OpenAI, or Ollama.

  • Is your Tines tenant on the cloud or self-hosted?

Understanding the AI providers

What AI providers are available to deploy?

  • NOTE1: While we support the above providers directly, we’re able to support any OpenAI or Anthropic spec-compatible providers or observability tools. Refer to the AI Admin docs for the most current list. As with all features, this list will continue to evolve.

  • NOTE2: Each model running locally will have to expose an OpenAI compatible API so that Tines may connect to it. Systems running on Ollama already provide this.

Is there a specific LLM a Tines builder should consider?

No - the right LLM for one business might be different from another. Customers can choose to use the one their organization already has in place.

Does using a different AI provider affect the security of using AI in Tines?

No. Customers engage with third-party applications at their own risk, as is the current status at Tines. Regardless of which LLM is used, Tines will still handle customer data using our same security and privacy rules.

Does using a preferred AI provider bring in any extra costs?

No extra costs are incurred via Tines, but the pricing changes.

Does using a preferred AI provider affect how the different AI in Tines features are used?

Different LLMs excel at different tasks. For AI features in Tines, we recommend using a model equivalent to the ones Tines supports to experience quality AI outputs. Tenant admins can switch to another available LLM or revert back to the Tines default at any point.

How can tenant admins get set up with their own AI model?

Have the AI API key handy for their Anthropic or OpenAI provider.

Navigate to AI settings - Providers - Custom provider - choose the provider and enter the API key

  • Watch a step-by-step here.

Who can builders reach out to with questions or troubleshooting?

  • Tines builders should utilize the Tines in-product chat or reach out to their account teams with general questions.

  • For troubleshooting, builders can start with our Support team; however, they should note that issues may be with their AI provider, not Tines, and should reach out to that vendor.

  • For builders with self-hosted tenants, we recommend working with Professional Services for effective setup and understanding.

Bookmark our Admin-AI docs and Use a preferred AI provider in Tines for all available resources on using a preferred AI provider.

Did this answer your question?