AI in Tines is available for all tenants, and can be configured by a tenant owner.
Learn more about AI in Tines.
Enable AI features
Control whether or not AI features are enabled across the whole tenant.
These can be turned on via the AI settings in the settings center.
By default, AI is turned on for newly created tenants.
AI providers
By default, AI features are powered by Anthropic's Claude, hosted securely through AWS Bedrock.
It is possible to bring your own AI provider to power all of our AI features. The following AI providers are configurable with the following customizations:
AI providers, observability tools, or custom proxies which are schema compatible with the following providers can also be configured as well. This includes but is not limited to:
OpenAI / Anthropic behind a proxy or service
AI models
When configuring OpenAI compatible APIs, it is possible to select which models are usable within Tines. Once configured, the fast model will power features such as automatic transform. The smart model will be used to power Workbench and other similar features.
All custom models enabled on an AI provider are also accessbile on the AI action.
Custom AWS Bedrock support for cloud
With custom AWS Bedrock support for cloud, you can choose any models that are enabled in your AWS region and permitted by your Bedrock account.
Recommended models to enable by default
Anthropic Claude 4 Sonnet
Anthropic Claude 3.7 Sonnet
Anthropic Claude 3.5 Haiku
Anthropic Claude 3 Haiku
Model IDs can be obtained from the AWS Bedrock documentation.
Credentials for AWS Bedrock
To securely invoke AWS Bedrock APIs, Tines supports AWS authentication using assumed roles.
Required IAM permissions
To use Bedrock within Tines, the IAM role must include the following permissions:
{
"Version": "2012-10-17",
"Statement": [
{
"Sid": "BedrockModelAccessPermissions",
"Effect": "Allow",
"Action": [
"bedrock:InvokeModel*",
"bedrock:GetInferenceProfile",
"bedrock:ListInferenceProfiles"
],
"Resource": "*"
}
]
}
AWS Bedrock endpoints
When configuring your connection to AWS Bedrock, you’ll need to specify the correct Amazon Bedrock runtime API endpoint for your AWS region.
It will look something like: bedrock-runtime.<region>.amazonaws.com
View the list of region-specific Bedrock runtime endpoints.

Azure OpenAI
When deploying OpenAI models through the Azure AI Foundry, additional configuration is required to enable use of those models in Tines. Each model is deployed to a unique URL and must be manually added to the model list.
Allocation
Allocate your AI credits to different teams in your tenant. We will provide you with warnings when you’re nearing your credit limits.
Should you need to increase your AI credits, contact your Tines account team.
