- Getting started
- Data security and compliance
- Organizations
- Authentication and security
- Licensing
- Tenants and services
- Accounts and roles
- External applications
- Testing in your organization
- AI Trust Layer
- Notifications
- Logging
- Troubleshooting
- Migrating to Automation Cloud Dedicated

Automation Cloud Dedicated admin guide
- Flex: Advanced Platform, Flex Standard Platform.
The LLM configurations tab allows you to integrate your existing AI subscriptions while maintaining the governance framework provided by UiPath. You can:
- Replace UiPath LLM subscription: Replace UiPath-managed subscriptions with your own, provided they match the same model family and version already supported by the UiPath product. This allows for seamless swapping of UiPath-managed models with your subscribed models.
- Add your own LLM: Use any LLM that meets the product's compatibility criteria. To ensure smooth integration, your chosen LLM must pass a series of tests initiated through a probe call before it can be used within the UiPath ecosystem.
Configuring LLMs preserves most of the governance benefits of the AI Trust Layer, including policy enforcement via Automation Ops and detailed audit logs. However, model governance policies are specifically designed for UiPath-managed LLMs. This means that if you disable a particular model through an AI Trust Layer policy, the restriction only applies to the UiPath-managed version of that model. Your own configured models of the same type remain unaffected.
When leveraging the option to use your own LLM or subscription, keep the following points in mind:
- Compatibility requirements: Your chosen LLM or subscription must align with the model family and version currently supported by the UiPath product.
-
Setup: Make sure you properly configure and maintain all required LLMs in the custom setup. If any component is missing, outdated, or incorrectly configured, your custom setup may cease to function. In such cases, the system will automatically revert to a UiPath-managed LLM to ensure continuity of service, unless UiPath LLMs are turned off through an Automation Ops policy.
-
Cost-saving: If your custom LLM setup is complete, correct, and meets all necessary requirements, you may be eligible for a Reduced Consumption Rate.
LLM connections rely on Integration Service to establish the connection to your own models. You can create connections to the following providers:
- Azure Open AI
- Open AI
- Amazon Bedrock
- Google Vertex
- Open AI V1 Compliant LLM – Use this option to connect to any LLM provider whose API follows the OpenAI V1 standard. For details, refer to the OpenAI V1 Compliant LLM connector documentation.
To set up a new connection, follow these steps:
You can perform the following actions on your existing connections:
- Check status – Verify the status of your Integration Service connection. This action ensures that the connection is active and functioning correctly.
- Edit – Modify any parameters of your existing connection.
- Disable – Temporarily suspend the connection. When disabled, the connection remains visible in your list but doesn't route any calls. You can re-enable the connection when needed.
- Delete – Permanently remove the connection from your system. This action disables the connection and removes it from your list.
Each product supports specific large language models (LLMs) and versions. Use the table below to identify the supported models and versions for your product.
You can connect your own LLM using one of the following providers: Amazon Web Services, Google Vertex, Microsoft Azure OpenAI, or OpenAI V1 Compliant. Follow the steps outlined in the previous section to create a connection.
| Product | Feature | LLM provider | Version |
|---|---|---|---|
| Autopilot for everyone | Chat | Anthropic |
anthropic.claude-3.5-sonnet-20240620-v1:0 anthropic.claude-3.7-sonnet-20250219-v1:0 |
| OpenAI | gpt-4o-mini-2024-07-18 | ||
| Coded agents | Call LLM | Anthropic |
anthropic.claude-3.5-sonnet-20240620-v1:0 anthropic.claude-3.5-sonnet-20241022-v2:0 anthropic.claude-3.7-sonnet-20250219-v1:0 anthropic.claude-3-haiku-20240307-v1:0 |
| Gemini |
gemini-1.5-pro-001 gemini-2.0-flash-001 | ||
| OpenAI |
gpt-4o-2024-05-13 gpt-4o-2024-08-06 gpt-4o-2024-11-20 gpt-4o-mini-2024-07-18 o3-mini-2025-01-31 | ||
| Context Grounding | Advanced Extractions | Gemini | gemini-2.5-flash |
| Embeddings | OpenAI | text-embedding-3-large | |
| Gemini | gemini-embedding-001 | ||
| GenAI Activities | Build, Test & Deploy | Anthropic |
anthropic.claude-3.5-sonnet-20241022-v2:0 anthropic.claude-3.7-sonnet-20250219-v1:0 |
| Gemini |
gemini-2.5-pro gemini-2.5-flash | ||
| OpenAI |
gpt-5-2025-08-07 gpt-5-mini-2025-08-07 gpt-5-nano-2025-0807 |