- Getting started
- Host administration
- Organizations
- Tenants and services
- Authentication and security
- Licensing
- Accounts and roles
- Testing in your organization
- AI Trust Layer
- About AI Trust Layer
- AI Trust Layer configuration checklist
- Large Language Model Support
- Using connector templates
- How-to guide: Connecting Mistral through AI Trust Layer
- How-to guide: Connecting OSS models through AI Trust Layer
- External applications
- Notifications
- Logging
- Troubleshooting
Private Test Cloud admin guide
This guide explains how to connect a Mistral model deployed in Microsoft Azure AI Foundry to UiPath Agents through the AI Trust Layer LLM Configurations feature. You create a custom Integration Service connector based on an Azure OpenAI template and adapt its request hook for Mistral's API requirements.
Prerequisites
- BYO AI Gateway enabled in your cluster. For details, see Configuring AI Trust Layer.
- A Mistral model deployed in Azure AI Foundry (for example,
mistral-small-2503) - The Azure AI Foundry endpoint URL for your resource
- An API key for your Azure AI Foundry resource
- Organization administrator access in Automation Suite
- Access to Integration Service and Connector Builder
Create the custom connector
- Navigate to Admin > AI Trust Layer > LLM configurations and select Add configuration.
- Set the Tenant, Product, and Feature values.
- Under Model Configuration, enter a custom alias in the LLM Name field and set API Type to OpenAI.
- In the Connector field, select Create custom connector.
- Select the Azure OpenAI template, then select Create connector. Connector Builder opens with the Azure OpenAI template pre-populated.
Edit the connector for Mistral compatibility
Mistral models on Azure AI Foundry use strict schema validation and do not accept all fields that the Azure OpenAI template sends by default. The connector's preRequest hook must resolve these differences before each request reaches the Mistral endpoint.
| Azure OpenAI template sends | Mistral on Azure AI Foundry expects | Resolution applied by hook |
|---|---|---|
tool_choice: "required" or object form | "none", "auto", or "any" | Translated to "any" when tools are present |
parallel_tool_calls field | Field not supported (extra_forbidden) | Removed from request body |
max_completion_tokens | max_tokens | Field is renamed |
-
In Connector Builder, open the Hooks section and select the preRequest hook.
-
Replace the entire hook body with the following script:
// Normalize query params and payload for Mistral on Azure AI Foundry. // Removes unsupported fields and adapts tool semantics for Mistral's strict schema. const _reqPath = (typeof request_path !== 'undefined') ? request_path : ''; const _reqParams = (typeof request_parameters !== 'undefined') ? request_parameters : undefined; const _cfg = (typeof configuration !== 'undefined') ? configuration : undefined; if (['/query', '/v1/responses'].includes(_reqPath)) { return done(); } let apiVersion = (_cfg && _cfg['api-version']) ? _cfg['api-version'] : "2023-05-15"; if (_reqParams && _reqParams["api-version"]) { apiVersion = _reqParams["api-version"]; } if (['/listAllModels', '/auth_validation'].includes(_reqPath)) { apiVersion = (_cfg && _cfg['api-version']) ? _cfg['api-version'] : "2023-10-01-preview"; } // Resolve body across different runtime variable names let body = (typeof request_body !== 'undefined' && request_body) ? request_body : ((typeof request_vendor_body !== 'undefined' && request_vendor_body) ? request_vendor_body : ((typeof request !== 'undefined' && request && request.body) ? request.body : undefined)); if (body && typeof body === 'string') { try { body = JSON.parse(body); } catch (e) { /* leave as-is */ } } if (body && typeof body === 'object') { // Remove field rejected by Mistral's strict schema if (Object.prototype.hasOwnProperty.call(body, 'parallel_tool_calls')) { delete body.parallel_tool_calls; } // Normalize tool_choice: Mistral accepts only 'none', 'auto', or 'any' const _allowedToolChoice = new Set(['none', 'auto', 'any']); const _hasTools = Object.prototype.hasOwnProperty.call(body, 'tools') && Array.isArray(body.tools) && body.tools.length > 0; if (Object.prototype.hasOwnProperty.call(body, 'tool_choice')) { if (body.tool_choice === 'required') { body.tool_choice = 'any'; } else if (body.tool_choice && typeof body.tool_choice === 'object') { body.tool_choice = 'any'; } else if (typeof body.tool_choice === 'string' && !_allowedToolChoice.has(body.tool_choice)) { body.tool_choice = _hasTools ? 'any' : 'auto'; } } else if (_hasTools) { // Force tool usage: agent runtime expects tool calls when tools are configured body.tool_choice = 'any'; } // Rename max_completion_tokens to max_tokens if (Object.prototype.hasOwnProperty.call(body, 'max_completion_tokens')) { if (!Object.prototype.hasOwnProperty.call(body, 'max_tokens')) { body.max_tokens = body.max_completion_tokens; } delete body.max_completion_tokens; } } const out = { request_vendor_parameters: { "api-version": apiVersion } }; if (typeof request_body !== 'undefined') out.request_body = body; if (typeof request_vendor_body !== 'undefined') out.request_vendor_body = body; return done(out);// Normalize query params and payload for Mistral on Azure AI Foundry. // Removes unsupported fields and adapts tool semantics for Mistral's strict schema. const _reqPath = (typeof request_path !== 'undefined') ? request_path : ''; const _reqParams = (typeof request_parameters !== 'undefined') ? request_parameters : undefined; const _cfg = (typeof configuration !== 'undefined') ? configuration : undefined; if (['/query', '/v1/responses'].includes(_reqPath)) { return done(); } let apiVersion = (_cfg && _cfg['api-version']) ? _cfg['api-version'] : "2023-05-15"; if (_reqParams && _reqParams["api-version"]) { apiVersion = _reqParams["api-version"]; } if (['/listAllModels', '/auth_validation'].includes(_reqPath)) { apiVersion = (_cfg && _cfg['api-version']) ? _cfg['api-version'] : "2023-10-01-preview"; } // Resolve body across different runtime variable names let body = (typeof request_body !== 'undefined' && request_body) ? request_body : ((typeof request_vendor_body !== 'undefined' && request_vendor_body) ? request_vendor_body : ((typeof request !== 'undefined' && request && request.body) ? request.body : undefined)); if (body && typeof body === 'string') { try { body = JSON.parse(body); } catch (e) { /* leave as-is */ } } if (body && typeof body === 'object') { // Remove field rejected by Mistral's strict schema if (Object.prototype.hasOwnProperty.call(body, 'parallel_tool_calls')) { delete body.parallel_tool_calls; } // Normalize tool_choice: Mistral accepts only 'none', 'auto', or 'any' const _allowedToolChoice = new Set(['none', 'auto', 'any']); const _hasTools = Object.prototype.hasOwnProperty.call(body, 'tools') && Array.isArray(body.tools) && body.tools.length > 0; if (Object.prototype.hasOwnProperty.call(body, 'tool_choice')) { if (body.tool_choice === 'required') { body.tool_choice = 'any'; } else if (body.tool_choice && typeof body.tool_choice === 'object') { body.tool_choice = 'any'; } else if (typeof body.tool_choice === 'string' && !_allowedToolChoice.has(body.tool_choice)) { body.tool_choice = _hasTools ? 'any' : 'auto'; } } else if (_hasTools) { // Force tool usage: agent runtime expects tool calls when tools are configured body.tool_choice = 'any'; } // Rename max_completion_tokens to max_tokens if (Object.prototype.hasOwnProperty.call(body, 'max_completion_tokens')) { if (!Object.prototype.hasOwnProperty.call(body, 'max_tokens')) { body.max_tokens = body.max_completion_tokens; } delete body.max_completion_tokens; } } const out = { request_vendor_parameters: { "api-version": apiVersion } }; if (typeof request_body !== 'undefined') out.request_body = body; if (typeof request_vendor_body !== 'undefined') out.request_vendor_body = body; return done(out); -
In the connector settings, set Base URL to your Azure AI Foundry endpoint:
https://{your-resource-name}.openai.azure.com/openai. -
Set the Authentication Type to match your setup. This example uses API Key (
customApiKey), but you can use any supported authentication type, including OAuth — update the connector settings accordingly. -
Select Save, then publish the connector.
Create a connection in Integration Service
- In Integration Service, navigate to Connections and select Add connection.
- Select the custom connector you published.
- In the API key field, enter your Azure AI Foundry API key.
- In the Azure OpenAI resource field, enter your resource name — the subdomain portion of your endpoint URL, without
https://or.openai.azure.com. - Select Connect to provision the connection.
Complete the LLM configuration
- Return to Admin > AI Trust Layer > LLM configurations and open the configuration you started.
- Under Model Configuration, set Connector to your published connector and Connection to the connection you created.
- In the LLM identifier field, enter the deployment name exactly as it appears in Azure AI Foundry.
Note:
Trailing spaces in the LLM identifier field cause a
DeploymentNotFounderror. Verify there are no leading or trailing spaces before saving. - Select Test configuration to run the AI Trust Layer probe.
- If the probe passes, select Save.
Result
The configuration is saved and the Mistral model is available to UiPath Agents for the product and feature you specified. Calls route through the AI Trust Layer and appear in the audit log under Source: Custom connection.
If you encounter issues while creating a custom connector, contact UiPath Support for assistance.