UiPath Documentation
private-test-cloud
2.2510
true

Private Test Cloud admin guide

Last updated May 14, 2026

How-to guide: Connecting Mistral through AI Trust Layer

This guide explains how to connect a Mistral model deployed in Microsoft Azure AI Foundry to UiPath Agents through the AI Trust Layer LLM Configurations feature. You create a custom Integration Service connector based on an Azure OpenAI template and adapt its request hook for Mistral's API requirements.

Prerequisites

  • BYO AI Gateway enabled in your cluster. For details, see Configuring AI Trust Layer.
  • A Mistral model deployed in Azure AI Foundry (for example, mistral-small-2503)
  • The Azure AI Foundry endpoint URL for your resource
  • An API key for your Azure AI Foundry resource
  • Organization administrator access in Automation Suite
  • Access to Integration Service and Connector Builder

Create the custom connector

  1. Navigate to Admin > AI Trust Layer > LLM configurations and select Add configuration.
  2. Set the Tenant, Product, and Feature values.
  3. Under Model Configuration, enter a custom alias in the LLM Name field and set API Type to OpenAI.
  4. In the Connector field, select Create custom connector.
  5. Select the Azure OpenAI template, then select Create connector. Connector Builder opens with the Azure OpenAI template pre-populated.

Edit the connector for Mistral compatibility

Mistral models on Azure AI Foundry use strict schema validation and do not accept all fields that the Azure OpenAI template sends by default. The connector's preRequest hook must resolve these differences before each request reaches the Mistral endpoint.

Azure OpenAI template sendsMistral on Azure AI Foundry expectsResolution applied by hook
tool_choice: "required" or object form"none", "auto", or "any"Translated to "any" when tools are present
parallel_tool_calls fieldField not supported (extra_forbidden)Removed from request body
max_completion_tokensmax_tokensField is renamed
  1. In Connector Builder, open the Hooks section and select the preRequest hook.

  2. Replace the entire hook body with the following script:

    // Normalize query params and payload for Mistral on Azure AI Foundry.
    // Removes unsupported fields and adapts tool semantics for Mistral's strict schema.
    const _reqPath = (typeof request_path !== 'undefined') ? request_path : '';
    const _reqParams = (typeof request_parameters !== 'undefined') ? request_parameters : undefined;
    const _cfg = (typeof configuration !== 'undefined') ? configuration : undefined;
    
    if (['/query', '/v1/responses'].includes(_reqPath)) {
        return done();
    }
    
    let apiVersion = (_cfg && _cfg['api-version']) ? _cfg['api-version'] : "2023-05-15";
    if (_reqParams && _reqParams["api-version"]) {
        apiVersion = _reqParams["api-version"];
    }
    if (['/listAllModels', '/auth_validation'].includes(_reqPath)) {
        apiVersion = (_cfg && _cfg['api-version']) ? _cfg['api-version'] : "2023-10-01-preview";
    }
    
    // Resolve body across different runtime variable names
    let body = (typeof request_body !== 'undefined' && request_body) ? request_body :
               ((typeof request_vendor_body !== 'undefined' && request_vendor_body) ? request_vendor_body :
                ((typeof request !== 'undefined' && request && request.body) ? request.body : undefined));
    
    if (body && typeof body === 'string') {
        try { body = JSON.parse(body); } catch (e) { /* leave as-is */ }
    }
    
    if (body && typeof body === 'object') {
        // Remove field rejected by Mistral's strict schema
        if (Object.prototype.hasOwnProperty.call(body, 'parallel_tool_calls')) {
            delete body.parallel_tool_calls;
        }
    
        // Normalize tool_choice: Mistral accepts only 'none', 'auto', or 'any'
        const _allowedToolChoice = new Set(['none', 'auto', 'any']);
        const _hasTools = Object.prototype.hasOwnProperty.call(body, 'tools')
            && Array.isArray(body.tools) && body.tools.length > 0;
    
        if (Object.prototype.hasOwnProperty.call(body, 'tool_choice')) {
            if (body.tool_choice === 'required') {
                body.tool_choice = 'any';
            } else if (body.tool_choice && typeof body.tool_choice === 'object') {
                body.tool_choice = 'any';
            } else if (typeof body.tool_choice === 'string' && !_allowedToolChoice.has(body.tool_choice)) {
                body.tool_choice = _hasTools ? 'any' : 'auto';
            }
        } else if (_hasTools) {
            // Force tool usage: agent runtime expects tool calls when tools are configured
            body.tool_choice = 'any';
        }
    
        // Rename max_completion_tokens to max_tokens
        if (Object.prototype.hasOwnProperty.call(body, 'max_completion_tokens')) {
            if (!Object.prototype.hasOwnProperty.call(body, 'max_tokens')) {
                body.max_tokens = body.max_completion_tokens;
            }
            delete body.max_completion_tokens;
        }
    }
    
    const out = {
        request_vendor_parameters: { "api-version": apiVersion }
    };
    
    if (typeof request_body !== 'undefined') out.request_body = body;
    if (typeof request_vendor_body !== 'undefined') out.request_vendor_body = body;
    
    return done(out);
    // Normalize query params and payload for Mistral on Azure AI Foundry.
    // Removes unsupported fields and adapts tool semantics for Mistral's strict schema.
    const _reqPath = (typeof request_path !== 'undefined') ? request_path : '';
    const _reqParams = (typeof request_parameters !== 'undefined') ? request_parameters : undefined;
    const _cfg = (typeof configuration !== 'undefined') ? configuration : undefined;
    
    if (['/query', '/v1/responses'].includes(_reqPath)) {
        return done();
    }
    
    let apiVersion = (_cfg && _cfg['api-version']) ? _cfg['api-version'] : "2023-05-15";
    if (_reqParams && _reqParams["api-version"]) {
        apiVersion = _reqParams["api-version"];
    }
    if (['/listAllModels', '/auth_validation'].includes(_reqPath)) {
        apiVersion = (_cfg && _cfg['api-version']) ? _cfg['api-version'] : "2023-10-01-preview";
    }
    
    // Resolve body across different runtime variable names
    let body = (typeof request_body !== 'undefined' && request_body) ? request_body :
               ((typeof request_vendor_body !== 'undefined' && request_vendor_body) ? request_vendor_body :
                ((typeof request !== 'undefined' && request && request.body) ? request.body : undefined));
    
    if (body && typeof body === 'string') {
        try { body = JSON.parse(body); } catch (e) { /* leave as-is */ }
    }
    
    if (body && typeof body === 'object') {
        // Remove field rejected by Mistral's strict schema
        if (Object.prototype.hasOwnProperty.call(body, 'parallel_tool_calls')) {
            delete body.parallel_tool_calls;
        }
    
        // Normalize tool_choice: Mistral accepts only 'none', 'auto', or 'any'
        const _allowedToolChoice = new Set(['none', 'auto', 'any']);
        const _hasTools = Object.prototype.hasOwnProperty.call(body, 'tools')
            && Array.isArray(body.tools) && body.tools.length > 0;
    
        if (Object.prototype.hasOwnProperty.call(body, 'tool_choice')) {
            if (body.tool_choice === 'required') {
                body.tool_choice = 'any';
            } else if (body.tool_choice && typeof body.tool_choice === 'object') {
                body.tool_choice = 'any';
            } else if (typeof body.tool_choice === 'string' && !_allowedToolChoice.has(body.tool_choice)) {
                body.tool_choice = _hasTools ? 'any' : 'auto';
            }
        } else if (_hasTools) {
            // Force tool usage: agent runtime expects tool calls when tools are configured
            body.tool_choice = 'any';
        }
    
        // Rename max_completion_tokens to max_tokens
        if (Object.prototype.hasOwnProperty.call(body, 'max_completion_tokens')) {
            if (!Object.prototype.hasOwnProperty.call(body, 'max_tokens')) {
                body.max_tokens = body.max_completion_tokens;
            }
            delete body.max_completion_tokens;
        }
    }
    
    const out = {
        request_vendor_parameters: { "api-version": apiVersion }
    };
    
    if (typeof request_body !== 'undefined') out.request_body = body;
    if (typeof request_vendor_body !== 'undefined') out.request_vendor_body = body;
    
    return done(out);
    
  3. In the connector settings, set Base URL to your Azure AI Foundry endpoint: https://{your-resource-name}.openai.azure.com/openai.

  4. Set the Authentication Type to match your setup. This example uses API Key (customApiKey), but you can use any supported authentication type, including OAuth — update the connector settings accordingly.

  5. Select Save, then publish the connector.

Create a connection in Integration Service

  1. In Integration Service, navigate to Connections and select Add connection.
  2. Select the custom connector you published.
  3. In the API key field, enter your Azure AI Foundry API key.
  4. In the Azure OpenAI resource field, enter your resource name — the subdomain portion of your endpoint URL, without https:// or .openai.azure.com.
  5. Select Connect to provision the connection.

Complete the LLM configuration

  1. Return to Admin > AI Trust Layer > LLM configurations and open the configuration you started.
  2. Under Model Configuration, set Connector to your published connector and Connection to the connection you created.
  3. In the LLM identifier field, enter the deployment name exactly as it appears in Azure AI Foundry.
    Note:

    Trailing spaces in the LLM identifier field cause a DeploymentNotFound error. Verify there are no leading or trailing spaces before saving.

  4. Select Test configuration to run the AI Trust Layer probe.
  5. If the probe passes, select Save.

Result

The configuration is saved and the Mistral model is available to UiPath Agents for the product and feature you specified. Calls route through the AI Trust Layer and appear in the audit log under Source: Custom connection.

Note:

If you encounter issues while creating a custom connector, contact UiPath Support for assistance.

Was this page helpful?

Connect

Need help? Support

Want to learn? UiPath Academy

Have questions? UiPath Forum

Stay updated