UiPath AI Fabric

UiPath AI Fabric

About AI Fabric

AI Fabric is a service that allows you to deploy, manage, and continuously improve Machine Learning models and consume them within RPA workflows in Studio. The ML models can be built in a Python IDE or using an AutoML platform such as H20 Driverless AI. This chapter treats the subject of model deployment and management which is done through the AI Fabric web application available through your Automation Cloud Portal.

Accessing AI Fabric

🚧

Important!

The AI Fabric app is only available in Automation Cloud organizations and is licensed as a separate service.

To gain access to AI Fabric, change your licensing plan to Enterprise Trial or Enterprise and then allocate AI Robot licenses to your AI Fabric service within a tenant in your Automation Cloud organization.

The AI Fabric menu option is displayed in the Cloud Portal left navigation bar. You can access the app if your tenant has the necessary licenses and you have the corresponding permissions.

🚧

Important!

In order for your Robots to interract with AI Fabric on Cloud, the following url will need to be accessible by them:

AI Fabric Licensing

AI Fabric uses the concept of an AI Robot and a AI Robot Pro for licensing.

An AI Robot is the runtime for serving ML Skills (machine learning models available for robots to make prediction requests) and run ML Training jobs (training a new model version on new data to).
One AI Robot can serve two ML Skills or run one ML training job concurrently. Any user connected to the cloud Orchestrator instance can access ML Skills.

An AI Robot Pro is the same concept as an AI Robot with the addition that the runtime can use a GPU (for both serving a model or training a model). A GPU is just a different processing unit that speeds up some types of computations. The efficiency gain by using a GPU runtime is heavily dependent on the code being run.

Relation to UiPath Document Understanding

AI Fabric is the infrastructure on top of which UiPath Document Understanding machine learning models run. These models can be deployed or instantiated for retraining with a few clicks (see this section).

Using a Document Understanding model involves these steps:

  • Collect document samples and the requirements of the data points that need to be extracted.
  • Label documents using Data Manager. Note Data Manager itself will connect to an OCR Engine.
  • Export labeled documents as a Training data set and upload that exported folder to AI Fabric Storage.
  • Export labeled documents as a Testing data set and upload that exported folder to AI Fabric Storage.
  • Run a Training Pipeline on AI Fabric.
  • Evaluate the model performance with an Evaluation Pipeline on AI Fabric.
  • Deploy the trained model as an ML Skill in AI Fabric.
  • Query the ML Skill from an RPA Workflow using the Document Understanding activity pack.

🚧

Note

Remember that using Document Understanding models require that the machine on which AI Fabric is installed can access https://du-metering.uipath.com

Permissions

In order to perform various actions regarding ML entities, you need certain permissions:

  • Display ML logs and view log data - View on ML Logs
  • Display projects, datasets, ML packages and pipelines and view their corresponding details - View on ML Packages
  • Displays ML skills and allows you to view details on the corresponding ML package (available versions, parameters) - View on ML Skills
  • Create a new project, dataset, or pipeline, and upload a new ML package - Create on ML Packages.
  • Deploy a new ML skill - Create on ML Skills.
  • Update a project, dataset, or pipeline, and upload a new ML package version and view release notes for each version - Edit on ML Packages.
  • Update to a new ML skill package version or rollback to an older one - Edit on ML Skills
  • Remove projects, datasets, pipelines, and undeployed package versions and packages - Delete on ML Packages.
  • Remove ML skills - Delete on ML Skills.

📘

Note:

Permissions are managed in Orchestrator.

User Personas & Custom Roles

This section treats the subject of user personas which handle AI Fabric in your company, and the recommended roles to be defined in Orchestrator for each persona.

Data Scientist

In charge of building and uploading the ML models to AI Fabric. Data Scientists build and then upload ML packages. They can perform this operation from the ML Packages page.

Permissions

  • View, Edit, Create, Delete on ML Packages
  • View on ML Skills
  • View on ML Logs

Process Controller

In charge of deploying models already uploaded by Data Scientists (ML packages), or provided by UiPath (OS packages), into ML skills. Process Controllers can perform this operation from the ML Skills page.

Permissions

  • View, Edit, Create, Delete on ML Skills
  • View on ML Packages
  • View on ML Logs

RPA Developer

In charge of developing and testing automation workflows; usually does not have access to Orchestrator or the AI Fabric app. RPA Developers consume the ML skills available on their Robot. These are retrieved from the Orchestrator tenant the connected Robot has been provisioned in.

  • View on ML Skills.
  • View on ML Logs.

RPA Developers consume deployed ML skills within customized workflows in Studio using the ML Skill activity from the UiPath.MLServices.Activities package. This activity package is only available for Studio v2019.10+ and can only be used by Robots v2019.10+. Read more about it here.

Updated 2 months ago



About AI Fabric


Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.