UiPath AI Fabric

UiPath AI Fabric

AI Fabric

This page details the hardware, software requirements as well as prerequisites for installing AI Fabric.

🚧

Note

We strongly recommend to install AI Fabric on VM instances provided by popular cloud providers such as AWS, Azure, and GCP.

Hardware Requirements

The table below makes some recommendations averaged across generic models (small and large).

CPU

RAM (GB)

OS/Boot Disk (GB)

External Data Disk (GB)

Models Served

Concurrent Models Trained

6

24

200

250

1-2

8

52

200

500

3

1-2

12

64

200

1000

3-4

2

❗️

Airgapped installation

An airgapped installation requires a 500 GB for OS/Boot Disk.

Hardware Requirements for UiPath Document Understanding Models

  • Serving one Document Understanding model uses ~1 CPU and ~4GB RAM
  • Training one Document Understanding with 1000 images uses ~2CPU and 24GB RAM.

🚧

Before provisioning a machine, be sure to read the installation instructions. The external data disk attached to the machine must be un-formatted and must be of type disk not partition. See step 1. Provision a Machine.

GPU Requirements

Only NVIDIA GPUs are currently supported. Most scenarios will not require training on a GPU as very few model architectures can execute with GPU but not CPU. If you have constrains on model training time, it is recommended you add a GPU with at least 8 GB of Video RAM. GPU driver requirements are only applicable to airgapped installations see Prerequisite for Installation below

Software Requirements

Operating System

The following table lists the operating system(s) officially supported for the AI Fabric on-premises installation.

OS

Version

Ubuntu

18.04 LTS

RHEL

7.4, 7.5, 7.6, 7.7, 7.8

CentOS

7.4, 7.5, 7.6, 7.7, 7.8

Browsers

The following table lists the browser(s) officially supported for the AI Fabric on-premises installation.

Browser

Version

Google Chrome

64 or above

Microsoft Edge

80 or above

Mozilla Firefox

66 or above

Prerequisites for Installation

Before starting the UiPath installation the following prerequisites are needed:

  • Orchestrator 20.4
    See the guide here for various ways to install Orchestrator.
  • SQL Server
    It is highly recommended that you use the same SQL Server as was used when installing Orchestrator as detailed here. For the installation, you will require the hostname, admin username, and password of this SQL Server.
  • GPU Prequirements for Airgapped Install
    This section of prerequirements only applies to cases where the machine has a GPU, and the installation is in an airgapped environment. Said differently, in cases where the environment allows for outbound connections, this section can be ignored as a script that downloads from NVIDIA repositories installs NVIDIA components is provided.

For AI Fabric airgapped installation (that is avoiding our provided GPU installer script), it is a pre-requirement that the node have NVIDIA driver version 450.51.06 installed, as well as nvidia-container-runtime.

🚧

Make sure that SQL Server Authentication mode is enabled.

AI Fabric Architecture

AI Fabric runs on a kubernetes cluster. All communication into and out of the cluster is secured with HTTPS (TLS). Tenant and user-specific traffic uses modern protocols (OAuth2.0 and OpenID) supported by UiPath's Identity Server.

The diagram below shows a detailed architecture diagram of the various components in AI Fabric.

At a high-level, AI Fabric core services manage the deployment and training of machine learning models.

A deployment of a machine learning model (called an ML Skill) is a container with the code and model artifacts. AI Fabric creates an end-point from that container that is permissioned and replicated.

A training or evaluation of a machine learning model will create a container image on-the-fly and execute code predefined in by the AI Fabric user or by an out-of-the-box retrainable model.

Updated 2 days ago



AI Fabric


Suggested Edits are limited on API Reference Pages

You can only suggest edits to Markdown body content, but not to the API spec.