Based on the deployment manner, UiPath Task Mining resources have the following hosting model:
This instance is locally installed, on the organization's own machines and contains the below functionalities:
The Client Application is the local application that needs to be installed on the user’s machine in order to collect data from the users registered in Task Mining. It is linked to the Admin Console in order to facilitate the recording process management and manage other specific information that the users might need. It consists of two parts - Recorder and Processing Queue.
The desktop application installed by each user after receiving the invitation email. When the recording process is triggered by the Company Admin, it captures the user's individual steps within whitelisted applications.
- Processing Queue
This is installed automatically together with the Recorder and it's used to process the collected data and send it to the Local Data Store in real-time. It runs locally on the background and has no UI or controls.
To aggregate all recorded data from multiple users, we recommend setting up an internal network drive accessible to all users. This may be a separate machine with increased disk space.
The Company Admin needs to define this location in the Configuration File uploaded to the Admin Console.
A command-line tool executed by a Company Admin after a sufficient amount of data has been collected. This securely uploads the aggregated recorded data from the Local Data Store to the Blob Storage, making it accessible to the Analyzer.
This instance is hosted on the UiPath servers. It's accessed through a web browser and contains the below functionalities:
The Web application used by the Company Admin to manage the process discovery initiatives.
A UiPath-managed Microsoft Azure data store used to upload the data resulted after the Uploader is executed. The data will then be made available to the Analyzer.
The Machine Learning (ML) model used for discovering recommended automation candidates. After the Company Admin triggers the upload process, the analysis process is triggered. The output is a ZIP file describing the automatically generated process map, which is emailed as an attachment to the Company Admin.
The functionality that generates the process map diagram in the Admin Console after uploading the ZIP file.
The data gets transformed on the client app before being transferred to the shared folder and that it then gets encrypted before the upload. After that, the data is again encrypted as part of Azure storage and it remains double encrypted until the Analyzer decrypts it on a new VM that is spun up for the analysis.
The result is then encrypted again before being transferred off the VM and on the blob.
Updated about a month ago