orchestrator
2024.10
true
UiPath logo, featuring letters U and I in white

Orchestrator User Guide

Automation CloudAutomation Cloud Public SectorAutomation SuiteStandalone
Last updated Nov 22, 2024

About Logs

The Logs page displays logs generated by Robots in all folders the user has access to, including logs generated for jobs started through remote debugging sessions.

To access it, navigate to Orchestrator Automations tab from a folder context, and select Logs from the options displayed.

The table below contains field descriptions for the Logs page.

Field

Description

Time

The timestamp the log was registered.

Note: You can sort and filter the log list by Time.

Level

The severity level at which the message was logged. The following options are available: Trace, Debug, Info, Warn, Error, and Fatal.

Note: You can sort and filter the log list by Level.

Process

The name of the process that generated a given log message.

Hostname

The name of the workstation used for the process execution.

Host identity

The identity under which the execution takes place. The following values are possible:

  • <Domain\Username> - jobs executed under that specific account. Displayed in the following cases:

... foreground jobs regardless of the Robot version;

... all jobs executed on Robots lower than 2021.10;

... attended jobs executed on robots connected using a machine key, without user sign-in.

Note: For Robots older than 2021.10, the host identity gets populated dynamically according to the account settings made in Orchestrator. Changing the domain\username for the account used to execute a job changes the host identity as well.
  • ROOT - background jobs executed on Linux robots.
  • NT AUTHORITY\LOCAL SERVICE - jobs executed under the Robot service identity. Displayed for background jobs executed on Robots 2021.10+ without credentials.
    Service mode robots run under NT AUTHORITY\LOCAL SERVICE. User mode robots run under a certain user identity.
  • N/A - jobs started from the Assistant by users connected using interactive sign-in. For robots connected using the machine key, without user sign-in, the <Domain\Username> is displayed.

Message

The logged message. This can also be a message logged through the Log Message activity in Studio.

Keep in mind that the content of this column is displayed in the language of the Robot regardless of what language was chosen for Orchestrator.

Note: Logs generated outside the context of a folder can be viewed only in the Orchestrator database and Elasticsearch.

Filtering

To view all logs generated by a Robot for an indicated job, navigate to the Jobs page.

To filter logs by the name of the host machine they have been generated on, use the Machine filter on the Logs page.

The new filter works retroactively for logs stored in ElasticSearch, while for logs stored in the database, it only works for new log entries.

Log Storage

If Orchestrator is unavailable, logs are stored in a local database (C:\Windows\SysWOW64\config\systemprofile\AppData\Local\UiPath\Logs\execution_log_data), within the available disk space, until the connection is restored. When the connection is restored, the logs are sent in batches in the order they had been generated.
Note: The database is not deleted after the logs have been successfully sent to Orchestrator.

The status of a job is stored in the memory of the UiPath® Robot service. When Orchestrator becomes available, the information regarding the job status is synced between the two. However, if Orchestrator is not available and you restart the UiPath Robot service, the information is lost. This means that whenever Orchestrator becomes available the job is executed again.

Logs can be sent to ElasticSearch, a local SQL database, and/or UiPath Insights thus enabling you to have non-repudiation logs. They are independent of each other and, as such, an issue encountered in one does not affect the other(s).

Configure the location where logs are stored in UiPath.Orchestrator.dll.config, by changing the value of the writeTo parameter.

The Logs page displays the entries that Robots send to Orchestrator if the logs are sent to Elasticsearch or a SQL database. If logs are sent to both Elasticsearch and SQL, then the Logs page displays only the entries sent to Elasticsearch.

Note:

The performance of the SQL database starts degrading when it reaches 2 million Robot logs, with a severe degradation once the 6 million threshold is reached. The degradation leads to slow log searches and impacts the performance of your automations.

These thresholds represent our observed averages. Depending on your database server hardware, the values can either be smaller or larger (by up to 2 - 3 times).

You need to regularly clean up the database so as to ensure that you stay within these limits.

If, however, your business needs require that you exceed these thresholds, you need to use Elasticsearch in order to maintain performance.

If you use Elasticsearch to store your Robot logs, please note that, in certain circumstances, only 10.000 items can be queried.

Due to an Elasticsearch limitation, Cloud Platform's Orchestrator tenants are configured to ignore Robot logs larger than 50 kb. The vast majority of logs average around 2 kb.

Logging Levels

Messages are logged on the following levels: Trace,Debug,Info,Warn,Error and Fatal.

Custom messages can also be sent to this page from Studio, with the Log Message activity. The messages can be logged at all the levels described above and should be used for diagnostic purposes.

For example, in the screenshot below, you can see that we logged a custom message at a Fatal severity level.



See Logging Levels for more information.

Exporting Logs

All logs can be exported to a .csv file, by clicking the Export button. The filters applied to this page are taken into account when this file is generated. For example, if you set to view logs only from the last 30 days with an Info severity level, only the entries that meet these criteria are downloaded.

To ensure the best performance, please note that the exported entries are not in reverse chronological order.

Logs may not be in the proper order only in the following scenario:

  • There are two or more robot log entries with almost equal timestamps - they are equal up to the millisecond (time expressed as yyyy-MM-dd HH\:mm\:ss.fff is the same), but differ in the millisecond's subunits (the last four values in yyyy-MM-dd HH\:mm\:ss.fffffff are different).
  • The logs are viewed in Orchestrator with the default sort order in the grid (sort by Time descending).

However, this does not affect the database and exported .csv file.

Note: Both server exceptions from Orchestrator, and the stack trace on the Job Details window, are logged in English, regardless of what language was chosen by the user.

Permissions

You need View permission on Logs in the folder context to have access to logs generated in that folder.

  • Filtering
  • Log Storage
  • Logging Levels
  • Exporting Logs
  • Permissions

Was this page helpful?

Get The Help You Need
Learning RPA - Automation Courses
UiPath Community Forum
Uipath Logo White
Trust and Security
© 2005-2024 UiPath. All rights reserved.