- Release Notes
- Getting Started
- About Insights
- Insights Data Model
- Adding Custom Data to Insights
- Managing Your Deployment
- Licensing
- Installation and Upgrade
- Insights in Orchestrator
- Insights Portal
Insights
Adding Custom Data to Insights
- For deployments exceeding 75 processes and/or queues with custom data added per tenant, you must either add your desired process and queue tables to an allowlist or Enable Table Grouping.
- A configurable alert is displayed warning you when you are approaching this limit.
- Table grouping may also need to be enabled based upon the total number of such tables across all tenants. Review the Hardware and Software Requirements for additional details.
- The number of columns in process and queue tables is limited to 40.
successfulTransaction
and SuccessfulTransaction
).
If your variables are not unique, you must set Insights to be case
insensitive.
When designing your automation workflows intended for consumption and analysis in Insights, there are changes to consider in the best practice approach used to date.
You do not need to "tag" individual items and/or data points to ensure their availability for analysis in Insights. For all processes and/or queues with custom fields, a custom table is automatically generated with those fields available.
The following activities can be used in designing your automation workflows to capture the desired data:
- Add Log Fields - all added log fields are ingested as a custom table in Insights. See [here] (doc:adding-custom-data-to-insights#logs) for more details.
- Add Queue Item - all data added
via the
ItemInformation
property are ingested as a custom table in Insights. See here for more details. - Set Transaction
Status - all data added via the
Analytics
andOutput
properties are ingested as a custom table in Insights. See here for more details.
For complete details about all data ingested by Insights, see the Insights Data Model page.
For those workflows that you want to analyze in Insights it is recommended to log all desired data at the end of each workflow.
Where previously you would use the RemoveLogFields activity at the end of any individual sequence to remove logs not needed in the rest of the workflow, removing all custom logs as such results in the process and data not being ingested into the Insights database.
There are 2 methods to leverage logs for sending data to Insights:
- Log message - enables you to add a custom string message that is sent to Insights, in the form of a log event, in case of an error in your workflow.
- Add Log Fields - enables you to add custom variables to the default Insights dashboards.
"Process-<your process
name>”
. These custom fields are extracted from the raw message via a
custom JSON parser and added to the table in the
RawMessage_CustomFieldName
format.
Integer
,
String
, Boolean
, Double
, and
Float
.
Insights receives the last log event only for successful processes. If you use the Remove Log Fields activity, Insights will not be able to capture the custom fields as they will be removed by the workflow prior to the process completing.
This also means that transactional processes that leverage logs for capturing data per transaction will not send the data for each transaction log to Insights. If you are using a process that iterates through transactions and wish to store custom data for each, you should follow UiPath best practices and use queues. Queues support sending custom variables for every queue item to Insights.
ItemInformation
becomes available. There, you can store any
desired custom variables. The supported data types for all queue data are the same
as for custom logs: Integer
, String
,
Boolean
, Double
, and
Float
.
Analytics
, and Output
.
Queue-"your queue name”
. The fields will be
identified via the input they were stored in:
- Variables stored in
Analytics
will be named“AnalyticsData_<yourfieldname>”
. - Variables stored in
Output
will be named“Output_<yourfieldname>”
-
Variables stored in
ItemInformation
will be named“SpecificData_<yourfieldname>”
.