- Release Notes
- Getting Started
- Setup and Configuration
- Automation Projects
- Dependencies
- Types of Workflows
- Control Flow
- File Comparison
- Automation Best Practices
- Source Control Integration
- Debugging
- Logging
- The Diagnostic Tool
- Workflow Analyzer
- About Workflow Analyzer
- ST-NMG-001 - Variables Naming Convention
- ST-NMG-002 - Arguments Naming Convention
- ST-NMG-004 - Display Name Duplication
- ST-NMG-005 - Variable Overrides Variable
- ST-NMG-006 - Variable Overrides Argument
- ST-NMG-008 - Variable Length Exceeded
- ST-NMG-009 - Prefix Datatable Variables
- ST-NMG-011 - Prefix Datatable Arguments
- ST-NMG-012 - Argument Default Values
- ST-NMG-016 - Argument Length Exceeded
- ST-NMG-017 - Class name matches default namespace
- ST-DBP-002 - High Arguments Count
- ST-DBP-003 - Empty Catch Block
- ST-DBP-007 - Multiple Flowchart Layers
- ST-DPB-010 - Multiple instances of [Workflow] or [Test Case]
- ST-DBP-020 - Undefined Output Properties
- ST-DBP-021 - Hardcoded Timeout
- ST-DBP-023 - Empty Workflow
- ST-DBP-024 - Persistence Activity Check
- ST-DBP-025 - Variables Serialization Prerequisite
- ST-DBP-026 - Delay Activity Usage
- ST-DBP-027 - Persistence Best Practice
- ST-DBP-028 - Arguments Serialization Prerequisite
- ST-USG-005 - Hardcoded Activity Arguments
- ST-USG-009 - Unused Variables
- ST-USG-010 - Unused Dependencies
- ST-USG-014 - Package Restrictions
- ST-USG-020 - Minimum Log Messages
- ST-USG-024 - Unused Saved for Later
- ST-USG-025 - Saved Value Misuse
- ST-USG-026 - Activity Restrictions
- ST-USG-027 - Required Packages
- ST-USG-028 - Restrict Invoke File Templates
- ST-USG-032 - Required Tags
- ST-USG-034 - Automation Hub URL
- Variables
- Arguments
- Imported Namespaces
- Coded automations
- Introduction
- Registering custom services
- Before and After contexts
- Generating code
- Generating coded test case from manual test cases
- Trigger-based Attended Automation
- Recording
- UI Elements
- Selectors
- Object Repository
- Data Scraping
- Image and Text Automation
- Citrix Technologies Automation
- RDP Automation
- VMware Horizon Automation
- Salesforce Automation
- SAP Automation
- macOS UI Automation
- The ScreenScrapeJavaSupport Tool
- The WebDriver Protocol
- Extensions
- About extensions
- SetupExtensions tool
- UiPathRemoteRuntime.exe is not running in the remote session
- UiPath Remote Runtime blocks Citrix session from being closed
- UiPath Remote Runtime causes memory leak
- UiPath.UIAutomation.Activities package and UiPath Remote Runtime versions mismatch
- The required UiPath extension is not installed on the remote machine
- Screen resolution settings
- Group Policies
- Cannot communicate with the browser
- Chrome extension is removed automatically
- The extension may have been corrupted
- Check if the extension for Chrome is installed and enabled
- Check if ChromeNativeMessaging.exe is running
- Check if ComSpec variable is defined correctly
- Enable access to file URLs and Incognito mode
- Multiple browser profiles
- Group Policy conflict
- Known issues specific to MV3 extensions
- List of extensions for Chrome
- Chrome Extension on Mac
- Group Policies
- Cannot communicate with the browser
- Edge extension is removed automatically
- The extension may have been corrupted
- Check if the Extension for Microsoft Edge is installed and enabled
- Check if ChromeNativeMessaging.exe is running
- Check if ComSpec variable is defined correctly
- Enable access to file URLs and InPrivate mode
- Multiple browser profiles
- Group Policy conflict
- Known issues specific to MV3 extensions
- List of extensions for Edge
- Extension for Safari
- Extension for VMware Horizon
- Extension for Amazon WorkSpaces
- SAP Solution Manager plugin
- Excel Add-in
- Test Suite - Studio
- Troubleshooting
Test Data Queues
Make use of the test data queue in Studio by configuring it as a data source or importing it through activities. All imported entities are stored in the Project tab, under Test Data.
For more information on test data queues in Orchestrator, see Test Data Queues.
- Install or upgrade an Orchestrator version equal to, or higher than 2022.4.
- Make sure that you have uploaded a JSON schema and added a test data queue in Orchestrator.
- Newly created test data queues are empty, but you can upload queue items based on your defined JSON schema, either directly in Orchestrator, or through Studio.
- When you use test data queues as source, items are retrieved sequentially from a test data queue regardless if some of them have been consumed.
- To Run and Debug test cases with dynamic test data, use the Test Explorer. The data comes from the test data queue during runtime, and the Test Explorer populates the values at runtime.
You can add test data queue items to your workflows either through activities or as a data source to a test case.
To add items to your test data queue using an activity, you need to create a custom workflow and configure an activity to import data from a test data queue in Orchestrator.
- Open Studio.
- Open or create a new Test Automation project.
- Navigate to New > Test Case.
-
Add an Add Test Data Queue Item activity to the Designer panel and go to Properties to configure the QueueName to match test your Orchestrator test data queue.
Note: Make sure that you are connected to an Orchestrator instance. - In the Testing > Data activities list, add multiple activities of your own choice to the Add Test Data Queue Item activity. For more information, see Test Data Activities.
-
Configure your activities to match your synthetic user data scenario.
Important: A valid queue item must have the exact name, as defined in your JSON schema. For example, if you have defined agivenName
property in your JSON schema, your corresponded Given Name activity should match this (e.g. DisplayName set togivenName
). For more information, see JSON Schema Definition. - Open the Variables panel to define your variables.
- Save your test case.
- In the Project panel, right-click on your test case and select Set as publishable
- Click Publish and configure the package properties.
-
Click Run File.
Note: Alternatively, you can create a custom input form to easily configure and generate synthetic user data. For more information, see Create Form.
When you create test cases, you can configure the data source to point to a test data queue, to make use of Data-driven testing.
- Create a new test case with test data, or update an existing test case with test data.
- Click Source and select Test Data Queue from the dropdown list.
- Select a test data queue or use the search function to look for it.
- (Optional) Filter the test data
queue to retrieve only specific items using the built-in Query Builder.
Use the Range option to set the desired range of items. The first field represents the starting index for the range, and the second field is the number of items to retrieve.
- Click Create to add test
data (test data queue) to the test case.
An argument with the name of the test data queue is generated in the Arguments panel of your project. For the illustration above, the name of the argument is
workforce
.Important: The name of the test data queue argument must stay the same. If you change the name of the argument, you won't be able to access the data anymore. For example, if you change the name of the argument fromworkforce
toWork_Force_Queue
, you won't be able to access the corresponding data.
You can update test data with test data queue as source.
- Open Studio and navigate to your project.
- In the Project panel, right-click a test case with data variations and select Update test data.
- Click Source and select Test Data Queue from the dropdown list.
- Select a test data queue or use the search function to look for it.
- (Optional) Click Update all test cases using the same test data to update all test cases from the current project that use the test data that you are updating.
-
(Optional) Filter the test data queue to retrieve only specific items using the built-in Query Builder.
Note: You can perform this action in Test Explorer by right-clicking a file, and then choosing Update Test Data.