- Release Notes
- Getting Started
- Setup and Configuration
- Automation Projects
- Dependencies
- Types of Workflows
- Control Flow
- File Comparison
- Automation Best Practices
- Source Control Integration
- Debugging
- Logging
- The Diagnostic Tool
- Workflow Analyzer
- About Workflow Analyzer
- ST-NMG-001 - Variables Naming Convention
- ST-NMG-002 - Arguments Naming Convention
- ST-NMG-004 - Display Name Duplication
- ST-NMG-005 - Variable Overrides Variable
- ST-NMG-006 - Variable Overrides Argument
- ST-NMG-008 - Variable Length Exceeded
- ST-NMG-009 - Prefix Datatable Variables
- ST-NMG-011 - Prefix Datatable Arguments
- ST-NMG-012 - Argument Default Values
- ST-NMG-016 - Argument Length Exceeded
- ST-NMG-017 - Class name matches default namespace
- ST-DBP-002 - High Arguments Count
- ST-DBP-003 - Empty Catch Block
- ST-DBP-007 - Multiple Flowchart Layers
- ST-DPB-010 - Multiple instances of [Workflow] or [Test Case]
- ST-DBP-020 - Undefined Output Properties
- ST-DBP-021 - Hardcoded Timeout
- ST-DBP-023 - Empty Workflow
- ST-DBP-024 - Persistence Activity Check
- ST-DBP-025 - Variables Serialization Prerequisite
- ST-DBP-026 - Delay Activity Usage
- ST-DBP-027 - Persistence Best Practice
- ST-DBP-028 - Arguments Serialization Prerequisite
- ST-USG-005 - Hardcoded Activity Arguments
- ST-USG-009 - Unused Variables
- ST-USG-010 - Unused Dependencies
- ST-USG-014 - Package Restrictions
- ST-USG-020 - Minimum Log Messages
- ST-USG-024 - Unused Saved for Later
- ST-USG-025 - Saved Value Misuse
- ST-USG-026 - Activity Restrictions
- ST-USG-027 - Required Packages
- ST-USG-028 - Restrict Invoke File Templates
- ST-USG-032 - Required Tags
- ST-USG-034 - Automation Hub URL
- Variables
- Arguments
- Imported Namespaces
- Coded automations
- Introduction
- Registering custom services
- Before and After contexts
- Generating code
- Generating coded test case from manual test cases
- Trigger-based Attended Automation
- Recording
- UI Elements
- Selectors
- Object Repository
- Data Scraping
- Image and Text Automation
- Citrix Technologies Automation
- RDP Automation
- VMware Horizon Automation
- Salesforce Automation
- SAP Automation
- macOS UI Automation
- The ScreenScrapeJavaSupport Tool
- The WebDriver Protocol
- Extensions
- About extensions
- SetupExtensions tool
- UiPathRemoteRuntime.exe is not running in the remote session
- UiPath Remote Runtime blocks Citrix session from being closed
- UiPath Remote Runtime causes memory leak
- UiPath.UIAutomation.Activities package and UiPath Remote Runtime versions mismatch
- The required UiPath extension is not installed on the remote machine
- Screen resolution settings
- Group Policies
- Cannot communicate with the browser
- Chrome extension is removed automatically
- The extension may have been corrupted
- Check if the extension for Chrome is installed and enabled
- Check if ChromeNativeMessaging.exe is running
- Check if ComSpec variable is defined correctly
- Enable access to file URLs and Incognito mode
- Multiple browser profiles
- Group Policy conflict
- Known issues specific to MV3 extensions
- List of extensions for Chrome
- Chrome Extension on Mac
- Group Policies
- Cannot communicate with the browser
- Edge extension is removed automatically
- The extension may have been corrupted
- Check if the Extension for Microsoft Edge is installed and enabled
- Check if ChromeNativeMessaging.exe is running
- Check if ComSpec variable is defined correctly
- Enable access to file URLs and InPrivate mode
- Multiple browser profiles
- Group Policy conflict
- Known issues specific to MV3 extensions
- List of extensions for Edge
- Extension for Safari
- Extension for VMware Horizon
- Extension for Amazon WorkSpaces
- SAP Solution Manager plugin
- Excel Add-in
- Test Suite - Studio
- Introduction
- Application Testing
- Working with manual test cases
- Execution Templates
- Mock Testing
- API Test Automation
- Troubleshooting
Working with manual test cases
Once you've established a connection with a Test Manager instance and Studio project, you can navigate to the Test Explorer panel. First, you can transform manual tests within the associated project into low-code or coded test cases. Second, you can create coded test cases using our AI features powered by AutopilotTM. If needed, you can review these manual tests in Test Manager by double-clicking on the test case in the Test Explorer panel before automating them.
You can generate fully automated test cases using Generative AI powered by AutopilotTM.
Before you begin, visit AI-powered test automation best practices to ensure you effectively generate the coded test cases.
- Connect Studio to a Test Manager project.
- Create elements in the Object
repository that mirror the UI elements from the manual tests.
Note: Autopilot uses object repository elements to automate UI elements in test steps. The elements are employed in the APIs as
IElementDescriptors
.
You can generate coded test cases either from the Studio IDE or from the Test Explorer panel.
To generate a coded test case from the Studio IDE, you can create a new coded test case, and employ Autopilot to generate code for it. Visit Generating code for more information about ways to generate code.
To generate the coded test case from the Test Explorer panel, follow these steps:
- In the Test Explorer panel, select Manual Test Cases to display the manual test cases that you can convert into automations.
- Right-click a manual test case
and select Generate Coded Test Case with Autopilot.
The result will be a coded test case that uses UiPath APIs to generate the code for automating the entire manual test. The manual steps are displayed as comments at the beginning of the test case.
Before you begin, visit AI-powered test automation best practices to ensure you effectively generate the low-code test cases.
- Connect Studio to a Test Manager project.
- Create elements in the Object repository that mirror the UI elements from
the manual tests.
Note: Autopilot uses object repository elements to automate UI elements in test steps.
You can generate low-code test cases either from the Designer panel or from the Test Explorer panel.
To generate low-code test cases from the Test Explorer panel, follow these steps:
- In the Test Explorer panel, select Manual Test Cases to display the manual test cases that you can convert into automations.
- Right-click a manual test case and select Generate Test Case with
Autopilot.
The result will be a low-code test case that uses activities to generate the automation for the entire manual test.
To generate low-code test cases from the Designer panel, follow these steps:
- Open your blank low-code test case and select generate with Autopilot
from the Designer Panel.
A Type annotation text field appears, which allows to input your instructions.
- Enter the necessary test steps, then select Generate to trigger the
generation of your test case.
If desired, you can save the instructions, and return to them later. Select Save if you want to save the instructions.
Note: You can use these steps for creating activities in Do blocks or within the bodies of other activities.Figure 1. Typing the instructions for Autopilot in the Type annotation text field
To generate a single activity, follow these steps:
- Select Add Activity or use the shortcut
Ctrl + Shift + T
to open the Studio search bar. - In the Studio search bar, enter your instructions.
- Select Generate with
Autopilot, at the end of the search bar list.
This command allows you to use natural language to generate automation steps, including test steps, based on your input.
You can create a coded test case from a manual test, resulting in Studio generating a coded test case named after the manual test.
Prerequisites:
- Connect to Test Manager:
- In the Studio header, click Test Manager.
- Click Test Manager Settings.
- Input the base URL of your Test Manager instance and click Connect.
- Under Default Project, select the project in which you want to work. This project should include the manual tests you wish to work with.
- Open the Test Explorer panel.
- Right-click a manual test case from the list of test cases in your default project.
- Click Create Coded Test
Case.
Result: A new coded test case is automatically created, having the same name as the manual test. This coded test case displays the manual steps as comments.
You can generate a low-code test case from a manual test with the manual steps displayed as Comment activities.
- Connect Studio to a Test Manager project.
- In the Test Explorer panel, select Manual Test Cases to show the manual test cases that you can convert into low-code test cases.
- Right-click a manual test case
and select Create Test Case.
The result is a low-code test case that places the manual steps as Comment activities within the XAML file.