- Release Notes
- Getting Started
- Setup and Configuration
- Automation Projects
- Dependencies
- Types of Workflows
- Control Flow
- File Comparison
- Automation Best Practices
- Source Control Integration
- Debugging
- Logging
- The Diagnostic Tool
- Workflow Analyzer
- About Workflow Analyzer
- ST-NMG-001 - Variables Naming Convention
- ST-NMG-002 - Arguments Naming Convention
- ST-NMG-004 - Display Name Duplication
- ST-NMG-005 - Variable Overrides Variable
- ST-NMG-006 - Variable Overrides Argument
- ST-NMG-008 - Variable Length Exceeded
- ST-NMG-009 - Prefix Datatable Variables
- ST-NMG-011 - Prefix Datatable Arguments
- ST-NMG-012 - Argument Default Values
- ST-NMG-016 - Argument Length Exceeded
- ST-NMG-017 - Class name matches default namespace
- ST-DBP-002 - High Arguments Count
- ST-DBP-003 - Empty Catch Block
- ST-DBP-007 - Multiple Flowchart Layers
- ST-DPB-010 - Multiple instances of [Workflow] or [Test Case]
- ST-DBP-020 - Undefined Output Properties
- ST-DBP-021 - Hardcoded Timeout
- ST-DBP-023 - Empty Workflow
- ST-DBP-024 - Persistence Activity Check
- ST-DBP-025 - Variables Serialization Prerequisite
- ST-DBP-026 - Delay Activity Usage
- ST-DBP-027 - Persistence Best Practice
- ST-DBP-028 - Arguments Serialization Prerequisite
- ST-USG-005 - Hardcoded Activity Arguments
- ST-USG-009 - Unused Variables
- ST-USG-010 - Unused Dependencies
- ST-USG-014 - Package Restrictions
- ST-USG-020 - Minimum Log Messages
- ST-USG-024 - Unused Saved for Later
- ST-USG-025 - Saved Value Misuse
- ST-USG-026 - Activity Restrictions
- ST-USG-027 - Required Packages
- ST-USG-028 - Restrict Invoke File Templates
- ST-USG-032 - Required Tags
- ST-USG-034 - Automation Hub URL
- Variables
- Arguments
- Imported Namespaces
- Coded automations
- Introduction
- Registering custom services
- Before and After contexts
- Generating code
- Generating coded test case from manual test cases
- Trigger-based Attended Automation
- Recording
- UI Elements
- Selectors
- Object Repository
- Data Scraping
- Image and Text Automation
- Citrix Technologies Automation
- RDP Automation
- VMware Horizon Automation
- Salesforce Automation
- SAP Automation
- macOS UI Automation
- The ScreenScrapeJavaSupport Tool
- The WebDriver Protocol
- Extensions
- About extensions
- SetupExtensions tool
- UiPathRemoteRuntime.exe is not running in the remote session
- UiPath Remote Runtime blocks Citrix session from being closed
- UiPath Remote Runtime causes memory leak
- UiPath.UIAutomation.Activities package and UiPath Remote Runtime versions mismatch
- The required UiPath extension is not installed on the remote machine
- Screen resolution settings
- Group Policies
- Cannot communicate with the browser
- Chrome extension is removed automatically
- The extension may have been corrupted
- Check if the extension for Chrome is installed and enabled
- Check if ChromeNativeMessaging.exe is running
- Check if ComSpec variable is defined correctly
- Enable access to file URLs and Incognito mode
- Multiple browser profiles
- Group Policy conflict
- Known issues specific to MV3 extensions
- List of extensions for Chrome
- Chrome Extension on Mac
- Group Policies
- Cannot communicate with the browser
- Edge extension is removed automatically
- The extension may have been corrupted
- Check if the Extension for Microsoft Edge is installed and enabled
- Check if ChromeNativeMessaging.exe is running
- Check if ComSpec variable is defined correctly
- Enable access to file URLs and InPrivate mode
- Multiple browser profiles
- Group Policy conflict
- Known issues specific to MV3 extensions
- List of extensions for Edge
- Extension for Safari
- Extension for VMware Horizon
- Extension for Amazon WorkSpaces
- SAP Solution Manager plugin
- Excel Add-in
- Test Suite - Studio
- Troubleshooting
Activity Coverage
You can use the workflow activity coverage panel to check whether your test cases have been covered or not and look for redundancies. The test execution results are shown in Test Explorer, in the Activity Coverage panel. The coverage calculations are done by the Robot.
You can analyze the workflow's activity coverage, debug newly created test case, and view covered and uncovered activities. To support your workflow debugging, you can see the traversed path, by going to Test Explorer, opening the Activity Coverage panel and double-clicking the test case. You can see the path highlighted in green and any skipped activities in red. Additionally, you can quickly identify whether a test case has passed or failed and view the result.
Consider the partial coverage rate of the following workflow that automates a loan management process.
In this high-volume short-term loan scenario, the activity coverage result shows that 53% of the workflow activities were covered during execution. This flowchart uses a Flow Switch activity. You can create another test case to follow the execution of different scenario, for low-volume loans. Therefore, depending on your automation needs, you can create separate test cases to cover each scenario during the execution.
The following full coverage rate scenario tests all activities, for automating the UiBank app for applying for a loan. This workflow uses an XLSX file to import test data as part of data-driven testing using Test Data File.
The activity coverage results show a 100% coverage rate, meaning that the data set used in the test case, together with the added activities went through all possible scenarios.