- Release Notes
- Getting Started
- Setup and Configuration
- Automation Projects
- Dependencies
- Types of Workflows
- File Comparison
- Automation Best Practices
- Source Control Integration
- Debugging
- The Diagnostic Tool
- Workflow Analyzer
- About Workflow Analyzer
- ST-NMG-001 - Variables Naming Convention
- ST-NMG-002 - Arguments Naming Convention
- ST-NMG-004 - Display Name Duplication
- ST-NMG-005 - Variable Overrides Variable
- ST-NMG-006 - Variable Overrides Argument
- ST-NMG-008 - Variable Length Exceeded
- ST-NMG-009 - Prefix Datatable Variables
- ST-NMG-011 - Prefix Datatable Arguments
- ST-NMG-012 - Argument Default Values
- ST-NMG-016 - Argument Length Exceeded
- ST-DBP-002 - High Arguments Count
- ST-DBP-003 - Empty Catch Block
- ST-DBP-007 - Multiple Flowchart Layers
- ST-DBP-020 - Undefined Output Properties
- ST-DBP-023 - Empty Workflow
- ST-DBP-024 - Persistence Activity Check
- ST-DBP-025 - Variables Serialization Prerequisite
- ST-DBP-026 - Delay Activity Usage
- ST-DBP-027 - Persistence Best Practice
- ST-DBP-028 - Arguments Serialization Prerequisite
- ST-USG-005 - Hardcoded Activity Arguments
- ST-USG-009 - Unused Variables
- ST-USG-010 - Unused Dependencies
- ST-USG-014 - Package Restrictions
- ST-USG-020 - Minimum Log Messages
- ST-USG-024 - Unused Saved for Later
- ST-USG-025 - Saved Value Misuse
- ST-USG-026 - Activity Restrictions
- ST-USG-027 - Required Packages
- ST-USG-028 - Restrict Invoke File Templates
- Variables
- Arguments
- Imported Namespaces
- Recording
- UI Elements
- Control Flow
- Selectors
- Object Repository
- Data Scraping
- Image and Text Automation
- Citrix Technologies Automation
- RDP Automation
- Salesforce Automation
- SAP Automation
- VMware Horizon Automation
- Logging
- The ScreenScrapeJavaSupport Tool
- The WebDriver Protocol
- Test Suite - Studio
- Extensions
- Troubleshooting
Studio User Guide
Data-Driven Testing
Application testing can require working with large data sets to verify the workflow execution and cover corner cases. Instead of creating multiple test cases for each data set, you can import and use your data sets with your projects. The test data is imported to your project as a JSON file. You can find this file under Project > Test Data.
Package.VariationFile.MaxSizeInKBytes
.
- Open your workflow in Studio.
-
In the Project panel, right-click a test case and select Add Test Data. Alternatively, you can select multiple test cases in your project if you want to add test data to all of them.
- Browse for your XLSX or CSV file and then select the Worksheet. If you have an updated JSON file, you can select that from the dropdown list.
-
Select individual or all the values from your data file.
InArgument
together with its associated values, as you can see in the following screenshot with the workflow and Excel file side-by-side.
- Add test data when creating test case.
- Verify Expression with Operator. You can use this to compare the workflow values with those imported from the Excel file.
You can debug or run your test case using selected or all of your test data.
- In the Design ribbon, click Run file with data variations or Debug file with data variations.
-
Select the data that you want to use and confirm your action.
If you use a Verify Expression with Operator activity for your test case, the execution is performed once for each selected data set.
The following table lists the actions that you can take for the test cases that contain data variation.
Action |
Description |
Procedure |
---|---|---|
Update test data |
Update the imported test data by choosing whether to create a new file or overwrite existing data. You can use this when you have made any changes to the Excel file. Any new columns will be added as arguments to the test case. |
|
Remove test data |
Remove test data from the test case. |
|
Modify test data JSON file |
Update test data directly into the JSON file. The file is created after adding test data to the test case. |
|