Release Date: 4 May 2021
A difference in the library version of a dependency shared by the Testing and WebApi packages resulted in HTTP Request errors. This issue has been addressed by removing the library from the Testing package.
Release Date: 26 April 2021
The following Application Testing Workflow Analyzer Rules are available so you can maintain the same level of automation for projects that have multiple stakeholders:
Test Case Name Not Unique Within Project (TA-NMG-001) - Check whether a test case is unique or not, regardless of its location folder or subfolder.
Test Case Naming Convention Broken (TA-NMG-002) - Check for test cases that match a specific RegEx.
Missing Verifications Within the Test Case (TA-DBP-001) - Check for a defined minimum and maximum number of verifications for your test cases.
Untested Workflows (TA-DBP-002) - Check for workflows that do not have at least one assigned test case.
Test Manager Connected (TA-DBP-003) - Check for RPA test cases that are not linked to a test case in Test Manager.
Unused Mocking (TA-DBP-004) - Check if the mock workflows have at least one mock activity.
Test Case Without Annotations (TA-DBP-005) - Check for RPA test cases without a single annotation.
Test Case/Workflow Too Complex (TA-DBP-006 - Check for test case and workflow files that exceed a specified number of steps (activities).
Test Case Includes Too Many Branches (TA-DBP-007) - Checks if a test case is using any If statements.
- Now you can attach documents to your test execution results for a verifiable outcome, as part of your audit trail. For more information, see Attach Document.
You can configure the format of your output message.
You can set an alternative name for your verification activities to show up in Orchestrator.
Create reports or notify stakeholders if the verification activity has failed.
Release Date: 5 March 2021
An issue affected assertion posting to Orchestrator (v2020.4). If the Robot did not have permission to upload the assertion the workflow continued to run without returning an error.
Release Date: 13 October 2020
You can now create synthetic test data to support your testing efforts. For example, this comes in handy when production data cannot be used for testing purposes, due to data regulations such as GDPR. Creating your artificial data provides high coverage and is easy to reproduce, as opposed to copying or anonymizing existing production data.
To learn how to create synthetic test data, see the following new activities:
- Add Test Data Queue Item
- Given Name
- Get Test Data Queue Item
- Last Name
- Random Date
- Random Number
- Random String
- Random Value
Release Date: 6 May 2020
The first iteration of the Testing activities pack offers three activities designed to enable verification of logical expressions and control attributes in your testing workflows. The new activities are:
Updated 26 days ago