- Release Notes
- General Availability
- Getting Started
- Data Labeling
- ML Packages
- Out of the Box Packages
- AI Computer Vision
- Communications Mining
- UiPath Document Understanding
- ML Skills
- ML Logs
- Document Understanding in AI Center™
- AI Center API
- AI Solutions Templates
- How To
- Basic Troubleshooting Guide
Using Data Labeling with Human-in-the-Loop
Data Labeling enables you to upload raw data, annotate text data in the labeling tool (for classification or entity recognition), and use the labeled data to train ML models. Apart from this, you can use data labeling for human validation on model outputs.
A common scenario is when you train an extractor or classifier model. When the model prediction falls below a set confidence threshold, that data can be sent to Action Center for human validation. The validated data can be used to retrain the model in order to improve confidence on subsequent model predictions.
- Use the Wait for External Task and Resume activity to create a task in Action Center from Studio.
Use the Create Labeling Task and Create External Task activities to convert the
model output into a format compatible with Data Labeling.
Data Labeling supports files using the JSON format. The JSON file should contain a data object that in turn contains the structure configured in the previous step.
Send the task to a human to review.
Once a human has reviewed and completed the task, Task Object is updated with the output of the human review.
- Convert Task Object into a format that the models can use as training data.
- Send the validated data to an AI Center dataset as training data using the Upload File activity.
- Start a pipeline run using the uploaded dataset.