process-mining
2023.10
false
UiPath logo, featuring letters U and I in white
Process Mining
Automation CloudAutomation Cloud Public SectorAutomation SuiteStandalone
Last updated Oct 17, 2024

Create a job

Important:

The input data must meet the format as required for the app template you are using to create your process app. See App Templates.

Follow these steps to create the extraction job.

  1. Create a new CData Sync job.
  2. Enter a descriptive name for the job in the Name field. For example, ServiceNow_to_AzureBlob or ServiceNow_to_SQLServer.
  3. Select the source connection created in Create a source connection the source connection from the Source list.
  4. Select the destination connection created in Create a destination connection from the Destination list.
  5. Make sure the option Standard is selected as the Replication Type and select Add Job.
  6. Go to the Task tab and select Add Tasks. Enable the Custom Query option.

  7. Create a Custom Query using the replication queries for your app template and select Add Tasks.
    Tip:

    You can copy the replication queries from the documentation for you app template that can be accessed from App Templates.

  8. Edit the job Settings on the Overview tab. Enable the checkbox Parallel Processing and enter 8 in the Worker Pool field to improve loading speed.
    • Locate the Destination schema entry and copy the schema retrieved in Create a Destination.
  9. Configure the Incremental Replication to make sure the Replicate Interval and Replicate Interval Unit are set so the resulting period is equal or larger than the extraction period. For example, if you want to extract data for a period of a year, the Replicate Interval and Replicate Interval Unit must reflect at least a year period.
  10. Go to the Advanced tab in the Settings panel and edit the Replicate Options. Enable Drop Table to prevent the data to be appended to the table.
  11. Save your changes.

Call the Data Ingestion API

Follow these steps to edit the post-job event to call the data ingestion API.

  1. Go to the Events tab in the Job Settings panel.
  2. Edit the Post-Job Event section and add the ingestion API call from the code block below ensuring to replace the value of the "http.url" with the End of upload API. See Retrieving the SQL Server database parameters.

    <api:set attr="http.url" value="https://my-uipath-server.com/default/defaulttenant/processMining_/api/v4.0/apps/98dfd1d5-9e42-4f0e-9a0a-8945629f01b3/transform/unauthenticated"/> 
    <api:call op="httpPost" in="http"/><api:set attr="http.url" value="https://my-uipath-server.com/default/defaulttenant/processMining_/api/v4.0/apps/98dfd1d5-9e42-4f0e-9a0a-8945629f01b3/transform/unauthenticated"/> 
    <api:call op="httpPost" in="http"/>
  3. Save your changes.
  4. Select Jobs and locate the destination job.
  5. Run the job to test if the job runs correctly.

Logging information

You can add additional statements to obtain logging information on the API call. Follow these steps.

  1. Add the following statements between the lines previously added in the Post-Job event.

    <api:set attr="http.verbosity" value="5"/>

    <api:set attr="http.logfile" value="D:/mydir/cdata_log.txt"/>

  2. Edit the job Settings on the Overview tab and select Verbose from the Logfile Verbosity list box.

  3. Run the job and check the log file that is created.
  • Call the Data Ingestion API
  • Logging information

Was this page helpful?

Get The Help You Need
Learning RPA - Automation Courses
UiPath Community Forum
Uipath Logo White
Trust and Security
© 2005-2024 UiPath. All rights reserved.