Building an FTP Tracking Import Workflow
This lesson walks through every step of building a complete SFTP tracking import workflow — from scaffolding to scheduling. You will connect to a carrier's SFTP server, download CSV tracking files, parse each row, import the events into CXTMS, and move the processed files to an archive directory.
What We Are Building
A scheduled workflow that runs every 15 minutes and:
- Connects to the carrier's SFTP server
- Lists all
.csvfiles in/incoming/tracking/ - For each file: downloads content, parses CSV rows, imports tracking events
- Moves each processed file to
/processed/ - Disconnects from the SFTP server
Step 1 — Scaffold with the ftp-tracking Template
Never write SFTP workflow YAML from scratch. The CLI generates correct UUIDs, file structure, and all required sections:
bash1npx cxtms create workflow carrier-tracking-import --template ftp-tracking
This creates workflows/carrier-tracking-import.yaml. Read the generated file to understand the initial structure, then customize each section as described in the steps below.
Step 2 — Configure the SFTP Connection
In the variables section of the generated workflow, update the ftpConfig variable to point to your carrier's app configuration:
yaml1variables:2 - name: sftpConfig3 fromConfig: "integrations.carrierName.sftp"
The app configuration named integrations.carrierName.sftp should contain:
json1{2 "host": "sftp.carrier.com",3 "port": 22,4 "username": "tms_user",5 "password": "..."6}
In the Connect step, reference these values:
yaml1- task: "FileTransfer/Connect@1"2 name: Connect3 inputs:4 host: "{{ sftpConfig?.host? }}"5 port: "{{ sftpConfig?.port? }}"6 username: "{{ sftpConfig?.username? }}"7 password: "{{ sftpConfig?.password? }}"8 protocol: "sftp"9 outputs:10 - name: connection11 mapping: "connection"
Step 3 — List and Download CSV Files
Configure ListFiles to target the carrier's inbox directory. Use a file pattern to avoid picking up files that are still being written (many carriers write a .tmp file then rename to .csv when complete):
yaml1- task: "FileTransfer/ListFiles@1"2 name: ListFiles3 inputs:4 connection: "{{ Transfer.Connect.connection }}"5 path: "/incoming/tracking"6 outputs:7 - name: files8 mapping: "files"
Start the foreach loop over the returned file list:
yaml1- task: foreach2 name: ProcessFiles3 collection: "Transfer.ListFiles.files"4 item: "file"5 continueOnError: true6 steps:7 - task: "FileTransfer/DownloadFile@1"8 name: Download9 inputs:10 connection: "{{ Transfer.Connect.connection }}"11 path: "{{ file.path }}"12 outputs:13 - name: content14 mapping: "content"
Setting continueOnError: true on the foreach ensures that a malformed file or a parse error on one record does not halt processing of the remaining files.
Step 4 — Parse the CSV with CsvParse
After downloading, pass the content to Utilities/CsvParse@1. The task reads from a URL — for in-memory downloaded content you will need the file's path if the download saves to disk, or you can pass the raw content using Utilities/Import@1. For SFTP downloads that return content directly as a string, map through a file:// temp path written by the Download step.
A common pattern using the downloaded content URL:
yaml1 - task: "Utilities/CsvParse@1"2 name: ParseCsv3 inputs:4 url: "{{ Transfer.ProcessFiles.Download.filePath }}"5 hasHeader: true6 outputs:7 - name: rows8 mapping: "records?"9 - name: hasRows10 mapping: "hasRecords?"
If your carrier's CSV has non-standard column names, use columns to provide explicit names that match what TrackingEvent/Import expects:
yaml1 inputs:2 url: "{{ Transfer.ProcessFiles.Download.filePath }}"3 columns: ["trackingNumber", "eventDate", "statusCode", "location", "description"]
Step 5 — Loop Over Rows and Import Tracking Events
With the parsed rows available, iterate and import events using Order/ImportTrackingEvents:
yaml1 - task: foreach2 name: ImportRows3 collection: "Transfer.ProcessFiles.ParseCsv.rows"4 item: "row"5 continueOnError: true6 steps:7 - task: "Order/ImportTrackingEvents@1"8 name: ImportEvents9 inputs:10 organizationId: "{{ organizationId }}"11 trackingNumber: "{{ row.trackingNumber }}"12 events:13 - eventDefinitionName: "{{ row.statusCode }}"14 eventDate: "{{ row.eventDate }}"15 location: "{{ row.location }}"16 notes: "{{ row.description }}"17 matchByFields: ["eventDefinitionName", "eventDate"]18 skipIfExists: true19 createEventDefinitions: true
The skipIfExists: true flag ensures that re-processing a file (for example after a workflow retry) does not create duplicate tracking events. createEventDefinitions: true automatically creates event definition records for new status codes encountered in the file.
Step 6 — Move Processed Files
After a file is fully processed, move it from the inbox to the archive directory:
yaml1 - task: "FileTransfer/MoveFile@1"2 name: Archive3 inputs:4 connection: "{{ Transfer.Connect.connection }}"5 sourcePath: "{{ file.path }}"6 destinationPath: "/processed/{{ file.name }}"
By placing the Archive step after the import steps inside the foreach loop, each file is moved only after all its rows have been successfully processed. If the import fails, the MoveFile step is skipped and the file stays in the inbox for the next run to retry.
Step 7 — Disconnect
Always close the connection at the end of the activity:
yaml1 - task: "FileTransfer/Disconnect@1"2 name: Disconnect3 inputs:4 connection: "{{ Transfer.Connect.connection }}"
Place Disconnect outside and after the foreach loop so it runs once after all files are processed, not once per file.
Step 8 — Add a Scheduled Cron Trigger
The workflow runs on a schedule. In the schedules section, configure the polling interval. For tracking imports a 15-minute interval is typical:
yaml1schedules:2 - cron: "*/15 * * * *"3 displayName: "Poll carrier SFTP every 15 minutes"
Common cron expressions for tracking imports:
| Interval | Cron Expression |
|---|---|
| Every 15 minutes | */15 * * * * |
| Every hour | 0 * * * * |
| Every 2 hours during business hours | 0 8-18/2 * * 1-5 |
| Daily at 6 AM | 0 6 * * * |
Set runAs: "system" at the workflow level so the scheduled trigger has the permissions needed to import data without requiring a logged-in user.
Final Structure Overview
1workflow:2 name: "Carrier Tracking Import"3 executionMode: Async4 runAs: "system"56variables:7 - sftpConfig (fromConfig)89schedules:10 - cron: "*/15 * * * *"1112activities:13 - name: Transfer14 steps:15 - FileTransfer/Connect → connection handle16 - FileTransfer/ListFiles → files[]17 - foreach (ProcessFiles)18 - FileTransfer/DownloadFile → file content19 - Utilities/CsvParse → rows[]20 - foreach (ImportRows)21 - Order/ImportTrackingEvents22 - FileTransfer/MoveFile → archive23 - FileTransfer/Disconnect
Validate and Deploy
After customizing:
bash1# Validate YAML structure2npx cxtms workflows/carrier-tracking-import.yaml34# Deploy to server5npx cxtms workflow deploy workflows/carrier-tracking-import.yaml --org <orgId>67# Test manually before enabling the schedule8npx cxtms workflow execute workflows/carrier-tracking-import.yaml --org <orgId>
Check execution logs to verify files were picked up, rows were parsed, and events were imported:
bash1npx cxtms workflow logs workflows/carrier-tracking-import.yaml --org <orgId>2npx cxtms workflow log <executionId> --org <orgId> --console
Summary
- Scaffold with
--template ftp-trackingand customize — never start from scratch. - Store SFTP credentials in app configuration and reference with
fromConfig. - Set
continueOnError: trueon bothforeachloops (files and rows) for resilience. - Use
skipIfExists: trueonOrder/ImportTrackingEventsto safely reprocess files without creating duplicates. - Place
FileTransfer/Disconnectoutside the file loop so it runs once. - Schedule with a cron expression and set
runAs: "system"for unattended execution.