Skip to content

How to Automate File Transfer Pipelines — Xferity Guide

This guide explains how to set up automated file transfer pipelines using Xferity.

Before starting:

  • A host running Linux, Docker, or Windows
  • Access to at least one transfer endpoint (SFTP server, FTPS server, S3 bucket, or AS2 partner)
  • Credentials or key material for that endpoint
  • The xferity binary — see Deployment

Create a partner file for the endpoint you are connecting to.

id: my-sftp-partner
type: sftp
host: sftp.example.com
port: 22
auth:
method: password
username: myuser
password: env:SFTP_PASSWORD
host_key_fingerprint: "SHA256:abc123..."
  • password uses a secret reference — not plaintext
  • host_key_fingerprint is required for SSH trust verification
id: my-s3-bucket
type: s3
bucket: my-transfer-bucket
region: us-east-1
credentials:
access_key_id: env:AWS_ACCESS_KEY_ID
secret_access_key: env:AWS_SECRET_ACCESS_KEY

Create a flow file that references the partner and defines the transfer behavior.

name: outgoing-upload
partner: my-sftp-partner
direction: upload
source:
path: /data/outgoing
pattern: "*.csv"
destination:
path: /inbound/
idempotency_mode: hash
retry:
max_attempts: 5
base_delay_seconds: 10
cap_seconds: 300
cleanup:
delete_after_transfer: true
name: incoming-download
partner: my-sftp-partner
direction: download
source:
path: /outbound/
pattern: "*.pgp"
destination:
path: /data/incoming/
pgp:
provider: auto
decrypt: true
verify: true
idempotency_mode: hash

Terminal window
xferity validate
xferity diag incoming-download

validate checks all YAML config files for unknown fields and required values. diag performs a preflight check: endpoint reachability, key availability, TLS verification.


Terminal window
xferity run incoming-download

Review the output:

Terminal window
xferity flow history incoming-download
xferity trace <some-file.pgp>

name: incoming-download
schedule_cron: "0 */5 * * * *" # every 5 minutes (six-field cron with seconds)

Run the service:

Terminal window
xferity run-service incoming-download
Terminal window
xferity run-service incoming-download --interval-seconds 300

Add to your flow:

retry:
max_attempts: 5
base_delay_seconds: 10
cap_seconds: 300
jitter: true
dead_letter:
path: /data/dead-letter/

Files that exhaust all retries move to the dead-letter path instead of being lost.


/metrics exposes:

  • flow run counts and durations
  • retry counts
  • dead-letter accumulation
  • certificate expiry state
Terminal window
xferity flow status
xferity flow history incoming-download
xferity logs incoming-download

Configure Slack or Email alerts for flow failures:

notifications:
on_failure:
- type: slack
webhook_url: env:SLACK_WEBHOOK

Terminal window
docker compose up xferity

The included Docker Compose file mounts config, flows, state, logs, keys, and storage.

For the full feature set (durable jobs, UI, certificate inventory), use Postgres-backed mode:

See Deployment: Postgres and Workers.


CapabilityScriptXferity flow
SSH host key verificationManual or disabledRequired by default
Duplicate file protectionNoneSHA-256 idempotency
Retry on failureManual loopsExponential backoff, jitter
Concurrent execution protectionRace conditionsDistributed flow lock
Audit trailLog filesStructured JSONL with hash chain
File lifecycle traceNot possiblexferity trace <file>
Recovery after crashRe-run from scratchxferity resume