CLI Reference
Install, configure, and use the af command-line tool.
Installation
curl -fsSL https://cli.automatedfuture.co/cli/install.sh | shThis installs the af binary to ~/.local/bin. The script auto-detects your platform (Linux and macOS, x86_64 and aarch64). To update, re-run the same command.
Verify it works:
af --versionConfiguration
The CLI stores config in ~/.config/af/config.json. You can set values with the CLI or through environment variables.
Environment Variables
| Variable | Description |
|---|---|
AF_API_KEY | API key (loaded via af config from-env) |
AF_PROJECT_ID | Default project ID (loaded via af config from-env, also used directly by commands) |
The CLI also loads a .env file from the current directory if one exists.
Quick Setup
# From environment variables (recommended for CI)
export AF_API_KEY=your-key
export AF_PROJECT_ID=proj_abc123
af config from-env
# Or set values directly
af config set api-key YOUR_API_KEY
af config set project-id proj_abc123Global Options
-v, --verbose Enable verbose debug output
-c, --config <PATH> Configuration file path (env: AF_CONFIG)
-h, --help Print helpaf config
Manage configuration settings.
Alias: cfg
af config set
af config set <KEY> <VALUE>Supported keys: api_key / api-key, api_base / api-base, project_id / project-id, run_id / run-id, test_result_id / test-result-id, test_started_at / test-started-at.
af config get
af config get <KEY>Retrieve a single configuration value. Sensitive values are masked.
af config list
af config listDisplay all configuration values with the config file path.
af config path
af config pathShow the path to the configuration file.
af config from-env
af config from-envLoad AF_API_KEY and AF_PROJECT_ID from the environment and save to the config file.
af config reset
af config reset --yesReset all configuration to defaults. Requires the --yes / -y flag.
af project
Manage projects.
Alias: p
af project list
af project listList all projects in your account.
af test-run
Manage test runs. A test run is a container for one or more test results (e.g., a CI pipeline run).
Alias: tr
af test-run start
af test-run start [OPTIONS]| Option | Description |
|---|---|
--project-id <ID> | Project ID (env: AF_PROJECT_ID) |
-l, --label <LABEL> | Label for the test run (e.g., "CI #42") |
Starts a new test run and stores the run ID in config.
af test-run start --label "CI Pipeline #42"af test-run end
af test-run end [OPTIONS]| Option | Description |
|---|---|
--run-id <ID> | Test run ID (env: AF_RUN_ID, uses config default if omitted) |
Marks the test run as completed.
af test
Manage individual test results within a test run.
Alias: t
af test start
af test start [OPTIONS]| Option | Description |
|---|---|
-n, --name <NAME> | Test name (required) |
--project-id <ID> | Project ID (env: AF_PROJECT_ID) |
--run-id <ID> | Test run ID (env: AF_RUN_ID) |
Creates a new test result with status "Running" and begins tracking execution time.
af test start --name "test_user_login"af test update
af test update [OPTIONS]| Option | Description |
|---|---|
-n, --name <NAME> | Rename the test |
-s, --status <STATUS> | Test status: passed, failed, skipped |
-d, --duration <MS> | Override duration in milliseconds |
-e, --error <MSG> | Error message (for failed tests) |
--project-id <ID> | Project ID (env: AF_PROJECT_ID) |
--run-id <ID> | Test run ID (env: AF_RUN_ID) |
Duration is auto-calculated from when test start was called unless overridden with --duration.
af test update --status passed
af test update --status failed --error "Assertion failed: expected 200, got 500"af test end
af test endSends the final duration update and clears the tracked test state.
af test upload
af test upload [OPTIONS]| Option | Description |
|---|---|
-f, --file <PATH> | File to upload (required) |
-l, --label <LABEL> | Artifact label (defaults to filename) |
-k, --kind <KIND> | Artifact kind: log, image, video, other (auto-detected) |
--content-type <TYPE> | MIME type (auto-detected from extension) |
--test-step <NAME> | Test step name |
--test-result-id <ID> | Test result ID (uses current tracked test if omitted) |
af test upload --file screenshot.png
af test upload --file output.log --label "Server logs" --kind logaf junit
Parse a JUnit XML file and upload test results to an existing test run.
af junit <FILE> [OPTIONS]| Option | Description |
|---|---|
--project-id <ID> | Project ID (env: AF_PROJECT_ID) |
--run-id <ID> | Test run ID (env: AF_RUN_ID, uses config default if omitted) |
The parser supports both <testsuites> and bare <testsuite> root elements. Test names are normalized from the suite name, classname, and test name attributes joined with ::.
# Start a run, upload JUnit results, then end the run
af test-run start --label "CI #99"
af junit path/to/junit.xml
af test-run endStdout and stderr captured in <system-out> / <system-err> elements are uploaded as log artifacts.
af run
Execute a command, auto-detect JUnit XML output, create a test run, upload results, and exit with the command's exit code.
af run [OPTIONS] -- <COMMAND> [ARGS...]| Option | Description |
|---|---|
-l, --label <LABEL> | Label for the test run (defaults to the command string) |
--project-id <ID> | Project ID (env: AF_PROJECT_ID) |
--junit <PATH> | Explicit JUnit XML file path (repeatable; auto-detects if omitted) |
When no --junit paths are given, af run searches the working directory for common JUnit XML locations:
**/TEST-*.xml**/junit.xml**/test-results/**/*.xml**/surefire-reports/*.xmlbuild/test-results/**/*.xml**/junit-reports/*.xml**/test-report*.xml
Each candidate file is validated before parsing to avoid picking up unrelated XML.
# Run pytest with JUnit output and upload automatically
af run -- pytest --junitxml=results.xml
# Run with an explicit label
af run --label "Nightly tests" -- ./gradlew test
# Point to specific JUnit files
af run --junit build/results.xml --junit build/results2.xml -- make testThe command's exit code is always propagated, regardless of whether the upload succeeds.
af version
Display version information.
Alias: v
af versionTypical Workflow
Manual test reporting
# 1. Start a test run
af test-run start --label "Nightly tests"
# 2. Report individual tests
af test start --name "test_login"
# ... run your test ...
af test update --status passed
af test end
af test start --name "test_checkout"
# ... run your test ...
af test update --status failed --error "Timeout waiting for payment"
af test upload --file screenshot.png
af test end
# 3. End the test run
af test-run endJUnit XML upload
# Upload JUnit XML to an existing run
af test-run start --label "CI Pipeline"
af junit test-results/junit.xml
af test-run endOne-command CI integration
# Run tests and upload results in one step
af run -- pytest --junitxml=results.xml
# With a custom label
af run --label "PR Check" -- npm test