Command-line interface
The eolas command ships with the Python package as an optional install
extra. Same install on Linux, macOS, and Windows.
The CLI is a thin layer over the Python client — same auth, same retry logic, same error mapping. Use it from a terminal, a cron job, a shell script, or an AI agent.
Quick examples
# Browse
eolas datasets list --source "Stats NZ"
eolas datasets list --search cpi --json | jq '.[].name'
# Fetch
eolas get nz_cpi --start 2020-01-01 --format csv > cpi.csv
eolas get nz_cpi --format json | jq '.[].value'
# Schedule (cron on POSIX, Task Scheduler on Windows)
eolas schedule add nz_cpi --daily --out ~/data/cpi.csv
eolas schedule list
# Generate connector configs (Enterprise plan)
eolas integrate meltano --datasets nz_cpi,nz_gdp --output ./my-pipeline/
Authentication
eolas resolves your API key in this order:
--api-key VALUEflag on the commandEOLAS_API_KEYenvironment variableVS_API_KEYenvironment variable (legacy, still honoured)~/.eolas/config.json(written byeolas auth set-key, mode 0600)
Useful interactively and in CI: set the env var in scripts, use the config file on personal machines so cron jobs Just Work.
eolas auth set-key # prompt for the key, write config
eolas auth status # show which source is in use
eolas auth clear # delete the config file
Output: human or machine
The CLI auto-detects whether stdout is a terminal:
| Context | Default output |
|---|---|
| Interactive terminal | Rich coloured tables |
Piped (\| jq, > file.csv, etc.) |
NDJSON or CSV — whatever's most pipeable |
Force machine output explicitly with --json on any command that supports it.
# Same command, different output
eolas datasets list # → coloured table
eolas datasets list --json # → newline-delimited JSON
eolas datasets list | head # → newline-delimited JSON (auto-detected)
Commands
eolas get <name>
Fetch a dataset and write it to stdout or a file.
eolas get nz_cpi # CSV to stdout
eolas get nz_cpi --format json # JSON to stdout
eolas get nz_cpi --start 2020-01-01 --end 2024-12-31 # filtered
eolas get sa2_2023 --format parquet --out sa2.parquet # Parquet (must specify --out)
eolas get nz_cpi --limit 100 # cap rows
Formats: csv (default), json, parquet.
Parquet requires --out FILE (binary output isn't safe to stream to stdout).
eolas datasets ...
eolas datasets list # everything
eolas datasets list --source "Stats NZ" # filter by source
eolas datasets list --search cpi # substring on name/title
eolas datasets info nz_cpi # full metadata
eolas datasets preview nz_cpi --limit 5 # first N rows
eolas auth ...
eolas auth set-key # interactive prompt; writes ~/.eolas/config.json
eolas auth status # masked key + source
eolas auth clear # remove config file
eolas schedule ...
Recurring fetches via the OS-native scheduler. cron on Linux/macOS, Task Scheduler on Windows — same commands, same behaviour.
# Daily at 06:00 local time (the default)
eolas schedule add nz_cpi --daily --out ~/data/cpi.csv
# Other shortcuts
eolas schedule add nz_cpi --hourly --out ~/data/cpi.csv
eolas schedule add nz_cpi --weekly --out ~/data/cpi.csv
eolas schedule add nz_cpi --monthly --out ~/data/cpi.csv
# Custom cron expression (POSIX only)
eolas schedule add nzd_usd --cron "0 */6 * * *" --out ~/data/fx.csv
# Preview without installing
eolas schedule add nz_cpi --daily --out ~/data/cpi.csv --dry-run
eolas schedule list
eolas schedule remove nz_cpi
--out FILE is required — cron jobs run with no terminal, so stdout has
to go somewhere on disk.
On POSIX, managed entries are tagged with # eolas-schedule: <name>
sentinels, so eolas schedule list/remove only ever touch lines that
belong to eolas. Your other cron jobs aren't affected. On Windows the
equivalent is the eolas-<name> task name prefix.
eolas integrate <platform> (Enterprise plan)
Generate ready-to-deploy connector configs for popular data-pipeline tools.
eolas integrate meltano --datasets nz_cpi,nz_gdp --output ./tap-eolas/
eolas integrate fivetran --datasets nz_cpi
eolas integrate azure-data-factory --datasets nz_cpi,nz_gdp
| Platform | What you get |
|---|---|
| Meltano | meltano.yml (uses tap-rest-api-msdk) + README + .env.example — meltano install && meltano run tap-eolas target-jsonl and you're loading |
| Fivetran | Connector Builder YAML for paste-into-dashboard import + setup README |
| Azure Data Factory | Linked-service + per-dataset REST datasets + copy pipeline JSON — usable via az datafactory CLI or ADF Studio paste |
Output directory defaults to ./eolas-<platform>/. Existing files are
preserved unless you pass --force.
This is an Enterprise-plan feature. Non-Enterprise keys see a clear upgrade pointer with the pricing URL. The gating lives server-side so the capability is bypass-proof.
Other commands
eolas health # ping the API; useful smoke check in CI
eolas version # print installed eolas-data version
Exit codes
For shell scripts and AI agents that branch on outcome:
| Code | Meaning |
|---|---|
0 |
Success |
1 |
Generic error |
2 |
Authentication (invalid key, Enterprise plan required, etc.) |
3 |
Rate limit hit |
4 |
Dataset / resource not found |
5 |
Other API error (5xx, etc.) |
64 |
Bad command-line usage (mirrors sysexits.h) |
if eolas health; then
eolas get nz_cpi --format csv > cpi.csv
fi
case $? in
2) echo "auth problem — run eolas auth set-key" ;;
3) echo "rate limited — wait and retry" ;;
4) echo "dataset not found" ;;
esac
Tips
- Piping: every command that produces records auto-switches to NDJSON
when stdout is piped. Combine with
jq,csvkit,mlr, or anything else. - CI usage: set
EOLAS_API_KEYas a CI secret; the env var takes precedence over any on-disk config. - Agent usage:
--helpis structured enough that an LLM can discover the commands without reading external docs. Stable exit codes mean agents can branch on outcome programmatically. - Custom output dir for integrations: pass
--output DIR. IfDIRdoesn't exist it's created.
Source
The CLI lives in eolas-data —
eolas_data/cli.py and eolas_data/schedule.py. PRs welcome.