Skip to main content
The Inference CLI (inf) drives Catalyst from the terminal. It serves two main paths:
  1. Instrument your codebase so every LLM call your app makes is captured in Observe — inf instrument hands the job to an AI coding agent (Claude Code, OpenCode, or Codex) and walks you through the diff.
  2. Operate the platform programmatically — browse models, manage rubrics and eval runs, upload and materialize datasets, queue and monitor training runs, and inspect captured inferences without opening the dashboard.
Sign up for an account at Inference.net to get started.
The CLI is currently in beta. Please report any issues you find.

Quick Start

1

Install

npm install -g @inference/cli
2

Sign in

inf auth login
3

Check status

inf auth status
Run inf --help at any time to see every command. Having trouble? Send us a message or tag us at x.com/@inference_net.

Global Options

These flags work on every command.
FlagDescription
--jsonOutput as JSON (preserves full UUIDs for scripting)
-v, --verboseVerbose debug output
-p, --project <id>Override the active project for this invocation
-V, --versionShow CLI version
-h, --helpShow help
Tables show UUIDs as readable 8-character prefixes. For scripting, always use --json — it preserves full UUIDs so you can round-trip values between commands (e.g. inf dataset list --json | jq -r '.[0].id' | xargs inf dataset get).

Commands

CommandDescription
inf instrumentInstrument your codebase for Catalyst observability using an AI agent
inf authSign in, sign out, and check authentication status
inf projectList, switch between, and inspect projects
inf modelsBrowse callable models with capabilities and pricing
inf evalManage rubrics, launch eval runs, inspect results
inf datasetUpload JSONL data, create eval/training datasets, download
inf trainingQueue training runs, monitor progress, view logs, and poll status
inf inferenceView inference requests and responses captured by Observe
inf dashboardLaunch the interactive terminal dashboard

Explore the CLI

Instrument your codebase

Hand your project to an AI coding agent that wires up Catalyst for you.

Authentication

Browser and headless authentication, env vars, and config.

Projects

Switch between projects and inspect the active project.

Models

Browse callable models, capabilities, and pricing.

Evals

Manage rubrics, launch eval runs, and inspect results.

Datasets

Upload JSONL files and materialize eval or training datasets.

Training

Queue training runs, monitor progress, view logs, and poll for completion.

Inferences

Inspect request and response payloads captured by Observe.

TUI Dashboard

Interactive terminal UI for training runs, evals, datasets, and inferences.

Usage telemetry

The CLI reports anonymized usage events so we can prioritize the commands our customers actually rely on.
  • What we capture: the command name, CLI version, OS and CPU architecture, the JavaScript runtime, and the flag names (never flag values) you used. When you run the CLI inside a git repository we also capture the repo owner/name and current branch from your origin remote — this is most useful for inf instrument, where we want to understand which codebases Catalyst gets wired into.
  • What we never capture: argument values, environment variables, file contents, API keys, or any data you pass to a command.
  • When we don’t capture anything: events are only sent once you are authenticated. Commands run before inf auth login / inf auth set-key emit no events.
  • How it works: events are sent fire-and-forget over tRPC with a short timeout, so telemetry never slows down or blocks your command. Failures are silent.