Skip to main content
The Inference.net Catalyst CLI (inf) gives you command-line access to your observability data, training runs, evaluations, and datasets on Inference.net.

Quick Start

1

Install

npm install -g @inference/cli
2

Sign in

inf auth login
3

Instrument your codebase

inf install

What Is Catalyst?

Catalyst is an observability proxy that sits between your application and your LLM provider. Your app sends requests to Inference.net instead of directly to OpenAI or Anthropic. The proxy forwards each request to the upstream provider, records telemetry, and returns the response unchanged.
Your App  →  Inference.net Proxy  →  LLM Provider (OpenAI, Anthropic, etc.)

            Catalyst Dashboard
        (cost · latency · tokens · traces)
There’s no proprietary SDK — you keep using the official OpenAI or Anthropic SDKs. All features work as before (streaming, function calling, tool use, vision, structured outputs) with full observability added. The inf install command instruments your codebase automatically using an AI coding agent. Once instrumented, your Catalyst dashboard shows cost breakdowns, latency percentiles, token usage, error rates, and full request/response inspection for every LLM call.

Authentication

Browser login (recommended for local development):
inf auth login
API key (for CI / headless environments):
inf auth set-key sk-observability-your-key-here
Or via environment variable:
export INF_API_KEY=sk-observability-your-key-here
API keys must begin with sk-observability-. Generate one from your Inference.net dashboard.
Check status:
inf auth status
Sign out:
inf auth logout

Auth Priority

When multiple credentials are present, the CLI resolves them in this order:
  1. INF_API_KEY environment variable
  2. API key stored via inf auth set-key
  3. Session token stored via inf auth login

Configuration

The CLI stores configuration at ~/.inf/config.json, created automatically on first login.
VariableDescriptionDefault
INF_API_KEYAPI key for authentication
INF_API_URLAPI base URL overridehttps://observability-api.inference.net
INF_PROJECT_IDOverride the active project

Global Options

These flags work on every command.
FlagDescription
--jsonOutput as JSON
-v, --verboseVerbose debug output
-p, --project <id>Override the active project
--versionShow CLI version
--helpShow help

Commands

CommandDescription
inf installInstrument your codebase with Catalyst observability
inf projectManage and switch between projects
inf trainingMonitor training runs, view logs, and poll status
inf evalManage eval runs, definitions, and datasets
inf datasetList, inspect, and download datasets
inf inferenceView inference requests and responses
inf dashboardLaunch the interactive terminal dashboard