inf install command adds Catalyst observability to your codebase. It uses an AI coding agent to automatically find your LLM client instances and route them through the Inference.net observability proxy.
What It Does
The agent makes three changes to your existing LLM SDK clients:- Redirects the base URL to
https://api.inference.netso requests flow through the observability proxy - Adds routing headers (
x-inference-provider,x-inference-observability-api-key,x-inference-environment) so the proxy knows where to forward and what to record - Adds task IDs (
x-inference-task-id) to each call site so you can group requests by logical task in the dashboard
Options
| Flag | Description |
|---|---|
--dry-run | Preview changes without modifying any files |
Supported Agents
The CLI detects and launches one of these AI coding agents to apply the changes:| Agent | Binary |
|---|---|
| Claude Code | claude |
| OpenCode | opencode |
| Codex | codex |
Supported Providers
Built-in: OpenAI, Anthropic Custom (viax-inference-provider-url header): Google Gemini, Together AI, Groq, Fireworks AI, Mistral AI, Cerebras, Perplexity, DeepSeek, OpenRouter, Azure OpenAI, and any OpenAI-compatible endpoint.