Route your OpenAI requests through the Inference Catalyst gateway to get cost tracking, latency monitoring, and analytics. You keep your existing OpenAI API key — just point the SDK at the gateway and add a few headers.Documentation Index
Fetch the complete documentation index at: https://docs.inference.net/llms.txt
Use this file to discover all available pages before exploring further.
Prefer automatic setup? Run
inf instrument to instrument your codebase in seconds. Learn moreSetup
Get your API keys
You need two keys:
- Inference Catalyst project API key — from your dashboard under API Keys
- OpenAI API key — from your OpenAI account