Route your OpenRouter requests through the Inference Catalyst gateway to access hundreds of models while getting full observability. OpenRouter is OpenAI-compatible, so you use the OpenAI SDK with theDocumentation Index
Fetch the complete documentation index at: https://docs.inference.net/llms.txt
Use this file to discover all available pages before exploring further.
x-inference-provider-url header.
Prefer automatic setup? Run
inf instrument to instrument your codebase in seconds. Learn moreSetup
Get your API keys
You need two keys:
- Inference Catalyst project API key — from your dashboard under API Keys
- OpenRouter API key — from your OpenRouter account