Route your Groq requests through the Inference Catalyst gateway to get cost tracking, latency monitoring, and analytics. Groq is OpenAI-compatible, so you use the OpenAI SDK with theDocumentation Index
Fetch the complete documentation index at: https://docs.inference.net/llms.txt
Use this file to discover all available pages before exploring further.
x-inference-provider-url header to specify Groq’s base URL.
Prefer automatic setup? Run
inf instrument to instrument your codebase in seconds. Learn moreSetup
Get your API keys
You need two keys:
- Inference Catalyst project API key — from your dashboard under API Keys
- Groq API key — from your Groq console