Install a Catalyst tracing SDK, configure export, and capture your first trace.
This quickstart uses OpenAI because it is the smallest end-to-end trace. The same
setup pattern applies to Anthropic, LangChain, LangGraph, LangSmith, OpenAI
Agents, Claude Agent SDK, Pydantic AI, and manual spans.
Open the Catalyst dashboard and filter for your CATALYST_SERVICE_NAME. The
trace should include an OpenAI LLM span with input messages, output messages,
model name, invocation parameters, finish reason, and token counts.If the process is short-lived, always call shutdown() before exit so batched
spans are flushed.