Documentation Index
Fetch the complete documentation index at: https://docs.inference.net/llms.txt
Use this file to discover all available pages before exploring further.
Offline evaluation
Offline eval is the standard flow: run a dataset through models, score the outputs, compare. You control what gets evaluated and when. This is what’s available today. The inputs typically come from captured production traffic, but the outputs are re-generated. You select which models to run against rather than judging the original production outputs.Online evaluation
Online evaluation is coming soon.
- Sample rate controls - evaluate a percentage of traffic to manage cost
- Real outputs - judge what your model actually produced, not a re-run
- Continuous - always running, not triggered manually