Welcome to the Inference.net documentation. Inference.net makes it easy to access the leading open source AI models with only a few lines of code. Our mission is to build the best AI-native platform for developers to build their own AI applications.We currently offer:
Serverless LLM Inference - Use the API to access the top open source language models like Llama-3.1-8B. Pay only for the tokens you use. View the list of available models here.
LoRA Inference (Early access) - Upload LoRA adapters and access them via streaming or batch endpoints.