Does Inference.net support streaming?
Yes, Inference.net supports streaming for all language models. Simply set thestream=true
parameter in your request.
Frequently asked questions about the Inference.net API
stream=true
parameter in your request.