Call all LLM APIs using the OpenAI format [Anthropic, Huggingface, Cohere, Azure OpenAI etc.] liteLLM supports streaming the model response back, pass stream=True to get a streaming iterator in response. Streaming is supported for OpenAI, Azure, Anthropic, and Huggingface models.
Features
- Translating inputs to the provider's completion and embedding endpoints
- Guarantees consistent output, text responses will always be available
- Exception mapping
- Common exceptions across providers are mapped to the OpenAI exception types
- LiteLLM Client - debugging & 1-click add new LLMs
- liteLLM supports streaming the model response back
Categories
Large Language Models (LLM)License
MIT LicenseFollow LiteLLM
You Might Also Like
Rate This Project
Login To Rate This Project
User Reviews
Be the first to post a review of LiteLLM!