Call all LLM APIs using the OpenAI format [Anthropic, Huggingface, Cohere, Azure OpenAI etc.] liteLLM supports streaming the model response back, pass stream=True to get a streaming iterator in response. Streaming is supported for OpenAI, Azure, Anthropic, and Huggingface models.

Features

  • Translating inputs to the provider's completion and embedding endpoints
  • Guarantees consistent output, text responses will always be available
  • Exception mapping
  • Common exceptions across providers are mapped to the OpenAI exception types
  • LiteLLM Client - debugging & 1-click add new LLMs
  • liteLLM supports streaming the model response back

Project Samples

Project Activity

See All Activity >

License

MIT License

Follow LiteLLM

LiteLLM Web Site

Other Useful Business Software
Try Google Cloud Risk-Free With $300 in Credit Icon
Try Google Cloud Risk-Free With $300 in Credit

No hidden charges. No surprise bills. Cancel anytime.

Use your credit across every product. Compute, storage, AI, analytics. When it runs out, 20+ products stay free. You only pay when you choose to.
Start Free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of LiteLLM!

Additional Project Details

Programming Language

Python

Related Categories

Python Large Language Models (LLM), Python MCP Gateways

Registered

2023-08-25