| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| Parent folder | |||
| langroid-0.61.0-py3-none-any.whl | 2026-03-25 | 434.4 kB | |
| langroid-0.61.0.tar.gz | 2026-03-25 | 384.4 kB | |
| 0.61.0 source code.tar.gz | 2026-03-25 | 58.4 MB | |
| 0.61.0 source code.zip | 2026-03-25 | 58.9 MB | |
| README.md | 2026-03-25 | 1.1 kB | |
| Totals: 5 Items | 118.2 MB | 0 | |
Add MiniMax LLM provider support
PR #1004 adds comprehensive support for MiniMax language models via their OpenAI-compatible API.
What's New
- 7 models supported: M2.7, M2.5, M2.1, M2, and their
-highspeedvariants, with context lengths up to 1M tokens - Standard provider pattern: Uses
minimax/prefix for model names (e.g.minimax/M2.7), consistent with existing providers like DeepSeek and Gemini - API key via
MINIMAX_API_KEYenvironment variable - Example script:
examples/basic/chat-minimax.py - Documentation: Updated tutorials and supported models docs
Usage
:::python
import langroid as lr
import langroid.language_models as lm
llm_config = lm.OpenAIGPTConfig(chat_model="minimax/M2.7")
agent = lr.ChatAgent(lr.ChatAgentConfig(llm=llm_config))
Available Models
| Model | Context Length |
|---|---|
| M2.7 / M2.7-highspeed | 1M |
| M2.5 / M2.5-highspeed | 204K |
| M2.1 / M2.1-highspeed | 1M |
| M2 | 1M |