| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| Parent folder | |||
| README.md | 2026-04-06 | 1.1 kB | |
| v0.8.0 - HuggingFace Embeddings Support source code.tar.gz | 2026-04-06 | 3.4 MB | |
| v0.8.0 - HuggingFace Embeddings Support source code.zip | 2026-04-06 | 3.5 MB | |
| Totals: 3 Items | 6.9 MB | 1 | |
What's New
š New Features
- HuggingFace Embeddings Support (#77)
- Use any HuggingFace embedding model locally via
sentence-transformers - Set
EMBEDDINGS_PROVIDER=huggingfacein your.env - Separate configuration for LLM and embeddings providers
- Supports popular models like
all-MiniLM-L6-v2,bge-large-en-v1.5, etc. - See HuggingFace Embeddings Documentation
š Bug Fixes
- Fix maximum update depth exceeded during LLM streaming (#69)
- Added debounce logic to prevent infinite re-render loops in Answer component
- Address MiniMax provider review feedback (#91)
š¦ Dependencies
- Update frontend pnpm-lock.yaml
Configuration Example
:::bash
# Use HuggingFace for embeddings (separate from LLM provider)
EMBEDDINGS_PROVIDER=huggingface
HUGGINGFACE_EMBEDDINGS_MODEL=sentence-transformers/all-MiniLM-L6-v2
# Optional: only needed for gated models
HUGGINGFACE_API_KEY=your_token_here
Full Changelog: https://github.com/rag-web-ui/rag-web-ui/compare/v0.7.3...v0.8.0