Download Latest Version AnythingLLM v1.8.3 Mobile support + RAG improvements source code.zip (9.0 MB)
Email in envelope

Get an email when there's a new version of Anything LLM

Home / v1.8.5
Name Modified Size InfoDownloads / Week
Parent folder
AnythingLLM v1.8.3 Mobile support + RAG improvements source code.tar.gz 2025-08-14 8.2 MB
AnythingLLM v1.8.3 Mobile support + RAG improvements source code.zip 2025-08-14 9.0 MB
README.md 2025-08-14 5.8 kB
Totals: 3 Items   17.1 MB 15

AnythingLLM v1.8.3 is live

Notable Changes

Mobile support

Now, currently under Experimental features, you can connect the AnythingLLM Mobile App - Android Beta to your instance to seamlessly blend an on-device and off-device experience. Leverage your instance Agent Skills and flows all within a single unified interface!

Chat with documents has been overhauled

https://github.com/user-attachments/assets/0e4ab18d-fb58-480a-94f5-907664dd8f3f

When we first built AnythingLLM, the average context window was 4K - hardly anything to fit a full document. So we decided to always be RAG first. This has its drawbacks since RAG is semantically dependent on asking questions about content in the document. This leads to poor results for "Summarize this document," only to be told by the document, "what are you talking about".

Well, now we have the best of both worlds. Documents are scoped to a workspace thread & user and we will attempt to use the full document text when possible and your model can support it. If you overflow this amount, we can then ask you to embed the document so you can unlock that long-term memory.

context-warning

You can also easily manage and see your context window to remove files that are no longer relevant, but retain the conversation history. manage-attached-docs

You can also still embed files directly in the workspace file manager too :)

What's Changed

New Contributors

Full Changelog: https://github.com/Mintplex-Labs/anything-llm/compare/v1.8.4...v1.8.5

Source: README.md, updated 2025-08-14