Name | Modified | Size | Downloads / Week |
---|---|---|---|
Parent folder | |||
README.md | 2025-08-10 | 1.1 kB | |
v2.0.1 source code.tar.gz | 2025-08-10 | 88.9 MB | |
v2.0.1 source code.zip | 2025-08-10 | 88.9 MB | |
Totals: 3 Items | 177.8 MB | 2 |
Release Notes
Bug Fixes
Multi-turn Conversation Issue (Issue [#60])
Fixed a critical bug where the LLM would stop responding after the second turn in multi-turn conversations. This issue was introduced in a previous version and affected all conversation flows.
What was fixed:
- Removed incorrect currentTokenCount
reset that was breaking conversation continuity
- Fixed context positioning to properly maintain conversation history
- Enhanced embeddings isolation to prevent context contamination
New Features:
- Added resetContext()
method for manual context clearing
- Added clearConversation()
method to start fresh conversations
- Improved context management for better conversation control
Technical Details: - The issue was caused by resetting the token count on each turn, making the model "forget" previous conversation context - Context now properly continues from the current position instead of resetting to zero - Embeddings operations are now properly isolated from conversation context
Full Changelog: https://github.com/eastriverlee/LLM.swift/compare/v2.0.0...v2.0.1