Download Latest Version DeepTutor-v1.3.10 source code.tar.gz (8.4 MB)
Email in envelope

Get an email when there's a new version of DeepTutor

Home / v1.3.9
Name Modified Size InfoDownloads / Week
Parent folder
DeepTutor-v1.3.9 source code.tar.gz 2026-05-08 8.4 MB
DeepTutor-v1.3.9 source code.zip 2026-05-08 9.0 MB
README.md 2026-05-08 6.4 kB
Totals: 3 Items   17.5 MB 0

DeepTutor v1.3.9 Release Notes

Release Date: 2026.05.09

v1.3.9 builds on the v1.3.8 multi-user foundation with broader TutorBot deployment options, safer provider routing for thinking models, and a smoother web onboarding path. It adds Zulip and NVIDIA NIM support, improves startup ergonomics, and folds in the main issue fixes reported after the last release.

Highlights

TutorBot Channel and Provider Expansion

  • Zulip is now a TutorBot channel - bots can listen to private messages and stream topics, enforce allow_from, choose mention-only or open stream replies, and bridge Zulip's event queue into the async TutorBot bus.
  • Math and files work better in Zulip - LaTeX is converted to Zulip-friendly KaTeX markup, upload/download calls use configurable retry behavior, and attachment filenames include upload-path digests to avoid collisions.
  • Zulip topics keep conversations separated - stream topics now become part of the chat/session key, with a stable (no topic) fallback for empty topics.
  • TutorBot supports NVIDIA NIM - nvidia_nim is available in TutorBot provider config and registry detection, including NIM's streaming behavior that omits unsupported stream_options.

Model and Runtime Reliability

  • Configured context windows are respected - the safety ceiling is raised to 1,000,000 tokens while the large-model fallback remains 65,536, so explicit 128K-style model settings are no longer silently clamped.
  • Qwen vision detection is fixed - Qwen VL models are treated as vision-capable across DashScope, OpenAI-compatible, and custom bindings.
  • Minimal thinking mode is provider-safe - DeepSeek, DashScope, VolcEngine, BytePlus, and MiniMax no longer receive a rejected top-level reasoning_effort=minimal; DeepTutor sends the provider-specific disable signal instead.
  • DeepSeek v4 costs are tracked - research token accounting includes deepseek-v4-flash and deepseek-v4-pro pricing entries.

Web and CLI Polish

  • deeptutor start launches the full web stack - the CLI now delegates to scripts/start_web.py so backend and frontend can be started from one command, and launcher failures propagate through the CLI exit code.
  • Sidebar onboarding is clearer - primary navigation icons now expose scoped, localized tooltips with descriptions and keyboard focus support.
  • Multi-line user messages stay readable - chat message rendering preserves Shift+Enter line breaks, fixing code blocks and structured prompts that were previously collapsed into one line.
  • Assigned resources are easier to understand - model-selection summaries and read-only knowledge-base actions now present clearer labels for non-admin, grant-scoped sessions.

Multi-User and Session Store Parity

  • Assigned model options match the selector contract - non-admin LLM choices now return profile names, model names, labels, and active/default metadata in the same shape expected by the web model selector.
  • PocketBase sessions support more chat flows - message metadata can be persisted, last-message lookup is available, and message deletion works with PocketBase string IDs as well as SQLite integer IDs.
  • Regenerate remains storage-neutral - turn retry logic can remove the last assistant message without assuming the backing session store uses integer primary keys.

Tests

  • Added Zulip channel coverage for config parsing, permission checks, duplicate filtering, mentions, stream topic scoping, attachment extraction, retry behavior, LaTeX conversion, typing status, sending, uploads, and startup failures.
  • Added TutorBot NVIDIA NIM provider tests for registry detection, schema acceptance, and streaming request compatibility.
  • Added LLM regression tests for Qwen vision capability, explicit context-window budgets, and minimal-thinking provider kwargs.
  • Added CLI coverage so deeptutor start propagates the launcher exit code.
  • Added research token-pricing coverage for the DeepSeek v4 model entries.

Upgrade Notes

  • Install or refresh the .[tutorbot] extra, or requirements/tutorbot.txt, to include the new zulip>=0.8.0,<1.0.0 dependency before enabling Zulip bots.
  • Configure Zulip bots with site, email, apiKey, allowFrom, and groupPolicy; use mention for safer stream deployments and open only when every stream message should reach the bot.
  • If you use LLM_REASONING_EFFORT=minimal with DeepSeek, DashScope, VolcEngine, BytePlus, or MiniMax, keep the setting as-is; v1.3.9 translates it to the correct provider-specific disable payload.
  • Large configured context windows may now be honored instead of capped at 65,536 tokens, so verify provider limits and expected prompt-cost behavior.
  • Optional PocketBase deployments should ensure the messages collection has a metadata_json JSON field before relying on regenerate/session metadata parity.

What's Changed

New Contributors

Full Changelog: https://github.com/HKUDS/DeepTutor/compare/v1.3.8...v1.3.9

Source: README.md, updated 2026-05-08