| Name | Modified | Size | Downloads / Week |
|---|---|---|---|
| Parent folder | |||
| README.md | 2026-02-17 | 926 Bytes | |
| Release v2.3 source code.tar.gz | 2026-02-17 | 14.8 MB | |
| Release v2.3 source code.zip | 2026-02-17 | 14.8 MB | |
| Totals: 3 Items | 29.6 MB | 0 | |
Add custom prompts, domain presets, and provider config check
- Add PRESET_PROMPTS dict to llm.py with 4 research domain variants: threat intel, ransomware/malware, personal identity, corporate espionage
- Update generate_summary() to accept preset and custom_instructions params; custom instructions append to the base prompt (never replace)
- Add Prompt Settings sidebar expander with domain preset selector, live system prompt viewer (read-only), and custom instructions textarea with domain-specific placeholder examples
- Add Provider Configuration sidebar section that auto-checks all 6 LLM providers (OpenAI, Anthropic, Google, OpenRouter, Ollama, llama.cpp) on page load with clear set/missing/optional status indicators
- Show prominent error banner when no models are available due to missing env vars, surfacing the root cause of issue [#99]
Co-Authored-By: Claude Sonnet 4.5 noreply@anthropic.com