Local Deep Research is an open-source AI-powered research assistant designed to perform deep, iterative investigations by combining large language models with multi-source search capabilities. It runs locally, giving users full control over their data, privacy, and infrastructure while supporting both local and cloud-based LLMs. The system breaks down complex queries into smaller steps, performs parallel searches across web and academic sources, and generates structured, citation-backed reports. It also supports personal document ingestion through vector search, enabling users to build a private, searchable knowledge base. The platform includes a web interface, Docker-based deployment, and flexible configuration options, making it accessible to both developers and researchers. Its architecture emphasizes transparency, customization, and reproducibility in AI-assisted research workflows.
Features
- Local-first execution with full data ownership
- Multi-source parallel search and synthesis
- Integration with local and cloud LLMs
- Vector-based document retrieval (RAG)
- Automated report generation with citations
- Web UI and Docker-based deployment