LLocalSearch is an open-source search engine framework designed to run entirely on local infrastructure using large language model agents to gather and synthesize information from the web. The system allows users to submit natural language questions, after which a chain of LLM-driven agents recursively searches for relevant information and compiles a response. Unlike many AI search tools, LLocalSearch operates without requiring external cloud APIs or proprietary services, making it suitable for privacy-focused or offline environments. The architecture integrates local language models with external tools such as search engines, enabling the system to gather up-to-date information while keeping model execution on local hardware. The tool also exposes the internal reasoning process of its agents so users can observe how queries are expanded and how results are retrieved during the search process.

Features

  • Fully local search system powered by LLM agents
  • Recursive tool-based search workflow for answering complex queries
  • Integration with local language models such as those served by Ollama
  • Transparent visualization of the reasoning and tool-calling process
  • Self-hosted architecture that does not require external API keys
  • Containerized deployment options using Docker environments

Project Samples

Project Activity

See All Activity >

License

Apache License V2.0

Follow LLocalSearch

LLocalSearch Web Site

Other Useful Business Software
Try Google Cloud Risk-Free With $300 in Credit Icon
Try Google Cloud Risk-Free With $300 in Credit

No hidden charges. No surprise bills. Cancel anytime.

Use your credit across every product. Compute, storage, AI, analytics. When it runs out, 20+ products stay free. You only pay when you choose to.
Start Free
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of LLocalSearch!

Additional Project Details

Programming Language

Go

Related Categories

Go Large Language Models (LLM)

Registered

2026-03-04