Self-hosted AI Package is an open-source Docker Compose-based starter kit that makes it easy to bootstrap a full local AI and low-code development environment with commonly used open tools, empowering developers to run LLMs and AI workflows entirely on their infrastructure. The stack typically includes Ollama for running local large language models, n8n as a low-code workflow automation platform, Supabase for database and vector storage, Open WebUI for interacting with models, Flowise for agent building, and additional services like SearXNG, Neo4j, and Langfuse for search, knowledge graphs, and observability. This integrated setup allows users to experiment with RAG pipelines, automated workflows, AI agents, and project data management without relying on external hosted services, increasing flexibility and privacy. The repository comes with example workflows (such as Local RAG AI Agent workflows) and environment configurations that help streamline setup and encourage customization.
Features
- Docker Compose template for local AI environments
- Local LLM execution with Ollama
- Supabase vector and database service integration
- n8n low-code workflow orchestration
- Open WebUI for model interaction
- Additional services (Flowise, Neo4j, SearXNG, Langfuse)