Bringing large-language models and chat to web browsers
AI assistant that supports knowledge bases, model APIs
Fast, local-first web content extraction for LLMs
ChatGLM3 series: Open Bilingual Chat LLMs | Open Source Bilingual Chat
Web app for interacting with any LangGraph agent (PY & TS) via a chat
Fully private LLM chatbot that runs entirely with a browser
Fast, flexible LLM inference
AI search engine - self-host with local or cloud LLMs
WebAssembly binding for llama.cpp - Enabling on-browser LLM inference
High-speed Large Language Model Serving for Local Deployment
Quick illustration of how one can easily read books together with LLMs
Query anything (GitHub, Notion, +40 more) with SQL and let LLMs
Fast and efficient unstructured data extraction
The free, Open Source alternative to OpenAI, Claude and others
Run AI models locally on your machine with node.js bindings for llama
LLocalSearch is a completely locally running search aggregator
Masks sensitive data and secrets before they reach AI
A minimal LLM chat app that runs entirely in your browser
Automatic question answering for local knowledge bases based on LLM
Universal LLM Deployment Engine with ML Compilation
Web-based tool converts GitHub repository contents
local-first semantic code search engine
LLM Council works together to answer your hardest questions
An elegant AI chat client. Full-featured, lightweight
Chinese and English multimodal conversational language model