Distribute and run LLMs with a single file
Drag & drop UI to build your customized LLM flow
Desktop app for prototyping and debugging LangGraph applications
Framework and no-code GUI for fine-tuning LLMs
Locally run an Instruction-Tuned Chat-Style LLM
Swirl queries any number of data sources with APIs
Adding guardrails to large language models
File Parser optimised for LLM Ingestion with no loss
INT4/INT5/INT8 and FP16 inference on CPU for RWKV language model
Low-code framework for building custom LLMs, neural networks
AI-powered CLI git wrapper, boilerplate code generator, chat history
Experimental search engine for conversational AI such as parl.ai