Model Context Protocol tool support for LangChain
Simple, Pythonic building blocks to evaluate LLM applications
Python bindings for the Transformer models implemented in C/C++
Easy-to-use and high-performance NLP and LLM framework
Unified embedding model
Persian NLP Toolkit
WikiChat is an improved RAG
The no-nonsense RAG chunking library
ReFT: Representation Finetuning for Language Models
A central, open resource for data and tools
Serving LangChain LLM apps automagically with FastApi
TextWorld is a sandbox learning environment for the training
Research-oriented chatbot framework
Standalone, small, language-neutral
Integrating LLMs into structured NLP pipelines
Operating LLMs in production
Efficient Triton Kernels for LLM Training
Run 100B+ language models at home, BitTorrent-style
An open-source framework for training large multimodal models
Stanford NLP Python library for many human languages
A Model Context Protocol (MCP) server
INT4/INT5/INT8 and FP16 inference on CPU for RWKV language model
Label, clean and enrich text datasets with LLMs
LLM
Ongoing research training transformer models at scale