Simple, Pythonic building blocks to evaluate LLM applications
Python bindings for the Transformer models implemented in C/C++
A central, open resource for data and tools
Serving LangChain LLM apps automagically with FastApi
Research-oriented chatbot framework
Integrating LLMs into structured NLP pipelines
Operating LLMs in production
Run 100B+ language models at home, BitTorrent-style
An open-source framework for training large multimodal models
INT4/INT5/INT8 and FP16 inference on CPU for RWKV language model
Ongoing research training transformer models at scale
Label, clean and enrich text datasets with LLMs
LLM
Large-scale Self-supervised Pre-training Across Tasks, Languages, etc.
PandasAI is a Python library that integrates generative AI
Train a 26M-parameter GPT from scratch in just 2h
Scalable data pre processing and curation toolkit for LLMs
LLMs for your CLI
Implementation of "Tree of Thoughts
State-of-the-art Parameter-Efficient Fine-Tuning
User toolkit for analyzing and interfacing with Large Language Models
One API for all LLMs either Private or Public
PyTorch library of curated Transformer models and their components
Run any Llama 2 locally with gradio UI on GPU or CPU from anywhere
Central interface to connect your LLM's with external data