Aqueduct allows you to run LLM and ML workloads on any infrastructure
One API for all LLMs either Private or Public
Serving LangChain LLM apps automagically with FastApi
An elegent pytorch implement of transformers
Label, clean and enrich text datasets with LLMs
LLM
Adding guardrails to large language models
Operating LLMs in production
Large-scale Self-supervised Pre-training Across Tasks, Languages, etc.
Scalable data pre processing and curation toolkit for LLMs
Database system for building simpler and faster AI-powered application
User toolkit for analyzing and interfacing with Large Language Models
INT4/INT5/INT8 and FP16 inference on CPU for RWKV language model
Integrating LLMs into structured NLP pipelines
PyTorch library of curated Transformer models and their components
lightweight package to simplify LLM API calls
A paper title generation model fine-tuned on the LLaMA model
LLMs for your CLI
AI R&D Efficiency Improvement Research: Do-It-Yourself Training LoRA
Community for applying LLMs to robotics and a robot simulator
Seamlessly integrate LLMs into scikit-learn
Open source libraries and APIs to build custom preprocessing pipelines
Framework to easily create LLM powered bots over any dataset
Sweep: AI-powered Junior Developer for small features and bug fixes
Revolutionizing Database Interactions with Private LLM Technology