Superduper: Integrate AI models and machine learning workflows
Neural Network Compression Framework for enhanced OpenVINO
Openai style api for open large language models
Images to inference with no labeling
Multilingual Automatic Speech Recognition with word-level timestamps
MII makes low-latency and high-throughput inference possible
Operating LLMs in production
Integrate, train and manage any AI models and APIs with your database
Pytorch domain library for recommendation systems
INT4/INT5/INT8 and FP16 inference on CPU for RWKV language model
Libraries for applying sparsification recipes to neural networks
An easy-to-use LLMs quantization package with user-friendly apis
Phi-3.5 for Mac: Locally-run Vision and Language Models
Unified Model Serving Framework
Optimizing inference proxy for LLMs
Large Language Model Text Generation Inference
Replace OpenAI GPT with another LLM in your app
State-of-the-art diffusion models for image and audio generation
Easiest and laziest way for building multi-agent LLMs applications
Efficient few-shot learning with Sentence Transformers
Trainable models and NN optimization tools
Probabilistic reasoning and statistical analysis in TensorFlow
Framework that is dedicated to making neural data processing
Official inference library for Mistral models
PyTorch extensions for fast R&D prototyping and Kaggle farming