Openai style api for open large language models
Sparsity-aware deep learning inference runtime for CPUs
Superduper: Integrate AI models and machine learning workflows
Images to inference with no labeling
Easy-to-use deep learning framework with 3 key features
Data manipulation and transformation for audio signal processing
A set of Docker images for training and serving models in TensorFlow
Pytorch domain library for recommendation systems
State-of-the-art diffusion models for image and audio generation
Multi-LoRA inference server that scales to 1000s of fine-tuned LLMs
Integrate, train and manage any AI models and APIs with your database
INT4/INT5/INT8 and FP16 inference on CPU for RWKV language model
Libraries for applying sparsification recipes to neural networks
An easy-to-use LLMs quantization package with user-friendly apis
A Unified Library for Parameter-Efficient Learning
Multilingual Automatic Speech Recognition with word-level timestamps
Lightweight Python library for adding real-time multi-object tracking
Unified Model Serving Framework
Optimizing inference proxy for LLMs
Large Language Model Text Generation Inference
Framework that is dedicated to making neural data processing
Easiest and laziest way for building multi-agent LLMs applications
Efficient few-shot learning with Sentence Transformers
Standardized Serverless ML Inference Platform on Kubernetes
Trainable models and NN optimization tools