Run Local LLMs on Any Device. Open-source
A unified framework for scalable computing
OpenMMLab Model Deployment Framework
Superduper: Integrate AI models and machine learning workflows
Official inference library for Mistral models
Neural Network Compression Framework for enhanced OpenVINO
Unified Model Serving Framework
Framework that is dedicated to making neural data processing
A set of Docker images for training and serving models in TensorFlow
AIMET is a library that provides advanced quantization and compression
Easy-to-use deep learning framework with 3 key features
An MLOps framework to package, deploy, monitor and manage models
Library for serving Transformers models on Amazon SageMaker
Standardized Serverless ML Inference Platform on Kubernetes
Powering Amazon custom machine learning chips
A GPU-accelerated library containing highly optimized building blocks
Deep learning optimization library: makes distributed training easy
A computer vision framework to create and deploy apps in minutes
Guide to deploying deep-learning inference networks
Toolkit for allowing inference and serving with MXNet in SageMaker
Deploy a ML inference service on a budget in 10 lines of code