High-performance neural network inference framework for mobile
AIMET is a library that provides advanced quantization and compression
C++ library for high performance inference on NVIDIA GPUs
OpenMMLab Model Deployment Framework
Neural Network Compression Framework for enhanced OpenVINO
Framework that is dedicated to making neural data processing
Open standard for machine learning interoperability
A general-purpose probabilistic programming system
Set of comprehensive computer vision & machine intelligence libraries
Multi-Modal Neural Networks for Semantic Search, based on Mid-Fusion
Libraries for applying sparsification recipes to neural networks
Deep Learning API and Server in C++14 support for Caffe, PyTorch
Lightweight Python library for adding real-time multi-object tracking
A toolkit to optimize ML models for deployment for Keras & TensorFlow
Sequence-to-sequence framework, focused on Neural Machine Translation
Guide to deploying deep-learning inference networks
Uniform deep learning inference framework for mobile