C++ library for high performance inference on NVIDIA GPUs
High-performance neural network inference framework for mobile
Set of comprehensive computer vision & machine intelligence libraries
ONNX Runtime: cross-platform, high performance ML inferencing
AIMET is a library that provides advanced quantization and compression
OpenVINO™ Toolkit repository
Library for OCR-related tasks powered by Deep Learning
A general-purpose probabilistic programming system
Official inference library for Mistral models
PArallel Distributed Deep LEarning: Machine Learning Framework
Phi-3.5 for Mac: Locally-run Vision and Language Models
Everything you need to build state-of-the-art foundation models
A lightweight vision library for performing large object detection
Library for serving Transformers models on Amazon SageMaker
A unified framework for scalable computing
Libraries for applying sparsification recipes to neural networks
A set of Docker images for training and serving models in TensorFlow
DoWhy is a Python library for causal inference
A library for accelerating Transformer models on NVIDIA GPUs
An MLOps framework to package, deploy, monitor and manage models
Visual Instruction Tuning: Large Language-and-Vision Assistant
OpenFieldAI is an AI based Open Field Test Rodent Tracker
Lightweight inference library for ONNX files, written in C++
OpenMMLab Model Deployment Framework
A computer vision framework to create and deploy apps in minutes