A set of Docker images for training and serving models in TensorFlow
ONNX Runtime: cross-platform, high performance ML inferencing
Open standard for machine learning interoperability
OpenAI swift async text to image for SwiftUI app using OpenAI
A unified framework for scalable computing
C++ library for high performance inference on NVIDIA GPUs
OpenVINO™ Toolkit repository
The Triton Inference Server provides an optimized cloud
Training and deploying machine learning models on Amazon SageMaker
PArallel Distributed Deep LEarning: Machine Learning Framework
Powering Amazon custom machine learning chips
Python Package for ML-Based Heterogeneous Treatment Effects Estimation
The official Python client for the Huggingface Hub
An MLOps framework to package, deploy, monitor and manage models
A GPU-accelerated library containing highly optimized building blocks
A library for accelerating Transformer models on NVIDIA GPUs
Trainable models and NN optimization tools
Serving system for machine learning models
Deep Learning API and Server in C++14 support for Caffe, PyTorch
Adversarial Robustness Toolbox (ART) - Python Library for ML security
Uplift modeling and causal inference with machine learning algorithms
Library for OCR-related tasks powered by Deep Learning
Open-Source AI Camera. Empower any camera/CCTV
Bayesian inference with probabilistic programming
DoWhy is a Python library for causal inference