ONNX Runtime: cross-platform, high performance ML inferencing
Open standard for machine learning interoperability
Training and deploying machine learning models on Amazon SageMaker
Python Package for ML-Based Heterogeneous Treatment Effects Estimation
The official Python client for the Huggingface Hub
Adversarial Robustness Toolbox (ART) - Python Library for ML security
A unified framework for scalable computing
OpenMLDB is an open-source machine learning database
Powering Amazon custom machine learning chips
An MLOps framework to package, deploy, monitor and manage models
The Triton Inference Server provides an optimized cloud
PArallel Distributed Deep LEarning: Machine Learning Framework
A set of Docker images for training and serving models in TensorFlow
Uplift modeling and causal inference with machine learning algorithms
Serving system for machine learning models
C++ library for high performance inference on NVIDIA GPUs
Open-Source AI Camera. Empower any camera/CCTV
Everything you need to build state-of-the-art foundation models
Bayesian inference with probabilistic programming
OpenVINO™ Toolkit repository
A GPU-accelerated library containing highly optimized building blocks
DoWhy is a Python library for causal inference
Trainable models and NN optimization tools
Data manipulation and transformation for audio signal processing
Deep Learning API and Server in C++14 support for Caffe, PyTorch