The Triton Inference Server provides an optimized cloud
Visualizer for neural network, deep learning, machine learning models
A Flexible and Powerful Parameter Server for large-scale ML
A community-supported supercharged version of paperless
Python binding to the Apache Tika™ REST services
Open-source simulator for autonomous driving research.
Elyra extends JupyterLab with an AI centric approach
TensorFlow is an open source library for machine learning
Streamline your ML workflow
Deep Learning API and Server in C++14 support for Caffe, PyTorch
Interactively analyze ML models to understand their behavior
Making Enterprise Data Intelligent and Responsive for AI
Helps scientists define testable, modular, self-documenting dataflow
Build cross-modal and multimodal applications on the cloud
Embed images and sentences into fixed-length vectors
fast C++ library for GPU linear algebra & scientific computing
Serve machine learning models within a Docker container
Open source large-language-model based code completion engine
A fast and stable captcha auto solving server with API.
Leading free and open-source face recognition system
TensorFlowOnSpark brings TensorFlow programs to Apache Spark clusters
EZStacking is Jupyter notebook generator for machine learning
CPU/GPU inference server for Hugging Face transformer models
C++ library based on tensorrt integration
Deploy a ML inference service on a budget in 10 lines of code