The Triton Inference Server provides an optimized cloud
Visualizer for neural network, deep learning, machine learning models
TensorFlow is an open source library for machine learning
AutoML toolkit for automate machine learning lifecycle
A community-supported supercharged version of paperless
Open-source simulator for autonomous driving research.
Streamline your ML workflow
Embed images and sentences into fixed-length vectors
Helps scientists define testable, modular, self-documenting dataflow
Elyra extends JupyterLab with an AI centric approach
A Python package for segmenting geospatial data with the SAM
Serve machine learning models within a Docker container
Python binding to the Apache Tika™ REST services
Deep Learning API and Server in C++14 support for Caffe, PyTorch
Interactively analyze ML models to understand their behavior
Open source large-language-model based code completion engine
Open-source tool designed to enhance the efficiency of workloads
Build cross-modal and multimodal applications on the cloud
A simulator for drones, cars and more, built on Unreal Engine
A fast and stable captcha auto solving server with API.
Your gateway to GPT writing
TensorFlowOnSpark brings TensorFlow programs to Apache Spark clusters
fast C++ library for GPU linear algebra & scientific computing
Leading free and open-source face recognition system
CPU/GPU inference server for Hugging Face transformer models