Training and deploying machine learning models on Amazon SageMaker
Run Local LLMs on Any Device. Open-source
Port of Facebook's LLaMA model in C/C++
Single-cell analysis in Python
Ready-to-use OCR with 80+ supported languages
Everything you need to build state-of-the-art foundation models
A high-throughput and memory-efficient inference and serving engine
Operating LLMs in production
The official Python client for the Huggingface Hub
DoWhy is a Python library for causal inference
FlashInfer: Kernel Library for LLM Serving
Gaussian processes in TensorFlow
LMDeploy is a toolkit for compressing, deploying, and serving LLMs
Python Package for ML-Based Heterogeneous Treatment Effects Estimation
Adversarial Robustness Toolbox (ART) - Python Library for ML security
A library to communicate with ChatGPT, Claude, Copilot, Gemini
The unofficial python package that returns response of Google Bard
AI interface for tinkerers (Ollama, Haystack RAG, Python)
Uplift modeling and causal inference with machine learning algorithms
An easy-to-use LLMs quantization package with user-friendly apis
Neural Network Compression Framework for enhanced OpenVINO
Official inference library for Mistral models
A high-performance ML model serving framework, offers dynamic batching
A unified framework for scalable computing
A Pythonic framework to simplify AI service building