Training and deploying machine learning models on Amazon SageMaker
Run Local LLMs on Any Device. Open-source
A high-throughput and memory-efficient inference and serving engine
The official Python client for the Huggingface Hub
Unified Model Serving Framework
Uncover insights, surface problems, monitor, and fine tune your LLM
Efficient few-shot learning with Sentence Transformers
State-of-the-art diffusion models for image and audio generation
Single-cell analysis in Python
A library for accelerating Transformer models on NVIDIA GPUs
Powering Amazon custom machine learning chips
Easiest and laziest way for building multi-agent LLMs applications
Visual Instruction Tuning: Large Language-and-Vision Assistant
A Pythonic framework to simplify AI service building
Python Package for ML-Based Heterogeneous Treatment Effects Estimation
20+ high-performance LLMs with recipes to pretrain, finetune at scale
Trainable models and NN optimization tools
Standardized Serverless ML Inference Platform on Kubernetes
Replace OpenAI GPT with another LLM in your app
An MLOps framework to package, deploy, monitor and manage models
AIMET is a library that provides advanced quantization and compression
A lightweight vision library for performing large object detection
Uplift modeling and causal inference with machine learning algorithms
Optimizing inference proxy for LLMs
Everything you need to build state-of-the-art foundation models