Run Local LLMs on Any Device. Open-source
Create HTML profiling reports from pandas DataFrame objects
The official Python client for the Huggingface Hub
State-of-the-art diffusion models for image and audio generation
Standardized Serverless ML Inference Platform on Kubernetes
Openai style api for open large language models
A unified framework for scalable computing
A Pythonic framework to simplify AI service building
DoWhy is a Python library for causal inference
The unofficial python package that returns response of Google Bard
Integrate, train and manage any AI models and APIs with your database
LLMFlows - Simple, Explicit and Transparent LLM Apps
Framework for Accelerating LLM Generation with Multiple Decoding Heads
Lightweight anchor-free object detection model
Training & Implementation of chatbots leveraging GPT-like architecture
CPU/GPU inference server for Hugging Face transformer models
Deploy a ML inference service on a budget in 10 lines of code