Run Local LLMs on Any Device. Open-source
Create HTML profiling reports from pandas DataFrame objects
The official Python client for the Huggingface Hub
State-of-the-art diffusion models for image and audio generation
Standardized Serverless ML Inference Platform on Kubernetes
LLM.swift is a simple and readable library
Openai style api for open large language models
A unified framework for scalable computing
A Pythonic framework to simplify AI service building
DoWhy is a Python library for causal inference
The unofficial python package that returns response of Google Bard
Integrate, train and manage any AI models and APIs with your database
Deep Learning API and Server in C++14 support for Caffe, PyTorch
LLMFlows - Simple, Explicit and Transparent LLM Apps
Framework for Accelerating LLM Generation with Multiple Decoding Heads
High-level Deep Learning Framework written in Kotlin
Training & Implementation of chatbots leveraging GPT-like architecture
CPU/GPU inference server for Hugging Face transformer models
Deploy a ML inference service on a budget in 10 lines of code