Run Local LLMs on Any Device. Open-source
Open-Source AI Camera. Empower any camera/CCTV
DoWhy is a Python library for causal inference
Superduper: Integrate AI models and machine learning workflows
The Triton Inference Server provides an optimized cloud
Easy-to-use Speech Toolkit including Self-Supervised Learning model
Trainable models and NN optimization tools
Trainable, memory-efficient, and GPU-friendly PyTorch reproduction
Tensor search for humans
Scripts for fine-tuning Meta Llama3 with composable FSDP & PEFT method
LLMFlows - Simple, Explicit and Transparent LLM Apps
The unofficial python package that returns response of Google Bard
Database system for building simpler and faster AI-powered application
A toolkit to optimize ML models for deployment for Keras & TensorFlow
MII makes low-latency and high-throughput inference possible
A computer vision framework to create and deploy apps in minutes