User-friendly AI Interface
Run Local LLMs on Any Device. Open-source
OpenVINO™ Toolkit repository
Standardized Serverless ML Inference Platform on Kubernetes
A Pythonic framework to simplify AI service building
Phi-3.5 for Mac: Locally-run Vision and Language Models
The free, Open Source alternative to OpenAI, Claude and others
Simplifies the local serving of AI models from any source
AI interface for tinkerers (Ollama, Haystack RAG, Python)
Data manipulation and transformation for audio signal processing
A scalable inference server for models optimized with OpenVINO
A RWKV management and startup tool, full automation, only 8MB
A Unified Library for Parameter-Efficient Learning
Run local LLMs like llama, deepseek, kokoro etc. inside your browser
DoWhy is a Python library for causal inference
20+ high-performance LLMs with recipes to pretrain, finetune at scale
Libraries for applying sparsification recipes to neural networks
A library for accelerating Transformer models on NVIDIA GPUs
lightweight, standalone C++ inference engine for Google's Gemma models
LLM training code for MosaicML foundation models
An MLOps framework to package, deploy, monitor and manage models
An Open-Source Programming Framework for Agentic AI
Tensor search for humans
Powering Amazon custom machine learning chips
Build Production-ready Agentic Workflow with Natural Language