Run Local LLMs on Any Device. Open-source
Port of Facebook's LLaMA model in C/C++
User-friendly AI Interface
An Open-Source Programming Framework for Agentic AI
Official inference library for Mistral models
Deep learning optimization library: makes distributed training easy
Phi-3.5 for Mac: Locally-run Vision and Language Models
Run local LLMs like llama, deepseek, kokoro etc. inside your browser
Superduper: Integrate AI models and machine learning workflows
Framework that is dedicated to making neural data processing
A real time inference engine for temporal logical specifications
A computer vision framework to create and deploy apps in minutes
High-level Deep Learning Framework written in Kotlin
Implementation of model parallel autoregressive transformers on GPUs