AICI: Prompts as (Wasm) Programs
Port of OpenAI's Whisper model in C/C++
Run Local LLMs on Any Device. Open-source
High-performance neural network inference framework for mobile
ONNX Runtime: cross-platform, high performance ML inferencing
Port of Facebook's LLaMA model in C/C++
Ready-to-use OCR with 80+ supported languages
User-friendly AI Interface
OpenVINO™ Toolkit repository
A high-throughput and memory-efficient inference and serving engine
C++ library for high performance inference on NVIDIA GPUs
Self-hosted, community-driven, local OpenAI compatible API
Protect and discover secrets using Gitleaks
C++ implementation of ChatGLM-6B & ChatGLM2-6B & ChatGLM3 & GLM4(V)
Open standard for machine learning interoperability
LMDeploy is a toolkit for compressing, deploying, and serving LLMs
Library for OCR-related tasks powered by Deep Learning
Deep learning optimization library: makes distributed training easy
Easy-to-use Speech Toolkit including Self-Supervised Learning model
LLMs as Copilots for Theorem Proving in Lean
A general-purpose probabilistic programming system
Data manipulation and transformation for audio signal processing
Phi-3.5 for Mac: Locally-run Vision and Language Models
A Pythonic framework to simplify AI service building
Operating LLMs in production