User-friendly AI Interface
FlashInfer: Kernel Library for LLM Serving
Everything you need to build state-of-the-art foundation models
High-performance neural network inference framework for mobile
A high-performance inference system for large language models
PyTorch extensions for fast R&D prototyping and Kaggle farming
Set of comprehensive computer vision & machine intelligence libraries
A real time inference engine for temporal logical specifications
OpenMMLab Video Perception Toolbox
Guide to deploying deep-learning inference networks
Deep learning inference framework optimized for mobile platforms