High-performance neural network inference framework for mobile
MNN is a blazing fast, lightweight deep learning framework
Run Local LLMs on Any Device. Open-source
C++ library for high performance inference on NVIDIA GPUs
Deep learning inference framework optimized for mobile platforms
Uniform deep learning inference framework for mobile