Mobile AI Compute Engine (or MACE for short) is a deep learning inference framework optimized for mobile heterogeneous computing on Android, iOS, Linux and Windows devices. Runtime is optimized with NEON, OpenCL and Hexagon, and Winograd algorithm is introduced to speed up convolution operations. The initialization is also optimized to be faster. Chip-dependent power options like big.LITTLE scheduling, Adreno GPU hints are included as advanced APIs. UI responsiveness guarantee is sometimes obligatory when running a model. Mechanism like automatically breaking OpenCL kernel into small units is introduced to allow better preemption for the UI rendering task. Graph level memory allocation optimization and buffer reuse are supported. The core library tries to keep minimum external dependencies to keep the library footprint small.

Features

  • Model protection has been the highest priority since the beginning of the design
  • Various techniques are introduced like converting models to C++ code and literal obfuscations
  • Good coverage of recent Qualcomm, MediaTek, Pinecone and other ARM based chips
  • CPU runtime supports Android, iOS and Linux
  • TensorFlow, Caffe and ONNX model formats are supported.
  • Runtime is optimized with NEON, OpenCL and Hexagon

Project Samples

Project Activity

See All Activity >

License

Apache License V2.0

Follow MACE

MACE Web Site

Other Useful Business Software
Earn up to 16% annual interest with Nexo. Icon
Earn up to 16% annual interest with Nexo.

Access competitive interest rates on your digital assets.

Generate interest, borrow against your crypto, and trade a range of cryptocurrencies — all in one platform. Geographic restrictions, eligibility, and terms apply.
Get started with Nexo.
Rate This Project
Login To Rate This Project

User Reviews

Be the first to post a review of MACE!

Additional Project Details

Operating Systems

Android, Apple iPhone, Linux, Windows

Programming Language

C++

Related Categories

C++ Frameworks, C++ Machine Learning Software, C++ Deep Learning Frameworks, C++ LLM Inference Tool

Registered

2021-12-13