MiMo-V2-Flash

MiMo-V2-Flash

Xiaomi Technology
+
+

Related Products

  • Vertex AI
    783 Ratings
    Visit Website
  • Google AI Studio
    11 Ratings
    Visit Website
  • LM-Kit.NET
    23 Ratings
    Visit Website
  • DataHub
    10 Ratings
    Visit Website
  • Ango Hub
    15 Ratings
    Visit Website
  • Enterprise Bot
    23 Ratings
    Visit Website
  • QuickApps
    Visit Website
  • Careerminds
    46 Ratings
    Visit Website
  • Nexo
    16,425 Ratings
    Visit Website
  • TrueLoyal
    241 Ratings
    Visit Website

About

This repository contains the research preview of LongLLaMA, a large language model capable of handling long contexts of 256k tokens or even more. LongLLaMA is built upon the foundation of OpenLLaMA and fine-tuned using the Focused Transformer (FoT) method. LongLLaMA code is built upon the foundation of Code Llama. We release a smaller 3B base variant (not instruction tuned) of the LongLLaMA model on a permissive license (Apache 2.0) and inference code supporting longer contexts on hugging face. Our model weights can serve as the drop-in replacement of LLaMA in existing implementations (for short context up to 2048 tokens). Additionally, we provide evaluation results and comparisons against the original OpenLLaMA models.

About

MiMo-V2-Flash is an open weight large language model developed by Xiaomi based on a Mixture-of-Experts (MoE) architecture that blends high performance with inference efficiency. It has 309 billion total parameters but activates only 15 billion active parameters per inference, letting it balance reasoning quality and computational efficiency while supporting extremely long context handling, for tasks like long-document understanding, code generation, and multi-step agent workflows. It incorporates a hybrid attention mechanism that interleaves sliding-window and global attention layers to reduce memory usage and maintain long-range comprehension, and it uses a Multi-Token Prediction (MTP) design that accelerates inference by processing batches of tokens in parallel. MiMo-V2-Flash delivers very fast generation speeds (up to ~150 tokens/second) and is optimized for agentic applications requiring sustained reasoning and multi-turn interactions.

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Platforms Supported

Windows
Mac
Linux
Cloud
On-Premises
iPhone
iPad
Android
Chromebook

Audience

Users interested in a powerful Large Language Model solution

Audience

Developers and researchers requiring a solution to build high-performance AI applications involving long-context reasoning, coding, and agentic workflows

Support

Phone Support
24/7 Live Support
Online

Support

Phone Support
24/7 Live Support
Online

API

Offers API

API

Offers API

Screenshots and Videos

Screenshots and Videos

Pricing

Free
Free Version
Free Trial

Pricing

Free
Free Version
Free Trial

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Reviews/Ratings

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Training

Documentation
Webinars
Live Online
In Person

Training

Documentation
Webinars
Live Online
In Person

Company Information

LongLLaMA
github.com/CStanKonrad/long_llama

Company Information

Xiaomi Technology
Founded: 2010
China
mimo.xiaomi.com/blog/mimo-v2-flash

Alternatives

Llama 2

Llama 2

Meta

Alternatives

Kimi K2 Thinking

Kimi K2 Thinking

Moonshot AI
Olmo 3

Olmo 3

Ai2
Xiaomi MiMo

Xiaomi MiMo

Xiaomi Technology
GLM-4.5

GLM-4.5

Z.ai
DeepSeek-V2

DeepSeek-V2

DeepSeek
Hermes 3

Hermes 3

Nous Research

Categories

Categories

Integrations

Claude Code
Hugging Face
Xiaomi MiMo
Xiaomi MiMo Studio

Integrations

Claude Code
Hugging Face
Xiaomi MiMo
Xiaomi MiMo Studio
Claim LongLLaMA and update features and information
Claim LongLLaMA and update features and information
Claim MiMo-V2-Flash and update features and information
Claim MiMo-V2-Flash and update features and information