Audience

AI developers interested in a 100M token context model LLM

About LTM-2-mini

LTM-2-mini is a 100M token context model: LTM-2-mini. 100M tokens equals ~10 million lines of code or ~750 novels.

For each decoded token, LTM-2-mini’s sequence-dimension algorithm is roughly 1000x cheaper than the attention mechanism in Llama 3.1 405B1 for a 100M token context window.

The contrast in memory requirements is even larger – running Llama 3.1 405B with a 100M token context requires 638 H100s per user just to store a single 100M token KV cache.2 In contrast, LTM requires a small fraction of a single H100’s HBM per user for the same context.

Integrations

No integrations listed.

Ratings/Reviews

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Company Information

Magic AI
Founded: 2022
United States
magic.dev/

Videos and Screen Captures

LTM-2-mini Screenshot 1
Other Useful Business Software
Go From AI Idea to AI App Fast Icon
Go From AI Idea to AI App Fast

One platform to build, fine-tune, and deploy ML models. No MLOps team required.

Access Gemini 3 and 200+ models. Build chatbots, agents, or custom models with built-in monitoring and scaling.
Try Free

Product Details

Platforms Supported
Cloud

LTM-2-mini Frequently Asked Questions

Q: What kinds of users and organization types does LTM-2-mini work with?
Q: What languages does LTM-2-mini support in their product?

LTM-2-mini Product Features

LTM-2-mini Additional Categories