Audience

AI and LLM developers and engineers

About MPT-7B

Introducing MPT-7B, the latest entry in our MosaicML Foundation Series. MPT-7B is a transformer trained from scratch on 1T tokens of text and code. It is open source, available for commercial use, and matches the quality of LLaMA-7B. MPT-7B was trained on the MosaicML platform in 9.5 days with zero human intervention at a cost of ~$200k.

Now you can train, finetune, and deploy your own private MPT models, either starting from one of our checkpoints or training from scratch. For inspiration, we are also releasing three finetuned models in addition to the base MPT-7B: MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-StoryWriter-65k+, the last of which uses a context length of 65k tokens!

Pricing

Starting Price:
Free
Pricing Details:
Open source
Free Version:
Free Version available.

Integrations

Ratings/Reviews

Overall 0.0 / 5
ease 0.0 / 5
features 0.0 / 5
design 0.0 / 5
support 0.0 / 5

This software hasn't been reviewed yet. Be the first to provide a review:

Review this Software

Company Information

MosaicML
Founded: 2021
United States
www.mosaicml.com/blog/mpt-7b

Videos and Screen Captures

MPT-7B Screenshot 1
You Might Also Like
Passwordless authentication enables a secure and frictionless experience for your users | Auth0 Icon
Over two-thirds of people reuse passwords across sites, resulting in an increasingly insecure e-commerce ecosystem. Learn how passwordless can not only mitigate these issues but make the authentication experience delightful. Implement Auth0 in any application in just five minutes

Product Details

Platforms Supported
SaaS
Windows
Mac
Linux
On-Premises
Training
Documentation

MPT-7B Frequently Asked Questions

Q: What kinds of users and organization types does MPT-7B work with?
Q: What languages does MPT-7B support in their product?
Q: What type of training does MPT-7B provide?
Q: How much does MPT-7B cost?

MPT-7B Product Features