Sapling Logo

LLM: MPT

MPT

Developer
MosaicML
Initial Release
2023-05-05
Overview
MPT-7B and MPT-30B are a set of models that are part of MosaicML's Foundation Series. Trained on 1T tokens, the developers state that MPT-7B matches the performance of LLaMA while also being open source, while MPT-30B outperforms the original GPT-3. In addition to the base model, the developers also offer MPT-Instruct, MPT-Chat, and MPT-7B-StoryWriter-65k+, the last of which is trained on a context length of 65K tokens.
Description
Open Source
Yes
Instruct Tuned
Yes
Model Sizes
7B, 30B
Finetuning
Yes
License
Apache 2.0
Pricing
-
Link
Visit
Further Reading

Unsure? Contact us with a brief description of your use case if you'd like for us to make a snap assessment. Depending on your requirements, a smaller, custom language model may even be the best option.