Sapling Logo

MPT vs. OPT

LLM Comparison


MPT

MPT

Overview

MPT-7B and MPT-30B are a set of models that are part of MosaicML's Foundation Series. Trained on 1T tokens, the developers state that MPT-7B matches the performance of LLaMA while also being open source, while MPT-30B outperforms the original GPT-3. In addition to the base model, the developers also offer MPT-Instruct, MPT-Chat, and MPT-7B-StoryWriter-65k+, the last of which is trained on a context length of 65K tokens.



Initial release: 2023-05-05

OPT

OPT

Overview

Open Pre-trained Transformer Language Models (OPT) is part of the family of open source models designed to replicate GPT-3, with similar decoder-only architecture. It has since been superseded by models such as LLaMA, GPT-J, and Pythia.



Initial release: 2022-05-03

Further Reading

Looking for an LLM API/SDK that works out of the box? No prompts or ad hoc guardrails.

Sapling API
More Comparisons

MPT

OPT

Products & Features
Instruct Models
Coding Capability
Customization
Finetuning
Open Source
License Apache 2.0 NA
Model Sizes 7B, 30B 1.3B, 2.7B, 6.7B, 13B, 30B, 66B, 175B