Sapling Logo

MPT vs. Pythia

LLM Comparison


MPT

MPT

Overview

MPT-7B and MPT-30B are a set of models that are part of MosaicML's Foundation Series. Trained on 1T tokens, the developers state that MPT-7B matches the performance of LLaMA while also being open source, while MPT-30B outperforms the original GPT-3. In addition to the base model, the developers also offer MPT-Instruct, MPT-Chat, and MPT-7B-StoryWriter-65k+, the last of which is trained on a context length of 65K tokens.



Initial release: 2023-05-05

Pythia

Pythia

Overview

The most recent (as of May 2023) effort from EleutherAI, Pythia is a set of LLMs trained on The Pile. While it appears to outperform OPT and GPTNeo, its performance against GPT-J is unclear. Versions of Pythia have also been instruct-tuned by the team at Together.



Initial release: 2023-02-13

Further Reading

Looking for an LLM API/SDK that works out of the box? No prompts or ad hoc guardrails.

Sapling API
More Comparisons

MPT

Pythia

Products & Features
Instruct Models
Coding Capability
Customization
Finetuning
Open Source
License Apache 2.0 Apache 2.0
Model Sizes 7B, 30B 1B, 1.4B, 2.8B, 6.9B, 12B