Sapling Logo

FLAN-T5 vs. MPT-7B

LLM Comparison

FLAN-T5

Overview

FLAN-T5 is a finetuned version of Google's popular T5 model with instruct-finetuning. As stated in the model repository's introduction, compared to T5, FLAN-T5 is "just better at everything." With its permissive license, FLAN-T5 has become a popular option for a starting instruct model.

Initial release: 2022-12-06

MPT-7B

Overview

MPT-7B is a set of models that are part of MosaicML's Foundation Series. Trained on 1T tokens, the developers state that it matches the performance of LLaMA while also being open source. In addition to the base model, the developers also offer MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-StoryWriter-65k+, the last of which is trained on a context length of 65K tokens.

Initial release: 2023-05-05

Looking for an LLM API/SDK that works out of the box?

Sapling API
More Comparisons

FLAN-T5

MPT-7B

Products & Features
Instruct Models
Coding Capability
Customization
Finetuning
Open Source
License Apache 2.0 Apache 2.0
Model Sizes 3B, 11B 7B