Sapling Logo

MPT-7B vs. RedPajama-INCITE

LLM Comparison

MPT-7B

Overview

MPT-7B is a set of models that are part of MosaicML's Foundation Series. Trained on 1T tokens, the developers state that it matches the performance of LLaMA while also being open source. In addition to the base model, the developers also offer MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-StoryWriter-65k+, the last of which is trained on a context length of 65K tokens.

Initial release: 2023-05-05

RedPajama-INCITE

Overview

RedPajama-INCITE is the first family of models trained on the RedPajama base dataset. The goal of the RedPajama-INCITE models is to replicate the LLaMA recipe but make the model fully open source under the Apache license. As of the initial release, the 3B parameter model is best-in-class, with the 7B parameter model in progress.

Initial release: 2023-05-05

Looking for an LLM API/SDK that works out of the box?

Sapling API
More Comparisons

MPT-7B

RedPajama-INCITE

Products & Features
Instruct Models
Coding Capability
Customization
Finetuning
Open Source
License Apache 2.0 Apache 2.0
Model Sizes 7B 3B, 7B