Sapling Logo

GPT-J vs. MPT-7B

LLM Comparison

GPT-J

Overview

GPT-J is a model released by EleutherAI shortly after its release of GPTNeo, with the aim of delveoping an open source model with capabilities similar to OpenAI's GPT-3 model. With a larger size than GPTNeo, GPT-J also performs better on various benchmarks.

Initial release: 2021-06-09

Further Reading

MPT-7B

Overview

MPT-7B is a set of models that are part of MosaicML's Foundation Series. Trained on 1T tokens, the developers state that it matches the performance of LLaMA while also being open source. In addition to the base model, the developers also offer MPT-7B-Instruct, MPT-7B-Chat, and MPT-7B-StoryWriter-65k+, the last of which is trained on a context length of 65K tokens.

Initial release: 2023-05-05

Looking for an LLM API/SDK that works out of the box?

Sapling API
More Comparisons

GPT-J

MPT-7B

Products & Features
Instruct Models
Coding Capability
Customization
Finetuning
Open Source
License Apache 2.0 Apache 2.0
Model Sizes 6B 7B