Sapling Logo

LLaMA vs. MPT

LLM Comparison


LLaMA

LLaMA

Overview

LLaMA was previously Meta AI's most performant LLM available for researchers and noncommercial use cases. It has since been succeeded by Llama 2.


The model that launched a frenzy in open-source instruct-finetuned models, LLaMA is Meta AI's more parameter-efficient, open alternative to large commercial LLMs. Despite being smaller than many commercial models, LLaMA outperformed the gold standard GPT-3 on many benchmarks, with the primary drawback being that its access remains gated to researchers with restrictions on commercial use.


Initial release: 2023-02-24

MPT

MPT

Overview

MPT-7B and MPT-30B are a set of models that are part of MosaicML's Foundation Series. Trained on 1T tokens, the developers state that MPT-7B matches the performance of LLaMA while also being open source, while MPT-30B outperforms the original GPT-3. In addition to the base model, the developers also offer MPT-Instruct, MPT-Chat, and MPT-7B-StoryWriter-65k+, the last of which is trained on a context length of 65K tokens.



Initial release: 2023-05-05

Looking for an LLM API/SDK that works out of the box? No prompts or ad hoc guardrails.

Sapling API
More Comparisons

LLaMA

MPT

Products & Features
Instruct Models
Coding Capability
Customization
Finetuning
Open Source
License Noncommercial Apache 2.0
Model Sizes 7B, 13B, 33B, 65B 7B, 30B