Sapling Logo

OPT vs. RedPajama-INCITE

LLM Comparison


OPT

OPT

Overview

Open Pre-trained Transformer Language Models (OPT) is part of the family of open source models designed to replicate GPT-3, with similar decoder-only architecture. It has since been superseded by models such as LLaMA, GPT-J, and Pythia.



Initial release: 2022-05-03

Further Reading

RedPajama-INCITE

RedPajama-INCITE

Overview

RedPajama-INCITE is the first family of models trained on the RedPajama base dataset. The goal of the RedPajama-INCITE models is to replicate the LLaMA recipe but make the model fully open source under the Apache license. As of the initial release, the 3B parameter model is best-in-class, with the 7B parameter model in progress. Update as of June 6, 2023: the 7B parameter model was made available, outperforming other models of the same size.



Initial release: 2023-05-05

Looking for an LLM API/SDK that works out of the box? No prompts or ad hoc guardrails.

Sapling API
More Comparisons

OPT

RedPajama-INCITE

Products & Features
Instruct Models
Coding Capability
Customization
Finetuning
Open Source
License NA Apache 2.0
Model Sizes 1.3B, 2.7B, 6.7B, 13B, 30B, 66B, 175B 3B, 7B