Sapling Logo

GPTNeo vs. OpenLLaMA

LLM Comparison


GPTNeo

GPTNeo

Overview

GPTNeo is a model released by EleutherAI to try and provide an open source model with capabilities similar to OpenAI's GPT-3 model. One of the earliest such models, GPTNeo was trained on The Pile, Eleuther's corpus of web text.



Initial release: 2021-03-21

Further Reading

OpenLLaMA

OpenLLaMA

Overview

OpenLLaMA is an effort from OpenLM Research to offer a non-gated version of LLaMa that can be used both for research and commercial applications. As of June 2023, the model is still training, with 3B, 7B, and 13B parameter models available.



Initial release: 2023-04-28

Looking for an LLM API/SDK that works out of the box? No prompts or ad hoc guardrails.

Sapling API
More Comparisons

GPTNeo

OpenLLaMA

Products & Features
Instruct Models
Coding Capability
Customization
Finetuning
Open Source
License Apache 2.0 Apache 2.0
Model Sizes 1.3B, 2.7B 3B, 7B, 13B