Sapling Logo

LLM: Pythia

Pythia

Developer
EleutherAI, Together
Initial Release
2023-02-13
Overview
The most recent (as of May 2023) effort from EleutherAI, Pythia is a set of LLMs trained on The Pile. While it appears to outperform OPT and GPTNeo, its performance against GPT-J is unclear. Versions of Pythia have also been instruct-tuned by the team at Together.
Description
Open Source
Yes
Instruct Tuned
No
Model Sizes
1B, 1.4B, 2.8B, 6.9B, 12B
Finetuning
Yes
License
Apache 2.0
Pricing
-
Link
Visit
Further Reading

Unsure? Contact us with a brief description of your use case if you'd like for us to make a snap assessment. Depending on your requirements, a smaller, custom language model may even be the best option.