Sapling Logo

LLM: OPT

OPT

Developer
Meta
Initial Release
2022-05-03
Overview
Open Pre-trained Transformer Language Models (OPT) is part of the family of open source models designed to replicate GPT-3, with similar decoder-only architecture. It has since been superseded by models such as LLaMA, GPT-J, and Pythia.
Description
Open Source
Yes
Instruct Tuned
No
Model Sizes
1.3B, 2.7B, 6.7B, 13B, 30B, 66B, 175B
Finetuning
Yes
License
NA
Pricing
-
Link
Visit
Further Reading

Unsure? Contact us with a brief description of your use case if you'd like for us to make a snap assessment. Depending on your requirements, a smaller, custom language model may even be the best option.