FLAN-T5 is a finetuned version of Google's popular T5 model with instruct-finetuning. As stated in the model repository's introduction, compared to T5, FLAN-T5 is "just better at everything." With its permissive license, FLAN-T5 has become a popular option for a starting instruct model.
Initial release: 2022-12-06
Open Pre-trained Transformer Language Models (OPT) is part of the family of open source models designed to replicate GPT-3, with similar decoder-only architecture. It has since been superseded by models such as LLaMA, GPT-J, and Pythia.
Initial release: 2022-05-03
|Products & Features|
|Model Sizes||3B, 11B||1.3B, 2.7B, 6.7B, 13B, 30B, 66B, 175B|