FLAN-T5 is a finetuned version of Google's popular T5 model with instruct-finetuning. As stated in the model repository's introduction, compared to T5, FLAN-T5 is "just better at everything." With its permissive license, FLAN-T5 has become a popular option for a starting instruct model.
Initial release: 2022-12-06
StableVicuna is an RLHF finetune of Vicuna using datasets such as the OpenAssistant Conversations Dataset and the GPT4All Prompt Generations dataset.
Initial release: 2023-04-28
|Products & Features|
|Model Sizes||3B, 11B||13B|