Sapling Logo

Alpaca vs. FLAN-T5

LLM Comparison


Alpaca

Alpaca

Overview

Alpaca is an instruction-finetuned LLM based off of LLaMA.


The first of many instruct-finetuned versions of LLaMA, Alpaca is an instruction-following model introduced by Stanford researchers. Impressively, with only $600 of compute spend, the researchers demonstrated that on qualitative benchmarks Alpaca performed similarly to OpenAI's text-davinci-003, a significantly larger model.


Initial release: 2023-03-13

Further Reading

FLAN-T5

FLAN-T5

Overview

FLAN-T5 is a finetuned version of Google's popular T5 model with instruct-finetuning. As stated in the model repository's introduction, compared to T5, FLAN-T5 is "just better at everything." With its permissive license, FLAN-T5 has become a popular option for a starting instruct model.



Initial release: 2022-12-06

Further Reading

Looking for an LLM API/SDK that works out of the box? No prompts or ad hoc guardrails.

Sapling API
More Comparisons

Alpaca

FLAN-T5

Products & Features
Instruct Models
Coding Capability
Customization
Finetuning
Open Source
License Noncommercial Apache 2.0
Model Sizes 7B 3B, 11B