Sapling Logo

FLAN-T5 vs. Koala

LLM Comparison


FLAN-T5

FLAN-T5

Overview

FLAN-T5 is a finetuned version of Google's popular T5 model with instruct-finetuning. As stated in the model repository's introduction, compared to T5, FLAN-T5 is "just better at everything." With its permissive license, FLAN-T5 has become a popular option for a starting instruct model.



Initial release: 2022-12-06

Further Reading

Koala

Koala

Overview

Released alongside Vicuna, Koala is one of many descendants of the Meta LLaMA model trained on dialogue data collected from the web. On the developers' benchmarks, Koala outperforms its sibling Alpaca, though its adoption has been significantly less than that of its other sibling, Vicuna. Due to its use of LLaMA, only research use is permitted.



Initial release: 2023-04-03

Further Reading

Looking for an LLM API/SDK that works out of the box? No prompts or ad hoc guardrails.

Sapling API
More Comparisons

FLAN-T5

Koala

Products & Features
Instruct Models
Coding Capability
Customization
Finetuning
Open Source
License Apache 2.0 Noncommercial
Model Sizes 3B, 11B 7B, 13B