Released alongside Vicuna, Koala is one of many descendants of the Meta LLaMA model trained on dialogue data collected from the web. On the developers' benchmarks, Koala outperforms its sibling Alpaca, though its adoption has been significantly less than that of its other sibling, Vicuna. Due to its use of LLaMA, only research use is permitted.
Initial release: 2023-04-03
The most recent (as of May 2023) effort from EleutherAI, Pythia is a set of LLMs trained on The Pile. While it appears to outperform OPT and GPTNeo, its performance against GPT-J is unclear. Versions of Pythia have also been instruct-tuned by the team at Together.
Initial release: 2023-02-13
|Products & Features|
|Model Sizes||7B, 13B||1B, 1.4B, 2.8B, 6.9B, 12B|