Released alongside Vicuna, Koala is one of many descendants of the Meta LLaMA model trained on dialogue data collected from the web. On the developers' benchmarks, Koala outperforms its sibling Alpaca, though its adoption has been significantly less than that of its other sibling, Vicuna. Due to its use of LLaMA, only research use is permitted.
Initial release: 2023-04-03
OpenLLaMA is an effort from OpenLM Research to offer a non-gated version of LLaMa that can be used both for research and commercial applications. As of June 2023, the model is still training, with 3B, 7B, and 13B parameter models available.
Initial release: 2023-04-28
|Products & Features|
|Model Sizes||7B, 13B||3B, 7B, 13B|