Released alongside Vicuna, Koala is one of many descendants of the Meta LLaMA model trained on dialogue data collected from the web. On the developers' benchmarks, Koala outperforms its sibling Alpaca, though its adoption has been significantly less than that of its other sibling, Vicuna. Due to its use of LLaMA, only research use is permitted.
Initial release: 2023-04-03
StableVicuna is an RLHF finetune of Vicuna using datasets such as the OpenAssistant Conversations Dataset and the GPT4All Prompt Generations dataset.
Initial release: 2023-04-28
Looking for an LLM API/SDK that works out of the box?Sapling API
|Products & Features|
|Model Sizes||7B, 13B||13B|