Dolly is an LLM trained using the Databricks machine learning platform. Originally released without instruct-finetuning, Dolly v2 included tuning on the Stanford Alpaca dataset.
Initial release: 2023-03-24
Released alongside Vicuna, Koala is one of many descendants of the Meta LLaMA model trained on dialogue data collected from the web. On the developers' benchmarks, Koala outperforms its sibling Alpaca, though its adoption has been significantly less than that of its other sibling, Vicuna. Due to its use of LLaMA, only research use is permitted.
Initial release: 2023-04-03
|Products & Features|
|Model Sizes||3B, 12B||7B, 13B|