Sapling Logo

Alpaca vs. Dolly

LLM Comparison

Alpaca

Overview

The first of many instruct-finetuned versions of LLaMA, Alpaca is an instruction-following model introduced by Stanford researchers. Impressively, with only $600 of compute spend, the researchers demonstrated that on qualitative benchmarks Alpaca performed similarly to OpenAI's text-davinci-003, a significantly larger model.

Initial release: 2023-03-13

Dolly

Overview

Dolly is an LLM trained using the Databricks machine learning platform. Originally released without instruct-finetuning, Dolly v2 included tuning on the Stanford Alpaca dataset.

Initial release: 2023-03-24

Looking for an LLM API/SDK that works out of the box?

Sapling API
More Comparisons

Alpaca

Dolly

Products & Features
Instruct Models
Coding Capability
Customization
Finetuning
Open Source
License Noncommercial MIT
Model Sizes 7B 3B, 12B