Sapling Logo

Phi vs. Vicuna

LLM Comparison


Phi

Phi

Overview

Phi is a series of compact language models developed by Microsoft using textbooks and synthetic data.


Phi-1 and Phi-2 are 1.3B and 2.7B parameter language models, respectively, developed by Microsoft to demonstrate the ability of smaller language models trained on high-quality data. Despite its size and not having an instruct finetuned counterpart, the Phi-2 model is well-suited to research and experimentation given its size and MIT license.


Initial release: 2023-06-20

Vicuna

Vicuna

Overview

Released alongside Koala, Vicuna is one of many descendants of the Meta LLaMA model trained on dialogue data collected from the ShareGPT website. According to the authors, Vicuna achieves more than 90% of ChatGPT's quality in user preference tests, while vastly outperforming Alpaca. As of May 2023, Vicuna seems to be the heir apparent of the instruct-finetuned LLaMA model family, though it is also restricted from commercial use.



Initial release: 2023-03-30

Looking for an LLM API/SDK that works out of the box? No prompts or ad hoc guardrails.

Sapling API
More Comparisons

Phi

Vicuna

Products & Features
Instruct Models
Coding Capability
Customization
Finetuning
Open Source
License MIT Noncommercial
Model Sizes 1.3B, 2.7B 13B