Sapling Logo

Orca vs. Phi

LLM Comparison


Orca

Orca

Overview

Orca is a descendant of LLaMA developed by Microsoft with finetuning on explanation traces obtained from GPT-4.


Orca-13B is a LLM developed by Microsoft. It is based on LLaMA with finetuning on complex explanation traces obtained from GPT-4. By using rich signals, Orca surpasses the performance of models such as Vicuna-13B on complex tasks. However, given its model backbone and the data used for its finetuning, Orca is under noncommercial use.


Initial release: 2023-06-05

Phi

Phi

Overview

Phi is a series of compact language models developed by Microsoft using textbooks and synthetic data.


Phi-1 and Phi-2 are 1.3B and 2.7B parameter language models, respectively, developed by Microsoft to demonstrate the ability of smaller language models trained on high-quality data. Despite its size and not having an instruct finetuned counterpart, the Phi-2 model is well-suited to research and experimentation given its size and MIT license.


Initial release: 2023-06-20

Looking for an LLM API/SDK that works out of the box? No prompts or ad hoc guardrails.

Sapling API
More Comparisons

Orca

Phi

Products & Features
Instruct Models
Coding Capability
Customization
Finetuning
Open Source
License Noncommercial MIT
Model Sizes 13B 1.3B, 2.7B