Sapling Logo

DeepSeek vs. Gemma 2

LLM Comparison


DeepSeek

DeepSeek

Overview

DeepSeek currently offers V3 and R1 models, both of which are highly efficient and performant. V3 is comparable to models such as Anthropic's Sonnet 3.5, while R1 is comparable to models such as OpenAI's o1.


DeepSeek is a Chinese startup that began releasing LLMs in 2023 with DeepSeek Coder. In rapid succession, DeepSeek has since released more powerful models, most notably releasing DeepSeek V3 at the end of 2024 and DeepSeek R1 at the beginning of 2025. DeepSeek V3 and R1 set the frontier in terms of efficiency while maintaining high performance. The release of V3 and R1 sent shockwaves through the US technology sector, especially given the low cost with which V3 and R1 were trained (orders of magnitude less than the cost of training equivalent US models.)


Initial release: 2023-11-29

Further Reading

Gemma 2

Gemma 2

Overview

Gemma 2 succeeds the Gemma family of lightweight open models from Google built using the same processes used for the the larger Gemini models.


Gemma 2 is the successor to the Gemma family of open models, including larer models (9B and 27B parameters) with outsized performance across benchmarks. Using a combination of techniques such as training on twice as much data, knowledge distillation, and architectural improvements such as sliding window attention, logit soft-capping, and model merging, Gemma 2 outperforms models of similar size (such as Llama 3), with the 27B parameter model being competitive with models more than twice its size (such as Llama 70B).


Initial release: 2024-06-27

Looking for an LLM API/SDK that works out of the box? No prompts or ad hoc guardrails.

Sapling API
More Comparisons

DeepSeek

Gemma 2

Products & Features
Instruct Models
Coding Capability
Customization
Finetuning
Open Source
License MIT Custom
Model Sizes 67B, 671B 2.6B, 9B, 27B