Google logo

Gemma 2 9B

Overview

Overview

Gemma 2 9B IT is an instruction-tuned version of Google's Gemma 2 9B base model. It was trained on 8 trillion tokens of web data, code, and math content. The model features sliding window attention, logit soft-capping, and knowledge distillation techniques. It's optimized for dialogue applications through supervised fine-tuning, distillation, RLHF, and model merging using WARP.

Gemma 2 9B was released on June 27, 2024.

Performance

Timeline

ReleasedUnknown
Knowledge CutoffUnknown

Specifications

Parameters
9.2B
License
Gemma
Training Data
Unknown
Tags
tuning:instruct

Benchmarks

Benchmarks

Gemma 2 9B Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Sat Feb 14 2026
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing

Pricing, performance, and capabilities for Gemma 2 9B across different providers:

No pricing information available for this model.

API Access

API Access Coming Soon

API access for Gemma 2 9B will be available soon through our gateway.

Recent Posts

Recent Reviews

FAQ

Common questions about Gemma 2 9B

Gemma 2 9B was released on June 27, 2024 by Google.
Gemma 2 9B was created by Google.
Gemma 2 9B has 9.2 billion parameters.
Gemma 2 9B is released under the Gemma license.