Google logo

Gemma 3 1B

Google·
Mar 2025
1B params

Overview

The Gemma 3 1B model is a lightweight, 1-billion-parameter language model by Google, optimized for efficiency on resource-limited devices. At 529MB, it processes text at 2,585 tokens/second with a context window of 128,000 tokens. It supports 35+ languages but handles text-only input, unlike larger multimodal Gemma models. This balance of speed and efficiency makes it ideal for fast text processing on mobile and low-power devices.

Gemma 3 1B was released on March 12, 2025.

Performance

Timeline

ReleasedUnknown
Knowledge CutoffUnknown

Specifications

Parameters
1.0B
License
Gemma
Training Data
Unknown
Tags
tuning:instruct

Related Models

Compare Gemma 3 1B to other models by quality (GPQA score) vs cost. Higher scores and lower costs represent better value.

Performance visualization loading...

Gathering benchmark data from similar models

Benchmarks

Gemma 3 1B Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Sat Dec 20 2025
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing, performance, and capabilities for Gemma 3 1B across different providers:

No pricing information available for this model.

Example Outputs

Recent Posts

Recent Reviews

API Access

API Access Coming Soon

API access for Gemma 3 1B will be available soon through our gateway.

FAQ

Common questions about Gemma 3 1B

Gemma 3 1B was released on March 12, 2025.
Gemma 3 1B has 1.0 billion parameters.