Mistral logo

Ministral 3 (3B Instruct 2512)

Overview

Overview

The smallest model in the Ministral 3 family, Ministral 3 3B is a powerful, efficient tiny language model with vision capabilities. This model is the instruct post-trained version in FP8, fine-tuned for instruction tasks, making it ideal for chat and instruction based use cases. The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 3B can even be deployed locally, fitting in 16GB of VRAM in BF16, and less than 8GB of RAM/VRAM when quantized.

Ministral 3 (3B Instruct 2512) was released on December 4, 2025.

Performance

Timeline

ReleasedUnknown
Knowledge CutoffUnknown

Specifications

Parameters
3.0B
License
Apache 2.0
Training Data
Unknown

Benchmarks

Benchmarks

Ministral 3 (3B Instruct 2512) Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Mon Jan 26 2026
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing

Pricing, performance, and capabilities for Ministral 3 (3B Instruct 2512) across different providers:

No pricing information available for this model.

API Access

API Access Coming Soon

API access for Ministral 3 (3B Instruct 2512) will be available soon through our gateway.

Recent Posts

Recent Reviews

FAQ

Common questions about Ministral 3 (3B Instruct 2512)

Ministral 3 (3B Instruct 2512) was released on December 4, 2025 by Mistral.
Ministral 3 (3B Instruct 2512) was created by Mistral.
Ministral 3 (3B Instruct 2512) has 3.0 billion parameters.
Ministral 3 (3B Instruct 2512) is released under the Apache 2.0 license. This is an open-source/open-weight license.
Yes, Ministral 3 (3B Instruct 2512) is a multimodal model that can process both text and images as input.