Mistral logo

Ministral 3 (8B Instruct 2512)

Mistral
ministral-3-8b-instruct-2512Variant

Overview

A balanced model in the Ministral 3 family, Ministral 3 8B is a powerful, efficient tiny language model with vision capabilities. This model is the instruct post-trained version in FP8, fine-tuned for instruction tasks, making it ideal for chat and instruction based use cases. The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 8B can even be deployed locally, capable of fitting in 24GB of VRAM in BF16, and less than 12GB of RAM/VRAM when quantized.

Ministral 3 (8B Instruct 2512) was released on December 4, 2025.

Performance

Timeline

Release DateUnknown
Knowledge CutoffUnknown

Other Details

Parameters
8.0B
License
Apache 2.0
Training Data
Unknown

Related Models

Compare Ministral 3 (8B Instruct 2512) to other models by quality (GPQA score) vs cost. Higher scores and lower costs represent better value.

Performance visualization loading...

Gathering benchmark data from similar models

Benchmarks

Ministral 3 (8B Instruct 2512) Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Wed Dec 10 2025
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing, performance, and capabilities for Ministral 3 (8B Instruct 2512) across different providers:

No pricing information available for this model.

Example Outputs

Recent Posts

Recent Reviews

API Access

API Access Coming Soon

API access for Ministral 3 (8B Instruct 2512) will be available soon through our gateway.

FAQ

Common questions about Ministral 3 (8B Instruct 2512)

Ministral 3 (8B Instruct 2512) was released on December 4, 2025.
Ministral 3 (8B Instruct 2512) has 8.0 billion parameters.