Ministral 3 (14B Reasoning 2512)
MistralOverview
A balanced model in the Ministral 3 family, Ministral 3 14B is a powerful, efficient tiny language model with vision capabilities. This model is the reasoning post-trained version, trained for reasoning tasks, making it ideal for math, coding and stem related use cases. The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 14B can even be deployed locally, capable of fitting in 24GB of VRAM in BF16, and less than 12GB of RAM/VRAM when quantized.
Ministral 3 (14B Reasoning 2512) was released on December 4, 2025. API access is available through Mistral AI.
Performance
Timeline
Other Details
Related Models
Compare Ministral 3 (14B Reasoning 2512) to other models by quality (GPQA score) vs cost. Higher scores and lower costs represent better value.
Performance visualization loading...
Gathering benchmark data from similar models
Benchmarks
Ministral 3 (14B Reasoning 2512) Performance Across Datasets
Scores sourced from the model's scorecard, paper, or official blog posts
Pricing
Pricing, performance, and capabilities for Ministral 3 (14B Reasoning 2512) across different providers:
| Provider | Input ($/M) | Output ($/M) | Max Input | Max Output | Latency (s) | Throughput | Quantization | Input | Output |
|---|---|---|---|---|---|---|---|---|---|
Mistral AI | $0.20 | $0.20 | 262.1K | 262.1K | 0.23 | 128.6 tok/s | — | Text Image Audio Video | Text Image Audio Video |
Example Outputs
Recent Posts
Recent Reviews
API Access
API Access Coming Soon
API access for Ministral 3 (14B Reasoning 2512) will be available soon through our gateway.
FAQ
Common questions about Ministral 3 (14B Reasoning 2512)
