Mistral logo

MiniStral 3 (14B Instruct 2512)

Mistral
ministral-3-14b-instruct-2512Variant

Overview

A balanced model in the Ministral 3 family, Ministral 3 14B is a powerful, efficient tiny language model with vision capabilities. This model is the instruct post-trained version in FP8, fine-tuned for instruction tasks, making it ideal for chat and instruction based use cases. The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 14B can even be deployed locally, capable of fitting in 24GB of VRAM in BF16, and less than 12GB of RAM/VRAM when quantized.

MiniStral 3 (14B Instruct 2512) was released on December 4, 2025.

Performance

Timeline

Release DateUnknown
Knowledge CutoffUnknown

Other Details

Parameters
14.0B
License
Apache 2.0
Training Data
Unknown

Related Models

Compare MiniStral 3 (14B Instruct 2512) to other models by quality (GPQA score) vs cost. Higher scores and lower costs represent better value.

Performance visualization loading...

Gathering benchmark data from similar models

Benchmarks

MiniStral 3 (14B Instruct 2512) Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Wed Dec 10 2025
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing, performance, and capabilities for MiniStral 3 (14B Instruct 2512) across different providers:

No pricing information available for this model.

Example Outputs

Recent Posts

Recent Reviews

API Access

API Access Coming Soon

API access for MiniStral 3 (14B Instruct 2512) will be available soon through our gateway.

FAQ

Common questions about MiniStral 3 (14B Instruct 2512)

MiniStral 3 (14B Instruct 2512) was released on December 4, 2025.
MiniStral 3 (14B Instruct 2512) has 14.0 billion parameters.