Mistral logo

MiniStral 3 (14B Instruct 2512)

Overview

Overview

A balanced model in the Ministral 3 family, Ministral 3 14B is a powerful, efficient tiny language model with vision capabilities. This model is the instruct post-trained version in FP8, fine-tuned for instruction tasks, making it ideal for chat and instruction based use cases. The Ministral 3 family is designed for edge deployment, capable of running on a wide range of hardware. Ministral 3 14B can even be deployed locally, capable of fitting in 24GB of VRAM in BF16, and less than 12GB of RAM/VRAM when quantized.

MiniStral 3 (14B Instruct 2512) was released on December 4, 2025.

Performance

Timeline

ReleasedUnknown
Knowledge CutoffUnknown

Specifications

Parameters
14.0B
License
Apache 2.0
Training Data
Unknown

Benchmarks

Benchmarks

MiniStral 3 (14B Instruct 2512) Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Sat Feb 07 2026
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing

Pricing, performance, and capabilities for MiniStral 3 (14B Instruct 2512) across different providers:

No pricing information available for this model.

API Access

API Access Coming Soon

API access for MiniStral 3 (14B Instruct 2512) will be available soon through our gateway.

Recent Posts

Recent Reviews

FAQ

Common questions about MiniStral 3 (14B Instruct 2512)

MiniStral 3 (14B Instruct 2512) was released on December 4, 2025 by Mistral.
MiniStral 3 (14B Instruct 2512) was created by Mistral.
MiniStral 3 (14B Instruct 2512) has 14.0 billion parameters.
MiniStral 3 (14B Instruct 2512) is released under the Apache 2.0 license. This is an open-source/open-weight license.
Yes, MiniStral 3 (14B Instruct 2512) is a multimodal model that can process both text and images as input.