Mistral logo

Mistral Small 3.1 24B Base

Overview

Overview

Pretrained base model version of Mistral Small 3.1. Features improved text performance, multimodal understanding, multilingual capabilities, and an expanded 128k token context window compared to Mistral Small 3. Designed for fine-tuning.

Mistral Small 3.1 24B Base was released on March 17, 2025. API access is available through Mistral AI.

Performance

Timeline

ReleasedUnknown
Knowledge CutoffUnknown

Specifications

Parameters
24.0B
License
Apache 2.0
Training Data
Unknown
Tags
tuning:base

Benchmarks

Benchmarks

Mistral Small 3.1 24B Base Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Sun Jan 25 2026
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing

Pricing, performance, and capabilities for Mistral Small 3.1 24B Base across different providers:

ProviderInput ($/M)Output ($/M)Max InputMax OutputLatency (s)ThroughputQuantizationInputOutput
Mistral AI logo
Mistral AI
$0.10$0.30128.0K128.0K
0.23
137.1 c/s
Text
Image
Audio
Video
Text
Image
Audio
Video

API Access

API Access Coming Soon

API access for Mistral Small 3.1 24B Base will be available soon through our gateway.

Recent Posts

Recent Reviews

FAQ

Common questions about Mistral Small 3.1 24B Base

Mistral Small 3.1 24B Base was released on March 17, 2025 by Mistral.
Mistral Small 3.1 24B Base was created by Mistral.
Mistral Small 3.1 24B Base has 24.0 billion parameters.
Mistral Small 3.1 24B Base is released under the Apache 2.0 license. This is an open-source/open-weight license.
Yes, Mistral Small 3.1 24B Base is a multimodal model that can process both text and images as input.