Mistral logo

Mistral Large 3 (675B Base)

Mistral
mistral-large-3-675b-base-2512Variant

Overview

Mistral Large 3 is a state-of-the-art general-purpose Multimodal granular Mixture-of-Experts model with 41B active parameters and 675B total parameters trained from scratch with 3000 H200s. This model is the base pre-trained version, not fine-tuned for instruction or reasoning tasks, making it ideal for custom post-training processes. Designed for reliability and long-context comprehension - It is engineered for production-grade assistants, retrieval-augmented systems, scientific workloads, and complex enterprise workflows.

Mistral Large 3 (675B Base) was released on December 4, 2025.

Performance

Timeline

Release DateUnknown
Knowledge CutoffUnknown

Other Details

Parameters
675.0B
License
Apache 2.0
Training Data
Unknown

Related Models

Compare Mistral Large 3 (675B Base) to other models by quality (GPQA score) vs cost. Higher scores and lower costs represent better value.

Performance visualization loading...

Gathering benchmark data from similar models

Benchmarks

Mistral Large 3 (675B Base) Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Wed Dec 10 2025
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing, performance, and capabilities for Mistral Large 3 (675B Base) across different providers:

No pricing information available for this model.

Example Outputs

Recent Posts

Recent Reviews

API Access

API Access Coming Soon

API access for Mistral Large 3 (675B Base) will be available soon through our gateway.

FAQ

Common questions about Mistral Large 3 (675B Base)

Mistral Large 3 (675B Base) was released on December 4, 2025.
Mistral Large 3 (675B Base) has 675.0 billion parameters.