Mistral logo

Mistral Large 3 (675B Base)

Overview

Overview

Mistral Large 3 is a state-of-the-art general-purpose Multimodal granular Mixture-of-Experts model with 41B active parameters and 675B total parameters trained from scratch with 3000 H200s. This model is the base pre-trained version, not fine-tuned for instruction or reasoning tasks, making it ideal for custom post-training processes. Designed for reliability and long-context comprehension - It is engineered for production-grade assistants, retrieval-augmented systems, scientific workloads, and complex enterprise workflows.

Mistral Large 3 (675B Base) was released on December 4, 2025.

Performance

Timeline

ReleasedUnknown
Knowledge CutoffUnknown

Specifications

Parameters
675.0B
License
Apache 2.0
Training Data
Unknown

Benchmarks

Benchmarks

Mistral Large 3 (675B Base) Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Fri Feb 06 2026
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing

Pricing, performance, and capabilities for Mistral Large 3 (675B Base) across different providers:

No pricing information available for this model.

API Access

API Access Coming Soon

API access for Mistral Large 3 (675B Base) will be available soon through our gateway.

Recent Posts

Recent Reviews

FAQ

Common questions about Mistral Large 3 (675B Base)

Mistral Large 3 (675B Base) was released on December 4, 2025 by Mistral.
Mistral Large 3 (675B Base) was created by Mistral.
Mistral Large 3 (675B Base) has 675.0 billion parameters.
Mistral Large 3 (675B Base) is released under the Apache 2.0 license. This is an open-source/open-weight license.
Yes, Mistral Large 3 (675B Base) is a multimodal model that can process both text and images as input.