Model Comparison

Jamba 1.5 Mini vs Mistral Large 2

Mistral Large 2 significantly outperforms across most benchmarks. Jamba 1.5 Mini is 12.0x cheaper per token.

Performance Benchmarks

Comparative analysis across standard metrics

2 benchmarks

Jamba 1.5 Mini outperforms in 0 benchmarks, while Mistral Large 2 is better at 2 benchmarks (GSM8k, MMLU).

Mistral Large 2 significantly outperforms across most benchmarks.

Thu Apr 16 2026 • llm-stats.com

Arena Performance

Human preference votes

Pricing Analysis

Price comparison per million tokens

Jamba 1.5 Mini costs less

For input processing, Jamba 1.5 Mini ($0.20/1M tokens) is 10.0x cheaper than Mistral Large 2 ($2.00/1M tokens).

For output processing, Jamba 1.5 Mini ($0.40/1M tokens) is 15.0x cheaper than Mistral Large 2 ($6.00/1M tokens).

In conclusion, Mistral Large 2 is more expensive than Jamba 1.5 Mini.*

* Using a 3:1 ratio of input to output tokens

Lowest available price from all providers
Thu Apr 16 2026 • llm-stats.com
AI21 Labs
Jamba 1.5 Mini
Input tokens$0.20
Output tokens$0.40
Best providerAWS Bedrock
Mistral AI
Mistral Large 2
Input tokens$2.00
Output tokens$6.00
Best providerGoogle
Notice missing or incorrect data?Start an Issue

Model Size

Parameter count comparison

71.0B diff

Mistral Large 2 has 71.0B more parameters than Jamba 1.5 Mini, making it 136.5% larger.

AI21 Labs
Jamba 1.5 Mini
52.0Bparameters
Mistral AI
Mistral Large 2
123.0Bparameters
52.0B
Jamba 1.5 Mini
123.0B
Mistral Large 2

Context Window

Maximum input and output token capacity

Jamba 1.5 Mini accepts 256,144 input tokens compared to Mistral Large 2's 128,000 tokens. Jamba 1.5 Mini can generate longer responses up to 256,144 tokens, while Mistral Large 2 is limited to 128,000 tokens.

AI21 Labs
Jamba 1.5 Mini
Input256,144 tokens
Output256,144 tokens
Mistral AI
Mistral Large 2
Input128,000 tokens
Output128,000 tokens
Thu Apr 16 2026 • llm-stats.com

License

Usage and distribution terms

Jamba 1.5 Mini is licensed under Jamba Open Model License, while Mistral Large 2 uses Mistral Research License.

License differences may affect how you can use these models in commercial or open-source projects.

Jamba 1.5 Mini

Jamba Open Model License

Open weights

Mistral Large 2

Mistral Research License

Open weights

Release Timeline

When each model was launched

Jamba 1.5 Mini was released on 2024-08-22, while Mistral Large 2 was released on 2024-07-24.

Jamba 1.5 Mini is 1 month newer than Mistral Large 2.

Jamba 1.5 Mini

Aug 22, 2024

1.6 years ago

4w newer
Mistral Large 2

Jul 24, 2024

1.7 years ago

Knowledge Cutoff

When training data ends

Jamba 1.5 Mini has a documented knowledge cutoff of 2024-03-05, while Mistral Large 2's cutoff date is not specified.

We can confirm Jamba 1.5 Mini's training data extends to 2024-03-05, but cannot make a direct comparison without Mistral Large 2's cutoff date.

Jamba 1.5 Mini

Mar 2024

Mistral Large 2

Provider Availability

Jamba 1.5 Mini is available from Bedrock, Google. Mistral Large 2 is available from Google, Mistral AI.

Jamba 1.5 Mini

bedrock logo
AWS Bedrock
Input Price:Input: $0.20/1MOutput Price:Output: $0.40/1M
google logo
Google
Input Price:Input: $0.20/1MOutput Price:Output: $0.40/1M

Mistral Large 2

google logo
Google
Input Price:Input: $2.00/1MOutput Price:Output: $6.00/1M
mistral logo
Mistral
Input Price:Input: $2.00/1MOutput Price:Output: $6.00/1M
* Prices shown are per million tokens

Outputs Comparison

Notice missing or incorrect data?Start an Issue discussion

Key Takeaways

Larger context window (256,144 tokens)
Less expensive input tokens
Less expensive output tokens
Higher GSM8k score (93.0% vs 75.8%)
Higher MMLU score (84.0% vs 69.7%)

Detailed Comparison

AI Model Comparison Table
Feature
AI21 Labs
Jamba 1.5 Mini
Mistral AI
Mistral Large 2

FAQ

Common questions about Jamba 1.5 Mini vs Mistral Large 2

Mistral Large 2 significantly outperforms across most benchmarks. Jamba 1.5 Mini is made by AI21 Labs and Mistral Large 2 is made by Mistral AI. The best choice depends on your use case — compare their benchmark scores, pricing, and capabilities above.
Jamba 1.5 Mini scores ARC-C: 85.7%, GSM8k: 75.8%, MMLU: 69.7%, TruthfulQA: 54.1%, Arena Hard: 46.1%. Mistral Large 2 scores GSM8k: 93.0%, HumanEval: 92.0%, MT-Bench: 86.3%, MMLU: 84.0%, MMLU French: 82.8%.
Jamba 1.5 Mini is 10.0x cheaper for input tokens. Jamba 1.5 Mini costs $0.20/M input and $0.40/M output via bedrock. Mistral Large 2 costs $2.00/M input and $6.00/M output via google.
Jamba 1.5 Mini supports 256K tokens and Mistral Large 2 supports 128K tokens. A larger context window lets you process longer documents, conversations, or codebases in a single request.
Key differences include context window (256K vs 128K), input pricing ($0.20 vs $2.00/M), licensing (Jamba Open Model License vs Mistral Research License). See the full comparison above for benchmark-by-benchmark results.
Jamba 1.5 Mini is developed by AI21 Labs and Mistral Large 2 is developed by Mistral AI.