Model Comparison
Jamba 1.5 Large vs Qwen2.5-Coder 32B Instruct
Jamba 1.5 Large significantly outperforms across most benchmarks. Qwen2.5-Coder 32B Instruct is 38.9x cheaper per token.
Performance Benchmarks
Comparative analysis across standard metrics
Jamba 1.5 Large outperforms in 4 benchmarks (ARC-C, MMLU, MMLU-Pro, TruthfulQA), while Qwen2.5-Coder 32B Instruct is better at 1 benchmark (GSM8k).
Jamba 1.5 Large significantly outperforms across most benchmarks.
Arena Performance
Human preference votes
Pricing Analysis
Price comparison per million tokens
For input processing, Jamba 1.5 Large ($2.00/1M tokens) is 22.2x more expensive than Qwen2.5-Coder 32B Instruct ($0.09/1M tokens).
For output processing, Jamba 1.5 Large ($8.00/1M tokens) is 88.9x more expensive than Qwen2.5-Coder 32B Instruct ($0.09/1M tokens).
In conclusion, Jamba 1.5 Large is more expensive than Qwen2.5-Coder 32B Instruct.*
* Using a 3:1 ratio of input to output tokens
Model Size
Parameter count comparison
Jamba 1.5 Large has 366.0B more parameters than Qwen2.5-Coder 32B Instruct, making it 1143.8% larger.
Context Window
Maximum input and output token capacity
Jamba 1.5 Large accepts 256,000 input tokens compared to Qwen2.5-Coder 32B Instruct's 128,000 tokens. Jamba 1.5 Large can generate longer responses up to 256,000 tokens, while Qwen2.5-Coder 32B Instruct is limited to 128,000 tokens.
License
Usage and distribution terms
Jamba 1.5 Large is licensed under Jamba Open Model License, while Qwen2.5-Coder 32B Instruct uses Apache 2.0.
License differences may affect how you can use these models in commercial or open-source projects.
Jamba Open Model License
Open weights
Apache 2.0
Open weights
Release Timeline
When each model was launched
Jamba 1.5 Large was released on 2024-08-22, while Qwen2.5-Coder 32B Instruct was released on 2024-09-19.
Qwen2.5-Coder 32B Instruct is 1 month newer than Jamba 1.5 Large.
Aug 22, 2024
1.7 years ago
Sep 19, 2024
1.6 years ago
4w newerKnowledge Cutoff
When training data ends
Jamba 1.5 Large has a documented knowledge cutoff of 2024-03-05, while Qwen2.5-Coder 32B Instruct's cutoff date is not specified.
We can confirm Jamba 1.5 Large's training data extends to 2024-03-05, but cannot make a direct comparison without Qwen2.5-Coder 32B Instruct's cutoff date.
Mar 2024
—
Provider Availability
Jamba 1.5 Large is available from Bedrock, Google. Qwen2.5-Coder 32B Instruct is available from Lambda, DeepInfra, Hyperbolic, Fireworks.
Jamba 1.5 Large
Qwen2.5-Coder 32B Instruct
Outputs Comparison
Key Takeaways
Jamba 1.5 Large
View detailsAI21 Labs
Qwen2.5-Coder 32B Instruct
View detailsAlibaba Cloud / Qwen Team
Detailed Comparison
| Feature |
|---|
FAQ
Common questions about Jamba 1.5 Large vs Qwen2.5-Coder 32B Instruct