Model Comparison
Jamba 1.5 Large vs Phi-3.5-mini-instruct
Jamba 1.5 Large significantly outperforms across most benchmarks. Phi-3.5-mini-instruct is 35.0x cheaper per token.
Performance Benchmarks
Comparative analysis across standard metrics
Jamba 1.5 Large outperforms in 6 benchmarks (ARC-C, Arena Hard, GPQA, GSM8k, MMLU, MMLU-Pro), while Phi-3.5-mini-instruct is better at 1 benchmark (TruthfulQA).
Jamba 1.5 Large significantly outperforms across most benchmarks.
Arena Performance
Human preference votes
Pricing Analysis
Price comparison per million tokens
For input processing, Jamba 1.5 Large ($2.00/1M tokens) is 20.0x more expensive than Phi-3.5-mini-instruct ($0.10/1M tokens).
For output processing, Jamba 1.5 Large ($8.00/1M tokens) is 80.0x more expensive than Phi-3.5-mini-instruct ($0.10/1M tokens).
In conclusion, Jamba 1.5 Large is more expensive than Phi-3.5-mini-instruct.*
* Using a 3:1 ratio of input to output tokens
Model Size
Parameter count comparison
Jamba 1.5 Large has 394.2B more parameters than Phi-3.5-mini-instruct, making it 10373.7% larger.
Context Window
Maximum input and output token capacity
Jamba 1.5 Large accepts 256,000 input tokens compared to Phi-3.5-mini-instruct's 128,000 tokens. Jamba 1.5 Large can generate longer responses up to 256,000 tokens, while Phi-3.5-mini-instruct is limited to 128,000 tokens.
License
Usage and distribution terms
Jamba 1.5 Large is licensed under Jamba Open Model License, while Phi-3.5-mini-instruct uses MIT.
License differences may affect how you can use these models in commercial or open-source projects.
Jamba Open Model License
Open weights
MIT
Open weights
Release Timeline
When each model was launched
Jamba 1.5 Large was released on 2024-08-22, while Phi-3.5-mini-instruct was released on 2024-08-23.
Phi-3.5-mini-instruct is 0 month newer than Jamba 1.5 Large.
Aug 22, 2024
1.7 years ago
Aug 23, 2024
1.7 years ago
1d newerKnowledge Cutoff
When training data ends
Jamba 1.5 Large has a documented knowledge cutoff of 2024-03-05, while Phi-3.5-mini-instruct's cutoff date is not specified.
We can confirm Jamba 1.5 Large's training data extends to 2024-03-05, but cannot make a direct comparison without Phi-3.5-mini-instruct's cutoff date.
Mar 2024
—
Provider Availability
Jamba 1.5 Large is available from Bedrock, Google. Phi-3.5-mini-instruct is available from Azure.
Jamba 1.5 Large
Phi-3.5-mini-instruct
Outputs Comparison
Key Takeaways
Jamba 1.5 Large
View detailsAI21 Labs
Phi-3.5-mini-instruct
View detailsMicrosoft
Detailed Comparison
| Feature |
|---|
FAQ
Common questions about Jamba 1.5 Large vs Phi-3.5-mini-instruct