Model Comparison
Jamba 1.5 Large vs Llama 3.1 70B Instruct
Llama 3.1 70B Instruct significantly outperforms across most benchmarks. Llama 3.1 70B Instruct is 17.5x cheaper per token.
Performance Benchmarks
Comparative analysis across standard metrics
Jamba 1.5 Large outperforms in 0 benchmarks, while Llama 3.1 70B Instruct is better at 4 benchmarks (ARC-C, GPQA, MMLU, MMLU-Pro).
Llama 3.1 70B Instruct significantly outperforms across most benchmarks.
Arena Performance
Human preference votes
Pricing Analysis
Price comparison per million tokens
For input processing, Jamba 1.5 Large ($2.00/1M tokens) is 10.0x more expensive than Llama 3.1 70B Instruct ($0.20/1M tokens).
For output processing, Jamba 1.5 Large ($8.00/1M tokens) is 40.0x more expensive than Llama 3.1 70B Instruct ($0.20/1M tokens).
In conclusion, Jamba 1.5 Large is more expensive than Llama 3.1 70B Instruct.*
* Using a 3:1 ratio of input to output tokens
Model Size
Parameter count comparison
Jamba 1.5 Large has 328.0B more parameters than Llama 3.1 70B Instruct, making it 468.6% larger.
Context Window
Maximum input and output token capacity
Jamba 1.5 Large accepts 256,000 input tokens compared to Llama 3.1 70B Instruct's 128,000 tokens. Jamba 1.5 Large can generate longer responses up to 256,000 tokens, while Llama 3.1 70B Instruct is limited to 128,000 tokens.
License
Usage and distribution terms
Jamba 1.5 Large is licensed under Jamba Open Model License, while Llama 3.1 70B Instruct uses Llama 3.1 Community License.
License differences may affect how you can use these models in commercial or open-source projects.
Jamba Open Model License
Open weights
Llama 3.1 Community License
Open weights
Release Timeline
When each model was launched
Jamba 1.5 Large was released on 2024-08-22, while Llama 3.1 70B Instruct was released on 2024-07-23.
Jamba 1.5 Large is 1 month newer than Llama 3.1 70B Instruct.
Aug 22, 2024
1.6 years ago
1mo newerJul 23, 2024
1.7 years ago
Knowledge Cutoff
When training data ends
Jamba 1.5 Large has a documented knowledge cutoff of 2024-03-05, while Llama 3.1 70B Instruct's cutoff date is not specified.
We can confirm Jamba 1.5 Large's training data extends to 2024-03-05, but cannot make a direct comparison without Llama 3.1 70B Instruct's cutoff date.
Mar 2024
—
Provider Availability
Jamba 1.5 Large is available from Bedrock, Google. Llama 3.1 70B Instruct is available from Lambda, DeepInfra, Hyperbolic, Groq, Cerebras, Together, Fireworks, Bedrock, Sambanova.
Jamba 1.5 Large
Llama 3.1 70B Instruct
Outputs Comparison
Key Takeaways
Jamba 1.5 Large
View detailsAI21 Labs
Detailed Comparison
| Feature |
|---|
FAQ
Common questions about Jamba 1.5 Large vs Llama 3.1 70B Instruct