AI21 Labs logo

Jamba 1.5 Large

AI21 Labs
jamba-1.5-largeVariant

Overview

State-of-the-art hybrid SSM-Transformer instruction following foundation model, offering superior long context handling, speed, and quality.

Jamba 1.5 Large was released on August 22, 2024. API access is available through Bedrock, Google.

Performance

Timeline

Release DateUnknown
Knowledge CutoffUnknown

Other Details

Parameters
398.0B
License
Jamba Open Model License
Training Data
Unknown

Related Models

Compare Jamba 1.5 Large to other models by quality (GPQA score) vs cost. Higher scores and lower costs represent better value.

Performance visualization loading...

Gathering benchmark data from similar models

Benchmarks

Jamba 1.5 Large Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Fri Dec 05 2025
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing, performance, and capabilities for Jamba 1.5 Large across different providers:

ProviderInput ($/M)Output ($/M)Max InputMax OutputLatency (s)ThroughputQuantizationInputOutput
Bedrock logo
Bedrock
$2.00$8.00256.0K256.0K0.5100.0 tok/s
Text
Image
Audio
Video
Text
Image
Audio
Video
Google logo
Google
$2.00$8.00256.0K256.0K0.342.0 tok/s
Text
Image
Audio
Video
Text
Image
Audio
Video

Price Comparison for Jamba 1.5 Large

Price per 1M input tokens (USD), lower is better

LLM Stats Logollm-stats.com - Fri Dec 05 2025

Throughput Comparison for Jamba 1.5 Large

Tokens per second, higher is better

LLM Stats Logollm-stats.com - Fri Dec 05 2025

Latency Comparison for Jamba 1.5 Large

Time to first token (s), lower is better

LLM Stats Logollm-stats.com - Fri Dec 05 2025

Jamba 1.5 Large API Providers: Price vs Throughput

Example Outputs

Recent Posts

Recent Reviews

API Access

API Access Coming Soon

API access for Jamba 1.5 Large will be available soon through our gateway.

FAQ

Common questions about Jamba 1.5 Large

Jamba 1.5 Large was released on August 22, 2024.
Jamba 1.5 Large has 398.0 billion parameters.