FAQ

Common questions about Bedrock.

What is Bedrock?

Bedrock is an API provider that hosts large language models. Active models: 1; From (input): $15.00 / 1M tok; Avg throughput: 120 tok/s; Avg latency: 0.50 s; Max context: 200K.

How many models does Bedrock offer?

Bedrock currently serves 1 active models out of 22 historical offerings on LLM Stats.

What is Bedrock's API pricing?

Bedrock input pricing starts from $15.00 per 1M tokens, with the most expensive offering at $15 per 1M tokens. See the Pricing tab above for the full per-model breakdown.

How fast is Bedrock?

Bedrock averages 120 output tokens per second across its catalog, with average latency of 0.50s. Per-model performance is shown in the Performance tab.

Does Bedrock support multimodal models?

Yes. Bedrock's catalog includes 1 vision-capable models. See the Models and Capabilities tabs for the full per-model breakdown.

Whose models does Bedrock host?

Bedrock hosts models from Anthropic, AI21 Labs, Amazon, Cohere, and Meta. See the Models tab for the full catalog grouped by creator.

How do I start using Bedrock?

Sign up at https://aws.amazon.com/bedrock/ to get an API key, then call Bedrock's API directly from your application. Use the Pricing and Performance tabs above to pick the right model for your latency, cost, and context-window requirements.