MoonshotAI logo

Kimi K2 Base

MoonshotAI
kimi-k2-baseVariant

Overview

Kimi K2 base model is a state-of-the-art mixture-of-experts (MoE) language model with 32 billion activated parameters and 1 trillion total parameters. Trained on 15.5 trillion tokens with the MuonClip optimizer, this is the foundation model before instruction tuning. It demonstrates strong performance on knowledge, reasoning, and coding benchmarks while being optimized for agentic capabilities.

Kimi K2 Base was released on July 11, 2025.

Performance

Timeline

Release DateUnknown
Knowledge CutoffUnknown

Other Details

Parameters
1000.0B
License
MIT
Training Data
Unknown
Tags
tuning:base

Related Models

Compare Kimi K2 Base to other models by quality (GPQA score) vs cost. Higher scores and lower costs represent better value.

Performance visualization loading...

Gathering benchmark data from similar models

Benchmarks

Kimi K2 Base Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Sat Dec 06 2025
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing, performance, and capabilities for Kimi K2 Base across different providers:

No pricing information available for this model.

Example Outputs

Recent Posts

Recent Reviews

API Access

API Access Coming Soon

API access for Kimi K2 Base will be available soon through our gateway.

FAQ

Common questions about Kimi K2 Base

Kimi K2 Base was released on July 11, 2025.
Kimi K2 Base has 1000.0 billion parameters.