MoonshotAI logo

Kimi K2 Base

Overview

Overview

Kimi K2 base model is a state-of-the-art mixture-of-experts (MoE) language model with 32 billion activated parameters and 1 trillion total parameters. Trained on 15.5 trillion tokens with the MuonClip optimizer, this is the foundation model before instruction tuning. It demonstrates strong performance on knowledge, reasoning, and coding benchmarks while being optimized for agentic capabilities.

Kimi K2 Base was released on July 11, 2025.

Performance

Timeline

ReleasedUnknown
Knowledge CutoffUnknown

Specifications

Parameters
1000.0B
License
MIT
Training Data
Unknown
Tags
tuning:base

Benchmarks

Benchmarks

Kimi K2 Base Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Wed Jan 21 2026
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing

Pricing, performance, and capabilities for Kimi K2 Base across different providers:

No pricing information available for this model.

API Access

API Access Coming Soon

API access for Kimi K2 Base will be available soon through our gateway.

Recent Posts

Recent Reviews

FAQ

Common questions about Kimi K2 Base

Kimi K2 Base was released on July 11, 2025 by MoonshotAI.
Kimi K2 Base was created by MoonshotAI.
Kimi K2 Base has 1000.0 billion parameters.
Kimi K2 Base is released under the MIT license. This is an open-source/open-weight license.