MoonshotAI logo

Kimi K2 0905

Overview

Overview

Kimi K2 0905 is the September update of Kimi K2 0711. It is a large-scale Mixture-of-Experts (MoE) language model developed by Moonshot AI, featuring 1 trillion total parameters with 32 billion active per forward pass. It supports long-context inference up to 256k tokens, extended from the previous 128k. This update improves agentic coding with higher accuracy and better generalization across scaffolds, and enhances frontend coding with more aesthetic and functional outputs for web, 3D, and related tasks. The model is trained with a novel stack incorporating the MuonClip optimizer for stable large-scale MoE training.

Kimi K2 0905 was released on September 5, 2025. API access is available through Novita.

Performance

Timeline

ReleasedUnknown
Knowledge CutoffUnknown

Specifications

Parameters
1000.0B
License
Proprietary
Training Data
Unknown

Benchmarks

Benchmarks

Kimi K2 0905 Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Sat Feb 21 2026
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing

Pricing, performance, and capabilities for Kimi K2 0905 across different providers:

ProviderInput ($/M)Output ($/M)Max InputMax OutputLatency (s)ThroughputQuantizationInputOutput
Novita logo
Novitafp8
$0.60$2.50262.1K262.1K
fp8
Text
Image
Audio
Video
Text
Image
Audio
Video

API Access

API Access Coming Soon

API access for Kimi K2 0905 will be available soon through our gateway.

Recent Posts

Recent Reviews

FAQ

Common questions about Kimi K2 0905

Kimi K2 0905 was released on September 5, 2025 by MoonshotAI.
Kimi K2 0905 was created by MoonshotAI.
Kimi K2 0905 has 1000.0 billion parameters.
Kimi K2 0905 is released under the Proprietary license.