- Models
- MoonshotAI
- Kimi K2 0905
Kimi K2 0905
MoonshotAIOverview
Kimi K2 0905 is the September update of Kimi K2 0711. It is a large-scale Mixture-of-Experts (MoE) language model developed by Moonshot AI, featuring 1 trillion total parameters with 32 billion active per forward pass. It supports long-context inference up to 256k tokens, extended from the previous 128k. This update improves agentic coding with higher accuracy and better generalization across scaffolds, and enhances frontend coding with more aesthetic and functional outputs for web, 3D, and related tasks. The model is trained with a novel stack incorporating the MuonClip optimizer for stable large-scale MoE training.
Kimi K2 0905 was released on September 5, 2025. API access is available through Novita.
Performance
Timeline
Other Details
Related Models
Compare Kimi K2 0905 to other models by quality (GPQA score) vs cost. Higher scores and lower costs represent better value.
Performance visualization loading...
Gathering benchmark data from similar models
Benchmarks
Kimi K2 0905 Performance Across Datasets
Scores sourced from the model's scorecard, paper, or official blog posts
Pricing
Pricing, performance, and capabilities for Kimi K2 0905 across different providers:
| Provider | Input ($/M) | Output ($/M) | Max Input | Max Output | Latency (s) | Throughput | Quantization | Input | Output |
|---|---|---|---|---|---|---|---|---|---|
Novitafp8 | $0.60 | $2.50 | 262.1K | 262.1K | — | — | fp8 | Text Image Audio Video | Text Image Audio Video |
Example Outputs
Recent Posts
Recent Reviews
API Access
API Access Coming Soon
API access for Kimi K2 0905 will be available soon through our gateway.
FAQ
Common questions about Kimi K2 0905
