MoonshotAI logo

Kimi K2 0905

MoonshotAI
kimi-k2-0905Variant

Overview

Kimi K2 0905 is the September update of Kimi K2 0711. It is a large-scale Mixture-of-Experts (MoE) language model developed by Moonshot AI, featuring 1 trillion total parameters with 32 billion active per forward pass. It supports long-context inference up to 256k tokens, extended from the previous 128k. This update improves agentic coding with higher accuracy and better generalization across scaffolds, and enhances frontend coding with more aesthetic and functional outputs for web, 3D, and related tasks. The model is trained with a novel stack incorporating the MuonClip optimizer for stable large-scale MoE training.

Kimi K2 0905 was released on September 5, 2025. API access is available through Novita.

Performance

Timeline

Release DateUnknown
Knowledge CutoffUnknown

Other Details

Parameters
1000.0B
License
Proprietary
Training Data
Unknown

Related Models

Compare Kimi K2 0905 to other models by quality (GPQA score) vs cost. Higher scores and lower costs represent better value.

Performance visualization loading...

Gathering benchmark data from similar models

Benchmarks

Kimi K2 0905 Performance Across Datasets

Scores sourced from the model's scorecard, paper, or official blog posts

LLM Stats Logollm-stats.com - Sun Dec 14 2025
Notice missing or incorrect data?Start an Issue discussion

Pricing

Pricing, performance, and capabilities for Kimi K2 0905 across different providers:

ProviderInput ($/M)Output ($/M)Max InputMax OutputLatency (s)ThroughputQuantizationInputOutput
Novita logo
Novitafp8
$0.60$2.50262.1K262.1Kfp8
Text
Image
Audio
Video
Text
Image
Audio
Video

Example Outputs

Recent Posts

Recent Reviews

API Access

API Access Coming Soon

API access for Kimi K2 0905 will be available soon through our gateway.

FAQ

Common questions about Kimi K2 0905

Kimi K2 0905 was released on September 5, 2025.
Kimi K2 0905 has 1000.0 billion parameters.