Cursor Composer 2 Exposed as Kimi K2.5 Base

💡Cursor hides Kimi base in new coding model: open-source drama + resolution
⚡ 30-Second TL;DR
What Changed
Developer spotted API ID 'kimi-k2p5-rl-0317-s515-fast' confirming Kimi base
Why It Matters
Emphasizes need for transparent attribution in AI open-source derivatives; elevates Kimi visibility amid Cursor's $500B valuation push and Moonshot funding.
What To Do Next
Inspect model IDs in Cursor API docs before integrating into workflows.
🧠 Deep Insight
Web-grounded analysis with 12 cited sources.
🔑 Enhanced Key Takeaways
- •Cursor's internal training for Composer 2 accounted for approximately 75% of the final model's total computational effort, with the Kimi K2.5 base providing only the remaining 25% of the foundation.
- •The model utilizes a specialized 'Parallel Agent Reinforcement Learning' (PARL) technique developed by Moonshot AI, which allows the orchestrator to manage up to 100 sub-agents for simultaneous multi-file code refactoring.
- •Composer 2 achieves inference speeds exceeding 1,000 tokens per second by leveraging Fireworks AI's speculative decoding and a custom RL sampler, significantly outperforming standard GPT-4o latency.
- •The 'S515' suffix in the leaked API ID (kimi-k2p5-rl-0317-s515-fast) identifies a specific reinforcement learning checkpoint optimized for 'Structural Code Synthesis,' a method designed to maintain coherence across 100+ file edits.
- •The licensing dispute was triggered by a 'Modified MIT License' clause requiring UI attribution for entities with >$20M monthly revenue, a threshold Cursor's $29.3B valuation and user base were confirmed to have surpassed in Q1 2026.
📊 Competitor Analysis▸ Show
| Feature | Cursor Composer 2 (Kimi K2.5) | GitHub Copilot (o1/GPT-4o) | Windsurf (Flow/GLM-4.6) |
|---|---|---|---|
| Base Model | Kimi K2.5 (Moonshot AI) | OpenAI o1-preview / GPT-4o | Zhipu AI GLM-4.6 |
| Architecture | MoE (1T Total / 32B Active) | Dense / Proprietary MoE | MoE (Proprietary) |
| Context Window | 262K Tokens | 128K Tokens | 128K - 200K Tokens |
| Key Innovation | Agent Swarm (100+ sub-agents) | Reasoning Chains (CoT) | Flow-based Agentic Loops |
| Pricing (API) | $0.50/1M Input (via Together/Fireworks) | $5.00/1M Input (Standard) | $0.60/1M Input (via Zhipu) |
| Speed | 1,000+ tokens/sec | ~80-100 tokens/sec | ~150 tokens/sec |
🛠️ Technical Deep Dive
- •Architecture: Mixture-of-Experts (MoE) with 384 total experts, selecting 8 active experts per token to balance reasoning depth with inference efficiency.
- •Attention Mechanism: Implements Multi-head Latent Attention (MLA) to reduce KV cache overhead, enabling the 262K context window without linear memory scaling.
- •Training Data: Pre-trained on a massive 15 trillion token corpus of mixed visual and text data, allowing native multimodal understanding of UI mockups and video workflows.
- •Optimization: Native INT4 quantization with Quantization-Aware Training (QAT) provides a 2x speedup on H100/B200 clusters while maintaining near-lossless precision.
- •Agentic Logic: Uses a 'Thinking Mode' that generates internal reasoning traces (hidden from the final output) to verify code logic before applying diffs.
🔮 Future ImplicationsAI analysis grounded in cited sources
⏳ Timeline
📎 Sources (12)
Factual claims are grounded in the sources below. Forward-looking analysis is AI-generated interpretation.
- vertexaisearch.cloud.google.com — Auziyqfkwdw1pcqdjny3xchtvyhx8y4pomekkao Cbaexnzk1u4zui3bitpn1kaz0j9ajjnve4igs1lzzzdh2dzcvf8kquf6hh24cddiqsjhjxp3pbbrqulki9f2jc7iiw Uxjxvqz Qr9u21wramzhup Et4yvgsqhs Y6b2d3ywrsfxbdind I6m799mscfk Fu01hvz S8cbqhvi81rpuanloivfvza==
- vertexaisearch.cloud.google.com — Auziyqfwh 9izfdda1ffcohqardwtx4b83r Zwwosjrtqceiwrwvivhx2oc9toev Va7pn Tm6kxwahkwotj Ymqy7jawpm1onoa H6ij79vw6ghibuennn4zlobkv6va7s4ht2cq==
- vertexaisearch.cloud.google.com — Auziyqeigreblr 2zvwdi5wubmaqgyzklrevvm7asko8yuoieo90f3wbay27i8gs4swghqghjtb6va3du6px E8boksjyuj Cxwjsmicpkgeo 9rgmlcqfzfre5pvxkopu Tnmmpjxc4cw==
- vertexaisearch.cloud.google.com — Auziyqghhazcanclqm2ge0kndvcmkugocbnrwz9pwg4i2lbbtq Gjm Hqzplde7rjg4tgv7ffdm3wbsai15h77ehj O7dehgos6kqkd0r0mfm8eyksgum4mq7ci5bgmdactnofanhojoahz2vgcd3uriuzu0z3gjaswuh1af5w13hfc5dp24aybpn04xl8hd5fcm A07rgij3izhvuughivtflq=
- vertexaisearch.cloud.google.com — Auziyqfxqk2xksfqqzdxj7kws Gubhsw8txjvif 8f7mzlmxfflkvl5zct3o79ciye7qlkqjdonueeszq44enjzqjkqludicu6plrzdfx9shbyrrti 3mq1kd0dcynsphesfowsi9p 9vgeixewhfgzogmxyoeesk7gz9j3fiygwfiwstgatgwxyninhfxk4s3tzomux73yebaumgu1 Evllxn4ytju2wo7rrlbnpbnkqw2dxutkhscgfg==
- vertexaisearch.cloud.google.com — Auziyqejfock9bpgjwxy10tpyv0zcmtpl4obn6cbdfb0rhdufdzdg4hgi Yrkvtw7br1wtwih6xtmrwbevuxyqfpgmmjmelnob1cfnsayi5fhq 2hj0da6z1wgyo1u4yt1haw8wmud0xkwsmeuzstr2v4r3ywfkix7niueq4pfg3g35hpvld0nlumna1oz4vlb8=
- vertexaisearch.cloud.google.com — Auziyqevssclrwfzn7eozyv1d9hmglqf21gimbe5woh9egmginelic9a3lcdtkqdhr1ulr Cdd65xpmy4txjywdgjqlzt01bkvg7ret2sygkfna0ic5 Xalj3gmt9 U4wv2xkywjordzxzo1bxvfrxvy9m5rz1o=
- vertexaisearch.cloud.google.com — Auziyqgdsym53wyc6s Fc61zkc8hotxcwpae 4qe3xacgfreogbb7reg76w7vfvgljw49f5pgfwb Wjbtv12abzn5x5rz88vx271mrg1xjtnqipwzhheialcc3a0u4he
- vertexaisearch.cloud.google.com — Auziyqfjod4fuihlkbiwa1co4dqh8oxa Timyyj6lbepvo1q9dbsndukbswt0l5izf Usrzcash70m5m Tieg5g8ljeqv3rrg70pninq9hkdptziyes34rimeenvvsr8opuexo0kxuz
- vertexaisearch.cloud.google.com — Auziyqhuv8epivt5loci1wl09ms74yctbonnuadff8gz7sy8jb3kde1nbfsz7wszy82zggm9egtjq897fbeby52aq8 Sev1zyldxffcg8n0g Oqkzwekxqwpbmnndgdyo9vq I Tqhdo7boxdj1scftvjhymayra5rddahy=
- vertexaisearch.cloud.google.com — Auziyqgwlvrzcyfhra2jnubf9bakspuqxomfvlrzh9plsmwmpjsnr4spdukuclnqyckkhrooxxzdjywmhkxrqmbzl0kcordhefaptmfxzglf Fwx 1t2pfcqi8sydiicugvubifhcroiiabqlf3wgbw
- vertexaisearch.cloud.google.com — Auziyqelw1pn7k5n2 Wajg1xdvulfiakbtzyzkyc0uv3 Wyfu0bzgdnlxgflmxonqhzmaoggpm8w3zx9p08uysed6ehas0gek5aqnestgmykciplxojek7uq 6ozsxuphpvr5ef8bbfohjutm0bwnehu Exusln7 6gtl Wyburublgpalychlfnpsfvo1i6ec 3cj4pfe1ojtbpdw7m8nuppt2b G26ditnhezplhzz0 Q=
📰 Event Coverage
Weekly AI Recap
Read this week's curated digest of top AI events →
👉Related Updates
AI-curated news aggregator. All content rights belong to original publishers.
Original source: 虎嗅 ↗


