🧠Stalecollected in 13m

Alibaba Cloud Launches Ultimate Coding Plan

Alibaba Cloud Launches Ultimate Coding Plan
PostLinkedIn
🧠Read original on 机器之心

💡Access 4 top open-source coding models in one cheap sub; beats single-model limits (global exclusive)

⚡ 30-Second TL;DR

What Changed

Coding Plan enables seamless switching between Qwen3.5 (397B params, 170B active), GLM-5, MiniMax M2.5, Kimi K2.5

Why It Matters

Lowers costs for high-frequency coding tasks, enabling developers to access top models without single-model limits. Boosts adoption of Chinese open-source LLMs in global tools.

What To Do Next

Subscribe to Aliyun Coding Plan Lite via https://www.aliyun.com/benefit/scene/codingplan to test Qwen3.5 on complex coding tasks.

Who should care:Developers & AI Engineers

🧠 Deep Insight

Web-grounded analysis with 8 cited sources.

🔑 Enhanced Key Takeaways

  • Alibaba Cloud's Coding Plan supports 8 total coding models including both Qwen proprietary models (qwen3.5-plus, qwen3-max-2026-01-23, qwen3-coder-next, qwen3-coder-plus) and third-party models (MiniMax-M2.5, glm-5, glm-4.7, kimi-k2.5), enabling developers to leverage specialized code-generation architectures alongside general-purpose LLMs[3].
  • The Qwen3-Coder-Next model is specifically engineered as an open-weight language model designed for coding agents and local development environments, representing a shift toward edge-deployable coding assistance beyond cloud-only solutions[8].
  • Coding Plan operates exclusively in the Singapore region with global deployment mode, meaning user input and output data may cross borders—a critical compliance consideration for enterprises handling sensitive code[3].
  • The qwen3-max-2026-01-23 flagship model integrates thinking mode with three built-in tools (web search, web extractor, code interpreter) to improve accuracy on complex coding problems through external tool reasoning during inference[5].

🛠️ Technical Deep Dive

  • Qwen3.5-Plus Pricing Structure: Tiered token-based pricing with input cost of $0.4 per 1K tokens (0–256K range) and $0.5 per 1K tokens (256K–1M range); output costs $2.4 and $3.0 respectively[5].
  • Model Context Windows: Qwen3-max-2026-01-23 supports 1,000,000 token context length with 65,536 max output tokens and 81,920 max chain-of-thought tokens[5].
  • Batch Processing Optimization: Qwen-plus models offer 50% discount on batch API calls, reducing costs for non-real-time coding workflows[5].
  • Token Quota System: Lite plan provides 18,000 monthly requests; Pro plan provides 90,000 monthly requests, with new user 80% first-month discount applied to both tiers[1][2].
  • Integration Ecosystem: Compatible with QwenCode, Claude Code, Cline, and OpenClaw (formerly Moltbot/Clawdbot) development tools[3].

🔮 Future ImplicationsAI analysis grounded in cited sources

Multi-model switching reduces vendor lock-in for enterprise coding workflows
Developers can freely switch between Qwen, GLM, MiniMax, and Kimi models within a single subscription, enabling cost optimization and model selection based on task-specific performance rather than platform constraints.
Open-weight Qwen3-Coder-Next enables offline coding agent deployment
As an open-weight model designed for local development, organizations can deploy coding agents on-premises or edge infrastructure, reducing latency and data residency concerns compared to cloud-only competitors.

Timeline

2025-09-23
Qwen3-max snapshot released; baseline for subsequent thinking-mode improvements
2025-12-01
Qwen-plus-2025-12-01 model snapshot released as part of Qwen3 series
2026-01-23
Qwen3-max-2026-01-23 stable release with integrated thinking mode and tool-use capabilities
2026-01-31
Coding Plan subscription service activated on Alibaba Cloud Model Studio
2026-02-15
Qwen3.5-plus-2026-02-15 snapshot released with thinking mode enabled by default
2026-02-24
Alibaba Cloud Model Studio documentation updated to reflect full Coding Plan feature set and supported models
📰

Weekly AI Recap

Read this week's curated digest of top AI events →

👉Related Updates

AI-curated news aggregator. All content rights belong to original publishers.
Original source: 机器之心