Micron, Samsung, and SK Hynix are massively expanding fabs to meet AI-driven memory needs. Micron's $200B plan features a huge Boise campus with 15-20万 WPM capacity. Priority for HBM and AI modules means ongoing consumer shortages.
Key Points
- 1.Micron invests $200B, Boise fab with 60万 sq ft cleanroom boosting output 40%.
- 2.SK Hynix accelerates Yongin cluster trial production to Feb/Mar.
- 3.Samsung advances P4 fab completion to 2026 Q4 with 10-12万 WPM.
- 4.Capacity targets AI HBM/SOCAMM as LPDDR shifts from consumer to Nvidia AI.
- 5.Consumers face prolonged shortages despite expansions.
Impact Analysis
Eases AI training/inference supply constraints but sustains high prices and consumer GPU delays. Enterprises gain better access to high-bandwidth memory for scaling models.
Technical Details
Boise fab targets 15-20万 WPM; HBM meets agentic AI bandwidth needs; expansions add 40%+ to global DRAM output focused on rack-scale AI.



