🏠Recentcollected in 6m

Goldman: Apple DRAM Buy Won't Hurt Margins

Goldman: Apple DRAM Buy Won't Hurt Margins
PostLinkedIn
🏠Read original on IT之家

💡Apple's 2.4EB LPDDR5 hoard signals mobile memory crunch for AI devs

⚡ 30-Second TL;DR

What Changed

Apple aggressively buying mobile DRAM, squeezing competitors.

Why It Matters

Signals tight LPDDR5 supply risks for AI mobile devices. Apple's strategy may drive up costs for others. Highlights memory as key growth driver for big tech.

What To Do Next

Model LPDDR5 demand forecasts for AI apps on mobile platforms like iOS.

Who should care:Founders & Product Leaders

🧠 Deep Insight

AI-generated analysis for this event.

🔑 Enhanced Key Takeaways

  • Apple's DRAM procurement strategy is heavily influenced by the integration of on-device generative AI models, which require significantly higher memory bandwidth and capacity compared to previous iPhone generations.
  • The 2.4 EB (Exabyte) figure represents a strategic shift toward standardizing 16GB and 24GB RAM configurations across the 2026 iPhone lineup to support complex local inference tasks.
  • Goldman Sachs' analysis suggests Apple is leveraging its massive cash reserves and long-term supply agreements to lock in favorable pricing, effectively insulating itself from the cyclical volatility of the broader memory market.
📊 Competitor Analysis▸ Show
FeatureApple (iPhone 2026)Samsung (Galaxy S26 Ultra)Google (Pixel 11 Pro)
DRAM TypeLPDDR5X (Custom)LPDDR5XLPDDR5X
Typical RAM16GB - 24GB16GB - 20GB16GB
AI StrategyOn-device focusHybrid Cloud/On-deviceCloud-heavy
Pricing StrategyPremium/StableCompetitive/PromotionalAggressive/Value

🛠️ Technical Deep Dive

  • Transition to LPDDR5X: The 2026 iPhone lineup utilizes LPDDR5X, offering higher data rates (up to 9600 Mbps) compared to standard LPDDR5 to handle the increased memory bandwidth requirements of local AI processing.
  • Memory Architecture: Apple is implementing a unified memory architecture that allows the Neural Engine and GPU to access the same high-speed memory pool, reducing latency for real-time AI inference.
  • Capacity Scaling: The move to 2.4 EB total capacity is driven by the need to keep large language models (LLMs) resident in memory to avoid the latency penalties of swapping to NAND flash storage.

🔮 Future ImplicationsAI analysis grounded in cited sources

Memory component costs will become a primary driver of iPhone retail price increases in late 2026.
As Apple consumes a larger share of the high-end LPDDR5X supply, the resulting market tightness will likely force manufacturers to pass higher component costs to consumers.
Android flagship manufacturers will face significant supply chain constraints in Q3 2026.
Apple's aggressive hoarding strategy limits the available volume of high-speed DRAM for competitors, potentially delaying product launches or forcing them to use lower-spec memory.

Timeline

2024-09
Apple introduces 8GB RAM as the baseline for the iPhone 16 series to support Apple Intelligence.
2025-09
Apple increases baseline RAM to 12GB for the iPhone 17 Pro models to accommodate more complex on-device AI models.
2026-02
Apple finalizes long-term supply contracts with major DRAM manufacturers to secure 2026 production capacity.
📰

Weekly AI Recap

Read this week's curated digest of top AI events →

👉Related Updates

AI-curated news aggregator. All content rights belong to original publishers.
Original source: IT之家