💰TechCrunch AI•Freshcollected in 3m
Google-Intel Deepen AI Chip Partnership

💡Google-Intel chip collab fights AI CPU shortage—vital for scaling infra
⚡ 30-Second TL;DR
What Changed
Google and Intel co-developing custom chips
Why It Matters
This partnership could ease CPU shortages critical for AI training and inference. It signals big tech's push for optimized AI hardware, potentially lowering costs for cloud-based AI deployments.
What To Do Next
Check Google Cloud's Intel CPU offerings for AI workloads to anticipate custom chip integrations.
Who should care:Enterprise & Security Teams
🧠 Deep Insight
AI-generated analysis for this event.
🔑 Enhanced Key Takeaways
- •The collaboration focuses on integrating Intel's 18A process node technology with Google's custom TPU (Tensor Processing Unit) architecture to improve power efficiency for large-scale model training.
- •This partnership marks a strategic shift for Google, which has historically relied heavily on TSMC for manufacturing its custom silicon, aiming to diversify its supply chain amid geopolitical supply risks.
- •Intel Foundry is positioning this deal as a cornerstone of its 'IDM 2.0' strategy, aiming to prove its manufacturing viability for hyperscale cloud providers against competitors like Samsung and TSMC.
📊 Competitor Analysis▸ Show
| Feature | Google/Intel (Custom) | NVIDIA (Blackwell/Rubin) | AWS (Trainium/Inferentia) |
|---|---|---|---|
| Primary Focus | Power-efficient inference/training | High-performance training | Cloud-native cost optimization |
| Manufacturing | Intel 18A | TSMC (CoWoS) | TSMC |
| Ecosystem | OpenXLA / JAX | CUDA (Proprietary) | AWS Neuron |
🛠️ Technical Deep Dive
- Utilization of Intel's 18A process node, featuring RibbonFET gate-all-around transistors for improved performance-per-watt.
- Integration of PowerVia backside power delivery to reduce voltage droop and improve signal integrity in high-density AI chip designs.
- Optimization for Google's proprietary JAX and TensorFlow frameworks to ensure hardware-software co-design efficiency.
🔮 Future ImplicationsAI analysis grounded in cited sources
Intel Foundry will capture at least 10% of Google's total AI chip manufacturing volume by 2027.
The deepening partnership suggests a multi-year commitment to transition significant portions of Google's silicon production to Intel's domestic US facilities.
Google will reduce its reliance on TSMC for AI hardware by at least 15% within three years.
Diversifying manufacturing partners is a stated goal of Google's hardware division to mitigate risks associated with reliance on a single foundry.
⏳ Timeline
2021-07
Intel announces IDM 2.0 strategy, opening its foundry services to external customers.
2023-02
Google and Intel announce initial collaboration on cloud infrastructure and open-source software optimization.
2024-02
Intel Foundry Services (IFS) hosts 'Direct Connect' event, highlighting Google as a key partner for advanced process nodes.
2025-06
Google begins pilot testing of TPU-based silicon prototypes manufactured on Intel's 18A process.
📰
Weekly AI Recap
Read this week's curated digest of top AI events →
👉Related Updates
AI-curated news aggregator. All content rights belong to original publishers.
Original source: TechCrunch AI ↗

