- Future of Computing Newsletter
- Posts
- 🦾 Emerging Challenges in Building Hardware for Physical AI
🦾 Emerging Challenges in Building Hardware for Physical AI
A Newsletter for Entrepreneurs, Investors, and Computing Geeks
Happy Monday! Here’s what’s inside this week’s newsletter:
Deep dive: The emerging challenges of building hardware for Physical AI, and how bringing intelligence to the edge is redefining compute design, power efficiency, sensor integration, and real-world verification.
Spotlights: NVIDIA and TSMC celebrate the first U.S.-made Blackwell wafer, and a global AWS outage exposes the fragility of cloud infrastructure and the geopolitics of digital resilience.
Headlines: Major semiconductor moves from Tesla, Intel, and Arm, new quantum breakthroughs from IonQ and IBM, advances in photonic and neuromorphic chips, and major cloud partnerships led by Google and Anthropic.
Readings: Advances in memory and cooling, scalable photonics for AI, quantum state preparation and storage, neuromorphic processing for edge and optical AI, and evolving strategies in data centers and cloud infrastructure.
Funding news: A wave of smaller rounds below $10M across semiconductors, AI, and quantum, alongside a few larger deals including a $98M round in photonics and a $1.4B round in data centers.
Bonus: Google’s new claim of quantum advantage triggers both excitement and scientific caution, as its Willow processor achieves a 13,000× speed-up while debate continues over practical impact.
Deep Dive: Emerging Challenges in Building Hardware for Physical AI
Physical AI brings intelligence to the edge, from robots and drones to industrial systems, where machines must sense, interpret, and act in the physical world. These systems combine local inference with cloud connectivity and must make autonomous, real-time decisions using uncertain data and limited power.
This shift reshapes how computing hardware is built and verified, introducing five core challenges that are redefining hardware design for intelligence at the edge:
Challenge 1: Designing for Local Decision-Making and Low Power
Physical AI systems must make rapid, independent decisions under strict power limits without relying on the cloud for real-time processing.
Technical implication: Edge processors must handle inference, control, and communication locally with predictable latency.
Design consequence: Compute, memory hierarchy, and power management are co-optimized to achieve sub-millisecond response times within strict thermal and energy limits.
Challenge 2: Integrating AI Algorithms Into Hardware Design
Embedding evolving AI models into hardware requires balancing performance with adaptability as algorithms change over time.
Technical implication: Chips must support pipelining, data preprocessing, and on-device learning while managing limited resources.
Design consequence: Systems are built with configurable compute and memory blocks that allow new or larger AI models to run without reworking the entire chip.
Challenge 3: Managing Complex System Interactions
Physical AI devices operate as distributed systems that sense, process, and communicate continuously across multiple edge nodes.
Technical implication: Synchronization, bandwidth, and latency must be managed across heterogeneous SoCs (System-on-a-Chip) that include CPUs, NPUs, and DSPs.
Design consequence: Control and dataflow architectures are designed for deterministic operation to ensure coordination and reliability in safety-critical environments.
Challenge 4: Bridging Analog, Digital, and Mixed-Signal Worlds
Integrating sensors such as lidar, radar, and MEMS with digital compute pipelines introduces complexity across analog and digital domains.
Technical implication: Mixed-signal integration makes it difficult to model and validate real-time behavior across sensing and computation layers.
Design consequence: Orchestration frameworks and system-level simulation tools are developed to synchronize sensing, processing, and actuation efficiently.
Challenge 5: Verifying Performance in Real-World Environments
Physical AI systems must prove reliability not just in simulation but under real-world conditions such as temperature shifts, vibration, and human interaction.
Technical implication: Hardware and AI models need continuous monitoring and validation once deployed in the field.
Design consequence: Adaptive calibration and feedback mechanisms are embedded into the design to maintain accuracy, stability, and safety across changing operating conditions.
Outlook
Physical AI will accelerate hardware design activity across sectors including industrial, automotive, medical, and consumer applications. Each domain will demand customized chips optimized for specific environments, often built through modular, chiplet-based approaches that scale across product families. EDA (Electronic Design Automation), simulation, and emulation tools will need to evolve to handle these multi-die systems that operate, learn, and update continuously in the field.
Source: Multiple Challenges Emerge With Physical AI System Design (Semiconductor Engineering, 2025)
Spotlights
“NVIDIA founder and CEO Jensen Huang today visited TSMC’s semiconductor manufacturing facility in Phoenix to celebrate the first NVIDIA Blackwell wafer produced on U.S. soil, representing that Blackwell has reached volume production.
Onstage at the celebration, Huang joined Y.L. Wang, vice president of operations at TSMC, to sign the Blackwell wafer, commemorating a milestone that showcases how the engines of the world’s AI infrastructure are now being constructed domestically.”
“The problem has been attributed to Amazon’s AWS service, and triggered by a DNS resolution failure tied to the DynamoDB API endpoint in AWS’s US-East-1 (Northern Virginia) region, s with a flow on to thousands of services, including Amazon’s own services, such as Alexa, Ring and Prime Video, were experiencing problems, as well as big names from around the web.
One thing is clear: when so much of Europe’s digital infrastructure runs on a handful of American cloud providers, resilience becomes as much a geopolitical issue as a technical one. It exposes the fragility of global digital supply chains and the UK’s growing challenge in ensuring digital sovereignty and resilience.”
Headlines
Last week’s headlines highlighted major semiconductor moves from Tesla, Intel, and Arm, new quantum breakthroughs from IonQ and IBM, advances in photonic and neuromorphic chips, and emerging space data center concepts.
🦾 Semiconductors
Tesla Unveils New AI5 Chip to Be Produced by Samsung and TSMC (MarketScreener)
The GAIN AI Act Will Undermine the Global Competitiveness of U.S. AI Chipmakers (CSIS)
With an Intel recovery underway, all eyes turn to its foundry business (TechCrunch)
NextSilicon’s Maverick-2 Powers AI and HPC Breakthroughs with up to 10× Performance Boost at Less Than Half the Power (Business Wire)
Infineon launches AURIX™ Configuration Studio to accelerate software development for AURIX devices (Infineon)
⚛️ Quantum
IonQ Achieves Landmark Result, Setting New World Record in Quantum Computing Performance (Oxford Ionics)
Forthcoming IBM Paper Expected to Show Quantum Algorithm Running on Inexpensive AMD Chips (The Quantum Insider)
Telecom at the Edge of Scale: How Quantum Technologies Are Recasting the Network Economy (The Quantum Insider)
U.S. Weighs Taking Equity Stakes in Quantum Computing Firms (The Quantum Insider)
QuTech Researchers Achieve Digital Control Breakthrough With Germanium Qubits (Quantum Zeitgeist)
UNSW Researchers Surpass 54 % Efficiency With Fiber-Coupled Quantum Memory (Quantum Zeitgeist)
⚡️ Photonic / Optical
“Rainbow-on-a-chip” could help keep AI energy demands in check — and it was created by accident (LiveScience)
🧠 Neuromorphic
💥 Data Centers
How Will Crusoe and Starcloud Build Data Centres in Space? (Data Centre Magazine)
OpenAI, Oracle and Vantage plan Stargate Wisconsin data-center campus expected to be close to a gigawatt (Data Centre Dynamics)
☁️ Cloud
Argyll and SambaNova Partner to Deliver the UK’s First Renewable-Powered Sovereign AI Cloud (Business Wire)
Readings
This week’s reading list covers advances in memory and cooling, scalable photonics for AI, quantum state preparation and storage, neuromorphic processing for edge and optical AI, and evolving strategies in data centers and cloud infrastructure.
🦾 Semiconductors
Nanoimprint Lithography: Stop Saying It Will Replace EUV (SemiAnalysis) (39 mins)
Modulation of the Inner Gate Length in MFMIS NSFETs To Achieve Big Gains in Memory Window (SemiEngineering) (3 min)
Understanding and Mitigating Column-Based Read Disturbance in DRAM Chips (SemiEngineering) (4 min)
What’s Different About HBM4 (SemiEngineering) (14 min)
Digital Twins For Packaging: Bridging Design, Fab, Test, And Reliability (SemiEngineering) (16 min)
Diamond Thermal Conductivity: A New Era in Chip Cooling (IEEE Spectrum) (11 mins)
⚛️ Quantum
IRID + AIMING: The Pure-Play Quantum Computing Stocks vs Tech Giants Defining the Next Computing Era (Quantum Zeitgeist) (27 mins)
Efficient Quantum State Preparation with Bucket Brigade QRAM Achieves Logarithmic Data Retrieval for Machine Learning Applications (Quantum Zeitgeist) (10 mins)
Quantum Entanglement Nodes: Storage Capacity with Erasure-Prone Assistance Achieves Maximum Size (Quantum Zeitgeist) (14 mins)
⚡️ Photonic / Optical
AI Factories: Photonics at Scale (Optics & Photonics News) (27 min)
Non-volatile tuning of cryogenic silicon photonic micro-ring modulators (Nature Communications) (42 mins)
🧠 Neuromorphic
Why In-Memory Computation Is So Important For Edge AI (SemiEngineering) (6 min)
Are Reservoirs and Ising Machines Neuromorphic? (EE Times) (5 min)
High-throughput optical neuromorphic graphic processing at millions of images (EurekAlert) (11 min)
Resonate-and-fire Photonic-Electronic Spiking Neurons Enable Ultrafast, Energy-Efficient Neuromorphic Processing (Quantum Zeitgeist) (11 min)
💥 Data Centers
Data Center Liquid Cooling – Market Size to Exceed US$40 billion by 2036 (IDTechEx) (5 min)
AI Data Center Forecast: From Scramble to Strategy (Bain & Company) (10 mins)
☁️ Cloud
What Does a ‘Sovereign Cloud’ Really Mean? (TechPolicy.Press) (17 min)
Funding News
Last week saw an unusually high number of sub-$10M rounds across semiconductors, AI, and quantum. Activity above that level was limited to a few larger deals, including a $98M round in photonics and Crusoe’s $1.4B Series E in data centers.
Amount | Name | Round | Category |
|---|---|---|---|
£2M | Quantum | ||
$2.5M | Semiconductors | ||
$4.5M | AI | ||
$5M | Neuromorphic | ||
$6M | Semiconductors | ||
$21M | Semiconductors | ||
$98M | Photonics | ||
$1.4B | Data Centers |
Bonus: Google Claims Quantum Advantage (Again)
Google’s breakthrough claim
Google has announced what it calls the first verifiable quantum advantage, showing that its Willow quantum processor can run a real algorithm on hardware 13,000× faster than the fastest classical supercomputers. The new Quantum Echoes algorithm was used to compute the structure of a molecule, and Google says it opens a path toward real-world applications in medicine and materials science.
The result, published in Nature, builds on six years of work: the 2019 “quantum supremacy” experiment proving raw speed and the 2024 Willow chip, which dramatically suppressed errors that had limited quantum hardware for decades.
Scientific caution
However, many scientists remain cautious. The algorithm has so far been applied only to relatively simple molecules that can already be simulated classically, and there is still no formal proof that an equally fast classical method does not exist. Several researchers argue that claims of “quantum advantage” should meet a high standard of evidence.
Others point out that Google’s projected five-year horizon for practical applications seems ambitious, even if the underlying progress in error suppression and algorithm design is widely acknowledged as an important step forward.

