- Future of Computing Newsletter
- Posts
- ⚛️🦾 Quantum Error Correction – What It Takes to Scale
⚛️🦾 Quantum Error Correction – What It Takes to Scale
A Newsletter for Entrepreneurs, Investors, and Computing Geeks
Happy Monday! This week’s deep dive looks at quantum error correction: why it matters, what causes errors, and the approaches being developed to address them.
In our spotlights, we feature a milestone in quantum networking as engineers transmit quantum signals using standard Internet Protocol, and a report on the rapid growth of the Data Center Interconnect market.
The headlines cover record sales in semiconductors, new funding and breakthroughs in quantum, advances in photonics and neuromorphic computing, shifts in data center infrastructure, and updates in AI.
This week’s readings include AI data center chips, water use in fabs, quantum RAM, photonic networks, and new neuromorphic designs.
Funding news brought (only) four rounds across AI, photonics, and semiconductors, ranging from a €1.5M Seed to a $55M Series C.
In our two bonus sections, we cover quantum’s new role in defense and, as in almost every week, the U.S.–China race over AI chips and export controls.
Deep Dive: Quantum Error Correction – What It Takes to Scale
Why It Matters
Quantum computers are inherently noisy and without quantum error correction (QEC), they will never scale to solve real-world problems. No amount of raw qubit count or algorithmic progress can compensate for this. QEC is not an add-on, but the only path to reliable, fault-tolerant quantum computation.
Causes of Errors
Decoherence: Qubits lose their quantum state due to unwanted coupling with the environment, leading to loss of quantum information over time.
Gate infidelity: Operations on qubits are imperfect due to calibration errors, control noise, or hardware limitations, resulting in small but compounding deviations from the intended state evolution.
Crosstalk: Control signals targeting one qubit unintentionally induce transitions or noise in neighboring qubits due to residual coupling.
Leakage: Qubits transition into higher energy levels outside the defined computational basis, making standard control and readout operations ineffective.
Readout errors: The final measurement of the qubit state yields incorrect results due to signal degradation, limited detector fidelity, or noise in the amplification chain.
Potential Solutions
Logical encoding: Physical qubits are grouped into larger structures called logical qubits using error-correcting codes, which allow errors to be detected and corrected without measuring the actual quantum information.
Real-time decoding: Classical processors continuously monitor the system by analyzing error signals, known as syndromes, during the computation. To be effective, decoding must happen within microseconds to keep up with qubit operation speeds.
Application-layer support: Instead of correcting errors physically, software-based strategies can reduce their impact by adapting algorithms to known noise patterns, which can be managed through techniques like error mitigation, circuit optimization, and repeated sampling.
Hardware–software co-design: Reliable QEC depends on seamless coordination between qubit control, measurement, and classical processing (including feedback, decoding, and compilation). This requires designing all system layers (from hardware to compilers) to support fast and synchronized error correction.
Key Challenges
Overhead: Implementing QEC requires a significant number of physical qubits to encode a single logical qubit, especially when using surface codes. This creates a resource bottleneck for current-generation quantum hardware.
Latency: Decoding must be performed faster than new errors occur, typically within a microsecond, to prevent error accumulation during quantum operations. Low-latency feedback is essential for maintaining logical qubit fidelity.
Scalability: More efficient codes can reduce qubit overhead but often require complex connectivity and advanced control, which are difficult to implement at scale. These demands can limit compatibility with certain qubit modalities.
Integration: Effective QEC requires close coordination between hardware, firmware, and software components such as decoders, control systems, and compilers. All layers must be designed to support synchronized, real-time correction workflows.
If you want to learn more about hardware-agnostic QEC at the application level, check out our interview with Commutator Studios for our Future of Computing blog.
Spotlights
“In a first-of-its-kind experiment, engineers at the University of Pennsylvania brought quantum networking out of the lab and onto commercial fiber-optic cables using the same Internet Protocol (IP) that powers today's web.
Reported in Science, the work shows that fragile quantum signals can run on the same infrastructure that carries everyday online traffic. The team tested their approach on Verizon's campus fiber-optic network.”
💥 Data Center Interconnect Market Size, 2025 - 2034 (Global Market Insights) (27 mins)
This reading summarizes the rapid growth of the Data Center Interconnect (DCI) market, projected to rise from $11.9B in 2025 to $35.9B by 2034. Growth is driven by AI workloads, cloud adoption, and edge computing, with North America leading and Asia-Pacific expanding fastest.
This is relevant because DCI underpins the entire digital infrastructure, enabling low-latency, high-bandwidth connectivity that makes large-scale AI training, multi-cloud, and real-time applications possible.
Headlines
Last week’s headlines feature record sales and acquisitions in semiconductors, new initiatives and research breakthroughs in quantum, advances in photonic and neuromorphic systems, shifts in data center infrastructure, and updates in AI.
🦾 Semiconductors
Nvidia reports record sales as the AI boom continues (TechCrunch)
Denmark can now contribute to producing world-class chips (University of Copenhagen)
⚛️ Quantum Computing
SEC Documents Confirm Larger Quantinuum Funding Round (The Quantum Insider)
Researchers Develop Reliable Room-Temperature Single-Photon Emitters Using Readily Available C60 Fullerenes (Quantum Zeitgeist)
Europe’s Quantum Strategy Faces Calls for Urgent Action (The Quantum Insider)
EU Gives Greater Access to Quantum Computers to Accelerate Next-Generation Technology (The Quantum Insider)
NASA Selects Planette to Develop the First Quantum-Inspired AI System for Extreme Weather Prediction (The Quantum Insider)
Quantum Computing Joins the Next Frontier in Genomics: Sanger Institute Partners With Quantinuum in Q4Bio Bid (The Quantum Insider)
⚡️ Photonic / Optical Computing
Chinese scientists make breakthrough in ultra-wideband photonic-electronic 6G communication (China Daily)
Xscape Photonics and Tower Semiconductor Unveil the Industry’s First Optically Pumped On-Chip Multi-Wavelength Laser Platform for AI Datacenter Fabrics (Global Newswire)
🧠 Neuromorphic Computing
Researchers achieve 90.7 % accuracy with novel optical neuromorphic system using nanocrystals (Quantum Zeitgeist)
💥 Data Centers
Meta Turns to Google Cloud for AI Data Centre Infrastructure (Data Centre Magazine)
Synopsys Embraces NVIDIA RTX PRO Servers to Accelerate Compute-Heavy Simulation Workloads (Synopsys Newsroom)
🤖 AI
Elon Musk says xAI has open sourced Grok 2.5 (TechCrunch)
Readings
This week’s reading list explores shifts in the AI chip market, water use and lithography in semiconductor fabs, new approaches to quantum memory and federated learning, analog breakthroughs in photonics, and brain-inspired circuit designs.
🦾 Semiconductors
AI data center chip market to hit $286 bn, growth likely peaking as custom ASICs gain ground (Omdia) (3 mins)
How Semiconductor Fabs Use Water (Semiconductor Engineering) (34 mins)
Semiconductor Lithography Materials Market Evolution: EUV, Integration, and Advanced Device Drivers (Techcet) (5 mins)
⚛️ Quantum Computing
Quantum memory array brings us closer to a quantum RAM (Phys.org) (6 mins)
Researchers Unlock Potential of Quantum Federated Learning for Decentralized Computing and Enhanced Privacy (Quantum Zeitgeist) (7 mins)
⚡️ Photonic / Optical Computing
Digital to analog in one smooth step: Device could replace signal modulators in fiber-optic networks (TechXplore) (5 mins)
🧠 Neuromorphic Computing
Artificial neuron merges DRAM with MoS₂ circuits to better emulate brain-like adaptability (TechXplore) (6 mins)
Turning spin loss into energy: New principle could enable ultra-low power devices (Phys.org) (6 mins)
💥 Data Centers
Data center engineers are skeptical of putting AI in charge (IEEE Spectrum) (11 mins)
DCD visits quantum data centers from IBM and IQM in Germany (Data Centre Dynamics) (22 mins)
Funding News
Last week brought four announced rounds across AI, photonics, and semiconductors, with more activity at the later stages. Investment sizes stretched from a €1.5M Seed to a $55M Series C.
Amount | Name | Round | Category |
---|---|---|---|
€1.5M | AI | ||
$20M | AI | ||
$34M | Photonics / Optical | ||
$55M | Semiconductors |
Bonus 1: Quantum for the Battlefield
Quantum is increasingly treated as a defense technology. Governments are funding sensor and navigation programs, militaries are awarding contracts, and companies are creating dedicated defense divisions. All of this signals that quantum is now seen as a strategic asset for national security.
Silicon Quantum Computing, CSIRO Win Australian Defense Contracts to Build Quantum Tech For Defense (The Quantum Insider)
Sydney quantum startup Q-CTRL bags $38 million US defence contract for GPS-free navigation sensors (Startup Daily)
Quantum Research Sciences Developing AI Platform to Help Air Force More Efficiently Connect With Industry (The Quantum Insider)
Bonus 2: China Doubles Down, U.S. Closes the Gaps
Another week of U.S. vs. China: The latter is ramping up domestic AI chip production, with new fabs aiming to triple output next year. Baidu launched its Baige 5.0 platform, powered by chips, which increased the efficiency of DeepSeek’s R1 model by about 50%. Alibaba followed with a new inference chip designed to reduce reliance on Nvidia.
China Aims to Triple AI Chip Production Amid Competition with US (UNN)
Baidu Unveils AI Computing Platform Powered by Chinese Chips to Push a Domestic Tech Stack (South China Morning Post)
Alibaba Develops New AI Chip to Help Fill Nvidia’s Absence in China (Seeking Alpha)
Meanwhile, TSMC stated that it will not utilize Chinese-made equipment in its most advanced 2-nanometer factories. Additionally, the U.S. has closed an export loophole that had allowed some firms to ship chip-making gear to China without a license.
Nvidia’s $50bn China Opportunity, TSMC Dumps China Tools (Nikkei Asia)
Department of Commerce Closes Export Controls Loophole for Foreign-Owned Semiconductor Fabs in China (U.S. Department of Commerce, Bureau of Industry & Security)