⚛️🦾 Quantum Error Correction – What It Takes to Scale

A Newsletter for Entrepreneurs, Investors, and Computing Geeks

Happy Monday! This week’s deep dive looks at quantum error correction: why it matters, what causes errors, and the approaches being developed to address them.

In our spotlights, we feature a milestone in quantum networking as engineers transmit quantum signals using standard Internet Protocol, and a report on the rapid growth of the Data Center Interconnect market.

The headlines cover record sales in semiconductors, new funding and breakthroughs in quantum, advances in photonics and neuromorphic computing, shifts in data center infrastructure, and updates in AI.

This week’s readings include AI data center chips, water use in fabs, quantum RAM, photonic networks, and new neuromorphic designs.

Funding news brought (only) four rounds across AI, photonics, and semiconductors, ranging from a €1.5M Seed to a $55M Series C.

In our two bonus sections, we cover quantum’s new role in defense and, as in almost every week, the U.S.–China race over AI chips and export controls.

Deep Dive: Quantum Error Correction – What It Takes to Scale

Why It Matters

Quantum computers are inherently noisy and without quantum error correction (QEC), they will never scale to solve real-world problems. No amount of raw qubit count or algorithmic progress can compensate for this. QEC is not an add-on, but the only path to reliable, fault-tolerant quantum computation.

Causes of Errors

  • Decoherence: Qubits lose their quantum state due to unwanted coupling with the environment, leading to loss of quantum information over time.

  • Gate infidelity: Operations on qubits are imperfect due to calibration errors, control noise, or hardware limitations, resulting in small but compounding deviations from the intended state evolution.

  • Crosstalk: Control signals targeting one qubit unintentionally induce transitions or noise in neighboring qubits due to residual coupling.

  • Leakage: Qubits transition into higher energy levels outside the defined computational basis, making standard control and readout operations ineffective.

  • Readout errors: The final measurement of the qubit state yields incorrect results due to signal degradation, limited detector fidelity, or noise in the amplification chain.

Potential Solutions

  • Logical encoding: Physical qubits are grouped into larger structures called logical qubits using error-correcting codes, which allow errors to be detected and corrected without measuring the actual quantum information.

  • Real-time decoding: Classical processors continuously monitor the system by analyzing error signals, known as syndromes, during the computation. To be effective, decoding must happen within microseconds to keep up with qubit operation speeds.

  • Application-layer support: Instead of correcting errors physically, software-based strategies can reduce their impact by adapting algorithms to known noise patterns, which can be managed through techniques like error mitigation, circuit optimization, and repeated sampling.

  • Hardware–software co-design: Reliable QEC depends on seamless coordination between qubit control, measurement, and classical processing (including feedback, decoding, and compilation). This requires designing all system layers (from hardware to compilers) to support fast and synchronized error correction.

Key Challenges

  • Overhead: Implementing QEC requires a significant number of physical qubits to encode a single logical qubit, especially when using surface codes. This creates a resource bottleneck for current-generation quantum hardware.

  • Latency: Decoding must be performed faster than new errors occur, typically within a microsecond, to prevent error accumulation during quantum operations. Low-latency feedback is essential for maintaining logical qubit fidelity.

  • Scalability: More efficient codes can reduce qubit overhead but often require complex connectivity and advanced control, which are difficult to implement at scale. These demands can limit compatibility with certain qubit modalities.

  • Integration: Effective QEC requires close coordination between hardware, firmware, and software components such as decoders, control systems, and compilers. All layers must be designed to support synchronized, real-time correction workflows.

If you want to learn more about hardware-agnostic QEC at the application level, check out our interview with Commutator Studios for our Future of Computing blog.

Other companies advancing the QEC stack include Riverlane, QuantWare, and Q-CTRL.

Spotlights

“In a first-of-its-kind experiment, engineers at the University of Pennsylvania brought quantum networking out of the lab and onto commercial fiber-optic cables using the same Internet Protocol (IP) that powers today's web.

Reported in Science, the work shows that fragile quantum signals can run on the same infrastructure that carries everyday online traffic. The team tested their approach on Verizon's campus fiber-optic network.”

💥 Data Center Interconnect Market Size, 2025 - 2034 (Global Market Insights) (27 mins)

This reading summarizes the rapid growth of the Data Center Interconnect (DCI) market, projected to rise from $11.9B in 2025 to $35.9B by 2034. Growth is driven by AI workloads, cloud adoption, and edge computing, with North America leading and Asia-Pacific expanding fastest.

This is relevant because DCI underpins the entire digital infrastructure, enabling low-latency, high-bandwidth connectivity that makes large-scale AI training, multi-cloud, and real-time applications possible.

Headlines


Last week’s headlines feature record sales and acquisitions in semiconductors, new initiatives and research breakthroughs in quantum, advances in photonic and neuromorphic systems, shifts in data center infrastructure, and updates in AI.

🦾 Semiconductors

⚛️ Quantum Computing

⚡️ Photonic / Optical Computing

🧠 Neuromorphic Computing

💥 Data Centers

🤖 AI

Readings


This week’s reading list explores shifts in the AI chip market, water use and lithography in semiconductor fabs, new approaches to quantum memory and federated learning, analog breakthroughs in photonics, and brain-inspired circuit designs.

🦾 Semiconductors

How Semiconductor Fabs Use Water (Semiconductor Engineering) (34 mins)

⚛️ Quantum Computing

⚡️ Photonic / Optical Computing

🧠 Neuromorphic Computing

💥 Data Centers

Funding News


Last week brought four announced rounds across AI, photonics, and semiconductors, with more activity at the later stages. Investment sizes stretched from a €1.5M Seed to a $55M Series C.

Amount

Name

Round

Category

€1.5M

Farang

Seed

AI

$20M

FriendliAI

Seed Extension

AI

$34M

OpenLight

Series A

Photonics / Optical

$55M

Paragraf

Series C

Semiconductors

Bonus 1: Quantum for the Battlefield

Quantum is increasingly treated as a defense technology. Governments are funding sensor and navigation programs, militaries are awarding contracts, and companies are creating dedicated defense divisions. All of this signals that quantum is now seen as a strategic asset for national security.

Bonus 2: China Doubles Down, U.S. Closes the Gaps

Another week of U.S. vs. China: The latter is ramping up domestic AI chip production, with new fabs aiming to triple output next year. Baidu launched its Baige 5.0 platform, powered by chips, which increased the efficiency of DeepSeek’s R1 model by about 50%. Alibaba followed with a new inference chip designed to reduce reliance on Nvidia.

Meanwhile, TSMC stated that it will not utilize Chinese-made equipment in its most advanced 2-nanometer factories. Additionally, the U.S. has closed an export loophole that had allowed some firms to ship chip-making gear to China without a license.

❤️ Love these insights? Forward this newsletter to a friend or two. They can subscribe here.