Quantum is Mission Critical, IBM & Cisco, Riverlane, QuEra, Kvantify and more - The Week in Quantum Computing, November 24th, 2025
Issue #260
Quick Recap
The 2025 Annual Report to the US Congress redefined quantum computing as a “mission-critical national asset,” setting a “Quantum First” 2030 objective for quantum advantage in various sectors, and urged a move from funding science to “buying outcomes,” while warning of sovereign risks from China’s industrial-scale quantum investment. On the commercial front, Broadcom launched quantum-safe Gen 8 128G Fibre Channel SAN switches, integrating post-quantum cryptography and AI-driven automation to secure and modernize enterprise data infrastructure in preparation for both AI and quantum threats. In tandem, IBM and Cisco outlined plans to build a distributed, fault-tolerant quantum computing network by 2030, promising architectures capable of trillions of quantum gates through advanced networking technologies and a demonstration within three years. Research and industry partnerships remained central this week. Riverlane’s new report underscored the pressing challenge of quantum error correction, stressing that millions of physical qubits may be needed for a single logical qubit and reaffirming this bottleneck as a defining obstacle for usable quantum systems. On the hybrid-computing front, QuEra and Dell demonstrated integrated quantum–classical HPC workflows, while Kvantify launched Qrunch, a user-friendly quantum chemistry platform enabling practical applications in drug discovery for companies like Novonesis on IQM hardware. ORCA Computing and SiC Systems, in collaboration with DTU and Novo Nordisk, won the HPC Innovation Excellence Award for their hybrid photonic quantum–AI platform for biomanufacturing optimization. Research highlights include a new constraint-aware QAOA algorithm for constrained optimization and a comprehensive review of quantum machine learning’s current benefits and persistent open questions. In conclusion, this week marked pivotal progress in both quantum policy and practical deployments, while reports on infrastructure and algorithms highlight ongoing challenges for real-world impact as we move into 2025.
Wow, that was a mouthful. Moving on.
What is Quantum Machine Learning?
A great paper by Su Yeon Chang and M. Cerezo from Los Alamos lab bring a survey on the state of QML and its applications. So I summarized and AI-ized it for your consumption.
Quantum machine learning (QML) is fundamentally defined as a computational paradigm that seeks to apply quantum-mechanical resources to solve learning problems.
QML arose from the effort to combine the theoretical promise of quantum computing with the empirical success of classical machine learning (ML).
Core Purpose and Scope
The primary goal of the QML framework is to leverage quantum processors to tackle various learning tasks more efficiently than purely classical models. These tasks include:
• Optimization
• Supervised learning
• Unsupervised learning
• Reinforcement learning
• Generative modeling
QML is an interdisciplinary endeavor, arising from the quest to combine quantum algorithmic techniques with the pragmatic, data-driven methods of modern ML. This combination leads to hopes of accelerating existing learning tasks and enabling new ones, particularly in regimes native to quantum data.
QML is described as “somewhat of an umbrella term spanning many settings”. The landscape of QML can be schematically organized along two axes: data type (classical versus quantum) and algorithm type (classical versus quantum). Schemes generally focus on models where the quantum device is used as the central information-processing or data-generating unit, relying on coherent state preparation, evolution, and measurements.
Historical Context
Despite its current widespread attention, QML is not a new field. Initial foundational concepts date back decades:
• The quantum perceptron was introduced in 1994.
• A quantum version of the probably approximately correct (PAC) learning formalism was developed in 1995.
• Initial ideas for quantum artificial neural networks emerged in the mid-1990s.
Approaches to QML
Broadly, QML approaches often fall into two strands when applied to classical data:
1. QML-from-the-best-quantum: Algorithms built directly from flagship quantum algorithms, such as leveraging primitives like quantum random access memory (QRAM) or the Harrow-Hassidim-Lloyd (HHL) algorithm for tasks like quantum Principal Component Analysis (qPCA) or recommendation systems.
2. QML-from-the-best-classical (Variational QML): This approach mimics modern classical ML, using parametrized hypothesis classes (like Parameterized Quantum Circuits or Quantum Neural Networks) where parameters are trained by minimizing an empirical loss, often following a hybrid classical-quantum computational paradigm.
For quantum data, QML involves methods for extracting information from quantum experiments, learning properties of quantum states, and channels. This area has seen the clearest practical success by grounding access models in physical reality (e.g., addressing constraints like the no-cloning theorem).
The Week in Quantum Computing
QuEra to Showcase Quantum/Classical with Dell Integration at SC25
QuEra Computing announced at SC25 a collaboration with Dell Technologies, highlighting the integration of QuEra’s neutral-atom quantum systems with Dell’s HPC infrastructure, including PowerEdge servers, NVIDIA GPUs, and the Dell Quantum Intelligent Orchestrator (QIO). The SC25 demonstration features a co-located setup in Boston that simulates Greenberger–Horne–Zeilinger (GHZ) state generation, leveraging QuEra’s qubit shuttling and parallel gate execution. Described by QuEra CCO Yuval Boger as “a clear signal that hybrid quantum–classical computing is becoming practical,” this prototype allows rapid experimentation and transparent benchmarking for diverse sectors. The effort marks a practical step for hybrid quantum–classical computing workflows within HPC centers, hyperscalers, and enterprises, moving quantum computing closer to integration with traditional IT strategies.
Paper: Empirical Quantum Advantage in Constrained Optimization from Encoded Unitary Designs
Researchers Chinonso Onah, Roman Firt, and Kristel Michielsen present the Constraint-Enhanced Quantum Approximate Optimization Algorithm (CE-QAOA), a shallow, constraint-aware quantum algorithm aimed at constrained optimization problems. Their ancilla-free encoder prepares W_n states with n-1 two-qubit rotations per block, and their two-local XY mixer maintains a constant spectral gap. The hybrid quantum-classical PHQC solver they introduce identifies optimal solutions in O(S n^2) time. Notably, CE-QAOA demonstrates a Θ(n^r) reduction in shot complexity and an exp(Θ(n^2)) minimax separation against certain classical baselines. In noiseless simulations of traveling salesman problems (TSP) with 4–10 cities from the QOPTLib library, global optima are recovered at depth p=1 using polynomial resources, suggesting practical quantum advantages in constrained optimization for 2025.
Kvantify launches Qrunch
Danish company Kvantify has launched Qrunch, quantum chemistry software designed to make quantum computing accessible, scalable, and cost-effective for non-experts, especially in pharmaceutical and materials science. Qrunch enables researchers to simulate complex chemistry problems on quantum hardware, with Danish biotech leader Novonesis among the first testers. Lars Olsen of Novonesis highlights improved enzyme product efficiency through deeper biological insights provided by Qrunch’s simulations. The platform operates on IQM Quantum Computers’ superconducting hardware, a collaboration that, according to Jan Goetz of IQM, aims to boost Europe’s quantum ecosystem through integrated hardware-software innovation.
ORCA Computing and SiC Systems Win 2025 HPC Innovation Excellence Award for Quantum-Accelerated Agentic AI
ORCA Computing and SiC Systems, in collaboration with the Technical University of Denmark (DTU) and Novo Nordisk, have won the 2025 HPC Innovation Excellence Award, presented by Hyperion Research, for their project on “Agentic AI for Biomanufacturing Optimization Using Hybrid Quantum–Classical HPC Systems.” The project features a Sense–Infer–Control (SIC) platform combining photonic quantum processors, GPU-accelerated HPC, and AI agents to optimize biomanufacturing workflows in real time.
IBM and Cisco Announce Plans to Build a Network of Large-Scale, Fault-Tolerant Quantum Computers
IBM and Cisco announced a collaboration to build a network of large-scale, fault-tolerant quantum computers, aiming for an initial demonstration by 2030 and targeting fully distributed quantum computing by the early 2030s. IBM’s quantum hardware will be linked by Cisco’s quantum networking technology to enable computations spanning tens to hundreds of thousands of qubits and trillions of quantum gates. This architecture relies on innovations such as microwave-optical transducers, quantum networking units (QNUs), and dynamic network software. Work with Fermi National Accelerator Laboratory’s SQMS will investigate scaling within data centers, with a demo planned in three years. This initiative could create the technological groundwork for a quantum computing internet spanning cities and potentially the globe by the late 2030s.
The @DeptofWar’s New Critical Technology Areas:
The “Science Project” Era is Over. Report to US Congress Calls for a “Quantum First” 2030 Goal
The 2025 Annual Report to US Congress marks a turning point for quantum computing, declaring it a “mission-critical national asset” beyond research. The report sets a “Quantum First” 2030 objective, targeting quantum advantage in cryptography, drug discovery, and materials science. It warns that mere qubit innovations are insufficient, urging modernization of cryogenic labs, fabrication lines, and measurement facilities. The report spotlights the intersection of quantum and AI as a transformative force and highlights China’s “industrial-scale funding” in dual-use quantum technologies as a sovereign risk. The Commission asserts that the era of “funding science” is over, demanding the US shift to “buying outcomes.” The urgency to act signifies a pivotal shift in US quantum policy for 2025 and beyond.
Broadcom Introduces the World’s First Quantum-Safe Gen 8 128G SAN Switch Portfolio
Broadcom has unveiled the industry’s first quantum-safe Gen 8 128G Fibre Channel SAN switch portfolio, featuring the Brocade X8 Directors and G820 56-port switch. Announced on November 19, 2025, these platforms deliver 128G performance and integrate quantum-resistant 256-bit encryption alongside post-quantum cryptography algorithms to address security threats posed by quantum computing. The switches incorporate embedded SAN AI for automated infrastructure management and offer high scalability, supporting up to 384×128G ports on the X8 Director. According to Dennis Makishima, VP and GM at Broadcom, this generation secures and automates storage networks for enterprise AI workloads. As organizations prepare for sophisticated threats from both AI and quantum technologies, enterprise infrastructure faces an inflection point with quantum-safe, AI-driven SAN solutions.
Riverlane report reveals scale of the Quantum Error Correction challenge
Riverlane has released a report detailing the immense scale of the Quantum Error Correction (QEC) challenge facing the field in 2025. The analysis underscores the need for millions of physical qubits to realize a single logical qubit, emphasizing error correction as a central hurdle for practical quantum computing. Riverlane’s findings highlight that, without significant advances in QEC, building useful quantum computers will remain out of reach. This report is a critical reminder that, despite recent progress and investments, the error correction bottleneck dominates near-term feasibility, reaffirming the gap between experimental prototypes and scalable, reliable quantum hardware.
Paper: A Primer on Quantum Machine Learning
Su Yeon Chang and M. Cerezo, in “A Primer on Quantum Machine Learning” (arXiv:2511.15969, November 2025), provide a comprehensive overview of quantum machine learning (QML), emphasizing its promise to solve optimization, supervised, unsupervised, reinforcement learning, and generative modeling more efficiently than classical models. The chapter outlines tensions between practicality and theoretical guarantees, differences in access models, and assessments of quantum speedups versus classical baselines, noting where evidence for quantum advantage is strong, conditional, or lacking. With 29+16 pages, 5 figures, and 15 boxes, the work aims to clarify the nuanced landscape and persistent open questions within QML. This survey underscores that the debate over when QML offers real benefits remains active and heavily dependent on context and assumptions.
Demystifying Logical Qubits and Fault Tolerance
IonQ’s November 2025 article clarifies the significance and challenges of logical qubits and fault tolerance in quantum computing. Logical qubits, first proposed by Peter Shor in 1995, are clusters of physical qubits designed for enhanced reliability through quantum error correction. Achieving practical logical qubits remains complex: attributes like physical-to-logical ratio, logical error rates, gate fidelity, speed, and universality all vary and are interdependent. Poor physical qubits yield ineffective logical qubits, meaning early demonstrations may offer little advantage over single physical qubits. IonQ highlights its ultra-high-fidelity trapped-ion hardware, notably leveraging Barium and its acquisition of Oxford Ionics, achieving record two-qubit gate fidelities of 99.99%. With current systems still in the NISQ era, IonQ’s approach aims for highly efficient, scalable fault-tolerant computation as physical qubit quality advances.
Single-photon switch could enable photonic computing
Researchers at Purdue University, led by Vladimir Shalaev and first author Demid Sychev, have demonstrated a “photonic transistor” operating at single-photon intensities, published in Nature Nanotechnology (2025, DOI: 10.1038/s41565-025-02056-2). Using the avalanche multiplication process from single-photon avalanche diodes (SPAD), their device achieves a nonlinear refractive index several orders of magnitude higher than prior materials. Three significant features are highlighted: operation at room temperature, compatibility with semiconductor manufacturing, and gigahertz speeds (potentially hundreds of gigahertz). This achievement addresses a critical bottleneck for scalable photonic and quantum computing, enabling low-power, fast photonic switches.
Physicists Take the Imaginary Numbers Out of Quantum Mechanics
In 2025, a longstanding debate about the necessity of imaginary numbers in quantum mechanics reignited when German and French theorists, followed by a quantum computing researcher, published real-valued formulations of quantum theory that are mathematically equivalent to the standard version, challenging earlier 2021 experimental claims by Marc-Olivier Renou, Nicolas Gisin, and colleagues. Their Nature experiment aimed to exclude real-number-based quantum theories, but new work revealed it depended on an objectionable assumption. Despite Erwin Schrödinger’s historical reservations and Bill Wootters noting the centrality of complex numbers in quantum theory, these developments suggest i may not be fundamental.
Amazon S3 now supports post-quantum TLS key exchange on S3 endpoints
On November 19, 2025, Amazon Web Services announced that Amazon S3 now supports post-quantum TLS key exchange across all regional S3, S3 Tables, and S3 Express One Zone endpoints. This implementation uses Module Lattice-Based Key Encapsulation Mechanisms (ML-KEM), a National Institute of Standards & Technology (NIST) standardized post-quantum cryptographic algorithm, providing customers with quantum-resistant encryption for data in transit. Combined with default server-side AES-256 encryption, S3 users now gain end-to-end quantum-resistant protection for both data in-transit and at-rest, at no additional cost. By adopting ML-KEM at scale, AWS signals a significant move towards integrating NIST-standardized post-quantum cryptography in mainstream cloud storage services in 2025.
NVIDIA NVQLink Architecture Integrates Accelerated Computing with Quantum Processors
NVIDIA unveiled NVQLink a few weeks ago, an open platform architecture that integrates GPU-accelerated computing directly with quantum processors via a low-latency (<4 microseconds) RDMA over Ethernet interconnect using standard NVIDIA networking hardware. Quantinuum has adopted NVQLink for its Helios QPU, pairing it with an NVIDIA GH200 Grace Hopper host, and demonstrated real-time decoding of Bring’s code qLDPC, achieving a 67 microsecond median decoding time and a 5.4x error rate improvement. Developers benefit from unified programming in C++ or Python using CUDA-Q, allowing direct invocation of classical functions from quantum kernels. This is the technical blog.







Love this!
The error corection bottleneck realy puts things in perspective. If you need millions of physical qubits for one logical qubit, we're talking about scale challeges that make current semiconductor manufacturing look simple. Makes me wonder how realistic that 2030 timeline is for practical deployment.