Scientists build quantum computers that snap together like LEGO bricks. Like LEGO for the quantum age, engineers have unveiled modular super...
![]() |
Scientists build quantum computers that snap together like LEGO bricks. |
Quantum-computing companies have been competing for years to squeeze the most qubits onto a chip. But fabrication and connectivity challenges mean there are limits to this strategy. The focus is now shifting to linking multiple quantum processors together to build computers large enough to tackle real-world problems.
In January, the Canadian quantum-computing company Xanadu unveiled what it says is the first modular quantum computer. Xanadu’s approach uses photons as qubits—just one of many ways to create the quantum-computing equivalent of a classical bit. In a paper published that same month in Nature, researchers at the company outlined how they connected 35 photonic chips and 13 kilometers of optical fiber across four server racks to create a 12-qubit quantum computer called Aurora. Although there are quantum computers with many more qubits today, Xanadu says the design demonstrates all the key components for a modular architecture that could be scaled up to millions of qubits.
![]() |
IBM plans to update its 156-qubit Heron processors to deliver a modular, 1,000+ qubit Flamingo processor this year. |
Scaling Quantum Machines
Quantum computing depends on the delicate control of qubits, the basic units of quantum information. Large-scale quantum processors may need millions of qubits, but assembling them into one massive, monolithic system has proven extremely difficult. Errors grow quickly as systems expand, limiting their usefulness for real-world problems.
The Illinois team’s approach embraces modularity. Instead of building one enormous chip, researchers create smaller high-performance modules — like quantum “blocks” — and snap them together into a bigger system, just as plastic LEGO bricks combine to form complex structures. Led by Wolfgang Pfaff, assistant professor of physics, the researchers demonstrated how two quantum modules can be connected by superconducting coaxial cables to share entanglement and perform joint operations.
Remarkably, the team achieved ~99% SWAP gate fidelity, meaning less than 1% loss in communication across modules. “We’ve created an engineering-friendly way of achieving modularity with superconducting qubits,” Pfaff explained. “Can I build a system that I can bring together, manipulate two qubits jointly, create entanglement, and later reconfigure if needed? The answer is yes — and with very high quality.”
Why Modularity Matters
Modularity has always been central to IBM’s quantum road map, says Oliver Dial, the chief technology officer of IBM Quantum. While the company has often led the field in packing more qubits into processors, there are limits to chip size. As they grow larger, wiring up the control electronics becomes increasingly challenging, says Dial. Building computers with smaller, testable, and replaceable components simplifies manufacturing and maintenance.
However, IBM is using superconducting qubits, which operate at high speeds and are relatively easy to fabricate but are less network-friendly than other quantum technologies. These qubits operate at microwave frequencies and so can’t easily interface with optical communications, which required IBM to develop specialized couplers to connect both adjacent chips and more distant ones.
IBM is also researching quantum transduction, which converts microwave photons into optical frequencies that can be transmitted over fiber optics. But the fidelity of current demonstrations is far from what is required, says Dial, so transduction isn’t on IBM’s official road map yet. Monolithic superconducting systems are difficult to scale because small errors cascade as the number of qubits grows. Modular systems, however, allow for hardware upgrades, targeted error detection, and incremental scaling.
If one module fails, it can be swapped out without rebuilding the entire computer. This mirrors how classical supercomputers use clusters of modular processors rather than a single giant CPU. For researchers, this also means greater flexibility. Probability maps of errors can be studied on smaller modules before larger systems are assembled, reducing the risk of discovering catastrophic design flaws only after full integration.
The Illinois team’s demonstration is a foundational milestone. The next challenge is to connect more than two modules while retaining high fidelity. If successful, this would pave the way for quantum “networks” of interchangeable processors that can be expanded almost endlessly. “Finding an approach that works has taken a while for our field,” Pfaff said. “We really want the ability to stitch bigger and bigger systems together through cables and still achieve numbers that justify scaling. Now, the question is: can we maintain this performance as we connect more devices?”
This modular approach addresses a critical bottleneck in the global race toward fault-tolerant quantum computing. It could accelerate the development of practical systems capable of solving problems in cryptography, materials science, drug discovery, and optimization that classical supercomputers cannot handle. By rethinking how quantum processors are built — not as indivisible chips but as reconfigurable, connectable components — scientists are inching closer to the day when quantum computers move from laboratory prototypes to reliable tools powering industry and research.