UNESCO designated 2025 as the year of Quantum Science and Technology, being the 100th anniversary of the discoveries of quantum mechanics which turned our understanding of physics on its head. For the past century, we’ve had to live with the profoundly strange notions that matter is insubstantial, particles exist in multiple states, and nothing we know is certain.
Proposed as a theory in the 1980s, quantum computing envisions a revolutionary approach to the construction and operation of computers. Classical computing executes deterministically on data and instructions represented as binary digits or bits. Quantum computers define quantum bits, or qubits, using the quantum characteristics of superpositioning, entanglement and uncertainty—offering a much faster, probabilistic approach to solving complex mathematical problems. The path from theory to reality, however, has been slow. For the past 40 years, meaningful quantum computers have always been said to be five to ten years off.
2025 marks a turning point where practical, useful quantum computing goes mainstream, and IBM is leading the way. This article describes IBM’s roadmap, results achieved and use cases moving from the lab into the real world.
The road more travelled
IBM’s quantum computing roadmap dates back nearly a decade, to early experimental work in 2016. Like many others, the company built prototype quantum computers out of only a handful of qubits but also provided access to its hardware over the cloud and fostered a robust developer ecosystem centred on its Qiskit tooling.
In the early 2020s IBM dramatically scaled up its quantum hardware, beginning with the 127-qubit System 1 (Eagle) in 2021, and continuing with the 433-qubit Osprey in 2022, and the 1,121-qubit Condor in 2023. The System 1 is now the foundational workhorse for the IBM Quantum Network, providing customers direct access to test and execute experimental quantum workloads. It has also been installed standalone in Canada, the United States, Germany, Japan and other countries.
IBM updates its roadmap annually, marking completed milestones and displaying ambitious targets into the next decade. Hardware and software mature together, including enhancements to Qiskit, and supporting complex workloads with circuit depth increasing exponentially from a current 5,000 gates. A gate is roughly equivalent to a single quantum machine instruction, so it’s a good measurement of the complexity of quantum algorithms.
Bigger isn’t better
The largest problem afflicting quantum computing is high error rates due to qubit decoherence. One cause of decoherence is interference from other qubits, effectively limiting the vertical scalability of quantum hardware. As a result, IBM’s Condor proved to be more of an engineering proof of concept, and other techniques were required to grow quantum computing capacity.
IBM’s focus shifted to developing parallel processing in quantum computing, based on the 133-qubit Heron introduced in late 2023. Subsequently, IBM announced and deployed a Quantum System 2 in its TJ Watson Research Center in Yorktown, New York. The System 2 currently deploys three Heron processors and additional work is being done to stabilize couplings, adding more processors. The Qiskit tools have also been improved to enable circuit cutting and circuit knitting, distributing the computational workload across processors.
Expect quantum computers to follow this pattern and expand horizontally, not vertically, in the years to come.
Dealing with errors
Quantum computers are notoriously error-prone, with an error rate of about one in a thousand, compared to a classical error rate of roughly one in a quintillion (1018.) This is the biggest inhibitor to achieving quantum advantage, the point at which quantum computers will meaningfully outperform classical computers.
In 2023, IBM introduced error mitigation—mathematical techniques to reduce the effect of errors. Zero-noise extrapolation, for example, uses detectable patterns in quantum errors to extrapolate a better approximate solution and can be included in quantum circuits via a simple include statement using Qiskit. Error mitigation is a good intermediate step in improving the quality of quantum computers but adds overhead.
IBM’s goal is fault-tolerant quantum computing, by devising logical qubits out of multiple physical qubits. This is tricky to achieve, and current surface codes, or connections between physical qubits, aren’t robust enough—requiring hundreds or thousands of physical qubits for a single logical qubit. In early 2024, IBM released a new surface code topology, arranging qubits in a virtual torus shape resulting in a ten-fold reduction in the number of physical qubits required for a logical qubit. A proof of concept was demonstrated showing 12 logical qubits formed from 288 physical qubits, where nearly 3,000 would have been required previously. More work will be needed to scale this up to hundreds of logical qubits, but it’s a step in the right direction.
With error mitigation and early steps toward fault tolerance, IBM has achieved quantum utility—an intermediate step before quantum advantage, where quantum computers can produce results at least equivalent to high-performance classical computers, and the two can validate each other. The company has published several practical use cases in condensed-matter physics, statistics and materials science that demonstrate quantum utility.
A look ahead
IBM has consistently met the key milestones in its roadmap through 2024, and the next few years will bring further improvements in scalability and fault-tolerance. The Heron processor will grow from 133 to 156 qubits this year, and the System 2 will expand to seven parallel Herons. By 2029, this will lead to an improvement in quantum circuit quality up to 100 million gates. By then, expect IBM to have achieved quantum advantage with relevant use cases deployed in business and science.
Quantum, however, is not a general-purpose computing solution and it won’t replace classical computing. The future will be hybrid, where quantum and classical computers combine forces to solve the most complex problems which neither can solve alone. Watch for IBM to elaborate its vision of quantum-centric supercomputing before the end of this decade.
My main ask of IBM is, get out of the lab! Quantum has always been a research activity for the company, but quantum utility is good enough for customers to invest in production today.
Let’s commercialize quantum computing now. It’s ready.