IBM Quantum Computer Demonstrates Next Step Towards Moving Beyond Classical Supercomputing.
- New research published in Nature shows evidence of quantum utility
- Backed by this evidence, IBM to upgrade its full fleet of IBM Quantum systems to large-scale quantum processors over 100+ qubits
- Top research institutions and industry leaders including Boeing, Bosch, Cleveland Clinic, CERN, DESY, E.ON, ExxonMobil, Moderna, Oak Ridge National Lab, The University of Chicago, RIKEN, and Wells Fargo form working groups to pursue near-term quantum value
ARMONK, N.Y., June 14, 2023 – IBM (NYSE: IBM) today announced a new breakthrough, published on the cover of the scientific journal Nature, demonstrating for the first time that quantum computers can produce accurate results at a scale of 100+ qubits reaching beyond leading classical approaches.
One of the ultimate goals of quantum computing is to simulate components of materials that classical computers have never efficiently simulated. Being able to model these is a crucial step toward the ability to tackle challenges such as designing more efficient fertilizers, building better batteries, and creating new medicines. But today’s quantum systems are inherently noisy and they produce a significant number of errors that hamper performance. This is due to the fragile nature of quantum bits or qubits and disturbances from their environment.
In their experiment, the IBM team demonstrates that it is possible for a quantum computer to outperform leading classical simulations by learning and mitigating errors in the system. The team used the IBM Quantum ‘Eagle’ quantum processor composed of 127 superconducting qubits on a chip to generate large, entangled states that simulate the dynamics of spins in a model of material and accurately predict properties such as its magnetization.
To verify the accuracy of this modeling, a team of scientists at UC Berkeley simultaneously performed these simulations on advanced classical computers located at Lawrence Berkeley National Lab’s National Energy Research Scientific Computing Center (NERSC) and Purdue University. As the scale of the model increased, the quantum computer continued to turn out accurate results with the help of advanced error mitigation techniques, even while the classical computing methods eventually faltered and did not match the IBM Quantum system.
“This is the first time we have seen quantum computers accurately model a physical system in nature beyond leading classical approaches,” said Darío Gil, Senior Vice President and Director of IBM Research. “To us, this milestone is a significant step in proving that today’s quantum computers are capable, scientific tools that can be used to model problems that are extremely difficult – and perhaps impossible – for classical systems, signaling that we are now entering a new era of utility for quantum computing.”
IBM Commits to Utility-Scale Processors Across IBM Quantum Systems
Following this groundbreaking work, IBM is also announcing that its IBM Quantum systems running both on the cloud and on-site at partner locations will be powered by a minimum of 127 qubits, to be completed over the course of the next year.
These processors provide access to computational power large enough to surpass classical methods for certain applications and will offer improved coherence times as well as lower error rates over previous IBM quantum systems. Such capabilities can be combined with continuously advancing error mitigation techniques to enable IBM Quantum systems to meet a new threshold for the industry, which IBM has termed ‘utility-scale,’ a point at which quantum computers could serve as scientific tools to explore a new scale of problems that classical systems may never be able to solve.
“As we progress our mission to bring useful quantum computing to the world, we have solid evidence of the cornerstones needed to explore an entirely new class of computational problems,” said Jay Gambetta, IBM Fellow and Vice President, IBM Quantum. “By equipping our IBM Quantum systems with processors capable of utility scale, we are inviting our clients, partners and collaborators to bring their hardest problems to explore the limits of today’s quantum systems and to begin extracting real value.”
All IBM Quantum users will be able to run problems on utility-scale processors larger than 100 qubits. The over 2,000 participants in the IBM Quantum Spring Challenge had access to these utility-scale processors as they explored dynamic circuits, a technology that makes it easier to run more-advanced quantum algorithms.
Global Researchers and Industry Leaders Pursue Value with IBM Quantum
As IBM expands its quantum technology stack, research institutions, and private-sector leaders are mobilizing across industries for which quantum holds immediate potential. Equipped with more powerful quantum technology, including advanced hardware and tools to explore how error mitigation can enable accuracy today, pioneering organizations and universities are working with IBM to advance the value of quantum computing.
These working groups that are exploring the potential value quantum computing offers include:
- Healthcare and Life Sciences: led by organizations such as Cleveland Clinic and Moderna, are exploring applications of quantum chemistry and quantum machine learning to challenges such as accelerated molecular discovery and patient risk prediction models.
- High Energy Physics: comprised of groundbreaking research institutions such as CERN and DESY, are working to identify the best-suited quantum calculations, for areas such as identification and reconstruction algorithms for particle collision events, and the investigation of theoretical models for high energy physics.
- Materials: spearheaded by the teams at Boeing, Bosch, The University of Chicago, Oak Ridge National Lab, ExxonMobil, and RIKEN, aim to explore the best methods to build workflows for materials simulation.
- Optimization: aimed at establishing collaboration across global institutions such as E.ON, Wells Fargo, and others to explore key questions that progress the identification of optimization problems best suited for quantum advantage in sustainability and finance.