Michel Devoret and the Google Quantum AI team push the frontier of computing beyond classical

In a study published in Natureresearchers from Google Quantum AIincluding the physicist Michel H. Devoretwinner of the Nobel Prize in Physics 2025present the algorithm “Quantum Echoes”. Executed on CPU Willowthis algorithm demonstrates for the first time a verifiable quantum advantage, that is to say a calculation reproducible on other platforms and impossible to simulate with a classic supercomputer.

According to the team, Quantum Echoes ran 13,000 times faster as the Frontier supercomputer, today a world reference. This performance places Google at the head of a race that the company launched in 2019 with its “quantum supremacy” experiment, but whose usefulness remained symbolic. This time, researchers claim to have crossed the threshold where the power of quantum computing becomes scientifically exploitable.

From supremacy to verifiability

The issue of the work published in Nature goes beyond the simple question of power. The “Quantum Echoes” algorithm is based on the measurement of a quantum observable called Out-of-Time-Order Correlator (OTOC)a quantity that describes how a quantum system becomes chaotic.

Researchers have evolved a system of 103 qubits in two temporal directions, one “forward” (U) and one “backward” (U†), separated by elementary disturbances. This process, repeated thousands of times, revealed a constructive interference in a regime of quantum chaos.

Unlike previous “random circuit sampling” experiments, where each result was random and non-reproducible, Quantum Echoes produces verifiable physical values : magnetization, density, speed or current. These data can be confirmed by other quantum computers or by natural systems, an essential milestone for experimental traceability of quantum computing.

The butterfly effect at the qubit scale

The experiment carried out on Willow reproduces, in the quantum world, a well-known principle of classical chaos, namely thebutterfly effect. A tiny disturbance applied to a single qubit propagates throughout the system, changing the correlation between particles.

The researchers observed that these correlations could be amplified when the system satisfies a “quantum resonance” condition. This slow amplification, in power law rather than exponentialmarks a fundamental difference with classic models, where the signals dissipate quickly.

Google teams spent more than ten person-years to try to reproduce these results with nine classical algorithms different. None were able to simulate the observed behavior beyond the second order of OTOC. Where Willow calculated in two hours, a supercomputer would have required more than three years.

A new approach to understanding matter

The scientific interest of the project goes well beyond the demonstration of power. OTOCs can be used to characterize internal interactions complex physical systems, particularly in chemistry, materials physics and molecular biology.

In a second article currently being re-read, the researchers show that the same protocol applied to nuclear magnetic resonance (NMR) allows molecular models to be refined. By simulating the signals measured on real molecules and comparing them to those generated by the quantum computer, the team obtains more precise estimates of atomic structures.

This method, called Hamiltonian learningcould become one of the first concrete applications of quantum computingparticularly in the design of new materials and drug discovery.