Accueil du site > À la une > Quantum Data Fitting

Toutes les versions de cet article : English , français

Invented in 1794 by Carl Friedrich Gauss, fitting models to data has ever since been one of the most important tools of all of quantitative science. For very large amounts of data, fitting becomes computationally costly. In an article in Physical Review Letters, a collaboration of physicists from the LPT, the Institute for Quantum Computing (IQC, Waterloo, Canada), and the Massachusetts Institute of Technology (MIT, Boston, United States), now proposes a quantum algorithm that enables an exponential speed-up of the fitting procedure – for future generations of researchers with a quantum computer at their disposal.

Interest in quantum computing has developed spectacularly over the last twenty years, after, in 1995, Peter Shor found a quantum algorithm that allows factoring large integer numbers exponentially faster than with any known classical algorithms. Other quantum algorithms that outperform the best known classical ones were found later, but exponential speed-ups are hard to come by. A new type of quantum algorithm was found in 2009 by Aram Harrow (at the time at University of Bristol), Avinatan Hassedim and Seth Lloyd (both at MIT), who showed how to efficiently solve an exponentially large system of linear equations. This type of problem is ubiquitous in all of physics and engineering. The only crux : their quantum algorithm does not spit out the solution explicitly (which by itself would take an exponential amount of time), but only renders the expectation value of an arbitrary observable in the final state. As a consequence, it remained unclear whether a useful application of the algorithm exists. It now turned out that data fitting is a very natural application : the new quantum algorithm invented by Nathan Wiebe, Daniel Braun, and Seth Lloyd calculates in polynomial time the quality of the best fit for a given model. Thus, even without having to learn explicitly the result of the fit (i.e. the actual fit parameters), one can test many different models till finding one that fits the data very well. In a second part of the algorithm, the fit parameters can then be obtained explicitly. The algorithm appears to be particularly suited to fitting data that is itself produced by a quantum computer or quantum simulator, in which case it offers an attractive alternative to quantum tomography.

The article was highlighted by the editors of Physical Review Letters in the online "Physics" magazine in a synopsis

*Contact for further information : * Daniel Braun

Dans la même rubrique :

- Exceeding the Pauli limit
- Measuring the size of Schrödinger’s cat
- What is the true statistics of holons ?
- Predicting the maximum or the minimum of a random signal
- How to reproduce bacterial propulsion in a biomimetic way ?
- Dependence of DNA Persistence Length on Ionic Strength of Solutions with Monovalent and Divalent Salts
- Is 1D physics experimentally accessible in real systems ?
- An argentinian-french collaboration : a novel mechanism for fractional magnetization plateaus !
- An unconventional phase transition
- Entanglement-screening by nonlinear resonances
- DNA bubbles and bending : how conformational fluctuations modify its thermal denaturation
- Phase separation and flux quantization in the doped quantum dimer model on the square and triangular lattices
- Valence Bond Entanglement Entropy
- Spin gap and string order parameter in the ferromagnetic Spiral Staircase Heisenberg Ladder
- Generic mixed columnar-plaquette phases in Rokhsar-Kivelson models
- The Pairing Glue in High-Temperature Superconductors
- Effective Theory of Magnetization Plateaux in the Shastry-Sutherland Lattice
- A gauge theory picture of an exotic transition in a dimer model
- Why DNA chains are "kinked" when observed on surfaces
- Computing fidelity at magnetic quantum phase transitions