Posted in | Quantum Computing

Error-Correction Breakthrough Represents Significant Step Towards Powerful Quantum Computing

Hector Bombin, Helmut Kraztgraber and Miguel Angel Martin-Delgado, researchers from the QUITEMAD Scientific Consortium (Technologies Program R & D, funded by the Community of Madrid) together with an international group of scientists from Switzerland (R. Andrist) and Japan (M. Ozheki) have conducted research on the use of two families of Topological Quantum Codes for error correction in quantum systems. One of the most important results of this research is beating the previously known qubit error rate by 75%. This opens the way for future improvements to quantum computers.

This research has been coordinated by Miguel Angel Martin-Delgado, Professor at the Universidad Complutense de Madrid and QUITEMAD Coordinator, and has made use of the most powerful supercomputer in Spain (Magerit-2) at the Supercomputing and Visualization Center of Madrid (CeSViMa.), as well as supercomputers at the University of Texas and ETH Zurich. The result of this research places us one step closer to the success of quantum computation, whilst also bringing us closer to the major long-term challenge of building large quantum computers.

The study has been published in the Physical Review X journal (Americam Physical Society) with the title: "Strong Resilience of Topological Codes to depolarization" and also widely commented by Prof. Daniel Gottesman -Perimeter Institute, Waterloo, Canada- in the Physics Viewpoint section of the APS (American Physical Society) in an article entitled: "Keeping One Step Ahead of Errors'. Hector Bombin - also a researcher at the Perimeter Institute - and colleagues have studied both topological toric and color codes, determining the degree of protection provided against the more generic form of errors.

Quantum computers are much more vulnerable to noise than classical computers. This is because the quantum states of the tiny qubits can be altered by the smallest noise, which easily leads to errors. Because of the inevitable presence of decoherence effects in the systems, is necessary to have adequate error correction schemes and protect information from noise. Error correction is then a matter of great importance for the success of quantum computing. A very promising approach, and currently the best candidate for practical implementations, is the one that uses topological error correction codes. The error correcting performance of a topological code is essentially captured by the called 'error threshold': So, while the noise level is below the natural threshold of the code in question, the noise-induced errors can be completely corrected by well-designed manipulations, involving only a few qubits. However, for some of the topological reference codes, to find out what is the threshold of the most generic form of noise disturbance turns out to be a difficult technical challenge.

In general, an error correction code quantum works by, first, defining a set of error-identifying quantum measurements and, then, making measurements to identify the error (set the called "error sindrome"), and finally, prescribing and executing a set of quantum operations for errors correction on the qubits. A topological code is featured by two fundamental characteristics: First, all the necessary quantum measurements for the error correction are "local", involving only a few qubits that can be viewed as "neighbors." Second, there is no local operation which alone can change the entire computer encoded status. In essence, "topological" means robustness against local disturbances of the environment. The two families of topological codes in which the work has focused on are the toric codes and color codes. In the first case, the physical qubits are placed on the square-lattice-like grid on the surface of a torus; in the latter, on the vertices of a trivalent, eg. hexagonal lattice, the architecture of the qubit connectivity dictated by the nature of quantum measurements involved in the codes.

A previous work of Dennis, Kitaev, Landahl, and Preskill in 2004, pioneered the conceptual approach of determining the error threshold, by mapping the quantum problem on a classical model of spin. The form of the noise investigated was, however, only one of three possible fundamental types. However, Hector Bombin and colleagues study the case of a more generic form of noise, which includes not only the three types of noise, but also the correlations between them. It has succeeded in showing that, for the form of the considered noise, the corresponding classical counterpart of the toric code is an 8-vertex spin model. The error threshold then corresponds to the point -in the classical statistical model- where a magnetic ordering transition is lost due to the underlying disorder in the classical spins, which is equivalent to the presence of faulty qubits. Using Monte Carlo simulations and some duality arguments, it is possible to find the error threshold to be at 19% approximately -higher than what was previously thought. It is remarkable that the assignation of the color codes leads to new types of classical 8-vertex models with interaction but, at the same time, their error thresholds are also found at 19%. This is very novel from the viewpoint of Statistical Mechanics systems.

The researchers assure that this interdisciplinary effort places us to a closer step towards the final goal of building error-tolerant and large-scale quantum computers, and also that is possible to advance that that this work will also be of interest to the statistical mechanics community.

Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback