Posted in | News | Quantum Computing

Diversity Could Help Minimize Errors in Quantum Computing

Computer scientists have discovered that in quantum computing, as in team building, some amount of diversity helps to get a job completed in a better manner.

Image credit: Georgia Tech’s School of Electrical and Computer Engineering

Contrary to conventional computers, the processing in quantum-based machines tends to be loud, which creates error rates much higher than those of silicon-based computers. Hence, quantum operations are repeated thousands of times to make the accurate answer to be prominent statistically from all the incorrect ones.

But performing the same operation again and again on the same qubit set may just produce the same wrong answers that can seem statistically to be the right answer. The solution, according to scientists at the Georgia Institute of Technology, is to repeat the operation on other qubit sets that have diverse error signatures—and thus will not generate the same correlated mistakes.

“The idea here is to generate a diversity of errors so you are not seeing the same error again and again,” said Moinuddin Qureshi, a professor in Georgia Tech’s School of Electrical and Computer Engineering, who came up with the method together with his senior Ph.D. student, Swamit Tannu. “Different qubits tend to have different error signatures. When you combine the results from diverse sets, the right answer appears even though each of them individually did not get the right answer,” said Tannu.

Tannu compares the method, called Ensemble of Diverse Mappings (EDM), to the game show “Who Wants to be a Millionaire?” Contestants who were uncertain of the answer to a multiple-choice query could ask the studio audience for assistance.

It’s not necessary that the majority of the people in the audience know the right answer. If even 20% know it, you can identify it. If the answers go equally in the four buckets from the people who don’t know, the right answer will get 40% and you can select it even if only a relatively small number of people get it right.

Moinuddin Qureshi, Professor, School of Electrical and Computer Engineering, Georgia Tech

Experiments with a present Noisy Intermediate Scale Quantum (NISQ) computer demonstrated that EDM enhances the inference quality by 2.3 times compared to advanced mapping algorithms. By merging the output probability distributions of the varied ensemble, EDM augments the right answer by suppressing the wrong ones.

Tannu acknowledges that the EDM technique is counterintuitive. Qubits can be classified according to their error rate on particular types of problems, and the most logical sequence of action might be to utilize the set that is most correct. But even the best qubits generate mistakes, and those mistakes are likely to be the same when the operation is performed thousands of times.

Selecting qubits with diverse error rates—and thus various types of error—guards against that by guaranteeing that the one right answer will rise above the multiplicity of errors.

The goal of the research is to create several different versions of the program, each of which can make a mistake, but they will not make identical mistakes. As long as they make diverse mistakes, when you average things out, the mistakes get canceled out and the right answer emerges.

Swamit Tannu, Senior Ph.D. Student, School of Electrical and Computer Engineering, Georgia Tech

Qureshi compares the EDM technique to team-building methods endorsed by human resource professionals.

“If you form a team of experts with identical backgrounds, all of them may have the same blind spot,” he said, adding a human aspect. “If you want to make a team resilient to blind spots, collect a group of people who have different blind spots. As a whole, the team will be guarded against specific blind spots.”

Error rates in traditional silicon-based computers are quite insignificant, about one in a thousand-trillion operations, but present-day’s NISQ quantum computers generate an error in just 100 operations.

These are really early-stage machines in which the devices have a lot of error. That will likely improve over time, but because we are dependent on matter that has extremely low energy and lacks stability, we will never get the reliability we have come to expect with silicon. Quantum states are inherently about a single particle, but with silicon you are packing a lot of molecules together and averaging their activity.

Moinuddin Qureshi, Professor, School of Electrical and Computer Engineering, Georgia Tech

Qureshi continued, “If the hardware is inherently unreliable, we have to write software to make the most of it. We have to take the hardware characteristics into account to make these unique machines useful.”

The concept of working a quantum operation thousands of times to obtain what is likely to be the correct answer at first appears counterproductive. But quantum computing is a lot faster than traditional computing that nobody would disapprove performing a few thousand duplicate runs.

“The objective with quantum computers is not to take a current program and run it faster,” Qureshi said. “Using quantum, we can solve problems that are virtually impossible to solve with even the fastest supercomputers. With several hundred qubits, which is beyond the current state of the art, we could solve problems that would take a thousand years with the fastest supercomputer.”

Added Qureshi: “You don’t mind doing the computation a few thousand times to get an answer like that.”

The quantum error mitigation scheme is slated for presentation on October 14th at the 52nd Annual IEEE/ACM International Symposium on Microarchitecture. The work was backed by a gift from Microsoft.


Tell Us What You Think

Do you have a review, update or anything you would like to add to this news story?

Leave your feedback
Your comment type