Everyone working on quantum computers knows the devices are error prone. The basic unit of quantum programming – the quantum gate – fails about once every hundred operations. And that error rate is too high.
While hardware developers and programming analysts are fretting over failure rates, PNNL’s Nathan Wiebe is forging ahead writing code that he is confident will run on quantum computers when they are ready. In his joint appointment role as a professor of physics at the University of Washington, Wiebe is training the next generation of quantum computing theorists and programmers.
On one hand, Wiebe laments that “there’s such a huge gulf between where we are right now versus where we need to be.”
But just as quickly, he brushes aside doubt and explains that “we are already at the point where we are doing things that are really interesting.”
It’s this forge-ahead mentality that has placed him as a global leader in quantum algorithm development with a dozen different international partnerships and 91 publications on quantum algorithms published in the last five years alone.
Gaming rules apply to quantum gates
Coding for quantum computers requires leaps of imagination that can be daunting on one level, but Wiebe points out that any 15-year-old Minecraft enthusiast would have no trouble understanding the basics of how it works. The wildly popular building block video game has spawned a community of enthusiastic coders who create virtual computers inside the game environment. Minecraft coders have simulated real-world physics and created virtual calculators, among other feats. The Minecraft universe has its own internal rules and some of them don’t quite make sense – much like some of the rules of the quantum universe don’t seem clear, even to physicists.
Despite not understanding why the rules in Minecraft work the way they do, players instead learn how the physics of Minecraft work and further how to exploit that knowledge to perform tasks the games creators may not have intended. Quantum computer programmers have a similar challenge. They are faced with the strange rules of quantum mechanics and try to find creative ways to “hack” them to build computers that, in some cases, can solve problems trillions of times faster than ordinary computers by using quantum effects like interference and entanglement that ordinary computers lack.
“On a quantum computer, when you try to measure the quantum bits, they revert to ordinary bits. In the process, they lose the very features that give quantum computing its power,” Wiebe said. “With a quantum computer you have to be more subtle than you do with ordinary computers. You have to coax out information about the system without damaging the information that was encoded in there.”
“We found these weird rules of quantum mechanics,” he said. “But only now are we asking how we can exploit these rules in order to allow us to compute.”
It’s like steam engines
Wiebe likes to use the analogy of James Watt, inventor of the first modern steam engine. In the late 1700s, the limits to power that could be extracted from a steam engine weren’t understood. Only later did the French physicist Sadi Carnot discover that there were immutable physical laws that limited heat engine efficiency. This observation became known as the second law of thermodynamics and is now seen as a cornerstone of science. Just as the study of the efficiency of heat engines revealed the second law of thermodynamics, the study of quantum computing has the potential to reveal a deeper understanding of the limits that physics places on our ability to compute, as well as the new opportunities it provides to collaborate among fields.
Quantum computing is not simply physics, Wiebe said. It exists in the intersection between many fields, including physics, computer science, mathematics, materials science, and increasingly, data science. Indeed, he sees a huge untapped role for data science and machine learning in quantum computing.
“Like Watt and Carnot, we don’t necessarily need to capture all of the minutia that is happening inside the system,” Wiebe said. “All we have to be able to do is predict input and output. So data science and machine learning tools could have a lot of influence in making quantum computers work in practical terms.”
Diamonds in the rough
One of the first useful quantum technologies is likely to be quantum sensors – devices that use quantum signals to measure things like temperature and magnetic fields. Wiebe worked with an international team of colleagues to apply machine learning techniques to a tricky problem in quantum sensing.
Biologists want to use these sensors to measure what’s going on inside individual cells. The sensors are made of diamonds with certain defects that can be used to send quantum signals. The problem is that, at room temperature, the quantum sensor signals contain too many errors to be practical. The research team could not get the experiments to work unless the whole thing was cooled to liquid helium temperatures (−452.2°F), which obviously isn’t good for living cells.
Wiebe and his colleagues solved the problem by running the experiments at room temperature and then applying an algorithm that used techniques from data analytics and machine learning to correct for the error-prone, noisy signal.
“We got same sensitivity as the very cold cryogenic experiment at no additional cost,” he said.
Wiebe said that applying the same principles may be just the thing needed to correct for noisy, error-prone quantum gates. The question he asks is: “How much quantum error correction do I need to guarantee that my algorithms are going to run?”
Wiebe is adamant that making quantum computing practical will require the combined interdisciplinary efforts of researchers in many fields learning to speak each other’s languages.
“If we can build a quantum computer, then we have the ability to solve currently intractable problems in chemistry and materials science and physics,” he said. “The challenge both imposes limitations and provides new opportunities. Quantum computing forces us to get a deeper understanding of what it means to compute.”
Reference: Pacific Northwest National Laboratory