Melbourne uni team claim 60 qubit quantum computing simulation world record
- 27 June, 2018 11:19
Researchers at the University of Melbourne say they have set a new world record by simulating the output of a 60-qubit quantum computer.
The previous record was set in October by IBM which classically simulated 56 qubits in carefully chosen states.
“In terms of the number of qubits, this represents one of the largest simulations of a non-trivial quantum circuit ever performed,” the researchers said.
Simulating qubits using classical computers is tricky. While classical computers work with binary bits, programmed to encode and process data, a quantum computer’s qubits are quantum mechanical objects like atoms.
Quantum states can be binary and put in one of two possibilities, or effectively both at the same time. Quantum superposition means that two qubits can, in a sense, be all four combinations of 0 and 1 at the same time. That unique data crunching power is further boosted by entanglement where the state of one qubit when measured mysteriously dictates the state of another qubit.
That quality gives a 50 qubit machine – about the limit of current quantum computer hardware – in principle the ability to simultaneously represent about a million billion number combinations.
“In order to simulate a quantum computer I need to store every one of these possible binary combinations that a quantum computer can effectively represent. If you have a simple question – can a classical computer simulate a quantum computer – then already at 50 qubits I need 250 numbers to be represented in my classical simulation,” explains the University of Melbourne’s Professor Lloyd Hollenberg.
“Each of those numbers is essentially a complex number, 128 bits, so the counting then is in petabytes. Supercomputers with thousands of nodes are around about that limit,” he says.
In other words, to simulate a random quantum state of a 50 qubit machine would chew up some 18 petabytes of classical computer memory, or the equivalent of more than a million 16 gigabyte RAM laptops.
Simulating a 60 qubit machine then would need 18,000 petabytes, or over a billion laptops.
“But that’s not simulating a quantum computer doing anything,” says Hollenberg, “What we said is – well what if the quantum computer is doing something useful like running an algorithm?”
In a yet to be published paper, Hollenberg (deputy director of the Centre for Quantum Computation and Communication Technology) and researchers Aidan Dang and Charles Hill, describe how they optimised simulations of a quantum algorithm called Shor’s algorithm.
Shor’s algorithm finds the two prime numbers that when multiplied together equal any semi-prime number. It is a quantum algorithm, one specifically designed to run on a quantum computer.
Factoring is a ‘classically intractable’ problem and considered one of the areas where near-term quantum computers will surpass their classical counterparts.
Finding the primes of a 232 digit semi-prime number, for example, took researchers two years and a supercomputer. Had they used a normal laptop, it would have taken them 2,000 years. As you add digits, the challenge for a classical computer to find the prime factors of a number gets a lot harder very quickly. That’s lucky for us, since the difficulty in factoring very large semi-prime numbers is the foundation underpinning the RSA public key cryptosystem used to secure online communications like HTTPS.
The RSA-240 challenge (factoring a 240 digit number) has still not been achieved.
The actual problem posed to the Melbourne researchers' simulation was: which two prime numbers can be multiplied to give the answer 961,307?
The question can be answered using a standard laptop within a second; however is beyond the limit of what current quantum computer prototypes can solve.
“We wanted to push and see how far we could actually optimise our quantum simulation capability for a particular algorithm. In doing that it meant we could organise our simulation around the degree of quantum entanglement that was being produced in the algorithm itself,” Hollenberg says.
Shor’s algorithm was able to be exploited by the researchers to make their classical simulation more efficient. It’s “entanglement structure … lends itself to a particular matrix product state representation that quantifiably reduces the computational requirements for classical simulation,” the researchers explain in their paper.
“That understanding of how it worked in this particular case allowed us to go to a high number of qubits,” Hollenberg says.
Using their simulation of a 60 qubit machine running the algorithm, the Melbourne researchers were able to solve it using a total of 216 nodes, 5184 cores and 13.824 TB of memory within eight hours on a machine at the Pawsey Supercomputing Centre near Perth.
“The simulation used up almost all our allocated computing time at Pawsey, but we just made it,” Dang says. “To the best of our knowledge, this is the largest high-level simulation of Shor’s algorithm.”
Simulating quantum computers – even if the questions posed to the simulations are straightforward – helps researchers to better understand and test out the sorts of problems an eventually scaled-up quantum computer will be used for.
“The community has long since settled on the idea that once you get to 50 or 100 qubits you’re going to go beyond what a classical computer can simulate. So looking at problems right in that intermediate region and seeing if we can push the boundaries is really helping us set the bar for true quantum supremacy,” says the University of Sydney’s quantum computing Professor Stephen Bartlett.
The simulation research will also mean quantum computers can be better benchmarked and verified.
“The better your classical simulation capability is the better your benchmarking capability will be,” adds Hollenberg. “The capability to simulate quantum algorithms at this level is important to learning how a quantum computer will physically operate, how the software can work, and what sort of problems it can solve.”