News Technology

Quantum Computing: How Conditions Created By The COVID-19 Shutdown Are Delivering ‘The Best Data We Have Ever Seen’

Remotely controlled experiments are the way forward.

The COVID-19 pandemic and shutdown have been disastrous for many people. But one research project in my lab has been humming along, taking the best data my team has ever seen. It is an advanced ‘ion trap’ quantum computer, which uses laser beams to control an array of floating atoms.

We spent three years setting it up to run remotely and autonomously. Now, we think more labs should run quantum-computing experiments like this, to speed up research.

Quantum computers exploit the weird behaviour of matter at the atomic level. One particle can store many pieces of information, allowing the computers, in effect, to perform many calculations simultaneously. They promise to solve problems that are out of reach of conventional machines, and to speed up modelling of chemical reactions in batteries or drug design, or even simulations of information flow in black holes.

But good quantum hardware is extremely fragile, and the larger the system, the more easily it is perturbed. Some quantum components must be chilled to near absolute zero. Others must be stored in a vacuum more rarefied than that of outer space. It’s really hard to prepare and control precise quantum states, let alone keep them stable for hours. Stray currents, changes in temperature and vibrations can easily destabilize the system.

The quantum computer at the University of Maryland, led by myself and physicist Marko Cetina, uses up to 32 identical atoms as the quantum bits, or qubits. Each is levitated by electromagnetic fields and cooled by lasers to sit almost at rest. Typically, such an apparatus has thousands of electronic and optical components, all aligned precisely on a 3-metre wide, 500-kilogram steel table damped against vibrations. It requires an army of people to tweak mirrors and adjust signals, and the components must continually be replaced, tested, calibrated and updated.

But in 2016,we decided to redesign our system to run remotely — not just for convenience, but because that’s what our research goals require. We needed to add more qubits without increasing noise and errors, to test complex quantum gate operations, circuits and algorithms.

This required a different approach. For qubits, we use particular states of ytterbium-171 that are so stable that they are widely used for atomic clocks. We miniaturized the most reliable control components, added transducers and feedback circuits, and ran everything from an open-access software platform. We worked closely with many industry partners to make it all work, in a collaboration with engineers Jungsang Kim and Kenneth Brown at Duke University.

EURIQA (Error-corrected Universal Reconfigurable Ion trap Quantum Archetype) began operating autonomously in April 2019. The whole system now sits in a 1-metre-cubed box. It’s rarely opened. One researcher visits the lab for 10–20 minutes once a week to reboot the odd computer that has frozen or power supply that has tripped.

Since my university went into COVID-19 shutdown in March, EURIQA has kept running — all day, every day. And the data have been excellent because the campus has been a ghost town. The lab’s temperature hasn’t wavered and there’s little vibrational noise in the unoccupied building. It’s one of very few university quantum experiments making real progress right now.

But there’s a bigger picture. This remote mode of operation is exactly what’s needed in quantum-computing research. Companies including IBM, Google, Honeywell and a start-up I co-founded, IonQ (whose systems are based on EURIQA), are opening up commercial access to their early quantum-computing devices. By the end of 2020, several types of quantum computer will be available through cloud services hosted by Amazon and Microsoft. But researchers won’t have access to the inner workings to advance bespoke designs for particular scientific applications. They won’t be able to ‘co-design’, or fully exploit the interplay between computer fabrication and computer use.

Right now, most quantum-computing research involves the study of qubit properties, quantum gate operations and their control. In some cases, the components are wired together for a specific scientific application. Qubits, quantum logic operations and modes for executing programs are selected and optimized for one purpose. It would be great if instead, some of those components were simple ‘plug-and-play’ commodities, like the flash memory on a smartphone camera. Then, researchers wouldn’t have to build everything from scratch: they could insert a module, tweak a parameter or remotely reprogram a circuit.

Such a system could be built by adapting a particular qubit technology and piling stacks of control hardware and software on top, as we have done with EURIQA. Qubits could be swapped and systems redesigned as technology evolves — just as in conventional computing, the vacuum-tube switches of the 1940s gave way to germanium semiconductors and then silicon wafers in the 1960s.

Large quantum-computing initiatives in the United States, Europe, China, Canada, Australia, Singapore and Russia are investing in qubit research while also giving researchers access to commercial cloud services. But extensive research is needed between these extremes. Industry will ultimately mass-produce quantum computers, but the early ‘killer apps’ might well come from scientific discovery. Unleashing ‘full stack’ quantum computers into the research community will hasten that search.

Source: https://www.nature.com/articles/d41586-020-01937-x

Leave a Comment