A significant hurdle impacting quantum devices has been solved by a University of Oxford study using the power of machine learning. These results for the first time point toward a means of bridging the “reality gap”—the discrepancy between expected and observed behavior from quantum devices. The findings have been released in the journal Physical Review X.
From drug development and artificial intelligence to financial forecasts and climate modeling, quantum computing has the potential to accelerate a wide range of applications. Effective methods for scaling and combining discrete quantum devices, or qubits, are need to do this, though. Innate variability—the fact that even seemingly similar units behave differently—acts as a significant obstacle to this.
It is assumed that nanoscale defects in the materials utilized to create quantum devices are the source of functional unpredictability. This internal chaos cannot be simulated because these cannot be measured directly, which accounts for the discrepancy between expected and observed results.
In order to overcome this, the research team implicitly inferred these disease traits using a “physics-informed” machine learning approach. This was predicated on how the electron flow through the device was impacted by the internal disarray.
Lead researcher Associate Professor Natalia Ares of the University of Oxford’s Department of Engineering Science explained, “To give an analogy, when we play ‘crazy golf,’ the ball may enter a tunnel and exit at a speed or direction that deviates from our expectations.” However, we could improve our ability to anticipate the ball’s trajectory and close the reality difference with a few more shots, an insane golf simulator, and some machine learning.
For various voltage settings, the researchers tested the output current across a single quantum dot device. A simulation was used to calculate the difference between the measured current and the theoretical current in the absence of internal disorder.
The simulation was forced to find an internal disorder arrangement that could account for the measurements at all voltage levels by monitoring the current at numerous distinct voltage settings. This method combines deep learning with statistical and mathematical techniques.
“In the crazy golf analogy, it would be equivalent to placing a series of sensors along the tunnel, so we could take measurements of the ball’s speed at different points,” Associate Professor Ares continued. We can utilize the data to improve our predictions of the ball’s behavior when we take the shot, even though we are still unable to see inside the tunnel.
In addition to finding appropriate internal disorder profiles to explain the reported current values, the new model also demonstrated the ability to precisely anticipate the voltage settings needed for particular device operating regimes.
The new model was able to precisely forecast the voltage settings needed for particular device operating regimes in addition to identifying appropriate internal disorder profiles to describe the measured current values.
The model offers a fresh approach to measuring the variation amongst quantum devices. This could help engineers choose the best materials for quantum devices and allow for more precise predictions of how devices would function. It might guide strategies for compensating for undesired consequences of material flaws in quantum devices.
Co-author David Craig, a Ph.D. candidate at the University of Oxford’s Department of Materials, continued, “We have used straightforward measurements as a proxy for the internal variability of nanoscale quantum devices, much as we cannot directly observe black holes but infer their presence from their effect on surrounding matter.”
Our study has shown the value of utilizing physics-aware machine learning to close the reality gap, even though the real device is still more complex than the model can represent.