*Quantum Fast Fourier Transform (QFFT) can be executed using O(n^2) gates compared to O(n2^n) gates needed for its classical counterpart for amplitudes of 2^n. One of the uses of FFT is to reduce the computational power needed for Convolutional Neural Networks.*

Quantum Mechanics was developed in the first half of 20th century to study the properties of molecules, atoms and sub atomic particles. It is based on the idea that such particles can be described through their probability density functions represented through their wave equations.

Artificial Intelligence is a branch of Computer Science that aims to make machines as intelligent as humans. Its most important subset is Machine Learning which is based on the idea that machines can learn from data. Some of its principles also use statistics and probability.

Despite their relatively common base, it is only now that the 2 fields of study have started benefitting from each other.

*Quantum Mechanics helping Machine Learning models*

*Quantum Mechanics helping Machine Learning models*

There are many models based on Quantum Mechanics that increase the speed or reduce the data needed for Machine Learning models compared to their classical counterparts.

**Speed**

The HHL algorithm named after the inventors, Harrow, Hassidim, and Lloyd solves the linear system of equations, Ax = B in O(log N) timescales under some conditions as compared to O(N) of classical computers where A is N X N matrix. Such equations appear in Machine Learning frequently including in Support Vector Machine models. Groverâ€™s algorithm is used to find input for a given output in O(sqrt(N)) time compared to O(N) needed on classical machines where N is size of data. This can be used in KNN (Nearest neighbour) models. Principal Component Analysis (PCA) is widely used for reduction of dimensions of Machine Learning models. It takes polynomial time in dimensions of features, in contrast the quantum version that takes logarithm of polynomial time in dimensions of features.

**Data**

Quantum Fast Fourier Transform (QFFT) can be executed using O(n^2) gates compared to O(n2^n) gates needed for its classical counterpart for amplitudes of 2^n. One of the uses of FFT is to reduce the computational power needed for Convolutional Neural Networks.

One important application of Machine Learning is for generating new or modified images, music, fake molecules, scans etc. The generation process detects and uses the probability distribution of underlying training data. The Quantum counterpart can learn the probability distribution faster and also generate data faster. The latter can also be used to augment datasets of detection models. D-wave a pioneer in Quantum computing is a company that is working in this area.

However, there are still implementation challenges. The classical data which could be human readable needs to be prepared to input into a quantum system and the reverse needs to be done to read the output. Sometime this may completely negate the benefit of quantum processing.

Machine Learning helping Quantum Mechanical systems

Machine Learning and especially Deep Learning models have been created to study and make predictions of quantum systems.

**Chemistry**

In Quantum mechanics, solving quantum many body problems need exponential amount of information. Here solving implies either finding the ground state or the dynamics of states evolution with time. At least two approaches have been developed to study quantum systems. First is a tensor based system where each qubit is mapped to a tensor of rank 3 which is effectively a 3 dimensional matrix. For many quantum systems with less entanglement, entropy increases as per the area and not volume of the systems. The tensor based neural network for such systems scales polynomially with size. One example of tensor based neural network used for quantum interactions is Schnet.

Another approach is Restricted Boltzmann Machines (RBMs) that are neural networks with two layers, a visible layer with neurons representing qubits and a hidden layer where neurons describe degrees of freedom. Adding another hidden layer results in deep Boltzmann Machines which can describe almost all quantum systems with polynomial scaling including systems that have heavy entanglement. RBMs have one more use, constructing quantum state from the quantum measurements called quantum state tomography.

The above and other Machine Learning models are being used to predict inter atomic potential energy surfaces, molecular forces, polarizations, sub atomic densities etc. This is used in research as well as to discover new molecules, reaction paths, catalysts and optimal conditions to use, yields and undesirable side effects.

**Physics**

They are also being used in Quantum Physics e.g. in Ising model which is a mathematical model of ferromagnetism based on spins of atoms arranged in a lattice. Machine Learning models work on Ising model and predict the transition temperature to para magnetism and other properties of phase transitions. Similarly Machine Learning models have been used to study the data of electrons motion obtained through Scanning Tunnelling Microscope (STM) which can be used e.g. to study high temperature superconductivity.

**Biology**

Recently, a company called DeepMind solved the protein folding problem, i.e. predicting the 3D structure a protein would take based on its constituents 1D amino acids. A protein can take astronomical number of shapes when it folds. They used attention based neural networks that focus on subset of inputs to solve. As its shape decides its functionalities, this can help in development of treatment of diseases or enzymes for industrial use

**Quantum computers**

Quantum computers are based on Quantum Mechanics and are fundamentally different from classical computers. Their basic building blocks are qubits which are superpositions of both â€˜0â€™ and â€˜1â€™ states and so n qubits actually represent 2^n number of states with varying probabilities. A classical bit represents either â€˜0â€™ or â€˜1â€™. Hence increase in number of qubits increases the possibilities exponentially and can drastically reduce timelines of training of Machine Learning models.

But qubits must remain entangled, i.e. what happens to one should instantly affect the other, even when theyâ€™re physically separated. Both superposition and entanglement are quantum effects which are not used in classical computers. However qubits rapidly tend to lose superposition leading to decoherence (the system becomes entangled with its environment). The current challenge is to have a minimum number of qubits to remain entangled sufficiently long to process the data and deliver output.

Google, IBM and Intel have demonstrated Quantum computers with qubits between 50 and 100 but with capability to work on a specific task. Aim is to create general purpose Quantum computer that can surpass classical computer in performance.

**Future**

Eventually, Quantum computers will achieve Quantum supremacy, i.e. when they will perform tasks that classical computers cannot accomplish. This could have happened with 50 qubits but this number needs to increase to accommodate errors that are inherent in the Quantum computers, They will exponentially reduce the time needed to train Machine Learning models including the ones that are used to study quantum phenomena, bringing the two fields of study even closer

*This article has been published from a wire agency feed without modifications to the text. Only the headline has been changed.*