HomeArtificial IntelligenceArtificial Intelligence NewsResearchers create Classical Conditioning for AI

Researchers create Classical Conditioning for AI

Pavlov’s classical conditioning experiments from the early nineteenth century served as an inspiration for the University of Oxford researchers, and they recently developed an on-chip optical processor that could pave the way for remarkable advances in Artificial Intelligence (AI) and Machine Learning (ML).

Oxford claims that its new system can detect advanced dataset similarity. Unlike traditional Machine Learning algorithms that use electronic processors and traditional neural networks, Oxford’s system utilizes Pavlovian associative learning and runs on a backpropagation-free photonic network.

Getting Insight from Pavlov’s Dog

The process of associating two sensory stimuli to attain an identical response is known as classical conditioning. Sensory and motor neurons are involved in this process. Sensory neurons generate sensory-intensive actions when they receive sensory signals from motor neurons.

In the early 1900s, this concept was identified by Ivan Pavlov, and he discovered that by training the dog to link the sound of a bell with food, he could cause salivation in the dog when he rang a bell. The associative learning process links stimulus s2 (bell sound) with a natural stimulus s1 (sight or smell of food) to prompt the same response (salivation) in the dog.

This concept was applied by researchers at the University of Oxford to simplify neural circuitry with two major roles: 1) converge and associate two inputs, and 2) store memories of these associations for later reference. An associative monadic learning element is central to this research (AMLE). An AMLE is a device that efficiently performs the fundamental associative learning process of classical conditioning in order to advance AI/ML.

Information on the Associative Monadic Learning Element

To achieve associative learning, an AMLE combines a thin film of phase-change material with two coupled waveguides. The material (Ge2Sb2Te5 (GST)) modulates the coupling between the waveguides effectively. GST can be amorphous or crystalline, which affects the amount of coupling between the waveguides.

Stimuli s1 and s2 show no association in the crystalline state. However, when the stimuli (or inputs) arrive at the same time, they begin to associate, resulting in an amorphized GST. The learning threshold is determined by how much the GST amorphizes. The AMLE employs photonic associative learning to provide a one-of-a-kind Machine Learning framework for general learning tasks.

AMLEs Eradicate Backpropagation and Increase Computational Speed

Traditional neural network-intensive AI systems require large datasets for learning, which increases processing and computational costs. Backpropagation is used by conventional neural networks to accomplish high-accuracy AI learning.

According to the University of Oxford, AMLE eradicates the need for backpropagation by learning patterns and associating similar features in datasets using memory material. The absence of backpropagation results in faster AI/ML model training.

For example, while traditional neural network-based AI systems can recognize a rabbit after training its model with up to 10,000 rabbit/non-rabbit images, backpropagation-free technology (AMLE) can achieve comparable results with five rabbit/non-rabbit image pairs at significantly lower processing and computational costs.

For faster computational speeds, the AMLE also employs wavelength-division multiplexing. This capability enables AMLE to send multiple optical signals on different single channel-based wavelengths while avoiding backpropagation. AMLE sends and receives data using light, a process known as parallel signal processing, which results in higher information density and faster pattern recognition speed.

According to Professor Cheng, a co-author of the AMLE research, while this recently developed design cannot completely replace conventional neural networks, it can supplement them. The AMLE device significantly accelerates optical processing in high volume and simpler dataset learning tasks.

AMLE Development on a Photonic Platform

The University of Oxford researchers executed an AMLE on a photonic platform. The team demonstrated the growth and effectiveness of a single-layer-weight artificial neural network architecture free of backpropagation using this platform.

The researchers discovered that co-ordinating two different inputs can result in a similar output if the inputs are applied simultaneously at an already established optical delay. In the absence of light signal interferences, the association of these inputs could enable the association of multiple data streams, including different wavelengths, over a single element.

According to the University of Oxford, this research could pave the way for next-generation Machine Learning algorithms and architectural advancements.

Source link

Most Popular