Human Brain-like Transistors Developed

Scientists have created a transistor that stores and processes information in the same way that the human brain does, allowing it to do cognitive tasks that most artificial intelligence (AI) systems struggle with today.

This device, known as a “synaptic transistor,” is modeled after the architecture of the human brain, in which processing power and memory are entirely integrated and located in the same location. In contrast to traditional computing architecture, the CPU and memory are physically separate units.

According to Mark Hersam, research co-leader and professor of material science, engineering, and computing at Northwestern University, the brain has a fundamentally different design than a digital computer. When attempting to accomplish numerous activities at the same time, data moves back and forth between a microprocessor and memory, which consumes a lot of energy and creates a bottleneck.

The synaptic transistor can transmit data very quickly and achieve much higher energy efficiency due to its complete integration of memory and processing power, according to a study published on December 20. The scientists said that the necessity for this new type of computer architecture stemmed from the enormous energy consumption that would result from continuing to rely on traditional electronics in the big data era and the rising demand for AI computing workloads.

According to the researchers, synaptic transistors had been constructed in the past, but they were limited to very low temperatures. However, materials that function at room temperature are used in the novel transistor.

In contrast to conventional electronics, which stacks transistors onto a silicon wafer, the researchers created the new synaptic transistor by stacking bilayer graphene (BLG) and hexagonal boron nitride (hBN), then purposely twisting the layers to create a moiré pattern.

They discovered unique electrical properties when they rotated one layer in relation to the other; these qualities weren’t present in either layer alone. A precise twist and almost perfect alignment between hBN and BLG were needed to get the transistor to operate at room temperature.

Prior to testing the chip, the researchers trained it on data so that it could identify trends. Subsequently, they fed fresh sequences to the chip, which resembled the training set but were distinct from it. The majority of machine learning systems struggle to execute this associative learning process effectively.

Classifying data, which is essentially sorting into bins, would be one of the lowest-level activities if AI is designed to emulate human reasoning, according to Hersam. Their objective is to move AI technology closer to higher order thinking. They tested their new gadgets under more complex situations to confirm their increased capabilities because real-world conditions are sometimes more complex than current AI algorithms can manage.

The researchers trained the AI to recognize the number 000 in one exercise. The AI was then given examples of comparable patterns to recognize, such as 111 and 101, by the researchers. Even though the sequences 000 and 111 are different, the AI was able to figure out that they were both three consecutive digits.

Although this seems straightforward enough, current AI technologies have trouble with this kind of cognitive thinking. In other tests, the researchers also gave the AI partial patterns, so throwing “curveballs” at it. However, the researchers noted that the AI utilizing the chip continued to exhibit associative learning.

They have only used hBN and BLG with the moiré synaptic transistor thus far. Still, a wide variety of other two-dimensional materials can be used to create various moiré heterostructures. As such, researchers think that the field of moiré neuromorphic computing is still in its infancy and that they have only just begun to explore its potential.

According to Hersam, the characteristics of this experimental transistor may pave the way for future technological advancements that will use it to power very energy-efficient processors that drive advanced AI and machine learning systems.

Source link