Taking inspiration from the human brain, researchers have developed a new synaptic transistor capable of higher-level thinking.
Designed by researchers at Northwestern University, Boston College and the Massachusetts Institute of Technology (MIT), the device simultaneously processes and stores information just like the human brain. In new experiments, the researchers demonstrated that the transistor goes beyond simple machine learning tasks to categorize data and is capable of performing associative learning.
Although previous studies have leveraged similar strategies to develop brain-like computing devices, these transistors cannot operate outside of cryogenic temperatures. The new device, in contrast, is stable at room temperatures. It also operates at fast speeds, consumes very little power, and retains stored information even when power is removed, making it ideal for real-world applications.
The study will be published Wednesday (December 20) in the journal Nature.
“The brain has a fundamentally different architecture than a digital computer,” said Northwestern’s Mark C. Hersam, who co-led the research. “In a digital computer, data moves back and forth between a microprocessor and memory, which consumes a lot of energy and creates a bottleneck when trying to multitask. On the other hand, in the brain, memory and information processes are localized and fully integrated, resulting in orders of magnitude higher energy efficiency. Our synaptic transistor similarly achieves simultaneous memory and information processing functionality to more closely mimic the brain.”
Hersam is the Walter P. Murphy Professor of Materials Science and Engineering at Northwestern’s McCormick School of Engineering. He is also chair of the materials science and engineering department, director of the Center for Materials Science and Engineering Research, and a member of the International Nanotechnology Institute. Hersam co-led the research with Qiong Ma of Boston College and Pablo Jarillo-Herrero of MIT.
Recent advances in artificial intelligence (AI) have prompted researchers to develop computers that work more like the human brain. Conventional, digital computing systems have separate processing and storage units, so data-intensive tasks consume large amounts of power. With smart devices constantly collecting massive amounts of data, researchers are trying to discover new ways to process it all without consuming an increasing amount of energy. Currently, the memory resistor, or “memristor”, is the most well-developed technology that can perform a combined processing and memory function. But memristors still suffer from energy-expensive switching.
“For several decades, the paradigm in electronics was to build everything out of transistors and use the same silicon architecture,” Hersam said. “Significant progress has been made by simply packing more and more transistors into integrated circuits. You can’t deny the success of this strategy, but it comes at the cost of high power consumption, especially in the current era of big data where digital computing has to be enabled. to rethink computing hardware, especially for AI and machine learning tasks.”
To rethink this paradigm, Hersam and his team explored new advances in the physics of moiré patterns, a type of geometric pattern that results when two patterns are superimposed on each other. When two-dimensional materials are stacked, new properties appear that do not exist in just one layer. And when these layers are twisted to form a moiré pattern, unprecedented tuning of electronic properties becomes possible.
For the new device, the researchers combined two different types of atomically thin materials: bilayer graphene and hexagonal boron nitride. When deliberately stacked and twisted, the materials formed a moiré pattern. By rotating one layer relative to the other, the researchers could achieve different electronic properties in each graphene layer, even though they are only separated by atomic-scale dimensions. With the right choice of rotation, the researchers harnessed the physics of moiré for neuromorphic functionality at room temperature.
“With twist as a new design parameter, the number of permutations is huge,” Hersam said. “Graphene and hexagonal boron nitride are very similar structurally but just different enough to get extremely strong moiré effects.”
To test the transistor, Hersam and his team trained it to recognize similar — but not identical — patterns. Just earlier this month, Hersam unveiled a new nanoelectronic device capable of analyzing and categorizing data in an energy-efficient way, but his new synaptic transistor takes machine learning and artificial intelligence a step further.
“If AI is intended to mimic human thinking, one of the lower-level tasks would be data classification, which is just sorting into bins,” Hersham said. “Our goal is to push AI technology toward higher-level thinking. Real-world conditions are often more complex than current AI algorithms can handle, so we tested our new devices under more complex conditions to verify their advanced capabilities.”
First the researchers showed the device a pattern: 000 (three zeros in a row). They then asked the AI to detect similar patterns, such as 111 or 101. “If we trained it to detect 000 and then gave it 111 and 101, it knows that 111 is more like 000 than 101 Hersham explained. “000 and 111 are not exactly the same, but they are both three digits in a row. Recognizing that similarity is a form of higher level knowledge known as associative learning.”
In experiments, the new synaptic transistor successfully recognized similar patterns, displaying its associative memory. Even when the researchers threw curveballs — like giving him incomplete patterns — he still successfully demonstrated associative learning.
“Current AI can be easily confused, which can cause big problems in some contexts,” Hersam said. “Imagine if you’re using a self-driving vehicle and the weather conditions get worse. The vehicle might not be able to interpret the more complex sensor data like a human driver could. But even when we gave imperfect input to our transistor, it could still detect the right answer”.
The study, “Synaptic Moiré Transistor with Room-Temperature Neuromorphic Functionality,” was primarily supported by the National Science Foundation.