© Mark Ollig
AI (Artificial Intelligence) and machine learning have seen notable advancements with cutting-edge interactive models like ChatGPT.
However, AI lacks the advanced hardware/software model for effectively generating, utilizing, and storing long and short-term memory analogous to the human brain.
Work being done by a research team led by Shantanu Chakrabartty in the Preston M. Green Hall Department of Electrical and Systems Engineering at Washington University in St. Louis has created a prototype model that mimics the dynamic synapses of the human brain.
Our brain’s synapses transmit signals between neurons, and have the potential to store complex memories by way of interactions between different chemical pathways.
Artificial synapses in today’s AI system neural networks are impressive; however, compared to how the human biological brain operates, they are in their technological infancy.
Chakrabartty and his team’s artificial synapse is unique in its ability to mimic some of the human brain’s precision, enabling future AI systems to learn and execute new tasks effectively and continuously.
The newly developed artificial neural device operates with two interconnected reservoirs of electrons flowing between the two chambers through a junction or artificial synapse.
To create this junction, the research team uses quantum tunneling, specifically, Fowler-Nordheim (FN) quantum tunneling.
FN tunneling allows electrons to jump or pass through a triangular barrier, altering its shape. As a result, it provides a more direct and energy-efficient connection than alternate methods that need to be simplified for current computer modeling.
I was drawn to how the quantum tunneling mechanical phenomenon passes particles through typically unpassable barriers.
Fowler-Nordheim tunneling is a type of quantum tunneling used in solid-state electronics.
Imagine, if you will, an electron particle contained within an energy barrier.
In a device using FN tunneling, the particle would “tunnel” through the barrier using quantum mechanics, and appear on the other side as if by magic.
Fowler-Nordheim tunneling describes the transference of electrons through a thin insulating layer inside a FET (field-effect transistor) using an electric field to control current flow in a semiconductor.
“The beauty of this is that we can control this device up to a single electron because we precisely designed this quantum mechanical barrier,” states Chakrabartty.
Energize.
Yes, dear readers, for me, this transference of electrons sends me back to the transporter room on the USS Enterprise.
The Fowler-Nordheim tunneling mechanism controls the current flow, allowing particles of electrons to pass through barriers they usually couldn’t cross.
Chakrabartty and his team of doctoral students created a prototype of 128 hourglass-shaped devices on a silicon chip smaller than one millimeter (0.0394 inches).
Then there are the artificial memory synapses.
“Our research shows that the FN synapse performs near-optimally in terms of synaptic lifetime and memory consolidation,” says Chakrabartty.
“This artificial synapse can tackle some of the challenges of continual learning, retaining what it has previously learned while allowing for both short-term and long-term memory on the same device,” he added.
According to Chakrabartty, their artificial synapse device uses minimal energy since it requires only a few electrons.
“In our design, we set a fixed number of electrons and don’t need to supply additional energy, as the electrons flow according to the physics of the device. By limiting the flow of electrons, we ensure that our device can operate continuously for extended periods,” Chakrabartty explained.
Energy requirements for advanced AI computations are rapidly increasing. For example, the next generation of AI models will require nearly 200 terajoules of energy to train just one system.
A terajoule is a unit of energy defined as one unit equivalent to one trillion joules.
Terajoule measurements are used by energy-intensive industries such as natural gas, electrical generation, and other high-energy consumption utilities.
Of course, today’s artificial AI systems are nowhere near matching the human brain’s capacity, which has an electrochemical communications network of 100 billion neurons connected by nearly 1,000 trillion synapses.
The FN-Synapse model is a synaptic device based on differential quantum tunneling that achieves nearly optimal memory consolidation, and protects critical data without requiring additional computational or storage resources. It leverages the unique physics of its hourglass shape for synaptic intelligence and enables continuous learning.
The research team’s evaluation shows that FN-Synapse operating systems outperform other continual learning methodologies, and provides insights into improving the accuracy of future AI synaptic models.
“We are currently unsure about how to train systems with even half a trillion parameters while maintaining energy sustainability. We need to find new solutions to provide enough energy or develop energy-efficient memory devices to train these large AI models,” Chakrabartty said.
Their research with artificial synaptic memory and quantum tunneling continues.