Analog deep learning, a new branch of artificial intelligence, promises quicker processing with less energy use. The amount of time, effort, and money needed to train ever more complex neural network models is soaring as researchers push the limits of machine learning.
Similar to how transistors are the essential components of digital processors, programmable resistors are the fundamental building blocks of analog deep learning. Researchers have developed a network of analog artificial “neurons” and “synapses” that can do calculations similarly to a digital neural network by repeatedly repeating arrays of programmable resistors in intricate layers. Then, this network may be trained using difficult AI tasks like image recognition and natural language processing.
What is the goal of the research team?
The goal of an interdisciplinary MIT research team was to increase the speed of a particular kind of artificial analog synapse they had previously created. They used a useful inorganic substance in the manufacturing process to give their devices a speed boost of a million times over earlier iterations, roughly a million times faster than the synapses in the human brain.
This inorganic component also contributes to the resistor’s exceptional energy efficiency. The new material is compatible with silicon production methods, in contrast to materials employed in the earlier iteration of their device. This modification has made it possible to fabricate nanometer-scale devices and may open the door to their incorporation into commercial computing hardware for deep-learning applications.
“With that key insight and the very powerful nanofabrication techniques we have at MIT.nano, we have been able to put these pieces together and demonstrate that these devices are intrinsically very fast and operate with reasonable voltages. This work has really put these devices at a point where they now look really promising for future applications,” explained the senior author Jesús A. del Alamo, the Donner Professor in MIT’s Department of Electrical Engineering and Computer Science (EECS).
“The working mechanism of the device is the electrochemical insertion of the smallest ion, the proton, into an insulating oxide to modulate its electronic conductivity. Because we are working with very thin devices, we could accelerate the motion of this ion by using a strong electric field and push these ionic devices to the nanosecond operation regime,” stated senior author Bilge Yildiz, the Breene M. Kerr Professor in the departments of Nuclear Science and Engineering and Materials Science and Engineering.
Promise to Obama kept: the US managed to create an exascale supercomputer
“The action potential in biological cells rises and falls with a timescale of milliseconds since the voltage difference of about 0.1 volt is constrained by the stability of water. Here we apply up to 10 volts across a special solid glass film of nanoscale thickness that conducts protons without permanently damaging it. And the stronger the field, the faster the ionic devices,” explained senior author Ju Li, the Battelle Energy Alliance Professor of Nuclear Science and Engineering and materials science and engineering professor.
The training of a neural network can be done much faster and cheaper thanks to these programmable resistors than ever before. This might speed up the process by which scientists create deep learning models, which might subsequently be used for fraud detection, self-driving cars, or picture analysis in medicine.
“Once you have an analog processor, you will no longer be training networks everyone else is working on. You will be training networks with unprecedented complexities that no one else can afford to, and therefore vastly outperform them all. In other words, this is not a faster car; this is a spacecraft,” adds lead author and MIT postdoc Murat Onen.
The research published in Science is called “Nanosecond Protonic Programmable Resistors for Analog Deep Learning,”. It includes Frances M. Ross, the Ellen Swallow Richards Professor in the Department of Materials Science and Engineering; postdocs Nicolas Emond and Baoming Wang; and Difei Zhang, an EECS graduate student.
Why analog deep learning is faster?
For two key reasons, analog deep learning is faster and more energy-efficient than its digital version. First, because computation is done in memory rather than a processor, massive amounts of data are not constantly transported between the two. Analog processors can also carry out parallel processes. An analog processor doesn’t require more time to perform new operations as the size of the matrix increases because all computation happens simultaneously.
A protonic programmable resistor is the main component of MIT’s new analog processor technology. These nanoscale-sized resistors are placed like a chessboard in an array. One nanometer is one billionth of a meter.
Learning occurs in the human brain due to the strengthening and weakening of synapses, the connections between neurons. This approach, where training algorithms program the network weights, has been used by deep neural networks for a long time. Analog machine learning is possible with this new processor by varying the electrical conductivity of protonic resistors.
The motion of the protons governs the conductance. More protons are pushed into a resistor channel to increase conductance, and more protons are pulled out to decrease conductance. This is done by conducting protons but blocking electrons in an electrolyte (similar to a battery’s electrolyte).
The researchers investigated various electrolyte materials to create a protonic resistor that is programmable, extremely quick, and highly energy efficient. Onen concentrated on inorganic phosphosilicate glass instead of other devices’ usage of organic chemicals (PSG).
P-computers are the future for developing efficient AI and ML systems
PSG, the powdery desiccant material used to eliminate moisture and found in little bags in new furniture boxes, is essentially silicon dioxide. Under humidified conditions, it is investigated as a proton conductor for fuel cells. It is also the most popular oxide utilized in the production of silicon. To create PSG, silicon is given unique properties for proton conduction by adding a small amount of phosphorus.
Onen postulated that an improved PSG, which would make a perfect solid electrolyte for this purpose, may have a high proton conductivity at room temperature without the requirement for water. He was accurate.
Shocking speed
Because PSG has many nanometer-sized pores with surfaces that act as pathways for proton diffusion, it allows for rapid proton transport. Additionally, it can endure extremely powerful, pulsed electric fields. Onen argues that this is crucial because increasing the device’s voltage enables protons to flow at dizzying speeds.
“The speed certainly was surprising. Normally, we would not apply such extreme fields across devices to prevent them from turning into ash. But instead, protons ended up shuttling at immense speeds across the device stack, specifically a million times faster than we had before. And this movement doesn’t damage anything, thanks to the small size and low mass of protons. It is almost like teleporting. The nanosecond timescale means we are close to the ballistic or even quantum tunneling regime for the proton, under such an extreme field,” said Li.
The protons do not harm the substance, so the resistor can operate for millions of cycles without degrading. It is crucial to incorporate this new electrolyte into computing hardware since it makes it possible to create a programmable protonic resistor that is a million times faster than their previous device and can function well at room temperature.
PSG is an electrically insulating substance; therefore, nearly no electric current moves through it as protons move. Onen continues that this greatly improves the device’s energy efficiency.
According to del Alamo, the researchers intend to re-engineer these programmable resistors for high-volume manufacture now that they have proven their efficacy. After that, they can scale up the resistor arrays for use in systems and analyze their properties.
They also intend to research the materials to eliminate obstructions that prevent the voltage needed to effectively transfer protons into, through, and out of the electrolyte.
“Another exciting direction these ionic devices can enable is energy-efficient hardware to emulate the neural circuits and synaptic plasticity rules that are deduced in neuroscience, beyond analog deep neural networks. We have already started such a collaboration with neuroscience, supported by the MIT Quest for Intelligence,” said Yildiz.
“The collaboration that we have will be essential to innovate in the future. The path forward is still going to be very challenging, but at the same time, it is very exciting,” explains del Alamo.
“Intercalation reactions such as those found in lithium-ion batteries have been extensively explored for memory devices. This work demonstrates that proton-based memory devices deliver impressive and surprising switching speed and endurance. It lays the foundation for a new class of memory devices for powering deep learning algorithms,” stated William Chueh, associate professor of materials science and engineering at Stanford University, who was not involved with this study.
Everything you need to know about computer science major (2022 Edition)
“This work demonstrates a significant breakthrough in biologically inspired resistive-memory devices. These all-solid-state protonic devices are based on exquisite atomic-scale control of protons, similar to biological synapses but at orders of magnitude faster rates. I commend the interdisciplinary MIT team for this exciting development, which will enable future-generation computational devices,” added Elizabeth Dickey, the Teddy & Wilton Hawkins Distinguished Professor and head of the Department of Materials Science and Engineering at Carnegie Mellon University, who also was not involved with this study.