Skip to main content

An artificial synapse on a chip is able to learn autonomously

synapse on a chip learns autonomously 10717193 l
Ktsdesign/123RF
Brain-inspired deep learning neural networks have been behind many of the biggest breakthroughs in artificial intelligence seen over the past 10 years.

But a new research project from the National Center for Scientific Research (CNRS), the University of Bordeaux, and Norwegian information technology company Evry could take that these breakthroughs to next level — thanks to the creation of an artificial synapse on a chip.

Related Videos

“There are many breakthroughs from software companies that use algorithms based on artificial neural networks for pattern recognition,” Dr. Vincent Garcia, a CNRS research scientist who worked on the project, told Digital Trends. “However, as these algorithms are simulated on standard processors they require a lot of power. Developing artificial neural networks directly on a chip would make this kind of tasks available to everyone, and much more power efficient.”

Synapses in the brain function as the connections between neurons. Learning takes place when these connections are reinforced, and improved when synapses are stimulated. The newly developed electronic devices (called “memristors”) emulate the behavior of these synapses, by way of a variable resistance that depends on the history of electronic excitations they receive.

“Here we use a specific kind of memristors based on purely electronic effects,” Garcia continued. “In these devices, the active part is a ferroelectric film, which contains electric dipoles, that can be switched with an electric field. Depending on the orientation of these dipoles, the resistance is on or off. In addition, we can control configurations in which domains with up or down dipoles coexist, with intermediate voltage pulses. This gives rise to an analog device with many resistance levels. In our paper, we were able to understand how the resistance of the memristor evolves with voltage pulses and make a model based on the dynamics of ferroelectric domains.”

The result was an array of 45 such memristors which were able to learn to detect simple patterns without any assistance; something referred to as “unsupervised learning” in the machine learning community.

Now that the team is able to predict the behavior of an individual electronic synapse, the next goal is to develop neural networks on a chip that contain hundreds of these ferroelectric memristors. Once this is achieved, they will test it by connecting the neural network to an event-based camera to have a go at detecting moving objects at high-speed.

“The final goal of this project would be to integrate this bio-inspired camera in a car to assist the driver when unexpected objects or persons are crossing the road,” Garcia said.

Should all go according to plan, it may not be long before the neural networks are incorporated as a standard part of the processors found in our smartphones and other mobile devices.

Editors' Recommendations

IBM president confirms that the chip shortage will last ‘a few years’ more
CPU Computer Chip being put in place with tweezers.

In an interview with the BBC, IBM president Jim Whitehurst warned that the chip shortage could last “a few years” longer. The quote echoes similar claims made by Nvidia and Intel, which have seen firsthand the disruption of supply chains brought on by COVID-19.

Even with an uptick in vaccinations and a sense of normalcy returning to parts of the world, things will remain in flux for the semiconductor industry for at least a few years. “There's just a big lag between from when a technology is developed and when [a fabrication plant] goes into construction and when chips come out,” Whitehurst explained.

Read more
Trump administration wants to bring chip manufacturing to the U.S.
Huawei Kirin 980 processor

The U.S. government is hoping to persuade chip manufacturers Intel and TSMC (Taiwan Semiconductor Manufacturing Co.) to build factories in the U.S., according to a report by the Wall Street Journal.

The Trump administration is reportedly concerned about relying on overseas manufacturers, especially those in Asia, for a supply of all-important processors. Driven by the coronavirus pandemic, which has wreaked havoc on global supply chains in all industries including computing, the U.S. is seeking to secure a supply of essential components manufactured within its own borders.

Read more
Revisiting the rise of A.I.: How far has artificial intelligence come since 2010?
christie's auction house obvious art ai

2010 doesn’t seem all that long ago. Facebook was already a giant, time-consuming leviathan; smartphones and the iPad were a daily part of people’s lives; The Walking Dead was a big hit on televisions across America; and the most talked-about popular musical artists were the likes of Taylor Swift and Justin Bieber. So pretty much like life as we enter 2020, then? Perhaps in some ways.

One place that things most definitely have moved on in leaps and bounds, however, is on the artificial intelligence front. Over the past decade, A.I. has made some huge advances, both technically and in the public consciousness, that mark this out as one of the most important ten year stretches in the field’s history. What have been the biggest advances? Funny you should ask; I’ve just written a list on exactly that topic.

Read more