Site icon News Azi

Brain cell differences could be key to learning in humans and AI

Neural networks identifying hand gestures. Credit: Imperial College London

Researchers have found that variability between brain cells might speed up learning and improve the performance of the brain and future AI.

The new study found that by tweaking the electrical properties of individual cells in simulations of brain networks, the networks learned faster than simulations with identical cells.

They also found that the networks needed fewer of the tweaked cells to get the same results, and that the method is less energy intensive than models with identical cells.

The authors say that their findings could teach us about why our brains are so good at learning, and might also help us to build better artificially intelligent systems, such as digital assistants that can recognize voices and faces, or self-driving car technology.

First author Nicolas Perez, a Ph.D. student at Imperial College London’s Department of Electrical and Electronic Engineering, says that “the brain needs to be energy efficient while still being able to excel at solving complex tasks. Our work suggests that having a diversity of neurons in both brains and AI fulfills both these requirements and could boost learning.”

The research is published in Nature Communications.

Researchers tasked AI neural networks with learning and identifying hand gestures. Credit: Imperial College London

Why is a neuron like a snowflake?

The brain is made up of billions of cells called neurons, which are connected by vast ‘neural networks’ that allow us to learn about the world. Neurons are like snowflakes: they look the same from a distance but on further inspection it’s clear that no two are exactly alike.

By contrast, each cell in an artificial neural network—the technology on which AI is based—is identical, with only their connectivity varying. Despite the speed at which AI technology is advancing, their neural networks do not learn as accurately or quickly as the human brain—and the researchers wondered if their lack of cell variability might be a culprit.

They set out to study whether emulating the brain by varying neural network cell properties could boost learning in AI. They found that the variability in the cells improved their learning and reduced energy consumption.

Lead author Dr. Dan Goodman, also of Imperial’s Department of Electrical and Electronic Engineering, said: “Evolution has given us incredible brain functions—most of which we are only just beginning to understand. Our research suggests that we can learn vital lessons from our own biology to make AI work better for us.”

Tweaked timing

To carry out the study, the researchers focused on tweaking the “time constant”—that is, how quickly each cell decides what it wants to do based on what the cells connected to it are doing. Some cells will decide very quickly, looking only at what the connected cells have just done. Other cells will be slower to react, basing their decision on what other cells have been doing for a while.

After varying the cells’ time constants, they tasked the network with performing some benchmark machine learning tasks: to classify images of clothing and handwritten digits; to recognize human gestures; and to identify spoken digits and commands.

The results show that by allowing the network to combine slow and fast information, it was better able to solve tasks in more complicated, real-world settings.

When they changed the amount of variability in the simulated networks, they found that the ones that performed best matched the amount of variability seen in the brain, suggesting that the brain may have evolved to have just the right amount of variability for optimal learning.

Nicolas added that they “demonstrated that AI can be brought closer to how our brains work by emulating certain brain properties. However, current AI systems are far from achieving the level of energy efficiency that we find in biological systems.

“Next, we will look at how to reduce the energy consumption of these networks to get AI networks closer to performing as efficiently as the brain.”

“Neural heterogeneity promotes robust learning” by Nicolas Perez-Nieves, Vincent C. H. Leung, Pier Luigi Dragotti, and Dan F. M. Goodman, published 4 October 2021 in Nature Communications.


Artificial neural networks modeled on real brains can perform cognitive tasks


More information:
Nicolas Perez-Nieves et al, Neural heterogeneity promotes robust learning, Nature Communications (2021). DOI: 10.1038/s41467-021-26022-3
Provided by
Imperial College London


Citation:
Brain cell differences could be key to learning in humans and AI (2021, October 6)
retrieved 6 October 2021
from https://techxplore.com/news/2021-10-brain-cell-differences-key-humans.html

This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.

Stay connected with us on social media platform for instant update click here to join our  Twitter, & Facebook

We are now on Telegram. Click here to join our channel (@TechiUpdate) and stay updated with the latest Technology headlines.

For all the latest Technology News Click Here 

 For the latest news and updates, follow us on Google News

Read original article here

Denial of responsibility! NewsAzi is an automatic aggregator around the global media. All the content are available free on Internet. We have just arranged it in one platform for educational purpose only. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials on our website, please contact us by email – admin@newsazi.com. The content will be deleted within 24 hours.
Exit mobile version