How the Brain Develops: A New Way to Shed Light on Cognition

Summary: A new computational neuroscience study sheds light on how the brain’s cognitive abilities develop and could help shape new AI research.

Source: University of Montreal

A new study introduces a new neurocomputational model of the human brain that could shed light on how the brain develops complex cognitive abilities and advance neural artificial intelligence research.

Published Sept. 19, the study was carried out by an international group of researchers from the Institut Pasteur and Sorbonne Université in Paris, the CHU Sainte-Justine, Mila – Quebec Artificial Intelligence Institute, and Université de Montréal.

The model, which made the cover of the journal Proceedings of the National Academy of Sciences of the United States of America (PNAS), describes neural development over three hierarchical levels of information processing:

  • the first sensorimotor level explores how the brain’s inner activity learns patterns from perception and associates them with action;
  • the cognitive level examines how the brain contextually combines those patterns;
  • finally, the conscious level considers how the brain dissociates from the outside world and manipulates learned patterns (via memory) no longer accessible to perception.

The team’s research gives clues into the core mechanisms underlying cognition thanks to the model’s focus on the interplay between two fundamental types of learning: Hebbian learning, which is associated with statistical regularity (ie, repetition) – or as neuropsychologist Donald Hebb has put it, “neurons that fire together, wire together” – and reinforcement learning, associated with reward and the dopamine neurotransmitter.

The model solves three tasks of increasing complexity across those levels, from visual recognition to cognitive manipulation of conscious perceptions. Each time, the team introduced a new core mechanism to enable it to progress.

The results highlight two fundamental mechanisms for the multilevel development of cognitive abilities in biological neural networks:

  • synaptic epigenesis, with Hebbian learning at the local scale and reinforcement learning at the global scale;
  • and self-organized dynamics, through spontaneous activity and balanced excitatory/inhibitory ratio of neurons.
This shows a brain
The model solves three tasks of increasing complexity across those levels, from visual recognition to cognitive manipulation of conscious perceptions. Image is in the public domain

“Our model demonstrates how the neuro-AI convergence highlights biological mechanisms and cognitive architectures that can fuel the development of the next generation of artificial intelligence and even ultimately lead to artificial consciousness,” said team member Guillaume Dumas, an assistant professor of computational psychiatry at UdeM, and a principal investigator at the CHU Sainte-Justine Research Centre.

Reaching this milestone may require integrating the social dimension of cognition, he added. The researchers are now looking at integrating biological and social dimensions at play in human cognition. The team has already pioneered the first simulation of two whole brains in interaction.

Anchoring future computational models in biological and social realities will not only continue to shed light on the core mechanisms underlying cognition, the team believes, but will also help provide a unique bridge to artificial intelligence towards the only known system with advanced social consciousness: the human brain.

About this computational neuroscience research news

Author: Julie Gazaille
Source: University of Montreal
Contact: Julie Gazaille – University of Montreal
Image: The image is in the public domain

Original Research: Open access.
“Multilevel development of cognitive abilities in an artificial neural network” by Guillaume Dumas et al. PNAS


Abstract

See also

This shows a pregnant woman being hugged by her partner

Multilevel development of cognitive abilities in an artificial neural network

Several neuronal mechanisms have been proposed to account for the formation of cognitive abilities through postnatal interactions with the physical and sociocultural environment.

Here, we introduce a three-level computational model of information processing and acquisition of cognitive abilities. We propose minimal architectural requirements to build these levels, and how the parameters affect their performance and relationships.

The first sensorimotor level handles local unconscious processing, here during a visual classification task. The second level or cognitive level globally integrates the information from multiple local processors via long-ranged connections and synthesizes it in a global, but still unconscious, manner. The third and cognitively highest level handles the information globally and consciously. It is based on the global neuronal workspace (GNW) theory and is referred to as the conscious level.

We use the trace and delay conditioning tasks to, respectively, challenge the second and third levels. Results first highlight the necessity of epigenesis through the selection and stabilization of synapses at both local and global scales to allow the network to solve the first two tasks.

At the global scale, dopamine appears necessary to properly provide credit assignment despite the temporal delay between perception and reward. At the third level, the presence of interneurons becomes necessary to maintain a self-sustained representation within the GNW in the absence of sensory input.

Finally, while balanced spontaneous intrinsic activity facilitates epigenesis at both local and global scales, the balanced excitatory/inhibitory ratio increases performance. We discuss the plausibility of the model in both neurodevelopmental and artificial intelligence terms.

Leave a Reply

Your email address will not be published.