Emergence and reconfiguration of modular structure for artificial neural networks during continual familiarity detection.

Journal: Science advances
PMID:

Abstract

Advances in artificial intelligence enable neural networks to learn a wide variety of tasks, yet our understanding of the learning dynamics of these networks remains limited. Here, we study the temporal dynamics during learning of Hebbian feedforward neural networks in tasks of continual familiarity detection. Drawing inspiration from network neuroscience, we examine the network's dynamic reconfiguration, focusing on how network modules evolve throughout learning. Through a comprehensive assessment involving metrics like network accuracy, modular flexibility, and distribution entropy across diverse learning modes, our approach reveals various previously unknown patterns of network reconfiguration. We find that the emergence of network modularity is a salient predictor of performance and that modularization strengthens with increasing flexibility throughout learning. These insights not only elucidate the nuanced interplay of network modularity, accuracy, and learning dynamics but also bridge our understanding of learning in artificial and biological agents.

Authors

  • Shi Gu
  • Marcelo G Mattar
    Department of Bioengineering, University of Pennsylvania, Philadelphia, PA 19104, USA; Department of Psychology, University of Pennsylvania, Philadelphia, PA 19104, USA.
  • Huajin Tang
  • Gang Pan
    College of Computer Science and Technology, Zhejiang University, Hangzhou, Zhejiang, China.