Revealing the mechanisms of semantic satiation with deep learning models.

Journal: Communications biology
Published Date:

Abstract

The phenomenon of semantic satiation, which refers to the loss of meaning of a word or phrase after being repeated many times, is a well-known psychological phenomenon. However, the microscopic neural computational principles responsible for these mechanisms remain unknown. In this study, we use a deep learning model of continuous coupled neural networks to investigate the mechanism underlying semantic satiation and precisely describe this process with neuronal components. Our results suggest that, from a mesoscopic perspective, semantic satiation may be a bottom-up process. Unlike existing macroscopic psychological studies that suggest that semantic satiation is a top-down process, our simulations use a similar experimental paradigm as classical psychology experiments and observe similar results. Satiation of semantic objectives, similar to the learning process of our network model used for object recognition, relies on continuous learning and switching between objects. The underlying neural coupling strengthens or weakens satiation. Taken together, both neural and network mechanisms play a role in controlling semantic satiation.

Authors

  • Xinyu Zhang
    Wenzhou Medical University Renji College, Wenzhou, Zhejiang, China.
  • Jing Lian
    School of Electronic and Information Engineering, Lanzhou Jiaotong University, Lanzhou, Gansu, China.
  • Zhaofei Yu
  • Huajin Tang
  • Dong Liang
    Lauterbur Research Center for Biomedical Imaging, Shenzhen Institutes of Advanced Technology, Chinese Academy of Sciences, Shenzhen, 518055 China.
  • Jizhao Liu
    School of Information Science and Engineering, Lanzhou University, Lanzhou, Gansu, China.
  • Jian K Liu