Predicting artificial neural network representations to learn recognition model for music identification from brain recordings.

Journal: Scientific reports
Published Date:

Abstract

Recent studies have demonstrated that the representations of artificial neural networks (ANNs) can exhibit notable similarities to cortical representations when subjected to identical auditory sensory inputs. In these studies, the ability to predict cortical representations is probed by regressing from ANN representations to cortical representations. Building upon this concept, our approach reverses the direction of prediction: we utilize ANN representations as a supervisory signal to train recognition models using noisy brain recordings obtained through non-invasive measurements. Specifically, we focus on constructing a recognition model for music identification, where electroencephalography (EEG) brain recordings collected during music listening serve as input. By training an EEG recognition model to predict ANN representations-representations associated with music identification-we observed a significant improvement in classification accuracy. This study introduces a novel approach to developing recognition models for brain recordings in response to external auditory stimuli. It holds promise for advancing brain-computer interfaces (BCI), neural decoding techniques, and our understanding of music cognition. Furthermore, it provides new insights into the relationship between auditory brain activity and ANN representations.

Authors

  • Taketo Akama
    Sony Computer Science Laboratories, Inc, Tokyo, Japan. taketo.akama@sony.com.
  • Zhuohao Zhang
    Shanghai Xuhui Central Hospital, Zhongshan-Xuhui Hospital, and the Shanghai Key Laboratory of Medical Epigenetics, the International Co-laboratory of Medical Epigenetics and Metabolism (Ministry of Science and Technology), Institutes of Biomedical Sciences, Fudan University, Shanghai, 200032, China.
  • Pengcheng Li
    Key Laboratory of Experimental Marine Biology, Institute of Oceanology, Chinese Academy of Sciences, Qingdao 266071, China. Electronic address: pcli@qdio.ac.cn.
  • Kotaro Hongo
    Sony Computer Science Laboratories, Inc, Tokyo, Japan.
  • Shun Minamikawa
    Sony Computer Science Laboratories, Inc, Tokyo, Japan.
  • Natalia Polouliakh
    Sony Computer Science Laboratories, Inc, Tokyo, Japan.