Classification of Targets and Distractors in an Audiovisual Attention Task Based on Electroencephalography.

Journal: Sensors (Basel, Switzerland)
Published Date:

Abstract

Within the broader context of improving interactions between artificial intelligence and humans, the question has arisen regarding whether auditory and rhythmic support could increase attention for visual stimuli that do not stand out clearly from an information stream. To this end, we designed an experiment inspired by pip-and-pop but more appropriate for eliciting attention and P3a-event-related potentials (ERPs). In this study, the aim was to distinguish between targets and distractors based on the subject's electroencephalography (EEG) data. We achieved this objective by employing different machine learning (ML) methods for both individual-subject (IS) and cross-subject (CS) models. Finally, we investigated which EEG channels and time points were used by the model to make its predictions using saliency maps. We were able to successfully perform the aforementioned classification task for both the IS and CS scenarios, reaching classification accuracies up to 76%. In accordance with the literature, the model primarily used the parietal-occipital electrodes between 200 ms and 300 ms after the stimulus to make its prediction. The findings from this research contribute to the development of more effective P300-based brain-computer interfaces. Furthermore, they validate the EEG data collected in our experiment.

Authors

  • Steven Mortier
    IDLab-Department of Computer Science, University of Antwerp-imec, Sint-Pietersvliet 7, 2000 Antwerp, Belgium.
  • Renata Turkeš
    IDLab-Department of Computer Science, University of Antwerp-imec, Sint-Pietersvliet 7, 2000 Antwerp, Belgium.
  • Jorg De Winne
    WAVES Research Group, Department of Information Technology, Ghent University, 4 Technologiepark 126, Zwijnaarde, 9052 Ghent, Belgium.
  • Wannes Van Ransbeeck
    WAVES Research Group, Department of Information Technology, Ghent University, 4 Technologiepark 126, Zwijnaarde, 9052 Ghent, Belgium.
  • Dick Botteldooren
    WAVES Research Group, Faculty of Engineering and Architecture, Ghent University, Technologiepark 126, 9052 Gent, Belgium.
  • Paul Devos
    WAVES Research Group, Department of Information Technology, Ghent University, 4 Technologiepark 126, Zwijnaarde, 9052 Ghent, Belgium.
  • Steven Latré
    Department of Computer Science, University of Antwerp-imec, 2000 Antwerp, Belgium.
  • Marc Leman
    Department of Art, Music and Theater Studies, Institute for Psychoacoustics and Electronic Music (IPEM), Ghent University, 9000 Ghent, Belgium.
  • Tim Verdonck
    Department of Mathematics, University of Antwerp-imec, Middelheimlaan 1, 2000 Antwerp, Belgium.