AIMC Topic: Attention

Clear Filters Showing 81 to 90 of 587 articles

MACNet: A Multidimensional Attention-Based Convolutional Neural Network for Lower-Limb Motor Imagery Classification.

Sensors (Basel, Switzerland)
Decoding lower-limb motor imagery (MI) is highly important in brain-computer interfaces (BCIs) and rehabilitation engineering. However, it is challenging to classify lower-limb MI from electroencephalogram (EEG) signals, because lower-limb motions (L...

The use of machine learning to understand the role of visual attention in multi-attribute choice.

Acta psychologica
Whether eye movements (as a measure of visual attention) contribute to the understanding of how multi-attribute decisions are made, is still a matter of debate. In this study, we show how machine learning methods can be used to separate the effects o...

AFSleepNet: Attention-Based Multi-View Feature Fusion Framework for Pediatric Sleep Staging.

IEEE transactions on neural systems and rehabilitation engineering : a publication of the IEEE Engineering in Medicine and Biology Society
The widespread prevalence of sleep problems in children highlights the importance of timely and accurate sleep staging in the diagnosis and treatment of pediatric sleep disorders. However, most existing sleep staging methods rely on one-dimensional r...

A machine learning approach to understanding the road and traffic environments of crashes involving driver distraction and inattention (DDI) on rural multilane highways.

Journal of safety research
INTRODUCTION: Driver distraction and inattention (DDI) are major causes of road crashes, especially on rural highways. However, not all instances of distracted or inattentive driving lead to crashes. Previous studies indicate that DDI-related driving...

UMS-ODNet: Unified-scale domain adaptation mechanism driven object detection network with multi-scale attention.

Neural networks : the official journal of the International Neural Network Society
Unsupervised domain adaptation techniques improve the generalization capability and performance of detectors, especially when the source and target domains have different distributions. Compared with two-stage detectors, one-stage detectors (especial...

Deep-learning models reveal how context and listener attention shape electrophysiological correlates of speech-to-language transformation.

PLoS computational biology
To transform continuous speech into words, the human brain must resolve variability across utterances in intonation, speech rate, volume, accents and so on. A promising approach to explaining this process has been to model electroencephalogram (EEG) ...

Distinct connectivity patterns between perception and attention-related brain networks characterize dyslexia: Machine learning applied to resting-state fMRI.

Cortex; a journal devoted to the study of the nervous system and behavior
Diagnosis of dyslexia often occurs in late schooling years, leading to academic and psychological challenges. Furthermore, diagnosis is time-consuming, costly, and reliant on arbitrary cutoffs. On the other hand, automated algorithms hold great poten...

Semantic-guided attention and adaptive gating for document-level relation extraction.

Scientific reports
In natural language processing, document-level relation extraction is a complex task that aims to predict the relationships among entities by capturing contextual interactions from an unstructured document. Existing graph- and transformer-based model...

GradToken: Decoupling tokens with class-aware gradient for visual explanation of Transformer network.

Neural networks : the official journal of the International Neural Network Society
Transformer networks have been widely used in the fields of computer vision, natural language processing, graph-structured data analysis, etc. Subsequently, explanations of Transformer play a key role in helping humans understand and analyze its deci...

DiamondNet: A Neural-Network-Based Heterogeneous Sensor Attentive Fusion for Human Activity Recognition.

IEEE transactions on neural networks and learning systems
With the proliferation of intelligent sensors integrated into mobile devices, fine-grained human activity recognition (HAR) based on lightweight sensors has emerged as a useful tool for personalized applications. Although shallow and deep learning al...