AIMC Topic: Attention

Clear Filters Showing 41 to 50 of 573 articles

EMBANet: A flexible efficient multi-branch attention network.

Neural networks : the official journal of the International Neural Network Society
Recent advances in the design of convolutional neural networks have shown that performance can be enhanced by improving the ability to represent multi-scale features. However, most existing methods either focus on designing more sophisticated attenti...

FLANet: A multiscale temporal convolution and spatial-spectral attention network for EEG artifact removal with adversarial training.

Journal of neural engineering
Denoising artifacts, such as noise from muscle or cardiac activity, is a crucial and ubiquitous concern in neurophysiological signal processing, particularly for enhancing the signal-to-noise ratio in electroencephalograph (EEG) analysis. Novel metho...

Enhanced electroencephalogram signal classification: A hybrid convolutional neural network with attention-based feature selection.

Brain research
Accurate recognition and classification of motor imagery electroencephalogram (MI-EEG) signals are crucial for the successful implementation of brain-computer interfaces (BCI). However, inherent characteristics in original MI-EEG signals, such as non...

Multi-branch convolutional neural network with cross-attention mechanism for emotion recognition.

Scientific reports
Research on emotion recognition is an interesting area because of its wide-ranging applications in education, marketing, and medical fields. This study proposes a multi-branch convolutional neural network model based on cross-attention mechanism (MCN...

A comparative analysis of LSTM models aided with attention and squeeze and excitation blocks for activity recognition.

Scientific reports
Human Activity Recognition plays a vital role in various fields, such as healthcare and smart environments. Traditional HAR methods rely on sensor or video data, but sensor-based systems have gained popularity due to their non-intrusive nature. Curre...

Multi-Label Zero-Shot Learning Via Contrastive Label-Based Attention.

International journal of neural systems
Multi-label zero-shot learning (ML-ZSL) strives to recognize all objects in an image, regardless of whether they are present in the training data. Recent methods incorporate an attention mechanism to locate labels in the image and generate class-spec...

ShadowGAN-Former: Reweighting self-attention based on mask for shadow removal.

Neural networks : the official journal of the International Neural Network Society
Shadow removal remains a challenging visual task aimed at restoring the original brightness of shadow regions in images. Many existing methods overlook the implicit clues within non-shadow regions, leading to inconsistencies in the color, texture, an...

Paying more attention on backgrounds: Background-centric attention for UAV detection.

Neural networks : the official journal of the International Neural Network Society
Under the advancement of artificial intelligence, Unmanned Aerial Vehicles (UAVs) exhibit efficient flexibility in military reconnaissance, traffic monitoring, and crop analysis. However, the UAV detection faces unique challenges due to the UAV's sma...

Towards parameter-free attentional spiking neural networks.

Neural networks : the official journal of the International Neural Network Society
Brain-inspired spiking neural networks (SNNs) are increasingly explored for their potential in spatiotemporal information modeling and energy efficiency on emerging neuromorphic hardware. Recent works incorporate attentional modules into SNNs, greatl...

Quantum mixed-state self-attention network.

Neural networks : the official journal of the International Neural Network Society
Attention mechanisms have revolutionized natural language processing. Combining them with quantum computing aims to further advance this technology. This paper introduces a novel Quantum Mixed-State Self-Attention Network (QMSAN) for natural language...