Automated detection of quiet eye durations in archery using electrooculography and comparative deep learning models.
Journal:
BMC sports science, medicine & rehabilitation
Published Date:
Aug 9, 2025
Abstract
This study presents a deep learning-based approach for the automated detection of Quiet Eye (QE) durations from electrooculography (EOG) signals in archery. QE-the final fixation or tracking of the gaze before executing a motor action-is a critical factor in precision sports. Traditional detection methods, which rely on expert evaluations, are inherently subjective, time-consuming, and inconsistent. To overcome these limitations, EOG data were collected from 10 licensed archers during controlled shooting sessions and preprocessed using a wavelet transform and a Butterworth bandpass filter for noise reduction. We implemented and compared a traditional model (SVM) and five deep learning models-CNN + LSTM, CNN + GRU, Transformer, UNet, and 1D CNN-for QE detection. The CNN + LSTM model achieved the highest accuracy (95%), followed closely by CNN + GRU (93%), demonstrating superior performance in capturing both spatial and temporal dependencies in the EOG signals. Although Transformer-based and UNet models performed competitively, they exhibited lower precision in distinguishing QE periods. The performance of the traditional model was inferior to deep learning approaches. These results indicate that deep learning provides an effective and scalable solution for objective QE analysis, substantially reducing the dependence on expert annotations. This automated approach can enhance sports training by offering real-time, data-driven feedback to athletes and coaches. Furthermore, the methodology holds promise for broader applications in cognitive and motor skill assessments across various domains. Future work will focus on expanding the dataset, enabling real-time deployment, and evaluating model generalizability across different skill levels and sports disciplines.
Authors
Keywords
No keywords available for this article.