Transformer-based heart language model with electrocardiogram annotations.

Journal: Scientific reports
PMID:

Abstract

This paper explores the potential of transformer-based foundation models to detect Atrial Fibrillation (AFIB) in electrocardiogram (ECG) processing, an arrhythmia specified as an irregular heart rhythm without patterns. We construct a language with tokens from heartbeat locations to detect irregular heart rhythms by applying a transformers-based neural network architecture previously used only for building natural language models. Our experiments include 41, 128, 256, and 512 tokens, representing parts of ECG recordings after tokenization. The method consists of training the foundation model with annotated benchmark databases, then finetuning on a much smaller dataset and evaluating different ECG datasets from those used in the finetuning. The best-performing model achieved an F1 score of 93.33 % to detect AFIB in an ECG segment composed of 41 heartbeats by evaluating different training and testing ECG benchmark datasets. The results showed that a foundation model trained on a large data corpus could be finetuned using a much smaller annotated dataset to detect and classify arrhythmia in ECGs. This work paves the way for the transformation of foundation models into invaluable cardiologist assistants soon, opening the possibility of training foundation models with even more data to achieve even better performance scores.

Authors

  • Stojancho Tudjarski
    Innovation Dooel, 1000, Skopje, North Macedonia.
  • Marjan Gusev
    Innovation Dooel, 1000, Skopje, North Macedonia. marjan.gushev@finki.ukim.mk.
  • Evangelos Kanoulas
    Informatics Institute, University of Amsterdam, Amsterdam, The Netherlands.