Disclosing neonatal pain in real-time: AI-derived pain sign from continuous assessment of facial expressions.
Journal:
Computers in biology and medicine
PMID:
40024188
Abstract
This study introduces an AI-derived pain sign for continuous neonatal pain assessment, addressing the limitations of existing pain scales and computational approaches. Traditional pain scales, though widely used, are hindered by inter-rater variability, discontinuity, and subjectivity. While AI, particularly Deep-Learning, has shown promise, prior research has largely prioritized model performance over clinical applicability, often delivering static, binary predictions that lack interpretability in clinical practice. To bridge this gap, we developed a real-time pain sign tracking tool using facial expression analysis, a primary and non-invasive pain indicator in neonates. Leveraging benchmark datasets (iCOPE, iCOPEvid, and UNIFESP) and Deep-Learning frameworks (VGG-Face, N-CNN, and ViT-B/16), the models analyze video frames to generate a continuous visual representation of pain probability. Our results reveal the limitations of single-label predictions for time intervals, emphasizing the utility of a continuous monitoring visualization tool. The proposed pain sign effectively tracks dynamic changes in neonatal facial expressions, providing actionable and interpretable insights for healthcare professionals. We categorized these insights into a novel classification scheme, such as stable, irregular, unstable, and indeterminate pain signs. By integrating this pain sign into clinical workflows as a potential vital sign, this approach enables personalized pain management and continuous monitoring of both current and historical pain states in neonates, enhancing neonatal care and improving outcomes for these vulnerable patients.