AIMC Topic: Facial Expression

Clear Filters Showing 21 to 30 of 227 articles

Neural network-based ensemble approach for multi-view facial expression recognition.

PloS one
In this paper, we developed a pose-aware facial expression recognition technique. The proposed technique employed K nearest neighbor for pose detection and a neural network-based extended stacking ensemble model for pose-aware facial expression recog...

DCAlexNet: Deep coupled AlexNet for micro facial expression recognition based on double face images.

Computers in biology and medicine
Facial Micro-Expression Recognition (FER) presents challenges due to individual variations in emotional intensity and the complexity of feature extraction. While apex frames offer valuable emotional information, their precise role in FER remains uncl...

Overconfident, but angry at least. AI-Based investigation of facial emotional expressions and self-assessment bias in human adults.

BMC psychology
Metacognition and facial emotional expressions both play a major role in human social interactions [1, 2] as inner narrative and primary communicational display, and both are limited by self-monitoring, control and their interaction with personal and...

Disclosing neonatal pain in real-time: AI-derived pain sign from continuous assessment of facial expressions.

Computers in biology and medicine
This study introduces an AI-derived pain sign for continuous neonatal pain assessment, addressing the limitations of existing pain scales and computational approaches. Traditional pain scales, though widely used, are hindered by inter-rater variabili...

Application of Multiple Deep Learning Architectures for Emotion Classification Based on Facial Expressions.

Sensors (Basel, Switzerland)
Facial expression recognition (FER) is essential for discerning human emotions and is applied extensively in big data analytics, healthcare, security, and user experience enhancement. This study presents a comprehensive evaluation of ten state-of-the...

Machine learning classification of active viewing of pain and non-pain images using EEG does not exceed chance in external validation samples.

Cognitive, affective & behavioral neuroscience
Previous research has demonstrated that machine learning (ML) could not effectively decode passive observation of neutral versus pain photographs by using electroencephalogram (EEG) data. Consequently, the present study explored whether active viewin...

A Multimodal Pain Sentiment Analysis System Using Ensembled Deep Learning Approaches for IoT-Enabled Healthcare Framework.

Sensors (Basel, Switzerland)
This study introduces a multimodal sentiment analysis system to assess and recognize human pain sentiments within an Internet of Things (IoT)-enabled healthcare framework. This system integrates facial expressions and speech-audio recordings to evalu...

An Artificial Intelligence Model for Sensing Affective Valence and Arousal from Facial Images.

Sensors (Basel, Switzerland)
Artificial intelligence (AI) models can sense subjective affective states from facial images. Although recent psychological studies have indicated that dimensional affective states of valence and arousal are systematically associated with facial expr...

AI and early diagnostics: mapping fetal facial expressions through development, evolution, and 4D ultrasound.

Journal of perinatal medicine
The development of facial musculature and expressions in the human fetus represents a critical intersection of developmental biology, neurology, and evolutionary anthropology, offering insights into early neurological and social development. Fetal fa...

Advancing Emotionally Aware Child-Robot Interaction with Biophysical Data and Insight-Driven Affective Computing.

Sensors (Basel, Switzerland)
This paper investigates the integration of affective computing techniques using biophysical data to advance emotionally aware machines and enhance child-robot interaction (CRI). By leveraging interdisciplinary insights from neuroscience, psychology, ...