AIMC Topic: Facial Expression

Clear Filters Showing 21 to 30 of 222 articles

Application of Multiple Deep Learning Architectures for Emotion Classification Based on Facial Expressions.

Sensors (Basel, Switzerland)
Facial expression recognition (FER) is essential for discerning human emotions and is applied extensively in big data analytics, healthcare, security, and user experience enhancement. This study presents a comprehensive evaluation of ten state-of-the...

Machine learning classification of active viewing of pain and non-pain images using EEG does not exceed chance in external validation samples.

Cognitive, affective & behavioral neuroscience
Previous research has demonstrated that machine learning (ML) could not effectively decode passive observation of neutral versus pain photographs by using electroencephalogram (EEG) data. Consequently, the present study explored whether active viewin...

A Multimodal Pain Sentiment Analysis System Using Ensembled Deep Learning Approaches for IoT-Enabled Healthcare Framework.

Sensors (Basel, Switzerland)
This study introduces a multimodal sentiment analysis system to assess and recognize human pain sentiments within an Internet of Things (IoT)-enabled healthcare framework. This system integrates facial expressions and speech-audio recordings to evalu...

An Artificial Intelligence Model for Sensing Affective Valence and Arousal from Facial Images.

Sensors (Basel, Switzerland)
Artificial intelligence (AI) models can sense subjective affective states from facial images. Although recent psychological studies have indicated that dimensional affective states of valence and arousal are systematically associated with facial expr...

AI and early diagnostics: mapping fetal facial expressions through development, evolution, and 4D ultrasound.

Journal of perinatal medicine
The development of facial musculature and expressions in the human fetus represents a critical intersection of developmental biology, neurology, and evolutionary anthropology, offering insights into early neurological and social development. Fetal fa...

Advancing Emotionally Aware Child-Robot Interaction with Biophysical Data and Insight-Driven Affective Computing.

Sensors (Basel, Switzerland)
This paper investigates the integration of affective computing techniques using biophysical data to advance emotionally aware machines and enhance child-robot interaction (CRI). By leveraging interdisciplinary insights from neuroscience, psychology, ...

A Tutorial on the Use of Artificial Intelligence Tools for Facial Emotion Recognition in R.

Multivariate behavioral research
Automated detection of facial emotions has been an interesting topic for multiple decades in social and behavioral research but is only possible very recently. In this tutorial, we review three popular artificial intelligence based emotion detection ...

Face readers.

Science (New York, N.Y.)
Artificial intelligence is becoming better than humans at scanning animals' faces for signs of stress and pain. Are more complex emotions next?

An android can show the facial expressions of complex emotions.

Scientific reports
Trust and rapport are essential abilities for human-robot interaction. Producing emotional expressions in the robots' faces is an effective way for that purpose. Androids can show human-like facial expressions of basic emotions. However, whether andr...

A facial expression recognition network using hybrid feature extraction.

PloS one
Facial expression recognition faces great challenges due to factors such as face similarity, image quality, and age variation. Although various existing end-to-end Convolutional Neural Network (CNN) architectures have achieved good classification res...