AI Medical Compendium Topic

Explore the latest research on artificial intelligence and machine learning in medicine.

Facial Expression

Showing 11 to 20 of 205 articles

Clear Filters

Face readers.

Science (New York, N.Y.)
Artificial intelligence is becoming better than humans at scanning animals' faces for signs of stress and pain. Are more complex emotions next?

An android can show the facial expressions of complex emotions.

Scientific reports
Trust and rapport are essential abilities for human-robot interaction. Producing emotional expressions in the robots' faces is an effective way for that purpose. Androids can show human-like facial expressions of basic emotions. However, whether andr...

A facial expression recognition network using hybrid feature extraction.

PloS one
Facial expression recognition faces great challenges due to factors such as face similarity, image quality, and age variation. Although various existing end-to-end Convolutional Neural Network (CNN) architectures have achieved good classification res...

A Multimodal Pain Sentiment Analysis System Using Ensembled Deep Learning Approaches for IoT-Enabled Healthcare Framework.

Sensors (Basel, Switzerland)
This study introduces a multimodal sentiment analysis system to assess and recognize human pain sentiments within an Internet of Things (IoT)-enabled healthcare framework. This system integrates facial expressions and speech-audio recordings to evalu...

An Artificial Intelligence Model for Sensing Affective Valence and Arousal from Facial Images.

Sensors (Basel, Switzerland)
Artificial intelligence (AI) models can sense subjective affective states from facial images. Although recent psychological studies have indicated that dimensional affective states of valence and arousal are systematically associated with facial expr...

Advancing Emotionally Aware Child-Robot Interaction with Biophysical Data and Insight-Driven Affective Computing.

Sensors (Basel, Switzerland)
This paper investigates the integration of affective computing techniques using biophysical data to advance emotionally aware machines and enhance child-robot interaction (CRI). By leveraging interdisciplinary insights from neuroscience, psychology, ...

AI and early diagnostics: mapping fetal facial expressions through development, evolution, and 4D ultrasound.

Journal of perinatal medicine
The development of facial musculature and expressions in the human fetus represents a critical intersection of developmental biology, neurology, and evolutionary anthropology, offering insights into early neurological and social development. Fetal fa...

Overconfident, but angry at least. AI-Based investigation of facial emotional expressions and self-assessment bias in human adults.

BMC psychology
Metacognition and facial emotional expressions both play a major role in human social interactions [1, 2] as inner narrative and primary communicational display, and both are limited by self-monitoring, control and their interaction with personal and...

Disclosing neonatal pain in real-time: AI-derived pain sign from continuous assessment of facial expressions.

Computers in biology and medicine
This study introduces an AI-derived pain sign for continuous neonatal pain assessment, addressing the limitations of existing pain scales and computational approaches. Traditional pain scales, though widely used, are hindered by inter-rater variabili...

Neural dynamics of mental state attribution to social robot faces.

Social cognitive and affective neuroscience
The interplay of mind attribution and emotional responses is considered crucial in shaping human trust and acceptance of social robots. Understanding this interplay can help us create the right conditions for successful human-robot social interaction...