Emotional impact of AI-generated vs. human-composed music in audiovisual media: A biometric and self-report study.
Journal:
PloS one
Published Date:
Jun 25, 2025
Abstract
Generative artificial intelligence (AI) has evolved rapidly, sparking debates about its impact on the visual and sonic arts. Despite its growing integration into creative industries, public opinion remains sceptical, viewing creativity as uniquely human. In music production, AI tools are advancing, yet emotional expression remains largely overlooked in development and research. This study examined whether AI-powered music creation can evoke the same emotional impact as human-created music in audiovisual contexts. Participants (N = 88) watched videos accompanied by different audio tracks across three conditions: human-created music (HCM), AI-generated music using more sophisticated and detailed keyword prompts (AI-KP) and AI-generated music using simpler and less detailed prompts based on discrete and dimensional emotional values (AI-DP). Biometric data and personal affective responses were registered during this process. The results show that both AI soundtracks led to wider pupil dilation compared with human-created music but did not differ significantly from each other. AI-generated music with sophisticated prompts (AI-KP) resulted in a higher blink rate and skin impedance level as markers of attention and cognitive load, while emotional valence remained consistent across conditions. Participants found AI-generated music more arousing that HCM, while HCM was perceived as more familiar than both AI conditions.