Radiology Synthetic Confusion: How Generative Artifical Intelligence Amplifies Misunderstandings of Radiologists and Technologists in Patient-Facing Media.
Journal:
Canadian Association of Radiologists journal = Journal l'Association canadienne des radiologistes
Published Date:
Jun 24, 2025
Abstract
Artificial intelligence (AI) tools, particularly generative models, are increasingly used to depict clinical roles in healthcare. This study evaluates whether generative AI systems accurately differentiate between radiologists and medical radiation technologists (MRTs), 2 roles often confused by patients and providers. We assessed 1380 images and videos generated by 8 text-to-image/video AI models. Five raters evaluated task-role accuracy, attire, equipment, lighting, isolation, and demographics. Statistical tests compared differences across models and roles. MRTs were depicted accurately in 82.0% of outputs, while only 56.2% of radiologist images/videos were role-appropriate. Among inaccurate radiologist depictions, 79.1% misrepresented MRTs tasks. Radiologists were more often male (73.8%) and White (79.7%), while MRTs were more diverse. Stethoscope misuse, lack of disability/religious markers, and overuse of business attire for radiologists further reflected bias. Generative AI frequently misrepresents radiologist roles and demographics, reinforcing stereotypes and public confusion. Greater oversight and inclusion standards are needed to ensure equitable AI-generated healthcare content.
Authors
Keywords
No keywords available for this article.