The role of patient outcomes in shaping moral responsibility in AI-supported decision making.

Journal: Radiography (London, England : 1995)
Published Date:

Abstract

INTRODUCTION: Integrating decision support mechanisms utilising artificial intelligence (AI) into medical radiation practice introduces unique challenges to accountability for patient care outcomes. AI systems, often seen as "black boxes," can obscure decision-making processes, raising concerns about practitioner responsibility, especially in adverse outcomes. This study examines how medical radiation practitioners perceive and attribute moral responsibility when interacting with AI-assisted decision-making tools.

Authors

  • C Edwards
    School of Clinical Sciences, Queensland University of Technology, Brisbane, Queensland, Australia.
  • A Murphy
    Queensland University of Technology, School of Clinical Sciences, Faculty of Health, Brisbane, QLD, Australia; Medical Imaging and Nuclear Medicine, Children's Health Queensland Hospital and Health Service, South Brisbane, QLD, Australia; Department of Medical Imaging, Princess Alexandra Hospital, Woolloongabba, QLD, Australia.
  • A Singh
    Department of Otorhinolaryngology and Head and Neck Surgery, All India Institute of Medical Sciences, Room no. 4057, ENT Office, 4th floor, Teaching Block, Ansari Nagar, New Delhi, 110029 India.
  • S Daniel
    Centre for Advanced Imaging, The University of Queensland, St Lucia, Australia; Department of Nuclear Medicine, Queensland Specialised PET Services Royal Brisbane and Women's Hospital, Herston, Australia.
  • C Chamunyonga
    Queensland University of Technology, School of Clinical Sciences, Faculty of Health, Brisbane, QLD, Australia.