The three ghosts of medical AI: Can the black-box present deliver?

Journal: Artificial intelligence in medicine
Published Date:

Abstract

Our title alludes to the three Christmas ghosts encountered by Ebenezer Scrooge in A Christmas Carol, who guide Ebenezer through the past, present, and future of Christmas holiday events. Similarly, our article takes readers through a journey of the past, present, and future of medical AI. In doing so, we focus on the crux of modern machine learning: the reliance on powerful but intrinsically opaque models. When applied to the healthcare domain, these models fail to meet the needs for transparency that their clinician and patient end-users require. We review the implications of this failure, and argue that opaque models (1) lack quality assurance, (2) fail to elicit trust, and (3) restrict physician-patient dialogue. We then discuss how upholding transparency in all aspects of model design and model validation can help ensure the reliability and success of medical AI.

Authors

  • Thomas P Quinn
    Centre for Pattern Recognition and Data Analytics, Deakin University, Geelong, Australia.
  • Stephan Jacobs
    Department of Cardiothoracic and Vascular Surgery, Deutsches Herzzentrum Berlin, Berlin, Germany; DZHK (German Centre for Cardiovascular Research), Berlin, Germany.
  • Manisha Senadeera
    Applied Artificial Intelligence Institute (A2I2), Deakin University, Geelong, Australia.
  • Vuong Le
    Applied AI Institute, Deakin University, Geelong, Australia. vuong.le@deakin.edu.au.
  • Simon Coghlan
    School of Computing and Information Systems, University of Melbourne, Melbourne, Australia.