Exploring Differential Diagnosis-Based Explainable AI: A Case Study in Melanoma Detection.

Journal: Studies in health technology and informatics
Published Date:

Abstract

Melanoma is a significant global health concern, with rising incidence rates and high mortality when diagnosed late. Artificial Intelligence (AI) models, especially models using deep learning techniques, have shown promising results in melanoma detection. However, the complexity of these models often leads to a lack of transparency, making it difficult for clinicians to understand and trust AI-based diagnoses. This paper presents a novel Explainable AI (XAI) method that aligns with differential diagnosis techniques commonly used in clinical settings, providing more comprehensive explanations. The novel XAI method and four commonly used XAI methods were evaluated with intended users with respect to perceived useability and trust. We found that the new method was considered more useful than other methods tested. Notably, the widely used saliency mapping technique received the lowest ratings, performing even worse than providing no explanation at all.

Authors

  • Bjorn Buijing
    HU University of Applied Sciences Utrecht, Research Group Artificial Intelligence, Utrecht, The Netherlands.
  • Danielle Sent
    HU University of Applied Sciences Utrecht, Research Group Artificial Intelligence, Utrecht, The Netherlands.