The need for epistemic humility in AI-assisted pain assessment.

Journal: Medicine, health care, and philosophy
Published Date:

Abstract

It has been difficult historically for physicians, patients, and philosophers alike to quantify pain given that pain is commonly understood as an individual and subjective experience. The process of measuring and diagnosing pain is often a fraught and complicated process. New developments in diagnostic technologies assisted by artificial intelligence promise more accurate and efficient diagnosis for patients, but these tools are known to reproduce and further entrench existing issues within the healthcare system, such as poor patient treatment and the replication of systemic biases. In this paper we present the argument that there are several ethical-epistemic issues with the potential implementation of these technologies in pain management settings. We draw on literature about self-trust and epistemic and testimonial injustice to make these claims. We conclude with a proposal that the adoption of epistemic humility on the part of both AI tool developers and clinicians can contribute to a climate of trust in and beyond the pain management context and lead to a more just approach to the implementation of AI in pain diagnosis and management.

Authors

  • Rachel A Katz
    Institute for the History & Philosophy of Science and Technology, University of Toronto, Toronto, ON, Canada.
  • S Scott Graham
    Department of Rhetoric and Writing, Center for Health Communication, The University of Texas at Austin, Austin, TX, USA.
  • Daniel Z Buchman
    Centre for Addiction and Mental Health, Toronto, ON, Canada. daniel.buchman@utoronto.ca.