Evaluation and Real-World Performance Monitoring of Artificial Intelligence Models in Clinical Practice: Try It, Buy It, Check It.

Journal: Journal of the American College of Radiology : JACR
Published Date:

Abstract

The pace of regulatory clearance of artificial intelligence (AI) algorithms for radiology continues to accelerate, and numerous algorithms are becoming available for use in clinical practice. End users of AI in radiology should be aware that AI algorithms may not work as expected when used beyond the institutions in which they were trained, and model performance may degrade over time. In this article, we discuss why regulatory clearance alone may not be enough to ensure AI will be safe and effective in all radiological practices and review strategies available resources for evaluating before clinical use and monitoring performance of AI models to ensure efficacy and patient safety.

Authors

  • Bibb Allen
    Department of Radiology, Grandview Medical Center, Birmingham, Alabama. Electronic address: bibb@mac.com.
  • Keith Dreyer
    Department of Radiology, Massachusetts General Hospital, Boston, Massachusetts.
  • Robert Stibolt
    Diagnostic Radiology, Brookwood Baptist Health, Birmingham, Alabama.
  • Sheela Agarwal
    Lenox Hill Radiology, New York, New York.
  • Laura Coombs
    ACR Data Science Institute, Reston, Virginia.
  • Chris Treml
    ACR Data Science Institute, Reston, Virginia.
  • Mona Elkholy
    ACR Data Science Institute, Reston, Virginia.
  • Laura Brink
    American College of Radiology, Reston, Virginia.
  • Christoph Wald
    Chairman, Department of Radiology at Lahey Hospital & Medical Center, Professor of Radiology, Tufts University Medical School; Chair of the ACR Informatics Commission.