Medical Text Prediction and Suggestion Using Generative Pretrained Transformer Models with Dental Medical Notes.

Journal: Methods of information in medicine
Published Date:

Abstract

BACKGROUND: Generative pretrained transformer (GPT) models are one of the latest large pretrained natural language processing models that enables model training with limited datasets and reduces dependency on large datasets, which are scarce and costly to establish and maintain. There is a rising interest to explore the use of GPT models in health care.

Authors

  • Joseph Sirrianni
    The Abigail Wexner Research Institute, Nationwide Children's Hospital, Columbus, OH, United States.
  • Emre Sezgin
    The Abigail Wexner Research Institute, Nationwide Children's Hospital, Columbus, OH, United States.
  • Daniel Claman
    Pediatric Dentistry, Nationwide Children's Hospital, Columbus, Ohio, United States.
  • Simon L Linwood
    School of Medicine, University of California Riverside, Riverside, CA, United States.