Reliability and agreement during the Rapid Entire Body Assessment: Comparing rater expertise and artificial intelligence.

Journal: PloS one
PMID:

Abstract

The purpose of this study was to examine the reliability and agreement between human raters (novice, intermediate, and expert) and TuMeke Risk Suite when assessing work with the Rapid Entire Body Assessment (REBA). Twenty-one videos portraying veterinarians performing an equine radiograph were assessed with REBA by human raters and TuMeke Risk Suite (ergonomic artificial intelligence software). Intra-rater reliability of the final REBA score was highest for TuMeke Risk Suite (ICC = 1.0), then the expert rater (ICC = 0.89 (0.78-0.95)), and lowest for the novice rater (ICC = 0.51 (0.25-0.74)). Agreement between the expert rater and TuMeke Risk Suite was highest for scores of the trunk, leg, and upper arm, and lowest for the neck, wrist, and lower arm. The REBA tool in TuMeke Risk Suite may be of benefit to less experienced users to enhance reliability of their REBA assessments, especially when the trunk, legs, and upper arm are of primary interest.

Authors

  • Denise Balogh
    Canadian Center for Rural and Agricultural Health, University of Saskatchewan, Saskatoon, Saskatchewan, Canada.
  • Xiaoxiao Cui
    College of Veterinary Medicine, South China Agricultural University, Guangzhou, Guangdong, China.
  • Monique Mayer
    Small Animal Clinical Sciences, Western College of Veterinary Medicine, University of Saskatchewan, Saskatoon, Saskatchewan, Canada.
  • Niels Koehncke
    Canadian Center for Rural and Agricultural Health, University of Saskatchewan, Saskatoon, Saskatchewan, Canada.
  • Ryan Dueck
    IPM Occupational Therapy, Saskatoon, Saskatchewan, Canada.
  • Angelica E Lang
    Canadian Center for Rural and Agricultural Health, University of Saskatchewan, Saskatoon, Saskatchewan, Canada.