Deep learning-based computer vision to recognize and classify suturing gestures in robot-assisted surgery.

Journal: Surgery
Published Date:

Abstract

BACKGROUND: Our previous work classified a taxonomy of suturing gestures during a vesicourethral anastomosis of robotic radical prostatectomy in association with tissue tears and patient outcomes. Herein, we train deep learning-based computer vision to automate the identification and classification of suturing gestures for needle driving attempts.

Authors

  • Francisco Luongo
    Department of Biology and Biological Engineering, Caltech, Pasadena, CA.
  • Ryan Hakim
    Center for Robotic Simulation & Education, Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA.
  • Jessica H Nguyen
    Center for Robotic Simulation & Education, Catherine & Joseph Aresty Department of Urology, USC Institute of Urology, University of Southern California, Los Angeles, CA.
  • Animashree Anandkumar
    Department of Computing & Mathematical Sciences, Caltech, Pasadena, CA.
  • Andrew J Hung
    Center for Robotic Simulation & Education, Catherine & Joseph Aresty Department of Urology, University of Southern California Institute of Urology, Los Angeles, California. Electronic address: Andrew.Hung@med.usc.edu.