CholecTriplet2022: Show me a tool and tell me the triplet - An endoscopic vision challenge for surgical action triplet detection.

Journal: Medical image analysis
Published Date:

Abstract

Formalizing surgical activities as triplets of the used instruments, actions performed, and target anatomies is becoming a gold standard approach for surgical activity modeling. The benefit is that this formalization helps to obtain a more detailed understanding of tool-tissue interaction which can be used to develop better Artificial Intelligence assistance for image-guided surgery. Earlier efforts and the CholecTriplet challenge introduced in 2021 have put together techniques aimed at recognizing these triplets from surgical footage. Estimating also the spatial locations of the triplets would offer a more precise intraoperative context-aware decision support for computer-assisted intervention. This paper presents the CholecTriplet2022 challenge, which extends surgical action triplet modeling from recognition to detection. It includes weakly-supervised bounding box localization of every visible surgical instrument (or tool), as the key actors, and the modeling of each tool-activity in the form of ‹instrument, verb, target› triplet. The paper describes a baseline method and 10 new deep learning algorithms presented at the challenge to solve the task. It also provides thorough methodological comparisons of the methods, an in-depth analysis of the obtained results across multiple metrics, visual and procedural challenges; their significance, and useful insights for future research directions and applications in surgery.

Authors

  • Chinedu Innocent Nwoye
    ICube, University of Strasbourg, CNRS, IHU, Strasbourg, France. nwoye.chinedu@gmail.com.
  • Tong Yu
  • Saurav Sharma
    ICube, University of Strasbourg, CNRS, Strasbourg 67000, France.
  • Aditya Murali
    University of Strasbourg, UMR 7357 CNRS, ICube, Strasbourg, France.
  • Deepak Alapatt
    ICube, University of Strasbourg, CNRS, IHU Strasbourg, France.
  • Armine Vardazaryan
    ICube, University of Strasbourg, CNRS, IHU Strasbourg, France.
  • Kun Yuan
    University of Ottawa, Ottawa, K1N 6N5, Canada. Electronic address: kyuan033@uottawa.ca.
  • Jonas Hajek
    Riwolink GmbH, Germany.
  • Wolfgang Reiter
    Wintegral GmbH, München, Germany. wolfgang.reiter@wintegral.net.
  • Amine Yamlahi
    Department of Computer Assisted Medical Interventions, German Cancer Research Center (DKFZ), Im Neuenheimer Feld 223, 69120, Heidelberg, BW, Germany.
  • Finn-Henri Smidt
    Division of Intelligent Medical Systems (IMSY), German Cancer Research Center (DKFZ), Heidelberg, Germany.
  • Xiaoyang Zou
    Institute of Medical Robotics, School of Biomedical Engineering, Shanghai Jiao Tong University, China.
  • Guoyan Zheng
    Institute for Surgical Technology and Biomechanics, University of Bern, Bern, Switzerland. guoyan.zheng@istb.unibe.ch.
  • Bruno Oliveira
  • Helena R Torres
  • Satoshi Kondo
    Konica Minolta, Inc., Osaka, 569-8503, Japan.
  • Satoshi Kasai
    Niigata University of Health and Welfare, Japan.
  • Felix Holm
    Technical University Munich, Germany.
  • Ege Özsoy
    Technical University Munich, Germany.
  • Shuangchun Gui
    Southern University of Science and Technology, China.
  • Han Li
  • Sista Raviteja
    Indian Institute of Technology, Kharagpur, India.
  • Rachana Sathish
    Indian Institute of Technology Kharagpur, India.
  • Pranav Poudel
    Redev Technology Ltd, UK.
  • Binod Bhattarai
    University of Aberdeen, Aberdeen, UK.
  • Ziheng Wang
  • Guo Rui
    Intuitive Surgical, USA.
  • Melanie Schellenberg
    Division of Intelligent Medical Systems (IMSY), German Cancer Research Center (DKFZ), Heidelberg, Germany; National Center for Tumor Diseases (NCT), Heidelberg, Germany.
  • Joao L Vilaca
  • Tobias Czempiel
    Technical University Munich, Germany.
  • Zhenkun Wang
    Department of Computer Science, City University of Hong Kong, Hong Kong. Electronic address: zwang339@cityu.hk.edu.
  • Debdoot Sheet
    Department of Electrical Engineering, Indian Institute of Technology Kharagpur, West Bengal, India.
  • Shrawan Kumar Thapa
    Nepal Applied Mathematics and Informatics Institute for research (NAAMII), Nepal.
  • Max Berniker
    Intuitive Surgical, USA.
  • Patrick Godau
    Division of Intelligent Medical Systems (IMSY), German Cancer Research Center (DKFZ), Heidelberg, Germany; National Center for Tumor Diseases (NCT), Heidelberg, Germany.
  • Pedro Morais
  • Sudarshan Regmi
    Nepal Applied Mathematics and Informatics Institute for research (NAAMII), Nepal.
  • Thuy Nuong Tran
    Division of Intelligent Medical Systems (IMSY), German Cancer Research Center (DKFZ), Heidelberg, Germany.
  • Jaime Fonseca
  • Jan-Hinrich Nölke
    Division of Intelligent Medical Systems (IMSY), German Cancer Research Center (DKFZ), Heidelberg, Germany; National Center for Tumor Diseases (NCT), Heidelberg, Germany.
  • Estevao Lima
  • Eduard Vazquez
    Redev Technology, UK.
  • Lena Maier-Hein
    German Cancer Research Center (DKFZ), Computer Assisted Medical Interventions, Heidelberg, Germany.
  • Nassir Navab
    Chair for Computer Aided Medical Procedures & Augmented Reality, TUM School of Computation, Information and Technology, Technical University of Munich, Munich, Germany.
  • Pietro Mascagni
    IHU Strasbourg, Strasbourg, France.
  • Barbara Seeliger
    IHU-Strasbourg Institute of Image-Guided Surgery, 1, place de l'Hôpital, 67091, Strasbourg Cedex, France. barbara.seeliger@ihu-strasbourg.eu.
  • Cristians Gonzalez
    University Hospital of Strasbourg, IHU Strasbourg, France.
  • Didier Mutter
    Institut Hospitalo-Universitaire, Institute of Image-Guided Surgery, University of Strasbourg, Fédération de Médecine Translationnelle de Strasbourg, Strasbourg, France3Department of Digestive Surgery, Strasbourg University Hospital, Fédération de Médecin.
  • Nicolas Padoy
    IHU Strasbourg, Strasbourg, France.