Navigation of a robot-integrated fluorescence laparoscope in preoperative SPECT/CT and intraoperative freehand SPECT imaging data: a phantom study.

Journal: Journal of biomedical optics
Published Date:

Abstract

Robot-assisted laparoscopic surgery is becoming an established technique for prostatectomy and is increasingly being explored for other types of cancer. Linking intraoperative imaging techniques, such as fluorescence guidance, with the three-dimensional insights provided by preoperative imaging remains a challenge. Navigation technologies may provide a solution, especially when directly linked to both the robotic setup and the fluorescence laparoscope. We evaluated the feasibility of such a setup. Preoperative single-photon emission computed tomography/X-ray computed tomography (SPECT/CT) or intraoperative freehand SPECT (fhSPECT) scans were used to navigate an optically tracked robot-integrated fluorescence laparoscope via an augmented reality overlay in the laparoscopic video feed. The navigation accuracy was evaluated in soft tissue phantoms, followed by studies in a human-like torso phantom. Navigation accuracies found for SPECT/CT-based navigation were 2.25 mm (coronal) and 2.08 mm (sagittal). For fhSPECT-based navigation, these were 1.92 mm (coronal) and 2.83 mm (sagittal). All errors remained below the <1-cm detection limit for fluorescence imaging, allowing refinement of the navigation process using fluorescence findings. The phantom experiments performed suggest that SPECT-based navigation of the robot-integrated fluorescence laparoscope is feasible and may aid fluorescence-guided surgery procedures.

Authors

  • Matthias Nathanaël van Oosterom
    Leiden University Medical Center, Department of Surgery, Albinusdreef 2, Leiden 2333 ZA, The NetherlandsbLeiden University Medical Center, Department of Radiology, Interventional Molecular Imaging Laboratory, Albinusdreef 2, Leiden 2333 ZA, The Netherlands.
  • Myrthe Adriana Engelen
    Leiden University Medical Center, Department of Radiology, Interventional Molecular Imaging Laboratory, Albinusdreef 2, Leiden 2333 ZA, The Netherlands.
  • Nynke Sjoerdtje van den Berg
    Leiden University Medical Center, Department of Radiology, Interventional Molecular Imaging Laboratory, Albinusdreef 2, Leiden 2333 ZA, The NetherlandscThe Netherlands Cancer Institute, Antoni van Leeuwenhoek Hospital, Department of Urology, Plesmanlaan 121, Amsterdam 1066 CX, The Netherlands.
  • Gijs Hendrik KleinJan
    Leiden University Medical Center, Department of Radiology, Interventional Molecular Imaging Laboratory, Albinusdreef 2, Leiden 2333 ZA, The NetherlandscThe Netherlands Cancer Institute, Antoni van Leeuwenhoek Hospital, Department of Urology, Plesmanlaan 121, Amsterdam 1066 CX, The Netherlands.
  • Henk Gerrit van der Poel
    The Netherlands Cancer Institute, Antoni van Leeuwenhoek Hospital, Department of Urology, Plesmanlaan 121, Amsterdam 1066 CX, The Netherlands.
  • Thomas Wendler
    Technische Universität München, Computer Aided Medical Procedures, Institut für Informatik, I16, Boltzmannstr. 3, Garching bei München 85748, GermanyeSurgicEye GmbH, Friedenstraße 18A, München 81671, Germany.
  • Cornelis Jan Hadde van de Velde
    Leiden University Medical Center, Department of Surgery, Albinusdreef 2, Leiden 2333 ZA, The Netherlands.
  • Nassir Navab
    Chair for Computer Aided Medical Procedures & Augmented Reality, TUM School of Computation, Information and Technology, Technical University of Munich, Munich, Germany.
  • Fijs Willem Bernhard van Leeuwen
    Leiden University Medical Center, Department of Radiology, Interventional Molecular Imaging Laboratory, Albinusdreef 2, Leiden 2333 ZA, The NetherlandscThe Netherlands Cancer Institute, Antoni van Leeuwenhoek Hospital, Department of Urology, Plesmanlaan 121, Amsterdam 1066 CX, The Netherlands.