Augmented Reality-Guided Visual Servoing for Flexible Endoscope Control
Other outputs


Full Text

Other information
AbstractThis paper presents the integration of visual servoing and augmented reality (AR) for the control of a flexible
robotic endoscope. Pre-operatively segmented anatomy can be visualized in the endoscopic view via AR to improve awareness of critical structures and aid guidance towards regions of interest (ROIs). With AR, the camera view can also be virtually extended beyond the normal field-of-view, enhancing positional awareness relative to typically out-of-view ROIs. Combined with visual servoing-based control of a flexible endoscope, the surgeon can directly prescribe targets in the image frame for automated navigation. We apply these concepts experimentally within the nasal cavity of an anatomical skull phantom where a virtual tumor target located in the extended view could be successfully tracked. This approach has potential for autonomous guidance of laser-based tumor ablation.
All Author(s) ListZhengyang Li, Ge Fang, Justin D.L. Ho, Christopher I. Lam, Y.W. Yim, Chan JYK, Ka-Wai Kwok
Year2019
Month5
LanguagesEnglish-United States

Last updated on 2021-13-05 at 10:02