Autonomous Intelligent Navigation for Flexible Endoscopy Using Monocular Depth Guidance and 3-D Shape Planning
Refereed conference paper presented and published in conference proceedings

Altmetrics Information
.

Other information
AbstractRecent advancements toward perception and decision-making of flexible endoscopes have shown great potential in computer-aided surgical interventions. However, owing to modeling uncertainty and inter-patient anatomical variation in flexible endoscopy, the challenge remains for efficient and safe navigation in patient-specific scenarios. This paper presents a novel data-driven framework with self-contained visual-shape fusion for autonomous intelligent navigation of flexible endoscopes requiring no priori knowledge of system models and global environments. A learning-based adaptive visual servoing controller is proposed to online update the eye-in-hand vision-motor configuration and steer the endoscope, which is guided by monocular depth estimation via a vision transformer (ViT). To prevent unnecessary and excessive interactions with surrounding anatomy, an energy-motivated shape planning algorithm is introduced through entire endoscope 3-D proprioception from embedded fiber Bragg grating (FBG) sensors. Furthermore, a model predictive control (MPC) strategy is developed to minimize the elastic potential energy flow and simultaneously optimize the steering policy. Dedicated navigation experiments on a robotic-assisted flexible endoscope with an FBG fiber in several phantom environments demonstrate the effectiveness and adaptability of the proposed framework.
All Author(s) ListYiang Lu, Ruofeng Wei, Bin Li, Wei Chen, Jianshu Zhou, Qi Dou, Dong Sun, Yun-hui Liu
Name of ConferenceIEEE International Conference on Robotics and Automation 2023 ICRA
Start Date of Conference29/05/2023
End Date of Conference02/06/2023
Place of ConferenceLondon
Country/Region of ConferenceGreat Britain
Proceedings Title2023 IEEE International Conference on Robotics and Automation (ICRA)
Year2023
Pages6916 - 6922
ISBN979-8-3503-2366-5
eISBN979-8-3503-2365-8
LanguagesEnglish-United Kingdom

Last updated on 2023-29-10 at 03:11