CC BY-NC-ND 4.0 · Endosc Int Open 2018; 06(02): E205-E210
DOI: 10.1055/s-0043-121882
Original article
Eigentümer und Copyright ©Georg Thieme Verlag KG 2018

Novel experimental and software methods for image reconstruction and localization in capsule endoscopy

Anastasios Koulaouzidis
Centre for Liver & Digestive Disorders, The Royal Infirmary of Edinburgh, Edinburgh, UK
,
Dimitris K. Iakovidis
University of Thessaly, Department of Computer Science and Biomedical Informatics, Lamia, Greece
,
Diana E. Yung
Centre for Liver & Digestive Disorders, The Royal Infirmary of Edinburgh, Edinburgh, UK
,
Evangelos Mazomenos
Centre of Medical Image Computing and Department of Computer Science, University College London, London, UK
,
Federico Bianchi
The BioRobotics Institute, Scuola Superiore Sant’Anna, Pisa, Italy
,
Alexandros Karagyris
IBM Research, Almaden California, United States
,
George Dimas
University of Thessaly, Department of Computer Science and Biomedical Informatics, Lamia, Greece
,
Danail Stoyanov
Centre of Medical Image Computing and Department of Computer Science, University College London, London, UK
,
Henrik Thorlacius
Department of Clinical Sciences, Lund University, Malmö, Sweden
,
Ervin Toth
Department of Gastroenterology, Skåne University Hospital, Malmö, Sweden
,
Gastone Ciuti
The BioRobotics Institute, Scuola Superiore Sant’Anna, Pisa, Italy
› Author Affiliations
Further Information

Publication History

submitted 16 June 2017

accepted after revision 09 October 2017

Publication Date:
01 February 2018 (online)

Abstract

Background and study aims Capsule endoscopy (CE) is invaluable for minimally invasive endoscopy of the gastrointestinal tract; however, several technological limitations remain including lack of reliable lesion localization. We present an approach to 3D reconstruction and localization using visual information from 2D CE images.

Patients and methods Colored thumbtacks were secured in rows to the internal wall of a LifeLike bowel model. A PillCam SB3 was calibrated and navigated linearly through the lumen by a high-precision robotic arm. The motion estimation algorithm used data (light falling on the object, fraction of reflected light and surface geometry) from 2D CE images in the video sequence to achieve 3D reconstruction of the bowel model at various frames. The ORB-SLAM technique was used for 3D reconstruction and CE localization within the reconstructed model. This algorithm compared pairs of points between images for reconstruction and localization.

Results As the capsule moved through the model bowel 42 to 66 video frames were obtained per pass. Mean absolute error in the estimated distance travelled by the CE was 4.1 ± 3.9 cm. Our algorithm was able to reconstruct the cylindrical shape of the model bowel with details of the attached thumbtacks. ORB-SLAM successfully reconstructed the bowel wall from simultaneous frames of the CE video. The “track” in the reconstruction corresponded well with the linear forwards-backwards movement of the capsule through the model lumen.

Conclusion The reconstruction methods, detailed above, were able to achieve good quality reconstruction of the bowel model and localization of the capsule trajectory using information from the CE video and images alone.