CC BY-NC-ND 4.0 · Endosc Int Open 2020; 08(11): E1553-E1559
DOI: 10.1055/a-1261-3349
Original article

Novel polyp detection technology for colonoscopy: 3D optical scanner

Hakki Refai
1   Optecks, LLC, Tulsa, Oklahoma, United States
,
Badia Koudsi
1   Optecks, LLC, Tulsa, Oklahoma, United States
,
Omar Yusef Kudsi
2   Department of Surgery, Good Samaritan Medical Center, Tufts University School of Medicine, Brockton, Massachusetts, United States
› Institutsangaben
 

Abstract

Background and study aims Fifty-eight percent of American adults aged 50 to 75 undergo colonoscopies. Multiple factors result in missed lesions, at a rate of approximately 20 %, potentially subjecting patients to colorectal cancer. We report on use of a miniaturized optical scanner and accompanying processing software capable of detecting, measuring, and locating polyps with sub-millimeter accuracy, all in real time.

Materials and methods A prototype 3 D optical scanner was developed that fits within the dimensions of a standard endoscope. After calibration, the system was evaluated in an ex-vivo porcine colon model, using silicon-made polyps.

Results The average distance between two adjacent points in the 3 D point cloud was 94 µm. The results demonstrate high-accuracy measurements and 3 D models while operating at short distances. The scanner detected 6 mm × 3 mm polyps in every trial and identified polyp location with 95-µm accuracy. Registration errors were less than 0.8 % between point clouds based on physical features.

Conclusion We demonstrated that a novel 3 D optical scanning system improves the performance of colonoscopy procedures by using a combination of 3 D and 2 D optical scanning and fast, accurate software for extracting data and generating models. Further studies of the system are warranted.


#

Introduction

Existing colonoscopy systems use technology developed over 10 years ago that limits operator effectiveness in detecting abnormal tissue. Standard endoscopes use a visible light source and camera to view the colon. Ideally, current colonoscopy can find precancerous polyps and adenomas, facilitate removal or treatment, and facilitate early detection of colorectal cancer (CRC). The Centers for Disease Control and Prevention (CDC) estimates that colonoscopies prevented 66,000 colorectal cancers between 2003 and 2007 alone [1]. Despite the endoscope’s capabilities and operator training, operators still miss polyps and adenomas that can lead to interval cancers. Missed polyps and adenomas occur for several reasons. Polyps and adenomas, particularly ones < 6 mm, can grow in folds in the colon wall that block the operator’s view. Abnormal tissue can have coloring similar to that of surrounding tissue, causing abnormal tissue to blend into the background. The miss rate increases as the number and density of polyps increases. A shortage of endoscopists and an ever-increasing over-50 population compound the problem. In 2010, gastroenterologists performed approximately 50 % of colonoscopies [2], and studies clearly show that inexperienced practitioners miss 11 % more polyps and adenomas than experienced practitioners [3]. A safe, accurate, efficient, and augmented tool readily installed on the endoscopic instrument and easily employed by both specialists and non-specialists could significantly improve accuracy and success rates of colonoscopy, increase efficiency to reduce procedure times, and improve patient access to screenings, leading to earlier detection and more effective treatment of CRC. We aimed to present an endoscopic innovation and its potential to improve the detection of lesions during colonoscopy.


#

Material and Methods

We have taken an alternate approach to commercially available colonoscopy solutions and optical scanning systems by complementing the accuracy and model-building capabilities of three-dimensional (3 D) optical scanners with two-dimensional (2 D) imaging methods and novel polyp detection software, with hardware miniaturized to fit within an endoscope’s dimensions. We combined miniature laser arrays, pattern, and solid illumination generation, near-infrared (NIR) cameras, and advanced processing algorithms to meet size, speed, and accuracy needs of colonoscopy procedures. Our optical scanning system, schematically depicted in [Fig. 1a], employs two Omnivision CMOS 1080 p NIR cameras (circles) and two NIR vertical cavity surface-emitting laser (VCSEL) sources (squares), containing II-VI multimode 0.9 W arrays operating at 940 nm, integrated with the endoscope. Each camera has a 2.73 × 1.55 mm image area, 1.4 µm pixels, and 70° horizontal and 42° vertical field of view (FOV). Employing NIR sources and cameras takes advantage of high NIR tissue reflectivity to produce high-resolution recordings without interfering with the endoscope’s existing visual systems.

Zoom Image
Fig. 1 a Scanner configuration on endoscope tip. b 3 D point cloud generation. c Blood vessel imaging.

The short 1- to 5-cm working distance in the colon, combined with the small component size and our software’s capabilities, allows the scanner to operate on a 5.5-mm baseline and still achieve sub-millimeter measurement accuracy in depth. The optical sources produce both patterned light (intensity variations over space) and solid illumination (no intensity variation over space) on alternate intervals. Patterned light illumination of the colon wall ([Fig. 1b]), combined with the stereoscopic vision provided by using two cameras, produces data [4] that allow the software to accurately locate each part of the colon in 3 D space, producing a 3 D point cloud, which consists of all the 3 D points identified from the camera imaging data. The imaging and processing software uses triangulation algorithms and proprietary pattern matching algorithms to construct 3 D point clouds of each colon section with sub-millimeter accuracy. Software algorithms employ novel feature extraction algorithms on the 3 D point clouds that use geometrical analysis to detect polyps of sizes < 1 mm, even when polyp/adenoma coloring closely matches the coloring of the colon wall. The solid illumination NIR source accentuates the contrast between blood vessels and the wall tissue [5], providing a unique 2 D topography ([Fig. 1c]). This allows the software to perform pattern matching between images to extract registration information (tilt and shift between successive images) critically important for extracting features from the otherwise featureless colon wall and combining 3 D point clouds into a single, integrated 3 D model of the entire colon. Combining 3 D and 2 D imaging capabilities allows the software to accurately locate polyps within the colon and creates a record for tracking a patient’s colon health. Using both 2 D visual imagery data and 3 D scanning data as inputs to artificial intelligence (AI) and deep learning algorithms properly trained on 2 D visual images and 3 D scans would significantly reduce or eliminate false positive concerns, thus making such algorithms a valuable addition to colonoscopy analysis.

We constructed the miniaturized 3 D scanning and mapping system and evaluated system performance using a porcine distal colon from a 2-year-old pig. [Fig. 2] shows the construction and operation of the system’s optical scanning components. The hardware consists of the optical scanning head, wiring for powering and controlling two NIR sources, and camera drivers for controlling and recovering images from the cameras. The camera drivers connect to a laptop computer via USB 3.0 cables. The optical scanning head has a 12.8-mm diameter, closely matching that of current endoscopes. Two high-definition NIR cameras and two VCSEL-based NIR sources (VCSEL array, integrated diffractive elements and projection sources) are placed within a 3 D printed housing we designed using SolidWorks. The cameras and sources were secured within the housing with ultraviolet (UV) curable glue. One source projects patterned NIR illumination using a diffractive optical element and projection optics integrated with the VCSEL array, and one projects a solid illumination consisting of a uniform, flat-top illumination with 75°x55° field of illumination. The system has a 5.5-mm baseline distance between the cameras, giving a theoretical accuracy of 55 µm at a 25-mm operating distance. A 1.75 V, 20 mA, switchable source powered the NIR sources. The software algorithms were implemented in C + + .

Zoom Image
Fig. 2 The optical scanning system, including wiring for connecting the sources to the driving electronics, camera drivers for controlling and recording images from the cameras, and the optical scanning head. A quarter next to the scanning head provides perspective on size. Close-up of scanning head showing cameras and sources and the housing manufactured in-house.

The scanner system was calibrated using a customized black and white checkerboard target with each square having dimensions of less than 500 µm. After calibration, the system produced an output image that matched the original target object.

For testing, we inserted the optical scanning head into the porcine colon as shown in [Fig. 3]. An air compressor was used to inflate the colon. The scanning head was placed inside the colon and a rubber band secured the near and far ends of the colon to the air compressor tube and stick to help maintain inflation. To evaluate the system’s ability to detect polyps within the test colon, we inserted a polyp made of silicone into the colon. The polyp dimensions were 6 mm × 3 mm.

Zoom Image
Fig. 3 The test set-up used to evaluate the optical scanning system, with the screen showing real-time images from the cameras and the output of the processing software point cloud.

#

Results

The first experiments evaluated the accuracy attainable with the system. [Fig. 4a] shows the image from one NIR camera and [Fig. 4b] shows the corresponding 3 D point cloud. [Fig. 4c] provides a zoomed-in view of the point cloud with a calculation of distances between distinct points. The average distance between two adjacent points in the 3 D point cloud, taken as an average of 100 sets of 100-point pairs, was 94 µm, just under twice the theoretical target of 55 µm. The results verify that the triangulation algorithm and processing software can produce highly accurate 3 D scans and modeling of the 3 D space at short operating distances in the colon with the current miniaturized scanning hardware.

Zoom Image
Fig. 4 a Image of colon wall from NIR camera. b 3 D point cloud obtained from analysis software. c Calculation of measurement accuracy, shown by zooming in on a subset of points from the main point cloud.

We proceeded to evaluate the system’s ability to detect polyps within the test colon. We inserted polyps made of silicone and different dimensions into the colon, as the test colon did not contain naturally occurring polyps. All of the simulated polyps were either Isp or Is in morphology [6]. The results presented here are for a 6 mm × 3 mm polyp placed 20 cm from the colon entrance in the lower right quarter of the colon wall. The evaluation focused on the system’s ability to correctly detect and locate the polyp.  [Fig. 5] illustrates the process performed by the software algorithms to detect a polyp. A geometrical analysis algorithm identifies areas where the surface of the 3 D point cloud differs from the background surface of the colon wall, indicating a possible polyp. Through a series of processing steps to eliminate false positives, the software generates a binary distance map, with likely polyps indicated by bright areas and other areas indicated as dark. The software overlays the binary distance map onto the 3 D point cloud to mark the boundary of the likely polyp, calculate the polyp’s perimeter, and determine the nominal height/size of the likely polyp. Over repeated attempts, the software correctly identified the existence of the polyp every time and placed the overlay onto the polyp with an average error of 95 µm. Once the system detects and locates the polyp, the software can interface with imagery provided by the endoscope’s existing visible-light imaging system to provide indicators and data useful to the operator. As an example, [Fig. 5] shows the perimeter of the bright area from the distance map overlaid on the image to highlight the position of the polyp and shows how the system can present measurement data to the operator directly on the image produced by the endoscope’s visible-light imaging systems.

Zoom Image
Fig. 5 After overlaying the binary distance map on the 3 D point cloud, the software outlines the area indicated by the white section of the binary distance map, indicating the location of the detected polyp to the operator. Data relevant to the operator appears in the upper-left corner of the image.

We also performed an initial performance evaluation of the registration algorithms required to stitch together the individual 3 D point clouds into a single, cohesive 3 D model of the patient’s colon. We evaluated and verified the registration algorithms using physical features of the porcine colon, as the colon did not retain original blood vessels. The software collected images from both cameras before and after the scanner moved within the colon, as shown in [Fig. 6]. The algorithm first extracted points of interest (points near edges or identifiable details) from the left camera image in frame one and matched these points with corresponding points in the right camera image of frame one, as shown in [Fig. 6c]. This step provided accurate 3 D coordinates for each point of interest. The algorithm used the points of interest as input to the next step, in which the algorithm identified and matched the most robust points of interest (called rigid points) between the left camera image of frame one and the left camera image of frame two ([Fig. 6 d]). Finally, the algorithm matched rigid points between the left and right camera images of frame two and used this result to match rigid points between the left image of frame one with the right image of frame two ([Fig. 6e]). Using 3 D locations for each point from the 3 D point cloud, the algorithm derives a transformation matrix containing roll, pitch, yaw, and translation between images, which allows the software to reconcile and combine the images into a single 3 D model. The 3 D transformation matrix has the following form:

Zoom Image
Zoom Image
Fig. 6 a Images from the left and right NIR cameras in the first timeframe. b Images from the left and right NIR cameras taken in the second timeframe after moving the sensor head a short distance along the porcine colon. c Matching of points of interest between the left and right camera images for the first frame. d Matching of only the rigid points between the left camera images for frames one and two. e Matching of the rigid points between the left image of frame one and the right image of frame two.

For the images in [Fig. 6], the transformation matrix was computed as:

Zoom Image

The transformation matrix data produces an initial estimation of roll = 17.20°, pitch = –24.28°, yaw of 13.59°, lateral shift in x = 0.0586 mm, y = –0.0217 mm, and z = 0.0395 mm. Combining multiple successive frames eventually closes a loop – scanning an area both forward and backward – allowing for further refinement of calculations. The corrected transformation calculates the actual movements as 0.0590 mm, –0.02161 mm, and 0.0392 mm respectively. The resultant error between initial estimated values and the actual values ranged from –0.8 % to 0.8 %, verifying that the software, combined with the 2 D camera images and the 3 D point clouds, can accurately compute translation and rotation between images, meeting a key requirement for stitching together frames recorded by the scanner into a single, cohesive colon model.


#

Discussion

Optical scanning systems offer a candidate technology for producing high-accuracy 3 D imagery and modeling of the colon. Current commercially available 3 D optical scanners utilize a combination of technologies, including NIR and visible light, digital light projection (DLP) sources and VCSEL projectors, and high-resolution cameras, to produce 3 D scans of an object. However, all currently available scanners use a large (≥ 8 cm) baseline (distance between centers of the optical source and/or recording cameras), and large (up to 1 cm) components to achieve large FOV and high-resolution depth measurements [7] [8] [9]. In the colon, however, the operating distance varies from 1 cm to 5 cm, depending on the individual patient, position within the colon, and positioning of the endoscope with respect to the colon wall, which precludes the use of long baselines and large components. Systems with larger dimensions and components would integrate poorly with the endoscope, most likely resulting in protrusions or separate systems that would hinder the procedure in much the same manner as mechanical solutions.

We demonstrated a novel 3 D optical scanning system for improving the performance of colonoscopy procedures using a combination of 3 D and 2 D optical scanning and fast, accurate software for extracting data and generating models. The system accomplishes this using miniaturized sources and sensors located on a short 5.5-mm baseline, ideal for sub-millimeter measurement accuracy at up to 5-cm working distances, and software that extracts features from 3 D and 2 D images to detect polyps/adenomas and construct 3 D colon models. The system’s ability to accurately locate and measure polyps has the potential to reduce miss rates and improve decisions and procedures for polyp removal. The ability to accurately measure polyp size proves critical to determining patient monitoring and treatment, as patients with polyps > 10 mm must undergo shorter surveillance intervals. The system size and utility do not increase the risk of injury to the patient, do not require additional training to use, and can enhance the performance of less experienced operators, enabling wider availability of safe and effective treatment.

Based on the results presented here, the system offers several potential advantages compared to other proposed and commercially available solutions, including higher accuracy (sub-millimeter vs. centimeter), lower miss rates, real-time procedure imagery and data, and rendering of 3 D colon models for establishing digital records of the patient’s colon health. In the next stage of development, the system can employ AI and deep learning algorithms to further improve polyp detection. The availability of both 3 D and 2 D data as inputs to the AI, as opposed to only the traditional 2 D image data used in previously studied systems [10] [11], is expected to eliminate problems associated with the lack of strong features common in colonoscopy images, and thus improve the accuracy of operation. A recent study [12] found that AI methods working with 2 D colonoscopy images produced a high rate of false positives and produced miss rates comparable to experienced practitioners. The existence of false positives was also noted in [11] as an issue encountered with the AI method studied.

There are several implications for this technology. Management of polyps found during colonoscopy depends on patient characteristics as well as polyp size, morphology, and histology. For example, while adenomas < 2 mm may be removed using forceps, larger adenomas may require other methods of removal such as snare resection or a variety of advanced endoscopic resection methods. In terms of morphology, flat lesions are not only harder to detect visually as compared to other polypoid lesions but are also found to be more advanced histologically for their size [13]. Sessile polyps also carry a higher risk of malignancy and are associated with higher rates of incomplete resection, especially with larger polyp sizes [14], which ultimately leads to higher rates of interval cancers. Moreover, patients with more adenomas detected at baseline are at a significantly higher risk of advanced adenoma and CRC within 3 to 5 years [15]. Accurate detection and measurement of polyp size, morphology, invasion, and margins are essential in determining the appropriate management as well as surveillance. While the surveillance colonoscopy interval is 10 years for patients with no adenomas detected, patients with three to four small adenomas detected should undergo surveillance colonoscopy within 3 to 5 years. Improving the detection and characterization of these lesions may have crucial downstream benefits for these patients.

We can identify a few limitations of our study. This device has only been tested on an ex-vivo porcine model with silicone polyps. These polyps are uniform in shape and the study focused on a single size to demonstrate the concept behind the optical scanner. Further studies involving in-vivo models, with varying polyp shapes and sizes, are needed to support the real-life applicability of this device as well as provide quantitative measures of accuracy. The optics of the pattern generator used in conjunction with the optical source introduced variations in the smoothness of the reconstruction, which limits the accuracy obtained with the current version of the system. When scanning a flat surface, the reconstruction can vary by ±25 µm from the actual surface as a result. Additional errors occurred due to the use of software-based synchronization of image capture between the two cameras. Hardware-based execution of the synchronization is needed to further improve system accuracy toward the theoretical limit. The software execution has not undergone optimization, limiting the current system to a speed of one frame per second. Operation above 10 frames per second is needed to fully support real-time functionality.


#

Conclusion

We implemented and demonstrated a novel 3 D optical scanning system for improving and extending colonoscopy procedures. A combination of 3 D scanning for providing depth information, 2 D scanning for performing registration of 3 D images, and feature extraction algorithms allowed the system to detect and accurately measure simulated polyps in an ex-vivo porcine colon model. The system demonstrated sub-millimeter accuracy, providing the potential to resolve small (< 2 mm) polyps/adenomas, and construction of 3 D models with < 1 % error, providing the potential to create models of a patient’s colon that would assist patient diagnosis and monitoring. Both capabilities present significant improvement over the functionality and capabilities of current colonoscopy systems. The successful demonstration of the system, despite some experimental limitations, provides a strong impetus for further study and development.


#
#

Competing interests

Drs. Refai and Koudsi are CTO and CEO, respectively, of Optecks, LLC. Dr. Kudsi has received teaching course and/or consultancy fees from Intuitive Surgical, Bard-Davol, and W.L. Gore outside the submitted work.

Acknowledgments

The project is partially funded by the Oklahoma Center for the Advancement of Science and Technology (OCAST), (Contract No. AR18-045).

  • References

  • 1 Colorectal cancer. Centers for Disease Control and Prevention Vital Signs. https://www.cdc.gov/vitalsigns/cancerscreening/colorectalcancer/index.html
  • 2 Xirasagar S, Hurley TG, Sros L. et al. Quality and safety of screening colonoscopies performed by primary care physicians with standby specialist support. Med Care 2010; 48: 703-709
  • 3 Bressler B, Paszat LF, Chen Z. et al. Rates of new or missed colorectal cancers after colonoscopy and their risk factors: a population-based analysis. Gastroenterology 2007; 132: 96-102
  • 4 Bellocchio F, Borghese NA, Ferrari S. et al. 3D surface reconstruction: multi-scale hierarchical approaches. Springer Science & Business Media; 2012
  • 5 Mangold Klaus, Shaw J. et al. The physics of near-infrared photography. European Journal of Physics 2013; 34: S51-S71
  • 6 The Paris endoscopic classification of superficial neoplastic lesions: esophagus, stomach, and colon: November 30 to December 1, 2002. Gastrointest Endosc 2003; 58: S3-S43
  • 7 Frankowski G, Hainich R. DLP-based 3D metrology by structured light or projected fringe technology for life sciences and industrial metrology. In: Emerging digital micromirror device based systems and applications: International Society for Optics and Photonics. 72100C. 2009
  • 8 Frankowski G, Hainich R. DLP/DSP-based optical 3D sensors for the mass market in industrial metrology and life sciences. In: Emerging Digital Micromirror Device Based Systems and Applications III: International Society for Optics and Photonics. 79320D. 2011
  • 9 Zanuttigh P, Marin G, Dal Mutto C. et al. Time-of-flight and structured light depth cameras. In: Technology and Applications. 2016
  • 10 Lui TKL, Guo CG, Leung WK. Accuracy of artificial intelligence on histology prediction and detection of colorectal polyps: a systematic review and meta-analysis. Gastrointest Endosc 2020; 92: 11-22
  • 11 Aziz M, Fatima R, Dong C. et al. The impact of deep convolutional neural network-based artificial intelligence on colonoscopy outcomes: A systematic review with meta-analysis. J Gastroenterol Hepatol 2020; 92: 11-26
  • 12 Wang P, Xiao X, Glissen Brown JR. et al. Development and validation of a deep-learning algorithm for the detection of polyps during colonoscopy. Nat Biomed Eng 2018; 2: 741-748
  • 13 Saitoh Y, Waxman I, West AB. et al. Prevalence and distinctive biologic features of flat colorectal adenomas in a North American population. Gastroenterology 2001; 120: 1657-1665
  • 14 Pohl H, Srivastava A, Bensen SP. et al. Incomplete polyp resection during colonoscopy-results of the complete adenoma resection (CARE) study. Gastroenterology 2013; 144: 74-80.e71
  • 15 Martínez ME, Baron JA, Lieberman DA. et al. A pooled analysis of advanced colorectal neoplasia diagnoses after colonoscopic polypectomy. Gastroenterology 2009; 136: 832-841

Corresponding author

Hakki Refai
4502 E. 41st St, 4W134
Tulsa, OK 74135
USA   
Telefon: +1-918-625-3396   

Publikationsverlauf

Eingereicht: 18. März 2020

Angenommen: 03. August 2020

Artikel online veröffentlicht:
21. Oktober 2020

© 2020. The Author(s). This is an open access article published by Thieme under the terms of the Creative Commons Attribution-NonDerivative-NonCommercial License, permitting copying and reproduction so long as the original work is given appropriate credit. Contents may not be used for commecial purposes, or adapted, remixed, transformed or built upon. (https://creativecommons.org/licenses/by-nc-nd/4.0/)

Georg Thieme Verlag KG
Rüdigerstraße 14, 70469 Stuttgart, Germany

  • References

  • 1 Colorectal cancer. Centers for Disease Control and Prevention Vital Signs. https://www.cdc.gov/vitalsigns/cancerscreening/colorectalcancer/index.html
  • 2 Xirasagar S, Hurley TG, Sros L. et al. Quality and safety of screening colonoscopies performed by primary care physicians with standby specialist support. Med Care 2010; 48: 703-709
  • 3 Bressler B, Paszat LF, Chen Z. et al. Rates of new or missed colorectal cancers after colonoscopy and their risk factors: a population-based analysis. Gastroenterology 2007; 132: 96-102
  • 4 Bellocchio F, Borghese NA, Ferrari S. et al. 3D surface reconstruction: multi-scale hierarchical approaches. Springer Science & Business Media; 2012
  • 5 Mangold Klaus, Shaw J. et al. The physics of near-infrared photography. European Journal of Physics 2013; 34: S51-S71
  • 6 The Paris endoscopic classification of superficial neoplastic lesions: esophagus, stomach, and colon: November 30 to December 1, 2002. Gastrointest Endosc 2003; 58: S3-S43
  • 7 Frankowski G, Hainich R. DLP-based 3D metrology by structured light or projected fringe technology for life sciences and industrial metrology. In: Emerging digital micromirror device based systems and applications: International Society for Optics and Photonics. 72100C. 2009
  • 8 Frankowski G, Hainich R. DLP/DSP-based optical 3D sensors for the mass market in industrial metrology and life sciences. In: Emerging Digital Micromirror Device Based Systems and Applications III: International Society for Optics and Photonics. 79320D. 2011
  • 9 Zanuttigh P, Marin G, Dal Mutto C. et al. Time-of-flight and structured light depth cameras. In: Technology and Applications. 2016
  • 10 Lui TKL, Guo CG, Leung WK. Accuracy of artificial intelligence on histology prediction and detection of colorectal polyps: a systematic review and meta-analysis. Gastrointest Endosc 2020; 92: 11-22
  • 11 Aziz M, Fatima R, Dong C. et al. The impact of deep convolutional neural network-based artificial intelligence on colonoscopy outcomes: A systematic review with meta-analysis. J Gastroenterol Hepatol 2020; 92: 11-26
  • 12 Wang P, Xiao X, Glissen Brown JR. et al. Development and validation of a deep-learning algorithm for the detection of polyps during colonoscopy. Nat Biomed Eng 2018; 2: 741-748
  • 13 Saitoh Y, Waxman I, West AB. et al. Prevalence and distinctive biologic features of flat colorectal adenomas in a North American population. Gastroenterology 2001; 120: 1657-1665
  • 14 Pohl H, Srivastava A, Bensen SP. et al. Incomplete polyp resection during colonoscopy-results of the complete adenoma resection (CARE) study. Gastroenterology 2013; 144: 74-80.e71
  • 15 Martínez ME, Baron JA, Lieberman DA. et al. A pooled analysis of advanced colorectal neoplasia diagnoses after colonoscopic polypectomy. Gastroenterology 2009; 136: 832-841

Zoom Image
Fig. 1 a Scanner configuration on endoscope tip. b 3 D point cloud generation. c Blood vessel imaging.
Zoom Image
Fig. 2 The optical scanning system, including wiring for connecting the sources to the driving electronics, camera drivers for controlling and recording images from the cameras, and the optical scanning head. A quarter next to the scanning head provides perspective on size. Close-up of scanning head showing cameras and sources and the housing manufactured in-house.
Zoom Image
Fig. 3 The test set-up used to evaluate the optical scanning system, with the screen showing real-time images from the cameras and the output of the processing software point cloud.
Zoom Image
Fig. 4 a Image of colon wall from NIR camera. b 3 D point cloud obtained from analysis software. c Calculation of measurement accuracy, shown by zooming in on a subset of points from the main point cloud.
Zoom Image
Fig. 5 After overlaying the binary distance map on the 3 D point cloud, the software outlines the area indicated by the white section of the binary distance map, indicating the location of the detected polyp to the operator. Data relevant to the operator appears in the upper-left corner of the image.
Zoom Image
Zoom Image
Fig. 6 a Images from the left and right NIR cameras in the first timeframe. b Images from the left and right NIR cameras taken in the second timeframe after moving the sensor head a short distance along the porcine colon. c Matching of points of interest between the left and right camera images for the first frame. d Matching of only the rigid points between the left camera images for frames one and two. e Matching of the rigid points between the left image of frame one and the right image of frame two.
Zoom Image