US20210161604A1 - Systems and methods of navigation for robotic colonoscopy - Google Patents
Systems and methods of navigation for robotic colonoscopy Download PDFInfo
- Publication number
- US20210161604A1 US20210161604A1 US17/257,470 US201917257470A US2021161604A1 US 20210161604 A1 US20210161604 A1 US 20210161604A1 US 201917257470 A US201917257470 A US 201917257470A US 2021161604 A1 US2021161604 A1 US 2021161604A1
- Authority
- US
- United States
- Prior art keywords
- endoscope
- determining
- image
- blurred
- endoscope image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Abandoned
Links
- 238000000034 method Methods 0.000 title claims abstract description 46
- 238000002052 colonoscopy Methods 0.000 title description 8
- 230000033001 locomotion Effects 0.000 claims abstract description 33
- 239000013598 vector Substances 0.000 claims abstract description 32
- 210000001072 colon Anatomy 0.000 claims description 27
- 238000013507 mapping Methods 0.000 claims description 25
- 238000006073 displacement reaction Methods 0.000 claims description 10
- 238000003708 edge detection Methods 0.000 claims description 10
- 230000005484 gravity Effects 0.000 claims description 5
- 230000001052 transient effect Effects 0.000 claims description 3
- 238000012545 processing Methods 0.000 description 7
- 238000010586 diagram Methods 0.000 description 4
- 210000003484 anatomy Anatomy 0.000 description 2
- 238000005266 casting Methods 0.000 description 2
- 230000005055 memory storage Effects 0.000 description 2
- 210000003205 muscle Anatomy 0.000 description 2
- 208000004998 Abdominal Pain Diseases 0.000 description 1
- 241000251468 Actinopterygii Species 0.000 description 1
- 238000012935 Averaging Methods 0.000 description 1
- 208000032843 Hemorrhage Diseases 0.000 description 1
- 208000022559 Inflammatory bowel disease Diseases 0.000 description 1
- 206010023804 Large intestine perforation Diseases 0.000 description 1
- 206010028980 Neoplasm Diseases 0.000 description 1
- 206010038063 Rectal haemorrhage Diseases 0.000 description 1
- 238000004458 analytical method Methods 0.000 description 1
- 230000005540 biological transmission Effects 0.000 description 1
- 238000000339 bright-field microscopy Methods 0.000 description 1
- 238000004364 calculation method Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000000052 comparative effect Effects 0.000 description 1
- 230000000295 complement effect Effects 0.000 description 1
- 238000004590 computer program Methods 0.000 description 1
- 238000003745 diagnosis Methods 0.000 description 1
- 230000026058 directional locomotion Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 1
- 208000007784 diverticulitis Diseases 0.000 description 1
- 239000012530 fluid Substances 0.000 description 1
- 238000002347 injection Methods 0.000 description 1
- 239000007924 injection Substances 0.000 description 1
- 238000003780 insertion Methods 0.000 description 1
- 230000037431 insertion Effects 0.000 description 1
- 238000012804 iterative process Methods 0.000 description 1
- 238000005259 measurement Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 210000000056 organ Anatomy 0.000 description 1
- 238000003909 pattern recognition Methods 0.000 description 1
- 238000009877 rendering Methods 0.000 description 1
- 239000000523 sample Substances 0.000 description 1
- 229920006395 saturated elastomer Polymers 0.000 description 1
- 238000004611 spectroscopical analysis Methods 0.000 description 1
- 238000001356 surgical procedure Methods 0.000 description 1
- 230000000007 visual effect Effects 0.000 description 1
- XLYOFNOQVPJJNP-UHFFFAOYSA-N water Substances O XLYOFNOQVPJJNP-UHFFFAOYSA-N 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/00147—Holding or positioning arrangements
- A61B1/00149—Holding or positioning arrangements using articulated arms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B1/00—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
- A61B1/31—Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for the rectum, e.g. proctoscopes, sigmoidoscopes, colonoscopes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/68—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
- A61B5/6846—Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be brought in contact with an internal body part, i.e. invasive
- A61B5/6886—Monitoring or controlling distance between sensor and tissue
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/12—Edge-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/168—Segmentation; Edge detection involving transform domain methods
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H20/00—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
- G16H20/40—ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mechanical, radiation or invasive therapies, e.g. surgery, laser therapy, dialysis or acupuncture
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2055—Optical tracking systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B34/00—Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
- A61B34/20—Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
- A61B2034/2046—Tracking techniques
- A61B2034/2065—Tracking using image or pattern recognition
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
- A61B2576/02—Medical imaging apparatus involving image processing or analysis specially adapted for a particular organ or body part
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/10—Image acquisition modality
- G06T2207/10068—Endoscopic image
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30004—Biomedical image processing
- G06T2207/30028—Colon; Small intestine
Definitions
- the present invention is directed to systems and methods for robotically assisted medical procedures, in particular for navigation of medical instruments for diagnosis and surgery.
- a colonoscope typically includes a light source and an image capture device and is externally steerable through the colon. Colonoscopy is frequently performed to screen for asymptomatic cancers at an early stage or to find and to remove precancerous polyps. Colonoscopy may also be performed to diagnose rectal bleeding or changes in bowel habits and inflammatory bowel disease. Although the procedure is common, safe navigation of a colonoscope through the colon can be difficult due to the colon's distensible and highly mobile nature. Colonoscopy complications that can occur include colon perforation, hemorrhage, or severe abdominal pain. Robotic control of colonoscope movement is being developed to provide greater precision and speed for the procedure; however, robotic control must also contend with the varying, dynamic anatomy of the colon.
- Prevalent manual and automatic visual methods for navigating the colonoscope rely on directing the colonoscope towards dark regions of images acquired by the colonoscope.
- Khan and Gillies (“Vision based navigation for an endoscope”, Image and Vision Computing, 14 (1996), pp. 763-T72) describe determining and visually presenting the colon space, for manual and/or automatic navigation of the colon.
- the method includes directing the colonoscope towards a dark region of the lumen, according to the colonoscope image.
- the method also includes determining distances between muscle contours of the colon.
- Khan and Gillies note that a complete colon representation is necessary for guiding the endoscope around the bends when the view of the lumen is lost and muscle contours are not visible, or when the colon includes pockets or unforeseen obstacles.
- U.S. Patent Publication 2003/0167007 to Belson is directed to a method of colonoscopy whereby a spectroscopy device is attached to the tip of the colonoscope to create a three dimensional map of the colon for use by an automated method of advancing the colonoscope.
- 8,795,157 to Yaron and Frenkel is directed to a method for advancing a colonoscope by acquiring a stereoscopic image pair of a region of the interior of a colon, identifying a plurality of topographical features on an inner wall of the colon, determining the depth of each topographical feature, determining a radius of curvature of the colon, and advancing the colonoscope according to the direction of the topographical feature with greatest depth, according to the radius of curvature.
- U.S. Pat. No. 8,514,218 to Hong and Paladini is directed to navigating a colonoscope according to a depth image captured by an angular fish eye lens.
- the resolution is approximately equal across the whole image.
- the depth image is generated according to a ray casting volume rendering scheme.
- the gray level is proportional to the distance from the camera to the colon surface, the brighter region corresponding to the colon lumen which is far away from the current camera location, called the target region.
- U.S. Patent Publication 2003/0152897 to Geiger is directed to navigating a colonoscope by ray-casting, whereby for every pixel of an acquired colon image, a ray is cast and its intersection with an organ wall is calculated, to determine a longest ray. The colonoscope is then navigated in the direction of the longest ray.
- the aforementioned methods do not overcome a variety of difficulties of colonoscope navigation that stem from the dynamic anatomy of the colon, as well as the highly varied surface structure.
- the colon may include dark pockets, including disease related distortions, such as diverticulitis.
- the colon image may also include image artifacts, such as fluids or bubbles on the lens. When the endoscope tip is close to the colon wall, the image may display ‘red-out’ or ‘wall view’. Camera movement may also cause motion blur artifacts. These complicating factors and artifacts have impeded successful implementation of automated navigation.
- Embodiments of the present invention provide systems and methods for navigating an endoscope during colonoscopy.
- a method of endoscope navigation includes: receiving an endoscope image from an endoscope in a body lumen; determining a wall section of the endoscope image representing a proximate wall of the body lumen; determining a movement vector directing away from the wall section of the image; and applying a mechanical motion to the endoscope, according to the movement vector, to move the endoscope away from the proximate wall.
- the endoscope may be a colonoscope.
- Determining the wall section of the image may include determining a blurred portion of the endoscope image, and determining the blurred portion of the endoscope may include: generating by edge detection an edge-rendered mapping from the endoscope image, dividing the edge-rendered mapping into sectors, determining a variance of pixel intensity for each sector, and determining the blurred portion as a subset of the sectors having variances less than a threshold value.
- the edge detection may be performed by applying a 3 ⁇ 3 Laplacian operator to the endoscope image to generate a second order derivative mapping.
- the threshold value may be a variance less than a preset percentile of the variances of all the image subsections. Alternatively, the threshold value may be a preset variance value.
- the subset of the sectors having variances less than a threshold is a “blurred subset”, and determining the vector directing away from the wall section may include determining coordinates in the endoscope image of a center of gravity (COG) of the blurred subset.
- COG center of gravity
- the movement vector may be determined as a displacement from the coordinates of the COG towards a point in the endoscope image representing a current orientation of the endoscope tip.
- the movement vector may be determined as a weighted average of a displacement from the coordinates of the COG towards a point in the endoscope image representing a current orientation of the endoscope tip, and of a displacement to one or more additional targets, include at least one of a dark region target and a previous endoscope image target.
- a system for endoscope navigation may include a processor and a memory with computer-readable instructions that when executed cause the processor to perform steps of: receiving an endoscope image from an endoscope in a body lumen; determining a wall section of the endoscope image representing a proximate wall of the body lumen; determining a movement vector directing away from the wall section of the image; and applying a mechanical motion to the endoscope, according to the movement vector, to move the endoscope away from the proximate wall.
- FIG. 1 is a block diagram of a system for endoscope navigation, in accordance with an embodiment of the present invention
- FIG. 2 shows an endoscope image acquired during endoscope navigation, in accordance with an embodiment of the present invention
- FIG. 3 shows an edge-rendered mapping of the endoscope image, in accordance with an embodiment of the present invention
- FIG. 4 shows the edge-rendered mapping divided into sectors, in accordance with an embodiment of the present invention
- FIG. 5 shows a table of variances of pixel intensities of the sectors of the edge-rendered mapping, in accordance with an embodiment of the present invention
- FIG. 6 shows a blurred subset of the sectors, indicating proximity of a wall of a body lumen, in accordance with an embodiment of the present invention
- FIG. 7 shows vectors of several navigation targets, which are added to create a final target, during a process of endoscope navigation, in accordance with an embodiment of the present invention.
- FIG. 8 is a schematic, flow diagram of a process for endoscope navigation, according to an embodiment of the present invention.
- FIG. 1 is a block diagram of a system 20 for endoscope navigation, in accordance with an embodiment of the present invention.
- System 20 includes an endoscope 22 designed to be steered through a body lumen. In some embodiments is a colonoscope, and the body lumen is a patient's colon.
- Situated at the endoscope tip 24 may be several devices, such as a light emitter 26 , a water nozzle 28 , or other surgical instruments 30 (such as cautery probes, cold or hot forceps, snares, injection needles, suction, nets, or endoloops).
- the endoscope tip 24 also has a camera 32 , which typically sends images as video frames to a computer controller 40 .
- the lens system of the camera 32 is designed to provide a range of focus that is typically not less than 3 mm, and the steering of the endoscope is guided according to the range of focus.
- the computer controller 40 typically includes a processor 42 and a memory 44 , the memory including instructions for image processing described further hereinbelow.
- the computer controller may also be configured to receive input from a user control 60 , which may, for example be steering instructions for controlling the endoscope 22 .
- the computer controller 40 may be configured to control automated navigation of the endoscope 22 at least for a portion of the process of inserting and removing the endoscope from the.
- the computer controller 40 may also send navigation instructions 44 to a mechanical directional controller 50 , for controlling a 3 D position (i.e., x, y, and z axes) of the endoscope tip 24 .
- the directional controller 50 typically has motors that operate cables of the endoscope 22 to move the endoscope tip 24 .
- the computer controller 40 may also present video from the camera 32 on a user display 62 . Additional objects rendered by the computer controller 40 , such as a navigation target may also be presented on the user display 62 .
- the endoscope tip may include additional navigation sensors, such as distance sensors and remote tracking sensors, not shown.
- additional navigation sensors such as distance sensors and remote tracking sensors, not shown.
- the navigation methods provided by the present invention reduce the need for such additional sensors.
- measurements by such navigation sensors may also be applied to complement and to confirm the navigation process described hereinbelow.
- FIG. 2 shows an endoscope image 120 acquired during endoscope navigation, in accordance with an embodiment of the present invention.
- the computer controller 40 receives the endoscope image 120 from the camera 32 , typically as a frame of a video transmission.
- the endoscope image is calibrated to indicate the orientation of the endoscope tip, such that the tip is oriented towards an image center 125 .
- a current orientation of the endoscope tip may be indicated by other preset points of the image.
- the endoscope image 120 shown in the figure is an image of a colon, taken during a colonoscopy procedure.
- the computer controller 40 determines whether features of the image indicate that the endoscope tip is closer than a preset threshold to a wall of the body lumen. In an embodiment of the present invention, the computer controller processes the image to determine an out-of-focus (i.e., blurry) region, indicative of a region that is closer to the camera than the minimum focal length of the camera lens.
- an out-of-focus region indicative of a region that is closer to the camera than the minimum focal length of the camera lens.
- pixel saturation may be employed, as saturation may be indicative of high reflection very close to the endoscope light.
- a blurry region may be categorized according to a method described in Pech-Pacheco, et al., “Diatom autofocusing in brightfield microscopy: a comparative study” (Proceedings 15th International Conference on Pattern Recognition, IEEE, Sept. 2000).
- An edge-rendered mapping is first generated by an edge detection operation. The variance of the pixel intensity in the mapping is then calculated. A high variance is indicative of good focus, while a low variance is indicative of poor focus.
- An edge-rendered mapping of an image may be calculated by several methods known in the art, such as by convolution of the image with a discrete Laplace operator mask.
- a common mask for edge detection has the form:
- FIG. 3 shows an edge-rendered mapping 130 , generated by detecting edges in the endoscope image, in accordance with an embodiment of the present invention.
- the edge-rendered mapping 130 is divided into sectors 142 , as shown in FIG. 4 .
- a variance of pixel intensities within each sector of the edge-rendered mapping is then calculated.
- FIG. 5 shows a table of variances 150 for each sector, in accordance with an embodiment of the present invention. The table is superimposed on the original endoscope image 120 .
- the size of sectors for a given mapping may be varied, depending on factors such as camera resolution and quality.
- a typical sector size for a given mapping may be, for example 10 ⁇ 10 pixels.
- the sectors may be differentiated between blurry sectors and focused sectors.
- a preset threshold of sector variance is set to distinguish between blurry and non-blurry sectors.
- a percentile of the overall range of variances can be used to distinguish blurry from non-blurry sectors.
- FIG. 6 shows a blurred subset 160 of the sectors, which have been distinguished as described above from the focused sectors.
- the blurriness of the blurred sectors is caused by the distance between a wall of the lumen (e.g., a wall of the colon) and the endoscope tip being less than the minimum field depth of the camera lens. Consequently the blurred subset represents a proximate wall of the lumen.
- the computer controller calculates a direction for navigating the endoscope tip away from the proximate wall.
- the direction is determined as a vector extending from a “center of mass” or “center of gravity” (COG) of the blurred subset 160 of sectors towards a preset or interactively set point of the endoscope image, such as the center 125 .
- COG center of gravity
- the variance is inversely proportional to a measure of blurriness; consequently, the COG, i.e., a “center of blurriness”, may be calculated by assigning to each sector an inverse value of the variance (e.g., a constant divided by the variance).
- the inverse variance value is used to represent mass for the COG calculation.
- Other inverse variance indices may also be used.
- the two-dimensional coordinates of the COG are calculated from a standard COG formula, as follows:
- v i is the variance of sector i
- r i represents the coordinates of sector i, which may be, for example, coordinates of a central pixel of the sector, as measured from an arbitrary point, such as a corner or center point of the image.
- the COG of the inverse variance of blurry sectors is indicated as point 162 in the figure.
- the computer controller calculates a movement vector 164 starting from point 162 and extending in the direction of a point indicating a current orientation of the endoscope image 120 , such as the center 125 .
- the movement vector 164 indicates a directional motion which the computer controller then directs the direction controller 50 to apply to the endoscope.
- the origin of the movement vector 164 is transformed to the image center 125 (or other indicative orientation point), to generate a transformed vector 166 .
- the transformed vector indicates a navigation target 168 .
- FIG. 7 shows additional navigation targets, which may be aggregated with the navigation target 168 to create an aggregated final target 176 .
- the additional navigation targets may be, for example, a previously calculated navigation target 172 , calculated from a prior image, and a dark region target 174 , calculated by identifying the center of the darkest region of the image, by methods of image processing known in the art.
- the weights assigned to each target are preset.
- the final target may be calculated by according an averaging weight of 50% to the previous target, 30% to the new target, and 20% to the dark region target.
- FIG. 8 is a schematic, flow diagram of a process 200 for endoscope navigation, according to an embodiment of the present invention.
- the navigation begins, with the insertion of the endoscope into the body lumen, for example, by beginning a colonoscopy procedure.
- An iterative process of computer-assisted or computer-controlled navigation then begins at a step 212 , with the capture of the first endoscope image, typically a frame of an endoscope video.
- pixels of the image are analyzed by computer methods to determine whether there is an indication of a proximate lumen wall in the image.
- the analysis may include methods described above for determining a section of saturated pixels, or alternatively or additionally a blurred area of the image, indicating that a wall of the lumen is less than the minimum focus length of the endoscope camera.
- the process of determining the blurred area includes detecting edges in the image by edge detection methods such as applying a Laplace operator to the pixels of the image to generate an edge-rendered mapping. Other methods of edge detection may also be employed to generate the edge-rendered mapping. After the edge-rendered mapping is generated, the mapping may be divided into sectors and the variance of pixel intensity in each sector calculated to generate an indication of blurriness, thereby indicating that the blurred subset of sectors is a wall section of the image.
- the computer controller calculates a center of gravity (COG), which is a center of weighted values of the subset of pixel sectors that is determined to represent a proximate wall.
- COG center of gravity
- the weighted center may be calculated from inverse values of the variances.
- the computer controller then calculates a vector extending from the COG to the center of the endoscope image to turn the endoscope tip away from the proximate wall.
- the vector may also be calculated as a weighted average of multiple targets, which may include a prior vector target as well as a dark region target.
- the computer controller sends a signal indicative of the calculated vector to the mechanical directional controller of the endoscope to navigate the endoscope tip away from the wall, and then waits to receive a new image, that is, the process returns to step 212 .
- the mechanical directional controller is pre-calibrated to convert the signal into an endoscope tip motion within a safe working range of operation.
- Computer processing elements described may be distributed processing elements, implemented over wired and/or wireless networks. Such computing systems may furthermore be implemented by multiple alternative and/or cooperative configurations, such as a data center server or a cloud configuration of processers and data repositories. Processing elements of the system may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations thereof.
- Such elements can be implemented as a computer program product, tangibly embodied in an information carrier, such as a non-transient, machine-readable storage device, for execution by, or to control the operation of, data processing apparatus, such as a programmable processor, computer, or deployed to be executed on multiple computers at one site or distributed across multiple sites.
- Memory storage may also include multiple distributed memory units, including one or more types of storage media.
- Processing elements and memory storage such as databases, may be implemented so as to include security features, such as authentication processes known in the art.
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Surgery (AREA)
- Public Health (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Medical Informatics (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Biomedical Technology (AREA)
- Animal Behavior & Ethology (AREA)
- Heart & Thoracic Surgery (AREA)
- Veterinary Medicine (AREA)
- Molecular Biology (AREA)
- Radiology & Medical Imaging (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- General Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Theoretical Computer Science (AREA)
- Optics & Photonics (AREA)
- Epidemiology (AREA)
- Primary Health Care (AREA)
- Urology & Nephrology (AREA)
- Robotics (AREA)
- Endoscopes (AREA)
Abstract
A computerized system and method of endoscope navigation is provided that includes receiving an endoscope image from the endoscope in a body lumen, determining from the endoscope image a proximate wall of the body lumen, determining a movement vector directing away from the proximate wall, and applying a mechanical motion to the endoscope, according to the movement vector, to move the endoscope away from the proximate wall.
Description
- The present invention is directed to systems and methods for robotically assisted medical procedures, in particular for navigation of medical instruments for diagnosis and surgery.
- A colonoscope typically includes a light source and an image capture device and is externally steerable through the colon. Colonoscopy is frequently performed to screen for asymptomatic cancers at an early stage or to find and to remove precancerous polyps. Colonoscopy may also be performed to diagnose rectal bleeding or changes in bowel habits and inflammatory bowel disease. Although the procedure is common, safe navigation of a colonoscope through the colon can be difficult due to the colon's distensible and highly mobile nature. Colonoscopy complications that can occur include colon perforation, hemorrhage, or severe abdominal pain. Robotic control of colonoscope movement is being developed to provide greater precision and speed for the procedure; however, robotic control must also contend with the varying, dynamic anatomy of the colon.
- Prevalent manual and automatic visual methods for navigating the colonoscope rely on directing the colonoscope towards dark regions of images acquired by the colonoscope. Khan and Gillies (“Vision based navigation for an endoscope”, Image and Vision Computing, 14 (1996), pp. 763-T72) describe determining and visually presenting the colon space, for manual and/or automatic navigation of the colon. The method includes directing the colonoscope towards a dark region of the lumen, according to the colonoscope image. The method also includes determining distances between muscle contours of the colon. Khan and Gillies note that a complete colon representation is necessary for guiding the endoscope around the bends when the view of the lumen is lost and muscle contours are not visible, or when the colon includes pockets or unforeseen obstacles.
- U.S. Patent Publication 2003/0167007 to Belson is directed to a method of colonoscopy whereby a spectroscopy device is attached to the tip of the colonoscope to create a three dimensional map of the colon for use by an automated method of advancing the colonoscope. U.S. Pat. No. 8,795,157 to Yaron and Frenkel is directed to a method for advancing a colonoscope by acquiring a stereoscopic image pair of a region of the interior of a colon, identifying a plurality of topographical features on an inner wall of the colon, determining the depth of each topographical feature, determining a radius of curvature of the colon, and advancing the colonoscope according to the direction of the topographical feature with greatest depth, according to the radius of curvature.
- U.S. Pat. No. 8,514,218 to Hong and Paladini is directed to navigating a colonoscope according to a depth image captured by an angular fish eye lens. In such a lens, the resolution is approximately equal across the whole image. The depth image is generated according to a ray casting volume rendering scheme. In the depth image, the gray level is proportional to the distance from the camera to the colon surface, the brighter region corresponding to the colon lumen which is far away from the current camera location, called the target region.
- Similarly, U.S. Patent Publication 2003/0152897 to Geiger is directed to navigating a colonoscope by ray-casting, whereby for every pixel of an acquired colon image, a ray is cast and its intersection with an organ wall is calculated, to determine a longest ray. The colonoscope is then navigated in the direction of the longest ray.
- The aforementioned methods do not overcome a variety of difficulties of colonoscope navigation that stem from the dynamic anatomy of the colon, as well as the highly varied surface structure. The colon may include dark pockets, including disease related distortions, such as diverticulitis. The colon image may also include image artifacts, such as fluids or bubbles on the lens. When the endoscope tip is close to the colon wall, the image may display ‘red-out’ or ‘wall view’. Camera movement may also cause motion blur artifacts. These complicating factors and artifacts have impeded successful implementation of automated navigation.
- Embodiments of the present invention provide systems and methods for navigating an endoscope during colonoscopy. In some embodiments, a method of endoscope navigation includes: receiving an endoscope image from an endoscope in a body lumen; determining a wall section of the endoscope image representing a proximate wall of the body lumen; determining a movement vector directing away from the wall section of the image; and applying a mechanical motion to the endoscope, according to the movement vector, to move the endoscope away from the proximate wall. In further embodiments, the endoscope may be a colonoscope. Determining the wall section of the image may include determining a blurred portion of the endoscope image, and determining the blurred portion of the endoscope may include: generating by edge detection an edge-rendered mapping from the endoscope image, dividing the edge-rendered mapping into sectors, determining a variance of pixel intensity for each sector, and determining the blurred portion as a subset of the sectors having variances less than a threshold value.
- In further embodiments the edge detection may be performed by applying a 3×3 Laplacian operator to the endoscope image to generate a second order derivative mapping.
- The threshold value may be a variance less than a preset percentile of the variances of all the image subsections. Alternatively, the threshold value may be a preset variance value. The subset of the sectors having variances less than a threshold is a “blurred subset”, and determining the vector directing away from the wall section may include determining coordinates in the endoscope image of a center of gravity (COG) of the blurred subset. In some embodiments, the movement vector may be determined as a displacement from the coordinates of the COG towards a point in the endoscope image representing a current orientation of the endoscope tip. In alternative embodiments, the movement vector may be determined as a weighted average of a displacement from the coordinates of the COG towards a point in the endoscope image representing a current orientation of the endoscope tip, and of a displacement to one or more additional targets, include at least one of a dark region target and a previous endoscope image target.
- In further embodiments, a system for endoscope navigation may include a processor and a memory with computer-readable instructions that when executed cause the processor to perform steps of: receiving an endoscope image from an endoscope in a body lumen; determining a wall section of the endoscope image representing a proximate wall of the body lumen; determining a movement vector directing away from the wall section of the image; and applying a mechanical motion to the endoscope, according to the movement vector, to move the endoscope away from the proximate wall.
- The present invention will be more fully understood from the following detailed description of embodiments thereof.
- In the following detailed description of various embodiments, reference is made to the following drawings that form a part thereof, and in which are shown by way of illustration specific embodiments by which the invention may be practiced, wherein:
-
FIG. 1 is a block diagram of a system for endoscope navigation, in accordance with an embodiment of the present invention; -
FIG. 2 shows an endoscope image acquired during endoscope navigation, in accordance with an embodiment of the present invention; -
FIG. 3 shows an edge-rendered mapping of the endoscope image, in accordance with an embodiment of the present invention; -
FIG. 4 shows the edge-rendered mapping divided into sectors, in accordance with an embodiment of the present invention; -
FIG. 5 shows a table of variances of pixel intensities of the sectors of the edge-rendered mapping, in accordance with an embodiment of the present invention; -
FIG. 6 shows a blurred subset of the sectors, indicating proximity of a wall of a body lumen, in accordance with an embodiment of the present invention; -
FIG. 7 shows vectors of several navigation targets, which are added to create a final target, during a process of endoscope navigation, in accordance with an embodiment of the present invention; and -
FIG. 8 is a schematic, flow diagram of a process for endoscope navigation, according to an embodiment of the present invention. - In the following detailed description of various embodiments, it is understood that other embodiments may be utilized and structural changes may be made without departing from the scope of the present invention.
-
FIG. 1 is a block diagram of asystem 20 for endoscope navigation, in accordance with an embodiment of the present invention.System 20 includes anendoscope 22 designed to be steered through a body lumen. In some embodiments is a colonoscope, and the body lumen is a patient's colon. Situated at theendoscope tip 24 may be several devices, such as alight emitter 26, awater nozzle 28, or other surgical instruments 30 (such as cautery probes, cold or hot forceps, snares, injection needles, suction, nets, or endoloops). Theendoscope tip 24 also has acamera 32, which typically sends images as video frames to acomputer controller 40. As described further hereinbelow, the lens system of thecamera 32 is designed to provide a range of focus that is typically not less than 3 mm, and the steering of the endoscope is guided according to the range of focus. - The
computer controller 40 typically includes aprocessor 42 and amemory 44, the memory including instructions for image processing described further hereinbelow. The computer controller may also be configured to receive input from a user control 60, which may, for example be steering instructions for controlling theendoscope 22. Alternatively or additionally, thecomputer controller 40 may be configured to control automated navigation of theendoscope 22 at least for a portion of the process of inserting and removing the endoscope from the. Thecomputer controller 40 may also sendnavigation instructions 44 to a mechanicaldirectional controller 50, for controlling a 3D position (i.e., x, y, and z axes) of theendoscope tip 24. Thedirectional controller 50 typically has motors that operate cables of theendoscope 22 to move theendoscope tip 24. - The
computer controller 40 may also present video from thecamera 32 on auser display 62. Additional objects rendered by thecomputer controller 40, such as a navigation target may also be presented on theuser display 62. - The endoscope tip may include additional navigation sensors, such as distance sensors and remote tracking sensors, not shown. The navigation methods provided by the present invention reduce the need for such additional sensors. However, measurements by such navigation sensors may also be applied to complement and to confirm the navigation process described hereinbelow.
-
FIG. 2 shows anendoscope image 120 acquired during endoscope navigation, in accordance with an embodiment of the present invention. Thecomputer controller 40 receives theendoscope image 120 from thecamera 32, typically as a frame of a video transmission. Typically, the endoscope image is calibrated to indicate the orientation of the endoscope tip, such that the tip is oriented towards animage center 125. Alternatively, a current orientation of the endoscope tip may be indicated by other preset points of the image. Theendoscope image 120 shown in the figure is an image of a colon, taken during a colonoscopy procedure. - Upon receiving the
endoscope image 120, thecomputer controller 40 determines whether features of the image indicate that the endoscope tip is closer than a preset threshold to a wall of the body lumen. In an embodiment of the present invention, the computer controller processes the image to determine an out-of-focus (i.e., blurry) region, indicative of a region that is closer to the camera than the minimum focal length of the camera lens. Alternative, other methods of analyzing pixels of the image to determine wall proximity may be incorporated. For example, pixel saturation may be employed, as saturation may be indicative of high reflection very close to the endoscope light. - A blurry region may be categorized according to a method described in Pech-Pacheco, et al., “Diatom autofocusing in brightfield microscopy: a comparative study” (Proceedings 15th International Conference on Pattern Recognition, IEEE, Sept. 2000). An edge-rendered mapping is first generated by an edge detection operation. The variance of the pixel intensity in the mapping is then calculated. A high variance is indicative of good focus, while a low variance is indicative of poor focus.
- An edge-rendered mapping of an image may be calculated by several methods known in the art, such as by convolution of the image with a discrete Laplace operator mask. A common mask for edge detection has the form:
-
-
FIG. 3 shows an edge-renderedmapping 130, generated by detecting edges in the endoscope image, in accordance with an embodiment of the present invention. In order to differentiate between levels of blurriness in regions of the endoscope image, the edge-renderedmapping 130 is divided intosectors 142, as shown inFIG. 4 . A variance of pixel intensities within each sector of the edge-rendered mapping is then calculated.FIG. 5 shows a table ofvariances 150 for each sector, in accordance with an embodiment of the present invention. The table is superimposed on theoriginal endoscope image 120. The size of sectors for a given mapping may be varied, depending on factors such as camera resolution and quality. A typical sector size for a given mapping may be, for example 10×10 pixels. - After the variances are calculated for each sector of the endoscope image, the sectors may be differentiated between blurry sectors and focused sectors. In some embodiments, a preset threshold of sector variance is set to distinguish between blurry and non-blurry sectors. Alternatively, a percentile of the overall range of variances can be used to distinguish blurry from non-blurry sectors.
-
FIG. 6 shows ablurred subset 160 of the sectors, which have been distinguished as described above from the focused sectors. The blurriness of the blurred sectors is caused by the distance between a wall of the lumen (e.g., a wall of the colon) and the endoscope tip being less than the minimum field depth of the camera lens. Consequently the blurred subset represents a proximate wall of the lumen. - From the
blurred subset 160 of the sectors, the computer controller calculates a direction for navigating the endoscope tip away from the proximate wall. The direction is determined as a vector extending from a “center of mass” or “center of gravity” (COG) of theblurred subset 160 of sectors towards a preset or interactively set point of the endoscope image, such as thecenter 125. - The variance is inversely proportional to a measure of blurriness; consequently, the COG, i.e., a “center of blurriness”, may be calculated by assigning to each sector an inverse value of the variance (e.g., a constant divided by the variance). The inverse variance value is used to represent mass for the COG calculation. Other inverse variance indices may also be used. In one embodiment, by way of example, the two-dimensional coordinates of the COG are calculated from a standard COG formula, as follows:
-
- where vi is the variance of sector i, and ri represents the coordinates of sector i, which may be, for example, coordinates of a central pixel of the sector, as measured from an arbitrary point, such as a corner or center point of the image. The summations in the formula for COG are summations over all sectors (i.e., i=1 to n).
- The COG of the inverse variance of blurry sectors is indicated as
point 162 in the figure. - The computer controller calculates a
movement vector 164 starting frompoint 162 and extending in the direction of a point indicating a current orientation of theendoscope image 120, such as thecenter 125. Themovement vector 164 indicates a directional motion which the computer controller then directs thedirection controller 50 to apply to the endoscope. - In further embodiments, the origin of the
movement vector 164 is transformed to the image center 125 (or other indicative orientation point), to generate a transformedvector 166. The transformed vector indicates anavigation target 168.FIG. 7 shows additional navigation targets, which may be aggregated with thenavigation target 168 to create an aggregatedfinal target 176. The additional navigation targets may be, for example, a previously calculatednavigation target 172, calculated from a prior image, and adark region target 174, calculated by identifying the center of the darkest region of the image, by methods of image processing known in the art. In one embodiment, the weights assigned to each target are preset. For example, the final target may be calculated by according an averaging weight of 50% to the previous target, 30% to the new target, and 20% to the dark region target. -
FIG. 8 is a schematic, flow diagram of aprocess 200 for endoscope navigation, according to an embodiment of the present invention. At aninitial step 210, the navigation begins, with the insertion of the endoscope into the body lumen, for example, by beginning a colonoscopy procedure. An iterative process of computer-assisted or computer-controlled navigation then begins at astep 212, with the capture of the first endoscope image, typically a frame of an endoscope video. - At a
step 214, pixels of the image are analyzed by computer methods to determine whether there is an indication of a proximate lumen wall in the image. The analysis may include methods described above for determining a section of saturated pixels, or alternatively or additionally a blurred area of the image, indicating that a wall of the lumen is less than the minimum focus length of the endoscope camera. - As described above, in some embodiments, the process of determining the blurred area includes detecting edges in the image by edge detection methods such as applying a Laplace operator to the pixels of the image to generate an edge-rendered mapping. Other methods of edge detection may also be employed to generate the edge-rendered mapping. After the edge-rendered mapping is generated, the mapping may be divided into sectors and the variance of pixel intensity in each sector calculated to generate an indication of blurriness, thereby indicating that the blurred subset of sectors is a wall section of the image.
- At a
step 216, the computer controller calculates a center of gravity (COG), which is a center of weighted values of the subset of pixel sectors that is determined to represent a proximate wall. When the proximate wall is determined by the variance of pixel intensity in the edge-rendered mapping, the weighted center may be calculated from inverse values of the variances. - At a
step 220, the computer controller then calculates a vector extending from the COG to the center of the endoscope image to turn the endoscope tip away from the proximate wall. In further embodiments, the vector may also be calculated as a weighted average of multiple targets, which may include a prior vector target as well as a dark region target. At astep 222, the computer controller sends a signal indicative of the calculated vector to the mechanical directional controller of the endoscope to navigate the endoscope tip away from the wall, and then waits to receive a new image, that is, the process returns to step 212. The mechanical directional controller is pre-calibrated to convert the signal into an endoscope tip motion within a safe working range of operation. - It is to be understood that the embodiments described hereinabove are cited by way of example, and that the present invention is not limited to what has been particularly shown and described hereinabove. The scope of the present invention includes variations and modifications thereof which would occur to persons skilled in the art upon reading the foregoing description and which are not disclosed in the prior art. Computer processing elements described may be distributed processing elements, implemented over wired and/or wireless networks. Such computing systems may furthermore be implemented by multiple alternative and/or cooperative configurations, such as a data center server or a cloud configuration of processers and data repositories. Processing elements of the system may be implemented in digital electronic circuitry, or in computer hardware, firmware, software, or in combinations thereof. Such elements can be implemented as a computer program product, tangibly embodied in an information carrier, such as a non-transient, machine-readable storage device, for execution by, or to control the operation of, data processing apparatus, such as a programmable processor, computer, or deployed to be executed on multiple computers at one site or distributed across multiple sites. Memory storage may also include multiple distributed memory units, including one or more types of storage media.
- Communications between systems and devices described above are assumed to be performed by software modules and hardware devices known in the art. Processing elements and memory storage, such as databases, may be implemented so as to include security features, such as authentication processes known in the art.
- Method steps associated with the system and process can be rearranged and/or one or more such steps can be omitted to achieve the same, or similar, results to those described herein.
Claims (21)
1. A method of endoscope navigation comprising:
receiving an endoscope image from an endoscope in a body lumen;
determining a wall section of the endoscope image representing a proximate wall of the body lumen, as a blurred section of the endoscope image;
determining a movement vector as a displacement from the blurred section; and
applying a mechanical motion to the endoscope, according to the movement vector, to move the endoscope away from the proximate wall.
2. The method of claim 1 , wherein the endoscope is a colonoscope and the body lumen is a colon.
3. (canceled)
4. The method of claim 1 , wherein determining the blurred portion of the endoscope image comprises: generating by edge detection an edge-rendered mapping from the endoscope image, dividing the edge-rendered mapping into sectors, determining a variance of pixel intensity for each sector, and determining the blurred portion as a subset of the sectors having variances less than a threshold value.
5. The method of claim 4 , wherein the edge detection is performed by applying a 3×3 Laplacian operator to the endoscope image to generate a second order derivative mapping.
6. The method of claim 4 , wherein the threshold value is a variance less than a preset percentile of the variances of all the image subsections.
7. The method of claim 4 , wherein the threshold value is a preset variance value.
8. The method of claim 4 , wherein the subset of the sectors having variances less than a threshold is a “blurred subset”, and wherein determining the movement vector directing away from the wall section comprises determining coordinates in the endoscope image of a center of gravity (COG) of the blurred subset.
9. The method of claim 8 , wherein the movement vector is determined as a displacement from the coordinates of the COG towards a point in the endoscope image representing a current orientation of the endoscope tip.
10. The method of claim 8 , wherein the movement vector is determined as a weighted average of a displacement from the coordinates of the COG towards one or more target points, wherein the one or more target points include one or more of:
a point in the endoscope image representing a current orientation of the endoscope tip; a dark region target;
and a previous endoscope image target.
11. A system for endoscope navigation, comprising a processor and a non-transient memory with computer-readable instructions that when executed cause the processor to perform steps of:
receiving an endoscope image from an endoscope in a body lumen;
determining a wall section of the endoscope image representing a proximate wall of the body lumen, as a blurred section of the endoscope image;
determining a movement vector as a displacement from the blurred section; and
applying a mechanical motion to the endoscope, according to the movement vector, to move the endoscope away from the proximate wall.
12. The system of claim 11 , wherein the endoscope is a colonoscope and the body lumen is a colon.
13. (canceled)
14. The system of claim 11 , wherein determining the blurred portion of the endoscope image comprises: generating by edge detection an edge-rendered mapping from the endoscope image, dividing the edge-rendered mapping into sectors, determining a variance of pixel intensity for each sector, and determining the blurred portion as a subset of the sectors having variances less than a threshold value.
15. The system of claim 14 , wherein the edge detection is performed by applying a 3×3 Laplacian operator to the endoscope image to generate a second order derivative mapping.
16. The system of claim 14 , wherein the threshold value is a variance less than a preset percentile of the variances of all the image subsections.
17. The system of claim 14 , wherein the threshold value is a preset variance value.
18. The system of claim 14 , wherein the subset of the sectors having variances less than a threshold is a “blurred subset”, and wherein determining the movement vector comprises determining coordinates in the endoscope image of a center of gravity (COG) of the blurred subset.
19. The system of claim 18 , wherein the movement vector is determined as a displacement from the coordinates of the COG towards a point in the endoscope image representing a current orientation of the endoscope tip.
20. The system of claim 18 , wherein the movement vector is determined as a weighted average of a displacement from the coordinates of the COG towards one or more target points, wherein the one or more target points include one or more of: a point in the endoscope image representing a current orientation of the endoscope tip; a dark region target; and a previous endoscope image target.
21. A system for colonoscope navigation, comprising a processor and a non-transient memory with computer-readable instructions that when executed cause the processor to perform steps of:
receiving an colonoscope image from an colonoscope in a colon;
determining a wall section of the colonoscope image representing a proximate wall of the colon, as a blurred section of the colonoscope image;
determining a movement vector as a displacement from the blurred section; and
applying a mechanical motion to the colonoscope, according to the movement vector, to move the colonoscope away from the proximate wall.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US17/257,470 US20210161604A1 (en) | 2018-07-17 | 2019-07-15 | Systems and methods of navigation for robotic colonoscopy |
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US201862699185P | 2018-07-17 | 2018-07-17 | |
PCT/IL2019/050793 WO2020016886A1 (en) | 2018-07-17 | 2019-07-15 | Systems and methods of navigation for robotic colonoscopy |
US17/257,470 US20210161604A1 (en) | 2018-07-17 | 2019-07-15 | Systems and methods of navigation for robotic colonoscopy |
Publications (1)
Publication Number | Publication Date |
---|---|
US20210161604A1 true US20210161604A1 (en) | 2021-06-03 |
Family
ID=69164808
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/257,470 Abandoned US20210161604A1 (en) | 2018-07-17 | 2019-07-15 | Systems and methods of navigation for robotic colonoscopy |
Country Status (2)
Country | Link |
---|---|
US (1) | US20210161604A1 (en) |
WO (1) | WO2020016886A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113907693A (en) * | 2021-12-10 | 2022-01-11 | 极限人工智能有限公司 | Operation mapping ratio adjusting method and device, electronic equipment and storage medium |
WO2024029502A1 (en) * | 2022-08-01 | 2024-02-08 | 日本電気株式会社 | Endoscopic examination assistance device, endoscopic examination assistance method, and recording medium |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US11445108B1 (en) | 2021-03-05 | 2022-09-13 | International Business Machines Corporation | Turn direction guidance of an endoscopic device |
Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10022192B1 (en) * | 2017-06-23 | 2018-07-17 | Auris Health, Inc. | Automatically-initialized robotic systems for navigation of luminal networks |
Family Cites Families (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7894648B2 (en) * | 2005-06-17 | 2011-02-22 | Mayo Foundation For Medical Education And Research | Colonoscopy video processing for quality metrics determination |
US9661991B2 (en) * | 2005-08-24 | 2017-05-30 | Koninklijke Philips N.V. | System, method and devices for navigated flexible endoscopy |
US20150374210A1 (en) * | 2013-03-13 | 2015-12-31 | Massachusetts Institute Of Technology | Photometric stereo endoscopy |
US11116383B2 (en) * | 2014-04-02 | 2021-09-14 | Asensus Surgical Europe S.à.R.L. | Articulated structured light based-laparoscope |
-
2019
- 2019-07-15 US US17/257,470 patent/US20210161604A1/en not_active Abandoned
- 2019-07-15 WO PCT/IL2019/050793 patent/WO2020016886A1/en active Application Filing
Patent Citations (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US10022192B1 (en) * | 2017-06-23 | 2018-07-17 | Auris Health, Inc. | Automatically-initialized robotic systems for navigation of luminal networks |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113907693A (en) * | 2021-12-10 | 2022-01-11 | 极限人工智能有限公司 | Operation mapping ratio adjusting method and device, electronic equipment and storage medium |
WO2024029502A1 (en) * | 2022-08-01 | 2024-02-08 | 日本電気株式会社 | Endoscopic examination assistance device, endoscopic examination assistance method, and recording medium |
Also Published As
Publication number | Publication date |
---|---|
WO2020016886A1 (en) | 2020-01-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11145053B2 (en) | Image processing apparatus and computer-readable storage medium storing instructions for specifying lesion portion and performing differentiation classification in response to judging that differentiation classification operation is engaged based on signal from endoscope | |
US10694933B2 (en) | Image processing apparatus and image processing method for image display including determining position of superimposed zoomed image | |
CN107292857B (en) | Image processing apparatus and method, and computer-readable storage medium | |
US20210161604A1 (en) | Systems and methods of navigation for robotic colonoscopy | |
US10827906B2 (en) | Endoscopic surgery image processing apparatus, image processing method, and program | |
US9482855B2 (en) | Microscope system | |
US9826884B2 (en) | Image processing device for correcting captured image based on extracted irregularity information and enhancement level, information storage device, and image processing method | |
WO2018230098A1 (en) | Endoscope system, and method for operating endoscope system | |
WO2017203701A1 (en) | Image processing device, operation method for image processing device, and operation program for image processing device | |
JP6956853B2 (en) | Diagnostic support device, diagnostic support program, and diagnostic support method | |
JP7385731B2 (en) | Endoscope system, image processing device operating method, and endoscope | |
CN114022547A (en) | Endoscope image detection method, device, equipment and storage medium | |
JP4077716B2 (en) | Endoscope insertion direction detection device | |
US11857153B2 (en) | Systems and methods for multi-modal sensing of depth in vision systems for automated surgical robots | |
JP6664486B2 (en) | Image processing apparatus, operation method of image processing apparatus, and operation program of image processing apparatus | |
van der Stap et al. | Image-based navigation for a robotized flexible endoscope | |
US11000247B2 (en) | Method for operating an X-ray device with enhanced depiction of a medical component | |
US20210258507A1 (en) | Method and system for depth-based illumination correction | |
WO2018158817A1 (en) | Image diagnosis device, image diagnosis method, and program | |
Hong et al. | Colonoscopy simulation | |
Martínez et al. | Estimating the size of polyps during actual endoscopy procedures using a spatio-temporal characterization | |
CN114785948B (en) | Endoscope focusing method and device, endoscope image processor and readable storage medium | |
KR102378497B1 (en) | Method and Apparatus for Measuring Object Size | |
JP2005160916A (en) | Method, apparatus and program for determining calcification shadow | |
US20220346632A1 (en) | Image processing apparatus, image processing method, and non-transitory storage medium storing computer program |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STPP | Information on status: patent application and granting procedure in general |
Free format text: APPLICATION UNDERGOING PREEXAM PROCESSING |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STCB | Information on status: application discontinuation |
Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION |