US20190231167A1 - System and method for guiding and tracking a region of interest using an endoscope - Google Patents

System and method for guiding and tracking a region of interest using an endoscope Download PDF

Info

Publication number
US20190231167A1
US20190231167A1 US16/382,589 US201916382589A US2019231167A1 US 20190231167 A1 US20190231167 A1 US 20190231167A1 US 201916382589 A US201916382589 A US 201916382589A US 2019231167 A1 US2019231167 A1 US 2019231167A1
Authority
US
United States
Prior art keywords
interest
region
display screen
viewing element
target area
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US16/382,589
Inventor
Achia Kronman
Ron Sharoni
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
EndoChoice Inc
Original Assignee
EndoChoice Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by EndoChoice Inc filed Critical EndoChoice Inc
Priority to US16/382,589 priority Critical patent/US20190231167A1/en
Publication of US20190231167A1 publication Critical patent/US20190231167A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00009Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope
    • A61B1/000094Operational features of endoscopes characterised by electronic signal processing of image signals during a use of endoscope extracting biological structures
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00039Operational features of endoscopes provided with input arrangements for the user
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00039Operational features of endoscopes provided with input arrangements for the user
    • A61B1/0004Operational features of endoscopes provided with input arrangements for the user for electronic operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00039Operational features of endoscopes provided with input arrangements for the user
    • A61B1/00042Operational features of endoscopes provided with input arrangements for the user for mechanical operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00112Connection or coupling means
    • A61B1/00114Electrical cables in or with an endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • A61B1/0016Holding or positioning arrangements using motor drive units
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/012Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor characterised by internal passages or accessories therefor
    • A61B1/018Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor characterised by internal passages or accessories therefor for receiving instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/048Interaction techniques based on graphical user interfaces [GUI]
    • G06F3/0484Interaction techniques based on graphical user interfaces [GUI] for the control of specific functions or operations, e.g. selecting or manipulating an object, an image or a displayed text element, setting a parameter value or selecting a range
    • G06F3/04842Selection of displayed objects or displayed text elements
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/56Cameras or camera modules comprising electronic image sensors; Control thereof provided with illuminating means
    • H04N5/2256
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • H04N2005/2255
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/50Constructional details
    • H04N23/555Constructional details for picking-up images in sites, inaccessible due to their dimensions or hazardous conditions, e.g. endoscopes or borescopes

Definitions

  • the present specification relates generally to endoscopes, and more specifically, to a system and method for repositioning a region of interest from a first location to a second location on at least one display screen of the endoscope.
  • An endoscope is a medical instrument used for examining and treating internal body parts such as the alimentary canals, airways, the gastrointestinal system, and other organ systems.
  • Conventionally used endoscopes comprise an insertion tube, either flexible or rigid, carrying illuminators such as light-emitting diodes (LED) or a fiber optic light guide for directing light from an external light source situated at a proximal end of the tube to a distal tip.
  • LED light-emitting diodes
  • fiber optic light guide for directing light from an external light source situated at a proximal end of the tube to a distal tip.
  • most endoscopes are equipped with one or more channels through which medical devices, such as forceps, probes, and other tools, may be passed.
  • fluids such as water, saline, drugs, contrast material, dyes, or emulsifiers are often introduced or evacuated via the insertion tube.
  • a plurality of channels, one each for introduction and suctioning of liquids, may be provided within the insertion tube.
  • Endoscopes have attained great acceptance within the medical community, since they provide a means for performing procedures with minimal patient trauma, while enabling the physician to view the internal anatomy of the patient.
  • numerous endoscopes have been developed and categorized according to specific applications, such as cystoscopy, colonoscopy, bronchoscope, laparoscopy, upper gastrointestinal (GI) endoscopy among others.
  • Endoscopes may be inserted into the body's natural orifices or through an incision in the skin.
  • Endoscopes typically have a front camera for viewing the internal organ, such as the colon, and an illuminator for illuminating the field of view of the camera. Some endoscopes also comprise one or more side cameras and the corresponding illuminators for viewing the internal organs that are not in direct field of view of the front camera.
  • the camera(s) and illuminators are located in a tip of the endoscope and are used to capture images of the internal walls of the body cavity being endoscopically scanned. The captured images are sent to a control unit coupled with the endoscope via one of the channels present in the insertion tube, for being displayed on a screen coupled with the control unit.
  • endoscopes help in the detection and cure of a number of diseases in a non-invasive manner
  • conventional endoscopes suffer from the drawback of having a limited field of view.
  • the field of view is limited by the narrow internal geometry of organs as well as the insertion port, which may be body's natural orifices or an incision in the skin.
  • the image of the body cavity captured by the cameras such as a front camera is displayed on a display screen coupled with the camera.
  • irregularities such as polyps are observed forming on internal walls of a body cavity being scanned.
  • a clear display of the polyp during an endoscopic procedure such as colonoscopy is important in order to enable a medical practitioner to operate on the polyp by inserting medical tools via the working channel of the endoscope. It is advantageous to position a region of interest like the polyp in a distinct location such as a center of the display screen while operating on the polyp.
  • the present specification discloses a method of displaying a region of interest within an endoscopic image on a target area of a pre-defined display screen of an endoscopy system comprising an insertion portion having a tip section with at least one viewing element for capturing images of a body cavity comprising the region of interest, the display screen being coupled with the at least one viewing element, the method comprising: detecting the region of interest on a wall of the body cavity; displaying the region of interest on the display screen; marking the region of interest on the display screen; determining a transformation to be applied to the region of interest; and applying the determined transformation for causing the region of interest to be displayed on a target area of the pre-defined display screen.
  • the target area comprises the center of the display screen.
  • the region of interest is a body abnormality.
  • the region of interest is a polyp.
  • the pre-defined display screen is a central display screen coupled with a front viewing element of the endoscopy system and wherein the endoscopy system comprises a working channel located proximate to the front viewing element.
  • marking the region of interest comprises selecting the region of interest by using a graphical user interface (GUI). Still optionally, marking the region of interest comprises selecting the region of interest by using a mouse click, a touch screen, a hand gesture, or eye tracking.
  • GUI graphical user interface
  • determining a transformation to be applied to the region of interest comprises determining an angle between the region of interest and the viewing element.
  • applying the determined transformation comprises providing guidance to manually move the tip section of the insertion portion of the endoscope for positioning the region of interest in a target area of the display screen.
  • applying the determined transformation comprises automatically moving the tip section of the insertion portion of the endoscope for positioning the region of interest in a target area of the display screen.
  • an endoscope comprising an insertion portion having a tip section with at least one viewing element for capturing images of a body cavity comprising a region of interest, the at least one viewing element being coupled with a display screen, the display screen being coupled with a processor for performing a sequence of steps for displaying the region of interest within an endoscopic image on a target area of the display screen, the sequence of steps comprising: detecting the region of interest on a wall of the body cavity; displaying the region of interest on a display screen; marking the region of interest on the display screen; determining a transformation to be applied to the region of interest; and applying the determined transformation for causing the region of interest to be displayed in the center of the pre-defined display screen.
  • the present specification discloses a method of repositioning a region of interest from a first location to a second location on a display screen of an endoscopy system comprising: selecting a first location comprising a detected region of interest on the display screen; selecting a second location on the display screen; estimating a transformation to be applied to cause said region of interest to move in the direction of said second location; applying said estimated transformation; and evaluating if the region of interest is displayed at said second location on the display screen and if said region of interest is not displayed at said second location, repeatedly estimating a new transformation based on the most recent location of the region of interest and applying the same to cause said region of interest to be displayed at said second location on the display screen.
  • said first location and said second location may be on the same display screen.
  • said first location is on a first display screen and said second location is on a second display screen.
  • said second location is substantially the center of the display screen or it is the focal point of the images being captured by the viewing elements of an endoscope during an endoscopic procedure.
  • said region of interest is a body abnormality.
  • said region of interest is a polyp.
  • said first and second locations are selected by using a graphical user interface (GUI).
  • GUI graphical user interface
  • said first and second locations are selected by using a mouse click, a touch screen, a hand gesture, or eye tracking.
  • said estimation of transformation to be applied to the region of interest comprises determining an angle between the region of interest and the viewing element.
  • applying the estimated transformation comprises providing guidance to manually move the tip section of the insertion portion of the endoscope for positioning the region of interest to said second location of the display screen.
  • applying the estimated transformation comprises automatically moving the tip section of the insertion portion of the endoscope to reposition the region of interest to said second location of the display screen.
  • said automatic application comprises using electrical motors to move the tip section.
  • an endoscope system comprising: an insertion tube coupled to a distal tip section wherein said distal tip section comprises at least one viewing element; a main control unit coupled to said insertion tube and coupled to at least one display screen; a means for selecting a region of interest at a first location on said display screen; a means for selecting a second/target location on said display screen; a controller for estimating a transformation based on the most recent location of said region of interest and causing said transformation to be applied to cause said region of interest to move in the direction of said second location; and, wherein after every application of said transformation the system evaluates the latest position of said region of interest and in case said region of interest is not positioned at such second location, a new transformation is estimated based on the most recent location of the region of interest and the same is applied to cause said region of interest to move in the direction of said second location.
  • FIG. 1 illustrates a multiple viewing element endoscopy system, according to some embodiments of the present specification
  • FIG. 2 is an illustration of an endoscope in which the method and system of the present specification may be implements, in accordance with an embodiment
  • FIG. 3A is an exemplary schematic diagram showing a point of interest detected by a side viewing element and a chosen target point on a front viewing display screen of a multiple camera endoscope in accordance with an embodiment of the present specification;
  • FIG. 3B is a flowchart describing a method of repositioning a region of interest (ROI) from a first location on a display screen of the endoscope to a second location on at least one display screen, in accordance with an embodiment of the present specification.
  • ROI region of interest
  • FIG. 3C illustrates a navigation process to move a region of interest detected by a side pointing camera and displayed at a first location on display screen to a target point located at a second location on a display screen in accordance with an embodiment of the present specification
  • FIG. 3D illustrates the navigation process as depicted in FIG. 3C wherein the incremental movement of ROI towards the target point is shown in each image snapshot in accordance with an embodiment
  • FIG. 4 illustrates a camera calibration method, in accordance with an embodiment of the present specification
  • FIG. 5A illustrates a polyp located on the wall of a body cavity, such as a colon, that can be seen using a viewing element of an endoscope;
  • FIG. 5B illustrates the polyp of FIG. 5A marked as a ROI
  • FIG. 5C illustrates the polyp/ROI of FIG. 5B detected by using an image detection algorithm
  • FIG. 5D illustrates a guiding arrow on a central display screen
  • FIG. 5E illustrates the guiding arrow repositioned toward the polyp/ROI to compensate for movement of the endoscope's tip
  • FIG. 5F is a diagram showing the polyp/ROI as it is moved so that it is displayed on a central screen with an arrow, guiding a direction of movement of the endoscopic tip in order to position the polyp/ROI in the center of the display screen;
  • FIG. 5G illustrates the polyp/ROI positioned in the center of the central display screen
  • FIG. 6A illustrates an exemplary screenshot of an application program interface depicting instructions to move a knob on an endoscope handle in a clockwise direction, in accordance with an embodiment of the present specification
  • FIG. 6B illustrates an exemplary screenshot of an application program interface depicting instructions to move a knob on an endoscope handle in a counterclockwise direction, in accordance with an embodiment of the present specification.
  • the present specification provides a system and a method for displaying a region of interest (ROI) in the image captured by a viewing element of the endoscope at a user defined location on the display screen of the endoscope.
  • the user identifies a region of interest (ROI) such as a polyp (or any other abnormality) at a first location in the image displayed on the display screen of the endoscope.
  • ROI region of interest
  • the user identifies a second, target location on the display screen at which the image should be repositioned. Consequently, the entire image is repositioned such that the region of interest (ROI) is displayed at the second location identified on the display screen.
  • the second location is substantially the center of the display screen or it is representative of the focal point of the images being captured by the viewing elements of an endoscope during an endoscopic procedure.
  • the first location is on a first screen and the second location is on a second screen.
  • the first screen displays the images captured by a side viewing element or camera and the second screen displays the images captured by a front viewing element or camera of the endoscopy system.
  • the first screen displays the images captured by a front viewing element or camera and the second screen displays the images captured by a side viewing element or camera of the endoscopy system.
  • a first screen may display images captured by a first side viewing element
  • a second screen (positioned in the center) may display images captured by a front viewing element
  • a third screen may display images captured by a second side viewing element. It should be understood that any number of discrete displays may be employed to accommodate each camera or viewing element and the images captured can be displayed on any corresponding display, either dedicated or set by the user.
  • the ROI may denote an abnormality such as a polyp detected in a body cavity which the user may want to continuously track or display prominently during a medical procedure.
  • the present specification allows a user to view an image of the detected polyp on the center of a display or screen or on a central and/or front screen of the endoscopy system where the system has more than one display, thereby making operating upon the polyp much easier.
  • the present specification allows a user to view an image of a detected polyp at a target region in the display screen.
  • each of the words “comprise” “include” and “have”, and forms thereof, are not necessarily limited to members in a list with which the words may be associated.
  • FIG. 1 illustrates a multi-viewing element endoscopy system in which the method and system for repositioning a region of interest (ROI) from a first location to a second location such as the center of a display or as the focal point of images being captured by the viewing elements of an endoscope during an endoscopic procedure may be implemented.
  • the method and system for repositioning a region of interest (ROI) from a first location to a second location such as the center of images being captured by the viewing elements of an endoscope during an endoscopic procedure may be implemented in any endoscope comprising one or more viewing elements for capturing images of the insides of a body cavity.
  • the method and system for keeping a region of interest (ROI) in the center of images being captured by the viewing elements of an endoscope during an endoscopic procedure may be implemented in any endoscope system comprising at least one display screen.
  • System 100 may include a multi-viewing elements endoscope 102 .
  • Multi-viewing elements endoscope 102 may include a handle 104 , from which an elongated shaft 106 emerges. Elongated shaft 106 terminates with a tip section 108 which is turnable by way of a bending section 110 .
  • Handle 104 may be used for maneuvering elongated shaft 106 within a body cavity.
  • the handle may include one or more buttons and/or knobs and/or switches 105 which control bending section 110 as well as functions such as fluid injection and suction.
  • Handle 104 may further include at least one, and in some embodiments, one or more working channel openings 112 through which surgical tools may be inserted. In embodiments, the handle 104 also includes one and more side service/working channel openings.
  • Tip 108 may include multi-viewing elements.
  • tip 108 includes a front viewing element and one or more side viewing elements.
  • tip 108 may include only a front viewing element.
  • tip 108 may include one or more service/working channel exit point.
  • tip 108 includes a front service/working channel exit point and at least one side service channel exit point.
  • tip 108 may include two front service/working channel exit points.
  • a utility cable 114 may connect between handle 104 and a Main Control Unit 199 .
  • Utility cable 114 may include therein one or more fluid channels and one or more electrical channels.
  • the electrical channel(s) may include at least one data cable for receiving video signals from the front and side-pointing viewing elements, as well as at least one power cable for providing electrical power to the viewing elements and to the discrete illuminators.
  • the main control unit 199 contains the controls required for displaying the images of internal organs captured by the endoscope 102 .
  • the main control unit 199 may govern power transmission to the endoscope's 102 tip section 108 , such as for the tip section's viewing elements and illuminators.
  • the main control unit 199 may further control one or more fluid, liquid and/or suction pump(s) which supply corresponding functionalities to the endoscope 102 .
  • One or more input devices 118 such as a keyboard, a touch screen and the like may be connected to the main control unit 199 for the purpose of human interaction with the main control unit 199 .
  • the main control unit 199 comprises a screen/display 120 for displaying operation information concerning an endoscopy procedure when the endoscope 102 is in use.
  • the screen 120 may be configured to display images and/or video streams received from the viewing elements of the multi-viewing element endoscope 102 .
  • the screen 120 may further be operative to display a user interface for allowing a human operator to set various features of the endoscopy system.
  • the video streams received from the different viewing elements of the multi-viewing element endoscope 102 may be displayed separately on at least one monitor (not seen) by uploading information from the main control unit 199 , either side-by-side or interchangeably (namely, the operator may switch between views from the different viewing elements manually).
  • these video streams may be processed by the main control unit 199 to combine them into a single, panoramic video frame, based on an overlap between fields of view of the viewing elements.
  • two or more displays may be connected to the main control unit 199 , each for displaying a video stream from a different viewing element of the multi-viewing element endoscope 102 .
  • the main control unit 199 is described in U.S. Patent Application Ser. No. 14/263,896, entitled “Method and System for Video Processing in a Multi-Viewing Element Endoscope” and filed on Apr. 28, 2014, which is herein incorporated by reference in its entirety.
  • FIG. 2 illustrates another view of an endoscope in which the method and system for repositioning a region of interest from a first location to a second location on the display screen of the endoscope may be implemented.
  • Endoscope 200 comprises a handle 204 , from which an elongated shaft 206 emerges.
  • Elongated shaft 206 terminates with a tip section 208 which is turnable by way of a bending section 210 .
  • the tip section 208 comprises at least one viewing element which is used to capture images of the body cavity which are then displayed on a display screen coupled with the viewing element.
  • Handle 204 may be used for maneuvering elongated shaft 206 within a body cavity.
  • handle 204 comprises a service channel opening 212 through which a medical tool may be inserted.
  • tip 208 includes at least one service/working channel exit point from which the medical tool exits and is placed in close proximity to a region of interest/polyp on a wall of the body cavity being endoscopically examined.
  • a utility cable 214 is used to connect handle 204 with a controller (not shown) via an electrical connector 215 .
  • the controller governs power transmission to the tip section 208 .
  • the controller also controls one or more fluid, liquid and/or suction pump which supply corresponding functionalities to endoscope 200 .
  • One or more input devices such as a keyboard, a computer, a touch screen, a voice recognition system and the like, may be connected to the controller for the purpose of user interaction with the controller.
  • the method and system of the present specification may be applied to a multiple viewing elements endoscope having a front viewing element and side viewing elements.
  • the method and system of the present specification enable a user to locate a polyp in a body cavity using a viewing element and then have the polyp displayed at a target location on a display screen coupled with the viewing element while operating on the polyp using medical tools inserted via a working channel of the endoscope located in proximity to the viewing element.
  • the description provided herein is directed towards an endoscope having a working channel located near a front viewing element/camera so that a polyp observed in a body cavity, such as a colon, is displayed on a front display screen of the endoscope.
  • the method and system of the present specification can easily be applied to an endoscope having a side working channel, wherein the polyp is observed via a side viewing element positioned near the side working channel.
  • the system and method of the present specification also allows a user to identify the region of interest, such as a polyp, at a first location on a first screen and subsequently identify a second location on a second screen such that the system displays the region of interest at the second location on the second screen of the endoscope.
  • FIG. 3A is an exemplary schematic diagram showing a point of interest detected by a side viewing element and a chosen target point on a front viewing display screen as can be seen using a multi-viewing element endoscope, such as the multi viewing element endoscope 102 described in FIG. 1 .
  • multi- viewing element endoscope 102 is connected to the controller 199 through a utility cable 114 .
  • endoscope 102 provides three simultaneous endoscopic views using three cameras housed in the tip 108 , of endoscope 102 .
  • Controller 199 is connected to three display screens, 330 a , 330 b , and 330 c , wherein each display screen may be configured to display a corresponding view of the three endoscopic views provided by endoscope system 102 .
  • the display screen 330 b displays the view captured by a front camera and the display screens 330 a and 330 c display the views captured by two side cameras housed in tip 108 of the endoscope 102 .
  • display screens 330 a , 330 b , and 330 c are in the form of a single large screen 335 .
  • the videos are either separated one for the other or stitched together to form a panoramic view.
  • the abnormality should be located in front of the center viewing element, such that a medical tool emerging from the front working channel is easily detected by the front viewing element.
  • the abnormality should preferably be located in front of the side viewing element, such that a medical tool emerging from the side working channel is easily detected by the side viewing element.
  • this task is not trivial and consumes time during the procedure.
  • the present specification discloses systems and methods to enable displaying the ROI 322 at the required TP 323 on a specific display screen.
  • FIG. 3B is a flowchart depicting a method of repositioning a region of interest (ROI) from a first location on a display screen of the endoscope to a second location on the same display screen or on another display screen of the endoscope, in accordance with an embodiment of the present specification.
  • a region of interest (ROI) is detected in the body cavity during an endoscopic procedure.
  • a ROI is an abnormality, such as a polyp, a lesion, that may be detected in a body cavity such as a colon.
  • the abnormality is detected via a viewing element of the endoscope such as a front camera and or a side camera and displayed on the display screen of the endoscope.
  • the endoscope comprises a working channel located in proximity of a front viewing element/camera, there is an interest to keep the detected abnormality within the field of view of the front viewing element, which is in data communication with the front display screen.
  • the region of interest is identified by selecting/marking the same on the display screen.
  • the ROI is identified and marked by using a graphical user interface (GUI).
  • GUI graphical user interface
  • the medical practitioner may mark the ROI/polyp using known methods such as by a mouse click, a touch screen, a hand gesture, or eye tracking.
  • a user may navigate an icon on the display to visually coincide with the displayed point of interest 322 , resulting in an icon being visually superimposed over the point of interest 322 .
  • the display coordinates of the icon are recorded by a processor in memory and those coordinates are associated with the displayed position of the point of interest 322 .
  • the endoscopy system comprises an integrated mouse on the scope handle which is used by the user to select an ROI.
  • the ROI is detected and marked using image tracking algorithms known in the art, such as the Lucas-Kanade algorithm.
  • the ROI is detected and marked by matching various neighboring pixels contained in that specific area of the image in accordance with their normalized cross correlation with surrounding areas.
  • the ROI may be detected by using one or more sensors incorporated in the insertion tube of the endoscope.
  • the system comprises sensors such as an accelerometer or a gyroscope or an electromagnetic tracker or a combination of these sensors.
  • at least one sensor is used together with the camera to assess the location of a target or ROI relative to the camera.
  • a three-dimensional scene is constructed using displacement estimation from two images taken at different locations and/or from different cameras.
  • structured light-based 3D estimation is used for 3D scene construction and assessment.
  • the location of the sensor is important in various embodiments. For example, in an embodiment, an accelerometer must be placed with in the distal tip such that it has the same motion as the camera. In another embodiment, a structured light emitter must have the largest parallax possible from the camera.
  • a second/target location at which the region of interest is to be positioned is selected on the display screen.
  • the second location is positioned on the same display screen.
  • the second location is positioned on a second display screen.
  • the target location is identified and marked by using a graphical user interface (GUI).
  • GUI graphical user interface
  • the medical practitioner may mark the target location using known methods such as by a mouse click, a touch screen, a hand gesture, or eye tracking. Specifically, upon visually identifying a target point 323 on a display screen, a user may navigate an icon on the display to visually coincide with the target point 323 , resulting in an icon being visually superimposed over the target point 323 .
  • the display coordinates of the icon are recorded by a processor in memory and those coordinates are associated with the target point 323 .
  • the endoscopy system comprises an integrated mouse on the scope handle which is used by the user to select the target location.
  • the target point is positioned in the center of the display screen. In other embodiments, the target point is not positioned in the center of the display screen.
  • applying the transformation to the detected ROI comprises determining an angle between the ROI and the viewing element/camera.
  • the angle may be determined by using a camera calibration method.
  • the calibration method comprises mapping each pixel in an image captured by the camera with a vector connecting a center of the camera lens with the outside world.
  • the transformation estimated in step 306 is used to guide the user to position the ROI/abnormality at the target point (TP) or the second location of the display screen and maintain the position while operating upon the abnormality by inserting medical tools via the working channel.
  • the transformation or movement to the second location is estimated in a distance of millimeters from the first location or at an angle relative to the first location.
  • Exemplary transformations include pulling the distal tip 5 mm backward or moving the distal tip 30° to the left. In some embodiments, any transformation that places the ROI in the target location is acceptable.
  • the user is guided by means of icons, guides, or other data displayed on a graphical user interface (GUI) which directs a user to manually move the distal tip of the insertion portion of the endoscope in order to position the abnormality in the target region of the display screen.
  • GUI graphical user interface
  • the distal tip is moved automatically based on the applied transformation to position the abnormality in the target region of the display screen.
  • electronic motors are used to move the distal tip and facilitate the transformation.
  • the abnormality is positioned in the target region of the display screen by using a combination of manual and automatic methods.
  • the angular rotation of tip section is automatic while the longitudinal movement of complete insertion tube in manually done by the user.
  • FIG. 3C and FIG. 3D illustrate a navigation process to move a region of interest from a first location on the display screen to a target point/ second location on the display screen, in accordance with an embodiment of the present specification.
  • a structure of interest (polyp) 320 is shown at the top of the image 325 .
  • the polyp 320 is captured in the image at a point 321 which is termed as the region of interest (ROI).
  • the goal of the navigation method of the present specification is to move the distal tip 326 of an endoscope capturing the image 325 , such that the image ROI 321 is shifted to a desired target region 322 , which in an embodiment is in the center of the image.
  • the distal tip 326 is required to be rotated in a clockwise direction. This is done in via a feed-back process, wherein at each step the position of the ROI is detected and a transformation is computed and is used to move the distal tip 326 to bring ROI 321 as near as possible to the target region 322 .
  • FIG. 3C four different image snapshots I, II, III and IV are shown that represent the position of ROI at four different time steps such that at each step a unique transformation is computed based on the position of the ROI at that time.
  • Image snapshot I represents the starting position of the ROI and the target position, and in each subsequent snapshot, due to the movement of the distal tip 326 controlled by computed transformations, the ROI 321 is positioned closer to the target position as compared to the previous snapshot. Finally, in snapshot IV, the ROI 321 is positioned in the target region 322 .
  • the system has an option wherein a user can lock the position of ROI once it is positioned at the desired point in the display screen.
  • the position of the ROI is locked by monitoring the movements of the endoscope and applying a compensation, based on said movements, to the distal tip. Once the position of ROI is locked, subsequently, the ROI is always shown at the same location on the display screen irrespective of the movement of the tip section of the endoscopy system.
  • the transformation can be applied automatically or manually.
  • electrical motors rotate the distal tip in the desired directions, and/or push/pull the insertion tube in and out.
  • a manual transformation the user gets the transformation instructions by an API (Application Program Interface) and applies them directly to the distal tip.
  • An example for such API can be implemented by showing the user a scheme of the scope knobs with arrows that indicate the transformation direction.
  • the direction and length of the arrows change dynamically as they indicate the desired transformation needed to be performed in each step.
  • FIG. 3D also illustrates the navigation process as depicted in FIG. 3C wherein the incremental movement of ROI towards the target region is clearly shown in each image snapshot.
  • the current position of ROI is shown as 321 while the last position of ROI (prior to application of last transformation) is depicted as 321 p.
  • FIG. 4 illustrates a camera calibration method for determining the transformation to be applied to an endoscope tip for moving a region of interest to a target point on the display, in accordance with an embodiment of the present specification.
  • a camera in the endoscope's tip
  • the calibration method comprises mapping each pixel in an image captured by the camera with a vector connecting a center of the camera lens with the outside world.
  • a ray orientation pixel mapping is obtained with respect to an object/abnormality 402 in the outside world by measuring a distance between the object/abnormality 402 and a pixel 410 located in its image 404 taken by a camera lens 406 .
  • a ray orientation pixel mapping is obtained by measuring a distance between the object/abnormality 402 and another pixel 412 located in the image 404 .
  • the calibration parameters of the camera may be obtained by using a camera calibration algorithm.
  • Zhang algorithm/method is used for the purpose of camera calibration.
  • the camera calibration provides information on where a 3D object located at some coordinate relative to the camera would appear in the image.
  • To estimate the transformation a location of where the image would appear in the camera, given some transformation applied to the image, is estimated.
  • the camera parameters for the estimation are obtained by calibration. An optimization process is then performed to improve the transformation until a suitable transformation is reached.
  • FIGS. 5A to 5G illustrate a schema demonstrating the steps for maintaining a ROI in a center of a front display screen of an endoscopy system, in accordance with an embodiment of the present specification. It should be understood by those of ordinary skill in the art that the methods of the present specification can be employed with systems having any number of screens.
  • the endoscopy system has more than one display screen.
  • the endoscopy system comprises only one screen and in this case the ROI can be positioned at a central focal point (or any other target region provided by the user) for the ease of viewing and operation.
  • FIGS. 5A-5G illustrates displays associated with a left side camera 501 , a front camera 506 and a right side camera 508 respectively.
  • an insertion portion of an endoscope is inserted into a body cavity and the internal walls of the cavity are examined by a medical practitioner via the endoscopic images displayed on a monitoring screen coupled with a viewing element located in the tip of the insertion portion, until a polyp is observed on an internal wall of the body cavity.
  • FIG. 5A illustrates a polyp 502 located on the wall of a body cavity such as a colon captured via a viewing element of an endoscope.
  • the medical practitioner selects and marks the polyp as a region of interest, as discussed above.
  • FIG. 5B illustrates the polyp 502 marked as a ROI.
  • the ROI may be selected and marked by using a graphical user interface (GUI).
  • GUI graphical user interface
  • the medical practitioner may mark the ROI/polyp 502 using known methods such as by a mouse click, a touch screen, a hand gesture, or eye tracking.
  • the endoscopy system comprises an integrated mouse on the scope handle which is used by the user to select ROI 502 .
  • the marked ROI is detected by a software program running on a processor coupled with the endoscope and the monitoring screen by using any suitable image detection method.
  • FIG. 5C illustrates the polyp/ROI 502 being detected by using an image detection algorithm.
  • an arrow 504 appears on the central display monitor 506 of the endoscopy system as shown in FIG. 5D .
  • FIG. 5D illustrates guiding arrow 504 on central display screen 506 .
  • the arrow 504 guides the medical practitioner to move tip of the endoscope in a direction that would enable the ROI to be displayed in the center of the central display screen 506 .
  • a direction of rotation is computed by determining the rotational direction between the direction of the camera corresponding to the target point and the camera corresponding to the ROI.
  • the amount of movement in the computed direction is computed in a feedback loop, where real time sampling determines the new rotation direction, which is computed and a small rotation in this direction is applied.
  • the rotational or angular direction between the camera center, the z-axis and any pixel in the image is known.
  • the new ROI location is detected in the image and the process is repeated in an iterative manner until the ROI is placed on top of the target.
  • An illustrative navigation process to reposition a ROI (region of interest) from a first location on a display screen to a TP (target point) at a second location in the same or another display screen is also described in FIG. 3C and FIG. 3D .
  • FIG. 5E illustrates the guiding arrow repositioned toward the polyp/ROI to compensate for movement of the endoscope's tip.
  • FIG. 5E shows that if the endoscope's tip is moved in an upwards direction, the arrow 504 repositions pointing in a downward direction towards the polyp/ROI 502 .
  • the polyp/ROI 502 is caused to be displayed on the central display screen 506 .
  • FIG. 5F shows the polyp/ROI 502 being displayed on the central screen 506 with the arrow 504 pointing towards the direction in which the endoscope tip should be moved in order to position the polyp/ROI 502 in the center of the display screen 506 .
  • FIG. 5G illustrates the poly/ROI 502 positioned in the center of the central display screen 506 . Once positioned in the center, the polyp may be operated upon easily.
  • FIGS. 6A and 6B illustrate exemplary screenshots 611 , 621 of an application program interface depicting instructions to move a knob 605 on an endoscope handle 604 in a clockwise direction and counterclockwise direction, respectively, in accordance with an embodiment of the present specification.
  • a display of the application program interface shows the user how to move one or more knobs of the endoscope handle in order to move the distal tip of the endoscope in the appropriate direction to position a ROI in the display screen (move the ROI to a second location).
  • FIG. 6A and 6B illustrate exemplary screenshots 611 , 621 of an application program interface depicting instructions to move a knob 605 on an endoscope handle 604 in a clockwise direction and counterclockwise direction, respectively, in accordance with an embodiment of the present specification.
  • a display of the application program interface shows the user how to move one or more knobs of the endoscope handle in order to move the distal tip of the endoscope in the appropriate direction to position a ROI in the display screen (move the
  • the application program interface uses a clockwise curving arrow 603 with text stating “Move the endoscope distally” 609 to instruct the user to rotate knob 605 in a clockwise direction to move the endoscope in a distal direction in order to position the ROI in the second location/display screen.
  • the application program interface uses a counterclockwise curving arrow 613 with text stating “Move the endoscope proximally” 619 to instruct the user to rotate knob 605 in a counterclockwise direction to move the endoscope in a proximal direction in order to position the ROI in the second location/display screen.
  • the application program interface uses superimposed arrows, text, and other graphics to direct the user to move the endoscope tip in the correct direction in order to position the ROI in the second location/display screen.
  • the direction and length of the arrows or other graphics and wording of the text change dynamically as they indicate the required movements needed to position the ROI in the second location/display screen.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Public Health (AREA)
  • Optics & Photonics (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Signal Processing (AREA)
  • Human Computer Interaction (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Mechanical Engineering (AREA)
  • Endoscopes (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

A system and method of repositioning a region of interest within an endoscopic image from a first location to a target location on the display screen of an endoscopy system is provided. The endoscopy system includes an insertion portion having a tip section with at least one viewing element for capturing images of a body cavity having the region of interest. The method includes detecting a region of interest within a body cavity, displaying the region of interest on a display screen in communication with the endoscopy system, marking the region of interest on the display screen, selecting a target location on the display screen, estimating a transformation to be applied to the region of interest, and applying the estimated transformation for causing the region of interest to be displayed at the target location of the display screen.

Description

    CROSS REFERENCE
  • The present application relies on U.S. Provisional Patent Application No. 62/307,779, entitled “System and Method for Guiding and Tracking a Region of Interest Using an Endoscope” and filed on Mar. 14, 2016, for priority.
  • The aforementioned application is incorporated herein by reference in its entirety.
  • FIELD
  • The present specification relates generally to endoscopes, and more specifically, to a system and method for repositioning a region of interest from a first location to a second location on at least one display screen of the endoscope.
  • BACKGROUND
  • An endoscope is a medical instrument used for examining and treating internal body parts such as the alimentary canals, airways, the gastrointestinal system, and other organ systems. Conventionally used endoscopes comprise an insertion tube, either flexible or rigid, carrying illuminators such as light-emitting diodes (LED) or a fiber optic light guide for directing light from an external light source situated at a proximal end of the tube to a distal tip. Also, most endoscopes are equipped with one or more channels through which medical devices, such as forceps, probes, and other tools, may be passed. Further, during an endoscopic procedure, fluids, such as water, saline, drugs, contrast material, dyes, or emulsifiers are often introduced or evacuated via the insertion tube. A plurality of channels, one each for introduction and suctioning of liquids, may be provided within the insertion tube.
  • Endoscopes have attained great acceptance within the medical community, since they provide a means for performing procedures with minimal patient trauma, while enabling the physician to view the internal anatomy of the patient. Over the years, numerous endoscopes have been developed and categorized according to specific applications, such as cystoscopy, colonoscopy, bronchoscope, laparoscopy, upper gastrointestinal (GI) endoscopy among others. Endoscopes may be inserted into the body's natural orifices or through an incision in the skin.
  • Endoscopes, that are currently being used, typically have a front camera for viewing the internal organ, such as the colon, and an illuminator for illuminating the field of view of the camera. Some endoscopes also comprise one or more side cameras and the corresponding illuminators for viewing the internal organs that are not in direct field of view of the front camera. The camera(s) and illuminators are located in a tip of the endoscope and are used to capture images of the internal walls of the body cavity being endoscopically scanned. The captured images are sent to a control unit coupled with the endoscope via one of the channels present in the insertion tube, for being displayed on a screen coupled with the control unit.
  • While endoscopes help in the detection and cure of a number of diseases in a non-invasive manner, conventional endoscopes suffer from the drawback of having a limited field of view. The field of view is limited by the narrow internal geometry of organs as well as the insertion port, which may be body's natural orifices or an incision in the skin. The image of the body cavity captured by the cameras such as a front camera is displayed on a display screen coupled with the camera. Sometimes irregularities such as polyps are observed forming on internal walls of a body cavity being scanned. A clear display of the polyp during an endoscopic procedure such as colonoscopy is important in order to enable a medical practitioner to operate on the polyp by inserting medical tools via the working channel of the endoscope. It is advantageous to position a region of interest like the polyp in a distinct location such as a center of the display screen while operating on the polyp.
  • It is often difficult to maintain the display of the polyp constantly in the center of the display screen due to the movement of the body cavity, such as the colon, in which the polyp may be detected. It may also be beneficial for the medical practitioner to record the location of the polyp for operating upon it in the future.
  • For accurately and easily operating upon a region of interest within a body cavity endoscopically, it is essential that the location of the display of the region of interest within an image captured by the endoscope be maintained constantly at a desired location all through the endoscopic procedure.
  • Hence, there is need for a method enabling an operating medical practitioner to maintain the position of the region of interest at a desired location, such as in the center of the display screen, during an endoscopic procedure.
  • SUMMARY
  • In some embodiments, the present specification discloses a method of displaying a region of interest within an endoscopic image on a target area of a pre-defined display screen of an endoscopy system comprising an insertion portion having a tip section with at least one viewing element for capturing images of a body cavity comprising the region of interest, the display screen being coupled with the at least one viewing element, the method comprising: detecting the region of interest on a wall of the body cavity; displaying the region of interest on the display screen; marking the region of interest on the display screen; determining a transformation to be applied to the region of interest; and applying the determined transformation for causing the region of interest to be displayed on a target area of the pre-defined display screen.
  • Optionally, the target area comprises the center of the display screen.
  • Optionally, the region of interest is a body abnormality.
  • Optionally, the region of interest is a polyp.
  • Optionally, the pre-defined display screen is a central display screen coupled with a front viewing element of the endoscopy system and wherein the endoscopy system comprises a working channel located proximate to the front viewing element.
  • Optionally, marking the region of interest comprises selecting the region of interest by using a graphical user interface (GUI). Still optionally, marking the region of interest comprises selecting the region of interest by using a mouse click, a touch screen, a hand gesture, or eye tracking.
  • Optionally, determining a transformation to be applied to the region of interest comprises determining an angle between the region of interest and the viewing element.
  • Optionally, applying the determined transformation comprises providing guidance to manually move the tip section of the insertion portion of the endoscope for positioning the region of interest in a target area of the display screen.
  • Optionally, applying the determined transformation comprises automatically moving the tip section of the insertion portion of the endoscope for positioning the region of interest in a target area of the display screen.
  • In some embodiments, the present specification discloses an endoscope comprising an insertion portion having a tip section with at least one viewing element for capturing images of a body cavity comprising a region of interest, the at least one viewing element being coupled with a display screen, the display screen being coupled with a processor for performing a sequence of steps for displaying the region of interest within an endoscopic image on a target area of the display screen, the sequence of steps comprising: detecting the region of interest on a wall of the body cavity; displaying the region of interest on a display screen; marking the region of interest on the display screen; determining a transformation to be applied to the region of interest; and applying the determined transformation for causing the region of interest to be displayed in the center of the pre-defined display screen.
  • In some embodiments, the present specification discloses a method of repositioning a region of interest from a first location to a second location on a display screen of an endoscopy system comprising: selecting a first location comprising a detected region of interest on the display screen; selecting a second location on the display screen; estimating a transformation to be applied to cause said region of interest to move in the direction of said second location; applying said estimated transformation; and evaluating if the region of interest is displayed at said second location on the display screen and if said region of interest is not displayed at said second location, repeatedly estimating a new transformation based on the most recent location of the region of interest and applying the same to cause said region of interest to be displayed at said second location on the display screen.
  • In some embodiments, said first location and said second location may be on the same display screen. Optionally, said first location is on a first display screen and said second location is on a second display screen. Optionally, said second location is substantially the center of the display screen or it is the focal point of the images being captured by the viewing elements of an endoscope during an endoscopic procedure.
  • Optionally, said region of interest is a body abnormality. Optionally, said region of interest is a polyp.
  • Optionally, said first and second locations are selected by using a graphical user interface (GUI). Optionally, said first and second locations are selected by using a mouse click, a touch screen, a hand gesture, or eye tracking.
  • Optionally, said estimation of transformation to be applied to the region of interest comprises determining an angle between the region of interest and the viewing element.
  • Optionally, applying the estimated transformation comprises providing guidance to manually move the tip section of the insertion portion of the endoscope for positioning the region of interest to said second location of the display screen.
  • Optionally, applying the estimated transformation comprises automatically moving the tip section of the insertion portion of the endoscope to reposition the region of interest to said second location of the display screen.
  • Optionally, said automatic application comprises using electrical motors to move the tip section.
  • In some embodiments, the present specification discloses an endoscope system comprising: an insertion tube coupled to a distal tip section wherein said distal tip section comprises at least one viewing element; a main control unit coupled to said insertion tube and coupled to at least one display screen; a means for selecting a region of interest at a first location on said display screen; a means for selecting a second/target location on said display screen; a controller for estimating a transformation based on the most recent location of said region of interest and causing said transformation to be applied to cause said region of interest to move in the direction of said second location; and, wherein after every application of said transformation the system evaluates the latest position of said region of interest and in case said region of interest is not positioned at such second location, a new transformation is estimated based on the most recent location of the region of interest and the same is applied to cause said region of interest to move in the direction of said second location.
  • The aforementioned and other embodiments of the present specification shall be described in greater depth in the drawings and detailed description provided below.
  • BRIEF DESCRIPTION OF THE DRAWINGS
  • These and other features and advantages of the present invention will be appreciated, as they become better understood by reference to the following detailed description when considered in connection with the accompanying drawings, wherein:
  • FIG. 1 illustrates a multiple viewing element endoscopy system, according to some embodiments of the present specification;
  • FIG. 2 is an illustration of an endoscope in which the method and system of the present specification may be implements, in accordance with an embodiment;
  • FIG. 3A is an exemplary schematic diagram showing a point of interest detected by a side viewing element and a chosen target point on a front viewing display screen of a multiple camera endoscope in accordance with an embodiment of the present specification;
  • FIG. 3B is a flowchart describing a method of repositioning a region of interest (ROI) from a first location on a display screen of the endoscope to a second location on at least one display screen, in accordance with an embodiment of the present specification.
  • FIG. 3C illustrates a navigation process to move a region of interest detected by a side pointing camera and displayed at a first location on display screen to a target point located at a second location on a display screen in accordance with an embodiment of the present specification;
  • FIG. 3D illustrates the navigation process as depicted in FIG. 3C wherein the incremental movement of ROI towards the target point is shown in each image snapshot in accordance with an embodiment;
  • FIG. 4 illustrates a camera calibration method, in accordance with an embodiment of the present specification;
  • FIG. 5A illustrates a polyp located on the wall of a body cavity, such as a colon, that can be seen using a viewing element of an endoscope;
  • FIG. 5B illustrates the polyp of FIG. 5A marked as a ROI;
  • FIG. 5C illustrates the polyp/ROI of FIG. 5B detected by using an image detection algorithm;
  • FIG. 5D illustrates a guiding arrow on a central display screen;
  • FIG. 5E illustrates the guiding arrow repositioned toward the polyp/ROI to compensate for movement of the endoscope's tip;
  • FIG. 5F is a diagram showing the polyp/ROI as it is moved so that it is displayed on a central screen with an arrow, guiding a direction of movement of the endoscopic tip in order to position the polyp/ROI in the center of the display screen;
  • FIG. 5G illustrates the polyp/ROI positioned in the center of the central display screen;
  • FIG. 6A illustrates an exemplary screenshot of an application program interface depicting instructions to move a knob on an endoscope handle in a clockwise direction, in accordance with an embodiment of the present specification; and
  • FIG. 6B illustrates an exemplary screenshot of an application program interface depicting instructions to move a knob on an endoscope handle in a counterclockwise direction, in accordance with an embodiment of the present specification.
  • DETAILED DESCRIPTION
  • The present specification provides a system and a method for displaying a region of interest (ROI) in the image captured by a viewing element of the endoscope at a user defined location on the display screen of the endoscope. In an embodiment, the user identifies a region of interest (ROI) such as a polyp (or any other abnormality) at a first location in the image displayed on the display screen of the endoscope. The user then identifies a second, target location on the display screen at which the image should be repositioned. Consequently, the entire image is repositioned such that the region of interest (ROI) is displayed at the second location identified on the display screen. In an embodiment, the second location is substantially the center of the display screen or it is representative of the focal point of the images being captured by the viewing elements of an endoscope during an endoscopic procedure.
  • In an embodiment comprising an endoscopy system with multiple display screens, the first location is on a first screen and the second location is on a second screen. In an embodiment, the first screen displays the images captured by a side viewing element or camera and the second screen displays the images captured by a front viewing element or camera of the endoscopy system. In an embodiment, the first screen displays the images captured by a front viewing element or camera and the second screen displays the images captured by a side viewing element or camera of the endoscopy system. In some embodiments, a first screen may display images captured by a first side viewing element, a second screen (positioned in the center) may display images captured by a front viewing element, and a third screen may display images captured by a second side viewing element. It should be understood that any number of discrete displays may be employed to accommodate each camera or viewing element and the images captured can be displayed on any corresponding display, either dedicated or set by the user.
  • In an embodiment, the ROI may denote an abnormality such as a polyp detected in a body cavity which the user may want to continuously track or display prominently during a medical procedure. The present specification allows a user to view an image of the detected polyp on the center of a display or screen or on a central and/or front screen of the endoscopy system where the system has more than one display, thereby making operating upon the polyp much easier. In embodiments, the present specification allows a user to view an image of a detected polyp at a target region in the display screen.
  • The present specification is directed towards multiple embodiments. The following disclosure is provided in order to enable a person having ordinary skill in the art to practice the invention. Language used in this specification should not be interpreted as a general disavowal of any one specific embodiment or used to limit the claims beyond the meaning of the terms used therein. The general principles defined herein may be applied to other embodiments and applications without departing from the spirit and scope of the invention. Also, the terminology and phraseology used is for the purpose of describing exemplary embodiments and should not be considered limiting. Thus, the present invention is to be accorded the widest scope encompassing numerous alternatives, modifications and equivalents consistent with the principles and features disclosed. For purpose of clarity, details relating to technical material that is known in the technical fields related to the invention have not been described in detail so as not to unnecessarily obscure the present invention. In the description and claims of the application, each of the words “comprise” “include” and “have”, and forms thereof, are not necessarily limited to members in a list with which the words may be associated.
  • As used herein, the indefinite articles “a” and “an” mean “at least one” or “one or more” unless the context clearly dictates otherwise. It should be noted herein that any feature or component described in association with a specific embodiment may be used and implemented with any other embodiment unless clearly indicated otherwise.
  • FIG. 1 illustrates a multi-viewing element endoscopy system in which the method and system for repositioning a region of interest (ROI) from a first location to a second location such as the center of a display or as the focal point of images being captured by the viewing elements of an endoscope during an endoscopic procedure may be implemented. As would be apparent to persons of skill in the art, the method and system for repositioning a region of interest (ROI) from a first location to a second location such as the center of images being captured by the viewing elements of an endoscope during an endoscopic procedure may be implemented in any endoscope comprising one or more viewing elements for capturing images of the insides of a body cavity. Further, the method and system for keeping a region of interest (ROI) in the center of images being captured by the viewing elements of an endoscope during an endoscopic procedure may be implemented in any endoscope system comprising at least one display screen.
  • Reference is now made to FIG. 1, which shows a multi-viewing elements endoscopy system 100. System 100 may include a multi-viewing elements endoscope 102. Multi-viewing elements endoscope 102 may include a handle 104, from which an elongated shaft 106 emerges. Elongated shaft 106 terminates with a tip section 108 which is turnable by way of a bending section 110. Handle 104 may be used for maneuvering elongated shaft 106 within a body cavity. The handle may include one or more buttons and/or knobs and/or switches 105 which control bending section 110 as well as functions such as fluid injection and suction. Handle 104 may further include at least one, and in some embodiments, one or more working channel openings 112 through which surgical tools may be inserted. In embodiments, the handle 104 also includes one and more side service/working channel openings.
  • Tip 108 may include multi-viewing elements. In accordance with an embodiment, tip 108 includes a front viewing element and one or more side viewing elements. In another embodiment, tip 108 may include only a front viewing element.
  • In addition, tip 108 may include one or more service/working channel exit point. In accordance with an embodiment, tip 108 includes a front service/working channel exit point and at least one side service channel exit point. In another embodiment, tip 108 may include two front service/working channel exit points.
  • A utility cable 114, also referred to as an umbilical tube, may connect between handle 104 and a Main Control Unit 199. Utility cable 114 may include therein one or more fluid channels and one or more electrical channels. The electrical channel(s) may include at least one data cable for receiving video signals from the front and side-pointing viewing elements, as well as at least one power cable for providing electrical power to the viewing elements and to the discrete illuminators.
  • The main control unit 199 contains the controls required for displaying the images of internal organs captured by the endoscope 102. The main control unit 199 may govern power transmission to the endoscope's 102 tip section 108, such as for the tip section's viewing elements and illuminators. The main control unit 199 may further control one or more fluid, liquid and/or suction pump(s) which supply corresponding functionalities to the endoscope 102. One or more input devices 118, such as a keyboard, a touch screen and the like may be connected to the main control unit 199 for the purpose of human interaction with the main control unit 199.
  • In the embodiment shown in FIG. 1, the main control unit 199 comprises a screen/display 120 for displaying operation information concerning an endoscopy procedure when the endoscope 102 is in use. The screen 120 may be configured to display images and/or video streams received from the viewing elements of the multi-viewing element endoscope 102. The screen 120 may further be operative to display a user interface for allowing a human operator to set various features of the endoscopy system.
  • Optionally, the video streams received from the different viewing elements of the multi-viewing element endoscope 102 may be displayed separately on at least one monitor (not seen) by uploading information from the main control unit 199, either side-by-side or interchangeably (namely, the operator may switch between views from the different viewing elements manually). Alternatively, these video streams may be processed by the main control unit 199 to combine them into a single, panoramic video frame, based on an overlap between fields of view of the viewing elements. In an embodiment, two or more displays may be connected to the main control unit 199, each for displaying a video stream from a different viewing element of the multi-viewing element endoscope 102. The main control unit 199 is described in U.S. Patent Application Ser. No. 14/263,896, entitled “Method and System for Video Processing in a Multi-Viewing Element Endoscope” and filed on Apr. 28, 2014, which is herein incorporated by reference in its entirety.
  • FIG. 2 illustrates another view of an endoscope in which the method and system for repositioning a region of interest from a first location to a second location on the display screen of the endoscope may be implemented. Endoscope 200 comprises a handle 204, from which an elongated shaft 206 emerges. Elongated shaft 206 terminates with a tip section 208 which is turnable by way of a bending section 210. The tip section 208 comprises at least one viewing element which is used to capture images of the body cavity which are then displayed on a display screen coupled with the viewing element. Handle 204 may be used for maneuvering elongated shaft 206 within a body cavity. In various embodiments, handle 204 comprises a service channel opening 212 through which a medical tool may be inserted. Further in various embodiments, tip 208 includes at least one service/working channel exit point from which the medical tool exits and is placed in close proximity to a region of interest/polyp on a wall of the body cavity being endoscopically examined. A utility cable 214is used to connect handle 204 with a controller (not shown) via an electrical connector 215. The controller governs power transmission to the tip section 208. In an embodiment the controller also controls one or more fluid, liquid and/or suction pump which supply corresponding functionalities to endoscope 200. One or more input devices, such as a keyboard, a computer, a touch screen, a voice recognition system and the like, may be connected to the controller for the purpose of user interaction with the controller.
  • The method and system of the present specification may be applied to a multiple viewing elements endoscope having a front viewing element and side viewing elements. The method and system of the present specification enable a user to locate a polyp in a body cavity using a viewing element and then have the polyp displayed at a target location on a display screen coupled with the viewing element while operating on the polyp using medical tools inserted via a working channel of the endoscope located in proximity to the viewing element. The description provided herein is directed towards an endoscope having a working channel located near a front viewing element/camera so that a polyp observed in a body cavity, such as a colon, is displayed on a front display screen of the endoscope. However, it will be apparent to persons of skill that the method and system of the present specification can easily be applied to an endoscope having a side working channel, wherein the polyp is observed via a side viewing element positioned near the side working channel. In an embodiment, the system and method of the present specification also allows a user to identify the region of interest, such as a polyp, at a first location on a first screen and subsequently identify a second location on a second screen such that the system displays the region of interest at the second location on the second screen of the endoscope.
  • FIG. 3A is an exemplary schematic diagram showing a point of interest detected by a side viewing element and a chosen target point on a front viewing display screen as can be seen using a multi-viewing element endoscope, such as the multi viewing element endoscope 102 described in FIG. 1. Referring to FIG. 1 and FIG. 3a , multi- viewing element endoscope 102, is connected to the controller 199 through a utility cable 114. In embodiments, endoscope 102, provides three simultaneous endoscopic views using three cameras housed in the tip 108, of endoscope 102. Controller 199 is connected to three display screens, 330 a, 330 b, and 330 c, wherein each display screen may be configured to display a corresponding view of the three endoscopic views provided by endoscope system 102. In an embodiment, the display screen 330 b displays the view captured by a front camera and the display screens 330 a and 330 c display the views captured by two side cameras housed in tip 108 of the endoscope 102. In some embodiments, display screens 330 a, 330 b, and 330 c are in the form of a single large screen 335. In some embodiments, the videos are either separated one for the other or stitched together to form a panoramic view.
  • In an embodiment, during an endoscopic procedure there is an interest to keep a certain point of interest/region of interest (ROI) 322, shown on side display 330 a, in the center display screen 330 b and more specifically at a target point (TP) 323 located on the center display screen 330 b. For example, during an abnormality/polyp treatment, the abnormality should be located in front of the center viewing element, such that a medical tool emerging from the front working channel is easily detected by the front viewing element. In another embodiment, wherein the scope has a side working channel the abnormality should preferably be located in front of the side viewing element, such that a medical tool emerging from the side working channel is easily detected by the side viewing element. However, due to the movement of either the colon or multiple viewing elements, this task is not trivial and consumes time during the procedure. In subsequent sections, the present specification discloses systems and methods to enable displaying the ROI 322 at the required TP 323 on a specific display screen.
  • FIG. 3B is a flowchart depicting a method of repositioning a region of interest (ROI) from a first location on a display screen of the endoscope to a second location on the same display screen or on another display screen of the endoscope, in accordance with an embodiment of the present specification. At step 300 a region of interest (ROI) is detected in the body cavity during an endoscopic procedure. In various embodiments a ROI is an abnormality, such as a polyp, a lesion, that may be detected in a body cavity such as a colon. The abnormality is detected via a viewing element of the endoscope such as a front camera and or a side camera and displayed on the display screen of the endoscope. In an embodiment wherein the endoscope comprises a working channel located in proximity of a front viewing element/camera, there is an interest to keep the detected abnormality within the field of view of the front viewing element, which is in data communication with the front display screen.
  • At step 302, the region of interest is identified by selecting/marking the same on the display screen. In an embodiment, the ROI is identified and marked by using a graphical user interface (GUI). In various embodiments, the medical practitioner may mark the ROI/polyp using known methods such as by a mouse click, a touch screen, a hand gesture, or eye tracking. Specifically, upon visually noticing a point of interest 322 on a display screen, a user may navigate an icon on the display to visually coincide with the displayed point of interest 322, resulting in an icon being visually superimposed over the point of interest 322. When a user clicks on the icon, now visually superimposed over the point of interest 322, the display coordinates of the icon are recorded by a processor in memory and those coordinates are associated with the displayed position of the point of interest 322.
  • In an embodiment, the endoscopy system comprises an integrated mouse on the scope handle which is used by the user to select an ROI. In an embodiment, the ROI is detected and marked using image tracking algorithms known in the art, such as the Lucas-Kanade algorithm. In another embodiment, the ROI is detected and marked by matching various neighboring pixels contained in that specific area of the image in accordance with their normalized cross correlation with surrounding areas. In another embodiment, the ROI may be detected by using one or more sensors incorporated in the insertion tube of the endoscope. In embodiments, the system comprises sensors such as an accelerometer or a gyroscope or an electromagnetic tracker or a combination of these sensors. In various embodiments, at least one sensor is used together with the camera to assess the location of a target or ROI relative to the camera. For example, in an embodiment, using an accelerometer, a three-dimensional scene is constructed using displacement estimation from two images taken at different locations and/or from different cameras. In some embodiments, structured light-based 3D estimation is used for 3D scene construction and assessment. In addition, the location of the sensor is important in various embodiments. For example, in an embodiment, an accelerometer must be placed with in the distal tip such that it has the same motion as the camera. In another embodiment, a structured light emitter must have the largest parallax possible from the camera.
  • At step 304, a second/target location at which the region of interest is to be positioned is selected on the display screen. In an embodiment, the second location is positioned on the same display screen. In another embodiment comprising an endoscope with multiple display screens, the second location is positioned on a second display screen. In an embodiment, the target location is identified and marked by using a graphical user interface (GUI). In various embodiments, the medical practitioner may mark the target location using known methods such as by a mouse click, a touch screen, a hand gesture, or eye tracking. Specifically, upon visually identifying a target point 323 on a display screen, a user may navigate an icon on the display to visually coincide with the target point 323, resulting in an icon being visually superimposed over the target point 323. When a user clicks on the icon, now visually superimposed over the target point 323, the display coordinates of the icon are recorded by a processor in memory and those coordinates are associated with the target point 323. In an embodiment, the endoscopy system comprises an integrated mouse on the scope handle which is used by the user to select the target location. In some embodiments, the target point is positioned in the center of the display screen. In other embodiments, the target point is not positioned in the center of the display screen.
  • At step 306 a suitable transformation to be applied to the detected ROI is estimated. In an embodiment, applying the transformation to the detected ROI comprises determining an angle between the ROI and the viewing element/camera. The angle may be determined by using a camera calibration method. The calibration method comprises mapping each pixel in an image captured by the camera with a vector connecting a center of the camera lens with the outside world.
  • At step 308 the transformation estimated in step 306 is used to guide the user to position the ROI/abnormality at the target point (TP) or the second location of the display screen and maintain the position while operating upon the abnormality by inserting medical tools via the working channel. In various embodiments, the transformation or movement to the second location is estimated in a distance of millimeters from the first location or at an angle relative to the first location. Exemplary transformations include pulling the distal tip 5 mm backward or moving the distal tip 30° to the left. In some embodiments, any transformation that places the ROI in the target location is acceptable. In an embodiment, the user is guided by means of icons, guides, or other data displayed on a graphical user interface (GUI) which directs a user to manually move the distal tip of the insertion portion of the endoscope in order to position the abnormality in the target region of the display screen. In another embodiment, the distal tip is moved automatically based on the applied transformation to position the abnormality in the target region of the display screen. In an embodiment, in an automatic system, electronic motors are used to move the distal tip and facilitate the transformation. In yet another embodiment, the abnormality is positioned in the target region of the display screen by using a combination of manual and automatic methods. In an embodiment comprising a semi-automatic system, the angular rotation of tip section is automatic while the longitudinal movement of complete insertion tube in manually done by the user. At step 310 it is determined if the ROI is positioned at the target point of the display screen. If the ROI is not located in the target point the method steps 306 to 310 are repeated.
  • FIG. 3C and FIG. 3D illustrate a navigation process to move a region of interest from a first location on the display screen to a target point/ second location on the display screen, in accordance with an embodiment of the present specification. In FIG. 3C, a structure of interest (polyp) 320 is shown at the top of the image 325. The polyp 320 is captured in the image at a point 321 which is termed as the region of interest (ROI). The goal of the navigation method of the present specification is to move the distal tip 326 of an endoscope capturing the image 325, such that the image ROI 321 is shifted to a desired target region 322, which in an embodiment is in the center of the image. As shown in the figure, the distal tip 326 is required to be rotated in a clockwise direction. This is done in via a feed-back process, wherein at each step the position of the ROI is detected and a transformation is computed and is used to move the distal tip 326 to bring ROI 321 as near as possible to the target region 322. In FIG. 3C, four different image snapshots I, II, III and IV are shown that represent the position of ROI at four different time steps such that at each step a unique transformation is computed based on the position of the ROI at that time. Image snapshot I represents the starting position of the ROI and the target position, and in each subsequent snapshot, due to the movement of the distal tip 326 controlled by computed transformations, the ROI 321 is positioned closer to the target position as compared to the previous snapshot. Finally, in snapshot IV, the ROI 321 is positioned in the target region 322. One of ordinary skill in the art can appreciate that the number of steps required to bring the ROI over the target region can vary depending on the starting positions of the ROI and the target region and the efficiency and accuracy of the system deployed for facilitating the navigation process. In an embodiment, the system has an option wherein a user can lock the position of ROI once it is positioned at the desired point in the display screen. In an embodiment, the position of the ROI is locked by monitoring the movements of the endoscope and applying a compensation, based on said movements, to the distal tip. Once the position of ROI is locked, subsequently, the ROI is always shown at the same location on the display screen irrespective of the movement of the tip section of the endoscopy system.
  • In embodiments, the transformation can be applied automatically or manually. In case an automatic transformation is applied, electrical motors rotate the distal tip in the desired directions, and/or push/pull the insertion tube in and out. In case a manual transformation is applied, the user gets the transformation instructions by an API (Application Program Interface) and applies them directly to the distal tip. An example for such API can be implemented by showing the user a scheme of the scope knobs with arrows that indicate the transformation direction. In an embodiment, the direction and length of the arrows change dynamically as they indicate the desired transformation needed to be performed in each step.
  • FIG. 3D also illustrates the navigation process as depicted in FIG. 3C wherein the incremental movement of ROI towards the target region is clearly shown in each image snapshot. In the image snapshots II, III and IV, shown in FIG. 3D, the current position of ROI is shown as 321 while the last position of ROI (prior to application of last transformation) is depicted as 321 p.
  • FIG. 4 illustrates a camera calibration method for determining the transformation to be applied to an endoscope tip for moving a region of interest to a target point on the display, in accordance with an embodiment of the present specification. A camera (in the endoscope's tip) is calibrated for obtaining a transformation to be applied to move the endoscope tip, and thereby the region of interest to the target point. The calibration method comprises mapping each pixel in an image captured by the camera with a vector connecting a center of the camera lens with the outside world. During the calibration process a ray orientation pixel mapping is obtained with respect to an object/abnormality 402 in the outside world by measuring a distance between the object/abnormality 402 and a pixel 410 located in its image 404 taken by a camera lens 406. Similarly, a ray orientation pixel mapping is obtained by measuring a distance between the object/abnormality 402 and another pixel 412 located in the image 404. By obtaining multiple such mappings with respect to a camera, the calibration parameters of the camera may be obtained by using a camera calibration algorithm. In an embodiment of the present specification, Zhang algorithm/method is used for the purpose of camera calibration. The camera calibration provides information on where a 3D object located at some coordinate relative to the camera would appear in the image. To estimate the transformation, a location of where the image would appear in the camera, given some transformation applied to the image, is estimated. The camera parameters for the estimation are obtained by calibration. An optimization process is then performed to improve the transformation until a suitable transformation is reached.
  • FIGS. 5A to 5G illustrate a schema demonstrating the steps for maintaining a ROI in a center of a front display screen of an endoscopy system, in accordance with an embodiment of the present specification. It should be understood by those of ordinary skill in the art that the methods of the present specification can be employed with systems having any number of screens. In an embodiment, the endoscopy system has more than one display screen. In another embodiment, the endoscopy system comprises only one screen and in this case the ROI can be positioned at a central focal point (or any other target region provided by the user) for the ease of viewing and operation.
  • FIGS. 5A-5G illustrates displays associated with a left side camera 501, a front camera 506 and a right side camera 508 respectively. First, an insertion portion of an endoscope is inserted into a body cavity and the internal walls of the cavity are examined by a medical practitioner via the endoscopic images displayed on a monitoring screen coupled with a viewing element located in the tip of the insertion portion, until a polyp is observed on an internal wall of the body cavity. FIG. 5A illustrates a polyp 502 located on the wall of a body cavity such as a colon captured via a viewing element of an endoscope. Next the medical practitioner selects and marks the polyp as a region of interest, as discussed above.
  • FIG. 5B illustrates the polyp 502 marked as a ROI. In an embodiment, the ROI may be selected and marked by using a graphical user interface (GUI). In various embodiments, the medical practitioner may mark the ROI/polyp 502 using known methods such as by a mouse click, a touch screen, a hand gesture, or eye tracking. In an embodiment, the endoscopy system comprises an integrated mouse on the scope handle which is used by the user to select ROI 502. In various embodiments, the marked ROI is detected by a software program running on a processor coupled with the endoscope and the monitoring screen by using any suitable image detection method.
  • FIG. 5C illustrates the polyp/ROI 502 being detected by using an image detection algorithm. In an embodiment, upon detection of the ROI an arrow 504 appears on the central display monitor 506 of the endoscopy system as shown in FIG. 5D.
  • FIG. 5D illustrates guiding arrow 504 on central display screen 506. The arrow 504 guides the medical practitioner to move tip of the endoscope in a direction that would enable the ROI to be displayed in the center of the central display screen 506. A direction of rotation is computed by determining the rotational direction between the direction of the camera corresponding to the target point and the camera corresponding to the ROI. In an embodiment, the amount of movement in the computed direction is computed in a feedback loop, where real time sampling determines the new rotation direction, which is computed and a small rotation in this direction is applied.
  • In an embodiment of the present specification comprising a calibrated camera, the rotational or angular direction between the camera center, the z-axis and any pixel in the image is known. Thus, one can compute the angle between the camera, the ROI and the target location in the image and, accordingly, the rotation that should be applied to the camera to put the ROI on the target location can be estimated. After applying the transformation, the new ROI location is detected in the image and the process is repeated in an iterative manner until the ROI is placed on top of the target. An illustrative navigation process to reposition a ROI (region of interest) from a first location on a display screen to a TP (target point) at a second location in the same or another display screen is also described in FIG. 3C and FIG. 3D.
  • FIG. 5E illustrates the guiding arrow repositioned toward the polyp/ROI to compensate for movement of the endoscope's tip. Thus, FIG. 5E shows that if the endoscope's tip is moved in an upwards direction, the arrow 504 repositions pointing in a downward direction towards the polyp/ROI 502. Upon moving the endoscope tip in accordance with the direction pointed to by the arrow 504, the polyp/ROI 502 is caused to be displayed on the central display screen 506.
  • FIG. 5F shows the polyp/ROI 502 being displayed on the central screen 506 with the arrow 504 pointing towards the direction in which the endoscope tip should be moved in order to position the polyp/ROI 502 in the center of the display screen 506. FIG. 5G illustrates the poly/ROI 502 positioned in the center of the central display screen 506. Once positioned in the center, the polyp may be operated upon easily.
  • FIGS. 6A and 6B illustrate exemplary screenshots 611, 621 of an application program interface depicting instructions to move a knob 605 on an endoscope handle 604 in a clockwise direction and counterclockwise direction, respectively, in accordance with an embodiment of the present specification. A display of the application program interface shows the user how to move one or more knobs of the endoscope handle in order to move the distal tip of the endoscope in the appropriate direction to position a ROI in the display screen (move the ROI to a second location). In some embodiments, referring to FIG. 6A, the application program interface uses a clockwise curving arrow 603 with text stating “Move the endoscope distally” 609 to instruct the user to rotate knob 605 in a clockwise direction to move the endoscope in a distal direction in order to position the ROI in the second location/display screen. In some embodiments, referring to FIG. 6B, the application program interface uses a counterclockwise curving arrow 613 with text stating “Move the endoscope proximally” 619 to instruct the user to rotate knob 605 in a counterclockwise direction to move the endoscope in a proximal direction in order to position the ROI in the second location/display screen. In various embodiments, the application program interface uses superimposed arrows, text, and other graphics to direct the user to move the endoscope tip in the correct direction in order to position the ROI in the second location/display screen. In various embodiments, the direction and length of the arrows or other graphics and wording of the text change dynamically as they indicate the required movements needed to position the ROI in the second location/display screen.
  • The above examples are merely illustrative of the many applications of the system of present invention. Although only a few embodiments of the present invention have been described herein, it should be understood that the present invention might be embodied in many other specific forms without departing from the spirit or scope of the invention. Therefore, the present examples and embodiments are to be considered as illustrative and not restrictive, and the invention may be modified within the scope of the appended claims.

Claims (21)

1-21. (canceled)
22. A system for displaying a region of interest within a medical image on a target area of at least one display screen, the system comprising:
an insertion portion having a tip section with at least one viewing element for capturing images comprising the region of interest;
the at least one display screen coupled to the at least one viewing element; and
one or more processors configured for:
receiving an image of the region of interest captured through the at least one viewing element;
transmitting the image of the region of interest to the at least one display screen;
recording a demarcation of the region of interest on the display screen;
recording a demarcation of the target area on the at least one display screen;
determining a transformation to be applied to the region of interest; and
transmitting the determined transformation to the at least one display screen to provide guidance to direct a movement of the insertion portion toward the target area based on the recorded demarcation of the target area such that the region of interest is positioned in the target area on the at least one display screen.
23. The system of claim 22, wherein the target area comprises a center of one of the at least one display screen.
24. The system of claim 22, wherein the insertion portion comprises a working channel located proximate to the front viewing element
25. The system of claim 22, wherein the at least one viewing element includes a front viewing element, wherein the at least one display screen includes a central display screen coupled with the front viewing element.
26. The system of claim 22, wherein the one or more processors are further configured for receiving a user's selection of a portion of the at least one display screen using a mouse click, an actuator present on a handle of the medical device, a touch screen, a hand gesture, or eye tracking, wherein the selection is indicative of the region of interest.
27. The system of claim 22, wherein determining a transformation to be applied to the region of interest comprises determining an angle between the region of interest and the at least one viewing element.
28. The system of claim 22, wherein the guidance to direct a movement of the insertion portion toward the target area includes generating and electronically displaying a visual guide that directs a movement of the insertion portion such that the region of interest moves into the target area on the at least one display screen.
29. The system of claim 22, wherein the one or more processors are further configured for transmitting the determined transformation to one or more motors to automatically move the insertion portion in order to position the region of interest in the target area.
30. One or more processors for executing a plurality of programmatic instructions, stored in non-transient memory, for displaying a region of interest within a medical image on a target area of at least one display screen, the plurality of programmatic instructions comprising:
capturing an image of the region of interest through at least one viewing element of an insertion portion of a medical device, wherein the insertion portion has a tip section including the at least one viewing element;
displaying the region of interest on the at least one display screen;
recording a demarcation of the region of interest on the display screen;
recording a demarcation of the target area on the display screen;
determining a transformation to be applied to the region of interest; and
applying the determined transformation to provide guidance to direct a movement of the insertion portion toward the target area based on said recorded demarcation of the target area such that the region of interest is positioned in the target area on the display screen.
31. The one or more processors of claim 30, wherein determining a transformation to be applied to the region of interest comprises determining an angle between the region of interest and the at least one viewing element.
32. The one or more processors of claim 30, wherein applying the determined transformation comprises generating and electronically displaying a visual guide that directs a movement of the insertion portion such that the region of interest moves into the target area on the display screen.
33. The one or more processors of claim 32, wherein the visual guide includes an image of a handle of the medical device and an indicator indicating a direction to move an actuator of the handle.
34. The one or more processors of claim 30, wherein applying the determined transformation comprises transmitting the determined transformation to one or more motors to automatically move the insertion portion in order to position the region of interest in the target area
35. The one or more processors of claim 30, wherein recording the demarcation of the region of interest on the display screen and/or recording the demarcation of the target area on the display screen includes receiving a user's selection of a portion of the at least one display screen using a mouse click, an actuator present on a handle of the medical device, a touch screen, a hand gesture, or eye tracking, wherein the selection is indicative of the region of interest.
36. A system for displaying a region of interest within a medical image on a target area of at least one display screen, the system comprising:
a medical device with at least one viewing element for capturing images comprising the region of interest;
the at least one display screen coupled to the at least one viewing element; and
one or more processors configured for:
receiving an image of the region of interest captured through the at least one viewing element;
transmitting the image of the region of interest to the at least one display screen;
recording a demarcation of the region of interest on the display screen;
recording a demarcation of the target area on the at least one display screen;
determining a transformation to be applied to the region of interest, wherein the transformation comprises generating and displaying on the at least one display screen a visual guide that directs a movement of the medical device such that a portion of the medical device moves towards the region of interest; and
transmitting the determined transformation to the at least one display screen.
37. The system of claim 36, wherein the target area comprises the center of one of the at least one display screen.
38. The system of claim 36, wherein the at least one viewing element includes a front viewing element, wherein the at least one display screen includes a central display screen coupled with the front viewing element.
39. The system of claim 36, wherein the one or more processors are further configured for receiving a user's selection of a portion of the at least one display screen using a mouse click, an actuator present on a handle of the medical device, a touch screen, a hand gesture, or eye tracking, wherein the selection is indicative of the region of interest.
40. The system of claim 36, wherein determining a transformation to be applied to the region of interest comprises determining an angle between the region of interest and the at least one viewing element.
41. The system of claim 36, wherein the visual guide includes an image of an endoscope handle and an arrow indicating a direction of rotation of a knob on the endoscope handle.
US16/382,589 2016-03-14 2019-04-12 System and method for guiding and tracking a region of interest using an endoscope Abandoned US20190231167A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/382,589 US20190231167A1 (en) 2016-03-14 2019-04-12 System and method for guiding and tracking a region of interest using an endoscope

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201662307779P 2016-03-14 2016-03-14
US15/458,239 US10292570B2 (en) 2016-03-14 2017-03-14 System and method for guiding and tracking a region of interest using an endoscope
US16/382,589 US20190231167A1 (en) 2016-03-14 2019-04-12 System and method for guiding and tracking a region of interest using an endoscope

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US15/458,239 Continuation US10292570B2 (en) 2016-03-14 2017-03-14 System and method for guiding and tracking a region of interest using an endoscope

Publications (1)

Publication Number Publication Date
US20190231167A1 true US20190231167A1 (en) 2019-08-01

Family

ID=59788280

Family Applications (2)

Application Number Title Priority Date Filing Date
US15/458,239 Active 2037-06-27 US10292570B2 (en) 2016-03-14 2017-03-14 System and method for guiding and tracking a region of interest using an endoscope
US16/382,589 Abandoned US20190231167A1 (en) 2016-03-14 2019-04-12 System and method for guiding and tracking a region of interest using an endoscope

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US15/458,239 Active 2037-06-27 US10292570B2 (en) 2016-03-14 2017-03-14 System and method for guiding and tracking a region of interest using an endoscope

Country Status (2)

Country Link
US (2) US10292570B2 (en)
WO (1) WO2017160792A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190114738A1 (en) * 2016-06-16 2019-04-18 Olympus Corporation Image processing apparatus and image processing method

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019116593A1 (en) 2017-12-14 2019-06-20 オリンパス株式会社 Surgery support device
US10929669B2 (en) * 2019-06-04 2021-02-23 Magentiq Eye Ltd Systems and methods for processing colon images and videos

Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4874220A (en) * 1987-02-17 1989-10-17 Asahi Kogaku Kogyo Kabushiki Kaisha Viewing optical system for use with endoscope
US4891696A (en) * 1988-04-07 1990-01-02 Olympus Optical Co., Ltd. Electronic endoscope apparatus provided with image forming position detecting means
US4935810A (en) * 1988-10-26 1990-06-19 Olympus Optical Co., Ltd. Three-dimensional measuring apparatus
US5669871A (en) * 1994-02-21 1997-09-23 Olympus Optical Co., Ltd. Endoscope measurement apparatus for calculating approximate expression of line projected onto object to measure depth of recess or the like
US20020010384A1 (en) * 2000-03-30 2002-01-24 Ramin Shahidi Apparatus and method for calibrating an endoscope
US6409658B1 (en) * 1998-12-14 2002-06-25 Fuji Photo Optical Co., Ltd. Endoscope with objective lens drive mechanism
US20050041282A1 (en) * 2003-08-21 2005-02-24 Frank Rudolph Operating menu for a surgical microscope
US6977678B1 (en) * 1999-08-31 2005-12-20 Matsushita Electric Industrial Co., Ltd. Monitor camera system and method of controlling monitor camera thereof
US20060197839A1 (en) * 2005-03-07 2006-09-07 Senior Andrew W Automatic multiscale image acquisition from a steerable camera
US20070270650A1 (en) * 2006-05-19 2007-11-22 Robert Eno Methods and apparatus for displaying three-dimensional orientation of a steerable distal tip of an endoscope
US20080045797A1 (en) * 2004-07-02 2008-02-21 Osaka University Endoscope Attachment And Endoscope
US20080071143A1 (en) * 2006-09-18 2008-03-20 Abhishek Gattani Multi-dimensional navigation of endoscopic video
US20090248036A1 (en) * 2008-03-28 2009-10-01 Intuitive Surgical, Inc. Controlling a robotic surgical tool with a display monitor
US20090250501A1 (en) * 2006-06-05 2009-10-08 Medigus Ltd. Stapler
US20110134517A1 (en) * 2009-12-04 2011-06-09 Olympus Corporation Microscope controller and microscope system comprising microscope controller
US20110263938A1 (en) * 2009-06-18 2011-10-27 Avi Levy Multi-camera endoscope
US20120075444A1 (en) * 2010-09-29 2012-03-29 Tokendo Videoendoscope with configurable tactile controls
US20120119879A1 (en) * 2010-11-15 2012-05-17 Intergraph Technologies Company System and method for camera control in a surveillance system
US20130109916A1 (en) * 2009-06-18 2013-05-02 Peer Medical Ltd. Multi-camera endoscope
US20130155178A1 (en) * 2011-12-16 2013-06-20 Wayne E. Mock Controlling a Camera Using a Touch Interface
US20140078241A1 (en) * 2012-09-14 2014-03-20 Tangome, Inc. Camera manipulation during a video conference
US20140296866A1 (en) * 2009-06-18 2014-10-02 Endochoice, Inc. Multiple Viewing Elements Endoscope Having Two Front Service Channels
US8863027B2 (en) * 2011-07-31 2014-10-14 International Business Machines Corporation Moving object on rendered display using collar
US20160006943A1 (en) * 2014-06-10 2016-01-07 Nitesh Ratnakar Endoscope With Multiple Views And Novel Configurations Adapted Thereto
US9329375B2 (en) * 2010-11-15 2016-05-03 Leica Microsystems (Schweiz) Ag Microscope having a touch screen
US20170257619A1 (en) * 2014-09-18 2017-09-07 Sony Corporation Image processing device and image processing method

Family Cites Families (391)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US3639714A (en) 1970-05-15 1972-02-01 Fujisoku Electric Pushbutton alternate action switch with pushbutton shaft unconnected to alternate action actuator block
US3955064A (en) 1974-05-23 1976-05-04 Le Salon Bruno Demetrio Ltd. Hair styling iron having interchangeable heating tips
US4084401A (en) 1975-07-09 1978-04-18 Hughes Aircraft Company Digital watch with two buttons and improved setting and display control
US4037588A (en) 1975-09-11 1977-07-26 Richard Wolf Gmbh Laryngoscopes
US4027697A (en) 1975-11-19 1977-06-07 Bonney Roland W Rotary valve
JPS54144787A (en) 1978-05-02 1979-11-12 Ouchi Teruo Device for curving endoscope
JPS5675133A (en) 1979-11-22 1981-06-22 Olympus Optical Co Light source apparatus for endoscope
JPS57192547A (en) 1981-05-21 1982-11-26 Olympus Optical Co Ultrasonic diagnostic apparatus for body cavity
US4532918A (en) 1983-10-07 1985-08-06 Welch Allyn Inc. Endoscope signal level control
JPS60263916A (en) 1984-06-13 1985-12-27 Olympus Optical Co Ltd Retrofocus type varifocal objective lens for endoscope
US4588294A (en) 1984-06-27 1986-05-13 Warner-Lambert Technologies, Inc. Searching and measuring endoscope
US4641635A (en) 1984-08-15 1987-02-10 Olympus Optical Co., Ltd. Endoscope apparatus
JPS62220918A (en) 1986-03-22 1987-09-29 Olympus Optical Co Ltd Endoscope optical system
JPH07119893B2 (en) 1986-09-22 1995-12-20 オリンパス光学工業株式会社 Endoscope optical system
US4727859A (en) 1986-12-29 1988-03-01 Welch Allyn, Inc. Right angle detachable prism assembly for borescope
JP2697822B2 (en) 1987-05-25 1998-01-14 オリンパス光学工業株式会社 Endoscope objective lens
JP2804267B2 (en) 1988-05-02 1998-09-24 オリンパス光学工業株式会社 Endoscope objective lens
US4825850A (en) 1988-05-13 1989-05-02 Opielab, Inc. Contamination protection system for endoscope control handles
JPH0223931A (en) 1988-07-13 1990-01-26 Asahi Optical Co Ltd Brake means of curving device of endoscope
JPH07122692B2 (en) 1988-09-29 1995-12-25 富士写真光機株式会社 Objective lens for endoscope
US5007406A (en) 1988-11-04 1991-04-16 Asahi Kogaku Kogyo Kabushiki Kaisha Bending control device of endoscope
US5193525A (en) 1990-11-30 1993-03-16 Vision Sciences Antiglare tip in a sheath for an endoscope
US5224929A (en) 1990-12-21 1993-07-06 C. R. Bard, Inc. Irrigation/aspiration cannula and valve assembly
JPH0827429B2 (en) 1991-03-04 1996-03-21 オリンパス光学工業株式会社 Objective lens for endoscope
DE69228410T2 (en) 1991-08-21 1999-07-08 Smith & Nephew Inc Liquid treatment system
US5359456A (en) 1991-10-15 1994-10-25 Olympus Optical Co., Ltd. Objective lens system for endoscopes
DE69321963T2 (en) 1992-09-01 1999-04-01 Adair Edwin Lloyd STERILIZABLE ENDOSCOPE WITH A DETACHABLE DISPOSABLE PIPE ARRANGEMENT
US5630782A (en) 1992-09-01 1997-05-20 Adair; Edwin L. Sterilizable endoscope with separable auxiliary assembly
US5685821A (en) 1992-10-19 1997-11-11 Arthrotek Method and apparatus for performing endoscopic surgical procedures
US5836894A (en) 1992-12-21 1998-11-17 Artann Laboratories Apparatus for measuring mechanical parameters of the prostate and for imaging the prostate using such parameters
JP3372980B2 (en) 1993-01-22 2003-02-04 オリンパス光学工業株式会社 Endoscope
US5863286A (en) 1993-01-27 1999-01-26 Olympus Optical Company, Ltd. Endoscope system including endoscope and disposable protection cover
US5674182A (en) 1993-02-26 1997-10-07 Olympus Optical Co., Ltd. Endoscope system including endoscope and protection cover
US5460167A (en) 1993-03-04 1995-10-24 Olympus Optical Co., Ltd. Endoscope cover with channel
US5569157A (en) 1993-05-07 1996-10-29 Olympus Optical Co., Ltd. Endoscope
US5475420A (en) 1993-06-09 1995-12-12 Origin Medsystems, Inc. Video imaging system with image processing optimized for small-diameter endoscopes
US5447148A (en) 1993-07-08 1995-09-05 Vision Sciences, Inc. Endoscopic contamination protection system to facilitate cleaning of endoscopes
US5830121A (en) 1993-10-27 1998-11-03 Asahi Kogaku Kogyo Kabushiki Kaisha Endoscopic apparatus having an endoscope and a peripheral device wherein total usage of the endoscope is quantified and recorded
JP3392920B2 (en) 1993-11-26 2003-03-31 ペンタックス株式会社 Endoscope tip
JP3272516B2 (en) 1993-11-18 2002-04-08 旭光学工業株式会社 Endoscope tip
US5725477A (en) 1993-11-18 1998-03-10 Asahi Kogaku Kogyo Kabushiki Kaisha Front end structure of endoscope
ATE196809T1 (en) 1993-11-29 2000-10-15 Olympus Optical Co ARRANGEMENT FOR IMAGE ROTATION AND OVERLAYING
US5395329A (en) 1994-01-19 1995-03-07 Daig Corporation Control handle for steerable catheter
US5464007A (en) 1994-02-23 1995-11-07 Welch Allyn, Inc. Fluid insensitive braking for an endoscope
US5575755A (en) 1994-02-23 1996-11-19 Welch Allyn, Inc. Fluid insensitive braking for an endoscope
US5685823A (en) 1994-03-30 1997-11-11 Asahi Kogaku Kogyo Kabushiki Kaisha End structure of endoscope
US5547455A (en) 1994-03-30 1996-08-20 Medical Media Systems Electronically steerable endoscope
US5782751A (en) 1994-05-26 1998-07-21 Asahi Kogaku Kogyo Kabushiki Kaisha Side view endoscope
JP3379821B2 (en) 1994-05-31 2003-02-24 オリンパス光学工業株式会社 Endoscope
US5518502A (en) 1994-06-08 1996-05-21 The United States Surgical Corporation Compositions, methods and apparatus for inhibiting fogging of endoscope lenses
JP3051035B2 (en) 1994-10-18 2000-06-12 富士写真光機株式会社 Objective lens for endoscope
US5940126A (en) 1994-10-25 1999-08-17 Kabushiki Kaisha Toshiba Multiple image video camera apparatus
JPH0910170A (en) 1995-06-29 1997-01-14 Olympus Optical Co Ltd Objective optical system of endoscope
JPH0980305A (en) 1995-09-11 1997-03-28 Fuji Photo Optical Co Ltd Endoscope objective lens
US5810717A (en) 1995-09-22 1998-09-22 Mitsubishi Cable Industries, Ltd. Bending mechanism and stereoscope using same
US5810715A (en) 1995-09-29 1998-09-22 Olympus Optical Co., Ltd. Endoscope provided with function of being locked to flexibility of insertion part which is set by flexibility modifying operation member
US6117068A (en) 1995-10-19 2000-09-12 Elite Genetics, Inc Artificial insemination system
US5725478A (en) 1996-03-14 1998-03-10 Saad; Saad A. Methods and apparatus for providing suction and/or irrigation in a rigid endoscope while maintaining visual contact with a target area through the endoscope
US5860913A (en) 1996-05-16 1999-01-19 Olympus Optical Co., Ltd. Endoscope whose distal cover can be freely detachably attached to main distal part thereof with high positioning precision
JPH1043129A (en) 1996-08-02 1998-02-17 Olympus Optical Co Ltd Endoscope
US7018331B2 (en) 1996-08-26 2006-03-28 Stryker Corporation Endoscope assembly useful with a scope-sensing light cable
DE19636152C2 (en) 1996-09-06 1999-07-01 Schneider Co Optische Werke Compact wide angle lens
US5810770A (en) 1996-12-13 1998-09-22 Stryker Corporation Fluid management pump system for surgical procedures
JP4054094B2 (en) 1996-12-27 2008-02-27 オリンパス株式会社 Electronic endoscope
AUPO478397A0 (en) 1997-01-31 1997-02-20 Fairmont Medical Products Pty Ltd Endoscopic drape
US6058109A (en) 1997-02-04 2000-05-02 The Kohl Group, Inc. Combined uniform rate and burst rate transmission system
US6095970A (en) 1997-02-19 2000-08-01 Asahi Kogaku Kogyo Kabushiki Kaisha Endoscope
JPH10239740A (en) 1997-02-28 1998-09-11 Toshiba Corp Endoscope device
US5924976A (en) 1997-08-21 1999-07-20 Stelzer; Paul Minimally invasive surgery device
US20040052076A1 (en) 1997-08-26 2004-03-18 Mueller George G. Controlled lighting methods and apparatus
EP1009283A1 (en) 1997-08-28 2000-06-21 Qualia Computing, Inc. Method and system for automated detection of clustered microcalcifications from digital mammograms
US20110034769A1 (en) 1997-10-06 2011-02-10 Micro-Imaging Solutions Llc Reduced area imaging device incorporated within wireless endoscopic devices
US6095971A (en) 1997-10-22 2000-08-01 Fuji Photo Optical Co., Ltd. Endoscope fluid controller
JPH11137512A (en) 1997-11-07 1999-05-25 Toshiba Corp Endoscopic equipment
US7598686B2 (en) 1997-12-17 2009-10-06 Philips Solid-State Lighting Solutions, Inc. Organic light emitting diode methods and apparatus
US6277064B1 (en) 1997-12-30 2001-08-21 Inbae Yoon Surgical instrument with rotatably mounted offset endoscope
US6196967B1 (en) 1998-03-18 2001-03-06 Linvatec Corporation Arthroscopic component joining system
US6419626B1 (en) 1998-08-12 2002-07-16 Inbae Yoon Surgical instrument endoscope with CMOS image sensor and physical parameter sensor
JP4014186B2 (en) 1998-11-30 2007-11-28 フジノン株式会社 Endoscope objective lens
JP3704248B2 (en) 1999-02-04 2005-10-12 ペンタックス株式会社 Endoscope operation wire connection structure
US7015950B1 (en) 1999-05-11 2006-03-21 Pryor Timothy R Picture taking method and apparatus
JP4245731B2 (en) 1999-06-08 2009-04-02 オリンパス株式会社 Liquid crystal lens unit and liquid crystal lens assembly
US6690337B1 (en) 1999-06-09 2004-02-10 Panoram Technologies, Inc. Multi-panel video display
IL143258A0 (en) 2001-05-20 2002-04-21 Given Imaging Ltd A method for in vivo imaging of the gastrointestinal tract in unmodified conditions
US20020087047A1 (en) 1999-09-13 2002-07-04 Visionscope, Inc. Miniature endoscope system
JP2001095747A (en) 1999-09-30 2001-04-10 Olympus Optical Co Ltd Electronic endoscope
JP2001177824A (en) 1999-12-17 2001-06-29 Asahi Optical Co Ltd Signal changeover device for electronic endoscope
US6699179B2 (en) 2000-01-27 2004-03-02 Scimed Life Systems, Inc. Catheter introducer system for exploration of body cavities
WO2001069919A1 (en) 2000-03-10 2001-09-20 Datacube, Inc. Image processing system using an array processor
IL138632A (en) 2000-09-21 2008-06-05 Minelu Zonnenschein Multiple view endoscopes
US6837846B2 (en) 2000-04-03 2005-01-04 Neo Guide Systems, Inc. Endoscope having a guide tube
US6712760B2 (en) 2000-04-10 2004-03-30 Pentax Corporation Television device of portable endoscope
US6673012B2 (en) 2000-04-19 2004-01-06 Pentax Corporation Control device for an endoscope
US20010055062A1 (en) 2000-04-20 2001-12-27 Keiji Shioda Operation microscope
WO2002007587A2 (en) 2000-07-14 2002-01-31 Xillix Technologies Corporation Compact fluorescent endoscopy video system
JP3945133B2 (en) 2000-08-02 2007-07-18 フジノン株式会社 Endoscope observation window cleaning device
US6717092B2 (en) 2000-08-11 2004-04-06 Pentax Corporation Method of manufacturing treatment instrument of endoscope
KR100870033B1 (en) 2001-01-16 2008-11-21 기븐 이미징 리미티드 System and method for wide field imaging of body lumens
WO2002062262A2 (en) 2001-02-02 2002-08-15 Insight Instruments, Inc. Endoscope system and method of use
JP3939158B2 (en) 2001-02-06 2007-07-04 オリンパス株式会社 Endoscope device
US20020109771A1 (en) 2001-02-14 2002-08-15 Michael Ledbetter Method and system for videoconferencing
JP2002263055A (en) 2001-03-12 2002-09-17 Olympus Optical Co Ltd Tip hood for endoscope
DE10121450A1 (en) 2001-04-27 2002-11-21 Storz Endoskop Gmbh Schaffhaus Optical instrument, in particular an endoscope, with an exchangeable head
US6618205B2 (en) 2001-05-14 2003-09-09 Pentax Corporation Endoscope objective optical system
DE60228165D1 (en) 2001-05-16 2008-09-25 Olympus Corp Endoscope with image processing device
US7231135B2 (en) 2001-05-18 2007-06-12 Pentax Of American, Inc. Computer-based video recording and management system for medical diagnostic equipment
US7123288B2 (en) 2001-09-28 2006-10-17 Fujinon Corporation Electronic endoscope eliminating influence of light distribution in optical zooming
US6980227B2 (en) 2001-10-01 2005-12-27 Pentax Corporation Electronic endoscope with light-amount adjustment apparatus
US8038602B2 (en) 2001-10-19 2011-10-18 Visionscope Llc Portable imaging system employing a miniature endoscope
US6863651B2 (en) 2001-10-19 2005-03-08 Visionscope, Llc Miniature endoscope with imaging fiber system
US20070167681A1 (en) 2001-10-19 2007-07-19 Gill Thomas J Portable imaging system employing a miniature endoscope
EP1313066B1 (en) 2001-11-19 2008-08-27 STMicroelectronics S.r.l. A method for merging digital images to obtain a high dynamic range digital image
US7428019B2 (en) 2001-12-26 2008-09-23 Yeda Research And Development Co. Ltd. System and method for increasing space or time resolution in video
US20030158503A1 (en) 2002-01-18 2003-08-21 Shinya Matsumoto Capsule endoscope and observation system that uses it
JP4147033B2 (en) 2002-01-18 2008-09-10 オリンパス株式会社 Endoscope device
US20030153897A1 (en) 2002-02-12 2003-08-14 Russo Ronald D. Closed system drainage and infusion connector valve
US8988599B2 (en) 2010-08-31 2015-03-24 University Of Southern California Illumination sphere with intelligent LED lighting units in scalable daisy chain with interchangeable filters
JP4170685B2 (en) 2002-07-05 2008-10-22 フジノン株式会社 Endoscope bending operation mechanism
US20040061780A1 (en) 2002-09-13 2004-04-01 Huffman David A. Solid-state video surveillance system
CA2498578C (en) 2002-09-13 2011-11-22 Karl Storz Imaging, Inc. Video recording and image capture device
JP4131012B2 (en) 2002-10-10 2008-08-13 Hoya株式会社 Endoscope with sheath
US7097632B2 (en) 2002-10-28 2006-08-29 Sherwood Services Ag Automatic valve
US7537561B2 (en) 2002-11-27 2009-05-26 Olympus Corporation Endoscope apparatus
FR2857200B1 (en) 2003-07-01 2005-09-30 Tokendo VIDEO PROCESSOR FOR ENDOSCOPY
DE102004006541B4 (en) 2003-02-10 2016-11-10 Hoya Corp. endoscope
US7027231B2 (en) 2003-02-14 2006-04-11 Fujinon Corporation Endoscope objective lens
US7160249B2 (en) 2003-03-28 2007-01-09 Olympus Corporation Endoscope image pickup unit for picking up magnified images of an object, a focus adjustment apparatus and method, and a focus range check apparatus and method for the same
US8118732B2 (en) 2003-04-01 2012-02-21 Boston Scientific Scimed, Inc. Force feedback control system for video endoscope
US20050222499A1 (en) 2003-04-01 2005-10-06 Banik Michael S Interface for video endoscope system
WO2004096008A2 (en) 2003-05-01 2004-11-11 Given Imaging Ltd. Panoramic field of view imaging device
JP4144444B2 (en) 2003-06-20 2008-09-03 フジノン株式会社 Endoscope fluid delivery device
US7153259B2 (en) 2003-09-01 2006-12-26 Olympus Corporation Capsule type endoscope
US7154378B1 (en) 2003-09-11 2006-12-26 Stryker Corporation Apparatus and method for using RFID to track use of a component within a device
JP4533695B2 (en) 2003-09-23 2010-09-01 オリンパス株式会社 Treatment endoscope
US20050251127A1 (en) 2003-10-15 2005-11-10 Jared Brosch Miniature ultrasonic transducer with focusing lens for intracardiac and intracavity applications
WO2005046462A1 (en) 2003-11-14 2005-05-26 Apricot Co., Ltd. Endoscope device and imaging method using the same
WO2005053539A1 (en) 2003-12-02 2005-06-16 Olympus Corporation Ultrasonographic device
EP1707102B1 (en) 2004-01-19 2010-05-05 Olympus Corporation Capsule type medical treatment device
EP2572626B1 (en) 2004-01-19 2016-03-23 Olympus Corporation Capsule type endoscope
JP4448348B2 (en) 2004-03-10 2010-04-07 Hoya株式会社 Endoscope water channel
AU2005228956B2 (en) 2004-03-23 2011-08-18 Boston Scientific Limited In-vivo visualization system
JP2005278762A (en) 2004-03-29 2005-10-13 Fujinon Corp Centesis type probe for endoscope
US7976462B2 (en) 2004-04-06 2011-07-12 Integrated Endoscopy, Inc. Endoscope designs and methods of manufacture
US20050272977A1 (en) 2004-04-14 2005-12-08 Usgi Medical Inc. Methods and apparatus for performing endoluminal procedures
US8512229B2 (en) 2004-04-14 2013-08-20 Usgi Medical Inc. Method and apparatus for obtaining endoluminal access
US8277373B2 (en) 2004-04-14 2012-10-02 Usgi Medical, Inc. Methods and apparaus for off-axis visualization
JP4500096B2 (en) 2004-04-27 2010-07-14 オリンパス株式会社 Endoscope and endoscope system
US20050277808A1 (en) 2004-05-14 2005-12-15 Elazar Sonnenschein Methods and devices related to camera connectors
JP3967337B2 (en) 2004-05-14 2007-08-29 オリンパス株式会社 Endoscope and endoscope apparatus
WO2005112737A1 (en) 2004-05-24 2005-12-01 Olympus Corporation Light source device for endoscope
IL162390A0 (en) 2004-06-07 2005-11-20 Medigus Ltd Multipurpose endoscopy suite
US8500630B2 (en) 2004-06-30 2013-08-06 Given Imaging Ltd. In vivo device with flexible circuit board and method for assembly thereof
JP4544924B2 (en) 2004-07-12 2010-09-15 フォルテ グロウ メディカル株式会社 Endoscopy tube
US7335159B2 (en) 2004-08-26 2008-02-26 Scimed Life Systems, Inc. Endoscope having auto-insufflation and exsufflation
JP4649606B2 (en) 2004-08-31 2011-03-16 国立大学法人 名古屋工業大学 Spherical capsule type omnidirectional endoscope
WO2006025058A1 (en) 2004-09-03 2006-03-09 Stryker Gi Ltd. Optical head for endoscope
WO2006041426A2 (en) 2004-09-15 2006-04-20 Adobe Systems Incorporated Locating a feature in a digital image
US8480566B2 (en) 2004-09-24 2013-07-09 Vivid Medical, Inc. Solid state illumination for endoscopy
US8353860B2 (en) 2004-09-30 2013-01-15 Boston Scientific Scimed, Inc. Device for obstruction removal with specific tip structure
US8199187B2 (en) 2004-09-30 2012-06-12 Boston Scientific Scimed, Inc. Adapter for use with digital imaging medical device
US7241263B2 (en) 2004-09-30 2007-07-10 Scimed Life Systems, Inc. Selectively rotatable shaft coupler
US20090023998A1 (en) 2007-07-17 2009-01-22 Nitesh Ratnakar Rear view endoscope sheath
US11653816B2 (en) 2004-10-11 2023-05-23 Nitesh Ratnakar Next generation endoscope
US20080275298A1 (en) 2004-10-11 2008-11-06 Novation Science, Llc Dual View Endoscope
US7621869B2 (en) 2005-05-06 2009-11-24 Nitesh Ratnakar Next generation colonoscope
US8585584B2 (en) 2004-10-11 2013-11-19 Nitesh Ratnakar Dual view endoscope
US9131861B2 (en) 2004-11-30 2015-09-15 Academisch Medisch Centrum Pulsed lighting imaging systems and methods
US20060149127A1 (en) 2004-12-30 2006-07-06 Seddiqui Fred R Disposable multi-lumen catheter with reusable stylet
US8872906B2 (en) 2005-01-05 2014-10-28 Avantis Medical Systems, Inc. Endoscope assembly with a polarizing filter
US20070293720A1 (en) 2005-01-05 2007-12-20 Avantis Medical Systems, Inc. Endoscope assembly and method of viewing an area inside a cavity
US20060149129A1 (en) 2005-01-05 2006-07-06 Watts H D Catheter with multiple visual elements
US20080021274A1 (en) 2005-01-05 2008-01-24 Avantis Medical Systems, Inc. Endoscopic medical device with locking mechanism and method
US8797392B2 (en) 2005-01-05 2014-08-05 Avantis Medical Sytems, Inc. Endoscope assembly with a polarizing filter
US8182422B2 (en) 2005-12-13 2012-05-22 Avantis Medical Systems, Inc. Endoscope having detachable imaging device and method of using
US8289381B2 (en) 2005-01-05 2012-10-16 Avantis Medical Systems, Inc. Endoscope with an imaging catheter assembly and method of configuring an endoscope
US8029439B2 (en) 2005-01-28 2011-10-04 Stryker Corporation Disposable attachable light source unit for an endoscope
US7668450B2 (en) 2005-01-28 2010-02-23 Stryker Corporation Endoscope with integrated light source
US20060215406A1 (en) 2005-02-18 2006-09-28 William Thrailkill Medical diagnostic instrument with highly efficient, tunable light emitting diode light source
US7998064B2 (en) 2005-03-15 2011-08-16 Olympus Medical Systems Corp. Endoscope insertion portion
KR100895750B1 (en) 2005-03-18 2009-04-30 올림푸스 메디칼 시스템즈 가부시키가이샤 Endoscope, endoscope system, and switching circuit member for endoscope
US8388523B2 (en) 2005-04-01 2013-03-05 Welch Allyn, Inc. Medical diagnostic instrument having portable illuminator
US8092475B2 (en) 2005-04-15 2012-01-10 Integra Lifesciences (Ireland) Ltd. Ultrasonic horn for removal of hard tissue
JP4754871B2 (en) 2005-05-11 2011-08-24 オリンパスメディカルシステムズ株式会社 End of the endoscope
US20060293556A1 (en) 2005-05-16 2006-12-28 Garner David M Endoscope with remote control module or camera
JP4987257B2 (en) 2005-06-29 2012-07-25 オリンパス株式会社 Endoscope
US20070015989A1 (en) 2005-07-01 2007-01-18 Avantis Medical Systems, Inc. Endoscope Image Recognition System and Method
CA2620196A1 (en) 2005-08-24 2007-03-01 Traxtal Inc. System, method and devices for navigated flexible endoscopy
BRPI0615651A2 (en) 2005-09-06 2011-05-24 Stryker Gi Ltd disposable endoscope cap
WO2007034664A1 (en) 2005-09-22 2007-03-29 Olympus Corporation Endoscope insertion part
GB0519769D0 (en) 2005-09-28 2005-11-09 Imp College Innovations Ltd Imaging system
US7430682B2 (en) 2005-09-30 2008-09-30 Symbol Technologies, Inc. Processing image data from multiple sources
US7918788B2 (en) 2005-10-31 2011-04-05 Ethicon, Inc. Apparatus and method for providing flow to endoscope channels
US20070162095A1 (en) 2006-01-06 2007-07-12 Ezc Medical Llc Modular visualization stylet apparatus and methods of use
WO2007087421A2 (en) 2006-01-23 2007-08-02 Avantis Medical Systems, Inc. Endoscope
JP5435957B2 (en) 2006-01-23 2014-03-05 アヴァンティス メディカル システムズ インコーポレイテッド Endoscope
WO2007086073A2 (en) 2006-01-30 2007-08-02 Vision - Sciences Inc. Controllable endoscope
ES2614487T3 (en) 2006-01-30 2017-05-31 Covidien Lp Device for balancing the targets and applying an anti-fogging agent to medical videoscopes before medical procedures
EP1988812B1 (en) 2006-02-06 2012-11-21 Avantis Medical Systems, Inc. Endoscope with an imaging catheter assembly and method of configuring an endoscope
EP2520218A1 (en) 2006-02-09 2012-11-07 Avantis Medical Systems, Inc. Endoscope assembly with a polarizing filter
DE102007038859B4 (en) 2006-02-23 2020-03-12 Atmos Medizintechnik Gmbh & Co. Kg Method and arrangement for generating a signal according to patent DE 10 2006 008 990 B4 corresponding to the opening state of the vocal folds of the larynx
WO2007100846A2 (en) 2006-02-28 2007-09-07 Emphasys Medical, Inc. Endoscopic tool
US7834910B2 (en) 2006-03-01 2010-11-16 David M. DeLorme Method and apparatus for panoramic imaging
US9814372B2 (en) 2007-06-27 2017-11-14 Syntheon, Llc Torque-transmitting, variably-flexible, locking insertion device and method for operating the insertion device
US20070225562A1 (en) 2006-03-23 2007-09-27 Ethicon Endo-Surgery, Inc. Articulating endoscopic accessory channel
US8063933B2 (en) 2006-03-27 2011-11-22 Given Imaging Ltd. Battery contacts for an in-vivo imaging device
US7794393B2 (en) 2006-04-13 2010-09-14 Larsen Dane M Resectoscopic device and method
US20070241895A1 (en) 2006-04-13 2007-10-18 Morgan Kelvin L Noise reduction for flexible sensor material in occupant detection
US8287446B2 (en) 2006-04-18 2012-10-16 Avantis Medical Systems, Inc. Vibratory device, endoscope having such a device, method for configuring an endoscope, and method of reducing looping of an endoscope
US7955255B2 (en) 2006-04-20 2011-06-07 Boston Scientific Scimed, Inc. Imaging assembly with transparent distal cap
US20070247867A1 (en) 2006-04-21 2007-10-25 Sunoptic Technologies Llc Portable LED Light Source for an Endoscope or Boroscope
JP4928969B2 (en) 2006-04-26 2012-05-09 Hoya株式会社 Endoscope bending retention mechanism
FR2900741B1 (en) 2006-05-05 2008-07-18 Mauna Kea Technologies Soc Par MINIATURIZED OPTICAL HEAD WITH HIGH SPACE RESOLUTION AND HIGH SENSITIVITY, ESPECIALLY FOR FIBROUS CONFOCAL FLUORESCENCE IMAGING
JP2007301092A (en) 2006-05-10 2007-11-22 Pentax Corp Connector device of electronic endoscope
US8038598B2 (en) 2006-05-15 2011-10-18 Baystate Health, Inc. Balloon endoscope device
EP2023794A2 (en) 2006-05-19 2009-02-18 Avantis Medical Systems, Inc. System and method for producing and improving images
JP2007325724A (en) 2006-06-07 2007-12-20 Olympus Medical Systems Corp Management system for cleaning and disinfection of endoscope
US20080065101A1 (en) 2006-06-13 2008-03-13 Intuitive Surgical, Inc. Minimally invasive surgical apparatus with side exit instruments
US20080025413A1 (en) 2006-07-28 2008-01-31 John Apostolopoulos Selecting bit rates for encoding multiple data streams
EP1884805A1 (en) 2006-08-01 2008-02-06 Varioptic Liquid lens with four liquids
US7927272B2 (en) 2006-08-04 2011-04-19 Avantis Medical Systems, Inc. Surgical port with embedded imaging device
US20080036864A1 (en) 2006-08-09 2008-02-14 Mccubbrey David System and method for capturing and transmitting image data streams
US8248413B2 (en) * 2006-09-18 2012-08-21 Stryker Corporation Visual navigation system for endoscopic surgery
US7967745B2 (en) 2006-09-28 2011-06-28 Given Imaging, Ltd. In vivo imaging device and method of manufacture thereof
US20080091065A1 (en) 2006-10-04 2008-04-17 Olympus Medical Systems Corporation Medical image processing apparatus, endoscope system and medical image processing system
JP5203962B2 (en) 2006-11-24 2013-06-05 オリンパスメディカルシステムズ株式会社 Capsule endoscope
JP5226533B2 (en) 2006-11-28 2013-07-03 オリンパス株式会社 Endoscope device
US7813047B2 (en) 2006-12-15 2010-10-12 Hand Held Products, Inc. Apparatus and method comprising deformable lens element
JP4912141B2 (en) 2006-12-26 2012-04-11 キヤノン株式会社 Imaging display device, imaging display control method and system
US20130317295A1 (en) 2006-12-29 2013-11-28 GE Inspection Technologies Light assembly for remote visual inspection apparatus
US7870765B2 (en) 2007-01-04 2011-01-18 Scot Incorporated Safing lock mechanism
EP3120752A1 (en) 2007-01-19 2017-01-25 Sunnybrook Health Sciences Centre Scanning mechanisms for imaging probe
JP5086648B2 (en) 2007-01-19 2012-11-28 オリンパスメディカルシステムズ株式会社 Treatment tool
DE102007015492B4 (en) 2007-01-30 2011-03-24 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Illumination device for an image capture device at the distal end of an endoscope
US8672836B2 (en) 2007-01-31 2014-03-18 The Penn State Research Foundation Method and apparatus for continuous guidance of endoscopy
WO2008093322A2 (en) 2007-01-31 2008-08-07 Stryker Gi Ltd. Endoscope stand
US20090231419A1 (en) 2007-02-06 2009-09-17 Avantis Medical Systems, Inc. Endoscope Assembly and Method of Performing a Medical Procedure
US20080221388A1 (en) 2007-03-09 2008-09-11 University Of Washington Side viewing optical fiber endoscope
US20080246771A1 (en) 2007-04-03 2008-10-09 Dell Products L.P. Graphics processing system and method
US8064666B2 (en) 2007-04-10 2011-11-22 Avantis Medical Systems, Inc. Method and device for examining or imaging an interior surface of a cavity
US7813538B2 (en) 2007-04-17 2010-10-12 University Of Washington Shadowing pipe mosaicing algorithms with application to esophageal endoscopy
JP2008301968A (en) 2007-06-06 2008-12-18 Olympus Medical Systems Corp Endoscopic image processing apparatus
WO2008153841A2 (en) 2007-06-08 2008-12-18 Thomas Hsu Devices and methods for removal of debris from the objective lens of an endoscope
EP2172146B1 (en) 2007-06-22 2015-03-04 Olympus Medical Systems Corp. Capsule-type medical device
EP2190341A2 (en) 2007-07-26 2010-06-02 Avantis Medical Systems, Inc. Endoscope system
JP2009047947A (en) 2007-08-21 2009-03-05 Fujinon Corp Imaging lens and imaging apparatus
CN100582855C (en) 2007-08-23 2010-01-20 鸿富锦精密工业(深圳)有限公司 Endoscopy lens and endoscopy device
JP2009056105A (en) 2007-08-31 2009-03-19 Olympus Medical Systems Corp Focal distance variable endoscope
JP4904470B2 (en) 2007-09-19 2012-03-28 富士フイルム株式会社 Observation image forming apparatus
JP2009080413A (en) 2007-09-27 2009-04-16 Fujinon Corp Imaging optical system, image pickup apparatus for endoscope
JP2009082504A (en) 2007-09-28 2009-04-23 Fujifilm Corp Image pickup device and endoscope provided with image pickup device
EP2211683A2 (en) 2007-10-11 2010-08-04 Avantis Medical Systems, Inc. Endoscope assembly comprising retrograde viewing imaging device and instrument channel
WO2009049324A1 (en) 2007-10-11 2009-04-16 Avantis Medical Systems, Inc. Method and device for reducing the fixed pattern noise of a digital image
US9118850B2 (en) 2007-11-27 2015-08-25 Capso Vision, Inc. Camera system with multiple pixel arrays on a chip
US20090143647A1 (en) 2007-12-03 2009-06-04 Olympus Medical Systems Corp. Medical appliance, endoscope overtube, and endoscope apparatus
US8360964B2 (en) 2007-12-10 2013-01-29 Stryker Corporation Wide angle HDTV endoscope
US7885010B1 (en) 2008-01-14 2011-02-08 Integrated Medical Systems International Endoscope objective lens and method of assembly
US8723756B2 (en) 2008-01-15 2014-05-13 Synaptics Incorporated System having capability for daisy-chained serial distribution of video display data
US8529441B2 (en) 2008-02-12 2013-09-10 Innurvation, Inc. Ingestible endoscopic optical scanning device
JP2009189528A (en) 2008-02-14 2009-08-27 Fujinon Corp Processor unit for endoscope and endoscopic system
US8414478B2 (en) 2008-02-26 2013-04-09 Fujifilm Corporation Endoscopic aspiration device
GB2493606B (en) 2008-03-07 2013-03-27 Milwaukee Electric Tool Corp Visual inspection device
JP2009213673A (en) 2008-03-11 2009-09-24 Fujinon Corp Endoscope system and endoscope inspecting method
BRPI0906187A2 (en) 2008-03-18 2020-07-14 Novadaq Technologies Inc. image representation method and system for the acquisition of nir images and full color images
ES2580177T3 (en) 2008-05-13 2016-08-19 Boston Scientific Scimed, Inc. Steering system with locking mechanism
DE602009001103D1 (en) 2008-06-04 2011-06-01 Fujifilm Corp Lighting device for use in endoscopes
US7630148B1 (en) 2008-06-11 2009-12-08 Ge Inspection Technologies, Lp System for providing zoom, focus and aperture control in a video inspection device
US8617058B2 (en) 2008-07-09 2013-12-31 Innurvation, Inc. Displaying image data from a scanner capsule
JP2010017483A (en) 2008-07-14 2010-01-28 Olympus Corp Endoscope bending tube and endoscope with bending tube
JP5435916B2 (en) 2008-09-18 2014-03-05 富士フイルム株式会社 Electronic endoscope system
JP5558058B2 (en) 2008-09-19 2014-07-23 オリンパスメディカルシステムズ株式会社 Endoscopic endoscope
JP2010069231A (en) 2008-09-22 2010-04-02 Fujifilm Corp Imaging apparatus and endoscope
CA2732261A1 (en) 2008-09-24 2010-04-01 Code 3, Inc. Light bar
WO2010034107A1 (en) 2008-09-24 2010-04-01 Dentsply International Inc. Imaging device for dental instruments and methods for intra-oral viewing
WO2010057082A2 (en) 2008-11-17 2010-05-20 Mayo Foundation For Medical Education And Research Diagnostic capsules, delivery/retrieval systems, kits and methods
US8107170B2 (en) 2008-11-19 2012-01-31 Olympus Corporation Objective optical system
JP2010136032A (en) 2008-12-04 2010-06-17 Hitachi Ltd Video monitoring system
US8425406B2 (en) 2008-12-19 2013-04-23 Boston Scientific Scimed, Inc. Systems and methods for directing instruments to varying positions at the distal end of a guide tube
JP5235706B2 (en) 2009-02-03 2013-07-10 Hoya株式会社 Treatment endoscope
US8939894B2 (en) 2009-03-31 2015-01-27 Intuitive Surgical Operations, Inc. Three-dimensional target devices, assemblies and methods for calibrating an endoscopic camera
PL2414015T4 (en) 2009-03-31 2022-06-13 Dilon Technologies, Inc. Laryngoscope and system
CN102460266B (en) 2009-04-16 2014-10-15 奥林巴斯医疗株式会社 objective optical system
US20120281536A1 (en) 2009-06-12 2012-11-08 Cygnus Broadband, Inc. Systems and methods for detection for prioritizing and scheduling packets in a communication network
US9474440B2 (en) 2009-06-18 2016-10-25 Endochoice, Inc. Endoscope tip position visual indicator and heat management system
US9713417B2 (en) 2009-06-18 2017-07-25 Endochoice, Inc. Image capture assembly for use in a multi-viewing elements endoscope
US8926502B2 (en) 2011-03-07 2015-01-06 Endochoice, Inc. Multi camera endoscope having a side service channel
US9101268B2 (en) 2009-06-18 2015-08-11 Endochoice Innovation Center Ltd. Multi-camera endoscope
US9101287B2 (en) 2011-03-07 2015-08-11 Endochoice Innovation Center Ltd. Multi camera endoscope assembly having multiple working channels
US10524645B2 (en) 2009-06-18 2020-01-07 Endochoice, Inc. Method and system for eliminating image motion blur in a multiple viewing elements endoscope
US9901244B2 (en) 2009-06-18 2018-02-27 Endochoice, Inc. Circuit board assembly of a multiple viewing elements endoscope
US9642513B2 (en) 2009-06-18 2017-05-09 Endochoice Inc. Compact multi-viewing element endoscope system
US9402533B2 (en) 2011-03-07 2016-08-02 Endochoice Innovation Center Ltd. Endoscope circuit board assembly
US9492063B2 (en) 2009-06-18 2016-11-15 Endochoice Innovation Center Ltd. Multi-viewing element endoscope
US9706903B2 (en) 2009-06-18 2017-07-18 Endochoice, Inc. Multiple viewing elements endoscope system with modular imaging units
US20120232345A1 (en) 2011-03-07 2012-09-13 Peer Medical Ltd. Camera assembly for medical probes
US8516691B2 (en) 2009-06-24 2013-08-27 Given Imaging Ltd. Method of assembly of an in vivo imaging device with a flexible circuit board
WO2011041720A2 (en) 2009-10-01 2011-04-07 Jacobsen Stephen C Method and apparatus for manipulating movement of a micro-catheter
US8816856B2 (en) 2009-10-13 2014-08-26 Augusta E.N.T., P.C. Medical instrument cleaning system and method
US8447132B1 (en) 2009-12-09 2013-05-21 CSR Technology, Inc. Dynamic range correction based on image content
EP2477053B1 (en) 2009-12-11 2019-06-19 Olympus Corporation Endoscope objective optical system
DE102009058660A1 (en) 2009-12-16 2011-06-22 Karl Storz GmbH & Co. KG, 78532 Test device for an optical examination system
US20110184243A1 (en) 2009-12-22 2011-07-28 Integrated Endoscopy, Inc. Endoscope with different color light sources
MX2012008056A (en) 2010-01-11 2012-11-23 Motus Gi Medical Technologies Ltd Systems and methods for cleaning body cavities.
US20110169931A1 (en) 2010-01-12 2011-07-14 Amit Pascal In-vivo imaging device with double field of view and method for use
JP2011143029A (en) 2010-01-13 2011-07-28 Olympus Corp Endoscope-bending operation apparatus
JP4902033B1 (en) 2010-04-07 2012-03-21 オリンパスメディカルシステムズ株式会社 Objective lens and endoscope using the same
JP5597021B2 (en) 2010-04-15 2014-10-01 オリンパス株式会社 Image processing apparatus and program
US20110292258A1 (en) 2010-05-28 2011-12-01 C2Cure, Inc. Two sensor imaging systems
CN102612337B (en) 2010-07-08 2014-12-10 奥林巴斯医疗株式会社 Endoscope system and endoscope actuator control method
WO2012005108A1 (en) 2010-07-09 2012-01-12 オリンパスメディカルシステムズ株式会社 Image recording/regenerating system
EP3718466B1 (en) 2010-09-20 2023-06-07 EndoChoice, Inc. Endoscope distal section comprising a unitary fluid channeling component
US9560953B2 (en) 2010-09-20 2017-02-07 Endochoice, Inc. Operational interface in a multi-viewing element endoscope
US20150208900A1 (en) 2010-09-20 2015-07-30 Endochoice, Inc. Interface Unit In A Multiple Viewing Elements Endoscope System
JP5259882B2 (en) 2010-09-30 2013-08-07 オリンパスメディカルシステムズ株式会社 Imaging device
US20140276207A1 (en) 2010-10-25 2014-09-18 Endosee Corporation Method and appartus for hysteroscopy and endometrial biopsy
US20150320300A1 (en) 2010-10-28 2015-11-12 Endochoice, Inc. Systems and Methods of Distributing Illumination for Multiple Viewing Element and Multiple Illuminator Endoscopes
US9706908B2 (en) 2010-10-28 2017-07-18 Endochoice, Inc. Image capture and video processing systems and methods for multiple viewing element endoscopes
EP3540495A1 (en) 2010-10-28 2019-09-18 EndoChoice Innovation Center Ltd. Optical systems for multi-sensor endoscopes
EP2606811B1 (en) 2010-11-25 2014-07-02 Olympus Medical Systems Corp. Bending operation device for endoscope and the endoscope
US20130303979A1 (en) 2010-11-26 2013-11-14 Universitat Bern Verwaltungsdirektion Suction and irrigation device
EP2646173A2 (en) 2010-12-03 2013-10-09 Research Triangle Institute Ultrasound device, and associated cable assembly
EP3420886B8 (en) 2010-12-09 2020-07-15 EndoChoice, Inc. Flexible electronic circuit board multi-camera endoscope
JP6054874B2 (en) 2010-12-09 2016-12-27 エンドチョイス イノベーション センター リミテッド Flexible electronic circuit board for multi-camera endoscope
DE202010016900U1 (en) 2010-12-22 2011-05-19 Peer Medical Ltd. Multi-camera endoscope
JP2012135432A (en) 2010-12-27 2012-07-19 Fujifilm Corp Endoscope
WO2012096102A1 (en) 2011-01-14 2012-07-19 コニカミノルタオプト株式会社 Probe, optical measurement system, provision method, and packaged article for diagnosis purposes
CN103491854B (en) 2011-02-07 2016-08-24 恩多卓斯创新中心有限公司 Multicomponent cover for many cameras endoscope
KR101964642B1 (en) 2011-02-15 2019-04-02 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Seals and sealing methods for a surgical instrument having an articulated end effector actuated by a drive shaft
US10499804B2 (en) 2011-02-24 2019-12-10 DePuy Synthes Products, Inc. Imaging sensor providing improved visualization for surgical scopes
JP5185477B2 (en) 2011-03-29 2013-04-17 オリンパスメディカルシステムズ株式会社 Endoscopy adapter, endoscope processor, and endoscope system
JP5318142B2 (en) 2011-03-31 2013-10-16 富士フイルム株式会社 Electronic endoscope
CN103209634B (en) 2011-04-01 2016-02-03 奥林巴斯株式会社 Receiving system and capsule-type endoscope system
JP5954661B2 (en) 2011-08-26 2016-07-20 パナソニックIpマネジメント株式会社 Imaging device and imaging apparatus
JP5303015B2 (en) 2011-08-29 2013-10-02 富士フイルム株式会社 Endoscopic diagnosis device
TW201315361A (en) 2011-09-22 2013-04-01 Altek Corp Electronic device and image sensor heat dissipation structure
US8890945B2 (en) 2011-11-14 2014-11-18 Omnivision Technologies, Inc. Shared terminal of an image sensor system for transferring image data and control signals
US9332193B2 (en) 2011-11-14 2016-05-03 Omnivision Technologies, Inc. Synchronization of image acquisition in multiple image sensors with a synchronization clock signal
EP3517018A1 (en) 2011-11-21 2019-07-31 Boston Scientific Scimed, Inc. Endoscopic system for optimized visualization
KR20130059150A (en) 2011-11-28 2013-06-05 삼성전자주식회사 Objective lens for endoscopic device, actuator for focusing and endoscopic system
JP5863428B2 (en) 2011-12-05 2016-02-16 Hoya株式会社 Electronic endoscope scope, white balance adjustment method, electronic endoscope system, white balance adjustment jig
EP3659491A1 (en) 2011-12-13 2020-06-03 EndoChoice Innovation Center Ltd. Removable tip endoscope
EP2604172B1 (en) 2011-12-13 2015-08-12 EndoChoice Innovation Center Ltd. Rotatable connector for an endoscope
JP5355824B1 (en) 2011-12-16 2013-11-27 オリンパスメディカルシステムズ株式会社 Resin tube and endoscope
US20130197309A1 (en) 2012-01-31 2013-08-01 Olympus Medical Systems Corp. Endoscope
KR101905648B1 (en) 2012-02-27 2018-10-11 삼성전자 주식회사 Apparatus and method for shooting a moving picture of camera device
DE102012205598A1 (en) 2012-04-04 2013-10-10 Henke-Sass, Wolf Gmbh Protective sleeve for an endoscope tube having endoscope
US9289110B2 (en) 2012-04-05 2016-03-22 Stryker Corporation Control for surgical fluid management pump system
US8702647B2 (en) 2012-04-19 2014-04-22 Medtronic Ablation Frontiers Llc Catheter deflection anchor
US9560954B2 (en) 2012-07-24 2017-02-07 Endochoice, Inc. Connector for use with endoscope
WO2014061023A1 (en) 2012-10-18 2014-04-24 Endochoice Innovation Center Ltd. Multi-camera endoscope
JP5846385B2 (en) 2012-11-07 2016-01-20 国立大学法人東京工業大学 Endoscope operation system
WO2014084135A1 (en) 2012-11-27 2014-06-05 オリンパスメディカルシステムズ株式会社 Endoscope device
US9841280B2 (en) 2012-12-31 2017-12-12 Karl Storz Imaging, Inc. Modular medical imaging system
US9986899B2 (en) 2013-03-28 2018-06-05 Endochoice, Inc. Manifold for a multiple viewing elements endoscope
US20140364691A1 (en) 2013-03-28 2014-12-11 Endochoice, Inc. Circuit Board Assembly of A Multiple Viewing Elements Endoscope
US20140316204A1 (en) 2013-03-28 2014-10-23 Endochoice Inc. Removably Attachable Endoscopic Water Bottle Assembly and A Flushing Adaptor Thereof
US9993142B2 (en) 2013-03-28 2018-06-12 Endochoice, Inc. Fluid distribution device for a multiple viewing elements endoscope
WO2014160983A2 (en) 2013-03-28 2014-10-02 Endochoice, Inc. Compact multi-viewing element endoscope system
US9636003B2 (en) 2013-06-28 2017-05-02 Endochoice, Inc. Multi-jet distributor for an endoscope
EP2991537B1 (en) 2013-04-29 2019-12-18 EndoChoice, Inc. Video processing in a compact multi-viewing element endoscope system
CN105338875B (en) 2013-05-06 2018-11-23 恩多巧爱思股份有限公司 For observing the image collection assembly in element endoscope more
CN105358043B (en) 2013-05-07 2018-12-21 恩多巧爱思股份有限公司 The white balance shell being used together with more observation element endoscopes
WO2014183012A1 (en) 2013-05-09 2014-11-13 Endochoice, Inc. Operational interface in a multi-viewing elements endoscope
CN105358042B (en) 2013-05-13 2018-05-29 恩多巧爱思股份有限公司 Endoscope tip position visual detector and thermal management
US20140343361A1 (en) 2013-05-16 2014-11-20 Endochoice, Inc. Tip Protector for a Multi-Viewing Elements Endoscope
JP6438460B2 (en) 2013-05-17 2018-12-12 アヴァンティス メディカル システムズ, インコーポレイテッド Secondary imaging endoscope device
US20150313450A1 (en) 2013-05-17 2015-11-05 Endochoice, Inc. Stopper for the Cables of a Bending Section of An Endoscope
WO2014186525A1 (en) 2013-05-17 2014-11-20 Endochoice, Inc. Interface unit in a multiple viewing elements endoscope system
CN109549613A (en) 2013-05-17 2019-04-02 恩多巧爱思股份有限公司 There are two more observation element endoscopes of preceding service channel for tool
US9949623B2 (en) 2013-05-17 2018-04-24 Endochoice, Inc. Endoscope control unit with braking system
WO2014186519A2 (en) 2013-05-17 2014-11-20 Endochoice, Inc. Endoscope control unit with braking system
EP3013209B1 (en) 2013-06-28 2020-07-22 EndoChoice, Inc. Multiple viewing elements endoscope system with modular imaging units
CN105636499A (en) 2013-07-01 2016-06-01 恩多巧爱思股份有限公司 Circuit board assembly of a multiple viewing elements endoscope
US10064541B2 (en) 2013-08-12 2018-09-04 Endochoice, Inc. Endoscope connector cover detection and warning system
US20150057500A1 (en) 2013-08-26 2015-02-26 Endochoice Inc. System for Connecting and Disconnecting A Main Connector and A Main Control Unit of An Endoscope
WO2015047631A1 (en) 2013-09-24 2015-04-02 Endochoice, Inc. Circuit board assembly of a multiple viewing elements endoscope
DE102013110860A1 (en) 2013-10-01 2015-04-02 Endochoice Gmbh Endoscope with a supply cable attached to the endoscope
US9943218B2 (en) 2013-10-01 2018-04-17 Endochoice, Inc. Endoscope having a supply cable attached thereto
US20150099925A1 (en) 2013-10-03 2015-04-09 Endochoice, Inc. Endoscope with Integrated Sensors
EP3583884B1 (en) 2013-12-02 2022-05-04 EndoChoice, Inc. Fluid distribution device for a viewing endoscope
US9968242B2 (en) 2013-12-18 2018-05-15 Endochoice, Inc. Suction control unit for an endoscope having two working channels
WO2015112747A2 (en) 2014-01-22 2015-07-30 Endochoice, Inc. Image capture and video processing systems and methods for multiple viewing element endoscopes
EP3096675B1 (en) 2014-01-25 2020-09-02 EndoChoice, Inc. System for eliminating image motion blur in a multiple viewing elements endoscope
WO2015134060A1 (en) 2014-03-04 2015-09-11 Endochoice, Inc. Manifold for a multiple viewing elements endoscope
CN106456267B (en) * 2014-03-28 2020-04-03 直观外科手术操作公司 Quantitative three-dimensional visualization of an instrument in a field of view
EP3136943A4 (en) 2014-05-01 2017-12-27 EndoChoice, Inc. System and method of scanning a body cavity using a multiple viewing elements endoscope
WO2015175246A1 (en) 2014-05-02 2015-11-19 Endochoice, Inc. Stopper for the cables of a bending section of an endoscope
US11234581B2 (en) 2014-05-02 2022-02-01 Endochoice, Inc. Elevator for directing medical tool
WO2015171732A1 (en) 2014-05-07 2015-11-12 Endochoice, Inc. Endoscope illumination with multiple viewing elements and illuminators
US20150374206A1 (en) 2014-06-26 2015-12-31 Endochoice, Inc. Methods and Systems for Managing Information Generated From and Transmitted To An Endoscopic System
EP3171752B1 (en) 2014-07-21 2019-12-11 EndoChoice, Inc. Multi-focal, multi-camera endoscope systems
US10542877B2 (en) 2014-08-29 2020-01-28 Endochoice, Inc. Systems and methods for varying stiffness of an endoscopic insertion tube

Patent Citations (26)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4874220A (en) * 1987-02-17 1989-10-17 Asahi Kogaku Kogyo Kabushiki Kaisha Viewing optical system for use with endoscope
US4891696A (en) * 1988-04-07 1990-01-02 Olympus Optical Co., Ltd. Electronic endoscope apparatus provided with image forming position detecting means
US4935810A (en) * 1988-10-26 1990-06-19 Olympus Optical Co., Ltd. Three-dimensional measuring apparatus
US5669871A (en) * 1994-02-21 1997-09-23 Olympus Optical Co., Ltd. Endoscope measurement apparatus for calculating approximate expression of line projected onto object to measure depth of recess or the like
US6409658B1 (en) * 1998-12-14 2002-06-25 Fuji Photo Optical Co., Ltd. Endoscope with objective lens drive mechanism
US6977678B1 (en) * 1999-08-31 2005-12-20 Matsushita Electric Industrial Co., Ltd. Monitor camera system and method of controlling monitor camera thereof
US20020010384A1 (en) * 2000-03-30 2002-01-24 Ramin Shahidi Apparatus and method for calibrating an endoscope
US20050041282A1 (en) * 2003-08-21 2005-02-24 Frank Rudolph Operating menu for a surgical microscope
US20080045797A1 (en) * 2004-07-02 2008-02-21 Osaka University Endoscope Attachment And Endoscope
US20060197839A1 (en) * 2005-03-07 2006-09-07 Senior Andrew W Automatic multiscale image acquisition from a steerable camera
US20070270650A1 (en) * 2006-05-19 2007-11-22 Robert Eno Methods and apparatus for displaying three-dimensional orientation of a steerable distal tip of an endoscope
US20090250501A1 (en) * 2006-06-05 2009-10-08 Medigus Ltd. Stapler
US20080071143A1 (en) * 2006-09-18 2008-03-20 Abhishek Gattani Multi-dimensional navigation of endoscopic video
US20090248036A1 (en) * 2008-03-28 2009-10-01 Intuitive Surgical, Inc. Controlling a robotic surgical tool with a display monitor
US20110263938A1 (en) * 2009-06-18 2011-10-27 Avi Levy Multi-camera endoscope
US20130109916A1 (en) * 2009-06-18 2013-05-02 Peer Medical Ltd. Multi-camera endoscope
US20140296866A1 (en) * 2009-06-18 2014-10-02 Endochoice, Inc. Multiple Viewing Elements Endoscope Having Two Front Service Channels
US20110134517A1 (en) * 2009-12-04 2011-06-09 Olympus Corporation Microscope controller and microscope system comprising microscope controller
US20120075444A1 (en) * 2010-09-29 2012-03-29 Tokendo Videoendoscope with configurable tactile controls
US20120119879A1 (en) * 2010-11-15 2012-05-17 Intergraph Technologies Company System and method for camera control in a surveillance system
US9329375B2 (en) * 2010-11-15 2016-05-03 Leica Microsystems (Schweiz) Ag Microscope having a touch screen
US8863027B2 (en) * 2011-07-31 2014-10-14 International Business Machines Corporation Moving object on rendered display using collar
US20130155178A1 (en) * 2011-12-16 2013-06-20 Wayne E. Mock Controlling a Camera Using a Touch Interface
US20140078241A1 (en) * 2012-09-14 2014-03-20 Tangome, Inc. Camera manipulation during a video conference
US20160006943A1 (en) * 2014-06-10 2016-01-07 Nitesh Ratnakar Endoscope With Multiple Views And Novel Configurations Adapted Thereto
US20170257619A1 (en) * 2014-09-18 2017-09-07 Sony Corporation Image processing device and image processing method

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190114738A1 (en) * 2016-06-16 2019-04-18 Olympus Corporation Image processing apparatus and image processing method

Also Published As

Publication number Publication date
US20170258295A1 (en) 2017-09-14
WO2017160792A1 (en) 2017-09-21
US10292570B2 (en) 2019-05-21

Similar Documents

Publication Publication Date Title
US11529197B2 (en) Device and method for tracking the position of an endoscope within a patient's body
KR102558061B1 (en) A robotic system for navigating the intraluminal tissue network that compensates for physiological noise
US10786319B2 (en) System, control unit and method for control of a surgical robot
US10856770B2 (en) Method and system for providing visual guidance to an operator for steering a tip of an endoscopic device towards one or more landmarks in a patient
US20150313445A1 (en) System and Method of Scanning a Body Cavity Using a Multiple Viewing Elements Endoscope
US20100249506A1 (en) Method and system for assisting an operator in endoscopic navigation
US20070015967A1 (en) Autosteering vision endoscope
JP6749020B2 (en) Endoscope navigation device
JP2016511049A (en) Re-identifying anatomical locations using dual data synchronization
US20190231167A1 (en) System and method for guiding and tracking a region of interest using an endoscope
WO2019107226A1 (en) Endoscopic apparatus
US20190069955A1 (en) Control unit, system and method for controlling hybrid robot having rigid proximal portion and flexible distal portion
US11871904B2 (en) Steerable endoscope system with augmented view
US20220047339A1 (en) Endoluminal robotic (elr) systems and methods
US20160310043A1 (en) Endoscopic Polyp Measurement Tool and Method for Using the Same
US20220218180A1 (en) Endoscope insertion control device, endoscope insertion control method, and non-transitory recording medium in which endoscope insertion control program is recorded
WO2017002417A1 (en) Ultrasonic observation apparatus, ultrasonic observation apparatus operation method, and ultrasonic observation apparatus operation program
US20220202273A1 (en) Intraluminal navigation using virtual satellite targets
KR20200132174A (en) AR colonoscopy system and method for monitoring by using the same
US20240032772A1 (en) Medical device tracking systems and methods of using the same
WO2023154931A1 (en) Robotic catheter system and method of replaying targeting trajectory
JP2019097665A (en) Endoscope apparatus

Legal Events

Date Code Title Description
STPP Information on status: patent application and granting procedure in general

Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION

STPP Information on status: patent application and granting procedure in general

Free format text: NON FINAL ACTION MAILED

STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION