WO2012035492A1 - Robotic control of an endoscope from blood vessel tree images - Google Patents

Robotic control of an endoscope from blood vessel tree images Download PDF

Info

Publication number
WO2012035492A1
WO2012035492A1 PCT/IB2011/053998 IB2011053998W WO2012035492A1 WO 2012035492 A1 WO2012035492 A1 WO 2012035492A1 IB 2011053998 W IB2011053998 W IB 2011053998W WO 2012035492 A1 WO2012035492 A1 WO 2012035492A1
Authority
WO
WIPO (PCT)
Prior art keywords
blood vessel
vessel tree
operative
intra
robot
Prior art date
Application number
PCT/IB2011/053998
Other languages
French (fr)
Inventor
Aleksandra Popovic
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Priority to JP2013528806A priority Critical patent/JP5955847B2/en
Priority to RU2013116901/14A priority patent/RU2594813C2/en
Priority to US13/822,001 priority patent/US9615886B2/en
Priority to BR112013005879A priority patent/BR112013005879A2/en
Priority to EP11764337.9A priority patent/EP2615993B1/en
Priority to CN201180044480.6A priority patent/CN103108602B/en
Publication of WO2012035492A1 publication Critical patent/WO2012035492A1/en
Priority to US15/483,615 priority patent/US10182704B2/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • A61B1/00149Holding or positioning arrangements using articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00004Operational features of endoscopes characterised by electronic signal processing
    • A61B1/00006Operational features of endoscopes characterised by electronic signal processing of control signals
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00002Operational features of endoscopes
    • A61B1/00043Operational features of endoscopes provided with output arrangements
    • A61B1/00045Display arrangement
    • A61B1/0005Display arrangement combining images e.g. side-by-side, superimposed or tiled
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/04Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances
    • A61B1/05Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor combined with photographic or television appliances characterised by the image sensor, e.g. camera, being in the distal end portion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/313Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes
    • A61B1/3137Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor for introducing through surgical openings, e.g. laparoscopes for examination of the interior of blood vessels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/30Determination of transform parameters for the alignment of images, i.e. image registration
    • G06T7/33Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods
    • G06T7/344Determination of transform parameters for the alignment of images, i.e. image registration using feature-based methods involving models
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/70Determining position or orientation of objects or cameras
    • G06T7/73Determining position or orientation of objects or cameras using feature-based methods
    • G06T7/75Determining position or orientation of objects or cameras using feature-based methods involving models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • A61B1/0016Holding or positioning arrangements using motor drive units
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B17/00Surgical instruments, devices or methods, e.g. tourniquets
    • A61B17/00234Surgical instruments, devices or methods, e.g. tourniquets for minimally invasive surgery
    • A61B2017/00238Type of minimally invasive operation
    • A61B2017/00243Type of minimally invasive operation cardiac
    • A61B2017/00247Making holes in the wall of the heart, e.g. laser Myocardial revascularization
    • A61B2017/00252Making holes in the wall of the heart, e.g. laser Myocardial revascularization for by-pass connections, i.e. connections from heart chamber to blood vessel or from blood vessel to blood vessel
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • A61B2090/3614Image-producing devices, e.g. surgical cameras using optical fibre
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10072Tomographic images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20072Graph-based image processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30101Blood vessel; Artery; Vein; Vascular

Definitions

  • the present invention generally relates to robotic control of an endoscope during a minimally invasive surgical procedure (e.g., a minimally invasive coronary bypass grafting surgery).
  • the present invention specifically relates to a matching of a graphical representation of a pre-operative three-dimensional ("3D") blood vessel tree image to a graphical representation of an intra-operative endoscopic blood vessel tree image as a basis for robotic guiding of an endoscope.
  • 3D three-dimensional
  • Coronary artery bypass grafting (“CABG”) is a surgical procedure for CABG.
  • minimally invasive CABG In minimally invasive CABG, the aforementioned problem of conventional CABG is amplified because a surgeon cannot palpate the heart surface. Additionally, the length of surgical instruments used in minimally invasive CABG prevents any tactile feedback from the proximal end of the tool.
  • One known technique for addressing the problems with conventional CABG is to register an intra-operative site with a pre-operative 3D coronary artery tree.
  • an optically tracked pointer is used to digitalize position of the arteries in an open heart setting and the position data is registered to pre-operative tree using an Iterative Closest Point ("ICP") algorithm known in art.
  • ICP Iterative Closest Point
  • this technique as with any related approach matching digitized arteries and pre-operative data, is impractical for minimally invasive CABG because of spatial constraints imposed by a small port access. Also, this technique requires most of the arteries to be either visible or palpated by the surgeon, which is impossible in minimally invasive CABG.
  • One known technique for addressing the problems with minimally invasive CABG is to implement a registration method in which the heart surface is reconstructed using an optically tracked endoscope and matched to pre-operative computer tomography ("CT") data of the same surface.
  • CT computer tomography
  • this technique may fail if the endoscope view used to derive the surface is too small.
  • the algorithm of this technique more often than not operates in a suboptimal local maximum of the algorithm.
  • Another known technique for addressing the problems with minimally invasive CABG is to label a coronary tree extracted from a new patient using a database of previously labeled cases and graph based matching.
  • this technique works only if a complete tree is available and it's goal is to label the tree rather to match the geometry.
  • a further problem of minimally invasive CABG is an orientation and a guidance of the endoscope once the global positioning with respect to pre-operative 3D images is reached.
  • the goal of registration is to facilitate localization of the anastomosis site and the stenosis.
  • the endoscope In a standard setup, the endoscope is being held by an assistant, while the surgeon holds two instruments. The surgeon issues commands to the assistant and the assistant moves the endoscope accordingly.
  • This kind of setup hinders hand-eye coordination of the surgeon, because the assistant needs to intuitively translate surgeon's commands, typically issued in the surgeon's frame of reference, to the assistant's frame of reference and the endoscope's frame of reference.
  • Plurality of coordinate systems may cause various handling errors, prolong the surgery or cause misidentification of the coronary arteries.
  • a surgical endoscope assistant designed to allow a surgeon to directly control an endoscope via a sensed movement of the surgeon head may solve some of those problems by removing the assistant from the control loop, but the problem of transformation between the surgeon's frame of reference and the endoscope's frame of reference remains.
  • the present invention provides methods for matching graphical representations of a blood vessel tree (e.g., furcation of arteries, capillaries or veins) as shown in a pre-operative three-dimensional (“3D") image (e.g., a CT image, a cone beam CT image, a 3D X-Ray images or a MRI image) and in an intra-operative endoscopic image, overlaying the blood vessel tree from the pre-operative 3D image to the intra-operative endoscopic image, and using the overlay to guide a robot holding an endoscope toward a location as defined in the pre-operative 3D image.
  • 3D three-dimensional
  • One form of the present invention is a robotic guiding system employing a robot unit and a control unit.
  • a robot guiding system employs a robot unit and a control unit.
  • the robot unit includes an endoscope for generating an intra-operative endoscopic image of a blood vessel tree within an anatomical region, and a robot for moving the endoscope within the anatomical region.
  • the control unit includes an endoscope controller for generating an endoscopic path within the anatomical region, wherein the endoscopic path is derived from a matching of a graphical representation of the intra-operative endoscopic image of the blood vessel tree to a graphical representation of a pre-operative three-dimensional image of the blood vessel tree.
  • the control unit further includes a robot controller for commanding the robot to move the endoscope within the anatomical region in accordance with the endoscopic path.
  • a second form of the present invention is a robot guiding method involving a generation of an intra-operative endoscopic image of a blood vessel tree within an anatomical region and a generation of an endoscopic path within the anatomical region, wherein the endoscopic path is derived from a matching of a graphical representation of the intra- operative endoscopic image of the blood vessel tree to a graphical representation of a preoperative three-dimensional image of the blood vessel tree.
  • the robot guiding method further involves a commanding of a robot to move an endoscope within the anatomical region in accordance with the endoscopic path.
  • pre-operative as used herein is broadly defined to describe any activity executed before, during or after an endoscopic imaging of an anatomical region for purposes of acquiring a three-dimensional image of the anatomical region
  • intraoperative as used herein is broadly defined to describe any activity executed by the robot unit and the control unit during an endoscopic imaging of the anatomical region.
  • Examples of an endoscopic imaging of an anatomical region include, but are not limited to, a CABG, a bronchoscopy, a colonscopy, a laparascopy, and a brain endoscopy.
  • FIG. 1 illustrates an exemplary embodiment of a robotic guiding system in
  • FIG. 2 illustrates a flowchart representative of an exemplary embodiment of a robotic guidance method in accordance with the present invention.
  • FIG. 3 illustrates an exemplary surgical implementation of the flowchart shown in
  • FIG. 4 illustrates a flowchart representative of an exemplary embodiment of a graph matching method in accordance with the present invention.
  • FIGS. 5 and 6 illustrate an exemplary ordering of main graphs of a blood vessel tree in accordance with the present invention.
  • FIG. 7 illustrates an exemplary overlay of geometrical representation on an endoscopic image accordance with the present invention.
  • FIG. 8 illustrates an exemplary robot paths within the overlay shown in FIG. 7 in accordance with the present invention.
  • a robotic guiding system employs a robot unit 10 and a control unit 20 for any endoscopic procedure involving an endoscopic imaging of a blood vessel tree having one or more furcations (i.e., branches).
  • endoscopic procedures include, but are not limited to, minimally invasive cardiac surgery (e.g., coronary artery bypass grafting or mitral valve replacement).
  • Robot unit 10 includes a robot 1 1, an endoscope 12 rigidly attached to robot 11 and a video capture device 13 attached to the endoscope 12.
  • Robot 11 is broadly defined herein as any robotic device structurally configured with motorized control of one or more joints for maneuvering an end-effector as desired for the particular endoscopic procedure.
  • robot 11 may have four (4) degrees-of- freedom, such as, for example, a serial robot having joints serially connected with rigid segments, a parallel robot having joints and rigid segments mounted in parallel order (e.g., a Stewart platform known in the art) or any hybrid combination of serial and parallel kinematics.
  • Endoscope 12 is broadly defined herein as any device structurally configured with ability to image from inside a body.
  • Examples of endoscope 12 for purposes of the present invention include, but are not limited to, any type of scope, flexible or rigid (e.g., endoscope, arthroscope, bronchoscope, choledochoscope, colonoscope, cystoscope, duodenoscope, gastroscope, hysteroscope, laparoscope, laryngoscope, neuroscope, otoscope, push enteroscope, rhino laryngoscope, sigmoidoscope, sinuscope, thorascope, etc.) and any device similar to a scope that is equipped with an image system (e.g., a nested cannula with imaging).
  • the imaging is local, and surface images may be obtained optically with fiber optics, lenses, and miniaturized (e.g. CCD based) imaging systems.
  • endoscope 12 is mounted to the end-effector of robot 11.
  • a pose of the end-effector of robot 11 is a position and an orientation of the end-effector within a coordinate system of robot 11 actuators.
  • any given pose of the field-of-view of endoscope 12 within an anatomical region corresponds to a distinct pose of the end-effector of robot 11 within the robotic coordinate system. Consequently, each individual endoscopic image of a blood vessel tree generated by endoscope 12 may be linked to a corresponding pose of endoscope 12 within the anatomical region.
  • Video capture device 13 is broadly defined herein as any device structurally configured with a capability to convert an intra-operative endoscopic video signal from endoscope 12 into a computer readable temporal sequence of intra-operative endoscopic image ("IOEI") 14.
  • video capture device 13 may employ a frame grabber of any type for capturing individual digital still frames from the intra-operative endoscopic video signal.
  • control unit 20 includes a robot controller 21 and an endoscope controller 22.
  • Robot controller 21 is broadly defined herein as any controller structurally configured to provide one or more robot actuator commands (“RAC”) 26 to robot 11 for controlling a pose of the end-effector of robot 11 as desired for the endoscopic procedure. More particularly, robot controller 21 converts endoscope position commands (“EPC") 25 from endoscope controller 22 into robot actuator commands 26.
  • EPC endoscope position commands
  • endoscope position commands 25 may indicate an endoscopic path leading to desired 3D position of a field-of- view of endoscope 12 within an anatomical region whereby robot controller 21 converts command 25 into commands 26 including an actuation current for each motor of robot 11 as needed to move endoscope 12 to the desired 3D position.
  • Endoscope controller 22 is broadly defined herein as any controller structurally configured for implementing a robotic guidance method in accordance with the present invention and exemplary shown in FIG. 2.
  • endoscope controller 22 may incorporate an image processing module ("IPM") 23, which is broadly defined herein as any module structurally configured for executing an anatomical object image registration of the present invention.
  • IPM image processing module
  • Endoscope controller 22 may further incorporate a visual servo module ("VSM”) 24, which is broadly defined herein as any module structurally configured for generating endoscope position commands 25 indicating an endoscopic path leading to desired 3D position of a field-of-view of endoscope 12 within an anatomical region.
  • VSM visual servo module
  • endoscope position commands 25 are derived from the blood vessel tree image registration as exemplarily implemented by a stage S34 of flowchart 30 shown in FIG. 2.
  • a stage S31 of flowchart 30 encompasses an extraction of a geometrical representation of a blood vessel tree from a pre-operative 3D image.
  • a 3D imaging device e.g., a CT device, an X-ray device, or a MRI device
  • a blood vessel tree extractor 43 is operated to extract a geometrical representation 44 of a coronary arterial tree from image 42, which may be stored in a database 45.
  • a Brilliance iCT scanner sold by Philips may be used to generate image 42 and to extract a 3D dataset of the coronary arterial tree from image 42.
  • a stage S32 of flowchart 30 encompasses image processing module 23 matching the graphical representation of one or more intra-operative endoscopic images 14 (FIG. 1) of the blood vessel tree to a graphical representation of pre-operative 3D image 44 (FIG. 1) of the blood vessel tree.
  • endoscope 12 generates an intra-operative endoscopy video of a chest region of patient 50 that is captured by video capture device 13 and converted into intra-operative endoscopic images 14 whereby image processing module 23 of endoscope controller 22 matches a graphical representation of the intra-operative endoscopic image(s) 14 of the coronary arterial tree to a graphical representation of pre-operative 3D image 44 of the coronary arterial tree.
  • image processing module 23 executes a blood vessel tree image matching method of the present invention as exemplarily represented by a flowchart 60 shown in FIG. 4, which will be described herein in the context of the blood vessel tree being a coronary arterial tree.
  • a stage S61 of flowchart 60 encompasses image processing module 23 generating a coronary arterial tree main graph from a geometrical representation of the coronary arterial tree in accordance with any representation method known in the art.
  • a geometrical representation 70 of a coronary arterial tree is converted into a main graph 71 having nodes represented of each furcation (e.g., a bifurcation or trifurcation) of coronary arterial tree geometrical representation 70 and further having branch connections between nodes.
  • Stage S61 may be performed pre-operatively (e.g., days before the endoscopic surgery or any time prior to an introduction of endoscope 12 within patient 50), or intra-operatively by means of a C-arm angiography or other suitable system.
  • a stage S62 of flowchart 60 encompasses image processing module 23 generating a coronary arterial tree subgraph from a portion of a coronary arterial tree visible in an intra- operative endoscopic image 14 in accordance with any graphical representation method known in the art.
  • endoscope 12 is introduced into patient 50 whereby image processing module 23 performs a detection of a coronary arterial structure within the intraoperative endoscopic image 14.
  • some arterial structures may be visible while other arterial structures may be hidden by a layer of fatty tissue.
  • image processing module 23 may implement an automatic detection of visible coronary arterial structure(s) by known image processing operations (e.g., threshold detection by the distinct red color of the visible coronary arterial structure(s)), or a surgeon may manually use an input device to outline the visible coronary arterial structure(s) on the computer display.
  • image processing module 23 Upon a detection of the arterial structure(s), image processing module 23 generates the coronary arterial tree graph in a similar manner to the generation of the coronary arterial tree main graph.
  • a geometrical representation 72 of coronary arterial structure(s) is converted into a graph 73 having nodes represented of each furcation (e.g., a bifurcation or trifurcation) of coronary arterial tree geometrical representation 72 and further having branch connections between nodes. Since both trees are coming from the same person, it is understood that the graph derived from endoscopy images is a subgraph of the graph derived from 3D images.
  • a stage S63 of flowchart 60 encompasses image processing module 23 matching the subgraph to the maingraph in accordance with any known graph matching methods (e.g., maximum common subgraph or McGregor common subgraph). For example, as shown in stage S63, the nodes of subgraph 73 are matched to a subset of nodes of main graph 71.
  • any known graph matching methods e.g., maximum common subgraph or McGregor common subgraph.
  • subgraph 73 may only be partially detected within intra-operative endoscopic image 14 or some nodes/connections of subgraph 73 may be missing from intraoperative endoscopic image 14. To improve upon the matching accuracy of stage S62, an additional ordering of main graph 71 and subgraph 73 may be implemented.
  • a vertical node ordering of main graph 71 is implemented based on a known orientation of patient 50 during the image scanning of stage S61.
  • the main graph nodes may be directionally linked to preserve a top-bottom order as exemplarily shown in FIG. 5 via the solid arrows.
  • the orientation of patient 50 relative to endoscope 12 may not be known. However, knowing that branches of the coronary arterial tree reduce in diameter as they expand top-bottom, then varying arterial sizes of the arterial branches in intra-operative endoscopic image 14 may indicate orientation.
  • a horizontal node ordering of main graph 70 may be implemented based on the known orientation of patient 50 during the image scanning of stage S61. Specifically, the main graph nodes may be directionally linked to preserve a left-right node order as exemplarily shown in FIG. 6 via the dashed arrows. For subgraph 73, with the orientation of patient 50 to endoscope 12 more than likely being unknown, the horizontal node order of subgraph 73 may be set by the operating surgeon or an assistant via a graphical user interface.
  • a stage S33 of flowchart encompasses an overlay of the geometrical representation of pre-operative 3D image 44 (FIG. 1) of the blood vessel tree on the intra-operative endoscopic image 14 of the blood vessel tree. This is done by using the geometrical representation uniquely associated to the maingraph.
  • the entire geometry may be directly translated to intra-operative endoscopic image 14 using a perspective transformation.
  • the perspective transformation may be detected from intra-operative endoscopic image 14 and nodes in pre-operative 3D image 44 using matching algorithms known in art, such as, of example, homography matching.
  • FIG. 7 illustrates a geometrical representation 80 of a coronary arterial tree having nodes matched to nodes 91-95 with an intra-operative endoscopic image 90.
  • the distance between each node pair among nodes 91-95 may be used to determine a scaling factor for geometrical representation 80 to thereby enable geometrical representation 80 to overlay intra-operative endoscopic image 90 as shown.
  • stage S32 (FIG. 2) yields multiple results, then all possible overlays may be displayed to the surgeon whereby the surgeon may select the matching result the surgeon believes is the most likely match via a graphical user interface. Given that the surgeon knows the position of endoscope 12 relative to at least some structures in intra-operative endoscopic image 14, the selection may be relatively straightforward.
  • a stage S34 of flowchart 30 encompasses visual servo module 32 generates an endoscopic path within the overlay of the geometrical representation of pre-operative 3D image 44 (FIG. 1) of the blood vessel tree on intra-operative endoscopic image 14 (FIG. 1) of the blood vessel tree. Based on the endoscopic path, visual servo module 32 generates endoscope position commands 25 to robot controller 21 to thereby guide endoscope 12 (FIG. 1) along the endoscopic path to a desired position within the anatomical region. Specifically, once the exact overlay is found, robot 11 may be commanded to guide endoscope 12 to positions the surgeon selects on pre-operative 3D image 44.
  • the surgeon or the assistant may select a point of blood vessel tree, and robot 11 may guide endoscope 12 towards that desired position along any suitable path.
  • robot 11 may move endoscope 12 along a shortest path 101 to a desired position 100 or along an coronary arterial path 102 to desired position 100.
  • Coronary arterial path 102 is the preferred embodiment, because coronary arterial path 102 allows the surgeon to observe visible arteries as robot 11 moves endoscope 12. In addition, it might help the surgeon to decide if the matching was successful.
  • Coronary arterial path 102 may be defined using methods known in art (e.g., Dijkstra shortest path algorithm).
  • the movement of robot 11 may be commanded using uncalibrated visual servo ing with remote center of motion, and the field of view of endoscope 12 may be extended to enable a larger subgraph during matching stage S32.
  • stages S32-S34 may either be executed one time, or on a periodical basis until such time robot 11 has moved endoscope 12 to the desired position within the anatomical region, or multiple times as dictated by the surgeon.
  • modules 23 and 24 may be implemented by hardware, software and/or firmware integrated within endoscope controller 22 as shown.
  • FIGS. 1-8 From the description of FIGS. 1-8 herein, those having ordinary skill in the art will appreciate the numerous benefits of the present invention including, but not limited to, an application of the present invention to any type of endoscopy surgery performed on any type of blood vessels.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Veterinary Medicine (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Pathology (AREA)
  • Radiology & Medical Imaging (AREA)
  • Optics & Photonics (AREA)
  • Biophysics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • Signal Processing (AREA)
  • Endoscopes (AREA)
  • Surgical Instruments (AREA)

Abstract

A robot guiding system employs a robot unit (10) and a control unit (20). The robot unit (10) includes an endoscope (12) for generating an intra-operative endoscopic image (14) of a blood vessel tree within an anatomical region, and a robot (11) for moving the endoscope (12) within the anatomical region. The control unit (20) includes an endoscope controller (22) for generating an endoscopic path within the anatomical region, wherein the endoscopic path is derived from a matching of a graphical representation of the intra-operative endoscopic image (14) of the blood vessel tree to a graphical representation of a pre-operative three-dimensional image (44) of the blood vessel tree. The control unit (20) further includes a robot controller (21) for commanding the robot (11) to move the endoscope (12) within the anatomical region in accordance with the endoscopic path.

Description

ROBOTIC CONTROL OF AN ENDOSCOPE FROM BLOOD VESSEL TREE IMAGES
The present invention generally relates to robotic control of an endoscope during a minimally invasive surgical procedure (e.g., a minimally invasive coronary bypass grafting surgery). The present invention specifically relates to a matching of a graphical representation of a pre-operative three-dimensional ("3D") blood vessel tree image to a graphical representation of an intra-operative endoscopic blood vessel tree image as a basis for robotic guiding of an endoscope.
Coronary artery bypass grafting ("CABG") is a surgical procedure for
revascularization of obstructed coronary arteries. Approximately 500,000 operations are performed annually in the United States. In conventional CABG, the patient's sternum is opened and the patient's heart is fully exposed to a surgeon. Despite the exposure of the heart, some arteries may be partially invisible due to fatty tissue layer above them. For such arteries, the surgeon may palpate the heart surface and feel both blood pulsating from the arteries and a stenosis of the arteries. However, this data is sparse and might not be sufficient to transfer a surgical plan to the surgical site.
In minimally invasive CABG, the aforementioned problem of conventional CABG is amplified because a surgeon cannot palpate the heart surface. Additionally, the length of surgical instruments used in minimally invasive CABG prevents any tactile feedback from the proximal end of the tool.
One known technique for addressing the problems with conventional CABG is to register an intra-operative site with a pre-operative 3D coronary artery tree. Specifically, an optically tracked pointer is used to digitalize position of the arteries in an open heart setting and the position data is registered to pre-operative tree using an Iterative Closest Point ("ICP") algorithm known in art. However, this technique, as with any related approach matching digitized arteries and pre-operative data, is impractical for minimally invasive CABG because of spatial constraints imposed by a small port access. Also, this technique requires most of the arteries to be either visible or palpated by the surgeon, which is impossible in minimally invasive CABG.
One known technique for addressing the problems with minimally invasive CABG is to implement a registration method in which the heart surface is reconstructed using an optically tracked endoscope and matched to pre-operative computer tomography ("CT") data of the same surface. However, this technique, as with any related approach proposing surface based matching, may fail if the endoscope view used to derive the surface is too small. Furthermore, as the heart surface is relatively smooth without specific surface features, the algorithm of this technique more often than not operates in a suboptimal local maximum of the algorithm.
Another known technique for addressing the problems with minimally invasive CABG is to label a coronary tree extracted from a new patient using a database of previously labeled cases and graph based matching. However, this technique works only if a complete tree is available and it's goal is to label the tree rather to match the geometry.
A further problem of minimally invasive CABG is an orientation and a guidance of the endoscope once the global positioning with respect to pre-operative 3D images is reached. The goal of registration is to facilitate localization of the anastomosis site and the stenosis. In a standard setup, the endoscope is being held by an assistant, while the surgeon holds two instruments. The surgeon issues commands to the assistant and the assistant moves the endoscope accordingly. This kind of setup hinders hand-eye coordination of the surgeon, because the assistant needs to intuitively translate surgeon's commands, typically issued in the surgeon's frame of reference, to the assistant's frame of reference and the endoscope's frame of reference. Plurality of coordinate systems may cause various handling errors, prolong the surgery or cause misidentification of the coronary arteries.
A surgical endoscope assistant designed to allow a surgeon to directly control an endoscope via a sensed movement of the surgeon head may solve some of those problems by removing the assistant from the control loop, but the problem of transformation between the surgeon's frame of reference and the endoscope's frame of reference remains.
The present invention provides methods for matching graphical representations of a blood vessel tree (e.g., furcation of arteries, capillaries or veins) as shown in a pre-operative three-dimensional ("3D") image (e.g., a CT image, a cone beam CT image, a 3D X-Ray images or a MRI image) and in an intra-operative endoscopic image, overlaying the blood vessel tree from the pre-operative 3D image to the intra-operative endoscopic image, and using the overlay to guide a robot holding an endoscope toward a location as defined in the pre-operative 3D image.
One form of the present invention is a robotic guiding system employing a robot unit and a control unit.
A robot guiding system employs a robot unit and a control unit. The robot unit includes an endoscope for generating an intra-operative endoscopic image of a blood vessel tree within an anatomical region, and a robot for moving the endoscope within the anatomical region. The control unit includes an endoscope controller for generating an endoscopic path within the anatomical region, wherein the endoscopic path is derived from a matching of a graphical representation of the intra-operative endoscopic image of the blood vessel tree to a graphical representation of a pre-operative three-dimensional image of the blood vessel tree. The control unit further includes a robot controller for commanding the robot to move the endoscope within the anatomical region in accordance with the endoscopic path.
A second form of the present invention is a robot guiding method involving a generation of an intra-operative endoscopic image of a blood vessel tree within an anatomical region and a generation of an endoscopic path within the anatomical region, wherein the endoscopic path is derived from a matching of a graphical representation of the intra- operative endoscopic image of the blood vessel tree to a graphical representation of a preoperative three-dimensional image of the blood vessel tree. The robot guiding method further involves a commanding of a robot to move an endoscope within the anatomical region in accordance with the endoscopic path.
The term "pre-operative" as used herein is broadly defined to describe any activity executed before, during or after an endoscopic imaging of an anatomical region for purposes of acquiring a three-dimensional image of the anatomical region, and the term "intraoperative" as used herein is broadly defined to describe any activity executed by the robot unit and the control unit during an endoscopic imaging of the anatomical region. Examples of an endoscopic imaging of an anatomical region include, but are not limited to, a CABG, a bronchoscopy, a colonscopy, a laparascopy, and a brain endoscopy.
The foregoing forms and other forms of the present invention as well as various features and advantages of the present invention will become further apparent from the following detailed description of various embodiments of the present invention read in conjunction with the accompanying drawings. The detailed description and drawings are merely illustrative of the present invention rather than limiting, the scope of the present invention being defined by the appended claims and equivalents thereof.
FIG. 1 illustrates an exemplary embodiment of a robotic guiding system in
accordance with the present invention.
FIG. 2 illustrates a flowchart representative of an exemplary embodiment of a robotic guidance method in accordance with the present invention.
FIG. 3 illustrates an exemplary surgical implementation of the flowchart shown in
FIG. 2
FIG. 4 illustrates a flowchart representative of an exemplary embodiment of a graph matching method in accordance with the present invention. FIGS. 5 and 6 illustrate an exemplary ordering of main graphs of a blood vessel tree in accordance with the present invention.
FIG. 7 illustrates an exemplary overlay of geometrical representation on an endoscopic image accordance with the present invention.
FIG. 8 illustrates an exemplary robot paths within the overlay shown in FIG. 7 in accordance with the present invention.
As shown in FIG. 1, a robotic guiding system employs a robot unit 10 and a control unit 20 for any endoscopic procedure involving an endoscopic imaging of a blood vessel tree having one or more furcations (i.e., branches). Examples of such endoscopic procedures include, but are not limited to, minimally invasive cardiac surgery (e.g., coronary artery bypass grafting or mitral valve replacement).
Robot unit 10 includes a robot 1 1, an endoscope 12 rigidly attached to robot 11 and a video capture device 13 attached to the endoscope 12.
Robot 11 is broadly defined herein as any robotic device structurally configured with motorized control of one or more joints for maneuvering an end-effector as desired for the particular endoscopic procedure. In practice, robot 11 may have four (4) degrees-of- freedom, such as, for example, a serial robot having joints serially connected with rigid segments, a parallel robot having joints and rigid segments mounted in parallel order (e.g., a Stewart platform known in the art) or any hybrid combination of serial and parallel kinematics.
Endoscope 12 is broadly defined herein as any device structurally configured with ability to image from inside a body. Examples of endoscope 12 for purposes of the present invention include, but are not limited to, any type of scope, flexible or rigid (e.g., endoscope, arthroscope, bronchoscope, choledochoscope, colonoscope, cystoscope, duodenoscope, gastroscope, hysteroscope, laparoscope, laryngoscope, neuroscope, otoscope, push enteroscope, rhino laryngoscope, sigmoidoscope, sinuscope, thorascope, etc.) and any device similar to a scope that is equipped with an image system (e.g., a nested cannula with imaging). The imaging is local, and surface images may be obtained optically with fiber optics, lenses, and miniaturized (e.g. CCD based) imaging systems.
In practice, endoscope 12 is mounted to the end-effector of robot 11. A pose of the end-effector of robot 11 is a position and an orientation of the end-effector within a coordinate system of robot 11 actuators. With endoscope 12 mounted to the end-effector of robot 11, any given pose of the field-of-view of endoscope 12 within an anatomical region corresponds to a distinct pose of the end-effector of robot 11 within the robotic coordinate system. Consequently, each individual endoscopic image of a blood vessel tree generated by endoscope 12 may be linked to a corresponding pose of endoscope 12 within the anatomical region.
Video capture device 13 is broadly defined herein as any device structurally configured with a capability to convert an intra-operative endoscopic video signal from endoscope 12 into a computer readable temporal sequence of intra-operative endoscopic image ("IOEI") 14. In practice, video capture device 13 may employ a frame grabber of any type for capturing individual digital still frames from the intra-operative endoscopic video signal.
Still referring to FIG. 1, control unit 20 includes a robot controller 21 and an endoscope controller 22.
Robot controller 21 is broadly defined herein as any controller structurally configured to provide one or more robot actuator commands ("RAC") 26 to robot 11 for controlling a pose of the end-effector of robot 11 as desired for the endoscopic procedure. More particularly, robot controller 21 converts endoscope position commands ("EPC") 25 from endoscope controller 22 into robot actuator commands 26. For example, endoscope position commands 25 may indicate an endoscopic path leading to desired 3D position of a field-of- view of endoscope 12 within an anatomical region whereby robot controller 21 converts command 25 into commands 26 including an actuation current for each motor of robot 11 as needed to move endoscope 12 to the desired 3D position.
Endoscope controller 22 is broadly defined herein as any controller structurally configured for implementing a robotic guidance method in accordance with the present invention and exemplary shown in FIG. 2. To this end, endoscope controller 22 may incorporate an image processing module ("IPM") 23, which is broadly defined herein as any module structurally configured for executing an anatomical object image registration of the present invention. In particular, a blood vessel tree image registration as exemplarily implemented by stages S32 and S33 of flowchart 30 shown in FIG. 2. Endoscope controller 22 may further incorporate a visual servo module ("VSM") 24, which is broadly defined herein as any module structurally configured for generating endoscope position commands 25 indicating an endoscopic path leading to desired 3D position of a field-of-view of endoscope 12 within an anatomical region. In particular, endoscope position commands 25 are derived from the blood vessel tree image registration as exemplarily implemented by a stage S34 of flowchart 30 shown in FIG. 2.
A description of flowchart 30 will now be provided herein to facilitate a further understanding of endoscope controller 22. Referring to FIG. 2, a stage S31 of flowchart 30 encompasses an extraction of a geometrical representation of a blood vessel tree from a pre-operative 3D image. For example, as shown in FIG. 3, a 3D imaging device (e.g., a CT device, an X-ray device, or a MRI device) is operated to generate a pre-operative 3D image 42 of a chest region of a patient 50 illustrating left and right coronary arteries 51 and 52 of patient 50. Thereafter, a blood vessel tree extractor 43 is operated to extract a geometrical representation 44 of a coronary arterial tree from image 42, which may be stored in a database 45. In practice, a Brilliance iCT scanner sold by Philips may be used to generate image 42 and to extract a 3D dataset of the coronary arterial tree from image 42.
Referring back to FIG. 2, a stage S32 of flowchart 30 encompasses image processing module 23 matching the graphical representation of one or more intra-operative endoscopic images 14 (FIG. 1) of the blood vessel tree to a graphical representation of pre-operative 3D image 44 (FIG. 1) of the blood vessel tree. For example, as shown in FIG. 3, endoscope 12 generates an intra-operative endoscopy video of a chest region of patient 50 that is captured by video capture device 13 and converted into intra-operative endoscopic images 14 whereby image processing module 23 of endoscope controller 22 matches a graphical representation of the intra-operative endoscopic image(s) 14 of the coronary arterial tree to a graphical representation of pre-operative 3D image 44 of the coronary arterial tree. In one exemplary embodiment, image processing module 23 executes a blood vessel tree image matching method of the present invention as exemplarily represented by a flowchart 60 shown in FIG. 4, which will be described herein in the context of the blood vessel tree being a coronary arterial tree.
Referring to FIG. 4, a stage S61 of flowchart 60 encompasses image processing module 23 generating a coronary arterial tree main graph from a geometrical representation of the coronary arterial tree in accordance with any representation method known in the art. For example, as shown in stage S61, a geometrical representation 70 of a coronary arterial tree is converted into a main graph 71 having nodes represented of each furcation (e.g., a bifurcation or trifurcation) of coronary arterial tree geometrical representation 70 and further having branch connections between nodes. Stage S61 may be performed pre-operatively (e.g., days before the endoscopic surgery or any time prior to an introduction of endoscope 12 within patient 50), or intra-operatively by means of a C-arm angiography or other suitable system.
A stage S62 of flowchart 60 encompasses image processing module 23 generating a coronary arterial tree subgraph from a portion of a coronary arterial tree visible in an intra- operative endoscopic image 14 in accordance with any graphical representation method known in the art. Specifically, endoscope 12 is introduced into patient 50 whereby image processing module 23 performs a detection of a coronary arterial structure within the intraoperative endoscopic image 14. In practice, some arterial structures may be visible while other arterial structures may be hidden by a layer of fatty tissue. As such, image processing module 23 may implement an automatic detection of visible coronary arterial structure(s) by known image processing operations (e.g., threshold detection by the distinct red color of the visible coronary arterial structure(s)), or a surgeon may manually use an input device to outline the visible coronary arterial structure(s) on the computer display. Upon a detection of the arterial structure(s), image processing module 23 generates the coronary arterial tree graph in a similar manner to the generation of the coronary arterial tree main graph. For example, as shown in stage S62, a geometrical representation 72 of coronary arterial structure(s) is converted into a graph 73 having nodes represented of each furcation (e.g., a bifurcation or trifurcation) of coronary arterial tree geometrical representation 72 and further having branch connections between nodes. Since both trees are coming from the same person, it is understood that the graph derived from endoscopy images is a subgraph of the graph derived from 3D images.
A stage S63 of flowchart 60 encompasses image processing module 23 matching the subgraph to the maingraph in accordance with any known graph matching methods (e.g., maximum common subgraph or McGregor common subgraph). For example, as shown in stage S63, the nodes of subgraph 73 are matched to a subset of nodes of main graph 71.
In practice, subgraph 73 may only be partially detected within intra-operative endoscopic image 14 or some nodes/connections of subgraph 73 may be missing from intraoperative endoscopic image 14. To improve upon the matching accuracy of stage S62, an additional ordering of main graph 71 and subgraph 73 may be implemented.
In one embodiment, a vertical node ordering of main graph 71 is implemented based on a known orientation of patient 50 during the image scanning of stage S61. Specifically, the main graph nodes may be directionally linked to preserve a top-bottom order as exemplarily shown in FIG. 5 via the solid arrows. For subgraph 73, the orientation of patient 50 relative to endoscope 12 may not be known. However, knowing that branches of the coronary arterial tree reduce in diameter as they expand top-bottom, then varying arterial sizes of the arterial branches in intra-operative endoscopic image 14 may indicate orientation.
In another embodiment, a horizontal node ordering of main graph 70 may be implemented based on the known orientation of patient 50 during the image scanning of stage S61. Specifically, the main graph nodes may be directionally linked to preserve a left-right node order as exemplarily shown in FIG. 6 via the dashed arrows. For subgraph 73, with the orientation of patient 50 to endoscope 12 more than likely being unknown, the horizontal node order of subgraph 73 may be set by the operating surgeon or an assistant via a graphical user interface.
While the use of ordering may decrease the time for matching the graphs and reduce the number of possible matches, theoretically multiple matches between the graphs may still be obtained by the matching algorithm. Such a case of multiple matches is addressed during a stage S33 of flowchart 30.
Referring again to FIG. 2, based on the matching of the graphs, a stage S33 of flowchart encompasses an overlay of the geometrical representation of pre-operative 3D image 44 (FIG. 1) of the blood vessel tree on the intra-operative endoscopic image 14 of the blood vessel tree. This is done by using the geometrical representation uniquely associated to the maingraph. Thus, the entire geometry may be directly translated to intra-operative endoscopic image 14 using a perspective transformation. The perspective transformation may be detected from intra-operative endoscopic image 14 and nodes in pre-operative 3D image 44 using matching algorithms known in art, such as, of example, homography matching.
For example, FIG. 7 illustrates a geometrical representation 80 of a coronary arterial tree having nodes matched to nodes 91-95 with an intra-operative endoscopic image 90. The distance between each node pair among nodes 91-95 may be used to determine a scaling factor for geometrical representation 80 to thereby enable geometrical representation 80 to overlay intra-operative endoscopic image 90 as shown.
In practice, if the graph matching of stage S32 (FIG. 2) yields multiple results, then all possible overlays may be displayed to the surgeon whereby the surgeon may select the matching result the surgeon believes is the most likely match via a graphical user interface. Given that the surgeon knows the position of endoscope 12 relative to at least some structures in intra-operative endoscopic image 14, the selection may be relatively straightforward.
Referring back to FIG. 2, a stage S34 of flowchart 30 encompasses visual servo module 32 generates an endoscopic path within the overlay of the geometrical representation of pre-operative 3D image 44 (FIG. 1) of the blood vessel tree on intra-operative endoscopic image 14 (FIG. 1) of the blood vessel tree. Based on the endoscopic path, visual servo module 32 generates endoscope position commands 25 to robot controller 21 to thereby guide endoscope 12 (FIG. 1) along the endoscopic path to a desired position within the anatomical region. Specifically, once the exact overlay is found, robot 11 may be commanded to guide endoscope 12 to positions the surgeon selects on pre-operative 3D image 44. The surgeon or the assistant may select a point of blood vessel tree, and robot 11 may guide endoscope 12 towards that desired position along any suitable path. For example, as shown in FIG. 9, robot 11 may move endoscope 12 along a shortest path 101 to a desired position 100 or along an coronary arterial path 102 to desired position 100. Coronary arterial path 102 is the preferred embodiment, because coronary arterial path 102 allows the surgeon to observe visible arteries as robot 11 moves endoscope 12. In addition, it might help the surgeon to decide if the matching was successful. Coronary arterial path 102 may be defined using methods known in art (e.g., Dijkstra shortest path algorithm).
In practice, the movement of robot 11 may be commanded using uncalibrated visual servo ing with remote center of motion, and the field of view of endoscope 12 may be extended to enable a larger subgraph during matching stage S32.
Referring back to FIG. 2, stages S32-S34 may either be executed one time, or on a periodical basis until such time robot 11 has moved endoscope 12 to the desired position within the anatomical region, or multiple times as dictated by the surgeon.
In practice, modules 23 and 24 (FIG. 1) may be implemented by hardware, software and/or firmware integrated within endoscope controller 22 as shown.
From the description of FIGS. 1-8 herein, those having ordinary skill in the art will appreciate the numerous benefits of the present invention including, but not limited to, an application of the present invention to any type of endoscopy surgery performed on any type of blood vessels.
Although the present invention has been described with reference to exemplary aspects, features and implementations, the disclosed systems and methods are not limited to such exemplary aspects, features and/or implementations. Rather, as will be readily apparent to persons skilled in the art from the description provided herein, the disclosed systems and methods are susceptible to modifications, alterations and enhancements without departing from the spirit or scope of the present invention. Accordingly, the present invention expressly encompasses such modification, alterations and
enhancements within the scope hereof.

Claims

1. A robot guiding system, comprising:
a robot unit (10) including
an endoscope (12) operable for generating an intra-operative endoscopic image (14) of a blood vessel tree within an anatomical region, and
a robot (11) operable for moving the endoscope (12) within the anatomical region; and
a control unit (20) including
an endoscope controller (22) operable for generating an endoscopic path within the anatomical region, wherein the endoscopic path is derived from a matching of a graphical representation of the intra-operative endoscopic image (14) of the blood vessel tree to a graphical representation of a pre-operative three-dimensional image (44) of the blood vessel tree, and
a robot controller (21) operable for commanding the robot (11) to move the endoscope (12) within the anatomical region in accordance with the endoscopic path.
2. The robot guiding system of claim 1, wherein the matching of the graphical representation of the intra-operative endoscopic image (14) of the blood vessel tree to the graphical representation of the pre-operative three-dimensional image (44) of the blood vessel tree includes:
generating a main graph derived from a geometrical representation of the preoperative three-dimensional image (44) of the blood vessel tree;
generating a subgraph derived from a geometrical representation of the intra-operative endoscopic image (14) of the blood vessel tree; and
matching the subgraph to the main graph.
3. The robot guiding system of claim 2,
wherein the main graph includes a main set of nodes representative of each furcation of the blood vessel tree within the pre-operative three-dimensional image (44) of the blood vessel tree; and
wherein the subgraph includes a subset of the main set of nodes, the subset of nodes being representative of each furcation of the blood vessel tree within the intra-operative endoscopic image (14) of the blood vessel tree.
4. The robot guiding system of claim 3, wherein the matching of the subgraph to the main graph includes:
establishing at least one of a vertical ordering and a horizontal ordering of the nodes in the main graph.
5. The robot guiding system of claim 2, wherein the endoscope controller (22) is further operable for overlaying the geometrical representation of the pre-operative three-dimensional image (44) of the blood vessel tree onto the intra-operative endoscopic image (14) of the blood vessel tree in accordance with the matching of the graphical representation of the intraoperative endoscopic image (14) of the blood vessel tree to the graphical representation of the pre-operative three-dimensional image (44) of the blood vessel tree.
6. The robot guiding system of claim 5, wherein the endoscope controller (22) is further operable for generating the endoscopic path within the overlay of the geometrical
representation of the pre-operative three-dimensional image (44) of the blood vessel tree onto the intra-operative endoscopic image (14) of the blood vessel tree.
7. The robot guiding system of claim 2,
wherein the matching of the subgraph to the maingraph includes a plurality of matching results of the subset of nodes to the main set of nodes; and
wherein one of the plurality of matching results is selected as the match of the subgraph to the maingraph.
8. The robot guiding system of claim 1, wherein the blood vessel tree is a coronary artery tree.
9. A control unit (20) for an endoscope (12) operable for generating an intra-operative endoscopic image (14) of a blood vessel tree within an anatomical region and a robot (11) operable for moving the endoscope (12) within the anatomical region, the control unit (20) comprising:
an endoscope controller (22) operable for generating an endoscopic path within the anatomical region, wherein the endoscopic path is derived from a matching of a graphical representation of the intra-operative endoscopic image (14) of the blood vessel tree to a graphical representation of a pre-operative three-dimensional image (44) of the blood vessel tree; and
a robot controller (21) operable for commanding the robot (11) to move the endoscope (12) within the anatomical region in accordance with the endoscopic path.
10. The control unit (20) of claim 9, wherein the matching of the graphical representation of the intra-operative endoscopic image (14) of the blood vessel tree to the graphical representation of the pre-operative three-dimensional image (44) of the blood vessel tree includes:
generating a main graph derived from a geometrical representation of the preoperative three-dimensional image (44) of the blood vessel tree;
generating a subgraph derived from a geometrical representation of the intra-operative endoscopic image (14) of the blood vessel tree; and
matching the subgraph to the main graph.
11. The control unit (20) of claim 10,
wherein the main graph includes a main set of nodes representative of each furcation of the blood vessel tree within the pre-operative three-dimensional image (44) of the blood vessel tree; and
wherein the subgraph includes a subset of the main set of nodes, the subset of nodes being representative of each furcation of the blood vessel tree within the intra-operative endoscopic image (14) of the blood vessel tree.
12. The control unit (20) of claim 11, wherein the matching of the subgraph to the main graph includes:
establishing at least one of a vertical ordering and a horizontal ordering of the nodes in the main graph.
13. The control unit (20) of claim 10, wherein the endoscope controller (22) is further operable for overlaying the geometrical representation of the pre-operative three-dimensional image (44) of the blood vessel tree onto the intra-operative endoscopic image (14) of the blood vessel tree in accordance with the matching of the graphical representation of the intraoperative endoscopic image (14) of the blood vessel tree to the graphical representation of the pre-operative three-dimensional image (44) of the blood vessel tree
14. The robot guiding system of claim 13, wherein the endoscope controller (22) is further operable for generating the endoscopic path within an overlay of the geometrical representation of the pre-operative three-dimensional image (44) of the blood vessel tree onto the intra-operative endoscopic image (14) of the blood vessel tree.
15 The robot guiding system of claim 10,
wherein the matching of the subgraph to the maingraph includes a plurality of matching results of the subset of nodes to the main set of nodes; and
wherein one of the plurality of matching results is selected as the match of the subgraph to the maingraph.
16. A robot guiding method, comprising:
generating an intra-operative endoscopic image (14) of a blood vessel tree within an anatomical region;
generating an endoscopic path within the anatomical region, wherein the endoscopic path is derived from a matching of a graphical representation of the intra-operative endoscopic image (14) of the blood vessel tree to a graphical representation of a pre-operative three-dimensional image (44) of the blood vessel tree; and
commanding a robot (11) to move an endoscope (12) within the anatomical region in accordance with the endoscopic path.
17. The robot guiding method of claim 16, wherein the matching of the graphical representation of the intra-operative endoscopic image (14) of the blood vessel tree to the graphical representation of the pre-operative three-dimensional image (44) of the blood vessel tree includes:
generating a main graph derived from a geometrical representation of the preoperative three-dimensional image (44) of the blood vessel tree;
generating a subgraph derived from a geometrical representation of the intra-operative endoscopic image (14) of the blood vessel tree; and
matching the subgraph to the main graph.
18. The robot guiding method of claim 17, wherein the main graph includes a main set of nodes representative of each furcation of the blood vessel tree within the pre-operative three-dimensional image (44) of the blood vessel tree; and
wherein the subgraph includes a subset of the main set of nodes, the subset of nodes being representative of each furcation of the blood vessel tree within the intra-operative endoscopic image (14) of the blood vessel tree.
19. The robot guiding method of claim 17, further comprising:
overlaying the geometrical representation of the pre-operative three-dimensional image (44) of the blood vessel tree onto the intra-operative endoscopic image (14) of the blood vessel tree in accordance with the matching of the graphical representation of the intraoperative endoscopic image (14) of the blood vessel tree to the graphical representation of the pre-operative three-dimensional image (44) of the blood vessel tree
20. The robot guiding system of claim 19, wherein the endoscopic path is generated within the overlay of the geometrical representation of the pre-operative three-dimensional image (44) of the blood vessel tree onto the intra-operative endoscopic image (14) of the blood vessel tree.
PCT/IB2011/053998 2010-09-15 2011-09-13 Robotic control of an endoscope from blood vessel tree images WO2012035492A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
JP2013528806A JP5955847B2 (en) 2010-09-15 2011-09-13 Endoscopic robot control based on blood vessel tree image
RU2013116901/14A RU2594813C2 (en) 2010-09-15 2011-09-13 Robot control for an endoscope from blood vessel tree images
US13/822,001 US9615886B2 (en) 2010-09-15 2011-09-13 Robotic control of an endoscope from blood vessel tree images
BR112013005879A BR112013005879A2 (en) 2010-09-15 2011-09-13 '' Robot guidance system, control units for an endoscope and robot guidance method ''
EP11764337.9A EP2615993B1 (en) 2010-09-15 2011-09-13 Robotic control of an endoscope from blood vessel tree images
CN201180044480.6A CN103108602B (en) 2010-09-15 2011-09-13 From the robot controlling of vascular tree image endoscope
US15/483,615 US10182704B2 (en) 2010-09-15 2017-04-10 Robotic control of an endoscope from blood vessel tree images

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US38298010P 2010-09-15 2010-09-15
US61/382,980 2010-09-15

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US13/822,001 A-371-Of-International US9615886B2 (en) 2010-09-15 2011-09-13 Robotic control of an endoscope from blood vessel tree images
US15/483,615 Continuation US10182704B2 (en) 2010-09-15 2017-04-10 Robotic control of an endoscope from blood vessel tree images

Publications (1)

Publication Number Publication Date
WO2012035492A1 true WO2012035492A1 (en) 2012-03-22

Family

ID=44736002

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2011/053998 WO2012035492A1 (en) 2010-09-15 2011-09-13 Robotic control of an endoscope from blood vessel tree images

Country Status (7)

Country Link
US (2) US9615886B2 (en)
EP (1) EP2615993B1 (en)
JP (1) JP5955847B2 (en)
CN (1) CN103108602B (en)
BR (1) BR112013005879A2 (en)
RU (1) RU2594813C2 (en)
WO (1) WO2012035492A1 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2013118047A1 (en) * 2012-02-06 2013-08-15 Koninklijke Philips Electronics N.V. Invisible bifurcation detection within vessel tree images
WO2013173227A1 (en) 2012-05-14 2013-11-21 Intuitive Surgical Operations Systems and methods for registration of a medical device using a reduced search space
WO2014001980A1 (en) * 2012-06-28 2014-01-03 Koninklijke Philips N.V. Enhanced visualization of blood vessels using a robotically steered endoscope
WO2014001981A1 (en) * 2012-06-28 2014-01-03 Koninklijke Philips N.V. Evaluation of patency using photo-plethysmography on endoscope images
WO2014181222A1 (en) * 2013-05-09 2014-11-13 Koninklijke Philips N.V. Robotic control of an endoscope from anatomical features
WO2015110934A1 (en) * 2014-01-24 2015-07-30 Koninklijke Philips N.V. Continuous image integration for robotic surgery
JP2015529477A (en) * 2012-06-28 2015-10-08 コーニンクレッカ フィリップス エヌ ヴェ Fiber optic sensor guided navigation for blood vessel visualization and monitoring
JP2015530903A (en) * 2012-08-14 2015-10-29 インテュイティブ サージカル オペレーションズ, インコーポレイテッド System and method for registration of multiple vision systems
EP3041409A1 (en) * 2013-09-06 2016-07-13 Koninklijke Philips N.V. Navigation system
US10039473B2 (en) 2012-05-14 2018-08-07 Intuitive Surgical Operations, Inc. Systems and methods for navigation based on ordered sensor records
EP3102141B1 (en) 2014-02-04 2019-08-14 Koninklijke Philips N.V. A system for visualising an anatomical target
US10772684B2 (en) 2014-02-11 2020-09-15 Koninklijke Philips N.V. Spatial visualization of internal mammary artery during minimally invasive bypass surgery
US11963730B2 (en) 2021-11-30 2024-04-23 Endoquest Robotics, Inc. Steerable overtube assemblies for robotic surgical systems

Families Citing this family (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11857149B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. Surgical robotic systems with target trajectory deviation monitoring and related methods
US11786324B2 (en) 2012-06-21 2023-10-17 Globus Medical, Inc. Surgical robotic automation with tracking markers
US11857266B2 (en) 2012-06-21 2024-01-02 Globus Medical, Inc. System for a surveillance marker in robotic-assisted surgery
US11974822B2 (en) 2012-06-21 2024-05-07 Globus Medical Inc. Method for a surveillance marker in robotic-assisted surgery
US11399900B2 (en) 2012-06-21 2022-08-02 Globus Medical, Inc. Robotic systems providing co-registration using natural fiducials and related methods
US11864839B2 (en) 2012-06-21 2024-01-09 Globus Medical Inc. Methods of adjusting a virtual implant and related surgical navigation systems
US11963755B2 (en) 2012-06-21 2024-04-23 Globus Medical Inc. Apparatus for recording probe movement
US10624710B2 (en) 2012-06-21 2020-04-21 Globus Medical, Inc. System and method for measuring depth of instrumentation
US10874466B2 (en) 2012-06-21 2020-12-29 Globus Medical, Inc. System and method for surgical tool insertion using multiaxis force and moment feedback
US11896446B2 (en) 2012-06-21 2024-02-13 Globus Medical, Inc Surgical robotic automation with tracking markers
US11793570B2 (en) 2012-06-21 2023-10-24 Globus Medical Inc. Surgical robotic automation with tracking markers
US11589771B2 (en) 2012-06-21 2023-02-28 Globus Medical Inc. Method for recording probe movement and determining an extent of matter removed
US11298196B2 (en) 2012-06-21 2022-04-12 Globus Medical Inc. Surgical robotic automation with tracking markers and controlled tool advancement
US11317971B2 (en) 2012-06-21 2022-05-03 Globus Medical, Inc. Systems and methods related to robotic guidance in surgery
US11253327B2 (en) 2012-06-21 2022-02-22 Globus Medical, Inc. Systems and methods for automatically changing an end-effector on a surgical robot
US10842461B2 (en) 2012-06-21 2020-11-24 Globus Medical, Inc. Systems and methods of checking registrations for surgical systems
US11045267B2 (en) 2012-06-21 2021-06-29 Globus Medical, Inc. Surgical robotic automation with tracking markers
US11864745B2 (en) 2012-06-21 2024-01-09 Globus Medical, Inc. Surgical robotic system with retractor
KR102397254B1 (en) 2014-03-28 2022-05-12 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Quantitative three-dimensional imaging of surgical scenes
US10555788B2 (en) 2014-03-28 2020-02-11 Intuitive Surgical Operations, Inc. Surgical system with haptic feedback based upon quantitative three-dimensional imaging
CN111184577A (en) 2014-03-28 2020-05-22 直观外科手术操作公司 Quantitative three-dimensional visualization of an instrument in a field of view
EP3125807B1 (en) 2014-03-28 2022-05-04 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging of surgical scenes from multiport perspectives
US10350009B2 (en) 2014-03-28 2019-07-16 Intuitive Surgical Operations, Inc. Quantitative three-dimensional imaging and printing of surgical implants
US10517684B2 (en) * 2015-02-26 2019-12-31 Covidien Lp Robotically controlling remote center of motion with software and guide tube
JP6912481B2 (en) 2015-12-30 2021-08-04 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Image-based robot guidance
US11883217B2 (en) 2016-02-03 2024-01-30 Globus Medical, Inc. Portable medical imaging system and method
EP3500406B1 (en) 2016-08-22 2021-12-01 Canon Kabushiki Kaisha Continuum robot and control method of continuum robot
US9931025B1 (en) * 2016-09-30 2018-04-03 Auris Surgical Robotics, Inc. Automated calibration of endoscopes with pull wires
CN110049742B (en) 2016-12-07 2023-01-03 皇家飞利浦有限公司 Image-guided motion scaling for robot control
US11123139B2 (en) 2018-02-14 2021-09-21 Epica International, Inc. Method for determination of surgical procedure access
US10430949B1 (en) * 2018-04-24 2019-10-01 Shenzhen Keya Medical Technology Corporation Automatic method and system for vessel refine segmentation in biomedical images using tree structure based deep learning model
CN110575255B (en) * 2018-06-07 2022-08-16 格罗伯斯医疗有限公司 Robotic system and related methods for providing co-registration using natural fiducials
WO2020110278A1 (en) * 2018-11-30 2020-06-04 オリンパス株式会社 Information processing system, endoscope system, trained model, information storage medium, and information processing method
USD1022197S1 (en) 2020-11-19 2024-04-09 Auris Health, Inc. Endoscope
WO2023148812A1 (en) * 2022-02-01 2023-08-10 日本電気株式会社 Image processing device, image processing method, and storage medium

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2855292A1 (en) * 2003-05-22 2004-11-26 Inst Nat Rech Inf Automat Magnetic resonance image pattern readjusting device for use during tele-surgery, has processing unit to readjust selected pattern of portion of image at selected angle at which image is captured based on designed attribute
US20070001879A1 (en) * 2005-06-22 2007-01-04 Siemens Corporate Research Inc System and Method For Path Based Tree Matching

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH08164148A (en) * 1994-12-13 1996-06-25 Olympus Optical Co Ltd Surgical operation device under endoscope
AU2001224721A1 (en) * 2000-01-10 2001-08-07 Super Dimension Ltd. Methods and systems for performing medical procedures with reference to projective images and with respect to pre-stored images
US6610007B2 (en) * 2000-04-03 2003-08-26 Neoguide Systems, Inc. Steerable segmented endoscope and method of insertion
JP4656700B2 (en) * 2000-07-11 2011-03-23 オリンパス株式会社 Endoscopic surgery system
US7233820B2 (en) * 2002-04-17 2007-06-19 Superdimension Ltd. Endoscope structures and techniques for navigating to a target in branched structure
US7822461B2 (en) * 2003-07-11 2010-10-26 Siemens Medical Solutions Usa, Inc. System and method for endoscopic path planning
RU2290055C2 (en) * 2004-04-06 2006-12-27 Государственное образовательное учреждение высшего профессионального образования Новосибирская государственная медицинская академия Министерства здравоохранения Российской Федерации Neuronavigation endoscopic system
US20070167784A1 (en) * 2005-12-13 2007-07-19 Raj Shekhar Real-time Elastic Registration to Determine Temporal Evolution of Internal Tissues for Image-Guided Interventions
US7804990B2 (en) 2006-01-25 2010-09-28 Siemens Medical Solutions Usa, Inc. System and method for labeling and identifying lymph nodes in medical images
US9037215B2 (en) * 2007-01-31 2015-05-19 The Penn State Research Foundation Methods and apparatus for 3D route planning through hollow organs
US8672836B2 (en) 2007-01-31 2014-03-18 The Penn State Research Foundation Method and apparatus for continuous guidance of endoscopy
US20090156895A1 (en) * 2007-01-31 2009-06-18 The Penn State Research Foundation Precise endoscopic planning and visualization
EP2117436A4 (en) * 2007-03-12 2011-03-02 David Tolkowsky Devices and methods for performing medical procedures in tree-like luminal structures
DE102008016146B4 (en) 2008-03-28 2010-01-28 Aktormed Gmbh Operation assistance system for guiding a surgical auxiliary instrument
JP5372406B2 (en) * 2008-05-23 2013-12-18 オリンパスメディカルシステムズ株式会社 Medical equipment
JP5572440B2 (en) * 2009-09-15 2014-08-13 富士フイルム株式会社 Diagnosis support system, diagnosis support program, and diagnosis support method
US20120071753A1 (en) * 2010-08-20 2012-03-22 Mark Hunter Apparatus and method for four dimensional soft tissue navigation including endoscopic mapping

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR2855292A1 (en) * 2003-05-22 2004-11-26 Inst Nat Rech Inf Automat Magnetic resonance image pattern readjusting device for use during tele-surgery, has processing unit to readjust selected pattern of portion of image at selected angle at which image is captured based on designed attribute
US20070001879A1 (en) * 2005-06-22 2007-01-04 Siemens Corporate Research Inc System and Method For Path Based Tree Matching

Non-Patent Citations (3)

* Cited by examiner, † Cited by third party
Title
ALEKSANDRA POPOVIC ET AL: "An approach to robotic guidance of an uncalibrated endoscope in beating heart surgery", BIOMEDICAL ROBOTICS AND BIOMECHATRONICS (BIOROB), 2010 3RD IEEE RAS AND EMBS INTERNATIONAL CONFERENCE ON, IEEE, PISCATAWAY, NJ, USA, 26 September 2010 (2010-09-26), pages 106 - 113, XP031793447, ISBN: 978-1-4244-7708-1 *
C. GNAHM, C. HARTUNG, R. FRIEDL, M. HOFFMANN, K. DIETMAYER: "Towards navigation on the heart surface during coronary artery bypass grafting", INT. J. CARS, 4 November 2008 (2008-11-04), XP009155721 *
SCHIRNIBECK E U ET AL: "Automatic coronary artery detection on in situ heart images", COMPUTERS IN CARDIOLOGY, 2004 CHICAGO, IL, USA SEPT. 19-22, 2004, PISCATAWAY, NJ, USA,IEEE, 19 September 2004 (2004-09-19), pages 785 - 788, XP010814143, ISBN: 978-0-7803-8927-4, DOI: 10.1109/CIC.2004.1443057 *

Cited By (47)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9280823B2 (en) 2012-02-06 2016-03-08 Koninklijke Philips N.V. Invisible bifurcation detection within vessel tree images
WO2013118047A1 (en) * 2012-02-06 2013-08-15 Koninklijke Philips Electronics N.V. Invisible bifurcation detection within vessel tree images
US11375919B2 (en) 2012-05-14 2022-07-05 Intuitive Surgical Operations, Inc. Systems and methods for registration of a medical device using a reduced search space
EP3524184A1 (en) * 2012-05-14 2019-08-14 Intuitive Surgical Operations Inc. Systems for registration of a medical device using a reduced search space
US10039473B2 (en) 2012-05-14 2018-08-07 Intuitive Surgical Operations, Inc. Systems and methods for navigation based on ordered sensor records
KR20150017326A (en) * 2012-05-14 2015-02-16 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Systems and methods for registration of a medical device using a reduced search space
US11266327B2 (en) 2012-05-14 2022-03-08 Intuitive Surgical Operations, Inc. Systems and methods for registration of a medical device using a reduced search space
CN109452930B (en) * 2012-05-14 2021-10-29 直观外科手术操作公司 Registration system and method for medical devices using reduced search space
KR102214145B1 (en) 2012-05-14 2021-02-09 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Systems and methods for registration of a medical device using a reduced search space
US11633125B2 (en) 2012-05-14 2023-04-25 Intuitive Surgical Operations, Inc. Systems and methods for navigation based on ordered sensor records
US10299698B2 (en) 2012-05-14 2019-05-28 Intuitive Surgical Operations, Inc. Systems and methods for registration of a medical device using a reduced search space
CN109452930A (en) * 2012-05-14 2019-03-12 直观外科手术操作公司 For using the registration arrangement and method of the Medical Devices of the search space of reduction
US11737682B2 (en) 2012-05-14 2023-08-29 Intuitive Surgical Operations, Inc Systems and methods for registration of a medical device using a reduced search space
WO2013173227A1 (en) 2012-05-14 2013-11-21 Intuitive Surgical Operations Systems and methods for registration of a medical device using a reduced search space
US10154800B2 (en) 2012-05-14 2018-12-18 Intuitive Surgical Operations, Inc. Systems and methods for registration of a medical device using a reduced search space
EP2849670A4 (en) * 2012-05-14 2016-08-17 Intuitive Surgical Operations Systems and methods for registration of a medical device using a reduced search space
US10194801B2 (en) 2012-06-28 2019-02-05 Koninklijke Philips N.V. Fiber optic sensor guided navigation for vascular visualization and monitoring
JP2015529477A (en) * 2012-06-28 2015-10-08 コーニンクレッカ フィリップス エヌ ヴェ Fiber optic sensor guided navigation for blood vessel visualization and monitoring
WO2014001980A1 (en) * 2012-06-28 2014-01-03 Koninklijke Philips N.V. Enhanced visualization of blood vessels using a robotically steered endoscope
US9750575B2 (en) 2012-06-28 2017-09-05 Koninklijke Philips N.V. Evaluation of patency using photo-plethysmography on endoscope images
WO2014001981A1 (en) * 2012-06-28 2014-01-03 Koninklijke Philips N.V. Evaluation of patency using photo-plethysmography on endoscope images
US11278182B2 (en) 2012-06-28 2022-03-22 Koninklijke Philips N.V. Enhanced visualization of blood vessels using a robotically steered endoscope
CN104411226A (en) * 2012-06-28 2015-03-11 皇家飞利浦有限公司 Enhanced visualization of blood vessels using a robotically steered endoscope
JP2015525599A (en) * 2012-06-28 2015-09-07 コーニンクレッカ フィリップス エヌ ヴェ Evaluation of patency using photoplethysmography in endoscopic images
US10278615B2 (en) 2012-08-14 2019-05-07 Intuitive Surgical Operations, Inc. Systems and methods for registration of multiple vision systems
US11219385B2 (en) 2012-08-14 2022-01-11 Intuitive Surgical Operations, Inc. Systems and methods for registration of multiple vision systems
JP2015530903A (en) * 2012-08-14 2015-10-29 インテュイティブ サージカル オペレーションズ, インコーポレイテッド System and method for registration of multiple vision systems
CN106562757B (en) * 2012-08-14 2019-05-14 直观外科手术操作公司 The system and method for registration for multiple vision systems
US11896364B2 (en) 2012-08-14 2024-02-13 Intuitive Surgical Operations, Inc. Systems and methods for registration of multiple vision systems
CN106562757A (en) * 2012-08-14 2017-04-19 直观外科手术操作公司 System and method for registration of multiple vision systems
JP2016524487A (en) * 2013-05-09 2016-08-18 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Endoscopic robot control from anatomical features
CN105188594A (en) * 2013-05-09 2015-12-23 皇家飞利浦有限公司 Robotic control of an endoscope from anatomical features
RU2692206C2 (en) * 2013-05-09 2019-06-21 Конинклейке Филипс Н.В. Robotic control of endoscope based on anatomical features
WO2014181222A1 (en) * 2013-05-09 2014-11-13 Koninklijke Philips N.V. Robotic control of an endoscope from anatomical features
US20160066768A1 (en) * 2013-05-09 2016-03-10 Koninklijke Philips N.V. Robotic control of an endoscope from anatomical features
US11284777B2 (en) 2013-05-09 2022-03-29 Koninklijke Philips N.V. Robotic control of an endoscope from anatomical features
US11395702B2 (en) 2013-09-06 2022-07-26 Koninklijke Philips N.V. Navigation system
EP3041409A1 (en) * 2013-09-06 2016-07-13 Koninklijke Philips N.V. Navigation system
US20160331475A1 (en) * 2014-01-24 2016-11-17 Koninklijke Philips N.V. Continuous image integration for robotic surgery
WO2015110934A1 (en) * 2014-01-24 2015-07-30 Koninklijke Philips N.V. Continuous image integration for robotic surgery
US11083529B2 (en) 2014-01-24 2021-08-10 Koninklijke Philips N.V. Continuous image integration for robotic surgery
US11523874B2 (en) 2014-02-04 2022-12-13 Koninklijke Philips N.V. Visualization of depth and position of blood vessels and robot guided visualization of blood vessel cross section
US20230065264A1 (en) * 2014-02-04 2023-03-02 Koninklijke Philips N.V. Visualization of depth and position of blood vessels and robot guided visualization of blood vessel cross section
EP3102141B1 (en) 2014-02-04 2019-08-14 Koninklijke Philips N.V. A system for visualising an anatomical target
US11980505B2 (en) 2014-02-04 2024-05-14 Koninklijke Philips N.V. Visualization of depth and position of blood vessels and robot guided visualization of blood vessel cross section
US10772684B2 (en) 2014-02-11 2020-09-15 Koninklijke Philips N.V. Spatial visualization of internal mammary artery during minimally invasive bypass surgery
US11963730B2 (en) 2021-11-30 2024-04-23 Endoquest Robotics, Inc. Steerable overtube assemblies for robotic surgical systems

Also Published As

Publication number Publication date
US20170209028A1 (en) 2017-07-27
EP2615993B1 (en) 2015-03-18
US9615886B2 (en) 2017-04-11
US20130165948A1 (en) 2013-06-27
CN103108602B (en) 2015-09-30
EP2615993A1 (en) 2013-07-24
JP2013541365A (en) 2013-11-14
CN103108602A (en) 2013-05-15
RU2013116901A (en) 2014-10-20
US10182704B2 (en) 2019-01-22
BR112013005879A2 (en) 2016-05-10
JP5955847B2 (en) 2016-07-20
RU2594813C2 (en) 2016-08-20

Similar Documents

Publication Publication Date Title
US10182704B2 (en) Robotic control of an endoscope from blood vessel tree images
US10453174B2 (en) Endoscopic registration of vessel tree images
US9280823B2 (en) Invisible bifurcation detection within vessel tree images
US10835344B2 (en) Display of preoperative and intraoperative images
US11284777B2 (en) Robotic control of an endoscope from anatomical features
JP6725423B2 (en) System for visualizing anatomical targets
EP2838412B1 (en) Guidance tools to manually steer endoscope using pre-operative and intra-operative 3d images
EP2866638B1 (en) Enhanced visualization of blood vessels using a robotically steered endoscope
US20190069955A1 (en) Control unit, system and method for controlling hybrid robot having rigid proximal portion and flexible distal portion
WO2012156873A1 (en) Endoscope segmentation correction for 3d-2d image overlay

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201180044480.6

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 11764337

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2011764337

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 13822001

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2013528806

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2013116901

Country of ref document: RU

Kind code of ref document: A

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112013005879

Country of ref document: BR

ENP Entry into the national phase

Ref document number: 112013005879

Country of ref document: BR

Kind code of ref document: A2

Effective date: 20130312