WO2011055245A1 - Collision avoidance and detection using distance sensors - Google Patents

Collision avoidance and detection using distance sensors Download PDF

Info

Publication number
WO2011055245A1
WO2011055245A1 PCT/IB2010/054481 IB2010054481W WO2011055245A1 WO 2011055245 A1 WO2011055245 A1 WO 2011055245A1 IB 2010054481 W IB2010054481 W IB 2010054481W WO 2011055245 A1 WO2011055245 A1 WO 2011055245A1
Authority
WO
WIPO (PCT)
Prior art keywords
endoscopic
endoscope
distance
monocular
images
Prior art date
Application number
PCT/IB2010/054481
Other languages
French (fr)
Inventor
Aleksandra Popovic
Mareike Klee
Bout Marcelis
Christianus Martinus Van Heesch
Original Assignee
Koninklijke Philips Electronics N.V.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Koninklijke Philips Electronics N.V. filed Critical Koninklijke Philips Electronics N.V.
Priority to EP10779336A priority Critical patent/EP2496128A1/en
Priority to JP2012535970A priority patent/JP2013509902A/en
Priority to CN2010800498322A priority patent/CN102595998A/en
Priority to US13/502,412 priority patent/US20120209069A1/en
Publication of WO2011055245A1 publication Critical patent/WO2011055245A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00147Holding or positioning arrangements
    • A61B1/00149Holding or positioning arrangements using articulated arms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00193Optical arrangements adapted for stereoscopic vision
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/55Depth or shape recovery from multiple images
    • G06T7/579Depth or shape recovery from multiple images from motion
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/10Computer-aided planning, simulation or modelling of surgical operations
    • A61B2034/101Computer-aided simulation of surgical operations
    • A61B2034/105Modelling of the patient, e.g. for ligaments or bones
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • A61B2090/062Measuring instruments not otherwise provided for penetration depth
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/08Accessories or related features not otherwise provided for
    • A61B2090/0801Prevention of accidental cutting or pricking
    • A61B2090/08021Prevention of accidental cutting or pricking of the patient or his organs
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • A61B2090/3614Image-producing devices, e.g. surgical cameras using optical fibre
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/367Correlation of different images or relation of image positions in respect to the body creating a 3D dataset from 2D images using position information
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/378Surgical systems with images on a monitor during operation using ultrasound
    • A61B2090/3782Surgical systems with images on a monitor during operation using ultrasound transmitter or receiver in catheter or minimal invasive instrument
    • A61B2090/3784Surgical systems with images on a monitor during operation using ultrasound transmitter or receiver in catheter or minimal invasive instrument both receiver and transmitter being in the instrument or receiver being also transmitter
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/50Supports for surgical instruments, e.g. articulated arms
    • A61B2090/506Supports for surgical instruments, e.g. articulated arms using a parallelogram linkage, e.g. panthograph
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10068Endoscopic image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing

Definitions

  • the present invention generally relates to minimally invasive surgeries involving an endoscope manipulated by an endoscopic robot.
  • the present invention specifically relates to avoiding and detecting a collision by an endoscope using distance sensors with an object within an anatomical region of a body and a reconstruction of the surface imaged by the endoscope.
  • a minimally invasive surgery utilizes an endoscope, which is a long, flexible or rigid tube having an imaging capability.
  • the endoscope Upon insertion into a body through a natural orifice or a small incision, the endoscope provides an image of the region of interest that may be viewed through an eyepiece or on a screen as a surgeon performs the operation.
  • Essential to the surgery is the depth information of object(s) within the image that will enable the surgeon to be able to advance the endoscope while avoiding the object(s).
  • the frames of an endoscopic image are two-dimensional and the surgeon therefore may lose the perception of the depth of object(s) viewed in the screen shot of the image.
  • rigid endoscopes are used to provide visual feedback during major types of minimally invasive procedures including, but not limited to, endoscopic procedures for cardiac surgery, laparoscopic procedures for the abdomen, endoscopic procedures for the spine and arthroscopic procedures for joints (e.g., a knee).
  • endoscopic procedures for cardiac surgery, laparoscopic procedures for the abdomen, endoscopic procedures for the spine and arthroscopic procedures for joints (e.g., a knee).
  • a surgeon may use an active endoscopic robot for moving the endoscope autonomously or by commands from the surgeon.
  • the endoscopic robot should be able to avoid collision of the endoscope with important objects within the region of interest in the patient's body.
  • Such collision avoidance may be difficult for procedures involving real-time changes in the operating site (e.g., real-time changes in a knee during ACL arthroscopy due to removal of damaged ligament, repair of menisci and/or a drilling of a channel), and/or different positioning of the patient's body during surgery than in preoperative imaging (e.g., knee is straight during a preoperative computer-tomography and is bent during the surgery).
  • real-time changes in the operating site e.g., real-time changes in a knee during ACL arthroscopy due to removal of damaged ligament, repair of menisci and/or a drilling of a channel
  • preoperative imaging e.g., knee is straight during a preoperative computer-tomography and is bent during the surgery.
  • the present invention provides a technique that utilizes endoscopic video frames from the monocular endoscopic images and distance measurements of an object within the monocular endoscopic images to reconstruct a 3D image of a surface of an object viewed by the endoscope for the purposes of avoiding and detecting any collision by an endoscope with the object.
  • One form of the present invention is a endoscopic system employing an endoscope and an endoscopic control unit having an endoscopic robot.
  • the endoscope generates a plurality of monocular endoscopic images of an anatomical region of a body as the endoscope is advanced by the endoscopic robot to a target location within the anatomical region.
  • the endoscope includes one or more distance sensors for generating measurements of a distance of the endoscope from an object within the monocular endoscopic images as the endoscope is advanced to the target location by the endoscopic robot (e.g., distance to a ligament within monocular endoscopic images of a knee).
  • the endoscopic control unit receives the monocular endoscopic images and distance measurements to reconstruct a three-dimensional image of a surface of the object within the monocular endoscopic images as a function of the distance
  • a second form of the present invention is an endoscopic method involving an advancement of an endoscope by an endoscopic robot to a target location within an anatomical region of a body and a generation of a plurality of monocular endoscopic images of the anatomical region as the endoscope is advanced by the endoscopic robot to the target location within the anatomical region.
  • the method further involves a generation of distance
  • FIG. 1. illustrates an exemplary embodiment of a endoscopic system in accordance with the present invention.
  • FIG. 2 illustrates a first exemplary embodiment of a distal end of an endoscope in accordance with the present invention.
  • FIG. 3 illustrates a second exemplary embodiment of a distal end of an endoscope in accordance with the present invention.
  • FIG. 4 illustrates a flowchart representative of an exemplary embodiment of a collision avoidance/detection method in accordance with the present invention.
  • FIG. 5 illustrates a schematic representation of an arthroscopic surgery in accordance with the present invention.
  • FIG. 6 illustrates an exemplary application of the flowchart illustrated in FIG. 4 during the arthroscopic surgery illustrated in FIG. 5.
  • FIG. 7 illustrates a flowchart representative of an exemplary embodiment of an object detection in accordance with the present invention.
  • FIG. 8 illustrates an exemplary stereo matching of two synthetic knee images in accordance with the present invention.
  • a endoscopic system 10 of the present invention employs an endoscope 20 and a endoscopic control unit 30 for any applicable type of medical procedures.
  • medical procedures include, but are not limited to, minimally invasive cardiac surgery (e.g., coronary artery bypass grafting or mitral valve replacement), minimally invasive abdominal surgery (laparoscopy) (e.g., prostatectomy or cholecystectomy, and natural orifice translumenal endoscopic surgery.
  • Endoscope 20 is broadly defined herein as any device structurally configured imaging an anatomical region of a body (e.g., human or animal) via an imaging device 21 (e.g., fiber optics, lenses, miniaturized CCD based imaging systems, etc).
  • an imaging device 21 e.g., fiber optics, lenses, miniaturized CCD based imaging systems, etc.
  • Examples of endoscope 20 include, but are not limited to, any type of imaging scope (e.g., a bronchoscope, a colonoscope, a laparoscope, an arthroscope, etc.) and any device similar to a scope that is equipped with an image system (e.g., an imaging cannula).
  • a distance sensor 22 may be an ultrasound transducer element or array for transmitting and receiving ultrasound signals having a time of flight that is indicative of a distance to an object (e.g., a bone within a knee).
  • the ultrasound transducer element/array may be thin film micro-machined (e.g., piezoelectric thin film or capacitive micro-machined) transducers, which may also be disposable.
  • a capacitive micro-machined ultrasound transducer array has AC characteristics for time of flight distance measurement of an object, and DC characteristics for direct measurement of any pressure being exerted by the object of the membrane of the array.
  • distance sensor(s) 22 are located on a distal end of endoscope 20 relative to imaging device 21 to facilitate collision avoidance and detection by endoscope 20 with an object.
  • distance sensors in the form of ultrasound transducer array 42 and ultrasound transducer array 43 are positioned around a circumference and a front surface, respectively, of a distal end of an endoscope shaft 40 having a imaging device 41 on the front surface of its distal end.
  • arrays 42 and 43 provide sensing around a significant length of endoscope shaft 40.
  • ID or 2D ultrasound transducer arrays By making use ID or 2D ultrasound transducer arrays, steering of the ultrasound beam in an angle of +/-45 degree to transmit and receive ultrasound signals is obtain whereby objects positioned in the direct line of the ultrasound sensors as well as objects located under an angle may be detected and collision with these objects may be avoided.
  • a distance sensor in the form of a single ultrasound linear element 52 encircles a imaging device 51 on a top distal end of an endoscope shaft 50.
  • ultrasound linear element 52 may consist of several elements serving as a phase-array for beam-forming and beam-steering.
  • endoscopic robot 31 of unit 30 is broadly defined herein as any robotic device structurally configured with motorized control to maneuver endoscope 20 during a minimally invasive surgery
  • robot controller 32 of unit 30 is broadly defined herein as any controller structurally configured to provide motor signals to endoscopic robot 31 for the purposes of maneuvering endoscope 20 during the minimally invasive surgery.
  • Exemplary input device(s) 33 for robot controller 32 include, but are not limited to, a 2D/3D mouse and a joystick.
  • Collision avoidance/detection device 34 of unit 30 is broadly defined herein as any device structurally configured for providing a surgeon operating an endoscope or a endoscopic robot with a real-time collision avoidance/detection by endoscope 20 with an object within an anatomical region of a body using a combination of imaging device 21 and distance sensors 22.
  • collision avoidance/detection device 34 may operate independently of robot controller 32 as shown or be internally incorporated within robot controller 32.
  • Flowchart 60 as shown in FIG. 4 represents a collision avoidance/detection method of the present invention as executed by collision avoidance/detection device 34.
  • collision avoidance/detection device 34 initially executes a stage S61 for acquiring monocular endoscopic images of an object within the anatomical region of a body from imaging device 21, and a stage S62 for receiving distance measurements of endoscope 20 from the object from distance sensor(s) 22 while endoscope 20 is advanced to a target location within the anatomical region of the body by endoscopic robot 31.
  • collision avoidance/detection device 34 proceeds to a stage S63 of flowchart 60 to detect the object whereby the surgeon may manually operate endoscopic robot 31 or endoscopic robot 31 may be autonomously operated to avoid or detect any collision by endoscope 20 with the object.
  • the detection of the object involves a 3D reconstruction of a surface of the object as viewed by endoscope 20 that provides critical information for avoiding and detecting any collision by endoscope with the object including, but not limited to, a 3D shape of the object and a depth of every point on the surface of the object.
  • FIG. 5 illustrates a patella 72, a ligament 73 and a damaged cartilage 74 of a knee 71.
  • a irrigating instrument 75, a trimming instrument 76 and an arthroscope 77 having an imaging device in the form of a imaging device (not shown) and a distance sensor in the form of an ultrasound transducer array (not shown) are being used for purposes of repairing the damaged cartilage 74.
  • ultrasound transducers 78a-78d for determining a relative positioning of the ultrasound transducer array within knee 71.
  • FIG. 6 illustrates a control of arthroscope 77 by an endoscopic robot 31a.
  • stage S61 involves the imaging device of arthroscope 77 providing a two-dimensional image temporal sequence 80 (FIG. 6) to collision avoidance/detection device 34 as arthroscope 77 is being advanced to a target location within knee 71 by endoscopic robot 3 la as controlled by robot controller 32.
  • FIG. 6 the imaging device of arthroscope 77 providing a two-dimensional image temporal sequence 80 (FIG. 6) to collision avoidance/detection device 34 as arthroscope 77 is being advanced to a target location within knee 71 by endoscopic robot 3 la as controlled by robot controller 32.
  • the ultrasound transducer array of arthroscope 77 may be utilized to provide two-dimensional temporal sequence 90.
  • the distance measurements of stage S62 involve the ultrasound transducer array of arthroscope 77 transmitting and receiving ultrasound signals within knee 71 having a time of flight that is indicative of a distance to an object and provides collision avoidance/detection device 34 with distance measurement signals 81 (FIG. 6).
  • distance measurement signals may have AC signal components for time of flight distance measurement of an object, and DC signal components for direct measurement of any pressure being exerted by the object of the membrane of the ultrasound transducer array.
  • stage S63 involves collision avoidance/detection device 34 using a combination of image temporal sequence 80 and distance measurement signals 81 to provide control signals 82 to robot controller 32 and/or display image data 83 to a monitor 35 as needed to enable a surgeon or endoscopic robot 31 to avoid the object or to maneuver away from the object in the case of a collision.
  • the display of image data 93 further provides information for facilitating the surgeon in making any necessary intraoperative decisions, particularly the 3D shape of the object and the depth of each point on the surface of the object.
  • Flowchart 1 10 as shown in FIG. 7 represents an exemplary embodiment of stage S63 (FIG. 4). Specifically, the detection of the object by device 34 is achieved by an
  • a calibration of imaging device is executed during a stage S i l l of flowchart 1 10 prior to an insertion of arthroscope 77 within knee 71.
  • a standardized checkerboard method may be used to obtain intrinsic imaging device parameters (e.g., focal point and lens distortion coefficients) in a 3x3 imaging device intrinsic matrix (K).
  • a reconstruction of a 3D surface of an object from two or more images of the same scene taken at different time moments is executed during a stage S I 12 of flowchart 1 10.
  • motion of endoscope 71 is known from control of endoscopic robot 31 , so a relative rotation (3x3 matrix R) and a translation (3x1 vector t) between the two respective imaging device positions is also known.
  • K,R,t a knowledge set comprising of both intrinsic and extrinsic imaging device parameters
  • image rectification is implemented to build a 3D depth map from the two images.
  • the (K,R,t) images are warped so that their vertical components are aligned.
  • the process of rectification results in 3x3 warping matrices and 4x3 disparity-to-depth mapping matrix.
  • optical flow is computed between two images during stage S I 12, using point correspondences as known in the art.
  • a disparity map in every image element is u (xl -x2). Re -projecting the disparity map using the 4x3 disparity-to-depth mapping matrix will result in the 3D shape of the object in front of the lens of the imaging device.
  • FIG. 8 illustrates an exemplary result of a 3D surface reconstruction 100 from image temporal sequence 80.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • Veterinary Medicine (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Animal Behavior & Ethology (AREA)
  • Public Health (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Pathology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Optics & Photonics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Robotics (AREA)
  • Human Computer Interaction (AREA)
  • Endoscopes (AREA)
  • Ultra Sonic Daignosis Equipment (AREA)
  • Instruments For Viewing The Inside Of Hollow Bodies (AREA)

Abstract

An endoscopic method involves an advancement of an endoscope (20) as controlled by an endoscopic robot (31) to a target location within an anatomical region of a body, and a generation of a plurality of monocular endoscopic images (80) of the anatomical region as the endoscope (20) is advanced to the target location by the endoscopic robot (31). For avoiding or detecting a collision of the endoscope (20) with and object within monocular endoscopic images (80) (e.g., a ligament within monocular endoscopic images of a knee), the method further involves a generation of distance measurements of the endoscope (20) from the object as the endoscope (20) is advanced to the target location by the endoscopic robot (31), and a reconstruction of a three-dimensional image of a surface of the object within the monocular endoscopic images (80) as a function of the distance measurements (81).

Description

COLLISION AVOIDANCE AND DETECTION USING DISTANCE SENSORS
The present invention generally relates to minimally invasive surgeries involving an endoscope manipulated by an endoscopic robot. The present invention specifically relates to avoiding and detecting a collision by an endoscope using distance sensors with an object within an anatomical region of a body and a reconstruction of the surface imaged by the endoscope.
Generally, a minimally invasive surgery utilizes an endoscope, which is a long, flexible or rigid tube having an imaging capability. Upon insertion into a body through a natural orifice or a small incision, the endoscope provides an image of the region of interest that may be viewed through an eyepiece or on a screen as a surgeon performs the operation. Essential to the surgery is the depth information of object(s) within the image that will enable the surgeon to be able to advance the endoscope while avoiding the object(s). However, the frames of an endoscopic image are two-dimensional and the surgeon therefore may lose the perception of the depth of object(s) viewed in the screen shot of the image.
More particularly, rigid endoscopes are used to provide visual feedback during major types of minimally invasive procedures including, but not limited to, endoscopic procedures for cardiac surgery, laparoscopic procedures for the abdomen, endoscopic procedures for the spine and arthroscopic procedures for joints (e.g., a knee). During such procedures, a surgeon may use an active endoscopic robot for moving the endoscope autonomously or by commands from the surgeon. In either case, the endoscopic robot should be able to avoid collision of the endoscope with important objects within the region of interest in the patient's body. Such collision avoidance may be difficult for procedures involving real-time changes in the operating site (e.g., real-time changes in a knee during ACL arthroscopy due to removal of damaged ligament, repair of menisci and/or a drilling of a channel), and/or different positioning of the patient's body during surgery than in preoperative imaging (e.g., knee is straight during a preoperative computer-tomography and is bent during the surgery).
The present invention provides a technique that utilizes endoscopic video frames from the monocular endoscopic images and distance measurements of an object within the monocular endoscopic images to reconstruct a 3D image of a surface of an object viewed by the endoscope for the purposes of avoiding and detecting any collision by an endoscope with the object.
One form of the present invention is a endoscopic system employing an endoscope and an endoscopic control unit having an endoscopic robot. In operation, the endoscope generates a plurality of monocular endoscopic images of an anatomical region of a body as the endoscope is advanced by the endoscopic robot to a target location within the anatomical region.
Additionally, the endoscope includes one or more distance sensors for generating measurements of a distance of the endoscope from an object within the monocular endoscopic images as the endoscope is advanced to the target location by the endoscopic robot (e.g., distance to a ligament within monocular endoscopic images of a knee). For avoiding or detecting a collision of the endoscope with the object, the endoscopic control unit receives the monocular endoscopic images and distance measurements to reconstruct a three-dimensional image of a surface of the object within the monocular endoscopic images as a function of the distance
measurements .
A second form of the present invention is an endoscopic method involving an advancement of an endoscope by an endoscopic robot to a target location within an anatomical region of a body and a generation of a plurality of monocular endoscopic images of the anatomical region as the endoscope is advanced by the endoscopic robot to the target location within the anatomical region. For avoiding or detecting a collision of the endoscope with an object within the monocular endoscopic images (e.g., a ligament within monocular endoscopic images of a knee), the method further involves a generation of distance
measurements of the endoscope from the object as the endoscope is advanced to the target location by the endoscopic robot, and a reconstruction of a three-dimensional image of a surface of the object within the monocular endoscopic images as a function of the distance measurements.
FIG. 1. illustrates an exemplary embodiment of a endoscopic system in accordance with the present invention.
FIG. 2 illustrates a first exemplary embodiment of a distal end of an endoscope in accordance with the present invention.
FIG. 3 illustrates a second exemplary embodiment of a distal end of an endoscope in accordance with the present invention.
FIG. 4 illustrates a flowchart representative of an exemplary embodiment of a collision avoidance/detection method in accordance with the present invention.
FIG. 5 illustrates a schematic representation of an arthroscopic surgery in accordance with the present invention.
FIG. 6 illustrates an exemplary application of the flowchart illustrated in FIG. 4 during the arthroscopic surgery illustrated in FIG. 5. FIG. 7 illustrates a flowchart representative of an exemplary embodiment of an object detection in accordance with the present invention.
FIG. 8 illustrates an exemplary stereo matching of two synthetic knee images in accordance with the present invention.
As shown in FIG. 1 , a endoscopic system 10 of the present invention employs an endoscope 20 and a endoscopic control unit 30 for any applicable type of medical procedures. Examples of such medical procedures include, but are not limited to, minimally invasive cardiac surgery (e.g., coronary artery bypass grafting or mitral valve replacement), minimally invasive abdominal surgery (laparoscopy) (e.g., prostatectomy or cholecystectomy, and natural orifice translumenal endoscopic surgery.
Endoscope 20 is broadly defined herein as any device structurally configured imaging an anatomical region of a body (e.g., human or animal) via an imaging device 21 (e.g., fiber optics, lenses, miniaturized CCD based imaging systems, etc). Examples of endoscope 20 include, but are not limited to, any type of imaging scope (e.g., a bronchoscope, a colonoscope, a laparoscope, an arthroscope, etc.) and any device similar to a scope that is equipped with an image system (e.g., an imaging cannula).
Endoscope 20 is further equipped on its distal end with one or more distance sensors 22 as individual element(s) or array(s). In one exemplary embodiment, a distance sensor 22 may be an ultrasound transducer element or array for transmitting and receiving ultrasound signals having a time of flight that is indicative of a distance to an object (e.g., a bone within a knee). The ultrasound transducer element/array may be thin film micro-machined (e.g., piezoelectric thin film or capacitive micro-machined) transducers, which may also be disposable. In particular, a capacitive micro-machined ultrasound transducer array has AC characteristics for time of flight distance measurement of an object, and DC characteristics for direct measurement of any pressure being exerted by the object of the membrane of the array.
In practice, distance sensor(s) 22 are located on a distal end of endoscope 20 relative to imaging device 21 to facilitate collision avoidance and detection by endoscope 20 with an object. In one exemplary embodiment as shown in FIG. 2, distance sensors in the form of ultrasound transducer array 42 and ultrasound transducer array 43 are positioned around a circumference and a front surface, respectively, of a distal end of an endoscope shaft 40 having a imaging device 41 on the front surface of its distal end. For this embodiment, arrays 42 and 43 provide sensing around a significant length of endoscope shaft 40. By making use ID or 2D ultrasound transducer arrays, steering of the ultrasound beam in an angle of +/-45 degree to transmit and receive ultrasound signals is obtain whereby objects positioned in the direct line of the ultrasound sensors as well as objects located under an angle may be detected and collision with these objects may be avoided.
In another exemplary embodiment as shown in FIG. 3, a distance sensor in the form of a single ultrasound linear element 52 encircles a imaging device 51 on a top distal end of an endoscope shaft 50. Alternatively, ultrasound linear element 52 may consist of several elements serving as a phase-array for beam-forming and beam-steering.
Referring again to FIG. 1, endoscopic robot 31 of unit 30 is broadly defined herein as any robotic device structurally configured with motorized control to maneuver endoscope 20 during a minimally invasive surgery, and robot controller 32 of unit 30 is broadly defined herein as any controller structurally configured to provide motor signals to endoscopic robot 31 for the purposes of maneuvering endoscope 20 during the minimally invasive surgery. Exemplary input device(s) 33 for robot controller 32 include, but are not limited to, a 2D/3D mouse and a joystick.
Collision avoidance/detection device 34 of unit 30 is broadly defined herein as any device structurally configured for providing a surgeon operating an endoscope or a endoscopic robot with a real-time collision avoidance/detection by endoscope 20 with an object within an anatomical region of a body using a combination of imaging device 21 and distance sensors 22. In practice, collision avoidance/detection device 34 may operate independently of robot controller 32 as shown or be internally incorporated within robot controller 32.
Flowchart 60 as shown in FIG. 4 represents a collision avoidance/detection method of the present invention as executed by collision avoidance/detection device 34. For this method, collision avoidance/detection device 34 initially executes a stage S61 for acquiring monocular endoscopic images of an object within the anatomical region of a body from imaging device 21, and a stage S62 for receiving distance measurements of endoscope 20 from the object from distance sensor(s) 22 while endoscope 20 is advanced to a target location within the anatomical region of the body by endoscopic robot 31. From the image acquisition and distance measurements, collision avoidance/detection device 34 proceeds to a stage S63 of flowchart 60 to detect the object whereby the surgeon may manually operate endoscopic robot 31 or endoscopic robot 31 may be autonomously operated to avoid or detect any collision by endoscope 20 with the object. The detection of the object involves a 3D reconstruction of a surface of the object as viewed by endoscope 20 that provides critical information for avoiding and detecting any collision by endoscope with the object including, but not limited to, a 3D shape of the object and a depth of every point on the surface of the object.
To facilitate an understanding of flowchart 60, stages S61-S63 will now be described in more detail in the context of an arthroscopic surgical procedure 70 as shown in FIGS. 5 and 6. Specifically, FIG. 5 illustrates a patella 72, a ligament 73 and a damaged cartilage 74 of a knee 71. A irrigating instrument 75, a trimming instrument 76 and an arthroscope 77 having an imaging device in the form of a imaging device (not shown) and a distance sensor in the form of an ultrasound transducer array (not shown) are being used for purposes of repairing the damaged cartilage 74. Also, illustrated are ultrasound transducers 78a-78d for determining a relative positioning of the ultrasound transducer array within knee 71.
FIG. 6 illustrates a control of arthroscope 77 by an endoscopic robot 31a.
Referring to FIG. 4, the image acquisition of stage S61 involves the imaging device of arthroscope 77 providing a two-dimensional image temporal sequence 80 (FIG. 6) to collision avoidance/detection device 34 as arthroscope 77 is being advanced to a target location within knee 71 by endoscopic robot 3 la as controlled by robot controller 32.
Alternatively, the ultrasound transducer array of arthroscope 77 may be utilized to provide two-dimensional temporal sequence 90.
The distance measurements of stage S62 involve the ultrasound transducer array of arthroscope 77 transmitting and receiving ultrasound signals within knee 71 having a time of flight that is indicative of a distance to an object and provides collision avoidance/detection device 34 with distance measurement signals 81 (FIG. 6). In one embodiment, distance measurement signals may have AC signal components for time of flight distance measurement of an object, and DC signal components for direct measurement of any pressure being exerted by the object of the membrane of the ultrasound transducer array.
The object depth estimation of stage S63 involves collision avoidance/detection device 34 using a combination of image temporal sequence 80 and distance measurement signals 81 to provide control signals 82 to robot controller 32 and/or display image data 83 to a monitor 35 as needed to enable a surgeon or endoscopic robot 31 to avoid the object or to maneuver away from the object in the case of a collision. The display of image data 93 further provides information for facilitating the surgeon in making any necessary intraoperative decisions, particularly the 3D shape of the object and the depth of each point on the surface of the object. Flowchart 1 10 as shown in FIG. 7 represents an exemplary embodiment of stage S63 (FIG. 4). Specifically, the detection of the object by device 34 is achieved by an
implementation of a multiple stereo matching algorithm based on epipolar geometry.
First, a calibration of imaging device is executed during a stage S i l l of flowchart 1 10 prior to an insertion of arthroscope 77 within knee 71. In one embodiment of stage S 1 1 1, a standardized checkerboard method may be used to obtain intrinsic imaging device parameters (e.g., focal point and lens distortion coefficients) in a 3x3 imaging device intrinsic matrix (K).
Second, as arthroscope 77 is being advanced to a target location within knee 71 , a reconstruction of a 3D surface of an object from two or more images of the same scene taken at different time moments is executed during a stage S I 12 of flowchart 1 10. Specifically, motion of endoscope 71 is known from control of endoscopic robot 31 , so a relative rotation (3x3 matrix R) and a translation (3x1 vector t) between the two respective imaging device positions is also known. Using a knowledge set (K,R,t), comprising of both intrinsic and extrinsic imaging device parameters, image rectification is implemented to build a 3D depth map from the two images. In this process, the (K,R,t) images are warped so that their vertical components are aligned. The process of rectification results in 3x3 warping matrices and 4x3 disparity-to-depth mapping matrix.
Next, an optical flow is computed between two images during stage S I 12, using point correspondences as known in the art. Specifically, optical flow (u,v) in each 2D point (x,y) represents points movement between two images. Since the images are rectified, (i.e. warped to be parallel), then v = 0. Finally, from optical flow, a disparity map in every image element is u (xl -x2). Re -projecting the disparity map using the 4x3 disparity-to-depth mapping matrix will result in the 3D shape of the object in front of the lens of the imaging device. FIG. 8 illustrates an exemplary result of a 3D surface reconstruction 100 from image temporal sequence 80.
It is possible to detect distance between the lens and other structures. However, given an immeasurable imperfections in image temporal sequence 80 and any discretization errors, a stage S I 13 of flowchart 1 10 is implemented to correct the 3D surface reconstruction as needed. The correction starts with a comparison of the depth(s) dsi , i=l, .. .,N measured by N (one or more) distance sensors 22 and depth(s) du i=l ,.. .,N measured from the reconstructed images. These distances should be the same, however, because of the measurement noises, each of N measurement position will have an error associated with it: e; = | dsi - du |, i=l , .. .,N. The direct measurement using distance sensors 22 is significantly more precise than image- based method. Image-based method has however denser measurement. Therefore, the set ¾ is used to perform an elastic warping of the reconstructed surface to improve precision.
Although the present invention has been described with reference to exemplary aspects, features and implementations, the disclosed systems and methods are not limited to such exemplary aspects, features and/or implementations. Rather, as will be readily apparent to persons skilled in the art from the description provided herein, the disclosed systems and methods are susceptible to modifications, alterations and enhancements without departing from the spirit or scope of the present invention. Accordingly, the present invention expressly encompasses such modification, alterations and
enhancements within the scope hereof.

Claims

1. An endoscopic system (10), comprising:
an endoscope (20) for generating a plurality of monocular endoscopic images (80) of an anatomical region (71) of a body as the endoscope (20) is advanced to a target location within the anatomical region (71),
wherein the endoscope (20) includes at least one distance sensor (22) for generating measurements (81) of a distance of the endoscope (20) from an object within the monocular endoscopic images (80) as the endoscope (20) is advanced to the target location; and an endoscopic control unit (30) in communication with the endoscope (20) to receive the monocular endoscopic images (80) and the distance measurements (81),
wherein the endoscopic control unit (30) includes an endoscopic robot (31) operable to advance the endoscope (20) to the target location, and
wherein the endoscopic control unit (30) is operable to reconstruct a three- dimensional image of a surface of the object within the monocular endoscopic images (80) as a function of the distance measurements (81).
2. The endoscopic system (10) of claim 1, wherein the reconstruction of the three- dimensional image of the surface of the object includes:
building a three-dimensional depth map of the object from a temporal sequence of the monocular endoscopic images (80) of the anatomical region (71); and
correcting the three-dimensional depth map of the object relative to at least two distance measurements, each distance measurement being associated with one of the monocular endoscopic images.
3. The endoscopic system (10) of claim 2, wherein the correction of the three- dimensional image of the surface of the object includes:
generating an error set representative of a comparison of the depth map to a depth of each point of a surface of the object as indicated by the at least two distance measurements.
4. The endoscopic system (10) of claim 3, wherein the correction of the three- dimensional image of the surface of the object further includes: performing an elastic warping of the reconstruction of the three-dimensional image of the surface of the object as a function of the error set.
5. The endoscopic system (10) of claim 1 , wherein the at least one distance sensor (22) is operable to provide a measurement of any pressure being exerted by the object on the at least one distance sensor (22).
6. The endoscopic system (10) of claim 1, wherein the at least one distance sensor (22) includes at least one of an ultrasound transducer element (43) for transmitting and receiving ultrasound signals having a time of flight that is indicative of the distance from the endoscope (22) to the object.
7. The endoscopic system (10) of claim 1, wherein the at least one distance sensor (22) includes at least one of an ultrasound transducer array (42) for transmitting and receiving ultrasound signals having a time of flight that is indicative of the distance from the endoscope (22) to the object.
8. The endoscopic system (10) of claim 1, wherein the at least one distance sensor (22) is piezoelectric ceramic transducer.
9. The endoscopic system (10) of claim 1, wherein the at least one distance sensor (22) is single crystal transducer.
10. The endoscopic system (10) of claim 1, wherein the at least one distance sensor (22) is piezoelectric thin micro-machined transducer.
1 1. The endoscopic system (10) of claim 1, wherein the at least one distance sensor (22) is built using capacitive micro-machining.
12. The endoscopic system (10) of claim 1,
wherein the endoscope (20) further includes an imaging device (51) on a top distal end of a shaft of endoscope (20); and
wherein the at least one distance sensor (22) includes an ultrasound linear element (52) encircling the imaging device (51).
13. The endoscopic system (10) of claim 1, the at least one wherein distance sensor (22) includes a plurality of sensor elements serving as a phase-array for beam- forming and beam- steering.
14. An endoscopic method (60), comprising:
controlling an endoscopic robot (31) to advance an endoscope (20) to a target location within an anatomical region of a body;
generating a plurality of monocular endoscopic images (80) of the anatomical region (71) as the endoscope (20) is advanced to the target location by the endoscopic robot (31); generating measurements of a distance of the endoscope (20) from an object within the monocular endoscopic images (80) as the endoscope (20) is advanced to the target location by the endoscopic robot (31); and
reconstructing a three-dimensional image of a surface of the object within the monocular endoscopic images (80) as a function of the distance measurements.
15. The endoscopic method (60) of claim 14, wherein the reconstruction of the three- dimensional image of the surface of the object includes:
building a three-dimensional depth map of the object from a temporal sequence of the monocular endoscopic images (80) of the anatomical region (71); and
correcting the three-dimensional depth map of the object relative to at least two distance measurements, each distance measurement being associated with one of the monocular endoscopic images.
16. The endoscopic method (60) of claim 15, wherein the correction of the three- dimensional image of the surface of the object includes:
generating an error set representative of a comparison of the depth map to a depth of each point of a surface of the object as indicated by the at least two distance measurements.
17. The endoscopic method (60) of claim 16, wherein the correction of the three- dimensional image of the surface of the object further includes:
performing an elastic warping of the reconstruction of the three-dimensional image of the surface of the object as a function of the error set.
18. The endoscopic method (60) of claim 14, further comprising:
generating measurements of a pressure being exerted by the object on the endoscope
(20).
19. An endoscopic control unit (30), comprising :
an endoscopic robot (31) for advancing an endoscope (20) to a target location within the anatomical region (71) within a body; and
a collision/avoidance detection unit (34) is operable, as the endoscope (20) is advanced to the target location by the endoscopic robot (31), to receive a plurality of monocular endoscopic images (80) of the anatomical region (71) and to receive measurements (81) of a distance of the endoscope (20) from an object within the monocular endoscopic images (80), wherein the collision/avoidance detection unit (34) is further operable to reconstruct a three-dimensional image of a surface of the object within the monocular endoscopic images (80) as a function of the distance measurements (81).
20. The endoscopic control unit (30) of claim 19, wherein the reconstruction of the three- dimensional image of the surface of the object includes:
building a three-dimensional depth map of the object from a temporal sequence of the monocular endoscopic images (80) of the anatomical region (71); and
correcting the three-dimensional depth map of the object relative to at least two distance measurements (81), each distance measurement (81) being associated with one of the monocular endoscopic images.
PCT/IB2010/054481 2009-11-04 2010-10-04 Collision avoidance and detection using distance sensors WO2011055245A1 (en)

Priority Applications (4)

Application Number Priority Date Filing Date Title
EP10779336A EP2496128A1 (en) 2009-11-04 2010-10-04 Collision avoidance and detection using distance sensors
JP2012535970A JP2013509902A (en) 2009-11-04 2010-10-04 Collision avoidance and detection using distance sensors
CN2010800498322A CN102595998A (en) 2009-11-04 2010-10-04 Collision avoidance and detection using distance sensors
US13/502,412 US20120209069A1 (en) 2009-11-04 2010-10-04 Collision avoidance and detection using distance sensors

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US25785709P 2009-11-04 2009-11-04
US61/257,857 2009-11-04

Publications (1)

Publication Number Publication Date
WO2011055245A1 true WO2011055245A1 (en) 2011-05-12

Family

ID=43355722

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2010/054481 WO2011055245A1 (en) 2009-11-04 2010-10-04 Collision avoidance and detection using distance sensors

Country Status (6)

Country Link
US (1) US20120209069A1 (en)
EP (1) EP2496128A1 (en)
JP (1) JP2013509902A (en)
CN (1) CN102595998A (en)
TW (1) TW201124106A (en)
WO (1) WO2011055245A1 (en)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2833186A1 (en) * 2013-07-30 2015-02-04 Olympus Corporation Blade inspection apparatus and blade inspection method
JP2018522622A (en) * 2015-06-05 2018-08-16 シーメンス アクチエンゲゼルシヤフトSiemens Aktiengesellschaft Method and system for simultaneous scene analysis and model fusion for endoscopic and laparoscopic navigation
US10366531B2 (en) * 2017-10-24 2019-07-30 Lowe's Companies, Inc. Robot motion planning for photogrammetry
US10424110B2 (en) 2017-10-24 2019-09-24 Lowe's Companies, Inc. Generation of 3D models using stochastic shape distribution
CN110811527A (en) * 2019-12-05 2020-02-21 中山大学附属第一医院 Endoscope with shape estimation and disease online auxiliary diagnosis functions
CN110811491A (en) * 2019-12-05 2020-02-21 中山大学附属第一医院 Online disease identification endoscope with three-dimensional reconstruction function

Families Citing this family (52)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8672837B2 (en) 2010-06-24 2014-03-18 Hansen Medical, Inc. Methods and devices for controlling a shapeable medical device
JP5988786B2 (en) * 2012-09-07 2016-09-07 オリンパス株式会社 Ultrasonic unit and ultrasonic endoscope
GB2505926A (en) * 2012-09-14 2014-03-19 Sony Corp Display of Depth Information Within a Scene
KR102087595B1 (en) * 2013-02-28 2020-03-12 삼성전자주식회사 Endoscope system and control method thereof
US9057600B2 (en) 2013-03-13 2015-06-16 Hansen Medical, Inc. Reducing incremental measurement sensor error
US9014851B2 (en) 2013-03-15 2015-04-21 Hansen Medical, Inc. Systems and methods for tracking robotically controlled medical instruments
US9271663B2 (en) 2013-03-15 2016-03-01 Hansen Medical, Inc. Flexible instrument localization from both remote and elongation sensors
US9629595B2 (en) 2013-03-15 2017-04-25 Hansen Medical, Inc. Systems and methods for localizing, tracking and/or controlling medical instruments
US11020016B2 (en) 2013-05-30 2021-06-01 Auris Health, Inc. System and method for displaying anatomy and devices on a movable display
US9452531B2 (en) 2014-02-04 2016-09-27 Microsoft Technology Licensing, Llc Controlling a robot in the presence of a moving object
JP6358811B2 (en) * 2014-02-13 2018-07-18 オリンパス株式会社 Manipulator and manipulator system
US10555788B2 (en) 2014-03-28 2020-02-11 Intuitive Surgical Operations, Inc. Surgical system with haptic feedback based upon quantitative three-dimensional imaging
US11266465B2 (en) 2014-03-28 2022-03-08 Intuitive Surgical Operations, Inc. Quantitative three-dimensional visualization of instruments in a field of view
KR102405687B1 (en) * 2014-03-28 2022-06-07 인튜어티브 서지컬 오퍼레이션즈 인코포레이티드 Quantitative three-dimensional imaging and printing of surgical implants
DE102014210619A1 (en) * 2014-06-04 2015-12-17 Olympus Winter & Ibe Gmbh Endoscope with non-contact distance measurement
CN105881535A (en) 2015-02-13 2016-08-24 鸿富锦精密工业(深圳)有限公司 Robot capable of dancing with musical tempo
CN107624193A (en) * 2015-04-29 2018-01-23 西门子公司 The method and system of semantic segmentation in laparoscope and endoscope 2D/2.5D view data
JP6177488B2 (en) 2015-07-23 2017-08-09 オリンパス株式会社 Manipulator and medical system
US10195740B2 (en) 2015-09-10 2019-02-05 X Development Llc Using object observations of mobile robots to generate a spatio-temporal object inventory, and using the inventory to determine monitoring parameters for the mobile robots
US9727963B2 (en) 2015-09-18 2017-08-08 Auris Surgical Robotics, Inc. Navigation of tubular networks
US10143526B2 (en) 2015-11-30 2018-12-04 Auris Health, Inc. Robot-assisted driving systems and methods
WO2017103984A1 (en) * 2015-12-15 2017-06-22 オリンパス株式会社 Medical manipulator system and operation method therefor
US10244926B2 (en) 2016-12-28 2019-04-02 Auris Health, Inc. Detecting endolumenal buckling of flexible instruments
CN108990412B (en) 2017-03-31 2022-03-22 奥瑞斯健康公司 Robot system for cavity network navigation compensating physiological noise
US10022192B1 (en) 2017-06-23 2018-07-17 Auris Health, Inc. Automatically-initialized robotic systems for navigation of luminal networks
KR102578978B1 (en) 2017-06-28 2023-09-19 아우리스 헬스, 인코포레이티드 Electromagnetic distortion detection
WO2019005699A1 (en) 2017-06-28 2019-01-03 Auris Health, Inc. Electromagnetic field generator alignment
WO2019050829A1 (en) 2017-09-05 2019-03-14 Covidien Lp Collision handling algorithms for robotic surgical systems
US11058493B2 (en) 2017-10-13 2021-07-13 Auris Health, Inc. Robotic system configured for navigation path tracing
US10555778B2 (en) 2017-10-13 2020-02-11 Auris Health, Inc. Image-based branch detection and mapping for navigation
CN107811710B (en) * 2017-10-31 2019-09-17 微创(上海)医疗机器人有限公司 Operation aided positioning system
EP3684562A4 (en) 2017-12-14 2021-06-30 Auris Health, Inc. System and method for estimating instrument location
WO2019125964A1 (en) 2017-12-18 2019-06-27 Auris Health, Inc. Methods and systems for instrument tracking and navigation within luminal networks
JP7225259B2 (en) 2018-03-28 2023-02-20 オーリス ヘルス インコーポレイテッド Systems and methods for indicating probable location of instruments
US10524866B2 (en) 2018-03-28 2020-01-07 Auris Health, Inc. Systems and methods for registration of location sensors
WO2019231895A1 (en) 2018-05-30 2019-12-05 Auris Health, Inc. Systems and methods for location sensor-based branch prediction
CN112236083A (en) 2018-05-31 2021-01-15 奥瑞斯健康公司 Robotic system and method for navigating a luminal network detecting physiological noise
EP3801189A4 (en) 2018-05-31 2022-02-23 Auris Health, Inc. Path-based navigation of tubular networks
MX2020012904A (en) 2018-05-31 2021-02-26 Auris Health Inc Image-based airway analysis and mapping.
CN108836406A (en) * 2018-06-01 2018-11-20 南方医科大学 A kind of single laparoscopic surgical system and method based on speech recognition
WO2020070883A1 (en) * 2018-10-05 2020-04-09 オリンパス株式会社 Endoscopic system
US11801113B2 (en) * 2018-12-13 2023-10-31 Covidien Lp Thoracic imaging, distance measuring, and notification system and method
CN110082359A (en) * 2019-05-10 2019-08-02 宝山钢铁股份有限公司 The location structure mechanical device of steel tube screw thread detection system based on image detection
WO2021038495A1 (en) 2019-08-30 2021-03-04 Auris Health, Inc. Instrument image reliability systems and methods
KR20220058569A (en) 2019-08-30 2022-05-09 아우리스 헬스, 인코포레이티드 System and method for weight-based registration of position sensors
WO2021044297A1 (en) 2019-09-03 2021-03-11 Auris Health, Inc. Electromagnetic distortion detection and compensation
WO2021059100A1 (en) * 2019-09-26 2021-04-01 Auris Health, Inc. Systems and methods for collision avoidance using object models
US11298195B2 (en) 2019-12-31 2022-04-12 Auris Health, Inc. Anatomical feature identification and targeting
CN114929148A (en) 2019-12-31 2022-08-19 奥瑞斯健康公司 Alignment interface for percutaneous access
WO2021137109A1 (en) 2019-12-31 2021-07-08 Auris Health, Inc. Alignment techniques for percutaneous access
KR20230079417A (en) * 2020-09-30 2023-06-07 아우리스 헬스, 인코포레이티드 Collision Avoidance of Surgical Robots Based on Detection of Contact Information
CN113838052B (en) * 2021-11-25 2022-02-18 极限人工智能有限公司 Collision warning device, electronic apparatus, storage medium, and endoscopic video system

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE1766904B1 (en) * 1967-08-08 1971-05-19 Olympus Optical Co Endoscope with a device for determining the object distance
US20060142657A1 (en) * 2002-03-06 2006-06-29 Mako Surgical Corporation Haptic guidance system and method
DE102006017003A1 (en) * 2006-04-11 2007-10-18 Friedrich-Alexander-Universität Erlangen-Nürnberg Endoscope for depth data acquisition in e.g. medical area, has modulation unit controlling light source based on modulation data so that source transmits modulated light signal and evaluation unit evaluating signal to estimate depth data
EP2108943A2 (en) * 2008-04-11 2009-10-14 Storz Endoskop Produktions GmbH Device and method for fluorescence imaging

Family Cites Families (17)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR1532340A (en) * 1967-04-06 1968-07-12 Comp Generale Electricite Device for measuring the width of a cavity in the circulatory system
JPS5745835A (en) * 1980-09-02 1982-03-16 Olympus Optical Co Endoscope apparatus
US5113869A (en) * 1990-08-21 1992-05-19 Telectronics Pacing Systems, Inc. Implantable ambulatory electrocardiogram monitor
US5417210A (en) * 1992-05-27 1995-05-23 International Business Machines Corporation System and method for augmentation of endoscopic surgery
DE19804797A1 (en) * 1998-02-07 1999-08-12 Storz Karl Gmbh & Co Device for endoscopic fluorescence diagnosis of tissue
EP2281503B1 (en) * 1999-09-24 2015-04-29 National Research Council of Canada Method for performing intra-operative angiography
JP3939652B2 (en) * 2000-11-15 2007-07-04 コーニンクレッカ フィリップス エレクトロニクス エヌ ヴィ Multidimensional ultrasonic transducer array
US6773402B2 (en) * 2001-07-10 2004-08-10 Biosense, Inc. Location sensing with real-time ultrasound imaging
WO2005089065A2 (en) * 2004-03-23 2005-09-29 Dune Medical Devices Ltd. Clean margin assessment tool
EP1489972B2 (en) * 2002-03-15 2013-04-10 Bjorn A. J. Angelsen Multiple scan-plane ultrasound imaging of objects
US20040199052A1 (en) * 2003-04-01 2004-10-07 Scimed Life Systems, Inc. Endoscopic imaging system
DE102004008164B3 (en) * 2004-02-11 2005-10-13 Karl Storz Gmbh & Co. Kg Method and device for creating at least a section of a virtual 3D model of a body interior
CA2826925C (en) * 2005-02-22 2017-01-24 Mako Surgical Corp. Haptic guidance system and method
US20060241438A1 (en) * 2005-03-03 2006-10-26 Chung-Yuo Wu Method and related system for measuring intracranial pressure
US7305883B2 (en) * 2005-10-05 2007-12-11 The Board Of Trustees Of The Leland Stanford Junior University Chemical micromachined microsensors
US20070167793A1 (en) * 2005-12-14 2007-07-19 Ep Medsystems, Inc. Method and system for enhancing spectral doppler presentation
FR2923372B1 (en) * 2007-11-08 2010-10-29 Theraclion DEVICE AND METHOD FOR NON-INVASIVE REPORTING OF A STRUCTURE SUCH AS A NERVE.

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE1766904B1 (en) * 1967-08-08 1971-05-19 Olympus Optical Co Endoscope with a device for determining the object distance
US20060142657A1 (en) * 2002-03-06 2006-06-29 Mako Surgical Corporation Haptic guidance system and method
DE102006017003A1 (en) * 2006-04-11 2007-10-18 Friedrich-Alexander-Universität Erlangen-Nürnberg Endoscope for depth data acquisition in e.g. medical area, has modulation unit controlling light source based on modulation data so that source transmits modulated light signal and evaluation unit evaluating signal to estimate depth data
EP2108943A2 (en) * 2008-04-11 2009-10-14 Storz Endoskop Produktions GmbH Device and method for fluorescence imaging

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP2833186A1 (en) * 2013-07-30 2015-02-04 Olympus Corporation Blade inspection apparatus and blade inspection method
US9316564B2 (en) 2013-07-30 2016-04-19 Olympus Corporation Blade inspection apparatus and blade inspection method
JP2018522622A (en) * 2015-06-05 2018-08-16 シーメンス アクチエンゲゼルシヤフトSiemens Aktiengesellschaft Method and system for simultaneous scene analysis and model fusion for endoscopic and laparoscopic navigation
US10366531B2 (en) * 2017-10-24 2019-07-30 Lowe's Companies, Inc. Robot motion planning for photogrammetry
US10424110B2 (en) 2017-10-24 2019-09-24 Lowe's Companies, Inc. Generation of 3D models using stochastic shape distribution
CN110811527A (en) * 2019-12-05 2020-02-21 中山大学附属第一医院 Endoscope with shape estimation and disease online auxiliary diagnosis functions
CN110811491A (en) * 2019-12-05 2020-02-21 中山大学附属第一医院 Online disease identification endoscope with three-dimensional reconstruction function

Also Published As

Publication number Publication date
JP2013509902A (en) 2013-03-21
US20120209069A1 (en) 2012-08-16
TW201124106A (en) 2011-07-16
EP2496128A1 (en) 2012-09-12
CN102595998A (en) 2012-07-18

Similar Documents

Publication Publication Date Title
US20120209069A1 (en) Collision avoidance and detection using distance sensors
AU2018380139B2 (en) Systems and methods to correct for uncommanded instrument roll
US11510736B2 (en) System and method for estimating instrument location
US11800970B2 (en) Computerized tomography (CT) image correction using position and direction (P and D) tracking assisted optical visualization
US20180206791A1 (en) Medical imaging apparatus and method
WO2018159338A1 (en) Medical support arm system and control device
US10945796B2 (en) Robotic control of surgical instrument visibility
EP3359012B1 (en) A laparoscopic tool system for minimally invasive surgery
US20160213436A1 (en) Medical system and method of controlling medical treatment tools
WO2017014303A1 (en) Medical system and operation method therefor
JP6334714B2 (en) Control unit or robot guide system for continuous image integration for robotic surgery
WO2018088105A1 (en) Medical support arm and medical system
Edgcumbe et al. Calibration and stereo tracking of a laparoscopic ultrasound transducer for augmented reality in surgery
Tamadazte et al. Weakly calibrated stereoscopic visual servoing for laser steering: Application to phonomicrosurgery
US20220061927A1 (en) Robotically controllable field generators for detecting distortions
JP6150968B1 (en) Endoscope system
WO2023276242A1 (en) Medical observation system, information processing device, and information processing method
WO2022201933A1 (en) Intravital observation system, observation system, intravital observation method, and intravital observation device
WO2022219878A1 (en) Medical observation system, medical image processing method, and information processing device
CN116456925A (en) Robot type controllable field generator

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201080049832.2

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 10779336

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2010779336

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 13502412

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 2012535970

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE