WO2019046411A1 - Structured light projection from an optical fiber - Google Patents

Structured light projection from an optical fiber Download PDF

Info

Publication number
WO2019046411A1
WO2019046411A1 PCT/US2018/048522 US2018048522W WO2019046411A1 WO 2019046411 A1 WO2019046411 A1 WO 2019046411A1 US 2018048522 W US2018048522 W US 2018048522W WO 2019046411 A1 WO2019046411 A1 WO 2019046411A1
Authority
WO
WIPO (PCT)
Prior art keywords
light
optical fiber
structured
pattern
further including
Prior art date
Application number
PCT/US2018/048522
Other languages
French (fr)
Inventor
Ian E. Mcdowall
Brian D. Hoffman
Theodore W. Rogers
Original Assignee
Intuitive Surgical Operations, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intuitive Surgical Operations, Inc. filed Critical Intuitive Surgical Operations, Inc.
Publication of WO2019046411A1 publication Critical patent/WO2019046411A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00193Optical arrangements adapted for stereoscopic vision
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0638Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements providing two or more wavelengths
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0655Control therefor
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0661Endoscope light sources
    • A61B1/0669Endoscope light sources at proximal end of an endoscope
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/0661Endoscope light sources
    • A61B1/0684Endoscope light sources using light emitting diodes [LED]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/06Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements
    • A61B1/07Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor with illuminating arrangements using light-conductive means, e.g. optical fibres
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • A61B2090/306Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure using optical fibres
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/371Surgical systems with images on a monitor during operation with simultaneous use of two cameras

Definitions

  • Minimally invasive medical techniques are intended to reduce the amount of tissue that is damaged during diagnostic or surgical procedures, thereby reducing patient recovery time, discomfort, and deleterious side effects.
  • a stereoscopic endoscope may be inserted into a patient's body cavity to view a surgical scene during a minimally invasive surgical procedure.
  • the surgical scene includes tissue structures and may include one or more surgical instruments inserted into the body cavity.
  • Tissue surface features may be difficult to visually discern due to uniform of tissue color or tissue smoothness, for example.
  • Structured light has been used to discern contours and depth in a scene. However, space to project structured light is limited within a surgical scene during minimally invasive surgery.
  • a structured light forming device includes an optical fiber having a structured image forming pattern at an end thereof.
  • a stereo endoscope in another aspect, includes an elongated housing having right and left camera windows and a light guide.
  • An optical fiber having a structured image forming pattern at an end thereof extends within the housing.
  • method is provided to project structured light.
  • Light is transmitted within an optical fiber from a proximal end portion of the optical fiber to a distal end portion of the optical fiber.
  • Light is emitted at the distal end portion of the optical fiber in a structured light pattern.
  • a method is provided to produce a depth map for an object.
  • Light is transmitted within an optical fiber from a proximal end portion of the optical fiber to a distal end portion of the optical fiber.
  • Light is emitted at the distal end portion of the optical fiber in a structured light pattern.
  • the distal end of the optical fiber is positioned to direct the structured light pattern to be incident upon the object.
  • a stuctured light pattern feature reflected from the object is captured at right and left light sensors.
  • a surface depth of a portion of the object on which the captured light pattern feature is incident is determined based upon a location of the captured stuctured light pattern feature in the right light sensor and a location of the captured stuctured light pattern feature in the right camera.
  • Figure 1 is a plan view of a minimally invasive teleoperated surgical system.
  • Figure 2 is a perspective view of a surgeon's console.
  • Figure 3 is a perspective view of an electronics cart.
  • Figure 4 is a diagrammatic illustration of a teleoperated surgical system.
  • Figure 5 is a perspective view of a patient-side cart of a minimally invasive teleoperated surgical system, in accordance with embodiments.
  • Figure 6 is an illustrative simplified block diagram showing an example positioning of mechanical support arms of the teleoperation surgery system during a surgical procedure in accordance with some embodiments.
  • Figure 7 is a side elevation view of a surgical instrument.
  • Figure 8 is an illustrative drawing of a stereo endoscope shown positioned via a cannula, to penetrate body tissue to provide visualize access to a surgical scene that includes a tissue object to be viewed.
  • Figure 9 is an illustrative schematic three-dimensional view of a a sterographic imaging system in accordance with some embodiments.
  • Figure 10 is an illustrative drawing of structed light from a first end portion of a stereo endoscope projected onto a tissue object.
  • Figures 11A-11B are illustrative a side view ( Figure 11A) and a perspective view ( Figure 11B) of a distal end portion of a first structured light forming optical fiber having an first exit surface with a first structured image forming pattern thereon.
  • Figures 12A-12B are illustrative drawings showing a side view ( Figure 12A) and a perspective partially transparent view ( Figure 12B) of a second structured light forming optical fiber that includes an optical fiber having smoothly polished exit surface and having sleeve having a second transparent exit window having a second structured image forming pattern formed thereon.
  • Figures 13A-13B are illustrative drawings showing a side view ( Figure 13A) and a perspective partially transparent view (Figure 13B) of a third structured light forming optical fiber that includes an optical fiber having an optical fiber tip affixed thereto that has a third transparent exit window having a third structured image forming pattern formed thereon.
  • Figure 14 is a cross-sectional view of a Gradient Index lens at the exit surface of an optical fiber.
  • Figure 1 is a plan view of a minimally invasive teleoperated surgical system 10, typically used for performing a minimally invasive diagnostic or surgical procedure on a patient 12 who is lying on a mobile operating table 14.
  • the system includes a mobile surgeon's console 16 for use by a surgeon 18 during the procedure.
  • One or more surgical team members 20 may also participate in the procedure.
  • the minimally invasive teleoperated surgical system 10 further includes a mobile patient-side cart 22 and a mobile electronics cart 24. In some embodiments, the table 14, surgeon's console 16, patient-side cart 22, and the electronics cart 24.
  • the patient-side cart 22 includes multiple segmented mechanical support arms 72, each having one end portion rotatably mounted to a vertical support structure 74 and having another end mounting a removably coupled surgical instrument 26.
  • each mechanical support arm 72 includes a first segment 72-1, a second segment 72-2 and a third segment 72-3.
  • the multiple segments of at least one support arm 72 are moved to position a surgical instrument for insertion within a minimally invasive incision in the body of the patient 12.
  • the surgeon 18 views the surgical site through the surgeon's console 16.
  • An image of the surgical site can be obtained by an endoscope 28, such as a stereoscopic endoscope, which a surgeon can manipulate at the patient-side cart 22 to orient the endoscope 28.
  • Computer processors located on the electronics cart 24 can be used to process the images of the surgical site for subsequent display to the surgeon 18 through the surgeon's console 16.
  • the number of surgical instruments 26 used at one time will generally depend on the diagnostic or surgical procedure.
  • FIG 2 is a perspective view of the surgeon's console 16.
  • the surgeon's console 16 includes a left eye display 32 and a right eye display 34 for presenting the surgeon 18 with a coordinated stereoscopic view of the surgical site that enables depth perception.
  • the console 16 further includes one or more control inputs 36.
  • One or more surgical instruments installed for use on the patient-side cart 22 (shown in Figure 1) move in response to surgeon 18's manipulation of the one or more control inputs 36.
  • the control inputs 36 can provide the same mechanical degrees of freedom as their associated surgical instruments 26 (shown in Figure 1) to provide the surgeon 18 with telepresence, or the perception that the control inputs 36 are integral with the instruments 26 so that the surgeon has a strong sense of directly controlling the instruments 26.
  • position, force, and tactile feedback sensors may be employed to transmit position, force, and tactile sensations from the surgical instruments 26 back to the surgeon's hands through the control inputs 36.
  • Figure 3 is a perspective view of the electronics cart 24.
  • the electronics cart 24 can be coupled with the endoscope 28 and includes a computer processor to process captured images for subsequent display, such as to a surgeon on the surgeon's console, or on another suitable display located locally and/or remotely.
  • a computer processor on electronics cart 24 can process the captured images to present the surgeon with coordinated stereo images of the surgical site.
  • Such coordination can include alignment between the opposing images and can include adjusting the stereo working distance of the stereoscopic endoscope.
  • image processing can include the use of previously determined camera calibration parameters to compensate for imaging errors of the image capture device, such as optical aberrations.
  • equipment in electronics cart may be integrated into the surgeon's console or the patient-side cart, or it may be distributed in various other locations in the operating room.
  • Figure 4 diagrammatically illustrates a teleoperated surgical system 50
  • a surgeon's console 52 (such as surgeon's console 16 in Figure 1) can be used by a surgeon to control a patient-side cart 54 (such as patent-side cart 22 in Figure 1) during a minimally invasive procedure.
  • the patient-side cart 54 can use an imaging device, such as a stereoscopic endoscope, to capture images of a surgical site and output the captured images to a computer processor located on an electronics cart 56 (such as the electronics cart 24 in Figure 1).
  • the computer processor typically includes one or more data processing boards purposed for executing computer readable code stored in a non-volatile memory device of the computer processor.
  • the computer processor can process the captured images in a variety of ways prior to any subsequent display. For example, the computer processor can overlay the captured images with a virtual control interface prior to displaying the combined images to the surgeon via the surgeon's console 52.
  • FIG. 5 is a perspective view of a patient-side cart 54 of a minimally invasive teleoperated surgical system 10, in accordance with embodiments.
  • the patient-side cart 54 includes four mechanical support arms 72.
  • a surgical instrument manipulator 73 which includes motors to control instrument motion, is mounted at the end of each support arm assembly 72.
  • each support arm 72 can optionally include one or more setup joints (e.g., unpowered and/or lockable) that are used to position the attached surgical instrument manipulator 73 in relation to the patient for surgery. While the patient-side cart 54 is shown as including four surgical instrument manipulators 73, more or fewer surgical instrument manipulators 73 may be used.
  • a functional teleoperated surgical system will generally include a vision system portion that enables a user of the teleoperated surgical system to view the surgical site from outside the patient's body 12.
  • the vision system typically includes a endoscopic camera instrument 28 for capturing video images and one or more video displays for displaying the captured video images.
  • the endoscopic camera 28 includes optics that transfer the images from a distal end of the endoscopic camera 28 to one or more imaging sensors (e.g., CMOS or CCD, or other array type detectors sensors) outside of the patient's body 12.
  • the imaging sensor(s) can be positioned at the distal end of the endoscopic camera 28, and the signals produced by the sensor(s) can be transmitted along a lead or wirelessly for processing and display on the one or more video displays.
  • an individual surgical instrument 26 and a cannula 27 are removably coupled to manipulator 73, with the surgical instrument 26 inserted through the cannula 27.
  • One or more teleoperated actuators of the manipulator 73 move the surgical instrument 26 as a whole.
  • the manipulator 73 further includes an instrument carriage 75.
  • the surgical instrument 26 is detachably connected to the instrument carriage 75.
  • the instrument carriage 75 houses one or more teleoperated actuators inside that provide a number of controller motions that the surgical instrument 26 translates into a variety of movements of an end effector on the surgical instrument 26.
  • the teleoperated actuators in the instrument carriage 75 move only one or more components of the surgical instrument 26 rather than the instrument as a whole.
  • Inputs to control either the instrument as a whole or the instrument's components are such that the input provided by a surgeon or other medical person to the control input (a "master" command) is translated into a corresponding action by the surgical instrument (a "slave” response).
  • instrument carriage 75 does not house teleoperated actuators. Teleoperated actuators that enable the variety of movements of the end effector of the surgical instrument 26 are housed in a location remote from the instrument carriage 75, e.g., elsewhere on patient-side cart 54.
  • a cable-based force transmission mechanism or the like is used to transfer the motions of each of the remotely located teleoperated actuators to a corresponding instrument-interfacing actuator output located on instrument carriage 75.
  • the surgical instrument 26 is mechanically coupled to a first actuator, which controls a first motion of the surgical instrument such as longitudinal (z-axis) rotation.
  • the surgical instrument 26 is mechanically coupled to a second actuator, which controls second motion of the surgical instrument such as two-dimensional (x, y) motion.
  • the surgical instrument 26 is mechanically coupled to a third actuator, which controls third motion of the surgical instrument such as opening and closing of jaws of an end effector, for example.
  • Figure 6 is an illustrative simplified block diagram showing an example positioning of mechanical support arms 72A-72C of the teleoperation surgery system 10 during a surgical procedure in accordance with some embodiments.
  • the patient-side system 54 includes at least three mechanical support arms 72A-72C.
  • each of the mechanical support arms 72A-72C includes rotatably mounted first, second and third segments 72-1, 72-2 and 72-3.
  • a center-located mechanical support arm 72 may support an endoscopic camera 28 suitable for capture of images within a field of view of the camera.
  • the mechanical support arms 72 to the left and right of center may support surgical instruments 26A and 26B, respectively, which may manipulate anatomical tissue.
  • the support arm segments are pre-positioned to support endoscope and instruments in precise position and orientation to for robot assisted manipulation by a surgeon to perform a medical procedure.
  • a user or operator O performs a surgical procedure on patient P by manipulating control input devices 36, such as hand grips and foot pedals at a master control console 16.
  • the operator can view video frames of images of a surgical site inside a patient's body through a stereo display viewer 31.
  • a computer processor 58 of the console 16 directs movement of teleoperationally controlled instruments 26, 26A-26B and 28 via control lines 159, effecting movement of the instruments using a patient-side system 24 (also referred to as a patient-side cart).
  • Figure 7 is a side view of a surgical instrument 26, which includes an elongated tubular shaft 610 having a centerline longitudinal axis 611, a distal (first) portion 650 for insertion into a patient's body cavity and proximal (second) portion 656 coupled adjacent a control mechanism 640.
  • the surgical instrument 26 is used to carry out surgical or diagnostic procedures.
  • the distal portion 650 of the surgical instrument 26 can provide any of a variety of end effectors 654, such as the forceps shown, a needle driver, a cautery device, a cutting tool, an imaging device (e.g., an endoscope or ultrasound probe), or the like.
  • the surgical end effector 654 can include a functional mechanical degree of freedom, such as jaws that open or close, or a knife that translates along a path.
  • the end effector 654 is coupled to the elongate tube 610 by a wrist 652 that allows the end effector to be oriented relative to the elongate tube centerline axis 611.
  • the control mechanism controls movement of the overall instrument and the end effector at its distal portion.
  • FIG 8 is an illustrative drawing of the endoscope 28 of Figure 1 shown positioned via a cannula 27, to penetrate body tissue 130 to provide visual access to a surgical scene that includes a tissue object 120 to be viewed.
  • the endoscope 28 includes elongated portion 202, which includes a distal (first) end portion 204 and a proximal (second) end portion 206 and a tip portion 208 of the first end portion 204.
  • the first end portion 202 is dimensioned to be inserted into a human body cavity.
  • the elongated portion 202 has a length sufficient to position the tip portion 208 close enough to an object to be viewed within the body cavity that the object can be imaged by the stereo image capture devices 913, 914, described more fully below.
  • the second end portion 206 is disposed outside the body cavity.
  • FIG. 9 is an illustrative schematic three-dimensional view of a a sterographic imaging system 900 in accordance with some embodiments.
  • the stereographic imaging system includes a pair of cameras 902, 904 to capture right and left images of the scene.
  • a non-zero spatial distance between the cameras often referred to as the basis or inter pupilary distance (IPD), allows observation of the scene from two distinct positions.
  • IPD inter pupilary distance
  • a matching process uses a right image captured by the right camera 902 and a left image received by left camera 903 to compute a distance from a distal portion of the endoscope 28 based upon the basis and the positions of corresponding reflected light pattern features in a left image of the scene captured by the left camera and in a right image of the scene captured by the right camera.
  • This method of triangulation relies on knowing the basis and matching features in the left and right eye images. An inability to precicely match features in the two images leads to an area of the image where the depth is not strictly deterministic. Naturally, in surgical applications, that problem wants to be solved in a deterministic way. More precision in the a-prioi knowledge as to the exact basis distance also allows more exact distance determinaiton in physical units such as mm. This distance may be calibrated at the time the stereo endoscope is manufactured and, as this will vary a small amount from unit to unit, that calibration data may be stored in a storage memory in the cloud associated with a unique ID associated with the device or stored in a storage memory on the device itself or in a table in a storage memory on a system using the device.
  • tissue surface features captured in both camera images may be used to calculate the distances from the camera to the features and with enough features, a three-dimensional depth map of the tissue surface may be computed for the scene.
  • U.S. Patent No. 9,526,587 which is expressly incorporated herein in its entirety by this reference, describes a stereographic imaging system feature matching processes that may be used to match structured light features reflected from a tissue surface.
  • the stereographic imaging system 900 includes a stereo endoscope 28 to that includes first and second light illumination guides 930, 932 (only one visible) to transmit illumination light 934.1 932.1 (only one visible) produced by one or both of first and second light sources 962, 964, to and from an object (e.g., tissue object 120) to be viewed, image capture sensors 913, 914, stereo image processing components 920, 922 and stereo display components 924, 926.
  • the stereo endoscope 28 includes a distal (first) end portion 204 suited for insertion to a patient's body cavity to view a surgical site and a proximal (second) end portion 206 to be disposed outside the body cavity.
  • the image capture sensors which in some
  • embodiments include charge coupled devices (CCDs) 913 and 914, operate together with optical lenses 916 and 918 to capture image information.
  • Image converter circuits 920, 922 which may include an appropriate electronically driven system such as a camera control unit (CCU), transform optical information captured by the image capture sensors 913, 914 to a digital format.
  • An electronic processor (not shown) is in operative communication with the image converter circuits 920, 922.
  • the system 900 further includes two visual image display units, one for a left eye and the other for a right eye, schematically indicated at 924, 926, each of which is operatively connected to one of the image converter circuits 920, 922 as indicated by lines 928, 930 so as to convert digital information received from the image converter circuits 920, 922 into corresponding visual images that can be viewed by a user such as a surgeon.
  • the display units may include an appropriate visual screening apparatus such as a liquid crystal display, an OLED, or the like.
  • An assembly of reflecting surfaces or a reflective train, e.g., comprising mirrors, or the like, is generally indicated by reference numeral 932.
  • the assembly 932 conveys the images from the display units 924, 926 to a viewer at 934. It will be appreciated that the direction of view at 934 is normal to the page, a right eye image being indicated at 934.1, and a left eye image being indicated at 934.2 for example.
  • the stereo endoscope 28 includes first and second optical lenses 936, 938.
  • the optical lenses 936, 938 have viewing axes as indicated by lines 940, 942.
  • the lenses 936, 938 may be positioned so that the axes 940, 942 intersect at 944. If an object, such as an anatomical tissue surface 120, is viewed at a working distance generally corresponding to the intersection 944, namely on a plane at Al, the object is viewed generally at an optically correct working distance. The farther the object is removed from the intersection 944, the more optically incorrect the object is viewed.
  • a viewing area, or the field of view, of each lens 936, 938 can be represented by a section through an imaginary conical volume indicated by referenced numerals 936.1, 938.1.
  • the object is viewed at an optically correct working distance. It will be appreciated that the axes 940, 942 are positioned at the centers of the oval shaped area 960 in an optically aligned system.
  • images of a tissue object 120 viewed at plane A are passed through the lenses 936, 938, along first (left) and second (right) camera reflected light guides 944, 946 as indicated by arrows 944.1, 946.1 in the endoscope 28, and are then magnified through lenses 916, 918 and are then projected onto optically sensitive surfaces of e.g., the image capture sensors 913, 914, at a proximal end of the endoscope 28 as indicated by arrows 944.1, 946.1.
  • the image capture sensors are disposed at the distal end portion of the endoscope.
  • the light sources 962, 964 may provide white light, color filtered light or light at some selected wavelength.
  • the light source may include one or more laser or light emitting diode (LED) light sources for example.
  • the laser may provide coherent light at a selected wavelength, for example.
  • Alternate light sources may be provided that may be selectively coupled to provide different light wavelengths during different portions of a surgical procedure, for example.
  • Each illumination light guide may include a bundle of individual optical fibers, that includes thousands of optical fibers each having a diameter in arrange 30-70 microns, to propagate light between the light source and the end of the endoscope disposed within a body cavity nearby to the tissue object 120 viewed at Al .
  • the illumination light guides 930, 932 may be incorporated into the endoscope 28 with right and left camera reflected light guides 944, 946 disposed between them.
  • the color of the light used to illuminate the tissue object 120 viewed at A is determined by the wavelength of the light supplied into the optical fibers of the illumination light guides.
  • Figure 10 is an illustrative drawing showing certain details of the first end portion 204 of a stereo endoscope 28 in accordance with some
  • Respective first and second illumination light guide exit windowsl012, 1014 are disposed on opposites sides (e.g., above and below) a camera window 1016 to provide uniform illumination of a surgical scene 1018 that includes a tissue surface 120.
  • illumination light 930, 930.1 to illuminate the tissue object 120 may be transmitted from the first light 962 source via first and second illumination light guides 930, 932 disposed behind the first and second exit windows 1012, 1014.
  • First and second (e.g., left and right) image capture sensors 913, 914 may be disposed behind the camera window 1016.
  • the first illumination light guide 930 extends between the first light exit window 1012 at the distal (first) end portion 204 of the endoscope 28 and the first light source 962 at the proximal (second) end portion 206 of the endoscope 28.
  • the second illumination light guide 932 extends between the second light exit window 1014 at the distal (first) end portion 204 and the first light source which illuminates the surgical scene for the camera(s) 962 may be located at the proximal (second) end portion of the endoscope 28.
  • Each illumination light guide includes a multiplicity of individual optical fibers to transmit light from the first light source 962 to the first and second light guide exit windows 1012, 1014.
  • first and second light sources 962, 964 at the second end portion 206 of the endoscope 28, which is physically spaced apart from the first end portion 204 of the endoscope 28, isolates the tissue from the bulk and potentially harmful heat generated by the first and second light sources 962, 964.
  • light source 962 is located in an enclosure a meter or several meters away from where the endoscope is used and the light is delivered to the fiber bundles in the endoscope through a light guide cable which serves this purpose.
  • Light source 964 is used for creating a pattern of illumination and is typically a laser source and delivered to the endoscope tip through a single optical fiber. As such, it is advantageous to locate the second source 964 source in a proximal portion of the endoscope. A more precise alignment is required between the second light source 964 and the single optical fiber used to deliver the pattern of illumination.
  • the light guide alignment required for delivering white light illumination from the first light source 962 does not need to be as precise.
  • a structured light forming optical fiber 1080 is disposed within the first illumination light guide 930 and has an exit surface disposed at the first illumination light guide exit window 1012.
  • the structured light forming optical fiber 1080 includes a distal (first) end including an exit surface having a pattern forming light-transmissive structure thereon and a proximal (second) end portion disposed to receive light from the second light source 964.
  • second light source 964 may be packaged together with the delivery of fiber attached to the laser diode module and may thus may be configured as a single element.
  • the second light source 964 is typically a coherent light source such as a laser or laser diode.
  • the pattern forming structure that the light from 964 passes through at the the exit surface of the optical fiber creates a non homogeneous pattern of light to illuminate clinically relevant tissue surfaces 120, for example.
  • This pattern creates a surface texture which enables pattern matching between the left and right eye cameras over portions of the tissue 120 which do not inherently contain sufficient features to perform the left-rigth eye correlations.
  • structured light refers to a known pattern, such as a pattern of light spots or a pattern of horizontal bars of light or a curvilinear pattern of light, such as a siusoidal pattern that may be projected onto a tissue surface.
  • the image forming pattern splits the light into a structured light pattern that projects onto a clinically relevant surface (e.g., a tissue or instrument surface) at a surgical work zone within the endoscope field of view.
  • a clinically relevant surface e.g., a tissue or instrument surface
  • Figure 10 shows an illustrative structured light pattern that includes a multiplicity of light spots 1070 projected from the structured light forming optical fiber 1080 onto the tissue surface 120.
  • the light spots 1070 incident on the tissue surface 120 are visible to the stereographic imaging system 900.
  • the pattern may extend over a portion, all, or overfill the field of view of the cameras depending on the implmentation.
  • the structured light forming optical fiber 1080 is disposed equidistant from centers of the left camera side and right camera side of the camera window 1016 for efficiency of stereographic processing as this reduces (but does not eliminate) problems due to shadowing of the pattern.
  • Postioning the structured light forming optical fiber 1080 between the two cameras thus reduces the porobability of one of the cameras having its view of a tissue surface feature being occluded by a different tissue surface feature, such as a ridge of tissue for example, while the other camera can view it.
  • a tissue surface feature such as a ridge of tissue for example
  • this type of shadowing is generally not that problematic.
  • Stereo depth reconstruction processing is performed upon the images which contain both structured light pattern reflected from the tissue surface and the light from the 'normal' illuminator 120 captured by the first and second cameras 902, 904 to determine distances of tissue surface portions from the distal end portion 204 of the endoscope 228 to thereby determine a tissue surface depth map for the tissue surface 120.
  • the implementation of the capture approach is dependent on the implementation.
  • the camera may have the ability to image RGB and IR where the RGB pixels do not see the IR light and the IR pixels do not see the RGB illumination.
  • the pattern laser may be implemented in the IR and used concurently with the RGB imaging modes.
  • the cameras see only RGB the pattern may be projected in the blue for example only when the depth reconstruction from the stereo matching of the RGB images fails or when additional confidence is needed in the depth estimation - for example if a tool is being computer controlled to follow the tissue surface.
  • the pattern is chosen carefully such that it provides unique patterns over the area illuminated, then it also affords the ability for other tools to use the pattern for relative registration in the co-ordinate frame of the endoscope and tissue.
  • a robotically controlled tool with a built in camera can use vision algorithms to locate the section of the tissue and pattern being illumiated by the pattern fiber and then follow the surface reconstructed from stereo matching knowing the section of the pattern it can see provides orientaiton and position for the tool.
  • a tissue surface depth map may be generated from a stereographic camera viewpoint, based upon scene geometry.
  • a depth map may provide to the surgical system, tissue contour information about a surgical scene which may not be readily visible to a surgeon viewing the scene through a stereoscopic viewer and camera system. For example, having a depth map for the scene that is accurate enables the the surgical system to determine placment of marker images or alphanumeric character images in a scene that appear to be lyng on the tissue surface.
  • the system also may use a depth map to more effectively manage tasks such as automated tool exchange such that a newly introduced surgical tool does not intersect tissue without a surgeon direction or control.
  • the surface geometry may be placed in a coordinate system of the surgical system so that a coordinate transformation between additional surgical tools and robot arms may be calculated with respect to the stereo endoscope and a tissue surface depth map. More accurately knowing the tissue surface also enables applictions such as matching the geometry being currentl observed to pre-operative images from imaging modalities such as CT or MRI where an internal 3D surface may be calculted.
  • Structured light may be used to determine distance of tissue surface portions illuminated by structured light patterns from a distal end 204 of the endoscope 28 to thereby produce a tissue surface depth map.
  • Structured light reflected from the tissue surface may be captured within camera images of both cameras 902, 904 of the stereo imaging system 900 and may be used to calculate the distances from the stereo endoscope 28 to individual tissue surface portions illuminated by individual portions (e.g., individual light spots) of the structured light. With enough light spots, a three-dimensional depth map of the tissue surface may be computed for the scene.
  • Figures 11A-11B are illustrative of a side view ( Figure 11 A) and a perspective view ( Figure 11B) of a distal end portion of a first structured light forming optical fiber 1080A having an exit surface 1082A with a structured image forming pattern 1084A thereon.
  • the structured image forming pattern 1084A is implemented as a small sturucture on or etched into the surface of the fiber so that light incident upon it from within the optical fiber 1080A is diffracted to create patterns of light of different intensities by angle as to form a structured light image pattern, conceptually a pattern of light spots, that then fall or project onto an object such as a tissue surface 120 disposed in front of the exit window 1082A and camera.
  • the creation of the diffractive element may include small chehmically etched, laser written, or hologram in gel or uv cured materials that may include a pattern of variations in the optical density, refractive index, or surface profile that imparts variation into the light incident upon it from within the optical fiber thus creating the sturctured illumination.
  • a diffraction element may be produced at an extit end portion of the optical fiber through removal of material.
  • a photoresist may be used to create a pattern for use in selective etching of an exit end portion of an optical fiber to pattern a diffraction element.
  • laser ablation may be used to selecttively remove material.
  • an ebeam may be used to selectively etch away material to pattern the diffraction element.
  • a diffraction element may be produced through an additive process such as molding on a replicated pattern in a uv cured material or adhesive that may be formed at the end of the fiber to create a diffraction element pattern.
  • Figures 12A-12B are illustrative drawings showing a side view ( Figure 12A) and a perspective partially transparent view ( Figure 12B) of a second structured light forming optical fiber 1080B that includes an optical fiber having smoothly polished exit surface 1086 and having sleeve 1088 having a second transparent exit window 1082B having a second structured image forming pattern 1084B formed thereon or as an intrisic property of 1082b.
  • the second structured image forming pattern 1084B may include a diffractive element or may include a hologram and could be composed of features both diffractive in nature and refractive.
  • the sleevel088 is placed over the smooth exit surface of 1086 so that light 1090 that emanates from the core of fiber 1080B is incident upon the second exit window 1082B which contains sstructures or properties to create the structured pattern of illuminaiton desired 1084B.
  • An air gap (or adhesive filled gap) 1092 may be provided allowing the light from the fiber to between an exit surface 1086 of the second fiber 1080B and window 1082B .
  • the exit surface 1086 and the exit window 1082B are bonded directly together.
  • Figures 13A-13B are illustrative drawings showing a side view ( Figure 13A) and a perspective partially transparent view ( Figure 13B) of a third structured light forming optical fiber 1080C that includes an optical fiber having an optical fiber tip 1098 affixed thereto that has a third transparent exit window 1082C having a third structured image forming pattern 1084C formed thereon.
  • the third structured image forming pattern 1084C may include a diffractive or holograpic or replicated structure.
  • the optical fiber tip 1098 may be bonded to apolished exit surface 1096 of the fiber 1080C with an adhesive or fusion splicing, for example.
  • An advantage of forming the structured image forming pattern 1084C on a tip cap 1098 or on a spliced portion is ease of fabrication of the image forming pattern 1084C, which may not readily accommodate a long tail and attached laser device.
  • Figure 14 is a cross-sectional view of a Gradient Index (GRIN) lens 1402 at the exit surface 1404 of an optical fiber 1406; the assembly shown here serves the same function as the previously described 1080 fibers with integrated elements.
  • a structured light image forming pattern element 1408 includes a first side 1410 and a second side 1412. The first side 1410 of the structured light pattern forming element 1408 faces the optical fiber exit surface 1404 and has a light bending element 1414 thereon that bends laser light 1415 that propagates through the optical fiber 1404 and that exits through the exit surface 1404 into an acceptance angle of the GRIN lens.
  • the light bending element 1414 includes a hologram.
  • the light bending element 1414 includes a micro Fresnel lens.
  • an air gap 1416 may be provided between the optical fiber exit window 1404 and the structured light pattern forming element 1408.
  • the first surface may be bonded to the exit surface of the fiber with a uv or heat cured polymer adhesive material.
  • the second side 1412 of the structured light pattern forming element 1408 includes a mask pattern 1418 printed or formed in an opaque material that faces the GRIN lens 1402. The second side 1412 may be bonded to a light entry portion of the GRIN lens 1402.
  • element 1408 is simply a mask with a pattern of micro apertures; the element may include a diffuser or a micro fresnel or diffractive component to get the light to enter the GRIN lens as required.
  • the GRIN lens 1402 acts as a tiny projector lens that projects the light pattern 1420 formed by the mask pattern 1418 to produce a structured image 1422 at an anatomical surface.
  • the GRIN lens may include an aperture on the exit surface which is either painted or applied to the surface of the lens (or a small anulus part if desired) and black absorbing materials on the sides of the GRIN to improve the imaging performance and to reduce stray light.
  • the first and second light 962, 964 sources may be confiuged to select for optimal imagin by the cameras and the desired color representations; light of multiple colors within the first and second illumination light guides and within the structured light forming optical fiber.
  • the first light source 962 provides visible light to provide images visible to a surgeon during a surgical procedure.
  • the seconed light source 964 may provide visible or non-visible light such as short wave blue or near infra red (NIR) for transmission within the structured light forming optical fiber.
  • NIR near infra red
  • an image capture sensor includes an additional channel configured to capture reflected structured light patterns outside the visible ligh range.
  • a Bayer color filter array may be reformulated such that there are RGB pixels and separate NIR pixels whereby the NIR pixels may be used for the detection of the reflected structured light pattern and the separation of the pattern from the visible image.
  • the first light source 962 may provide light for transmission within the first and second illumination light guides that has a color or at a wavelength that accentuates the surface layer of the tissue. For example, blue light typically does not penetrate as deep into anatomical tissue as NIR light, and therefore, blue light produces sharper more distinct structured light features than does red red light. Thus, blue light also may be better suited for use by the second light source 964 than NIR light to accurate determination of structured light pattern edges during depth mapping, for example.
  • the second light source 964 providing NIR illumination, particularly around 800nm where the apparent color of hemoglobin is the same for both oxygenated and deoxygenated blood.
  • One reason to use a structured light pattern in the NIR is that one may use the same acquisition hardware that is already implemented for viewing fluorescence in the infra red.
  • projecting the pattern at 830nm provides a structured pattern which is in the emission band for indocyanine green, a relatively common NIR fluorescent dye which is used clinically.
  • the absorbance of ICG is relatively low at 830nm so shining an 830nm pattern into the scene primarily images the 830nm pattern rather than ICG fluorescence although that will contribute a very small signal if ICG is present in the tissues being illuminated.

Abstract

A stereo endoscope is provide that includes: a housing; right and left camera windows; a light guide; and an optical fiber having a structured image forming pattern at an end thereof extending within the housing.

Description

STRUCTURED LIGHT PROJECTION FROM AN OPTICAL FIBER
CLAIM OR PRIORITY
This application claims the benefit of priority to U.S. Provisional Patent Application No. 62/551,674, filed on August 29, 2017, which is incorporated by reference herein in its entirety.
BACKGROUND
Minimally invasive medical techniques are intended to reduce the amount of tissue that is damaged during diagnostic or surgical procedures, thereby reducing patient recovery time, discomfort, and deleterious side effects. A stereoscopic endoscope may be inserted into a patient's body cavity to view a surgical scene during a minimally invasive surgical procedure. The surgical scene includes tissue structures and may include one or more surgical instruments inserted into the body cavity. Tissue surface features may be difficult to visually discern due to uniform of tissue color or tissue smoothness, for example. Structured light has been used to discern contours and depth in a scene. However, space to project structured light is limited within a surgical scene during minimally invasive surgery.
SUMMARY
In one aspect a structured light forming device includes an optical fiber having a structured image forming pattern at an end thereof.
In another aspect, a stereo endoscope includes an elongated housing having right and left camera windows and a light guide. An optical fiber having a structured image forming pattern at an end thereof extends within the housing.
In another aspect, method is provided to project structured light. Light is transmitted within an optical fiber from a proximal end portion of the optical fiber to a distal end portion of the optical fiber. Light is emitted at the distal end portion of the optical fiber in a structured light pattern.
In another aspect, a method is provided to produce a depth map for an object. Light is transmitted within an optical fiber from a proximal end portion of the optical fiber to a distal end portion of the optical fiber. Light is emitted at the distal end portion of the optical fiber in a structured light pattern. The distal end of the optical fiber is positioned to direct the structured light pattern to be incident upon the object. A stuctured light pattern feature reflected from the object is captured at right and left light sensors. A surface depth of a portion of the object on which the captured light pattern feature is incident is determined based upon a location of the captured stuctured light pattern feature in the right light sensor and a location of the captured stuctured light pattern feature in the right camera.
BRIEF DESCRIPTION OF THE DRAWINGS
Certain embodiments are illustrated by way of example and not limitation in the figures of the accompanying srawings.
Figure 1 is a plan view of a minimally invasive teleoperated surgical system.
Figure 2 is a perspective view of a surgeon's console.
Figure 3 is a perspective view of an electronics cart.
Figure 4 is a diagrammatic illustration of a teleoperated surgical system.
Figure 5 is a perspective view of a patient-side cart of a minimally invasive teleoperated surgical system, in accordance with embodiments.
Figure 6 is an illustrative simplified block diagram showing an example positioning of mechanical support arms of the teleoperation surgery system during a surgical procedure in accordance with some embodiments.
Figure 7 is a side elevation view of a surgical instrument.
Figure 8 is an illustrative drawing of a stereo endoscope shown positioned via a cannula, to penetrate body tissue to provide visualize access to a surgical scene that includes a tissue object to be viewed.
Figure 9 is an illustrative schematic three-dimensional view of a a sterographic imaging system in accordance with some embodiments.
Figure 10 is an illustrative drawing of structed light from a first end portion of a stereo endoscope projected onto a tissue object.
Figures 11A-11B are illustrative a side view (Figure 11A) and a perspective view (Figure 11B) of a distal end portion of a first structured light forming optical fiber having an first exit surface with a first structured image forming pattern thereon.
Figures 12A-12B are illustrative drawings showing a side view (Figure 12A) and a perspective partially transparent view (Figure 12B) of a second structured light forming optical fiber that includes an optical fiber having smoothly polished exit surface and having sleeve having a second transparent exit window having a second structured image forming pattern formed thereon.
Figures 13A-13B are illustrative drawings showing a side view (Figure 13A) and a perspective partially transparent view (Figure 13B) of a third structured light forming optical fiber that includes an optical fiber having an optical fiber tip affixed thereto that has a third transparent exit window having a third structured image forming pattern formed thereon.
Figure 14 is a cross-sectional view of a Gradient Index lens at the exit surface of an optical fiber.
DESCRIPTION OF EMBODIMENTS
Minimally Invasive Surgical System
Figure 1 is a plan view of a minimally invasive teleoperated surgical system 10, typically used for performing a minimally invasive diagnostic or surgical procedure on a patient 12 who is lying on a mobile operating table 14. The system includes a mobile surgeon's console 16 for use by a surgeon 18 during the procedure. One or more surgical team members 20 may also participate in the procedure. The minimally invasive teleoperated surgical system 10 further includes a mobile patient-side cart 22 and a mobile electronics cart 24. In some embodiments, the table 14, surgeon's console 16, patient-side cart 22, and the electronics cart 24.
The patient-side cart 22 includes multiple segmented mechanical support arms 72, each having one end portion rotatably mounted to a vertical support structure 74 and having another end mounting a removably coupled surgical instrument 26. In some of embodiments, each mechanical support arm 72 includes a first segment 72-1, a second segment 72-2 and a third segment 72-3. During setup for a surgical procedure, the multiple segments of at least one support arm 72 are moved to position a surgical instrument for insertion within a minimally invasive incision in the body of the patient 12. During the surgical procedure, while surgical instruments are inserted within a patient's body cavity, the surgeon 18 views the surgical site through the surgeon's console 16. An image of the surgical site can be obtained by an endoscope 28, such as a stereoscopic endoscope, which a surgeon can manipulate at the patient-side cart 22 to orient the endoscope 28. Computer processors located on the electronics cart 24 can be used to process the images of the surgical site for subsequent display to the surgeon 18 through the surgeon's console 16. The number of surgical instruments 26 used at one time will generally depend on the diagnostic or surgical procedure.
Figure 2 is a perspective view of the surgeon's console 16. The surgeon's console 16 includes a left eye display 32 and a right eye display 34 for presenting the surgeon 18 with a coordinated stereoscopic view of the surgical site that enables depth perception. The console 16 further includes one or more control inputs 36. One or more surgical instruments installed for use on the patient-side cart 22 (shown in Figure 1) move in response to surgeon 18's manipulation of the one or more control inputs 36. The control inputs 36 can provide the same mechanical degrees of freedom as their associated surgical instruments 26 (shown in Figure 1) to provide the surgeon 18 with telepresence, or the perception that the control inputs 36 are integral with the instruments 26 so that the surgeon has a strong sense of directly controlling the instruments 26. To this end, position, force, and tactile feedback sensors (not shown) may be employed to transmit position, force, and tactile sensations from the surgical instruments 26 back to the surgeon's hands through the control inputs 36.
Figure 3 is a perspective view of the electronics cart 24. The electronics cart 24 can be coupled with the endoscope 28 and includes a computer processor to process captured images for subsequent display, such as to a surgeon on the surgeon's console, or on another suitable display located locally and/or remotely. For example, if a stereoscopic endoscope is used, a computer processor on electronics cart 24 can process the captured images to present the surgeon with coordinated stereo images of the surgical site. Such coordination can include alignment between the opposing images and can include adjusting the stereo working distance of the stereoscopic endoscope. As another example, image processing can include the use of previously determined camera calibration parameters to compensate for imaging errors of the image capture device, such as optical aberrations. Optionally, equipment in electronics cart may be integrated into the surgeon's console or the patient-side cart, or it may be distributed in various other locations in the operating room.
Figure 4 diagrammatically illustrates a teleoperated surgical system 50
(such as the minimally invasive teleoperated surgical system 10 of Figure 1). A surgeon's console 52 (such as surgeon's console 16 in Figure 1) can be used by a surgeon to control a patient-side cart 54 (such as patent-side cart 22 in Figure 1) during a minimally invasive procedure. The patient-side cart 54 can use an imaging device, such as a stereoscopic endoscope, to capture images of a surgical site and output the captured images to a computer processor located on an electronics cart 56 (such as the electronics cart 24 in Figure 1). The computer processor typically includes one or more data processing boards purposed for executing computer readable code stored in a non-volatile memory device of the computer processor. In one aspect, the computer processor can process the captured images in a variety of ways prior to any subsequent display. For example, the computer processor can overlay the captured images with a virtual control interface prior to displaying the combined images to the surgeon via the surgeon's console 52.
Figure 5 is a perspective view of a patient-side cart 54 of a minimally invasive teleoperated surgical system 10, in accordance with embodiments. The patient-side cart 54 includes four mechanical support arms 72. A surgical instrument manipulator 73, which includes motors to control instrument motion, is mounted at the end of each support arm assembly 72. Additionally, each support arm 72 can optionally include one or more setup joints (e.g., unpowered and/or lockable) that are used to position the attached surgical instrument manipulator 73 in relation to the patient for surgery. While the patient-side cart 54 is shown as including four surgical instrument manipulators 73, more or fewer surgical instrument manipulators 73 may be used.
A functional teleoperated surgical system will generally include a vision system portion that enables a user of the teleoperated surgical system to view the surgical site from outside the patient's body 12. The vision system typically includes a endoscopic camera instrument 28 for capturing video images and one or more video displays for displaying the captured video images. In some surgical system configurations, the endoscopic camera 28 includes optics that transfer the images from a distal end of the endoscopic camera 28 to one or more imaging sensors (e.g., CMOS or CCD, or other array type detectors sensors) outside of the patient's body 12. Alternatively, the imaging sensor(s) can be positioned at the distal end of the endoscopic camera 28, and the signals produced by the sensor(s) can be transmitted along a lead or wirelessly for processing and display on the one or more video displays.
Referring to Figure 5, in one aspect, for example, an individual surgical instrument 26 and a cannula 27 are removably coupled to manipulator 73, with the surgical instrument 26 inserted through the cannula 27. One or more teleoperated actuators of the manipulator 73 move the surgical instrument 26 as a whole. The manipulator 73 further includes an instrument carriage 75. The surgical instrument 26 is detachably connected to the instrument carriage 75. In one aspect, the instrument carriage 75 houses one or more teleoperated actuators inside that provide a number of controller motions that the surgical instrument 26 translates into a variety of movements of an end effector on the surgical instrument 26. Thus, the teleoperated actuators in the instrument carriage 75 move only one or more components of the surgical instrument 26 rather than the instrument as a whole. Inputs to control either the instrument as a whole or the instrument's components are such that the input provided by a surgeon or other medical person to the control input (a "master" command) is translated into a corresponding action by the surgical instrument (a "slave" response).
In an alternate embodiment, instrument carriage 75 does not house teleoperated actuators. Teleoperated actuators that enable the variety of movements of the end effector of the surgical instrument 26 are housed in a location remote from the instrument carriage 75, e.g., elsewhere on patient-side cart 54. A cable-based force transmission mechanism or the like is used to transfer the motions of each of the remotely located teleoperated actuators to a corresponding instrument-interfacing actuator output located on instrument carriage 75. In some embodiments, the surgical instrument 26 is mechanically coupled to a first actuator, which controls a first motion of the surgical instrument such as longitudinal (z-axis) rotation. The surgical instrument 26 is mechanically coupled to a second actuator, which controls second motion of the surgical instrument such as two-dimensional (x, y) motion. The surgical instrument 26 is mechanically coupled to a third actuator, which controls third motion of the surgical instrument such as opening and closing of jaws of an end effector, for example.
Figure 6 is an illustrative simplified block diagram showing an example positioning of mechanical support arms 72A-72C of the teleoperation surgery system 10 during a surgical procedure in accordance with some embodiments. In some embodiments, the patient-side system 54 includes at least three mechanical support arms 72A-72C. In some embodiments, each of the mechanical support arms 72A-72C includes rotatably mounted first, second and third segments 72-1, 72-2 and 72-3. A center-located mechanical support arm 72 may support an endoscopic camera 28 suitable for capture of images within a field of view of the camera. The mechanical support arms 72 to the left and right of center may support surgical instruments 26A and 26B, respectively, which may manipulate anatomical tissue. During setup for a surgical procedure, the support arm segments are pre-positioned to support endoscope and instruments in precise position and orientation to for robot assisted manipulation by a surgeon to perform a medical procedure.
A user or operator O (generally a surgeon) performs a surgical procedure on patient P by manipulating control input devices 36, such as hand grips and foot pedals at a master control console 16. The operator can view video frames of images of a surgical site inside a patient's body through a stereo display viewer 31. A computer processor 58 of the console 16 directs movement of teleoperationally controlled instruments 26, 26A-26B and 28 via control lines 159, effecting movement of the instruments using a patient-side system 24 (also referred to as a patient-side cart).
Surgical Instrument
Figure 7 is a side view of a surgical instrument 26, which includes an elongated tubular shaft 610 having a centerline longitudinal axis 611, a distal (first) portion 650 for insertion into a patient's body cavity and proximal (second) portion 656 coupled adjacent a control mechanism 640. The surgical instrument 26 is used to carry out surgical or diagnostic procedures. The distal portion 650 of the surgical instrument 26 can provide any of a variety of end effectors 654, such as the forceps shown, a needle driver, a cautery device, a cutting tool, an imaging device (e.g., an endoscope or ultrasound probe), or the like. The surgical end effector 654 can include a functional mechanical degree of freedom, such as jaws that open or close, or a knife that translates along a path. In the embodiment shown, the end effector 654 is coupled to the elongate tube 610 by a wrist 652 that allows the end effector to be oriented relative to the elongate tube centerline axis 611. The control mechanism controls movement of the overall instrument and the end effector at its distal portion.
Stereographic Imaging System
Figure 8 is an illustrative drawing of the endoscope 28 of Figure 1 shown positioned via a cannula 27, to penetrate body tissue 130 to provide visual access to a surgical scene that includes a tissue object 120 to be viewed. The endoscope 28 includes elongated portion 202, which includes a distal (first) end portion 204 and a proximal (second) end portion 206 and a tip portion 208 of the first end portion 204. The first end portion 202 is dimensioned to be inserted into a human body cavity. The elongated portion 202 has a length sufficient to position the tip portion 208 close enough to an object to be viewed within the body cavity that the object can be imaged by the stereo image capture devices 913, 914, described more fully below. The second end portion 206 is disposed outside the body cavity.
Figure 9 is an illustrative schematic three-dimensional view of a a sterographic imaging system 900 in accordance with some embodiments. The stereographic imaging system includes a pair of cameras 902, 904 to capture right and left images of the scene. A non-zero spatial distance between the cameras, often referred to as the basis or inter pupilary distance (IPD), allows observation of the scene from two distinct positions. A matching process uses a right image captured by the right camera 902 and a left image received by left camera 903 to compute a distance from a distal portion of the endoscope 28 based upon the basis and the positions of corresponding reflected light pattern features in a left image of the scene captured by the left camera and in a right image of the scene captured by the right camera. This method of triangulation relies on knowing the basis and matching features in the left and right eye images. An inability to precicely match features in the two images leads to an area of the image where the depth is not strictly deterministic. Naturally, in surgical applications, that problem wants to be solved in a deterministic way. More precision in the a-prioi knowledge as to the exact basis distance also allows more exact distance determinaiton in physical units such as mm. This distance may be calibrated at the time the stereo endoscope is manufactured and, as this will vary a small amount from unit to unit, that calibration data may be stored in a storage memory in the cloud associated with a unique ID associated with the device or stored in a storage memory on the device itself or in a table in a storage memory on a system using the device. The location of tissue surface features captured in both camera images may be used to calculate the distances from the camera to the features and with enough features, a three-dimensional depth map of the tissue surface may be computed for the scene. U.S. Patent No. 9,526,587 (Zhao et el), which is expressly incorporated herein in its entirety by this reference, describes a stereographic imaging system feature matching processes that may be used to match structured light features reflected from a tissue surface.
More particularly, the stereographic imaging system 900 includes a stereo endoscope 28 to that includes first and second light illumination guides 930, 932 (only one visible) to transmit illumination light 934.1 932.1 (only one visible) produced by one or both of first and second light sources 962, 964, to and from an object (e.g., tissue object 120) to be viewed, image capture sensors 913, 914, stereo image processing components 920, 922 and stereo display components 924, 926. As described above, the stereo endoscope 28 includes a distal (first) end portion 204 suited for insertion to a patient's body cavity to view a surgical site and a proximal (second) end portion 206 to be disposed outside the body cavity. The image capture sensors, which in some
embodiments include charge coupled devices (CCDs) 913 and 914, operate together with optical lenses 916 and 918 to capture image information. Image converter circuits 920, 922, which may include an appropriate electronically driven system such as a camera control unit (CCU), transform optical information captured by the image capture sensors 913, 914 to a digital format. An electronic processor (not shown) is in operative communication with the image converter circuits 920, 922. The system 900 further includes two visual image display units, one for a left eye and the other for a right eye, schematically indicated at 924, 926, each of which is operatively connected to one of the image converter circuits 920, 922 as indicated by lines 928, 930 so as to convert digital information received from the image converter circuits 920, 922 into corresponding visual images that can be viewed by a user such as a surgeon. The display units may include an appropriate visual screening apparatus such as a liquid crystal display, an OLED, or the like. An assembly of reflecting surfaces or a reflective train, e.g., comprising mirrors, or the like, is generally indicated by reference numeral 932. The assembly 932 conveys the images from the display units 924, 926 to a viewer at 934. It will be appreciated that the direction of view at 934 is normal to the page, a right eye image being indicated at 934.1, and a left eye image being indicated at 934.2 for example.
The stereo endoscope 28 includes first and second optical lenses 936, 938. The optical lenses 936, 938 have viewing axes as indicated by lines 940, 942. The lenses 936, 938 may be positioned so that the axes 940, 942 intersect at 944. If an object, such as an anatomical tissue surface 120, is viewed at a working distance generally corresponding to the intersection 944, namely on a plane at Al, the object is viewed generally at an optically correct working distance. The farther the object is removed from the intersection 944, the more optically incorrect the object is viewed. A viewing area, or the field of view, of each lens 936, 938 can be represented by a section through an imaginary conical volume indicated by referenced numerals 936.1, 938.1. If an object is on a plane at Al, within the bounds of the optical field of view indicated by the oval shaped area 960 the object is viewed at an optically correct working distance. It will be appreciated that the axes 940, 942 are positioned at the centers of the oval shaped area 960 in an optically aligned system.
In accordance with some embodiments, images of a tissue object 120 viewed at plane A are passed through the lenses 936, 938, along first (left) and second (right) camera reflected light guides 944, 946 as indicated by arrows 944.1, 946.1 in the endoscope 28, and are then magnified through lenses 916, 918 and are then projected onto optically sensitive surfaces of e.g., the image capture sensors 913, 914, at a proximal end of the endoscope 28 as indicated by arrows 944.1, 946.1. In some embodiments, the image capture sensors are disposed at the distal end portion of the endoscope.
The light sources 962, 964 may provide white light, color filtered light or light at some selected wavelength. The light source may include one or more laser or light emitting diode (LED) light sources for example. The laser may provide coherent light at a selected wavelength, for example. Alternate light sources may be provided that may be selectively coupled to provide different light wavelengths during different portions of a surgical procedure, for example. Each illumination light guide may include a bundle of individual optical fibers, that includes thousands of optical fibers each having a diameter in arrange 30-70 microns, to propagate light between the light source and the end of the endoscope disposed within a body cavity nearby to the tissue object 120 viewed at Al . The illumination light guides 930, 932 (only one visible) may be incorporated into the endoscope 28 with right and left camera reflected light guides 944, 946 disposed between them. The color of the light used to illuminate the tissue object 120 viewed at A is determined by the wavelength of the light supplied into the optical fibers of the illumination light guides.
Distal End Portion of Endoscope
Figure 10 is an illustrative drawing showing certain details of the first end portion 204 of a stereo endoscope 28 in accordance with some
embodiments. Respective first and second illumination light guide exit windowsl012, 1014 are disposed on opposites sides (e.g., above and below) a camera window 1016 to provide uniform illumination of a surgical scene 1018 that includes a tissue surface 120. Referring to Figures 9-10, illumination light 930, 930.1 to illuminate the tissue object 120 may be transmitted from the first light 962 source via first and second illumination light guides 930, 932 disposed behind the first and second exit windows 1012, 1014. First and second (e.g., left and right) image capture sensors 913, 914 may be disposed behind the camera window 1016.
The first illumination light guide 930 extends between the first light exit window 1012 at the distal (first) end portion 204 of the endoscope 28 and the first light source 962 at the proximal (second) end portion 206 of the endoscope 28. Similarly, the second illumination light guide 932 extends between the second light exit window 1014 at the distal (first) end portion 204 and the first light source which illuminates the surgical scene for the camera(s) 962 may be located at the proximal (second) end portion of the endoscope 28. Each illumination light guide includes a multiplicity of individual optical fibers to transmit light from the first light source 962 to the first and second light guide exit windows 1012, 1014. Placing the first and second light sources 962, 964, at the second end portion 206 of the endoscope 28, which is physically spaced apart from the first end portion 204 of the endoscope 28, isolates the tissue from the bulk and potentially harmful heat generated by the first and second light sources 962, 964. Additionally and generally, light source 962 is located in an enclosure a meter or several meters away from where the endoscope is used and the light is delivered to the fiber bundles in the endoscope through a light guide cable which serves this purpose. Light source 964 is used for creating a pattern of illumination and is typically a laser source and delivered to the endoscope tip through a single optical fiber. As such, it is advantageous to locate the second source 964 source in a proximal portion of the endoscope. A more precise alignment is required between the second light source 964 and the single optical fiber used to deliver the pattern of illumination. The light guide alignment required for delivering white light illumination from the first light source 962 does not need to be as precise.
Structured Light Forming Optical Fiber A structured light forming optical fiber 1080 is disposed within the first illumination light guide 930 and has an exit surface disposed at the first illumination light guide exit window 1012. The structured light forming optical fiber 1080 includes a distal (first) end including an exit surface having a pattern forming light-transmissive structure thereon and a proximal (second) end portion disposed to receive light from the second light source 964. In practice, second light source 964 may be packaged together with the delivery of fiber attached to the laser diode module and may thus may be configured as a single element. The second light source 964 is typically a coherent light source such as a laser or laser diode. The pattern forming structure that the light from 964 passes through at the the exit surface of the optical fiber creates a non homogeneous pattern of light to illuminate clinically relevant tissue surfaces 120, for example. This pattern creates a surface texture which enables pattern matching between the left and right eye cameras over portions of the tissue 120 which do not inherently contain sufficient features to perform the left-rigth eye correlations. As used herein, "structured light" refers to a known pattern, such as a pattern of light spots or a pattern of horizontal bars of light or a curvilinear pattern of light, such as a siusoidal pattern that may be projected onto a tissue surface. The way that the structured light pattern deforms when striking surfaces within the endoscope field of view provides a basis for the stereographic vision system to calculate depth and surface information of objects that are in view. As light propagated within the structured light forming optical fiber 1080 exits the distal end of the fiber, the image forming pattern splits the light into a structured light pattern that projects onto a clinically relevant surface (e.g., a tissue or instrument surface) at a surgical work zone within the endoscope field of view.
Figure 10 shows an illustrative structured light pattern that includes a multiplicity of light spots 1070 projected from the structured light forming optical fiber 1080 onto the tissue surface 120. The light spots 1070 incident on the tissue surface 120 are visible to the stereographic imaging system 900. Note that the pattern may extend over a portion, all, or overfill the field of view of the cameras depending on the implmentation. In some embodiments, the structured light forming optical fiber 1080 is disposed equidistant from centers of the left camera side and right camera side of the camera window 1016 for efficiency of stereographic processing as this reduces (but does not eliminate) problems due to shadowing of the pattern. Postioning the structured light forming optical fiber 1080 between the two cameras thus reduces the porobability of one of the cameras having its view of a tissue surface feature being occluded by a different tissue surface feature, such as a ridge of tissue for example, while the other camera can view it. As the problem where this is most useful has to do with areas that have high uniformity in the wavelenghts used by the camera, this type of shadowing is generally not that problematic. Stereo depth reconstruction processing is performed upon the images which contain both structured light pattern reflected from the tissue surface and the light from the 'normal' illuminator 120 captured by the first and second cameras 902, 904 to determine distances of tissue surface portions from the distal end portion 204 of the endoscope 228 to thereby determine a tissue surface depth map for the tissue surface 120. The implementation of the capture approach is dependent on the implementation. For example, the camera may have the ability to image RGB and IR where the RGB pixels do not see the IR light and the IR pixels do not see the RGB illumination. In this case, the pattern laser may be implemented in the IR and used concurently with the RGB imaging modes. Alternatively, if the cameras see only RGB the pattern may be projected in the blue for example only when the depth reconstruction from the stereo matching of the RGB images fails or when additional confidence is needed in the depth estimation - for example if a tool is being computer controlled to follow the tissue surface.
If the pattern is chosen carefully such that it provides unique patterns over the area illuminated, then it also affords the ability for other tools to use the pattern for relative registration in the co-ordinate frame of the endoscope and tissue. Thus, a robotically controlled tool with a built in camera can use vision algorithms to locate the section of the tissue and pattern being illumiated by the pattern fiber and then follow the surface reconstructed from stereo matching knowing the section of the pattern it can see provides orientaiton and position for the tool.
More particularly, in a minimally invasive surgery teleoperated surgical system, a tissue surface depth map may be generated from a stereographic camera viewpoint, based upon scene geometry. A depth map may provide to the surgical system, tissue contour information about a surgical scene which may not be readily visible to a surgeon viewing the scene through a stereoscopic viewer and camera system. For example, having a depth map for the scene that is accurate enables the the surgical system to determine placment of marker images or alphanumeric character images in a scene that appear to be lyng on the tissue surface. The system also may use a depth map to more effectively manage tasks such as automated tool exchange such that a newly introduced surgical tool does not intersect tissue without a surgeon direction or control. Moreover, in conjunction with a known pose of a stereo endoscopic camera from a robotic endoscope holder, the surface geometry may be placed in a coordinate system of the surgical system so that a coordinate transformation between additional surgical tools and robot arms may be calculated with respect to the stereo endoscope and a tissue surface depth map. More accurately knowing the tissue surface also enables applictions such as matching the geometry being currentl observed to pre-operative images from imaging modalities such as CT or MRI where an internal 3D surface may be calculted.
Structured light may be used to determine distance of tissue surface portions illuminated by structured light patterns from a distal end 204 of the endoscope 28 to thereby produce a tissue surface depth map. Structured light reflected from the tissue surface may be captured within camera images of both cameras 902, 904 of the stereo imaging system 900 and may be used to calculate the distances from the stereo endoscope 28 to individual tissue surface portions illuminated by individual portions (e.g., individual light spots) of the structured light. With enough light spots, a three-dimensional depth map of the tissue surface may be computed for the scene. U.S. Patent No. 8, 147,503 (Zhao et el), which is expressly incorporated herein in its entirety by this reference, describes a stereographic imaging system feature matching processes that may be used to produce a tissue surface depth map. Generally however, the light pattern may be used in addition to the natural tissue features so the pattern matching between left and right uses the structured illumination to augment the natural features. Some surfaces such as bone have relatively few intrinsic features so the structured light provides a means to create features on an otherwise featureless surface sothe stereo matching algorithm can find correlations. Note that while for clarity of understanding, the phrase 'spots' has been used to describe the pattern, the actual pattern may have a smoother or more continuous pattern than spots.
Locating the image forming pattern at the exit surface of a structured light forming optical fiber 1080, which has a small diameter in a range 30-200 microns, while locating the second light source 214, which provides light to the fiber 1080, outside the surgical site at a proximal end portion of the fiber, provides a structured light projection system that is compact in size at the distal end portion of the fiber, which is closest to an anatomical tissue surface that is to be imaged. Physical space at the tip of the endoscope typically is quite limited and the ability to dissipate heat is also challenging.
Image Pattern Forming Structures
Figures 11A-11B are illustrative of a side view (Figure 11 A) and a perspective view (Figure 11B) of a distal end portion of a first structured light forming optical fiber 1080A having an exit surface 1082A with a structured image forming pattern 1084A thereon. In some embodiments, the structured image forming pattern 1084A is implemented as a small sturucture on or etched into the surface of the fiber so that light incident upon it from within the optical fiber 1080A is diffracted to create patterns of light of different intensities by angle as to form a structured light image pattern, conceptually a pattern of light spots, that then fall or project onto an object such as a tissue surface 120 disposed in front of the exit window 1082A and camera. The creation of the diffractive element may include small chehmically etched, laser written, or hologram in gel or uv cured materials that may include a pattern of variations in the optical density, refractive index, or surface profile that imparts variation into the light incident upon it from within the optical fiber thus creating the sturctured illumination.
A diffraction element may be produced at an extit end portion of the optical fiber through removal of material. A photoresist may be used to create a pattern for use in selective etching of an exit end portion of an optical fiber to pattern a diffraction element. Alternatively, for example, laser ablation may be used to selecttively remove material. As yet another alternative, for example, an ebeam may be used to selectively etch away material to pattern the diffraction element. As yet another alternative, a diffraction element may be produced through an additive process such as molding on a replicated pattern in a uv cured material or adhesive that may be formed at the end of the fiber to create a diffraction element pattern.
Figures 12A-12B are illustrative drawings showing a side view (Figure 12A) and a perspective partially transparent view (Figure 12B) of a second structured light forming optical fiber 1080B that includes an optical fiber having smoothly polished exit surface 1086 and having sleeve 1088 having a second transparent exit window 1082B having a second structured image forming pattern 1084B formed thereon or as an intrisic property of 1082b. In some embodiments, the second structured image forming pattern 1084B may include a diffractive element or may include a hologram and could be composed of features both diffractive in nature and refractive. The sleevel088 is placed over the smooth exit surface of 1086 so that light 1090 that emanates from the core of fiber 1080B is incident upon the second exit window 1082B which contains sstructures or properties to create the structured pattern of illuminaiton desired 1084B. An air gap (or adhesive filled gap) 1092 may be provided allowing the light from the fiber to between an exit surface 1086 of the second fiber 1080B and window 1082B . In an alternative embodient, the exit surface 1086 and the exit window 1082B are bonded directly together. Figures 13A-13B are illustrative drawings showing a side view (Figure 13A) and a perspective partially transparent view (Figure 13B) of a third structured light forming optical fiber 1080C that includes an optical fiber having an optical fiber tip 1098 affixed thereto that has a third transparent exit window 1082C having a third structured image forming pattern 1084C formed thereon. In some embodiments, the third structured image forming pattern 1084C may include a diffractive or holograpic or replicated structure. The optical fiber tip 1098 may be bonded to apolished exit surface 1096 of the fiber 1080C with an adhesive or fusion splicing, for example. An advantage of forming the structured image forming pattern 1084C on a tip cap 1098 or on a spliced portion is ease of fabrication of the image forming pattern 1084C, which may not readily accommodate a long tail and attached laser device.
Figure 14 is a cross-sectional view of a Gradient Index (GRIN) lens 1402 at the exit surface 1404 of an optical fiber 1406; the assembly shown here serves the same function as the previously described 1080 fibers with integrated elements. A structured light image forming pattern element 1408 includes a first side 1410 and a second side 1412. The first side 1410 of the structured light pattern forming element 1408 faces the optical fiber exit surface 1404 and has a light bending element 1414 thereon that bends laser light 1415 that propagates through the optical fiber 1404 and that exits through the exit surface 1404 into an acceptance angle of the GRIN lens. In some embodiments, the light bending element 1414 includes a hologram. In some embodiments, the light bending element 1414 includes a micro Fresnel lens. In some embodiments, an air gap 1416 may be provided between the optical fiber exit window 1404 and the structured light pattern forming element 1408. In some embodiments, the first surface may be bonded to the exit surface of the fiber with a uv or heat cured polymer adhesive material. The second side 1412 of the structured light pattern forming element 1408 includes a mask pattern 1418 printed or formed in an opaque material that faces the GRIN lens 1402. The second side 1412 may be bonded to a light entry portion of the GRIN lens 1402. In some embodiments, element 1408 is simply a mask with a pattern of micro apertures; the element may include a diffuser or a micro fresnel or diffractive component to get the light to enter the GRIN lens as required. The GRIN lens 1402 acts as a tiny projector lens that projects the light pattern 1420 formed by the mask pattern 1418 to produce a structured image 1422 at an anatomical surface. Note that the GRIN lens may include an aperture on the exit surface which is either painted or applied to the surface of the lens (or a small anulus part if desired) and black absorbing materials on the sides of the GRIN to improve the imaging performance and to reduce stray light.
Light Color Selection
The first and second light 962, 964 sources may be confiuged to select for optimal imagin by the cameras and the desired color representations; light of multiple colors within the first and second illumination light guides and within the structured light forming optical fiber. The first light source 962 provides visible light to provide images visible to a surgeon during a surgical procedure. The seconed light source 964 may provide visible or non-visible light such as short wave blue or near infra red (NIR) for transmission within the structured light forming optical fiber. In some embodiments, an image capture sensor includes an additional channel configured to capture reflected structured light patterns outside the visible ligh range. U. S. Patent No. (add reference to the dual sensor Intuitive Surgical utility patent applications) describs an image sensor that includes chanels to capture light within the visible light spectrum and to capture certain light outside the visible light spectrum. Alternatively, a Bayer color filter array may be reformulated such that there are RGB pixels and separate NIR pixels whereby the NIR pixels may be used for the detection of the reflected structured light pattern and the separation of the pattern from the visible image.
During surface mapping, the first light source 962 may provide light for transmission within the first and second illumination light guides that has a color or at a wavelength that accentuates the surface layer of the tissue. For example, blue light typically does not penetrate as deep into anatomical tissue as NIR light, and therefore, blue light produces sharper more distinct structured light features than does red red light. Thus, blue light also may be better suited for use by the second light source 964 than NIR light to accurate determination of structured light pattern edges during depth mapping, for example.
Despite the relative fuzziness of structured light features produced with red light, there are competing advantages to the second light source 964 providing NIR illumination, particularly around 800nm where the apparent color of hemoglobin is the same for both oxygenated and deoxygenated blood. One reason to use a structured light pattern in the NIR is that one may use the same acquisition hardware that is already implemented for viewing fluorescence in the infra red. Thus, projecting the pattern at 830nm, for example, provides a structured pattern which is in the emission band for indocyanine green, a relatively common NIR fluorescent dye which is used clinically. Thus, by using the NIR pattern, one may make use of the acquisition hardware already implemented for seeing ICG fluorescence. The absorbance of ICG is relatively low at 830nm so shining an 830nm pattern into the scene primarily images the 830nm pattern rather than ICG fluorescence although that will contribute a very small signal if ICG is present in the tissues being illuminated.
In the preceding description, numerous details are set forth for the purpose of explanation. However, one of ordinary skill in the art will realize that the invention might be practiced without the use of these specific details. In other instances, well-known processes are shown in block diagram form in order not to obscure the description of the invention with unnecessary detail. Identical reference numerals may be used to represent different views of the same or similar item in different drawings. Thus, the foregoing description and drawings of embodiments in accordance with the present invention are merely illustrative of the principles of the invention. Therefore, it will be understood that various modifications can be made to the embodiments by those skilled in the art without departing from the scope of the invention, which is defined in the appended claims.

Claims

CLAIMS What is claimed is,
1. A structured light forming device comprising:
an optical fiber having a structured image forming pattern at a first end thereof.
2. The structured light forming device of claim 1,
wherein the structured image forming pattern includes a diffractive element.
3. The structured light forming device of claim 1,
wherein the structured image forming pattern includes a hologram.
4. The structured light forming device of claim 1 further including: a Gradient Index (GRIN) lens;
wherein the structured image forming pattern is disposed between the GRIN lens and the first end of the optical fiber.
5. The structured light forming device of claim 1 further including: a Gradient Index (GRIN) lens;
wherein the structured image forming pattern disposed between the GRIN lens and the first end of the optical fiber;
wherein the structured image forming pattern includes light bending element and a mask pattern.
6. The structured light forming device of claim 1 further including: a light source disposed to provide light to a second end of the optical fiber.
7. The structured light forming device of claim 1 further including: a light emitting diode disposed to provide light to a second end of the optical fiber.
8. The structured light forming device of claim 1 further including: a laser disposed to provide light to a second end of the optical fiber.
9. A stereo endoscope comprising:
an elongated housing including a proximal end portion and a distal end portion;
a right camera window at the distal end portion of the housing;
a left camera window at the distal end portion of the housing;
a light guide; and
a structured light forming optical fiber extending between the proximal end portion and the distal end portion of the housing, having a structured image forming pattern at a first end thereof.
10. The stereo endoscope of claim 9 further including:
a light source disposed to provide light to a proximal end of the optical fiber.
11. The stereo endoscope of claim 9 further including:
a light source to provide infra red wavelength light to a proximal end of the optical fiber.
12. The stero endoscope of claim 9 further including:
a light source to provide blue wavelength light to a proximal end of the optical fiber.
13. The stereo endoscope of claim 9 further including:
a light emitting diode disposed to provide light to a proximal end of the optical fiber.
14. The stero endoscope of claim 9 further including:
a laser disposed to provide light to a proximal end of the optical fiber.
15. The stereo endoscope of claim 9 further including:
a light source configured to selectably provide light to a proximal end of the optical fiber and to selectably provide the light to the light guide.
16. The stereo endoscope of claim 9 further including:
an infra red wavelength light source configured to selectably provide light to a proximal end of the optical fiber and to selectably provide the light to the light guide.
17. The stereo endoscope of claim 9 further including:
a blue wavelength wavelength light source configured to selectably provide light to a proximal end of the optical fiber and to selectably provide the light to the light guide.
18. The stereo endoscope of claim 9 further including:
a first light source disposed to provide light to a proximal end of the optical fiber; and
a second light source disposed to provide light to the light guide.
19. The stereo endoscope of claim 9 further including:
a laser to disposed to provide first light to a proximal end of the optical fiber; and
a second light source disposed to provide white ligth to the light guide.
20. The stereo endoscope of claim 9 further including:
a light emitting diode disposed to provide light to a proximal end of the optical fiber; and
a second light source disposed to provide white ligth to the light guide.
21. The stereo endoscope of claim 9,
wherein the distal end portion of the structured light forming optical fiber is disposed between the right camera window and the left camera window; further including:
a light source disposed to provide light to a proximal end of the optical fiber.
22. The stereo endoscope of claim 9,
wherein the distal end portion of the structured light forming optical fiber is disposed between the right camera window and the left camera window.
23. The stereo endoscope of claim 9,
wherein the light guide includes a plurality of optical fibers extending between the proximal end portion and the distal end portion of the housing; and wherein the structured light forming optical fiber is disposed among the plurality of optical fibers.
24. The stereo endoscope of claim 9,
wherein the structured image forming pattern includes a diffraction element.
25. The stereo endoscope of claim 9,
wherein the structured image forming pattern includes a hologram.
26. The stereo endoscope of claim 9 further including:
a Gradient Index (GRIN) lens;
wherein the optical fiber is disposed between the GRIN lens and the first end of the optical fiber.
27. The stereo endoscope of claim 9 further including:
a Gradient Index (GRIN) lens;
wherein the structured image forming pattern disposed between the GRIN lens and the first end of the optical fiber;
wherein the structured image forming pattern includes light bending element and a mask pattern.
28. A method to project structured light comprising:
transmitting light within an optical fiber from a proximal end portion of the optical fiber to a distal end portion of the optical fiber; and
emitting the light in a structured light pattern at the distal end portion of the optical fiber.
29. The method of claim 28 further including:
postioning the distal end of the optical fiber to direct the structured light pattern to be incident upon an object.
30. The method of claim 28 further including:
capturing at a right light sensor and at a left light sensor, a stuctured light pattern feature reflected from an object.
31. The method of claim 28 further including:
postioning the distal end of the optical fiber to direct the structured light pattern to be incident upon an object.
32. The method of claim 28 further including:
capturing at a right light sensor and at a left light sensor, a stuctured light pattern feature reflected from the object.
33. The method of claim 28,
wherein the emitting the light in a structured light pattern having a multiplicity of structured light pattern features structured light pattern includes diffracting the light.
34. The method of claim 28,
wherein emitting the light in a structured light pattern having a multiplicity of structured light pattern features includes producing a hologram.
35. The method of claim 28,
wherein emitting the light in a structured light pattern having a multiplicity of structured light pattern features includes bending the light and directing the light through a mask pattern.
36. The method of claim 28,
wherein the structured light pattern includes a multiplicity of light spots.
37. The method of claim 28,
wherein the transmitted light includes light from a laser.
38. The method of claim 28,
wherein the transmitted light includes light from a light emitting diode.
39. A method to produce a depth map for an object comprising: transmitting light within an optical fiber from a proximal end portion of the optical fiber to a distal end portion of the optical fiber;
emitting the light in a structured light pattern having a multiplicity of structured light pattern features, at the distal end portion of the optical fiber; postioning the distal end of the optical fiber to direct the structured light pattern to be incident upon the object;
capturing at a right light sensor and at a left light sensor, a stuctured light pattern feature reflected from the object; and
determining a surface depth of a portion of the object on which the captured light pattern feature is incident based upon a location of the captured stuctured light pattern feature in the right light sensor and a location of the captured stuctured light pattern feature in the right camera.
40. The method of claim 39,
wherein the emitting the light in a structured light pattern having a multiplicity of structured light pattern features structured light pattern includes diffracting the light.
41. The method of claim 39,
wherein emitting the light in a structured light pattern having a multiplicity of structured light pattern features includes producing a hologram.
42. The method of claim 39,
wherein emitting the light in a structured light pattern having a multiplicity of structured light pattern features includes bending the light and directing the light through a mask pattern.
43. The method of claim 39,
wherein the structured light pattern includes a multiplicity of light spots.
44. The method of claim 39,
wherein the transmitted light includes light from a laser.
45. The method of claim 39,
wherein the transmitted light includes light from a light emitting diode.
PCT/US2018/048522 2017-08-29 2018-08-29 Structured light projection from an optical fiber WO2019046411A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201762551674P 2017-08-29 2017-08-29
US62/551,674 2017-08-29

Publications (1)

Publication Number Publication Date
WO2019046411A1 true WO2019046411A1 (en) 2019-03-07

Family

ID=65527509

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2018/048522 WO2019046411A1 (en) 2017-08-29 2018-08-29 Structured light projection from an optical fiber

Country Status (1)

Country Link
WO (1) WO2019046411A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090292168A1 (en) * 2004-09-24 2009-11-26 Vivid Medical Wavelength multiplexing endoscope
US20110057930A1 (en) * 2006-07-26 2011-03-10 Inneroptic Technology Inc. System and method of using high-speed, high-resolution depth extraction to provide three-dimensional imagery for endoscopy
US20160128553A1 (en) * 2014-11-07 2016-05-12 Zheng Jason Geng Intra- Abdominal Lightfield 3D Endoscope and Method of Making the Same
US20160143509A1 (en) * 2014-11-20 2016-05-26 The Johns Hopkins University System for stereo reconstruction from monoscopic endoscope images
JP2016191888A (en) * 2015-03-31 2016-11-10 オリンパス株式会社 Pattern projection optical system for stereo measurement and stereo measurement endoscope device including the same

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090292168A1 (en) * 2004-09-24 2009-11-26 Vivid Medical Wavelength multiplexing endoscope
US20110057930A1 (en) * 2006-07-26 2011-03-10 Inneroptic Technology Inc. System and method of using high-speed, high-resolution depth extraction to provide three-dimensional imagery for endoscopy
US20160128553A1 (en) * 2014-11-07 2016-05-12 Zheng Jason Geng Intra- Abdominal Lightfield 3D Endoscope and Method of Making the Same
US20160143509A1 (en) * 2014-11-20 2016-05-26 The Johns Hopkins University System for stereo reconstruction from monoscopic endoscope images
JP2016191888A (en) * 2015-03-31 2016-11-10 オリンパス株式会社 Pattern projection optical system for stereo measurement and stereo measurement endoscope device including the same

Similar Documents

Publication Publication Date Title
US20230255446A1 (en) Surgical visualization systems and displays
US10136956B2 (en) Apparatus and method for robot-assisted surgery as well as positioning device
EP3679851A1 (en) Endoscopic imaging with augmented parallax
Maier-Hein et al. Optical techniques for 3D surface reconstruction in computer-assisted laparoscopic surgery
KR102512876B1 (en) Medical devices, systems, and methods integrating eye gaze tracking for stereo viewer
KR102410247B1 (en) Systems and methods for displaying an instrument navigator in a teleoperated system
CN112074248A (en) Three-dimensional visual camera and integrated robot technology platform
KR101772187B1 (en) Method and device for stereoscopic depiction of image data
US20220192777A1 (en) Medical observation system, control device, and control method
CN110177518A (en) System and method for the detection object in the visual field of image capture apparatus
US10772701B2 (en) Method and apparatus to project light pattern to determine distance in a surgical scene
WO2019046411A1 (en) Structured light projection from an optical fiber
US20220039904A1 (en) Jig assembled on stereoscopic surgical microscope for applying augmented reality techniques to surgical procedures
EP3666166B1 (en) System and method for generating a three-dimensional model of a surgical site
US20220015857A1 (en) Illumination corrected near-infrared (nir) imaging for image guided surgery
EP3871193B1 (en) Mixed reality systems and methods for indicating an extent of a field of view of an imaging device
US20230363830A1 (en) Auto-navigating digital surgical microscope
US20230099835A1 (en) Systems and methods for image mapping and fusion during surgical procedures
JP6687877B2 (en) Imaging device and endoscope device using the same
WO2023003734A1 (en) Imaging systems with mutiple fold optical path
WO2022192690A1 (en) Automated touchless registration for surgical navigation

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18850899

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 18850899

Country of ref document: EP

Kind code of ref document: A1