US20160287344A1 - Systems and methods for reducing measurement error in optical fiber shape sensors - Google Patents

Systems and methods for reducing measurement error in optical fiber shape sensors Download PDF

Info

Publication number
US20160287344A1
US20160287344A1 US15/183,936 US201615183936A US2016287344A1 US 20160287344 A1 US20160287344 A1 US 20160287344A1 US 201615183936 A US201615183936 A US 201615183936A US 2016287344 A1 US2016287344 A1 US 2016287344A1
Authority
US
United States
Prior art keywords
shape
shape sensor
sensor
instrument
neutral axis
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Abandoned
Application number
US15/183,936
Inventor
Caitlin Q. Donhowe
Stephen J. Blumenkranz
Vincent Duindam
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Intuitive Surgical Operations Inc
Original Assignee
Intuitive Surgical Operations Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Intuitive Surgical Operations Inc filed Critical Intuitive Surgical Operations Inc
Priority to US15/183,936 priority Critical patent/US20160287344A1/en
Publication of US20160287344A1 publication Critical patent/US20160287344A1/en
Abandoned legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00165Optical arrangements with light-conductive means, e.g. fibre optics
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/00163Optical arrangements
    • A61B1/00165Optical arrangements with light-conductive means, e.g. fibre optics
    • A61B1/00167Details of optical fibre bundles, e.g. shape or fibre distribution
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B1/00Instruments for performing medical examinations of the interior of cavities or tubes of the body by visual or photographical inspection, e.g. endoscopes; Illuminating arrangements therefor
    • A61B1/005Flexible endoscopes
    • A61B1/009Flexible endoscopes with bending or curvature detection of the insertion part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B34/37Master-slave robots
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/06Measuring instruments not otherwise provided for
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B6/00Light guides; Structural details of arrangements comprising light guides and other optical elements, e.g. couplings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B18/00Surgical instruments, devices or methods for transferring non-mechanical forms of energy to or from the body
    • A61B2018/00636Sensing and controlling the application of energy
    • A61B2018/00773Sensed parameters
    • A61B2018/00791Temperature
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2061Tracking techniques using shape-sensors, e.g. fiber shape sensors with Bragg gratings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/30Surgical robots
    • A61B2034/301Surgical robots for introducing or steering flexible instruments inserted into the body, e.g. catheters or endoscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/361Image-producing devices, e.g. surgical cameras
    • A61B2090/3614Image-producing devices, e.g. surgical cameras using optical fibre

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Surgery (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Animal Behavior & Ethology (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Molecular Biology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Robotics (AREA)
  • Optics & Photonics (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Radiology & Medical Imaging (AREA)
  • General Physics & Mathematics (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Endoscopes (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

A shape sensing apparatus comprises an instrument including an elongated shaft with a neutral axis. The shape sensor also includes a first shape sensor with an elongated optical fiber extending within the elongated shaft at a first radial distance from the neutral axis. The apparatus also includes a shape sensor compensation device extending within the elongated shaft and including a temperature sensor. The apparatus also comprises a tracking system for receiving shape data from the first shape sensor and compensating data from the shape sensor compensation device for use in calculating a bend measurement for the instrument.

Description

    CROSS-REFERENCE TO RELATED APPLICATIONS
  • This application claims the benefit of U.S. Provisional Application 61/663,951 filed Jun. 25, 2012, which is incorporated by reference herein in its entirety.
  • FIELD
  • The present disclosure is directed to systems and methods for reducing measurement error in a shape sensing optical fiber, and more particularly to systems and methods for reducing measurement error in shape sensing optical fibers used in surgical instruments.
  • BACKGROUND
  • Minimally invasive medical techniques are intended to reduce the amount of tissue that is damaged during diagnostic or surgical procedures, thereby reducing patient recovery time, discomfort, and deleterious side effects. Such minimally invasive techniques may be performed through natural orifices in a patient anatomy or through one or more surgical incisions. Through these natural orifices or incisions clinicians may insert surgical instruments to reach a target tissue location. To reach the target tissue location, the minimally invasive surgical instruments may navigate natural or surgically created passageways in anatomical systems such as the lungs, the colon, the intestines, the kidneys, the heart, the circulatory system, or the like. Navigational assist systems help the clinician route the surgical instruments and avoid damage to the anatomy. These systems can incorporate the use of shape sensors to more accurately describe the shape, position, orientation, and pose of the surgical instrument in real space or with respect to pre-procedural or concurrent images. The accuracy and precision of these shape sensors may be compromised by many factors including temperature variations, the location of the shape sensor within the instrument, and axial loading on the sensor. Improved systems and methods are needed for increasing the accuracy and precision of navigational assist systems, including minimizing the effects of factors that compromise shape sensor accuracy.
  • SUMMARY
  • The embodiments of the invention are summarized by the claims that follow below.
  • In one embodiment, a shape sensing apparatus comprises an instrument including an elongated shaft with a neutral axis. The shape sensing apparatus also includes a first shape sensor with an elongated optical fiber generally parallel to and at a first radial distance from the neutral axis of the elongated shaft. The apparatus also includes a shape sensor compensation device extending within the elongated shaft parallel to the neutral axis. The apparatus also comprises a tracking system for receiving shape data from the first shape sensor and compensating data from the shape sensor compensation device for use in calculating a bend measurement for the instrument.
  • In another embodiment, a shape sensing method comprises providing an instrument including an elongated shaft defining a neutral axis. The method also comprises receiving shape data from a first shape sensor. The first shape sensor includes an elongated optical fiber extending within the elongated shaft parallel to and at a first radial distance from the neutral axis. The method also comprises receiving compensation data from a shape sensor compensation device aligned with the elongated shaft parallel to the neutral axis. The method also comprises generating an instrument bend measurement based upon the received shape data and the compensation data.
  • In another embodiment, a medical instrument system comprises a surgical instrument with an elongated shaft having a neutral axis. The system also includes a first shape sensor including an elongated optical fiber extending within the elongated shaft parallel to and at a first radial distance from the neutral axis. The system further includes a shape sensor compensation device extending within the elongated shaft parallel to the neutral axis. The system further includes a tracking system adapted to receive shape data from the first shape sensor and compensating data from the shape sensor compensation device for calculating a bend measurement for the instrument. The system further includes a display system for displaying a virtual image of the surgical instrument using the bend measurement.
  • BRIEF DESCRIPTIONS OF THE DRAWINGS
  • Aspects of the present disclosure are best understood from the following detailed description when read with the accompanying figures. It is emphasized that, in accordance with the standard practice in the industry, various features are not drawn to scale. In fact, the dimensions of the various features may be arbitrarily increased or reduced for clarity of discussion. In addition, the present disclosure may repeat reference numerals and/or letters in the various examples. This repetition is for the purpose of simplicity and clarity and does not in itself dictate a relationship between the various embodiments and/or configurations discussed.
  • FIG. 1 is a robotic surgical system, in accordance with embodiments of the present disclosure.
  • FIG. 2 illustrates a surgical instrument system utilizing aspects of the present disclosure
  • FIGS. 3 and 4 are cross-sectional views of a surgical instrument including an optical fiber shape sensor according to one embodiment of the present disclosure.
  • FIGS. 5 and 6 are cross-sectional views of a surgical instrument including an optical fiber shape sensor and a shape sensor compensation device according to another embodiment of the present disclosure.
  • FIG. 7 is a cross-sectional view of a surgical instrument including an optical fiber shape sensor and a shape sensor compensation device according other embodiments of the present disclosure.
  • FIGS. 8 and 9 are cross-sectional views of a surgical instrument including an optical fiber shape sensor and a shape sensor compensation device according to another embodiment of the present disclosure.
  • FIG. 10 is a cross-sectional view of a surgical instrument including an optical fiber shape sensor and a shape sensor compensation device according to another embodiment of the present disclosure.
  • FIGS. 11 and 12 illustrate the surgical instrument of FIG. 10 bent in different planes.
  • FIG. 13 is a flowchart describing a method according to an embodiment of the disclosure.
  • DETAILED DESCRIPTION
  • In the following detailed description of the embodiments of the invention, numerous specific details are set forth in order to provide a thorough understanding of the disclosed embodiments. However, it will be obvious to one skilled in the art that the embodiments of this disclosure may be practiced without these specific details. In other instances well known methods, procedures, components, and circuits have not been described in detail so as not to unnecessarily obscure aspects of the embodiments of the invention.
  • The embodiments below will describe various instruments and portions of instruments in terms of their state in three-dimensional space. As used herein, the term “position” refers to the location of an object or a portion of an object in a three-dimensional space (e.g., three degrees of translational freedom along Cartesian X,Y,Z coordinates). As used herein, the term “orientation” refers to the rotational placement of an object or a portion of an object (three degrees of rotational freedom—e.g., roll, pitch, and yaw). As used herein, the term “pose” refers to the position of an object or a portion of an object in at least one degree of translational freedom and to the orientation of that object or portion of the object in at least one degree of rotational freedom (up to six total degrees of freedom). As used herein, the term “shape” refers to a set of poses, positions, or orientations measured along an object.
  • Referring to FIG. 1 of the drawings, a robotic surgical system is generally indicated by the reference numeral 100. As shown in FIG. 1, the robotic system 100 generally includes a surgical manipulator assembly 102 for operating a surgical instrument 104 in performing various procedures on the patient P. The assembly 102 is mounted to or near an operating table O. A master assembly 106 allows the surgeon S to view the surgical site and to control the manipulator assembly 102.
  • In alternative embodiments, the robotic system may include more than one manipulator assembly. The exact number of manipulator assemblies will depend on the surgical procedure and the space constraints within the operating room among other factors.
  • The master assembly 106 may be located at a surgeon's console C which is usually located in the same room as operating table O. However, it should be understood that the surgeon S can be located in a different room or a completely different building from the patient P. Master assembly 106 generally includes an optional support 108 and one or more control device(s) 112 for controlling the manipulator assemblies 102. The control device(s) 112 may include any number of a variety of input devices, such as joysticks, trackballs, gloves, trigger-guns, hand-operated controllers, voice recognition devices or the like. In some embodiments, the control device(s) 112 will be provided with the same degrees of freedom as the associated surgical instruments 104 to provide the surgeon with telepresence, or the perception that the control device(s) 112 are integral with the instruments 104 so that the surgeon has a strong sense of directly controlling instruments 104. In some embodiments, the control devices 112 are manual input devices which move with six degrees of freedom, and which may also include an actuatable handle for actuating instruments (for example, for closing grasping jaws, applying an electrical potential to an electrode, delivering a medicinal treatment, or the like).
  • A visualization system 110 may include a viewing scope assembly (described in greater detail below) such that a concurrent or real-time image of the surgical site is provided to surgeon console C. The concurrent image may be, for example, a two or three dimensional image captured by an endoscope positioned within the surgical site. In this embodiment, the visualization system 100 includes endoscopic components that may be integrally or removably coupled to the surgical instrument 104. However in alternative embodiments, a separate endoscope, attached to a separate manipulator assembly may be used with the surgical instrument to image the surgical site. The visualization system 110 may be implemented as hardware, firmware, software or a combination thereof which interact with or are otherwise executed by one or more computer processors, which may include the processors of a control system 116 (described below).
  • A display system 111 may display an image of the surgical site and surgical instruments captured by the visualization system 110. The display 111 and the master control devices 112 may be oriented such that the relative positions of the imaging device in the scope assembly and the surgical instruments are similar to the relative positions of the surgeon's eyes and hands so the operator can manipulate the surgical instrument 104 and the hand control as if viewing the workspace in substantially true presence. By true presence, it is meant that the presentation of an image is a true perspective image simulating the viewpoint of an operator that is physically manipulating the surgical instruments 104.
  • Alternatively or additionally, monitor 111 may present images of the surgical site recorded and/or modeled preoperatively using imaging technology such as, computerized tomography (CT), magnetic resonance imaging (MRI), fluoroscopy, thermography, ultrasound, optical coherence tomography (OCT), thermal imaging, impedence imaging, laser imaging, or nanotube X-ray imaging. In some embodiments, the monitor 111 may display a virtual navigational image in which the actual location of the surgical instrument is dynamically referenced with preoperative images to present the surgeon S with a virtual image of the surgical site at the location of the tip of the surgical instrument. An image of the tip of the surgical instrument or other graphical or alphanumeric indicators may be superimposed on the virtual image to assist the surgeon controlling the surgical instrument.
  • As shown in FIG. 1, a control system 116 includes at least one processor and typically a plurality of processors for effecting control between the surgical manipulator assembly 102, the master assembly 106, and the image and display system 110. The control system 116 also includes software programming instructions to implement some or all of the methods described herein. While control system 116 is shown as a single block in the simplified schematic of FIG. 1, the system may comprise a number of data processing circuits (e.g., on the surgical manipulator assembly 102 and/or on the master assembly 106), with at least a portion of the processing optionally being performed adjacent an input device, a portion being performed adjacent a manipulator, and the like. Any of a wide variety of centralized or distributed data processing architectures may be employed. Similarly, the programming code may be implemented as a number of separate programs or subroutines, or may be integrated into a number of other aspects of the robotic systems described herein. In one embodiment, control system 116 may support wireless communication protocols such as Bluetooth, IrDA, HomeRF, IEEE 802.11, DECT, and Wireless Telemetry.
  • In some embodiments, control system 116 may include servo controllers to provide force and torque feedback from the surgical instruments 104 to the hand-operated control device 112. Any suitable conventional or specialized servo controller may be used. A servo controller may be separate from, or integral with manipulator assemblies 102. In some embodiments, the servo controller and manipulator assembly are provided as part of a robotic arm cart positioned adjacent to the patient's body. The servo controller transmits signals instructing the manipulator assemblies to move instruments which extend into an internal surgical site within the patient body via openings in the body.
  • Each of the manipulator assemblies 102 that support a surgical instrument 104 and may comprise a series of manually articulatable linkages, generally referred to as set-up joints, and a robotic manipulator. The robotic manipulator assemblies 102 may be driven by a series of actuators (e.g., motors). These motors actively move the robotic manipulators in response to commands from the control system 116. The motors are further coupled to the surgical instrument so as to advance the surgical instrument into a naturally or surgically created anatomical orifice and to move the distal end of the surgical instrument in multiple degrees of freedom that may include three degrees of linear motion (e.g., X,Y, Z linear motion) and three degrees of rotational motion (e.g., roll, pitch, yaw). Additionally, the motors can be used to actuate an articulatable end effector of the instrument for grasping tissues in the jaws of a biopsy device or the like.
  • FIG. 2 illustrates a shape sensing apparatus 118 which includes the surgical instrument system 104 and its interfacing systems. The surgical instrument system 104 includes a steerable instrument 120 coupled by an interface 122 to manipulator assembly 102 and visualization system 110. The instrument 120 has a flexible body 124, a steerable tip 126 at its distal end 128, and the interface 122 at its proximal end 130. The body 124 houses cables, linkages, or other steering controls (not shown) that extend between the interface 122 and the tip 126 to controllably bend or turn the tip as shown for example by the dotted line versions of the bent tip 126, and in some embodiments control an optional end effector 132. The end effector is a working distal part that is manipulable for a medical function, e.g., for effecting a predetermined treatment of a target tissue. For instance, some end effectors have a single working member such as a scalpel, a blade, or an electrode. Other end effectors such as the embodiment of FIG. 2, have a pair or plurality of working members such as forceps, graspers, scissors, or clip appliers, for example. Examples of electrically activated end effectors include electrosurgical electrodes, transducers, sensors, and the like. End effectors may also include conduits to convey fluids, gases or solids to perform, for example, suction, insufflation, irrigation, treatments requiring fluid delivery, accessory introduction, biopsy extraction and the like). In other embodiments, flexible body 124 can define one or more lumens through which surgical instruments can be deployed and used at a target surgical location.
  • The instrument 120 can also include an image capture element 134 which may include a stereoscopic or monoscopic camera disposed at the distal end 128 for capturing images that are transmitted to and processed by the visualization system 110 for display by the display system 111. Alternatively, the image capture element 134 may be a coherent fiber-optic bundle that couples to an imaging and processing system on the proximal end of the instrument 120, such as a fiberscope. The image capture element 134 may be single or multi-spectral for capturing image data in the visible or infrared/ultraviolet spectrum.
  • A tracking system 136 interfaces with a sensor system 138 for determining the shape (and optionally, pose) of the distal end 128 and or one or more segments 137 along the instrument 120. Although only an exemplary set of segments 137 are depicted in FIG. 2, the entire length of the instrument 120, between the distal end 128 and the proximal end 130 and including the tip 126 may be effectively divided into segments, the shape (and location, pose, and/or position) of which may be determined by the sensor system 138. The tracking system 136 may be implemented as hardware, firmware, software or a combination thereof which interact with or are otherwise executed by one or more computer processors, which may include the processors of a control system 116.
  • The sensor system 138 includes an optical fiber 140 aligned with the flexible body 124 (e.g., provided within an interior channel (not shown) or mounted externally). The tracking system 136 is coupled to a proximal end of the optical fiber 140. In this embodiment, the fiber 140 has a diameter of approximately 200 μm. In other embodiments, the dimensions may be larger or smaller.
  • The optical fiber 140 forms a fiber optic bend sensor for determining the shape of the instrument 120. In one alternative, optical fibers including Fiber Bragg Gratings (FBG) are used to provide strain measurements in structures in one or more dimensions. Various systems and methods for monitoring the shape and relative position of a optical fiber in three dimensions are described in U.S. patent application publication no. 2006/0013523, filed on Jul. 13, 2005, U.S. provisional patent application Ser. No. 60/588,336, filed on Jul. 16, 2004, and U.S. Pat. No. 6,389,187, filed on Jun. 17, 1998, the disclosures of which are incorporated herein in their entireties. In other alternatives, sensors employing other strain sensing techniques such as Rayleigh scattering, Raman scattering, Brillouin scattering, and Fluorescence scattering may be suitable. In other alternative embodiments, the shape of the instrument 120 may be determined using other techniques. For example, if the history of instrument tip's pose is stored for an interval of time that is smaller than the period for refreshing the navigation display or for alternating motion (e.g., inhalation and exhalation), the pose history can be used to reconstruct the shape of the device over the interval of time. As another example, historical pose, position, or orientation data may be stored for a known point of an instrument along a cycle of alternating motion, such as breathing. This stored data may be used to develop shape information about the instrument. Alternatively, a series of positional sensors, such as electromagnetic (EM) sensors, positioned along the instrument can be used for shape sensing. Alternatively, a history of data from a positional sensor, such as an EM sensor, on the instrument during a procedure may be used to represent the shape of the instrument, particularly if an anatomical passageway is generally static. Alternatively, a wireless device with position or orientation controlled by an external magnetic field may be used for shape sensing. The history of its position may be used to determine a shape for the navigated passageways.
  • The optical fiber 140 is used to monitor the shape of at least a portion of the instrument 120. More specifically, light passing through the optical fiber 140 is processed by the tracking system 126 for detecting the shape of the surgical instrument 120 and for utilizing that information to assist in surgical procedures. The tracking system 136 may include a detection system for generating and detecting the light used for determining the shape of the instrument 120. This information, in turn, in can be used to determine other related variables, such as velocity and acceleration of the parts of a surgical instrument. By obtaining accurate measurements of one or more of these variables in real time, the controller can improve the accuracy of the robotic surgical system and compensate for errors introduced in driving the component parts. The sensing may be limited only to the degrees of freedom that are actuated by the robotic system, or may be applied to both passive (e.g., unactuated bending of the rigid members between joints) and active (e.g., actuated movement of the instrument) degrees of freedom.
  • The information from the tracking system 136 may be sent to the navigation system 142 where it is combined with information from the visualization system 110 and/or the preoperatively taken images to provide the surgeon or other operator with real-time position information on the display system 111 for use in the control of the instrument 120. The control system 116 may utilize the position information as feedback for positioning the instrument 120. Various systems for using fiber optic sensors to register and display a surgical instrument with surgical images are provided in U.S. patent application Ser. No. 13/107,562, entitled “Medical System Providing Dynamic Registration of a Model of an Anatomical Structure for Image-Guided Surgery,” which is incorporated by reference herein in its entirety.
  • In the embodiment of FIG. 2, the instrument 104 is teleoperated within the robotic surgical system 100. In an alternative embodiment, the manipulator assembly may be replaced by direct operator control. In the direct operation alternative, various handles and operator interfaces may be included for hand-held operation of the instrument.
  • FIGS. 3 and 4 are cross-sectional views of the surgical instrument 120 including the optical fiber shape sensor 140 according to one embodiment of the present disclosure. To simplify the illustration, details of the steering components and visual imaging system have been omitted. The illustration is not drawn to scale. In this embodiment, the optical fiber 140 comprises four cores 144 a-144 d contained within a single cladding 146. Each core may be single-mode with sufficient distance and cladding separating the cores such that the light in each core does not interact significantly with the light carried in other cores. In other embodiments, the number of cores may vary or each core may be contained in a separate optical fiber. In the embodiments of FIGS. 3 and 4, the fiber cores are arranged with 90° spacing about the center of the fiber 140. In other embodiments, four cores may be arranged with one core in the center of the fiber and three cores spaced at 120° intervals about the center.
  • In some embodiments, an array of FBG's is provided within each core. Each FBG comprises a series of modulations of the core's refractive index so as to generate a spatial periodicity in the refraction index. The spacing may be chosen so that the partial reflections from each index change add coherently for a narrow band of wavelengths, and therefore reflect only this narrow band of wavelengths while passing through a much broader band. During fabrication of the FBG's, the modulations are spaced by a known distance, thereby causing reflection of a known band of wavelengths. However, when a strain is induced on the fiber core, the spacing of the modulations will change, depending on the amount of strain in the core. Alternatively, backscatter or other optical phenomena that vary with bending of the optical fiber can be used to determine strain within each core.
  • Thus, to measure strain, light is sent down the fiber, and characteristics of the returning light are measured. For example, FBG's produce a reflected wavelength that is a function of the strain on the fiber and its temperature. This FBG technology is commercially available from a variety of sources, such as Smart Fibres Ltd. of Bracknell, England. Use of FBG technology in position sensors for robotic surgery is described in U.S. Pat. No. 7,930,065 which is incorporated by reference herein in its entirety.
  • The shape sensor may provide shape data to the tracking system in the form of strain data. Additionally, strain data may be supplemented with data related to light response, temperature errors, twist errors, or other data that may contribute to determining shape.
  • When applied to a multicore fiber, bending of the optical fiber induces strain on the cores that can be measured by monitoring the wavelength shifts in each core. By having two or more cores disposed off-axis in the fiber, bending of the fiber induces different strains on each of the cores. These strains are a function of the local bend radius of the fiber, the radial position of the core with respect to the fiber centerline and the angular position of the core about the core centerline with respect to the plane of fiber bending. For example, strain induced wavelength shifts in regions of the cores containing FBG's located at points where the fiber is bent, can thereby be used to determine the amount of bending at those points. These data, combined with the known spacings of the FBG regions, can be used to reconstruct the shape of the fiber. Such a system has been described by Luna Innovations. Inc. of Blacksburg, Va.
  • In this embodiment of FIGS. 3 and 4, the fiber 140 includes the four optical cores 144 a-144 d disposed at equal radial distances from and equal angular intervals about the axis of the fiber 140 such that in cross-section, opposing pairs of cores 144 a-144 c and 144 b-144 d form orthogonal axes. The sensing locations along the four optical cores are aligned such that measurements from each core are from substantially correlated axial regions along the optical fiber. For example, in one embodiment, each core 144 a-144 d includes an array of collinear FBG's that are disposed at known positions along the lengths of each core 144 a-144 d such that the FBG's 144 a-d for all four cores 144 a-144 d are aligned at a plurality of sensor segments 137, including the steerable tip 126. In this embodiment, the fiber 140 is centered at a radial distance D1 from a neutral axis 150 that extends longitudinally through the instrument 120. In alternative embodiments, the fiber 140 may be centered about the neutral axis 150 or located at a different radial distance. In this embodiment, the fiber 140 may be offset from the neutral axis to accommodate other components of the instrument 120 such as cables or other steering components or visualization components (not shown) that may be centered on or clustered about the neutral axis 150. In this embodiment, the neutral axis 150 extends generally along the central axis of the instrument 120. The neutral axis is the axis of the instrument 120 along which no axial strains (due to tension or compression) occur during bending.
  • A bending of the fiber 140 in one of the sensor segments 137 will lengthen at least one core 144 a-144 d with respect to the opposing core 144 a-144 d Interrogation of this length differential along the fiber enables the angle and radius of bending to be extracted. This interrogation may be performed using the tracking system 136. There are a variety of ways of multiplexing the FBG's so that a single fiber core can carry many sensors and the readings of each sensor can be distinguished. Some of the various ways are described in U.S. patent application Ser. No. 13/049,012 which is incorporated by reference herein in its entirety.
  • In alternative embodiments, fibers with fewer or more cores may be used. Likewise, the fiber cores may be arranged in different patterns, such as, a central core with (axial refers to the fiber orientation, not the spacing) additional cores spaced at angular intervals around the central core. In an alternative embodiment, the instrument body may include an internal channel sized to accommodate the optical fiber and separate it from the steering or visualization components, which themselves may be accommodated through separate channels. In one embodiment, a hollow utility channel may provide access for removable devices including removable surgical instruments, removable steering components, removable visualization components or the like.
  • When a fiber optic shape sensor is positioned offset from the neutral axis, the fiber is subject to axial tensile and compressive forces during bending which strain all of the fiber cores and may contribute to bending measurement error. Temperature variations in the fiber optic shape sensor alter the cores' index of refraction and create apparent strain that also may contribute to bending measurement error. Temperature variations in the shape sensor may result, for example, from patient body heat, friction, heat generated by components of the steering system, or heat generated by components of the visualization system. Both strain and temperature effects appear as common-mode perturbations to the fibers in the shape sensors. Given that the shape sensor provides the bend information as a differential measurement, under ideal circumstances the strain or temperature common-mode perturbations may have only limited impact, if any. However, in practice, the common-mode rejection ratio of the shape sensor is finite and, due to constructional differences, for example, its fibers may react differently to said common-mode perturbations. As a consequence, an undesired differential bend signal may result and negatively affect the accuracy of the bend measurement. Because the strain due to axial forces may not be distinguishable from the apparent strain induced by temperature variations, it may be difficult to determine the magnitude of the bending measurement error due to axial forces versus temperature. Thus, it may be difficult to correct or compensate for the measurement error that results from axial forces and temperature. The problem is compounded because adjusting compensation to make the shape sensor insensitive to axial forces, may make the sensor increasingly sensitive to temperature and vice versa.
  • Knowledge about the temperature variations in the instrument near the shape sensor may be used to identify the effects of temperature on the bending measurements and may be used to separate the measurement error caused by temperature from the error caused by axial loading. Information about the effects of the axial forces and the temperature may then be used to algorithmically compensate the computed bending measurements for the instrument. FIGS. 5 and 6 are cross-sectional views of a surgical instrument including an optical fiber shape sensor and a sensor compensation device according to one embodiment of the present disclosure. In this embodiment, an instrument 200 is similar to the instrument 120, with the differences described below. The instrument 200 includes a fiber 202, similar to the fiber 140 which is offset from the neutral axis 204 of the instrument 200 by the distance D1. In this embodiment, the instrument 200 includes a sensor compensation device 206 which measures temperature and/or temperature variations in the instrument 200. The temperature-measuring sensor compensation device 206 is an optical fiber with at least one core. This fiber extends through the instrument 120 generally parallel to the neutral axis 204. The sensor compensation device 206 is centered at a radial distance D2 from a neutral axis 204 of the instrument 200. In this embodiment, the radial distance D2 is smaller than the radial distance D1 and the sensor compensation device 206 is between the neutral axis and the fiber 202. In alternative embodiments, the sensor compensation device may be positioned at a radial distance greater than D1. To accurately determine the effects of temperature on the shape sensor 202, the sensor compensation device may be located anywhere generally near the sensor, within the instrument 200. Optical fiber-based temperature measurement devices may determine temperature or temperature variations in a variety of ways. For example, an optical fiber may be constructed so that temperature modulates the intensity, phase, polarization, wavelength or transit time of light in the fiber. Alternatively, the optical fiber may have evanescent wave loss that varies with temperature, or temperature may be determined by scatter analysis. Knowledge of the temperature or temperature variations within the instrument near the shape sensor may allow for the separation of the effects of temperature and axial forces and for the identification of their effect on bending measurements for the overall instrument 200. Algorithmic compensation techniques are used to remove the effects of temperature and/or axial forces from the final bending measurements. Alternative systems for measuring temperature within the instrument may be also be suitable for use in correcting the effects of temperature from final measurements. Alternative temperature measurement systems may include the use of a plurality of discrete single-point temperature sensors, such as thermocouples, resistance-based temperature sensors, and silicon diodes, at different locations along the instrument. The measurements taken at the different locations may be interpolated.
  • In some embodiments, compensation for temperature effects on shape sensor 202 can be performed using predetermined characterization data. This temperature characterization data can be generated empirically via controlled application of a temperature gradient(s) (either continuous or localized) to shape sensor 202. By holding shape sensor 202 in a fixed configuration (e.g., straight, or with a known curvature), applying the predetermined temperature gradient (e.g., using individual or continuous heating elements), and monitoring the change in sensor output, the effects of temperature variations on sensor 202 can be characterized. In some embodiments, a single temperature gradient can be applied to shape sensor 202, while in other embodiments, multiple temperature gradients can be applied to shape sensor 202. In any event, the generated temperature characterization data can be compiled in any format, such as a lookup table housing the empirically-generated data, a mathematical model of the thermal response, or a simple scaling factor as a function of temperature, among others. This temperature characterization data can then be provided as a discrete package (e.g., to be stored in or loaded into a computer-readable memory), combined with other sensor-related data (e.g., incorporated into an overall calibration file for sensor 202), or can be integrated into the measurement algorithms/software/hardware for reading the output of sensor 202 (e.g., an interrogator or other electronic system such as tracking system 136).
  • In an alternative embodiment, the optical fiber temperature sensor may be shrouded in an insulation material together with the fiber shape sensor. The insulation may be used to reduce the effect of external temperature changing sources.
  • FIG. 7 is a cross-sectional view of a surgical instrument including an optical fiber shape sensor and a sensor compensation device according to another embodiment of the present disclosure. In this embodiment, an instrument 300 is similar to the instrument 120, with the differences described below. The instrument 300 includes a fiber shape sensor 302, similar to the fiber shape sensor 140 which is offset from the neutral axis of the instrument 300 by the distance D1. In this embodiment, the instrument 300 includes a sensor compensation device 304 which reduces the temperature variations across the transverse dimensions of the fiber 302. The sensor compensation device 304 may be a highly thermal conductive coating, layer, jacket, or tubing that serves to distribute heat uniformly around and along the optical fiber sensor 302. The sensor compensation device 304 may alternatively be a highly insulating coating, layer, jacket, or tubing that insulates optical fiber sensor 302 from external temperature variations. By maintaining a relatively constant temperature across the fiber shape sensor 302, the sensor compensation device 302 reduces the measurement error that may be caused by high temperatures or temperature variations in the sensor.
  • Knowledge about the axial forces causing compression and tension in the shape sensor may be used to identify the magnitude and/or effects of axial forces on the bending measurements and may also be used to separate the measurement error caused by axial forces versus temperature. Information about effects of the axial forces and the temperature may then be used to algorithmically compensate the computed bending measurements for the instrument. FIGS. 8 and 9 are cross-sectional views of a surgical instrument including an optical fiber shape sensor and a sensor compensation device according to another embodiment of the present disclosure. In this embodiment, an instrument 400 is similar to the instrument 120, with the differences described below. The instrument 400 includes a fiber shape sensor 402, similar to the fiber shape sensor 140 which is offset from the neutral axis 404 of the instrument 400 by the distance D1. In this embodiment, the instrument 400 includes an optical fiber sensor compensation device 406 which measures the same bend in the instrument 400 but with different axial forces. The sensor compensation device 406 is an optical fiber identical to or substantially similar to the shape sensor fiber 402.
  • The sensor compensation device 406 is centered at a radial distance D2 from a neutral axis 408 of the instrument 400, but is positioned on the opposite side of the neutral axis (approximately 180° angular displacement) from the shape sensor fiber 402. In this embodiment, the radial distance D2 is the same as the radial distance D1. In alternative embodiments, the fiber optic sensor compensation device may be positioned within the instrument at other distances from the neutral axis or at other angular displacements from the shape sensor.
  • When positioned at opposite but equal locations from the neutral axis 408, a bend about the X-axis (bending in the YZ-plane) will place one of the fibers 402, 406 into compression and the other into tension. Thus the bend measurements derived from each of the fibers 402, 406 would exhibit errors due to opposite axial forces. The true value of the bend, at the neutral axis, would be a value between the fiber bend measurements. Knowledge of the axial force measurement error within the instrument may also allow for the separation of the effects of temperature and axial forces and for the identification of their respective effects on bending measurements. Algorithmic compensation techniques are then used to remove the effects of axial forces and/or temperature from the final bending measurements. Although the fiber 402 is identified as the shape sensor and fiber 406 is identified as the shape sensor compensation device, it is understood that because the fibers are operatively the same, either one can be considered the sensor and the other the compensation device.
  • FIG. 10 is a cross-sectional view of a surgical instrument including an optical fiber shape sensor and a shape sensor compensation device according to another embodiment of the present disclosure. In this embodiment, an instrument 500 is similar to the instrument 400 of FIG. 8, but includes a plurality of radially spaced optical fiber shape sensors 502-508, each of which is identical to or substantially similar to shape sensor 140 of FIG. 1. In this embodiment, each shape sensor 502-508 serves as a sensor compensation device to the oppositely positioned sensor. For example, the axial load measurement error associated with shape sensor 504 may be used to compensate the measurements derived from shape sensor 508 and vice versa, to achieve a compensated bend measurement for the neutral axis. Using a plurality of radially spaced optical fiber shape sensors allows for axial force compensation through different bending planes.
  • For example, FIGS. 11 and 12 illustrate the surgical instrument 500 of FIG. 10 bent in the YZ and XZ planes, respectively. In FIG. 11, the bend in the YZ plane puts shape sensor 502 into compression and shape sensor 506 into tension. The equal and opposite positions of the shape sensors 502, 506 about the neutral axis will cause equal and opposite axial strains in the sensors. Shape sensors 504, 508 would experience minimal or no axial forces and should provide the same bending measurement. In FIG. 12, the bend in the XZ plane puts shape sensor 504 into compression and shape sensor 508 into tension. The equal and opposite positions of the shape sensors 504, 508 about the neutral axis will cause equal and opposite axial strains in the sensors. Shape sensors 502, 506 would experience minimal or no axial forces. and should provide the same bending measurement. Using a radial array of shape sensors may more accurately reduce the measurement errors due to axial strain for bending in a variety of different planes.
  • For alternative embodiments in which the instrument is not symmetric about a neutral axis, shape sensor placement about the neutral axis may be calculated according to known methods so that the effects of the axial forces are equal and opposite.
  • FIG. 13 is a flowchart providing a method 600 for using the shape sensing apparatus. Prior to implementation of this method, preoperative images (e.g., CT, MR, or the like) of the patient may be captured. During the surgical procedure, the instrument may be inserted into the patient through a natural or surgically created opening. The display system may display endoscopic images from the visualization system and may also display preoperative images associated with the current location of the tip of the instrument. As the instrument navigates natural or surgically created passageways with the patient, the tracking system and the navigation system are used to determine the bend of various segments of the instrument and thus the overall shape of the instrument. At 602, shape data is received from a fiber optic shape sensor extending within the surgical instrument parallel to the neutral axis of the surgical instrument. At 604, compensating data is received from a sensor compensation device. For example, temperature data may be received from a temperature sensor located near the shape sensor or axial load shape data may be received from another shape sensor located opposite the neutral axis of the instrument from the shape sensor. Optionally, temperature sensor information may be modified or otherwise combined with predetermined temperature characterization data. At 606, a bend measurement for the surgical instrument is calculated using the data received from the shape sensor and the sensor compensation device.
  • Although the shape sensors and sensor compensation devices have been described herein with respect to teleoperated or hand operated surgical systems, these sensors and sensor compensation systems will find application in a variety of medical and non-medical instruments in which accurate instrument bending measurements would otherwise be compromised by temperature, sensor location, or other physical conditions of the shape sensors.
  • One or more elements in embodiments of the invention may be implemented in software to execute on a processor of a computer system such as control system 108. When implemented in software, the elements of the embodiments of the invention are essentially the code segments to perform the necessary tasks. The program or code segments can be stored in a processor readable storage medium or device that may have been downloaded by way of a computer data signal embodied in a carrier wave over a transmission medium or a communication link. The processor readable storage device may include any medium that can store information including an optical medium, semiconductor medium, and magnetic medium. Processor readable storage device examples include an electronic circuit; a semiconductor device, a semiconductor memory device, a read only memory (ROM), a flash memory, an erasable programmable read only memory (EPROM); a floppy diskette, a CD-ROM, an optical disk, a hard disk, or other storage device, The code segments may be downloaded via computer networks such as the Internet, Intranet, etc.
  • Note that the processes and displays presented may not inherently be related to any particular computer or other apparatus. Various general-purpose systems may be used with programs in accordance with the teachings herein, or it may prove convenient to construct a more specialized apparatus to perform the operations described. The required structure for a variety of these systems will appear as elements in the claims. In addition, the embodiments of the invention are not described with reference to any particular programming language. It will be appreciated that a variety of programming languages may be used to implement the teachings of the invention as described herein.
  • While certain exemplary embodiments of the invention have been described and shown in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not restrictive on the broad invention, and that the embodiments of the invention not be limited to the specific constructions and arrangements shown and described, since various other modifications may occur to those ordinarily skilled in the art.

Claims (21)

1-26. (canceled)
27. A shape sensing apparatus comprising:
an instrument including an elongated shaft with a neutral axis;
a first shape sensor including an elongated optical fiber extending within the elongated shaft at a first radial distance from the neutral axis;
a shape sensor compensation device extending within the elongated shaft, the shape sensor compensation device including a temperature sensor; and
a tracking system adapted to receive shape data from the first shape sensor and compensating data from the shape sensor compensation device for calculating a bend measurement for the instrument.
28. The apparatus of claim 27, wherein the shape sensor compensation device is located at a second radial distance from the neutral axis.
29. The apparatus of claim 27, wherein the shape sensor compensation device is located between the neutral axis and the first radial distance.
30. The apparatus of claim 27, wherein the temperature sensor and the first shape sensor are enclosed in insulation.
31. The apparatus of claim 27, wherein the shape sensor compensation device includes a plurality of shape sensors that together with the first shape sensor, are arranged at angular intervals about the neutral axis at the first radial distance.
32. The apparatus of claim 27, wherein the first shape sensor includes a plurality of optical cores, each of which includes a fiber Bragg grating.
33. A method of operating a shape sensing apparatus comprising:
providing an instrument including an elongated shaft defining a neutral axis;
receiving shape data from a first shape sensor, the first shape sensor including an elongated optical fiber extending within the elongated shaft at a first radial distance from the neutral axis;
receiving compensation data from a shape sensor compensation device extending within the elongated shaft, wherein the shape sensor compensation device includes a temperature sensor; and
generating an instrument bend measurement based upon the received shape data and the compensation data.
34. The method of claim 33, wherein the temperature sensor includes a fiber optic temperature sensor.
35. The method of claim 33, wherein receiving compensation data includes receiving temperature data from the temperature sensor.
36. The method of claim 35, wherein generating the instrument bend measurement comprises adjusting the shape data based on the temperature data and a set of predetermined temperature characterization data for the first shape sensor.
37. The method of claim 33, wherein the shape sensor compensation device is located at a second radial distance from the neutral axis.
38. The method of claim 33, wherein the shape sensor compensation device is located between the neutral axis and the first radial distance.
39. The method of claim 33, wherein the step of receiving compensation data includes receiving axial load data from a second shape sensor which includes a second elongated optical fiber extending within the elongated shaft at a second radial distance from the neutral axis and wherein a displacement angle between the first and second shape sensor about the neutral axis is approximately 180° and the first and second radial distances are approximately equal.
40. The method of claim 33, wherein the shape sensor compensation device includes a plurality of shape sensors that together with the first shape sensor, form a ring about the neutral axis at the first radial distance and wherein receiving compensation data further includes receiving the compensation data from one of the plurality of shape sensors located on a common bending plane with the first shape sensor.
41. A medical instrument system comprising:
an instrument including an elongated shaft with a neutral axis;
a first shape sensor including an elongated optical fiber extending within the elongated shaft at a first radial distance from the neutral axis; and
a shape sensor compensation device extending within the elongated shaft, wherein the shape sensor compensation device includes a temperature sensor;
a tracking system adapted to receive shape data from the first shape sensor and compensating data from the shape sensor compensation device for calculating a bend measurement for the instrument; and
a display system adapted to display a virtual image of the instrument using the bend measurement.
42. The medical instrument system of claim 41, wherein the instrument includes an end effector for manipulating tissue in response to a teleoperation command.
43. The medical instrument system of claim 41, wherein the temperature sensor includes an optical fiber.
44. The medical instrument system of claim 41, wherein the shape sensor compensation device is located at a second radial distance from the neutral axis.
45. The medical instrument system of claim 41, wherein the shape sensor compensation device is located between the neutral axis and the first radial distance.
46. The medical instrument system of claim 41, wherein the shape sensor compensation device includes a plurality of shape sensors that together with the first shape sensor, are arranged at angular intervals about the neutral axis at the first radial distance.
US15/183,936 2012-06-25 2016-06-16 Systems and methods for reducing measurement error in optical fiber shape sensors Abandoned US20160287344A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US15/183,936 US20160287344A1 (en) 2012-06-25 2016-06-16 Systems and methods for reducing measurement error in optical fiber shape sensors

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
US201261663951P 2012-06-25 2012-06-25
US13/925,965 US9429696B2 (en) 2012-06-25 2013-06-25 Systems and methods for reducing measurement error in optical fiber shape sensors
US15/183,936 US20160287344A1 (en) 2012-06-25 2016-06-16 Systems and methods for reducing measurement error in optical fiber shape sensors

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
US13/925,965 Division US9429696B2 (en) 2012-06-25 2013-06-25 Systems and methods for reducing measurement error in optical fiber shape sensors

Publications (1)

Publication Number Publication Date
US20160287344A1 true US20160287344A1 (en) 2016-10-06

Family

ID=49775045

Family Applications (2)

Application Number Title Priority Date Filing Date
US13/925,965 Active 2034-07-15 US9429696B2 (en) 2012-06-25 2013-06-25 Systems and methods for reducing measurement error in optical fiber shape sensors
US15/183,936 Abandoned US20160287344A1 (en) 2012-06-25 2016-06-16 Systems and methods for reducing measurement error in optical fiber shape sensors

Family Applications Before (1)

Application Number Title Priority Date Filing Date
US13/925,965 Active 2034-07-15 US9429696B2 (en) 2012-06-25 2013-06-25 Systems and methods for reducing measurement error in optical fiber shape sensors

Country Status (1)

Country Link
US (2) US9429696B2 (en)

Cited By (35)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20190029763A1 (en) * 2016-04-05 2019-01-31 Olympus Corporation Bend information computation apparatus and endoscope system
CN111006708A (en) * 2019-11-28 2020-04-14 北京航天控制仪器研究所 Measuring point positioning error compensation method for distributed optical fiber sensor
US11172989B2 (en) 2014-07-02 2021-11-16 Covidien Lp Dynamic 3D lung map view for tool navigation inside the lung
US11253325B2 (en) 2018-02-08 2022-02-22 Covidien Lp System and method for catheter detection in fluoroscopic images and updating displayed position of catheter
US11269173B2 (en) 2019-08-19 2022-03-08 Covidien Lp Systems and methods for displaying medical video images and/or medical 3D models
US11341720B2 (en) 2018-02-08 2022-05-24 Covidien Lp Imaging reconstruction system and method
US11341692B2 (en) 2017-06-29 2022-05-24 Covidien Lp System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data
US11357593B2 (en) 2019-01-10 2022-06-14 Covidien Lp Endoscopic imaging with augmented parallax
US11364004B2 (en) 2018-02-08 2022-06-21 Covidien Lp System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
US11380060B2 (en) 2020-01-24 2022-07-05 Covidien Lp System and method for linking a segmentation graph to volumetric data
US11547377B2 (en) 2015-08-06 2023-01-10 Covidien Lp System and method for navigating to target and performing procedure on target utilizing fluoroscopic-based local three dimensional volume reconstruction
US11559266B2 (en) 2015-08-06 2023-01-24 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope
US11564649B2 (en) 2017-10-10 2023-01-31 Covidien Lp System and method for identifying and marking a target in a fluoroscopic three-dimensional reconstruction
US11564751B2 (en) 2019-02-01 2023-01-31 Covidien Lp Systems and methods for visualizing navigation of medical devices relative to targets
US11617493B2 (en) 2018-12-13 2023-04-04 Covidien Lp Thoracic imaging, distance measuring, surgical awareness, and notification system and method
US11625825B2 (en) 2019-01-30 2023-04-11 Covidien Lp Method for displaying tumor location within endoscopic images
US11627924B2 (en) 2019-09-24 2023-04-18 Covidien Lp Systems and methods for image-guided navigation of percutaneously-inserted devices
US11705238B2 (en) 2018-07-26 2023-07-18 Covidien Lp Systems and methods for providing assistance during surgery
US11701179B2 (en) 2018-07-26 2023-07-18 Covidien Lp Modeling a collapsed lung using CT data
US11707241B2 (en) 2015-08-06 2023-07-25 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope
US11744643B2 (en) 2019-02-04 2023-09-05 Covidien Lp Systems and methods facilitating pre-operative prediction of post-operative tissue function
US11793579B2 (en) 2017-02-22 2023-10-24 Covidien Lp Integration of multiple data sources for localization and navigation
US11793402B2 (en) 2018-12-10 2023-10-24 Covidien Lp System and method for generating a three-dimensional model of a surgical site
US11798178B2 (en) 2014-07-02 2023-10-24 Covidien Lp Fluoroscopic pose estimation
US11801113B2 (en) 2018-12-13 2023-10-31 Covidien Lp Thoracic imaging, distance measuring, and notification system and method
US11819285B2 (en) 2019-04-05 2023-11-21 Covidien Lp Magnetic interference detection systems and methods
US11847730B2 (en) 2020-01-24 2023-12-19 Covidien Lp Orientation detection in fluoroscopic images
US11864935B2 (en) 2019-09-09 2024-01-09 Covidien Lp Systems and methods for pose estimation of a fluoroscopic imaging device and for three-dimensional imaging of body structures
US11871913B2 (en) 2014-10-31 2024-01-16 Covidien Lp Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same
US11877806B2 (en) 2018-12-06 2024-01-23 Covidien Lp Deformable registration of computer-generated airway models to airway trees
US11925333B2 (en) 2019-02-01 2024-03-12 Covidien Lp System for fluoroscopic tracking of a catheter to update the relative position of a target and the catheter in a 3D model of a luminal network
US11931111B2 (en) 2019-09-09 2024-03-19 Covidien Lp Systems and methods for providing surgical guidance
US11931141B2 (en) 2008-06-06 2024-03-19 Covidien Lp Hybrid registration method
US11944388B2 (en) 2018-09-28 2024-04-02 Covidien Lp Systems and methods for magnetic interference correction
US11950950B2 (en) 2020-07-24 2024-04-09 Covidien Lp Zoom detection and fluoroscope movement detection for target overlay

Families Citing this family (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109199388B (en) 2013-07-29 2022-03-18 直观外科手术操作公司 Shape sensor system with redundant sensing
JP6153414B2 (en) * 2013-08-06 2017-06-28 オリンパス株式会社 Insertion system and method for adjusting shape detection characteristics of shape sensor
FR3014200B1 (en) * 2013-12-02 2017-05-26 Commissariat Energie Atomique CONTROL OF INDUSTRIAL STRUCTURE
US10019800B2 (en) * 2014-01-06 2018-07-10 Koninklijke Philips N.V. Deployment modelling
JP6270499B2 (en) * 2014-01-21 2018-01-31 オリンパス株式会社 Endoscope device
WO2015152999A1 (en) * 2014-03-31 2015-10-08 Regents Of The University Of Minnesota Navigation tools using shape sensing technology
US9872692B2 (en) * 2014-04-24 2018-01-23 The Johns Hopkins University Motion-compensated micro-forceps system and method
US20160018245A1 (en) * 2014-07-17 2016-01-21 Schlumberger Technology Corporation Measurement Using A Multi-Core Optical Fiber
CN105321415A (en) * 2014-08-01 2016-02-10 卓思生命科技有限公司 Surgery simulation system and method
US10376134B2 (en) 2014-10-17 2019-08-13 Intutitive Surgical Operations, Inc. Systems and methods for reducing measurement error using optical fiber shape sensors
JP2018501834A (en) 2014-11-27 2018-01-25 コーニンクレッカ フィリップス エヌ ヴェKoninklijke Philips N.V. Device for determining the position of an interventional instrument in a projected image
JP6500096B2 (en) * 2015-05-01 2019-04-10 オリンパス株式会社 Curve information deriving device, endoscope system including curve information deriving device, curve information deriving method, and program for deriving curve information
JP6177488B2 (en) * 2015-07-23 2017-08-09 オリンパス株式会社 Manipulator and medical system
US9498300B1 (en) * 2015-07-30 2016-11-22 Novartis Ag Communication system for surgical devices
US10058371B2 (en) * 2015-11-18 2018-08-28 Medtronic Cryocath Lp Multi-lobe balloon for cryoablation
JP2020518326A (en) * 2017-04-18 2020-06-25 インテュイティブ サージカル オペレーションズ, インコーポレイテッド Graphical user interface for monitoring image-guided procedures
WO2019211112A1 (en) * 2018-05-02 2019-11-07 Koninklijke Philips N.V. Optical shape sensing device with integrated force sensing region and tip integration
CN112804946A (en) * 2018-08-07 2021-05-14 奥瑞斯健康公司 Combining strain-based shape sensing with catheter control
WO2020064084A1 (en) * 2018-09-24 2020-04-02 Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. Fiber-optic sensor, data glove and method for detecting curvature

Family Cites Families (32)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5404743A (en) 1993-08-12 1995-04-11 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Pulsed phase locked loop strain monitor
US5841032A (en) 1996-02-02 1998-11-24 The United States Of America As Represented By The Administrator National Aeronautics And Space Administration Variable and fixed frequency pulsed phase locked loop
US5798521A (en) 1996-02-27 1998-08-25 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Apparatus and method for measuring strain in bragg gratings
US5771204A (en) 1996-07-18 1998-06-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Apparatus and method for measuring relative phase of signals in a multiple-echo system
GB9713018D0 (en) 1997-06-20 1997-08-27 Secr Defence Optical fibre bend sensor
US6566648B1 (en) 1999-03-25 2003-05-20 The United States Of America As Represented By The United States National Aeronautics And Space Administration Edge triggered apparatus and method for measuring strain in bragg gratings
US6545760B1 (en) 1999-03-25 2003-04-08 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration Apparatus and method for measuring strain in optical fibers using rayleigh scatter
US6376830B1 (en) 1999-09-14 2002-04-23 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration System and method for measuring the transfer function of a guided wave device
US6426496B1 (en) 2000-08-22 2002-07-30 The United States Of America As Represented By The Administrator Of The National Aeronautics And Space Administration High precision wavelength monitor for tunable laser systems
US6856400B1 (en) 2000-12-14 2005-02-15 Luna Technologies Apparatus and method for the complete characterization of optical devices including loss, birefringence and dispersion effects
WO2004005973A2 (en) 2002-07-09 2004-01-15 Luna Technologies Polarization diversity detection without a polarizing beam splitter
US7388673B2 (en) 2002-07-09 2008-06-17 Luna Innovations Incorporated Heterodyne optical spectrum analyzer
WO2004090507A2 (en) 2003-04-02 2004-10-21 Luna Technologies, Inc. Apparatus and method for correcting errors generated by a laser with non-ideal tuning characteristics
US7440087B2 (en) 2004-02-24 2008-10-21 Luna Innovations Incorporated Identifying optical fiber segments and determining characteristics of an optical device under test based on fiber segment scatter pattern data
US7772541B2 (en) 2004-07-16 2010-08-10 Luna Innnovations Incorporated Fiber optic position and/or shape sensing based on rayleigh scatter
US7781724B2 (en) 2004-07-16 2010-08-24 Luna Innovations Incorporated Fiber optic position and shape sensing device and method relating thereto
US20060013523A1 (en) 2004-07-16 2006-01-19 Luna Innovations Incorporated Fiber optic position and shape sensing device and method relating thereto
US7633607B2 (en) 2004-09-01 2009-12-15 Luna Innovations Incorporated Method and apparatus for calibrating measurement equipment
CA2590790C (en) 2004-12-14 2014-09-02 Luna Innovations Inc. Compensating for time varying phase changes in interferometric measurements
WO2006099056A2 (en) 2005-03-10 2006-09-21 Luna Innovations Inc. Calculation of birefringence in a waveguide based on rayleigh scatter
US7930065B2 (en) 2005-12-30 2011-04-19 Intuitive Surgical Operations, Inc. Robotic surgery system including position sensors using fiber bragg gratings
EP2035792B1 (en) 2006-06-16 2018-05-23 Intuitive Surgical Operations, Inc. Distributed strain and temperature discrimination in polarization maintaining fiber
WO2008013705A2 (en) 2006-07-26 2008-01-31 Luna Innovations Incorporated High resolution interferometric optical frequency domain reflectometry ( ofdr) beyond the laser coherence length
US7720322B2 (en) 2008-06-30 2010-05-18 Intuitive Surgical, Inc. Fiber optic shape sensor
US8537367B2 (en) 2009-01-17 2013-09-17 Luna Innovations Incorporated Optical imaging for optical device inspection
US8773650B2 (en) 2009-09-18 2014-07-08 Intuitive Surgical Operations, Inc. Optical position and/or shape sensing
US9285246B2 (en) 2010-02-12 2016-03-15 Intuitive Surgical Operations, Inc. Method and system for absolute three-dimensional measurements using a twist-insensitive shape sensor
US8714026B2 (en) 2010-04-09 2014-05-06 Intuitive Surgical Operations, Inc. Strain sensing with optical fiber rosettes
US8400620B2 (en) 2010-06-01 2013-03-19 Luna Innovations Incorporated Registration of an extended reference for parameter measurement in an optical sensing system
US8531655B2 (en) 2010-09-17 2013-09-10 Luna Innovations Incorporated Compensating for non-ideal multi-core optical fiber structure
EP2518436B1 (en) * 2011-04-28 2015-06-17 Storz Endoskop Produktions GmbH Bend sensor
US8900131B2 (en) 2011-05-13 2014-12-02 Intuitive Surgical Operations, Inc. Medical system providing dynamic registration of a model of an anatomical structure for image-guided surgery

Cited By (45)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11931141B2 (en) 2008-06-06 2024-03-19 Covidien Lp Hybrid registration method
US11172989B2 (en) 2014-07-02 2021-11-16 Covidien Lp Dynamic 3D lung map view for tool navigation inside the lung
US11607276B2 (en) 2014-07-02 2023-03-21 Covidien Lp Dynamic 3D lung map view for tool navigation inside the lung
US11798178B2 (en) 2014-07-02 2023-10-24 Covidien Lp Fluoroscopic pose estimation
US11547485B2 (en) 2014-07-02 2023-01-10 Covidien Lp Dynamic 3D lung map view for tool navigation inside the lung
US11529192B2 (en) 2014-07-02 2022-12-20 Covidien Lp Dynamic 3D lung map view for tool navigation inside the lung
US11877804B2 (en) 2014-07-02 2024-01-23 Covidien Lp Methods for navigation of catheters inside lungs
US11389247B2 (en) 2014-07-02 2022-07-19 Covidien Lp Methods for navigation of a probe inside a lung
US11871913B2 (en) 2014-10-31 2024-01-16 Covidien Lp Computed tomography enhanced fluoroscopic system, device, and method of utilizing the same
US11559266B2 (en) 2015-08-06 2023-01-24 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope
US11547377B2 (en) 2015-08-06 2023-01-10 Covidien Lp System and method for navigating to target and performing procedure on target utilizing fluoroscopic-based local three dimensional volume reconstruction
US11707241B2 (en) 2015-08-06 2023-07-25 Covidien Lp System and method for local three dimensional volume reconstruction using a standard fluoroscope
US11478305B2 (en) * 2016-04-05 2022-10-25 Olympus Corporation Bend information computation apparatus and endoscope system
US20190029763A1 (en) * 2016-04-05 2019-01-31 Olympus Corporation Bend information computation apparatus and endoscope system
US11793579B2 (en) 2017-02-22 2023-10-24 Covidien Lp Integration of multiple data sources for localization and navigation
US11341692B2 (en) 2017-06-29 2022-05-24 Covidien Lp System and method for identifying, marking and navigating to a target using real time two dimensional fluoroscopic data
US11564649B2 (en) 2017-10-10 2023-01-31 Covidien Lp System and method for identifying and marking a target in a fluoroscopic three-dimensional reconstruction
US11896414B2 (en) 2018-02-08 2024-02-13 Covidien Lp System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
US11253325B2 (en) 2018-02-08 2022-02-22 Covidien Lp System and method for catheter detection in fluoroscopic images and updating displayed position of catheter
US11364004B2 (en) 2018-02-08 2022-06-21 Covidien Lp System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
US11712213B2 (en) 2018-02-08 2023-08-01 Covidien Lp System and method for pose estimation of an imaging device and for determining the location of a medical device with respect to a target
US11701184B2 (en) 2018-02-08 2023-07-18 Covidien Lp System and method for catheter detection in fluoroscopic images and updating displayed position of catheter
US11341720B2 (en) 2018-02-08 2022-05-24 Covidien Lp Imaging reconstruction system and method
US11705238B2 (en) 2018-07-26 2023-07-18 Covidien Lp Systems and methods for providing assistance during surgery
US11701179B2 (en) 2018-07-26 2023-07-18 Covidien Lp Modeling a collapsed lung using CT data
US11944388B2 (en) 2018-09-28 2024-04-02 Covidien Lp Systems and methods for magnetic interference correction
US11877806B2 (en) 2018-12-06 2024-01-23 Covidien Lp Deformable registration of computer-generated airway models to airway trees
US11793402B2 (en) 2018-12-10 2023-10-24 Covidien Lp System and method for generating a three-dimensional model of a surgical site
US11617493B2 (en) 2018-12-13 2023-04-04 Covidien Lp Thoracic imaging, distance measuring, surgical awareness, and notification system and method
US11801113B2 (en) 2018-12-13 2023-10-31 Covidien Lp Thoracic imaging, distance measuring, and notification system and method
US11793390B2 (en) 2019-01-10 2023-10-24 Covidien Lp Endoscopic imaging with augmented parallax
US11357593B2 (en) 2019-01-10 2022-06-14 Covidien Lp Endoscopic imaging with augmented parallax
US11625825B2 (en) 2019-01-30 2023-04-11 Covidien Lp Method for displaying tumor location within endoscopic images
US11564751B2 (en) 2019-02-01 2023-01-31 Covidien Lp Systems and methods for visualizing navigation of medical devices relative to targets
US11925333B2 (en) 2019-02-01 2024-03-12 Covidien Lp System for fluoroscopic tracking of a catheter to update the relative position of a target and the catheter in a 3D model of a luminal network
US11744643B2 (en) 2019-02-04 2023-09-05 Covidien Lp Systems and methods facilitating pre-operative prediction of post-operative tissue function
US11819285B2 (en) 2019-04-05 2023-11-21 Covidien Lp Magnetic interference detection systems and methods
US11269173B2 (en) 2019-08-19 2022-03-08 Covidien Lp Systems and methods for displaying medical video images and/or medical 3D models
US11864935B2 (en) 2019-09-09 2024-01-09 Covidien Lp Systems and methods for pose estimation of a fluoroscopic imaging device and for three-dimensional imaging of body structures
US11931111B2 (en) 2019-09-09 2024-03-19 Covidien Lp Systems and methods for providing surgical guidance
US11627924B2 (en) 2019-09-24 2023-04-18 Covidien Lp Systems and methods for image-guided navigation of percutaneously-inserted devices
CN111006708A (en) * 2019-11-28 2020-04-14 北京航天控制仪器研究所 Measuring point positioning error compensation method for distributed optical fiber sensor
US11380060B2 (en) 2020-01-24 2022-07-05 Covidien Lp System and method for linking a segmentation graph to volumetric data
US11847730B2 (en) 2020-01-24 2023-12-19 Covidien Lp Orientation detection in fluoroscopic images
US11950950B2 (en) 2020-07-24 2024-04-09 Covidien Lp Zoom detection and fluoroscope movement detection for target overlay

Also Published As

Publication number Publication date
US20130345719A1 (en) 2013-12-26
US9429696B2 (en) 2016-08-30

Similar Documents

Publication Publication Date Title
US9429696B2 (en) Systems and methods for reducing measurement error in optical fiber shape sensors
US10772485B2 (en) Systems and methods for reducing measurement error using optical fiber shape sensors
US11737682B2 (en) Systems and methods for registration of a medical device using a reduced search space
US11471066B2 (en) Systems and methods for configuring components in a minimally invasive instrument
US11607107B2 (en) Systems and methods for medical instrument force sensing
US10376178B2 (en) Systems and methods for registration of a medical device using rapid pose search
US20190343424A1 (en) Systems and methods for reducing measurement error using optical fiber shape sensors
US9918659B2 (en) Shape sensor systems for tracking interventional instruments and mehods of use

Legal Events

Date Code Title Description
STCB Information on status: application discontinuation

Free format text: ABANDONED -- FAILURE TO RESPOND TO AN OFFICE ACTION