WO2015094726A1 - Marker-based tool tracking - Google Patents

Marker-based tool tracking Download PDF

Info

Publication number
WO2015094726A1
WO2015094726A1 PCT/US2014/068899 US2014068899W WO2015094726A1 WO 2015094726 A1 WO2015094726 A1 WO 2015094726A1 US 2014068899 W US2014068899 W US 2014068899W WO 2015094726 A1 WO2015094726 A1 WO 2015094726A1
Authority
WO
WIPO (PCT)
Prior art keywords
marker
surgical tool
ophthalmic surgical
image
eye
Prior art date
Application number
PCT/US2014/068899
Other languages
French (fr)
Inventor
Hugang REN
Lingfeng Yu
Original Assignee
Novartis Ag
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Novartis Ag filed Critical Novartis Ag
Priority to CN201480069020.2A priority Critical patent/CN105828703B/en
Priority to AU2014366911A priority patent/AU2014366911B2/en
Priority to EP14872305.9A priority patent/EP3082569B1/en
Priority to ES14872305T priority patent/ES2726895T3/en
Priority to CA2932895A priority patent/CA2932895C/en
Priority to JP2016541315A priority patent/JP6302070B2/en
Publication of WO2015094726A1 publication Critical patent/WO2015094726A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/065Determining position of the probe employing exclusively positioning means located on or in the probe, e.g. using position sensors arranged on the probe
    • A61B5/066Superposing sensor position on an image of the patient, e.g. obtained by ultrasound or x-ray imaging
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/102Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for optical coherence tomography [OCT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/13Ophthalmic microscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/25User interfaces for surgical systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/06Devices, other than using radiation, for detecting or locating foreign bodies ; determining position of probes within or on the body of the patient
    • A61B5/061Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body
    • A61B5/064Determining position of a probe within the body employing means separate from the probe, e.g. sensing internal probe position employing impedance electrodes on the surface of the body using markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6813Specially adapted to be attached to a specific body part
    • A61B5/6814Head
    • A61B5/6821Eye
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery
    • A61F9/00736Instruments for removal of intra-ocular material or intra-ocular injection, e.g. cataract instruments
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/365Correlation of different images or relation of image positions in respect to the body augmented reality, i.e. correlating a live optical image with another image
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/364Correlation of different images or relation of image positions in respect to the body
    • A61B2090/368Correlation of different images or relation of image positions in respect to the body changing the image on a display according to the operator's position
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B90/37Surgical systems with images on a monitor during operation
    • A61B2090/373Surgical systems with images on a monitor during operation using light, e.g. by using optical scanners
    • A61B2090/3735Optical coherence tomography [OCT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/30Devices for illuminating a surgical field, the devices having an interrelation with other surgical devices or with a surgical procedure
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/0012Surgical microscopes

Definitions

  • the devices, systems, and methods disclosed herein relate generally to marker-based tool tracking, and more particularly, to devices, systems, and methods that are configured to perform marker-based tool tracking in ophthalmic surgeries.
  • Surgical tools such as surgical imaging probes, surgical forceps, surgical scissors, surgical vitrectomy probes, and the like, may be inserted into an eye during an ophthalmic surgery to perform various surgeries in the eye.
  • a distal portion of a surgical tool is inserted into the eye during the ophthalmic surgery.
  • the area of the eye surrounding the distal tip of the surgical tool is a region of interest to a surgeon.
  • guided surgical interventions such as intra-operative Optical Coherence Tomography (OCT) for Internal Limited Membrane (ILM) peeling
  • OCT Optical Coherence Tomography
  • ILM Internal Limited Membrane
  • useful surgical data may be overlaid to the surgeon's current area of interest.
  • the area of interest may shift accordingly.
  • automatic tool tracking may be used to locate the area of interest to adjust the surgical data overlay so the surgeon may visualize it without looking away from the current area of interest.
  • Motion-based object tracking may be used for automated surveillance.
  • Motion-based object tracking may use image processing algorithms, such as background subtraction, frame difference, and optical flow, to track an object.
  • image processing algorithms such as background subtraction, frame difference, and optical flow.
  • motion-based object tracking algorithm requires a quasi- stationary background and may not be suitable for tool tracking in an ophthalmic surgery in which background may vary constantly.
  • Region-based object tracking may be used for tracking simple objects.
  • an object template is preselected offline or during a first frame. For the subsequent frames, the template is searched across the whole field of view and the location with the greatest similarity to the template is identified as the object.
  • region-based object tracking is sensitive to object pose variations and local illumination changes and may not be suitable for tool tracking in an ophthalmic surgery, in which illumination and orientation of the tool vary greatly.
  • the third technique for general object tracking is feature-based object tracking.
  • Feature-based object tracking may extract and search for unique features of an object, such as contour, edge, shape, color, corner/interest point and the like, across the entire field of view for object detection. Nevertheless, in the feature-based tracking algorithm, a high contrast feature, which is not sensitive to environmental and object pose changes and is unique to the object, is required. Since most of the surgical tools do not possess high contrast intrinsic features, feature based object tracking may not provide suitable results.
  • illumination conditions may be challenging for tool tracking.
  • An endo illuminator may be inserted into the eye for illumination. Because the endo illuminator may move during a surgery, the illumination condition may vary greatly from image frame to image frame and the images of the fundus area being illuminated may change greatly over time. Motion-based and region-based object tracking techniques may be difficult to implement under inconsistent illumination conditions. Further, with a single illuminator illuminating from one side, shadow artifacts and specular reflection from the surgical tool may increase complexity for tool tracking.
  • a beam path of an imaging light from the eye may pass through multiple optical elements and media, such as eye vitreous body, an aged crystalline lens, eye cornea, and Binocular Indirect Ophthalmomicroscope (BIOM) lenses.
  • These optical elements in the beam path of imaging light may further degrade the image quality and reduce contrast.
  • the present disclosure is directed to devices, systems, and methods that address one or more of the disadvantages of the prior art.
  • the present disclosure is directed to an ophthalmic surgical tool tracking system.
  • the ophthalmic surgical tool tracking system may include an ophthalmic surgical tool with a distal portion configured to be inserted into an eye, a marker positioned at the distal portion of the ophthalmic surgical tool, and a tool position detection system configured to determine a position of the distal portion of the ophthalmic surgical tool in the eye by detecting the marker.
  • the tool position detection system may include a light source configured to introduce light into the eye; an imaging device configured to capture an image of the eye along with the distal portion of the ophthalmic surgical tool inserted into the eye, a processor, and a display.
  • the processor is configured to process the captured image to identify the marker, generate indicators indicating surgical data, such as a position of a surgical tool, an orientation of a surgical tool, an image, a surgical setting parameter, overlay the indicators on the captured image or a processed image.
  • the display is configured to display the captured image or a processed image with the overlaid indicators.
  • the present disclosure is directed to a method for tracking an ophthalmic surgical tool inserted into an eye.
  • the method may include introducing an imaging light into an eye, capturing an image of the eye along with a distal portion of an ophthalmic surgical tool inserted into the eye, and determining a position of the distal portion of the ophthalmic surgical tool in the eye by detecting a marker positioned at the distal portion of the ophthalmic surgical tool.
  • determining a position of the distal portion may include processing the captured image to identify the marker, generating indicators indicating surgical data, such as a position of a surgical tool, an orientation of a surgical tool, an image, a surgical setting parameter, overlaying the indicators on the captured images or a processed image, and displaying the captured image or a processed image with the overlaid indicators on a display.
  • Fig. 1 illustrates a schematic diagram of an exemplary ophthalmic surgical tool tracking system according to an aspect consistent with the principles of the present disclosure.
  • Fig. 2 illustrates a distal portion of an exemplary surgical tool according to an aspect consistent with the principles of the present disclosure.
  • Fig. 3 illustrates a perspective view of a marker and various types of markers according to an aspect consistent with the principles of the present disclosure.
  • Fig. 4 is a flow chart illustrating a method for tool-tracking according to an aspect consistent with the principles of the present disclosure.
  • Fig. 5 illustrates an image of a fundus during an ophthalmic surgery according to an aspect consistent with the principles of the present disclosure.
  • the devices, systems, and methods described herein provide an ophthalmic surgical tool tracking system that includes an ophthalmic surgical tool with a marker positioned at a distal portion of the ophthalmic surgical tool.
  • the system tracks the location of the marker when the distal portion of the ophthalmic surgical tool, along with the marker, is inserted into an eye to perform surgeries in the eye.
  • a light source e.g., an endo illuminator, may introduce light into a fundus of the eye. The light may be reflected off the fundus of the eye and the distal portion of the ophthalmic surgical tool.
  • An imaging device of the system may receive the reflected light to capture images of the fundus of the eye and the distal portion of the ophthalmic surgical tool.
  • the system also may include an image processor which may process the captured images to identify and extract the marker from the captured images.
  • the marker may have a high-contrast feature in a visible light or infrared light range or other spectral ranges.
  • the image processor may identify and extract the marker from the captured images.
  • the location of the marker may be used to generate indicators indicating surgical data, such as a position of a surgical tool, an orientation of a surgical tool, an image, a surgical setting parameter. The indicators are then overlaid with the captured images of the fundus or processed images and displayed to a user.
  • the system may indicate surgical data, such as a presence, position, and orientation of the surgical tool, an image, a surgical setting parameter in real time to assist a surgeon during an ophthalmic surgical operation in the eye.
  • Fig. 1 illustrates an exemplary ophthalmic surgical tool tracking system, generally designated 100 disposed relative to an eye 101 under treatment.
  • the eye 101 includes sclera 102, a cornea 104, an anterior chamber 106, and a posterior chamber 108.
  • a capsular bag 110 is illustrated in the posterior chamber 108.
  • the eye 101 further includes a retina 112.
  • An ophthalmic surgical tool 130 may be used to perform surgery in the eye 101.
  • the ophthalmic surgical tool 130 may be sized and shaped to be handled by a surgeon and to protrude into the eye 101 of the patient.
  • the ophthalmic surgical tool 130 may include a proximal portion 148 and a distal portion 144.
  • the proximal portion 148 may be sized and shaped for handheld grasping by a user.
  • the proximal portion 148 may define a handle which is sized and shaped for grasping by a single hand of the user.
  • the user may control the position of the distal portion 144 by maneuvering the proximal portion 148.
  • the distal portion 144 of the ophthalmic surgical tool 130 may include a marker 114.
  • the marker may have a high contrast feature in the visible light or infrared spectrum or other spectral ranges detectable by an imaging device 124 of the ophthalmic surgical tool tracking system 100.
  • the ophthalmic surgical tool tracking system 100 also may include a light source 122, e.g., an endo illuminator.
  • the light source 122 may have a distal portion that is configured to be inserted into the eye 101.
  • a distal tip of the light source 122 may emit an imaging light that may illuminate a fundus of the eye 101.
  • the fundus is an interior surface of the eye 101 and may include the retina 112.
  • the imaging light from the light source 122 may be reflected from the fundus and the distal portion 144 of the ophthalmic surgical tool 130.
  • the reflected imaging light may pass through the capsular bag 110, the anterior chamber 106, the cornea 104, and be received by the imaging device 124, which is configured to capture fundus images of the eye 101.
  • Lenses 132 and 134 may be provided between the eye 101 and the imaging device 124 to receive the reflected imaging light from the fundus and direct the imaging light to the imaging device 124.
  • the imaging device 124 may include one or more video cameras configured to capture images of the fundus.
  • the video camera may capture images in visible light spectrum, infrared spectrum or other spectral ranges.
  • imaging device 124 may include either or both a video camera that captures images of the fundus in visible light spectrum and a video camera that captures infrared images of an infrared marker 114 near the fundus in the infrared spectrum.
  • the ophthalmic surgical tool tracking system 100 also may include an image processor 126.
  • the image processor 126 may receive image frames captured by the imaging device 124 and perform various image processing on the image frames. In particular, the image processor 126 may perform image analysis on the image frames to identify and extract the image of the marker 114 from the image frames. Further, the image processor 126 may generate indicators and overlay the indicators on the image of the fundus or a processed image.
  • the indicators may include surgical data, such as the position and orientation of the marker 114 in the image of the fundus, the position and orientation of a distal tip 146 of the ophthalmic surgical tool 130, an image, a surgical setting parameter. The overlaid image may then be displayed by a display 128 to the user.
  • the imaging device 124, the image processor 126, and the display 128 may be implemented in separate housings communicatively coupled to one another or within a common console or housing.
  • a user interface 136 may be associated with the display 128 and/or the image processor 126. It may include, for example, a keyboard, a mouse, a joystick, a touchscreen, an eye tracking device, a speech recognition device, a gesture control module, dials, and/or buttons, among other input devices.
  • a user may enter desired instructions or parameters at the user interface 136 to control the imaging device 124 for taking images of the eye 101.
  • a surgeon may review the images of the fundus and/or the overlaid indicators on the display 128 to visualize the operation and the relative position of the distal tip 146 of the ophthalmic surgical tool 130 within various portions of the fundus.
  • Fig. 2 is an enlarged view of the distal portion 144 of the ophthalmic surgical tool 130.
  • One or more markers 114 may be positioned at the distal portion 144.
  • the markers 114 may wrap around the distal portion of the ophthalmic surgical tool 130, such that markers 114 may be visible in all directions even when the ophthalmic surgical tool 130 is rotated.
  • the markers 114 may be labels or printed decals bonded to the ophthalmic surgical tool 130.
  • the markers 114 may be formed with synthetic material, such as plastic. Thus, the markers 114 may not deteriorate in a biological tissue. Further, the markers 114 may be bio-compatible and do not interfere or react with biological tissues.
  • the markers 114 may be inscribed on an exterior surface of the ophthalmic surgical tool 130.
  • the markers 114 may be a layer of paint inscribed on the exterior surface of the ophthalmic surgical tool 130.
  • the markers 114 may be embedded into the wall of the ophthalmic surgical tool 130.
  • the markers 114 may have a high-contrast feature in visible, infrared or other spectrum.
  • a high- contrast feature may be a color or a pattern that is distinguishable from colors or patterns in the fundus.
  • a high-contrast color may be a green color, which typically does not appear in a fundus of an eye. It is noteworthy that adding markers 114 should not increase the size of the surgical tools.
  • Fig. 3 illustrates various examples of the markers 114.
  • the marker 114 may have a ring, ribbon shape configured to wrap around the distal portion 144 of the ophthalmic surgical tool 130.
  • the marker 114 may have an inner surface 116 and an outer surface 118.
  • the inner surface 116 may have adhesives and be configured to adhere or bond to an exterior surface of the ophthalmic surgical tool 130.
  • the exterior surface of the distal portion 144 of the ophthalmic surgical tool 130 may have a circumferential groove configured to accommodate the ring, ribbon shape marker 114. Thus, the marker 114 may fit securely in the circumferential groove.
  • the outer surface 118 of the marker 114 may have colors or patterns configured to distinguish the marker 114 from other elements in the fundus image.
  • One or more markers 114 may be used for the ophthalmic surgical tool 130.
  • the marker 114 may be formed of bio-compatible and/or synthetic materials, such as sterile plastic.
  • the marker 114 may be a layer of paint inscribed on an exterior surface of the distal portion 144 of the ophthalmic surgical tool 130.
  • the markers 114 may overlap each other or be separated from each other.
  • the markers 114 may have one or more high contrast colors.
  • the markers 114 may have a green color, which does not appear in a typical fundus image.
  • the green markers 114 may be distinguished from other elements in the fundus image.
  • the markers 114 may have various color, texture, or spectral contrast.
  • the markers 114 may include patterns that may identify an orientation and angle of the ophthalmic surgical tool 130. For example, as shown in Fig.
  • marker 114a may have a solid high-contrast color. When the ring, ribbon shape marker 114a is cut open, the marker 114a may be a ribbon in solid color. In another example, marker 114b may have a texture pattern that may distinguish the marker 114b from the background fundus image. Exemplary marker 114c may include an infrared color configured to reflect or emit infrared light. Markers 114 with various spectral absorption/emission also may be used.
  • the markers 114 may include letters, numbers, bar codes, pattern, symbols, or pictures.
  • Exemplary marker 114d may include letters. As shown in Fig. 3, assuming that the marker 114d wraps 360 degrees around the distal portion 144 of the ophthalmic surgical tool 130, a letter “A” may be positioned near the zero degree position and the letter “E” may be positioned near the 360 degree position. Letters “B,” “C,” and “D” may be positioned in between “A” and "E” at respective positions. Thus, based on the orientation of the letters, the rotational position of the marker 114d and indirectly the rotational position of the ophthalmic surgical tool 130 may be determined.
  • Exemplary marker 114e may include numbers "1" to "5.”
  • the numbers may indicate a rotational position of the ophthalmic surgical tool 130.
  • the orientation of the letters or number also may indicate a tilting angle of the ophthalmic surgical tool 130.
  • the numbers or letters may be orientated relatively to the distal tip 146 of the ophthalmic surgical tool 130 such that the bottoms of the numbers or letter face toward the distal tip 146.
  • the tilting angle of the distal tip 146 may be determined.
  • Exemplary marker 114f may include barcodes or stripes.
  • the direction of the stripes may indicate a tilting angle of the ophthalmic surgical tool 130.
  • the number of stripes may vary to indicate a rotational position of the marker 114f and indirectly, the rotational position of the ophthalmic surgical tool 130.
  • Marker 114g has various dot patterns.
  • the number of dots may indicate the rotational position of the marker 114f and the alignment of the dots may indicate a tilting angle of the marker 114f.
  • Other symbols also may be used on the markers 114.
  • various symbols, such as shapes or non-character symbols may be used at different rotational positions of the markers 114h and 114i to indicate rotational positions.
  • a picture may be used to indicate rotational and tilt positions of the marker 114j.
  • Other patterns or symbols that may indicate an orientation and position of the ophthalmic surgical tool 130 also may be used on the markers 114.
  • Fig. 4 is a flow chart illustrating a method 400 for tracking an ophthalmic surgical tool 130 inserted in an eye 101.
  • a light source 122 may introduce imaging light into the fundus of the eye 101.
  • the reflected imaging light from the fundus may be guided by lenses 132 and 134 and received by imaging device 124.
  • the imaging device 124 may capture images of the fundus.
  • the imaging device 124 may capture frames of images to form a video. Each image frame may be forwarded to the image processor 126 to be processed and analyzed.
  • the image processor 126 may perform contrast and feature enhancement processing on the image frame. For example, the image processor 126 may receive the image frame in Red-Green-Blue (RGB) format. At 404, the image processor 126 may convert the RGB format image frame into a Hue-Saturation- Value (HSV) space. At 406, after the image frame has been enhanced to bring out the contrast and feature, the image processor 126 may determine a first-order estimation mask of the marker 114. For example, based on a predetermined color of the marker 114, the image processor 126 may apply criteria to the hue and saturation channels of the HSV image frame that may separate the marker 114 from the background in order to bring out and estimate the image of the marker 114.
  • RGB Red-Green-Blue
  • HSV Hue-Saturation- Value
  • the image processor 126 may extract the image of the marker 114 from the image frame.
  • the image processor 126 may implement a blob detection process to detect a boundary of the marker 114 in the image frame.
  • a blob may be a region of the image frame where some properties, such as color and brightness, are approximately constant.
  • the image processor 126 may search for regions of approximately constant properties in the image frame to detect blobs.
  • the image processor 126 may find the boundary of the marker 114 and extract the marker 114 from the image frame.
  • the image processor 126 may analyze the shape and orientation of the marker 114 extracted from the image frame. Based on a predetermined pattern and color, the image processor 126 may determine the orientation of the marker 114 in the image frame. For example, if the marker 114 has stripes, the image processor 126 may determine the orientation of the marker 114 based on the orientation and direction of the stripes.
  • the image processor 126 may determine the position and orientation of the distal tip 146 of the ophthalmic surgical tool 130. In particular, based on the position and orientation of the marker 114, the image processor 126 may determine the position and orientation of the distal tip 146 of the ophthalmic surgical tool 130.
  • the marker 114 may be positioned from the distal tip 146 of the ophthalmic surgical tool 130 by a predetermined distance and may have a pattern that indicates a pointing direction of the ophthalmic surgical tool 130, e.g. , a strip or an arrow.
  • the image processor 126 may determine the position of the distal tip 146 of the ophthalmic surgical tool and the pointing direction or orientation of the ophthalmic surgical tool.
  • the image processor 126 may display and overlay indicators to indicate the distal tip 146 of the ophthalmic surgical tool 130 or other surgical data for surgical guidance.
  • the image processor 126 may generate an indicator, such as a box, a circle, a star, or an arrow, and overlay the indicator into the image frame at the position of the distal tip 146 of the ophthalmic surgical tool 130.
  • the indicator may indicate an orientation, e.g. , a pointing angle, of the ophthalmic surgical tool 130.
  • an arrow may be used as the indicator to indicate the pointing direction of the ophthalmic surgical tool 130.
  • the indicator may also include an image, such as an OCT image of a region of the retina 112, or a surgical setting parameter, such as a cutting speed of a vitrectomy probe.
  • the display 128 may display the image frame overlaid with the indicators.
  • Fig. 5 illustrates a fundus image 500 during an ophthalmic surgery.
  • the fundus image 500 may be displayed or processed and displayed on the display 128.
  • the distal portion 144 of the ophthalmic surgical tool 130 is inserted into the eye 101.
  • the distal portion 144 may have a marker 114.
  • the image processor 126 extracts the marker image and determines the position and orientation of the marker 114, the image processor 126 may generate and overlay indicators 502, 504, and 506 into the fundus image.
  • Indicator 502 may be a dot positioned over the marker 114 or at a position of the surgical tool 130 in the fundus image 500. Thus, the indicator 502 may indicate the position of the marker 114 or the position of the surgical tool 130 in real time. Indicator 504 may be an arrow positioned on or adjacent to the marker 114. The arrow of indicator 504 may indicate an orientation or pointing position of the ophthalmic surgical tool 130. Indicator 506 may be text or image positioned at or adjacent to the marker 114. The indicator may include name or description of the ophthalmic surgical tool. The indicator also may include diagnosis information, surgical status, warnings, or other information. For example, the indicator may include the brand name, identification, and/or functional description. In the example shown in Fig.
  • indicator 506 has the text: "Alcon” to indicate the name brand of the ophthalmic surgical tool.
  • operating parameters such as temperature, flow rate, revolutions per minute (RPM) speed, pressure, and the like, may be included in the indicator 506 to provide additional information to the user.
  • images of the tissue under examination from other imaging modalities such as optical coherence tomography (OCT), fluorescein angiography (FA), and the like, may be included in the indicator 506 to provide additional guidance to the user.
  • OCT optical coherence tomography
  • FA fluorescein angiography
  • the image processor 126 may perform the method 400 for each image frame as the image frame is displayed to continuously track the position and orientation of the distal tip 146 of the ophthalmic surgical tool 130 in real time.
  • the display 128 may display the indicators 502, 504, and 506 in a real-time video of the fundus to track the position and movement of the distal tip 146 of the ophthalmic surgical tool 130. Accordingly, a surgeon may view the display 128 to visualize the movement of the ophthalmic surgical tool 130 in the eye 101 during a surgery.
  • the above systems and methods may be applied to track an infrared marker 114 using an imaging device 124 configured to capture images in infrared spectrum. Further, an OCT enabled surgical tool 130 may use the above systems and methods to track the area of the eye 101 being imaged by the OCT surgical probe 130 around the area of the distal tip 146 of the OCT surgical probe 130.
  • the indicator may include identification or parameters of the ophthalmic surgical tool 130.
  • the indicator may identify that the ophthalmic surgical tool 130 is an irrigation tool with a certain liquid flow rate, pressure, and temperature. By displaying these parameters, a surgeon may keep track of the operation of the ophthalmic surgical tool 130 without looking away from the display 128.

Abstract

An ophthalmic surgical tool has a marker positioned at a distal portion of the ophthalmic surgical tool. The distal portion of the ophthalmic surgical tool is inserted into an eye along with the marker to perform surgeries in the eye. An imaging device captures images of the fundus of the eye including the distal portion of the ophthalmic surgical tool. An image processor processes the captured images to identify and extract the marker from the captured images. The marker has a high-contrast feature in a visible light or infrared light range or other spectral ranges. Thus, the image processor may identify and extract the marker from the captured images. Indicators may be generated based on the marker and be overlaid with the captured or processed images of the fundus and displayed to a user to indicate surgical information to the user.

Description

MARKER-BASED TOOL TRACKING
BACKGROUND
The devices, systems, and methods disclosed herein relate generally to marker-based tool tracking, and more particularly, to devices, systems, and methods that are configured to perform marker-based tool tracking in ophthalmic surgeries.
Surgical tools, such as surgical imaging probes, surgical forceps, surgical scissors, surgical vitrectomy probes, and the like, may be inserted into an eye during an ophthalmic surgery to perform various surgeries in the eye. Typically, a distal portion of a surgical tool is inserted into the eye during the ophthalmic surgery. Thus, the area of the eye surrounding the distal tip of the surgical tool is a region of interest to a surgeon. To achieve guided surgical interventions, such as intra-operative Optical Coherence Tomography (OCT) for Internal Limited Membrane (ILM) peeling, automatic tool tip tracking is used to efficiently close a feedback loop to allow the OCT engine to locate the scanning target area. Further, to provide real time feedback during surgery, useful surgical data may be overlaid to the surgeon's current area of interest. When a surgeon moves the distal portion of the surgical tool inserted into the eye, the area of interest may shift accordingly. Thus, automatic tool tracking may be used to locate the area of interest to adjust the surgical data overlay so the surgeon may visualize it without looking away from the current area of interest.
There are three conventional techniques for general object tracking. The first technique is motion-based object tracking. Motion-based object tracking may be used for automated surveillance. Motion-based object tracking may use image processing algorithms, such as background subtraction, frame difference, and optical flow, to track an object. Nevertheless, motion-based object tracking algorithm requires a quasi- stationary background and may not be suitable for tool tracking in an ophthalmic surgery in which background may vary constantly.
The second technique for general object tracking is region-based object tracking. Region-based object tracking may be used for tracking simple objects. In region-based object tracking, an object template is preselected offline or during a first frame. For the subsequent frames, the template is searched across the whole field of view and the location with the greatest similarity to the template is identified as the object. Nevertheless, region-based object tracking is sensitive to object pose variations and local illumination changes and may not be suitable for tool tracking in an ophthalmic surgery, in which illumination and orientation of the tool vary greatly.
The third technique for general object tracking is feature-based object tracking. Feature-based object tracking may extract and search for unique features of an object, such as contour, edge, shape, color, corner/interest point and the like, across the entire field of view for object detection. Nevertheless, in the feature-based tracking algorithm, a high contrast feature, which is not sensitive to environmental and object pose changes and is unique to the object, is required. Since most of the surgical tools do not possess high contrast intrinsic features, feature based object tracking may not provide suitable results.
In a vitreo-retinal surgery, illumination conditions may be challenging for tool tracking. An endo illuminator may be inserted into the eye for illumination. Because the endo illuminator may move during a surgery, the illumination condition may vary greatly from image frame to image frame and the images of the fundus area being illuminated may change greatly over time. Motion-based and region-based object tracking techniques may be difficult to implement under inconsistent illumination conditions. Further, with a single illuminator illuminating from one side, shadow artifacts and specular reflection from the surgical tool may increase complexity for tool tracking. Moreover, in order to capture a fundus image through a video camera, a beam path of an imaging light from the eye may pass through multiple optical elements and media, such as eye vitreous body, an aged crystalline lens, eye cornea, and Binocular Indirect Ophthalmomicroscope (BIOM) lenses. These optical elements in the beam path of imaging light may further degrade the image quality and reduce contrast. Thus, it may be difficult to extract an intrinsic feature of various surgical tools to achieve real time tool tracking.
The present disclosure is directed to devices, systems, and methods that address one or more of the disadvantages of the prior art. SUMMARY
In an exemplary aspect, the present disclosure is directed to an ophthalmic surgical tool tracking system. The ophthalmic surgical tool tracking system may include an ophthalmic surgical tool with a distal portion configured to be inserted into an eye, a marker positioned at the distal portion of the ophthalmic surgical tool, and a tool position detection system configured to determine a position of the distal portion of the ophthalmic surgical tool in the eye by detecting the marker.
The tool position detection system may include a light source configured to introduce light into the eye; an imaging device configured to capture an image of the eye along with the distal portion of the ophthalmic surgical tool inserted into the eye, a processor, and a display. The processor is configured to process the captured image to identify the marker, generate indicators indicating surgical data, such as a position of a surgical tool, an orientation of a surgical tool, an image, a surgical setting parameter, overlay the indicators on the captured image or a processed image. The display is configured to display the captured image or a processed image with the overlaid indicators. In another exemplary aspect, the present disclosure is directed to a method for tracking an ophthalmic surgical tool inserted into an eye. The method may include introducing an imaging light into an eye, capturing an image of the eye along with a distal portion of an ophthalmic surgical tool inserted into the eye, and determining a position of the distal portion of the ophthalmic surgical tool in the eye by detecting a marker positioned at the distal portion of the ophthalmic surgical tool.
In some aspects, determining a position of the distal portion may include processing the captured image to identify the marker, generating indicators indicating surgical data, such as a position of a surgical tool, an orientation of a surgical tool, an image, a surgical setting parameter, overlaying the indicators on the captured images or a processed image, and displaying the captured image or a processed image with the overlaid indicators on a display.
It is to be understood that both the foregoing general description and the following detailed description are exemplary and explanatory in nature and are intended to provide an understanding of the present disclosure without limiting the scope of the present disclosure. In that regard, additional aspects, features, and advantages of the present disclosure will be apparent to one skilled in the art from the following detailed description. BRIEF DESCRIPTION OF THE DRAWINGS
The accompanying drawings illustrate embodiments of the devices and methods disclosed herein and together with the description, serve to explain the principles of the present disclosure.
Fig. 1 illustrates a schematic diagram of an exemplary ophthalmic surgical tool tracking system according to an aspect consistent with the principles of the present disclosure.
Fig. 2 illustrates a distal portion of an exemplary surgical tool according to an aspect consistent with the principles of the present disclosure.
Fig. 3 illustrates a perspective view of a marker and various types of markers according to an aspect consistent with the principles of the present disclosure.
Fig. 4 is a flow chart illustrating a method for tool-tracking according to an aspect consistent with the principles of the present disclosure.
Fig. 5 illustrates an image of a fundus during an ophthalmic surgery according to an aspect consistent with the principles of the present disclosure.
DETAILED DESCRIPTION
For the purposes of promoting an understanding of the principles of the present disclosure, reference will now be made to the embodiments illustrated in the drawings, and specific language will be used to describe the same. It will nevertheless be understood that no limitation of the scope of the disclosure is intended. Any alterations and further modifications to the described systems, devices, and methods, and any further application of the principles of the present disclosure are fully contemplated as would normally occur to one skilled in the art to which the disclosure relates. In particular, it is fully contemplated that the systems, devices, and/or methods described with respect to one embodiment may be combined with the features, components, and/or steps described with respect to other embodiments of the present disclosure. For the sake of brevity, however, the numerous iterations of these combinations will not be described separately. For simplicity, in some instances the same reference numbers are used throughout the drawings to refer to the same or like parts.
The devices, systems, and methods described herein provide an ophthalmic surgical tool tracking system that includes an ophthalmic surgical tool with a marker positioned at a distal portion of the ophthalmic surgical tool. The system tracks the location of the marker when the distal portion of the ophthalmic surgical tool, along with the marker, is inserted into an eye to perform surgeries in the eye. A light source, e.g., an endo illuminator, may introduce light into a fundus of the eye. The light may be reflected off the fundus of the eye and the distal portion of the ophthalmic surgical tool. An imaging device of the system may receive the reflected light to capture images of the fundus of the eye and the distal portion of the ophthalmic surgical tool.
The system also may include an image processor which may process the captured images to identify and extract the marker from the captured images. The marker may have a high-contrast feature in a visible light or infrared light range or other spectral ranges. Thus, the image processor may identify and extract the marker from the captured images. The location of the marker may be used to generate indicators indicating surgical data, such as a position of a surgical tool, an orientation of a surgical tool, an image, a surgical setting parameter. The indicators are then overlaid with the captured images of the fundus or processed images and displayed to a user. By using the above marker-based ophthalmic surgical tool tracking system, robust tool tracking may be implemented during the ophthalmic surgery, because the high-contrast marker is less sensitive to illumination changes or shadow artifacts that frequently occur in ophthalmic surgeries. Further, the system may indicate surgical data, such as a presence, position, and orientation of the surgical tool, an image, a surgical setting parameter in real time to assist a surgeon during an ophthalmic surgical operation in the eye.
Fig. 1 illustrates an exemplary ophthalmic surgical tool tracking system, generally designated 100 disposed relative to an eye 101 under treatment. The eye 101 includes sclera 102, a cornea 104, an anterior chamber 106, and a posterior chamber 108. A capsular bag 110 is illustrated in the posterior chamber 108. The eye 101 further includes a retina 112. An ophthalmic surgical tool 130 may be used to perform surgery in the eye 101. The ophthalmic surgical tool 130 may be sized and shaped to be handled by a surgeon and to protrude into the eye 101 of the patient.
The ophthalmic surgical tool 130 may include a proximal portion 148 and a distal portion 144. The proximal portion 148 may be sized and shaped for handheld grasping by a user. For example, the proximal portion 148 may define a handle which is sized and shaped for grasping by a single hand of the user. In use, the user may control the position of the distal portion 144 by maneuvering the proximal portion 148. The distal portion 144 of the ophthalmic surgical tool 130 may include a marker 114. The marker may have a high contrast feature in the visible light or infrared spectrum or other spectral ranges detectable by an imaging device 124 of the ophthalmic surgical tool tracking system 100.
The ophthalmic surgical tool tracking system 100 also may include a light source 122, e.g., an endo illuminator. The light source 122 may have a distal portion that is configured to be inserted into the eye 101. A distal tip of the light source 122 may emit an imaging light that may illuminate a fundus of the eye 101. The fundus is an interior surface of the eye 101 and may include the retina 112. The imaging light from the light source 122 may be reflected from the fundus and the distal portion 144 of the ophthalmic surgical tool 130. The reflected imaging light may pass through the capsular bag 110, the anterior chamber 106, the cornea 104, and be received by the imaging device 124, which is configured to capture fundus images of the eye 101. Lenses 132 and 134 may be provided between the eye 101 and the imaging device 124 to receive the reflected imaging light from the fundus and direct the imaging light to the imaging device 124.
In some embodiments, the imaging device 124 may include one or more video cameras configured to capture images of the fundus. The video camera may capture images in visible light spectrum, infrared spectrum or other spectral ranges. For example, imaging device 124 may include either or both a video camera that captures images of the fundus in visible light spectrum and a video camera that captures infrared images of an infrared marker 114 near the fundus in the infrared spectrum.
The ophthalmic surgical tool tracking system 100 also may include an image processor 126. The image processor 126 may receive image frames captured by the imaging device 124 and perform various image processing on the image frames. In particular, the image processor 126 may perform image analysis on the image frames to identify and extract the image of the marker 114 from the image frames. Further, the image processor 126 may generate indicators and overlay the indicators on the image of the fundus or a processed image. The indicators may include surgical data, such as the position and orientation of the marker 114 in the image of the fundus, the position and orientation of a distal tip 146 of the ophthalmic surgical tool 130, an image, a surgical setting parameter. The overlaid image may then be displayed by a display 128 to the user.
The imaging device 124, the image processor 126, and the display 128 may be implemented in separate housings communicatively coupled to one another or within a common console or housing. A user interface 136 may be associated with the display 128 and/or the image processor 126. It may include, for example, a keyboard, a mouse, a joystick, a touchscreen, an eye tracking device, a speech recognition device, a gesture control module, dials, and/or buttons, among other input devices. A user may enter desired instructions or parameters at the user interface 136 to control the imaging device 124 for taking images of the eye 101. During an ophthalmic surgery, a surgeon may review the images of the fundus and/or the overlaid indicators on the display 128 to visualize the operation and the relative position of the distal tip 146 of the ophthalmic surgical tool 130 within various portions of the fundus.
Fig. 2 is an enlarged view of the distal portion 144 of the ophthalmic surgical tool 130. One or more markers 114 may be positioned at the distal portion 144. In particular, the markers 114 may wrap around the distal portion of the ophthalmic surgical tool 130, such that markers 114 may be visible in all directions even when the ophthalmic surgical tool 130 is rotated. The markers 114 may be labels or printed decals bonded to the ophthalmic surgical tool 130. The markers 114 may be formed with synthetic material, such as plastic. Thus, the markers 114 may not deteriorate in a biological tissue. Further, the markers 114 may be bio-compatible and do not interfere or react with biological tissues. In some embodiments, the markers 114 may be inscribed on an exterior surface of the ophthalmic surgical tool 130. Thus, the markers 114 may be a layer of paint inscribed on the exterior surface of the ophthalmic surgical tool 130. In some embodiments, the markers 114 may be embedded into the wall of the ophthalmic surgical tool 130. In addition, the markers 114 may have a high-contrast feature in visible, infrared or other spectrum. A high- contrast feature may be a color or a pattern that is distinguishable from colors or patterns in the fundus. For example, a high-contrast color may be a green color, which typically does not appear in a fundus of an eye. It is noteworthy that adding markers 114 should not increase the size of the surgical tools.
Fig. 3 illustrates various examples of the markers 114. The marker 114 may have a ring, ribbon shape configured to wrap around the distal portion 144 of the ophthalmic surgical tool 130. The marker 114 may have an inner surface 116 and an outer surface 118. The inner surface 116 may have adhesives and be configured to adhere or bond to an exterior surface of the ophthalmic surgical tool 130. The exterior surface of the distal portion 144 of the ophthalmic surgical tool 130 may have a circumferential groove configured to accommodate the ring, ribbon shape marker 114. Thus, the marker 114 may fit securely in the circumferential groove. The outer surface 118 of the marker 114 may have colors or patterns configured to distinguish the marker 114 from other elements in the fundus image.
One or more markers 114 may be used for the ophthalmic surgical tool 130.
The marker 114 may be formed of bio-compatible and/or synthetic materials, such as sterile plastic. In some embodiments, the marker 114 may be a layer of paint inscribed on an exterior surface of the distal portion 144 of the ophthalmic surgical tool 130. The markers 114 may overlap each other or be separated from each other. The markers 114 may have one or more high contrast colors. For example, the markers 114 may have a green color, which does not appear in a typical fundus image. Thus, the green markers 114 may be distinguished from other elements in the fundus image. The markers 114 may have various color, texture, or spectral contrast. In particular, the markers 114 may include patterns that may identify an orientation and angle of the ophthalmic surgical tool 130. For example, as shown in Fig. 3, marker 114a may have a solid high-contrast color. When the ring, ribbon shape marker 114a is cut open, the marker 114a may be a ribbon in solid color. In another example, marker 114b may have a texture pattern that may distinguish the marker 114b from the background fundus image. Exemplary marker 114c may include an infrared color configured to reflect or emit infrared light. Markers 114 with various spectral absorption/emission also may be used.
The markers 114 may include letters, numbers, bar codes, pattern, symbols, or pictures. Exemplary marker 114d may include letters. As shown in Fig. 3, assuming that the marker 114d wraps 360 degrees around the distal portion 144 of the ophthalmic surgical tool 130, a letter "A" may be positioned near the zero degree position and the letter "E" may be positioned near the 360 degree position. Letters "B," "C," and "D" may be positioned in between "A" and "E" at respective positions. Thus, based on the orientation of the letters, the rotational position of the marker 114d and indirectly the rotational position of the ophthalmic surgical tool 130 may be determined. Exemplary marker 114e may include numbers "1" to "5." Similarly, the numbers may indicate a rotational position of the ophthalmic surgical tool 130. Further, the orientation of the letters or number also may indicate a tilting angle of the ophthalmic surgical tool 130. For example, the numbers or letters may be orientated relatively to the distal tip 146 of the ophthalmic surgical tool 130 such that the bottoms of the numbers or letter face toward the distal tip 146. Thus, based on the orientation of the numbers or letters, the tilting angle of the distal tip 146 may be determined.
Exemplary marker 114f may include barcodes or stripes. The direction of the stripes may indicate a tilting angle of the ophthalmic surgical tool 130. Further, the number of stripes may vary to indicate a rotational position of the marker 114f and indirectly, the rotational position of the ophthalmic surgical tool 130. Marker 114g has various dot patterns. The number of dots may indicate the rotational position of the marker 114f and the alignment of the dots may indicate a tilting angle of the marker 114f. Other symbols also may be used on the markers 114. For example, various symbols, such as shapes or non-character symbols may be used at different rotational positions of the markers 114h and 114i to indicate rotational positions. In addition, a picture may be used to indicate rotational and tilt positions of the marker 114j. Other patterns or symbols that may indicate an orientation and position of the ophthalmic surgical tool 130 also may be used on the markers 114.
Fig. 4 is a flow chart illustrating a method 400 for tracking an ophthalmic surgical tool 130 inserted in an eye 101. As noted above, a light source 122 may introduce imaging light into the fundus of the eye 101. The reflected imaging light from the fundus may be guided by lenses 132 and 134 and received by imaging device 124. At 402, the imaging device 124 may capture images of the fundus. In particular, the imaging device 124 may capture frames of images to form a video. Each image frame may be forwarded to the image processor 126 to be processed and analyzed.
At 404, the image processor 126 may perform contrast and feature enhancement processing on the image frame. For example, the image processor 126 may receive the image frame in Red-Green-Blue (RGB) format. At 404, the image processor 126 may convert the RGB format image frame into a Hue-Saturation- Value (HSV) space. At 406, after the image frame has been enhanced to bring out the contrast and feature, the image processor 126 may determine a first-order estimation mask of the marker 114. For example, based on a predetermined color of the marker 114, the image processor 126 may apply criteria to the hue and saturation channels of the HSV image frame that may separate the marker 114 from the background in order to bring out and estimate the image of the marker 114.
At 408, the image processor 126 may extract the image of the marker 114 from the image frame. For example, the image processor 126 may implement a blob detection process to detect a boundary of the marker 114 in the image frame. A blob may be a region of the image frame where some properties, such as color and brightness, are approximately constant. The image processor 126 may search for regions of approximately constant properties in the image frame to detect blobs. Thus, the image processor 126 may find the boundary of the marker 114 and extract the marker 114 from the image frame.
At 410, the image processor 126 may analyze the shape and orientation of the marker 114 extracted from the image frame. Based on a predetermined pattern and color, the image processor 126 may determine the orientation of the marker 114 in the image frame. For example, if the marker 114 has stripes, the image processor 126 may determine the orientation of the marker 114 based on the orientation and direction of the stripes.
At 412, the image processor 126 may determine the position and orientation of the distal tip 146 of the ophthalmic surgical tool 130. In particular, based on the position and orientation of the marker 114, the image processor 126 may determine the position and orientation of the distal tip 146 of the ophthalmic surgical tool 130. For example, the marker 114 may be positioned from the distal tip 146 of the ophthalmic surgical tool 130 by a predetermined distance and may have a pattern that indicates a pointing direction of the ophthalmic surgical tool 130, e.g. , a strip or an arrow. Thus, based on the position and the pattern of the marker 114, the image processor 126 may determine the position of the distal tip 146 of the ophthalmic surgical tool and the pointing direction or orientation of the ophthalmic surgical tool.
At 414, the image processor 126 may display and overlay indicators to indicate the distal tip 146 of the ophthalmic surgical tool 130 or other surgical data for surgical guidance. For example, the image processor 126 may generate an indicator, such as a box, a circle, a star, or an arrow, and overlay the indicator into the image frame at the position of the distal tip 146 of the ophthalmic surgical tool 130. Further, the indicator may indicate an orientation, e.g. , a pointing angle, of the ophthalmic surgical tool 130. For example, an arrow may be used as the indicator to indicate the pointing direction of the ophthalmic surgical tool 130. Further, the indicator may also include an image, such as an OCT image of a region of the retina 112, or a surgical setting parameter, such as a cutting speed of a vitrectomy probe. The display 128 may display the image frame overlaid with the indicators.
Fig. 5 illustrates a fundus image 500 during an ophthalmic surgery. The fundus image 500 may be displayed or processed and displayed on the display 128. As shown in the fundus image 500, the distal portion 144 of the ophthalmic surgical tool 130 is inserted into the eye 101. The distal portion 144 may have a marker 114. After the image processor 126 extracts the marker image and determines the position and orientation of the marker 114, the image processor 126 may generate and overlay indicators 502, 504, and 506 into the fundus image.
Indicator 502 may be a dot positioned over the marker 114 or at a position of the surgical tool 130 in the fundus image 500. Thus, the indicator 502 may indicate the position of the marker 114 or the position of the surgical tool 130 in real time. Indicator 504 may be an arrow positioned on or adjacent to the marker 114. The arrow of indicator 504 may indicate an orientation or pointing position of the ophthalmic surgical tool 130. Indicator 506 may be text or image positioned at or adjacent to the marker 114. The indicator may include name or description of the ophthalmic surgical tool. The indicator also may include diagnosis information, surgical status, warnings, or other information. For example, the indicator may include the brand name, identification, and/or functional description. In the example shown in Fig. 5, indicator 506 has the text: "Alcon" to indicate the name brand of the ophthalmic surgical tool. In some embodiments, operating parameters, such as temperature, flow rate, revolutions per minute (RPM) speed, pressure, and the like, may be included in the indicator 506 to provide additional information to the user. In other embodiments, images of the tissue under examination from other imaging modalities, such as optical coherence tomography (OCT), fluorescein angiography (FA), and the like, may be included in the indicator 506 to provide additional guidance to the user.
The image processor 126 may perform the method 400 for each image frame as the image frame is displayed to continuously track the position and orientation of the distal tip 146 of the ophthalmic surgical tool 130 in real time. Thus, the display 128 may display the indicators 502, 504, and 506 in a real-time video of the fundus to track the position and movement of the distal tip 146 of the ophthalmic surgical tool 130. Accordingly, a surgeon may view the display 128 to visualize the movement of the ophthalmic surgical tool 130 in the eye 101 during a surgery.
The above systems and methods may be applied to track an infrared marker 114 using an imaging device 124 configured to capture images in infrared spectrum. Further, an OCT enabled surgical tool 130 may use the above systems and methods to track the area of the eye 101 being imaged by the OCT surgical probe 130 around the area of the distal tip 146 of the OCT surgical probe 130.
In some embodiments, the indicator may include identification or parameters of the ophthalmic surgical tool 130. For example, the indicator may identify that the ophthalmic surgical tool 130 is an irrigation tool with a certain liquid flow rate, pressure, and temperature. By displaying these parameters, a surgeon may keep track of the operation of the ophthalmic surgical tool 130 without looking away from the display 128.
Persons of ordinary skill in the art will appreciate that the embodiments encompassed by the present disclosure are not limited to the particular exemplary embodiments described above. In that regard, although illustrative embodiments have been shown and described, a wide range of modification, change, and substitution is contemplated in the foregoing disclosure. It is understood that such variations may be made to the foregoing without departing from the scope of the present disclosure. Accordingly, it is appropriate that the appended claims be construed broadly and in a manner consistent with the present disclosure.

Claims

CLAIMS We claim:
1. An ophthalmic surgical tool tracking system comprising:
an ophthalmic surgical tool comprising a distal portion configured to be inserted into an eye;
a marker positioned at the distal portion of the ophthalmic surgical tool; and a tool position detection system comprising:
a light source configured to introduce an imaging light into a fundus of the eye;
an imaging device configured to receive the imaging light reflected from the fundus and capture an image of the fundus along with the distal portion of the ophthalmic surgical tool inserted into the eye; and
a processor configured to determine a position of the marker in the image of the fundus.
2. The ophthalmic surgical tool tracking system of claim 1, wherein the processor is configured to:
process the captured image to identify the marker;
generate an indicator indicating surgical data; and
overlay the indicator to a display.
3. The ophthalmic surgical tool tracking system of claim 1, wherein the marker has one or more high-contrast features in the visible light range.
4. The ophthalmic surgical tool tracking system of claim 1, wherein the marker is a decal adhered to the ophthalmic surgical tool.
5. The ophthalmic surgical tool tracking system of claim 1, wherein the marker is inscribed on a surface or embedded into a wall of the ophthalmic surgical tool.
6. The ophthalmic surgical tool tracking system of claim 1, wherein the marker includes a pattern configured to indicate an orientation of the marker in the captured image.
7. The ophthalmic surgical tool tracking system of claim 1, wherein the marker has a high-contrast feature in an infrared range or other spectral ranges.
8. The ophthalmic surgical tool tracking system of claim 2, wherein the processor is further configured to identify the marker in the captured image by:
perform a contrast and feature enhancement process to the captured image; estimate an image of the marker in the enhanced image;
extract the image of the marker from the enhanced image; and
determine a shape, a position, and an orientation of the marker from the image of the marker.
9. The ophthalmic surgical tool tracking system of claim 2, wherein the surgical data including one or more of a position of the ophthalmic surgical tool, an orientation of the ophthalmic surgical tool, an image, a surgical setting parameter.
10. The ophthalmic surgical tool tracking system of claim 2,
wherein the imaging device is configured to capture a video of the eye, and wherein the processor is configured to continuously generate and overlay the indicator in real time.
11. An ophthalmic surgical tool comprising:
a distal portion configured to be inserted into an eye; and
a marker disposed at the distal portion and configured to be captured by an imaging device, wherein the marker includes a pattern configured to indicate an orientation and a position of a distal tip of the ophthalmic surgical tool.
12. The ophthalmic surgical tool of claim 11, wherein the marker has one or more high-contrast features in the visible light range.
13. The ophthalmic surgical tool of claim 11, wherein the marker is a decal attached to the ophthalmic surgical tool.
14. The ophthalmic surgical tool of claim 11, wherein the marker is inscribed on a surface or embedded into a wall of the ophthalmic surgical tool.
15. The ophthalmic surgical tool of claim 11, wherein the marker has a high-contrast feature in an infrared range or other spectral ranges.
16. A method for tracking an ophthalmic surgical tool inserted in an eye, the method comprising:
introducing an imaging light into a fundus of an eye;
receiving, by an imaging device, the imaging light reflected from the fundus; capturing, by the imaging device, an image of the fundus along with a distal portion of an ophthalmic surgical tool inserted into the eye; and
determining a position of a marker attached to the distal portion of the ophthalmic surgical tool.
17. The method of claim 16, wherein the determining a position of the marker comprises:
processing the captured image to identify the marker;
generating an indicator indicating surgical data;
overlaying the indicator on the captured image or a processed image; and displaying the captured image or the processed image with the overlaid indicator on a display.
18. The method of claim 17, wherein the processing the captured image comprises:
performing a contrast and feature enhancement process to the captured image; estimating an image of the marker in the enhanced image;
extracting the image of the marker from the enhanced image; and
determining a shape, a position and an orientation of the marker from the image of the marker.
19. The method of claim 17, wherein the surgical data including one or more of a position of the ophthalmic surgical tool, an orientation of the ophthalmic surgical tool, an image, a surgical setting parameter.
20. The method of claim 17, further comprising capturing a video of the eye and continuously generating and overlaying the indicator in real time.
PCT/US2014/068899 2013-12-19 2014-12-05 Marker-based tool tracking WO2015094726A1 (en)

Priority Applications (6)

Application Number Priority Date Filing Date Title
CN201480069020.2A CN105828703B (en) 2013-12-19 2014-12-05 Tool tracking based on label
AU2014366911A AU2014366911B2 (en) 2013-12-19 2014-12-05 Marker-based tool tracking
EP14872305.9A EP3082569B1 (en) 2013-12-19 2014-12-05 Marker-based tool tracking
ES14872305T ES2726895T3 (en) 2013-12-19 2014-12-05 Instrument tracking based on a marker
CA2932895A CA2932895C (en) 2013-12-19 2014-12-05 Marker-based tool tracking
JP2016541315A JP6302070B2 (en) 2013-12-19 2014-12-05 Instrument tracking using markers

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US14/134,237 2013-12-19
US14/134,237 US9597009B2 (en) 2013-12-19 2013-12-19 Marker-based tool tracking

Publications (1)

Publication Number Publication Date
WO2015094726A1 true WO2015094726A1 (en) 2015-06-25

Family

ID=53398779

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2014/068899 WO2015094726A1 (en) 2013-12-19 2014-12-05 Marker-based tool tracking

Country Status (8)

Country Link
US (1) US9597009B2 (en)
EP (1) EP3082569B1 (en)
JP (1) JP6302070B2 (en)
CN (1) CN105828703B (en)
AU (1) AU2014366911B2 (en)
CA (1) CA2932895C (en)
ES (1) ES2726895T3 (en)
WO (1) WO2015094726A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11806092B2 (en) 2018-04-25 2023-11-07 Carl Zeiss Meditec Ag Microscopy system and method for operating the microscopy system

Families Citing this family (28)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9597009B2 (en) 2013-12-19 2017-03-21 Novartis Ag Marker-based tool tracking
US9999350B2 (en) 2014-09-25 2018-06-19 Novartis Ag Reduced glare surgical microscope and associated devices, systems, and methods
US9645379B2 (en) * 2014-12-29 2017-05-09 Novartis Ag Magnification in ophthalmic procedures and associated devices, systems, and methods
US9639917B2 (en) * 2015-05-19 2017-05-02 Novartis Ag OCT image modification
US20170035287A1 (en) * 2015-08-04 2017-02-09 Novartis Ag Dynamic surgical data overlay
US9826900B2 (en) * 2015-08-17 2017-11-28 Novartis Ag Surgical microscope with integrated optical coherence tomography and display systems
US20170172667A1 (en) * 2015-12-16 2017-06-22 Novartis Ag Cannula with optical sensing
US11484363B2 (en) 2015-12-28 2022-11-01 Elbit Systems Ltd. System and method for determining the position and orientation of a tool tip relative to eye tissue of interest
IL243384A (en) * 2015-12-28 2017-05-29 Schneider Ron System and method for determining the position and orientation of a tool tip relative to eye tissue of interest
WO2017169823A1 (en) * 2016-03-30 2017-10-05 ソニー株式会社 Image processing device and method, surgery system, and surgical member
US11071449B2 (en) * 2016-03-31 2021-07-27 Alcon Inc. Visualization system for ophthalmic surgery
IL245560B1 (en) 2016-05-09 2024-01-01 Elbit Systems Ltd Localized optical coherence tomography images for ophthalmological surgical procedures
US10460457B2 (en) * 2016-07-12 2019-10-29 Novartis Ag Adaptive adjustment of overlay image parameters
US10409051B2 (en) 2016-08-02 2019-09-10 Novartis Ag Extraction of microscope zoom level using object tracking
EP3285107B2 (en) * 2016-08-16 2024-02-28 Leica Instruments (Singapore) Pte. Ltd. Surgical microscope with gesture control and method for a gesture control of a surgical microscope
US10973585B2 (en) 2016-09-21 2021-04-13 Alcon Inc. Systems and methods for tracking the orientation of surgical tools
MX2019007116A (en) 2016-12-15 2019-09-16 Novartis Ag Adaptive image registration for ophthalmic surgery.
US10918445B2 (en) * 2016-12-19 2021-02-16 Ethicon Llc Surgical system with augmented reality display
WO2018193932A1 (en) * 2017-04-21 2018-10-25 ソニー株式会社 Information processing device, surgical tool, information processing method, and program
JP2018175790A (en) * 2017-04-21 2018-11-15 ソニー株式会社 Information processing device, information processing method and program
JP2020535890A (en) * 2017-10-06 2020-12-10 アルコン インコーポレイティド Tracking eye movements within tracking range
US11517474B2 (en) 2017-12-19 2022-12-06 Alcon Inc. Methods and systems for eye illumination
US11259960B2 (en) 2018-02-22 2022-03-01 Alcon Inc. Surgical instrument using detected light
EP3696593A1 (en) * 2019-02-12 2020-08-19 Leica Instruments (Singapore) Pte. Ltd. A controller for a microscope, a corresponding method and a microscope system
KR102390116B1 (en) * 2019-11-19 2022-04-25 주식회사 마이크로트 Implant device for lowering intraocular pressure with easy and safe method
JP7421787B2 (en) 2019-12-11 2024-01-25 リバーフィールド株式会社 Vitreous surgery lighting device
EP3881793A1 (en) * 2020-03-17 2021-09-22 CHU de NICE Surgical instrument and computer-implemented method for determining the position and orientation of such surgical instrument
DE102021202384B3 (en) 2021-03-11 2022-07-14 Carl Zeiss Meditec Ag Microscope system, medical instrument and calibration method

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030169603A1 (en) * 2002-03-05 2003-09-11 Luloh K. Peter Apparatus and method for illuminating a field of view within an eye
WO2005107845A1 (en) 2004-04-29 2005-11-17 Iscience Interventional Corporation Apparatus and method for ocular treatment
US20100208202A1 (en) * 2009-02-16 2010-08-19 Canon Kabushiki Kaisha Fundus camera
US20110282331A1 (en) * 2010-05-13 2011-11-17 Oprobe, Llc Optical coherence tomography with multiple imaging instruments
US20130038836A1 (en) * 2011-08-12 2013-02-14 Ronald T. Smith Portable pattern-generating ophthalmic probe

Family Cites Families (18)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09149876A (en) * 1995-11-30 1997-06-10 Olympus Optical Co Ltd Endoscope device
JPH10118076A (en) * 1996-10-24 1998-05-12 Olympus Optical Co Ltd Device for surgical operation under endoscope
US7867186B2 (en) * 2002-04-08 2011-01-11 Glaukos Corporation Devices and methods for treatment of ocular disorders
US8403828B2 (en) * 2003-07-21 2013-03-26 Vanderbilt University Ophthalmic orbital surgery apparatus and method and image-guide navigation system
JP4095044B2 (en) * 2004-04-19 2008-06-04 国弘 武蔵 Endoscopic endoscope
DE102004049258B4 (en) 2004-10-04 2007-04-26 Universität Tübingen Device, method for controlling operation-supporting medical information systems and digital storage medium
US9867669B2 (en) * 2008-12-31 2018-01-16 Intuitive Surgical Operations, Inc. Configuration marker design and detection for instrument tracking
US8506558B2 (en) * 2008-01-11 2013-08-13 Oraya Therapeutics, Inc. System and method for performing an ocular irradiation procedure
JP5477800B2 (en) * 2008-02-27 2014-04-23 株式会社日立製作所 Method of operating rotation state detection device and rotation state detection device
ATE509568T1 (en) * 2008-10-22 2011-06-15 Sensomotoric Instr Ges Fuer Innovative Sensorik Mbh METHOD AND DEVICE FOR IMAGE PROCESSING FOR COMPUTER-ASSISTED EYE OPERATIONS
US8903476B2 (en) * 2009-03-08 2014-12-02 Oprobe, Llc Multi-function optical probe system for medical and veterinary applications
US8996173B2 (en) 2010-09-21 2015-03-31 Intuitive Surgical Operations, Inc. Method and apparatus for hand gesture control in a minimally invasive surgical system
US9542001B2 (en) 2010-01-14 2017-01-10 Brainlab Ag Controlling a surgical navigation system
US8414564B2 (en) * 2010-02-18 2013-04-09 Alcon Lensx, Inc. Optical coherence tomographic system for ophthalmic surgery
US10085633B2 (en) * 2012-04-19 2018-10-02 Novartis Ag Direct visualization system for glaucoma treatment
US9492065B2 (en) 2012-06-27 2016-11-15 Camplex, Inc. Surgical retractor with video cameras
EP2986419B1 (en) 2013-04-16 2017-06-07 Atlas Copco Industrial Technique AB Power tool
US9597009B2 (en) 2013-12-19 2017-03-21 Novartis Ag Marker-based tool tracking

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030169603A1 (en) * 2002-03-05 2003-09-11 Luloh K. Peter Apparatus and method for illuminating a field of view within an eye
WO2005107845A1 (en) 2004-04-29 2005-11-17 Iscience Interventional Corporation Apparatus and method for ocular treatment
US20100208202A1 (en) * 2009-02-16 2010-08-19 Canon Kabushiki Kaisha Fundus camera
US20110282331A1 (en) * 2010-05-13 2011-11-17 Oprobe, Llc Optical coherence tomography with multiple imaging instruments
US20130038836A1 (en) * 2011-08-12 2013-02-14 Ronald T. Smith Portable pattern-generating ophthalmic probe

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3082569A4

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11806092B2 (en) 2018-04-25 2023-11-07 Carl Zeiss Meditec Ag Microscopy system and method for operating the microscopy system

Also Published As

Publication number Publication date
EP3082569A4 (en) 2017-11-01
CN105828703B (en) 2019-01-01
EP3082569B1 (en) 2019-02-27
CN105828703A (en) 2016-08-03
JP2017501802A (en) 2017-01-19
US9597009B2 (en) 2017-03-21
US20150173644A1 (en) 2015-06-25
CA2932895C (en) 2019-03-19
EP3082569A1 (en) 2016-10-26
CA2932895A1 (en) 2015-06-25
AU2014366911B2 (en) 2017-03-30
JP6302070B2 (en) 2018-03-28
ES2726895T3 (en) 2019-10-10
AU2014366911A1 (en) 2016-07-07

Similar Documents

Publication Publication Date Title
CA2932895C (en) Marker-based tool tracking
US11071449B2 (en) Visualization system for ophthalmic surgery
US10973585B2 (en) Systems and methods for tracking the orientation of surgical tools
US9788906B2 (en) Context aware surgical systems for intraoperatively configuring imaging devices
US20180125333A1 (en) Efficient and interactive bleeding detection in a surgical system
Richa et al. Visual tracking of surgical tools for proximity detection in retinal surgery
Richa et al. Vision-based proximity detection in retinal surgery
EP2931161A1 (en) Markerless tracking of robotic surgical tools
US20170035287A1 (en) Dynamic surgical data overlay
EP3554383B1 (en) System for providing images for guiding surgery
Speidel et al. Automatic classification of minimally invasive instruments based on endoscopic image sequences
JP2011508618A (en) Method for detecting and / or tracking the location of characteristic eye components
CN111491549B (en) Method and system for eye illumination
WO2022206436A1 (en) Dynamic position identification and prompt system and method
WO2018109227A1 (en) System providing images guiding surgery
EP3881793A1 (en) Surgical instrument and computer-implemented method for determining the position and orientation of such surgical instrument
JP2019058494A (en) Laser treatment device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14872305

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2932895

Country of ref document: CA

ENP Entry into the national phase

Ref document number: 2016541315

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2014366911

Country of ref document: AU

Date of ref document: 20141205

Kind code of ref document: A

REEP Request for entry into the european phase

Ref document number: 2014872305

Country of ref document: EP

WWE Wipo information: entry into national phase

Ref document number: 2014872305

Country of ref document: EP