WO2016149155A1 - Methods and systems for regitration using a microscope insert - Google Patents

Methods and systems for regitration using a microscope insert Download PDF

Info

Publication number
WO2016149155A1
WO2016149155A1 PCT/US2016/022236 US2016022236W WO2016149155A1 WO 2016149155 A1 WO2016149155 A1 WO 2016149155A1 US 2016022236 W US2016022236 W US 2016022236W WO 2016149155 A1 WO2016149155 A1 WO 2016149155A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
microscope
processing unit
insert
eye
Prior art date
Application number
PCT/US2016/022236
Other languages
French (fr)
Inventor
Richard Awdeh
Original Assignee
Richard Awdeh
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Richard Awdeh filed Critical Richard Awdeh
Priority to EP16765529.9A priority Critical patent/EP3267892A4/en
Priority to US15/558,086 priority patent/US20180049840A1/en
Publication of WO2016149155A1 publication Critical patent/WO2016149155A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/20Surgical microscopes characterised by non-optical aspects
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/13Ophthalmic microscopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B6/00Apparatus or devices for radiation diagnosis; Apparatus or devices for radiation diagnosis combined with radiation therapy equipment
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/0004Microscopes specially adapted for specific applications
    • G02B21/0012Surgical microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/18Arrangements with more than one light path, e.g. for comparing two specimens
    • G02B21/20Binocular arrangements
    • G02B21/22Stereoscopic arrangements
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B21/00Microscopes
    • G02B21/36Microscopes arranged for photographic purposes or projection purposes or digital imaging or video purposes including associated control and data processing arrangements
    • G02B21/365Control or image processing arrangements for digital or video microscopes
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B27/00Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
    • G02B27/0093Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2055Optical tracking systems
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B34/00Computer-aided surgery; Manipulators or robots specially adapted for use in surgery
    • A61B34/20Surgical navigation systems; Devices for tracking or guiding surgical instruments, e.g. for frameless stereotaxis
    • A61B2034/2046Tracking techniques
    • A61B2034/2065Tracking using image or pattern recognition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/36Image-producing devices or illumination devices not otherwise provided for
    • A61B2090/363Use of fiducial points
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B90/00Instruments, implements or accessories specially adapted for surgery or diagnosis and not covered by any of the groups A61B1/00 - A61B50/00, e.g. for luxation treatment or for protecting wound edges
    • A61B90/39Markers, e.g. radio-opaque or breast lesions markers
    • A61B2090/3937Visible markers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery
    • A61F9/008Methods or devices for eye surgery using laser
    • A61F2009/00844Feedback systems
    • A61F2009/00846Eyetracking

Definitions

  • This disclosure is related in general to surgical microscopes and in particular to registration using a microscope insert for surgical microscopes.
  • a microscope insert includes a camera, a display device, a beam splitter, and a processing unit.
  • the camera is configured to receive a first portion of first light through a microscope from an object and generate a signal representing an image of the object.
  • the display device is configured to generate a graphical representation of information relevant to the object and project second light representing the graphical representation.
  • the beam splitter is configured to direct a second portion of the first light from the object and a first portion of the second light to a viewing device for simultaneously viewing the object and the information by a user.
  • the processing unit is configured to track motions of the object based on the image of the object and control the display device to adjust the graphical representation according to the motions of the object.
  • a method for tracking and registering an object in a microscope includes receiving first light from an object through a microscope; generating, based on a first portion of the first light, a first signal representing an image of the object; generating, according to the image of the object, a graphical representation of information relevant to the object; projecting second light corresponding to the graphical representation of the information; directing a second portion of the first light from the object and a first portion of the second light to a viewing device for simultaneously viewing the object and the information by a user; tracking the object based on the image of the object; and adjusting the graphical representation according to the tracking of the object.
  • Figure 1 is a schematic diagram of a microscope insert according to an embodiment
  • Figure 2 illustrates a surgical system including the microscope insert according to an embodiment
  • Figure 3 illustrates electronic connections between the microscope insert and an external computer system according to an embodiment
  • Figure 4 illustrates a microscope system including an microscope insert according to an embodiment
  • Figure 5 illustrates light paths within a microscope insert according to an embodiment
  • Figure 6A is a side view of various components assembled in a microscope insert according to an embodiment
  • Figure 6B is a top view of the various components assembled in the microscope insert according to an embodiment
  • Figures 7A and 7B are perspective views of a microscope insert having various components installed therein according to an embodiment
  • Figure 8 is a schematic diagram of a microscope insert according to an embodiment
  • Figure 9 is a schematic diagram of an insert driver circuit board for a microscope insert according to an embodiment
  • Figure 10 illustrates graphical information generated by a microscope insert according to an embodiment
  • Figure 11 illustrate a process for correcting a field of view of the microscope insert according to an embodiment
  • Figure 12 illustrates a process for generating an overlaid image in a microscope according to an embodiment
  • Figure 13 illustrates a process for marker-based registration and tracking according to an embodiment
  • Figure 14 illustrates a process for anatomical feature-based registration and tracking according to an embodiment
  • Figure 15A is an image of an eye with fiducial markers captured by a camera according to an embodiment
  • Figure 15B is an enhanced image of an eye with fiducial markers generated based on the image of Figure 15A;
  • Figure 15C is a binary mask generated based on the enhanced image of Figure 15B;
  • Figures 16A-16D illustrate a K-means clustering process for segmenting the image captured by a camera of the disclosed microscope insert according to an embodiment
  • Figure 17 is a mask image generated by a K-means clustered image according to the process of Figures 16A-16D;
  • Figure 18A is a reference image used for feature identification according to an embodiment
  • Figure 18B is a test image captured by a camera for feature
  • Figure 19 illustrates the mapping between the reference image of Figure 18A and the test image of Figure 18B according to an embodiment
  • Figure 20 illustrates a sequence diagram for interactions between a tracker engine configured to track the motions of an eye and a torsion engine configured to generate graphical representations for the guidance or prompts;
  • Figure 21 illustrates an exemplary tracking system for tracking the eye of a patient and generating the graphical representations for the guidance or prompts;
  • Figure 22 illustrates an exemplary process for carrying out the tracking by the tracker engine of Figure 21 ;
  • Figure 23 illustrates an exemplary process for reference image processing and generating Housdorff Distance look-up table
  • Figure 24 illustrates an exemplary process for processing the sense image and computation of the minimum-Hausdorff Distance between the reference image template ROI and ROI in the sense image
  • Figures 25-32 depict a process for estimation of ocular torsion from sclera features
  • a microscope insert 100 includes a projection system 104 and an imaging system 106.
  • Projection system 104 includes one or more display devices 110A and 110B and one or more sets of tube lenses 112A and 112B for projecting images from the display devices 110A and 110B.
  • Imaging system 106 includes one or more cameras 118A and 118B and one or more sets of tube lenses 112C and 112D for focusing images to cameras 118A and 118B.
  • Microscope insert 100 further includes one or more polarizing beam splitters (PBS) 120A and 120B, which will be further described below.
  • PBS polarizing beam splitters
  • the above components of insert 100 form individual optical channels that generate respective images for left and right eyes of a user.
  • Each optical channel includes a display device 11 OA/11 OB, a camera 118A/118B, a polarizing beam splitter 120A/120B, and corresponding tube lenses 112A/112B and
  • a polarizer element 114 may be disposed between tube lenses 112A/112B and polarizing beam splitters 120A/120B.
  • polarizer element 114 may include different pieces for respective optical channels.
  • Figure 1 shows two optical channels for microscope insert 100
  • insert 100 may have any number of optical channels, each having a structure similar to those depicted in Figure 1.
  • microscope 100 includes two or more optical channels,
  • videos/images generated by the optical channels are configured so as to provide a user with stereoscopic rendering.
  • cameras 118A and 118B are digital imaging devices, such as the Point Grey FL3-U3-13S2C-CS manufactured by Point Grey Research. However, a number of different cameras may be used, providing different features, such as a CMOS or CCD based sensor, a global or rolling shutter, and a range of resolutions at about 20 FPS or higher.
  • Display devices 110A and 110B may be LCOS (Liquid Crystal on Silicon) microdisplay devices, each of which has pixels that can be individually adjusted to match or exceed the brightness of the microscope.
  • LCOS Liquid Crystal on Silicon
  • Other display technologies may also be used, such as OLED, DLP, T-OLED, MEMS, and LCD-based displays.
  • Insert 100 also includes a display driver circuit 102 to control display devices 110A and 110B and/or other system elements or features.
  • Display driver circuit 102 may generate video/image data that are suitable for rendering by display devices 110A and 110B.
  • Insert 100 is connected to a processing unit 108 via standard communication protocols.
  • Processing unit 108 may or may not be disposed within insert 100.
  • Processing unit 108 receives video/image signals from cameras 118A and 118B and sends the video/image signals to driver circuit 102 for rendering the videos/images on display devices 110A and 110B.
  • Processing unit 108 may apply additional processing on videos/images data received from cameras 118A and 118B, For example, processing unit 108 may perform image processing techniques, such as image registration, pattern recognition, image filtering, image enhancement, and the like.
  • Processing unit 108 may also be connected to other peripherals to collect data to be used by microscope insert 100, to generate visual guidance for navigation during a surgical procedure, or to provide alternative graphical user interfaces on external display devices to supplement the display through microscope insert 100.
  • Figure 2 illustrates a surgical system 200 including a microscope insert 228 according to a further embodiment.
  • Surgical system 200 includes a microscope 226 coupled to microscope insert 228.
  • Microscope insert 228 generally corresponds to microscope insert 100 of Figure 1.
  • Insert 228 communicates with a processing unit 230, which corresponds to processing unit 108 of Figure 1.
  • Microscope 226 receives light or optical signals reflected from an object through its lens system and the polarized beam splitters (e.g., PBS's 120A and 120B), which pass the optical signals to the cameras (e.g., cameras 118A and 118B) of microscope insert 228.
  • the cameras of microscope insert 228 convert the optica! signals to digital data representing videos/images of the object and transmit the digital data to processing unit 230.
  • Processing unit 230 performs image processing on the digital data and sends processed data and relevant commands to the driver circuit (e.g., driver circuit
  • microscope 226 Based on the processed data and the commands from the driver circuit, display devices (e.g., display devices 110A and 110B) of microscope insert 228 generate optical signals representing processed videos/images of the object and project the optical signals to polarized beam splitters 120A and 120B. Polarized beam splitters 120A and 120B pass the optical signals to the eye pieces of microscope 226 for viewing by a user.
  • the driver circuit may also control, for example, the brightness or contrast of display devices 110A and 110B.
  • Processing unit 230 may also communicate with additional input devices, such as a QR code reader 202, a foot pedal 204, a USB switch 206, a power supply 208, and one or more external storage devices providing surgical planning data 210 or calibration and software update data 212. Additionally, processing unit 230 may be further connected to a surgical support system 224 that is suitable for the underlying surgery.
  • surgical support system 224 may be the Stellaris system manufactured by Bausch & Lomb Incorporated and suitable for ophthalmic procedures. Surgical support system 224 may collect the demographicai and biological data of a patient and provides the data to processing unit 230.
  • system 200 may include various output devices, such as speakers 218, an external display device 220, and a remote display device 222.
  • External display device 220 and remote display device 222 may be high-resolution monitors that provide additional monitoring capability outside of insert 228.
  • Display devices 220 and 222 may be located in the same operating room as microscope 226 or at a remote location.
  • System 200 may further include one or more storage media for storing post-operation data 214 and system diagnostics data 216.
  • other system components shown in Figure 2 may also be located in different locations and connected to processing unit 230 through, for example, Ethernet, Internet, USB connections, Bluetooth connections, infrared connections, cellular connections, Wi-Fi connections, and the like.
  • FIG. 3 illustrates a surgical system 300 including a microscope insert 314 according to an alternative embodiment.
  • Microscope insert 314 generally corresponds to microscope insert 100 of Figure 1 and is configured to generate stereoscopic images as described herein.
  • insert 314 may include two imaging cameras, two display devices, a driver circuit, and other imaging and projection optics for left and right eyes of a user.
  • System 300 further includes a medical stand 302, an external monitor 312, a foot pedal 308, and a surgical support system 310.
  • Medical stand 302 may include a QR image scanner 304 configured to scan QR codes to provide
  • Medical stand 302 also includes a processing unit 306, which generally corresponds to processing unit 108 of Figure 1.
  • Processing unit 306 may be included a motherboard with interfaces, such as USB 2.0, USB 3.0, Ethernet, etc.
  • Processing unit 306 may include a central processing unit (CPU) with heat sinks, a RAM, a video card, a power supply, a webcam, etc.
  • Processing unit 306 is connected to other system components through its communication interfaces, such as USB ports, Ethernet ports, Internet ports, HDMI interfaces, etc.
  • processing unit 306 may be connected to microscope insert 314 and external monitor 312 through HDMI interfaces to provide high resolution video/image data to the driver circuit of insert 314 and monitor 312.
  • processing unit 306 may also be connected to insert 314 and monitor 312 through USB ports to provide videoimage data and control signals.
  • Processing unit 306 may be connected to the camera of insert 314 through USB ports to receive video/image data from the camera.
  • Foot pedal 308 and other user input devices may be connected to processing unit 306 through one or more USB ports. Foot pedal 308 may be operated by a user to provide user input during a surgery. For example, when the user presses foot pedal 308, foot pedal 308 may generate an electronic signal.
  • processing unit 306 may control insert 314 accordingly.
  • processing unit 306 may control insert 314 to change the videos/images generated by the display devices of insert 314. With each pressing of foot pedal 308, insert 314 may toggle between two sets of videos/images. Alternatively, insert 314 may cycle through a series of videos/images when foot pedal 308 is pressed. Still alternatively, pedal 308 may have a position sensor that generates a position signal indicating a position of pedal 308 when the user partially presses pedal 308. Upon receiving the position signal from pedal 308, processing unit 306 may determine the current position of pedal 308 and control insert 314 accordingly. Processing unit 308 may control insert 314 to generate a different set of videos/images corresponding to each position of pedal 308.
  • processing unit 306 controls insert 314 to generate a first set of videos/images.
  • processing unit 306 controls insert 314 to generate a second set of videos/images.
  • Surgical support system 310 may include an external data source and other surgical systems, such as a Bausch & Lomb Steliaris surgical system.
  • Surgical support system 310 may include biological sensors that collect biological or physiological data of the patient, including, for example, heart rate, blood pressure, electrocardiogram, etc.
  • Surgical support system 310 may further include a database that stores information of the patient, including the patient's medical history and healthcare record.
  • the database may also include information of the underlying surgical procedure such as pre-operation analysis and planning performed by a physician, data collecting during the surgical procedure, and additional procedures recommended for post-operation follow-ups.
  • the database may also include information of the operating physician including his or her identification, association, qualification, etc.
  • Surgical support system 310 may be further connected to additional medical devices (not shown) such as an ultrasound imager, a magnetic resonance imaging device, a computed tomography device, etc., to collect additional image data of the patient.
  • Processing unit 306 may receive the information and data from surgical support system 310 and controls insert 314 to generate images based on the information and data. For example, processing unit 306 may transmit the additional image data (i.e., ultrasound data, MRI data, CT data, etc.) received from system 310 to the driver circuit of insert 314 and control the driver circuit of insert 314 to render the additional image, through the display devices, along with the microscopic images of the patient provided by the microscope. Processing unit 306 may also generate additional image data representing the biological or physiological data collected from the patient and control insert 314 to render the additional image data through the display devices of insert 314.
  • additional image data i.e., ultrasound data, MRI data, CT data, etc.
  • Processing unit 306 may also generate additional image data representing the biological or physiological data collected from the patient and control insert 314 to render the additional image data through the display devices of insert 314.
  • Figures 4 and 5 illustrate the operation of a microscope insert according to an embodiment using insert 100 as an example.
  • microscope insert 100 may be integrated with a microscope 400 that is suitable for various purposes.
  • microscope 400 may be a stereoscopic, infinity-corrected, tube microscope.
  • microscope insert 100 may be adapted for use in other microscope layouts and stereoscopic devices known in the art.
  • Microscope 400 may include a viewing device 402 that allows a user to view images of an object 406 placed under the microscope. Viewing device 402 may be a heads-up device including one or more eye pieces, through which the images of the object are presented to the user. Microscope 400 further includes a set of lens elements 404 that receive light reflected from the object and form microscopic images of the object based on the reflected light. Lens elements 404 transmit the microscopic images of the object to tubes 406A and 406B of microscope 400. Tubes 406A and 406B form light transmission paths (i.e., light paths) that direct the microscopic image of the object toward viewing device 402. The microscopic image may be an analog image in an embodiment.
  • the polarizing beam splitters 120A and 120B are disposed in the respective light paths between lens elements 404 and viewing device 402 of the microscope, intercepting light coming from respective tubes 406A and 406B.
  • the beam splitters 120A and 120B may also be placed at other locations within the microscope as one of ordinary skill in the art will appreciate.
  • beam splitters 120A and 120B may serve two functions in insert 100. First, they may direct a first component of the light signals coming from the object to respective cameras 118A and 118B so that cameras 118A and 118B capture images of the object. Second, they may merge a second component of the light signals coming from the object that is passed through to viewing device 402 with light signals projected from the display devices 110A and 110B.
  • Beam splitter 120A/120B splits the light coming up from the object into two portions, directing a first portion (i.e., an S-poiarized component S1) towards camera 118A/118B and a second portion (i.e, a P-polarized component P1) towards viewing device 402 of the microscope.
  • Lens 112C/112D between beam splitter 120A/120B and camera 118A/118B is used to focus the S-polarized component S1 exiting beam splitter 120A/120B onto the imaging sensor of camera 118A/118B.
  • polarizing beam splitter 120A/120B receives light signals representing a microscopic image of the object from lens elements 404 through tubes 406A and 406B.
  • Each of polarizing beam splitters 120A and 120B splits incident light signals by allowing one polarized component S1 to reflect and the other polarized component P1 to pass through.
  • the polarized component P1 that passes through beam splitter 120A/120B reaches viewing device 402 and provide the user with the microscopic image of the object for viewing.
  • the polarized component S1 is reflected by beam splitter 120A/120B toward respective camera 118A/118B through respective tube lens 112C/112D.
  • Camera 118A/118B receives the polarized component S1 reflected from beam splitter 120A/120B and converts the optica! signals to electronic image data corresponding to the microscopic image of the object.
  • Camera 118A/118B may then transmit the electronic image data to processing unit 108 for further processing.
  • Beam splitter 120A/120B operates in a similar manner on the display device side.
  • display device 110A/110B renders images under the control of the driver circuit and projects light signals corresponding to the images to beam splitter 120A/120B through lens 112A/112B.
  • Lens 112A/112B between beam splitter 120A/120B and respective display device 11 OA/11 OB converts the light signals projected from display devices 110A/110B to parallel light rays to match the up-ward parallel light rays coming from tube 406A/406B.
  • Beam splitter 120A/120B splits the incident light signals coming from display devices 110A/110B, reflecting the S-polarized component S2 of the incident light signals originating from display devices 110A/110B and passing through foe P-polarized component P2 to camera 118A/118B.
  • the reflected S-polarized component S2 from display devices 110A/110B is then merged or combined with the P-polarized component P1 passed through beam splitter 120A/120B from tube 406A/406B.
  • the images of the object provided by the P-polarized component P1 and the images from display device 110A/110B provided by the S-polarized component S2 may be simultaneously viewed by the user through viewing device 402.
  • the images generated by display devices 110A/110B appear as overlaid images on the images of the object formed by lens element 404.
  • Polarizing element 114 placed between lens 112A/112B and beam splitter 120A/120B is configured to adjust the polarization of those projected parallel rays from lens 112A/112B so as to adjust the ratio of the light component (i.e., the S2 component) reflected by beam splitter 120A/120B to the !ight component (i.e., the P2 component) passed through to camera 118A/118B.
  • the intensity of the S-polarized component S2 may be adjusted relatively to the intensity of the P-polarized component P2.
  • the intensity of the S-polarized component S2 may be substantial equal to the P-polahzed component P2 so that the light signals projected from display devices 110A/110B are equally split by beam splitter 120A/120B.
  • the intensity of the S-polarized component S2 may also be adjusted relatively to the intensity of the P-poiarized component P1.
  • the images on the display device 110A/110B may be adjusted to be brighter or dimmer with respect to the images of the object when viewed through viewing device 402.
  • the user of microscope 400 may view a combined image including the microscopic image of the object and the overlaid image generated by display device 110A/110B.
  • the optical components of the microscope insert may be adjusted so that the overlaid image may appear at a projection image plane 410 that substantially overlaps the focal plane of microscope 400 and is located within the depth of field 408 of microscope 400.
  • the microscope insert for a stereoscopic microscope includes a set of imaging and projection hardware for each of the right and left tubes of the microscope so as to generate stereoscopic images.
  • the insert includes four lenses 112A-112D, lens 112C and 112D configured to focus the images of the object to left and right camera 118A and 118B, and lens 112A and 112B configured to project the images generated by left and right display devices 110A and 110B to beam splitters 120A and 120B.
  • these lenses may be incorporated in a lens set.
  • the microscope insert may include additional optical components, such as mirrors, prisms, or lenses, in the optical paths between the beam splitters and the cameras or between the beam splitter and the display devices to modify the directions of the light rays.
  • the modified light rays may allow the optical components of the insert to be more freely arranged or repositioned so as to fit into a desired mechanical or industrial form.
  • Figures 6A and 6B illustrate an embodiment of a microscope insert 600 including additional optical components to steer light rays.
  • Figures 6A and 6B shows, respectively, a side view and a top view of major optical elements of microscope insert 600.
  • Microscope insert 600 includes two optical channels for rendering images, respectively, for left and right eyes of the user. Although only one optical channel is described here, one of ordinary skill in the art will appreciate that the optical channels include similar elements and operate in similar manor.
  • Each optical channel of microscope insert 600 includes a polarizing beam splitter 624 disposed in the corresponding light pathway of the microscope and coupled to the tube of the microscope, from which light reflected by an object enters microscope insert 600.
  • a portion (i.e., the S-polarized component S1) of the incident light is diverted to a turning prism 625, which directs the S1 component through imaging lenses 627 on to a camera 604.
  • beam splitter 624 may include a polarizer element configured to adjust the ratio of the light component diverted to camera 604 to the light component passed through to the eyepiece.
  • the ratio may be, for example, 1 :1, 1:2, 1 :3, or other desired value.
  • the images generated by the processing unit and to be overlaid on the microscopic images of the object are rendered by a projection LCOS display panel 622 illuminated by an RGB LED light source 621.
  • the S-polarized light component S2 of the light generated by LED light source 621 is passed through a set of display illumination optics 620 including illumination lenses and a turning prism. From illumination optics 620, the S-polarized light component S2 is reflected at the hypotenuse of a polarizing beam splitter 623 to LCOS display panel 622.
  • LCOS display panel 622 acts as an active polarizer.
  • the P-polarized light component P2 passes through a projection lens module 628 and a polarizing wave plate 626 to tube polarizing beam splitter 624.
  • the P-polarized light component P2 is then directed to camera 604 by tube polarizing beam splitter 624 and steering prism 625.
  • the S- poiarized light component S2 is diverted and reflected by tube polarizing beam splitter 624 to the eyepiece of the microscope, which then visualizes the microscopic images of the object and the images generated by display panel 622. When viewed through the eyepiece, the images generated by display panel 622 are overlaid on the microscopic images of the object.
  • polarizing wave plate 626 may be omitted. Accordingly, the light from LCOS display panel 622 passes through tube polarizing beam splitter 624 without being reflected to the eye piece. Instead, the light from LCOS display panel 622 is directed to turning prism 625 and, in turn to, imaging lens 627 and camera 604. The benefit of this configuration is that wave plate 626 can be removed to perform a calibration between display panel 622 and camera 604. Based on calibration, the system may confirm that images generated by display panel 622 are aligned to the image space being measured by camera 604.
  • Figures 7A and 7B illustrate an embodiment of a microscope insert 700 that is similar to microscope insert 600 described above.
  • the components of microscope insert 700 are packaged and assembled on a base plate 711 so that microscope insert 700 is ready to be installed on a microscope.
  • insert 700 includes one or more optical channels, each including components similar to those of insert 600 illustrated in Figures 6A and 6B.
  • Each optical channel includes a camera 704 disposed in a camera housing affixed to base plate 711 , a set of imaging lenses disposed in a lens tube 705, an imaging steering prism secured to base plate by prism bracket 706, a set of illumination optics disposed in an illumination optics housing 709, a set of projection lenses disposed in a lens tube 710.
  • a focus mechanism is provided in imaging lens tube 705 and allows for fine adjustment of the relative position of the imaging lenses therein, for focusing.
  • a focus mechanism is also provided in display lens tube 710 and allows for fine adjustment of the position of the projection lenses for focusing.
  • Each optical channel further includes an RGB LED light source and a display panel mounted to base plate 711 through a display and RGB LED mounting bracket 714.
  • Microscope insert 700 further includes a driver circuit board 707 mounted to base plate 711 through a driver board bracket 708.
  • Microscope insert 700 further includes mounting components for mounting onto a microscope.
  • insert 700 includes a top mount 701 that may be coupled to the eyepieces of the microscope.
  • Top mount 701 may include features that allow the eyepieces to be secured thereon.
  • Top mount 701 is secured to base plate 701 through one or more top mount braces.
  • Top mount 701 includes one or more microscope tube openings that allow light to pass through from the polarizing beam splitters to the eye pieces of the microscope.
  • Top mount 701 further includes a wave plate slot 712 for disposing and securing the wave plate. The wave plate may be easily inserted into wave plate slot or removed therefrom as desired.
  • Microscope insert 700 further includes a bottom mount flange 702 that may be coupled and secured to the microscope tube within the body of the microscope.
  • Figure 8 illustrates a microscope insert 800 according to another embodiment.
  • light reflected from the object under the microscope (not shown) is directed from the tubes (e.g., 406A and 406B of Figure 4) to respective cameras 818A and 818B by a group of reflective mirrors and prisms 820C, 820D, 822C, and 822D.
  • the images generated by display devices 810A and 810B are projected back to beam splitters 816A and 816B by another group of mirrors and prisms 820A, 820B, 822A, and 822B.
  • the arrangement in this embodiment allows the components to be disposed on a relatively small base plate that has a relatively small footprint, thereby easing integration in a variety of microscopic systems.
  • a polarizer element 814A/814B may be disposed in the light path between display device 81 OA/810B and beam splitter 816A/816B and is used to vary the amount of light passed through to camera 818A/818B from display device 810A/810B.
  • Polarizing element 814A/814B may be a set of polarizers, wave plates, or variable retarders, depending on the output polarization of display devices 81 OA and 810B.
  • display device 81 OA/81 OB outputs an S-polarized component, which is then rotated by a 1/2- lambda wave plate in polarizing element 814A/814B so as to be reflected upwardly to the eyepiece for viewing by the user.
  • the microscope inserts disclosed herein may create a stereoscopic image.
  • the inserts may create separate images for the left and right eyes of the user. The images are shifted with respect to each other to provide the perception of different convergence, resulting in stereoscopic rendering.
  • FIG. 9 is a schematic diagram of a display driver circuit 900 according to an embodiment.
  • Display driver circuit 900 generally corresponds to driver circuit 102 of Figure 1.
  • Driver circuit 900 provides communication interfaces between processing unit 108 and display devices 110A and 110B.
  • the functions of driver circuit 900 may include, for example:
  • processing unit 108 to display devices 110A and 110B;
  • USB interface for communication with processing unit 108, which supports, for example, firmware updates, control of brightness, gamma, color channel gain of each display device, display focus, and status indication (i.e. power indication, insignia illumination, etc.).
  • processing unit 108 analyzes image data provided by cameras 118A and 118B and provides inputs to display driver circuit 102 for generating overlaid images through the display devices 110A and 110B.
  • processing unit 108 may analyze the image data for registration, tracking, or modeling the object under the microscope. Information derived from the analysis of the image data may then be used to generate and adjust the overlaid images generated by display devices 110A and 110B.
  • the microscope insert disclosed herein may be integrated in a microscope for ophthalmic procedures, such as cataract surgery.
  • the microscope insert may generate images representing surgery-related information to assist a surgeon to navigate during a cataract surgery.
  • the images may be displayed to the user overlaid with the real-time microscopic image of the patient's eye. As a result, the surgeon is able to simultaneously view the image of the eye and the overlaid images through the microscope.
  • Figure 10 illustrates an exemplary composite image 1000 rendered by a microscope having a microscope insert described herein, according to an embodiment.
  • Image 1000 includes a real-time microscopic image 1020 of a patient's eye as viewed through the microscope and images generated by the microscope insert overlaid on the real-time eye images.
  • Microscopic image 1020 of the patient's eye may be an analog image formed by the zoom lens elements of the microscope.
  • the overlaid images generated by the microscope insert include graphical representations of information related to the surgical procedure.
  • the overlaid images may include prompts or instructions to guide the surgeon during the surgery.
  • the overlaid images may include image features indicating an axis of interest 1002 and incision points 1006 and 1008 to guide the surgeon to carry out incision and placement of the artificial lens.
  • the overlaid images may also present information including parameters related to the surgery, such as the current operation stage 1012, ultrasound power 1014, vacuum suction 1016, current time, and the like.
  • the information may be presented in an image area 1010 near the area of operation.
  • Image area 1010 may have a shape that generally conforms to the shape of the patient's eye.
  • the processing unit of the microscope insert is configured to track and determine the position, size, and rotation of the patient's eye as it is viewed through the microscope and adjust the position, size, and orientation of the overlaid images accordingly so that the overlaid images remain registered with the patient's eye.
  • the microscope insert described here may also receive external data from external data sources and user inputs from user input devices during a surgical procedure, and adjust the overlaid images accordingly.
  • the processing unit may receive, from the external data source, demographic information, bio-information, and medical history of the patient.
  • the external data source may include a monitoring system that monitors status of surgical equipment or status of the patient, such as heart rate, respiratory rate, blood pressure, eye pressure, and the like, during the surgery.
  • the processing unit may receive, from the monitoring system, the external data including real-time information representing the status of the patient and the equipment and presenting the external data as part of the overlaid image displayed to the operating surgeon through the microscope insert.
  • the processing unit may receive user inputs from the surgeon through the input devices, such as a joy stick, a foot pedal, a keyboard, a mouse, etc.
  • the user inputs may instruct the processing unit to adjust the information displayed in the overlaid images. For example, based on the user inputs, the processing unit may select portions of the external data for display as part of the overlaid images.
  • the processing unit may also display prompts or navigation
  • the processing unit may control microscope insert to modify the overlaid images so as to display prompts or instructions for the next step.
  • the prompts or instructions may include text or graphical information indicating the next step and may further include data or parameters relevant to the next step.
  • the processing unit may also control the microscope insert to generate a warning to alert the surgeon if there are abnormalities during a surgical procedure.
  • the warning may be a visual representation such as a warning sign generated by the display devices as part of the overlaid image.
  • the warning may also be other visual, audio, or haptic feedback, such as a warning sound or a vibration.
  • Figure 11 illustrates a process 1100 for correcting the field of view provided by the display devices and matching it with the field of view of the microscope.
  • the microscope generates a microscopic image 1132 having a field of view 1152.
  • the microscope insert generates an overlaid image 1134 having a field of view 1154.
  • fields of view 1152 and 154 may each have a circular shape.
  • Field of view 1152 may have a diameter D1
  • field of view 1154 may have a diameter D2.
  • overlaid image 1134 generated by the microscope insert and microscopic image 1132 generated by the microscope are displayed to the user through the eyepiece.
  • microscopic image 1132 and overlaid image 1134 are combined or overlaid.
  • image features of overlaid image 1134 may obscure important image features of microscopic image 1132 or may appear to be disproportional to the image features of microscopic image 1132.
  • overlaid image 1134 must be adjusted according to the field of view of microscopic image 1132.
  • polarization imposed by polarizing element 114 on light signals projected by display device 110A/110B allows a portion (i.e., the P-polarized component P2) of the light signals to pass through polarizing beam splitter 120A/120B.
  • the passed-through light from display device 110A/110B is received by camera 118A/118B, which captures overlaid image 1134.
  • camera 118A/118B receives light (i.e., the S-polarized component S1) from the object, which is reflected by beam splitter 120A/120B, and captures microscopic image 1132 generated by the microscope.
  • the processing unit i.e., processing unit 108 of Figure 1 then compare overlaid image 1134 with microscopic image 1132 to determine image transformations necessary to match field of view 1154 of overlaid image 1134 with field of view 1152 of microscopic image 1132.
  • Process 1100 may be used to correct any optical misalignment during manufacturing or slight damages from handling.
  • the image transformations used by the processing unit may be affine transformations. Typical transformations may include translation, scaling, skewing, rotation, and the like.
  • the processor unit may determine a scaling factor for scaling overlaid image 1134 based on a ratio between the diameter D1 of field of view 1152 and the diameter D2 of field of view 1154.
  • the processor unit may also determine translation parameters ( ⁇ and Ay) necessary to align the microscopic image and the overlaid image based on the distance between the circular centers of fields of view 1152 and 1154.
  • the microscope insert may provide more precisely placed overlaid images over the microscopic images when viewed through the eyepiece of the microscope.
  • the processing unit may monitor changes in the field of view of the microscopic image (i.e., based on the S-polarized component S1) during operation and adjust the overlaid image in such a way to track or follow the field of view of the microscopic image.
  • the processing unit may track an anatomical feature of the patient under the microscope and adjust the field of view of the overlaid image to follow the anatomical feature.
  • the camera i.e., camera
  • 118A/118B of Figure 1 is configured such that field of view 1152 of the microscope is entirely captured by the camera sensor.
  • the overlaid image generated by the display device i.e., display device 110A/110B
  • the camera sensor and the display device are configured to provide oversampling so as to provide sufficient resolutions over the image area that covers the field of view of the microscope.
  • Figure 12 illustrates a process 1200 for generating an overlaid image over a microscopic image, according to an embodiment.
  • Process 1200 may be implemented on the microscope insert (i.e., microscope insert 100) disclosed herein.
  • the microscope insert receives a first light signal from a microscope (i.e., microscope 400).
  • the first light signal represents a first image corresponding to an object (i.e., object 406) placed under the microscope.
  • the first light signal may be received from the zoom lens elements of the microscope through the tube within the body of the microscope.
  • the first image may be an analog microscopic image of the object.
  • the microscope insert directs a first portion (i.e., the P- polarized component P1) of the first light signal to a viewing device (i.e., viewing device 402) and a second portion (i.e., the S-polarized component S1) of the first light signal to a camera (i.e., camera 118A/118B).
  • the first light signal may be split by the polarizing beam splitter (i.e., PBS 120A/120B) of the microscope insert into the first portion and the second portion.
  • the polarizing beam splitter may be configured to allow the first portion of the first light signal to pass through to the viewing device and reflect the second portion of the first light signal to the camera within the microscope insert.
  • the microscope insert may further include a tube lens (i.e., lens 112C/112D) to focus the second portion of the first light signal onto the camera sensor and/or additional light steering components (i.e., mirrors and prisms) to direct or redirect the second portion of the first light signal to the location of the camera.
  • a tube lens i.e., lens 112C/112D
  • additional light steering components i.e., mirrors and prisms
  • a display device i.e., display device 110A/110B of the microscope insert generates a second image to be overlaid on the first image.
  • the second image i.e., the overlaid image
  • the second image includes graphical representations indicating information relevant to the object.
  • the object is a patient's eye and a surgical procedure (i.e., a cataract surgery) is carried out on the object
  • the second image may include, for example, prompts, instructions, parameters, and data relevant to the underlying surgical procedure.
  • the display device produces a second light signal representing the second image.
  • the microscope insert directs a first portion (i.e., the P- polarized component P2) of the second light signal to the camera and a second portion (i.e., the S-polarized component S2) of the second light signal to the viewing device.
  • the second light signal may be split again by the polarizing beam splitter into the first portion and the second portion.
  • the polarizing beam splitter may allow the first portion to pass through to the camera and reflect the second portion to the viewing device.
  • the microscope insert may further include a tube lens (i.e., lens 112A/112B) between the display device and the polarizing beam splitter to alter (i.e., expand) the second light signal projected by the display device.
  • the microscope insert may also include additional light steering components (i.e., mirrors and prisms) to direct the second light signal from the display device to the location of the polarizing beam splitter.
  • the microscope insert may also include a polarizer element (i.e., polarizer element 114) between the display device and the polarizing beam splitter.
  • the polarizer element may impose polarization on the second light signal so as to adjust the ratio between the first portion of the second light signal, which is passed through to the camera, and the second portion of the second light signal, which is reflected to the viewing device.
  • the first portion of the first light signal and the second portion of the second light signal are combined to form a composite image, including the first image corresponding to the object and the second image generated by the display device.
  • the second image when viewed through the viewing device, is rendered over the first image.
  • the user of the microscope i.e., the surgeon
  • the microscope insert may detect any mismatch between a field of the view of the first image and a field of view of the second image.
  • the microscope insert may detect the mismatch based on the second portion of the first light signal and the first portion of the second light signal received by the camera. If there is a mismatch, the microscope insert may adjust the second image according to the image transformations described herein so as to match the field of view of the second image with the field of view of the first image.
  • a microscope insert may apply image registration to the images of the object viewed under a microscope. Based on the registration, the processing unit of the microscope insert may render graphical elements, such as tags, labels, and the like, through the display device, providing instructions, prompts, or other surgery-related information to the operating surgeon. When the surgeon views the object through the eyepiece of the microscope
  • the graphical elements are overlaid on the images of the object.
  • the overlaid graphical elements may identity and track anatomical features that are of interest and are spatially associated with the identified anatomical features, thereby providing the surgeon with visual guidance and facilitating navigation through the surgical site.
  • the processing unit may be configured to perform two-dimensional (2D) or three-dimensional (3D) registration and tracking.
  • images generated by the cameras may be analyzed by the processing unit independently to provide 2D registration and tracking.
  • the images generated by the cameras may be analyzed together to provide a 3D registration to a known or assumed model.
  • the disclosed system provides the benefits of improved 3D registration by using two cameras for the 3D registration and two display devices for the 3D overlays.
  • the 3D registration is significantly improved using two cameras, compared with existing systems with one camera, and allows for improved registration and tracking, particularly for anatomical features that are at different depths within an operative site and move with respect to each other.
  • FIG. 13 illustrates an exemplary process 1300 for the marker-based registration.
  • the processing unit of the microscope insert applies image enhancement to the images of the object captured by the camera.
  • the image enhancement may be any known techniques such as digital filtering, sharpening, and the like.
  • the processing unit may detect and identify fiducial markers disposed on the object.
  • the fiducial markers may be detected based on spatial or spectral analysis of the images of the object. Alternatively, the fiducial markers may be determined based on a predetermined shape or color.
  • the processing unit may further identify the fiducial markers and associate the fiducial markers with respective identifications.
  • the processing unit may perform pose estimation.
  • the processing unit may further perform registration on the images of the object based on the identified markers.
  • the processing unit may determine a movement or orientation of the object based on the identified marker and calculate a coordinate transformation corresponding to the movement or orientation.
  • the transformation mathematically represents translations, rotations, or other affine transformations of the object.
  • the processing unit may adjust the graphical elements of the overlaid images generated by the display device according to the registration.
  • the processing unit may apply the coordinate transformation to the graphical elements so that the graphical elements experience similar translations, rotations, and the like.
  • the registration is carried out in real-time when the images of the object are captured by the camera.
  • FIG 14 illustrates an exemplary process 1400 for the anatomical feature-based registration.
  • the processing unit performs image enhancement on the images of the object captured by the camera, similar to step 1302 discussed above.
  • the processing unit performs image segmentation on the enhanced images of the object.
  • the image segmentation may be performed on an entire image or on regions of the image.
  • the image segmentation may be based on any known techniques, such as, the clustering techniques, histogram techniques, and thresholding techniques.
  • image pixels are grouped according to the image features, to which they belong.
  • the image features are detected based on the segmented images of the object. Individual image features that are of interest may be extracted from the segmented images. The detected image features may correspond to known anatomical structures of the object. The detected image features may then be matched with the known anatomical structures based on, for example, their shapes, sizes, locations, colors, and the like.
  • the processing unit may perform pose estimation similar to step 1306. For example, the processing unit may use a random sampling technique to calculate the pose of the object based on the detected image features. The processing unit may determine a coordinate transformation corresponding to the pose of the object. In addition, the processing unit may apply the coordinate transformation to the image elements of the overlaid image generated by the display device. As a result, the overlaid image tracks the anatomical features of the object when viewed through the microscope.
  • a 2D or 3D registration may be achieved.
  • the processing unit determines, based on the processed image, a set of coordinate transformation data including, for example, X, Y, R, and Theta.
  • X and Y represent the position, in pixel space, of the center of the patient's eye.
  • R represents the radius, in pixel space, of the limbus of the patient's eye.
  • Theta represents an angle of rotation of the patient's eye.
  • the processing unit may use information from the images captured by the left and right cameras to solve for the coordinate transformation data of the object on six degrees of freedom. Additionally, the processing unit may also use a hybrid registration technique that combines elements of the fiducial marker-based registration and the anatomical feature-based registration to determine the position and orientation of the eye of a patient.
  • the processing unit first performs an image enhancement on the images of the object acquired by the camera.
  • a range of image enhancement techniques may be used.
  • the processing unit may apply a histogram equalization technique to an image of an eye acquired by the camera ( Figure 15A).
  • a known contrast-limited adaptive histogram equalization may be used to equalize the histogram by broadening the range of the contract of image intensities.
  • the processing unit first converts the image into an L-a-b color space representation, in order for the histogram equalization to be performed on the luminosity channel of the converted image.
  • Figure 15B shows a resulting enhanced image after the histogram equalization.
  • the processing unit extracts the fiducial markers by segmenting the enhanced image. For example, in the fiducial marker-based registration, the processing unit may select a saturation channel in an HSV color space representation and apply binary thresholding to the enhanced image.
  • Figure 15C shows a binary image produced by binary thresholding.
  • the binary image may be further digitally filtered by the processing unit to eliminate those regions that have a smaller number of pixels than a preset value. The further filtering may eliminate those regions that do not correspond to any markers.
  • the processing unit may also remove highly eccentric regions having a large difference between major and minor diameters. The centroid of the remaining pixel clusters may then be computed and stored as the correct locations of the fiducial markers.
  • the processing unit may further perform a contrast-limited adaptive histogram equalization (CLAHE) on the image acquired by the camera to enhance the contrast in the image.
  • CLAHE contrast-limited adaptive histogram equalization
  • the processing unit may then apply a Gaussian filtering on the enhanced image and then segment the filtered image into regions based on color similarities in the a-b space of the L-a-b color space representation.
  • the processing unit may then apply a K-means clustering technique known in the art, as shown in Figures 16A-16D, to achieve color segmentation on a real-time scale.
  • the K-means clustering technique applies vector quantization on the image pixels and partitions the pixels into k clusters, in which each pixel belongs to the cluster with the nearest mean values.
  • the K-means clustering technique starts from an initial guess ( Figure 16A) and gradually refines the clusters ( Figures 16B-16D) so that the image pixels are properly grouped in the feature space.
  • the resulting image may then be thresholded to define a binary mask, which may later be used to filter out or remove feature points inside the iris of the eye as shown in Figure 17. This is desired because the size and shape of the iris may change throughout the surgery, making any feature points within the iris unsuitable for tracking.
  • FIG. 18A, 18B, and 19 feature points of an eye along with their describing geometries, known as feature descriptors, are identified in a reference image ( Figure 18A) and a test image ( Figure 18B).
  • a known real-time feature point detection technique or scale-invariant technique may be used to classify these features. These techniques include, for example, SIFT (Scale- invariant Feature Transform), SURF (Speeded-up Robust Features), STAR, FAST, GFTT (Good Features to Track), and MSER (Maximally Stable Extremal Regions).
  • a number of feature points are detected and classified in the reference image ( Figure 18A) and the test image ( Figure 18B), respectively.
  • Each circle in the image represents a feature point, and the size of the circle represents a scale of the identified point.
  • Feature points are identified in scale-space utilizing the determinant of the Hessian (DoH).
  • DoH Hessian
  • a mask determined from color segmentation may be used to filter out any SURF features detected inside the iris of the eye.
  • the remaining features may be matched by means of a random sampling method, such as the random sample consensus method (RANSAC), to enable a large number of outliers.
  • the RANSAC method computes a homography from a minimal subset of feature points, and randomly adds features to find the solution, which encompasses the largest number of feature points.
  • the resulting homography may then be transformed into a camera position and orientation, as shown in Figure 19, and compared to the known input transformation.
  • the processing unit may analyze 2D images generated by the cameras based on parameters of the cameras and a spatial relationship between the cameras. The processing unit may then compare the 2D analyses or use a 3D registration to calculate the position of the microscope focal plane relative to a plane of interest with respect to the real object. For example, the processing unit may determine the distance between the microscope focal plane and the plane of interest. The processing unit may then use that distance H9 adjust the focus of the projected images generated by the insert so as to match the plane of interest with respect to the real object. This technique provides the benefit of relieving eye strain of the surgeon when viewing the projected images and the analog image of the object at the same time by focusing the projected images to the plane of interest that the user is visualizing.
  • the processing unit may then control the display devices to adjust the overlaid images to track the changes or motions of the object viewed under the microscope, to display information relevant to the motions of the object, or perform other functions accordingly.
  • the processing unit may cause the insert to render tags and labels associated with individual layers or features of the eye, such as the sclera, iimbus, pupil, or iris, or other very small layers of the eye.
  • the processing unit may identify different anatomical layers or features of the eye and generate graphical elements in the overlaid images according to the identified anatomical layers or features.
  • Figure 20 illustrates a sequence diagram for interactions between a tracker engine configured to track the motions of an eye and a torsion engine configured to generate graphical representations for the guidance or prompts.
  • Figure 21 illustrates an exemplary tracking system for tracking the eye of a patient and generating the graphical representations for the guidance or prompts.
  • Figure 22 illustrates an exemplary process for carrying out the tracking by the tracker engine of Figure 21.
  • the main processing blocks are similar for both the reference and sense images, and are illustrated in Figure 22. These are detailed as follows:
  • Figure 23 illustrates an exemplary process for reference image processing and generating Housdorff Distance look-up table. The main processing blocks are illustrated in Figure 23 and are detailed as follows:
  • Image pre-processing - this processing step includes Histogram equalization.
  • Gabor filtering and Skeletonization including extracting the vasculatures from the sclera using four orientations of a 2-D Gabor filter, skeletonizing the four feature images, combining (logical OR) the four skeletonized images, and skeletonizing the final image.
  • template ROI in reference annulus - a template ROI is extracted from the Gabor filtered and skeletonized reference annulus.
  • Figure 24 illustrates an exemplary process for processing the sense image and computation of the minimum-Hausdorff Distance between the reference image template ROI and ROI in the sense image.
  • the main processing blocks are illustrated in Figure 24 and are detailed as follows:
  • Image pre-processing - this processing step includes histogram equalization.
  • Gabor filtering and Skeletonization including extracting the vasculatures from the sclera using four orientations of a 2-D Gabor filter, skeletonizing the four feature images, combining (logical OR) the four skeletonized images, and skeletonizing the final image.
  • ROI in reference annulus - a ROI is extracted from the Gabor filtered and skeletonized sense image annulus.
  • Figures 25-32 depict a process for estimation of ocular torsion from sclera features.
  • the ocular torsion estimation is based on the following
  • torsion angle is deterministic.
  • the computed angle is corrupted with additive noise (AWGN) and can be modeled as a sinusoidal (with AWGN).
  • AWGN additive noise
  • the system captures exemplar frame from which to compute the reference feature vector.
  • a reference image is captured with the subject (i.e., the patient) looking straight ahead. This is the exemplar image from which the feature template is generated.
  • the pupil radius of the patient under dilation is non-deterministic.
  • the system may use a-priori knowledge of the pupil center and radius to determine an accurate estimate of the iris radius.
  • the system may also preclude instabilities that may arise from extreme pupil dilation. Extreme pupil dilation precludes a safe guess as to the annular region which contains the outer iris boundary.
  • the system also computes iris radius. Iris boundary is deformable under incident forces. Absent any deforming incident forces, the radius is constant within bounds.
  • the system first computes tine pupil radius, which is non-deterministic under dilation, and precludes a "guess" for the annular ring which encloses the elliptical boundary of the iris. The system then computes iris radius.
  • Figure 26 illustrates an exemplary process for iris boundary estimation, including frame capture, down sampling, histogram equalization, grey scaling, Gaussian kernel filtering, pupil center and radius calculation, iris center and radius calculation, and definition of annular region in sclera.
  • Figure 27 illustrates an exemplary process for feature extraction from annular ring In exemplar image, including pre-definition of annular region in sclera, mapping annulus to rectangular region, sclera feature detection within (0 - 2n), and post processing.
  • the sclera feature detection may be based on simple edge detection, such as Sobel filter, Canny filter, etc., Gabor wavelet, or wavelet using the lifting scheme.
  • Figure 28 illustrates an exemplary feature extraction from segment of annular ring in test image, including segmentation from pre-defined annular region in sclera, mapping annular segment to rectangular region, sclera feature detection, and post processing.
  • the pre-determined segment is taken from the pre-defined annular ring.
  • the annular ring may be previously defined in the exemplar image.
  • the segmentation is included by two angles 0 1( 0 2 e [ ⁇ , 2 ⁇ ].
  • Figure 29 illustrates an exemplary annular ring constructed in scleral region of an exemplar image.
  • Figure 30 illustrates an exemplary mapping of annulus in sclera to a rectangular region.
  • Figure 31 illustrates an exemplary segmentation in annular region in a test image.
  • Figure 32 illustrates an exemplary template matching process.
  • the estimation of the ocular torsion may be based on the city-block distance metric (i.e., liriorm).
  • the system may calculate the knorm between p and q by.
  • the linorm metric is computationally more efficient than the Euclidean distance and the cross-correlation techniques.
  • the segments of the annular ring will, at times, be occluded during surgery.
  • the system may choose a segment that is not occluded, as segment occlusion will lead to template match failure.
  • the system may mitigate the occlusion by locating the occluded segment.
  • the occulated segment may be non-deterministic if a-priori knowledge of the location of the instruments is unknown.
  • the computational complexity may also be high.
  • the system may also define a number of segments using a-priori knowledge of instruments location (if known) so as to preclude a choice of an occluded segment.
  • the system may choose a segment randomly and compute the distance metric. The system will know, if failure, without need to search the entire exemplar since the torsion angle between neighboring frames is constrained - between neighboring (successive) frames: ⁇ ⁇ ⁇ 2° typically
  • torsion frequency Using a-priori knowledge of the torsion frequency will allow the system to compute the next torsion angle gradient, and will allow some limited adaptation. It will also allow estimation of the torsion frequency, when the pupil is dilated and the patient is prone. This is the preamble to actual surgery.
  • the system may use best fit sinusoidal to torsion angles.
  • the microscope insert allows the processing unit to acquire depth information along the z axis of any anatomical features, which is not available in existing systems.
  • the processing unit may the use the depth information to provide 3D guidance and assist the surgeon to navigate during the surgical procedure.
  • microscope insert is described above in the context of a cataract surgery, one of ordinary skill in the art will appreciate that the microscope insert may be integrated in other surgical systems configured to carry out a wild variety of surgical procedures, such as spinal surgery, ear, nose, and throat (ENT) surgery, neurosurgery, plastic and reconstructive surgery, gynecology, oncology, etc.
  • the insert may be used for registration, tracking, and image recognition and to generate customized stereoscopic overlaid information relevant to the procedure and a particular patient's anatomy that is not limited to what is disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.

Landscapes

  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Surgery (AREA)
  • Optics & Photonics (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Biophysics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Multimedia (AREA)
  • Pathology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • High Energy & Nuclear Physics (AREA)
  • Radiology & Medical Imaging (AREA)
  • Human Computer Interaction (AREA)
  • Microscoopes, Condenser (AREA)

Abstract

A microscope insert includes a camera, a display device, a beam splitter, and a processing unit. The camera is configured to receive a first portion of first light through a microscope from an object and generate a signal representing an image of the object. The display device is configured to generate a graphical representation of information relevant to the object and project second light representing the graphical representation. The beam splitter is configured to direct a second portion of the first light from the object and a first portion of the second light to a viewing device for simultaneously viewing the object and the information by a user. The processing unit is configured to track motions of the object based on the image of the object and control the display device to adjust the graphical representation according to the motions of the object.

Description

METHODS AND SYSTEMS FOR REGISTRATION USING A MICROSCOPE
INSERT
DESCRIPTION
Cross Reference to Related Applications
[001] This application claims the benefit of priority to U.S. Provisional Application No. 62/133,182, filed March 13, 2015.
[002] This disclosure is related in general to surgical microscopes and in particular to registration using a microscope insert for surgical microscopes.
Background
[003] Surgery carried out through a microscope, such as the cataract surgery, presents special challenges for the surgeon and the microscope. Not only must each procedure and step be carried out accurately, but parameters of the surgery and biological data of the patient must be monitored closely to achieve desired results and ensure safety of the patient. Existing surgical systems, such as ophthalmology microscopes, do not have the ability to display the surgical site and related data within the same field of view. As a result, the surgeon must move away from the eye pieces of the microscope to an external display device in order to view the related data and then move back to the eye pieces in order to continue the surgery. This is not only inconvenient, but may also cause patient safety issues. In addition, existing surgical systems do not provide sufficient prompts or guidance to the surgeon to ensure a correct procedure is carried out. It is desired to provide system-generated prompts for the surgeon during the surgery. [004] According to an embodiment, a microscope insert includes a camera, a display device, a beam splitter, and a processing unit. The camera is configured to receive a first portion of first light through a microscope from an object and generate a signal representing an image of the object. The display device is configured to generate a graphical representation of information relevant to the object and project second light representing the graphical representation. The beam splitter is configured to direct a second portion of the first light from the object and a first portion of the second light to a viewing device for simultaneously viewing the object and the information by a user. The processing unit is configured to track motions of the object based on the image of the object and control the display device to adjust the graphical representation according to the motions of the object.
[005] According to another embodiment, a method for tracking and registering an object in a microscope is disclosed. The method includes receiving first light from an object through a microscope; generating, based on a first portion of the first light, a first signal representing an image of the object; generating, according to the image of the object, a graphical representation of information relevant to the object; projecting second light corresponding to the graphical representation of the information; directing a second portion of the first light from the object and a first portion of the second light to a viewing device for simultaneously viewing the object and the information by a user; tracking the object based on the image of the object; and adjusting the graphical representation according to the tracking of the object.
BRIEF DESCRIPTION OF THE DRAWINGS
[006] Figure 1 is a schematic diagram of a microscope insert according to an embodiment; [007] Figure 2 illustrates a surgical system including the microscope insert according to an embodiment;
[008] Figure 3 illustrates electronic connections between the microscope insert and an external computer system according to an embodiment;
[009] Figure 4 illustrates a microscope system including an microscope insert according to an embodiment;
[010] Figure 5 illustrates light paths within a microscope insert according to an embodiment;
[011] Figure 6A is a side view of various components assembled in a microscope insert according to an embodiment;
[012] Figure 6B is a top view of the various components assembled in the microscope insert according to an embodiment;
[013] Figures 7A and 7B are perspective views of a microscope insert having various components installed therein according to an embodiment;
[014] Figure 8 is a schematic diagram of a microscope insert according to an embodiment;
[015] Figure 9 is a schematic diagram of an insert driver circuit board for a microscope insert according to an embodiment;
[016] Figure 10 illustrates graphical information generated by a microscope insert according to an embodiment;
[017] Figure 11 illustrate a process for correcting a field of view of the microscope insert according to an embodiment; and
[018] Figure 12 illustrates a process for generating an overlaid image in a microscope according to an embodiment; [019] Figure 13 illustrates a process for marker-based registration and tracking according to an embodiment;
[020] Figure 14 illustrates a process for anatomical feature-based registration and tracking according to an embodiment;
[021] Figure 15A is an image of an eye with fiducial markers captured by a camera according to an embodiment;
[022] Figure 15B is an enhanced image of an eye with fiducial markers generated based on the image of Figure 15A;
[023] Figure 15C is a binary mask generated based on the enhanced image of Figure 15B;
[024] Figures 16A-16D illustrate a K-means clustering process for segmenting the image captured by a camera of the disclosed microscope insert according to an embodiment;
[025] Figure 17 is a mask image generated by a K-means clustered image according to the process of Figures 16A-16D;
[026] Figure 18A is a reference image used for feature identification according to an embodiment;
[027] Figure 18B is a test image captured by a camera for feature
identification according to an embodiment;
[028] Figure 19 illustrates the mapping between the reference image of Figure 18A and the test image of Figure 18B according to an embodiment;
[029] Figure 20 illustrates a sequence diagram for interactions between a tracker engine configured to track the motions of an eye and a torsion engine configured to generate graphical representations for the guidance or prompts; [030] Figure 21 illustrates an exemplary tracking system for tracking the eye of a patient and generating the graphical representations for the guidance or prompts;
[031] Figure 22 illustrates an exemplary process for carrying out the tracking by the tracker engine of Figure 21 ;
[032] Figure 23 illustrates an exemplary process for reference image processing and generating Housdorff Distance look-up table;
[033] Figure 24 illustrates an exemplary process for processing the sense image and computation of the minimum-Hausdorff Distance between the reference image template ROI and ROI in the sense image; and
[034] Figures 25-32 depict a process for estimation of ocular torsion from sclera features,
DESCRIPTION OF THE EMBODIMENTS
[035] Reference will now be made in detail to exemplary embodiments of the disclosure, examples of which are illustrated in the accompanying drawings.
Wherever possible, the same reference numbers will be used throughout the drawings to refer to the same or like parts.
[036] As shown in Figure 1, a microscope insert 100 includes a projection system 104 and an imaging system 106. Projection system 104 includes one or more display devices 110A and 110B and one or more sets of tube lenses 112A and 112B for projecting images from the display devices 110A and 110B. Imaging system 106 includes one or more cameras 118A and 118B and one or more sets of tube lenses 112C and 112D for focusing images to cameras 118A and 118B.
Microscope insert 100 further includes one or more polarizing beam splitters (PBS) 120A and 120B, which will be further described below. [037] The above components of insert 100 form individual optical channels that generate respective images for left and right eyes of a user. Each optical channel includes a display device 11 OA/11 OB, a camera 118A/118B, a polarizing beam splitter 120A/120B, and corresponding tube lenses 112A/112B and
112C/112D. In a further embodiment, a polarizer element 114 may be disposed between tube lenses 112A/112B and polarizing beam splitters 120A/120B.
Alternatively, polarizer element 114 may include different pieces for respective optical channels.
[038] Although Figure 1 shows two optical channels for microscope insert 100, one of ordinary skill in the art would recognize that insert 100 may have any number of optical channels, each having a structure similar to those depicted in Figure 1. When microscope 100 includes two or more optical channels,
videos/images generated by the optical channels are configured so as to provide a user with stereoscopic rendering.
[039] In an embodiment, cameras 118A and 118B are digital imaging devices, such as the Point Grey FL3-U3-13S2C-CS manufactured by Point Grey Research. However, a number of different cameras may be used, providing different features, such as a CMOS or CCD based sensor, a global or rolling shutter, and a range of resolutions at about 20 FPS or higher.
[040] In an embodiment, display devices 110A and 110B may be LCOS (Liquid Crystal on Silicon) microdisplay devices, each of which has pixels that can be individually adjusted to match or exceed the brightness of the microscope. Other display technologies may also be used, such as OLED, DLP, T-OLED, MEMS, and LCD-based displays. [041] Insert 100 also includes a display driver circuit 102 to control display devices 110A and 110B and/or other system elements or features. Display driver circuit 102 may generate video/image data that are suitable for rendering by display devices 110A and 110B.
[042] Insert 100 is connected to a processing unit 108 via standard communication protocols. Processing unit 108 may or may not be disposed within insert 100. Processing unit 108 receives video/image signals from cameras 118A and 118B and sends the video/image signals to driver circuit 102 for rendering the videos/images on display devices 110A and 110B. Processing unit 108 may apply additional processing on videos/images data received from cameras 118A and 118B, For example, processing unit 108 may perform image processing techniques, such as image registration, pattern recognition, image filtering, image enhancement, and the like.
[043] Processing unit 108 may also be connected to other peripherals to collect data to be used by microscope insert 100, to generate visual guidance for navigation during a surgical procedure, or to provide alternative graphical user interfaces on external display devices to supplement the display through microscope insert 100.
[044] Figure 2 illustrates a surgical system 200 including a microscope insert 228 according to a further embodiment. Surgical system 200 includes a microscope 226 coupled to microscope insert 228. Microscope insert 228 generally corresponds to microscope insert 100 of Figure 1. Insert 228 communicates with a processing unit 230, which corresponds to processing unit 108 of Figure 1.
[045] Microscope 226 receives light or optical signals reflected from an object through its lens system and the polarized beam splitters (e.g., PBS's 120A and 120B), which pass the optical signals to the cameras (e.g., cameras 118A and 118B) of microscope insert 228. The cameras of microscope insert 228 convert the optica! signals to digital data representing videos/images of the object and transmit the digital data to processing unit 230.
[046] Processing unit 230 performs image processing on the digital data and sends processed data and relevant commands to the driver circuit (e.g., driver circuit
[02] of microscope 226. Based on the processed data and the commands from the driver circuit, display devices (e.g., display devices 110A and 110B) of microscope insert 228 generate optical signals representing processed videos/images of the object and project the optical signals to polarized beam splitters 120A and 120B. Polarized beam splitters 120A and 120B pass the optical signals to the eye pieces of microscope 226 for viewing by a user. The driver circuit may also control, for example, the brightness or contrast of display devices 110A and 110B.
[047] Processing unit 230 may also communicate with additional input devices, such as a QR code reader 202, a foot pedal 204, a USB switch 206, a power supply 208, and one or more external storage devices providing surgical planning data 210 or calibration and software update data 212. Additionally, processing unit 230 may be further connected to a surgical support system 224 that is suitable for the underlying surgery. For example, surgical support system 224 may be the Stellaris system manufactured by Bausch & Lomb Incorporated and suitable for ophthalmic procedures. Surgical support system 224 may collect the demographicai and biological data of a patient and provides the data to processing unit 230.
[048] Still additionally, system 200 may include various output devices, such as speakers 218, an external display device 220, and a remote display device 222. External display device 220 and remote display device 222 may be high-resolution monitors that provide additional monitoring capability outside of insert 228. Display devices 220 and 222 may be located in the same operating room as microscope 226 or at a remote location. System 200 may further include one or more storage media for storing post-operation data 214 and system diagnostics data 216. Similarly, other system components shown in Figure 2 may also be located in different locations and connected to processing unit 230 through, for example, Ethernet, Internet, USB connections, Bluetooth connections, infrared connections, cellular connections, Wi-Fi connections, and the like.
[049] Figure 3 illustrates a surgical system 300 including a microscope insert 314 according to an alternative embodiment. Microscope insert 314 generally corresponds to microscope insert 100 of Figure 1 and is configured to generate stereoscopic images as described herein. For example, insert 314 may include two imaging cameras, two display devices, a driver circuit, and other imaging and projection optics for left and right eyes of a user.
[050] System 300 further includes a medical stand 302, an external monitor 312, a foot pedal 308, and a surgical support system 310. Medical stand 302 may include a QR image scanner 304 configured to scan QR codes to provide
information encoded in the codes. Medical stand 302 also includes a processing unit 306, which generally corresponds to processing unit 108 of Figure 1. Processing unit 306 may be included a motherboard with interfaces, such as USB 2.0, USB 3.0, Ethernet, etc. Processing unit 306 may include a central processing unit (CPU) with heat sinks, a RAM, a video card, a power supply, a webcam, etc. Processing unit 306 is connected to other system components through its communication interfaces, such as USB ports, Ethernet ports, Internet ports, HDMI interfaces, etc. For example, processing unit 306 may be connected to microscope insert 314 and external monitor 312 through HDMI interfaces to provide high resolution video/image data to the driver circuit of insert 314 and monitor 312. Alternatively, processing unit 306 may also be connected to insert 314 and monitor 312 through USB ports to provide videoimage data and control signals. Processing unit 306 may be connected to the camera of insert 314 through USB ports to receive video/image data from the camera.
[051] Foot pedal 308 and other user input devices may be connected to processing unit 306 through one or more USB ports. Foot pedal 308 may be operated by a user to provide user input during a surgery. For example, when the user presses foot pedal 308, foot pedal 308 may generate an electronic signal.
Upon receiving the electronic signal from foot pedal 308, processing unit 306 may control insert 314 accordingly.
[052] For example, when the user presses foot pedal 308, processing unit 306 may control insert 314 to change the videos/images generated by the display devices of insert 314. With each pressing of foot pedal 308, insert 314 may toggle between two sets of videos/images. Alternatively, insert 314 may cycle through a series of videos/images when foot pedal 308 is pressed. Still alternatively, pedal 308 may have a position sensor that generates a position signal indicating a position of pedal 308 when the user partially presses pedal 308. Upon receiving the position signal from pedal 308, processing unit 306 may determine the current position of pedal 308 and control insert 314 accordingly. Processing unit 308 may control insert 314 to generate a different set of videos/images corresponding to each position of pedal 308. For example, when the user presses pedal 308 to a first position, processing unit 306 controls insert 314 to generate a first set of videos/images. When the user presses pedal 308 to a second position, processing unit 306 controls insert 314 to generate a second set of videos/images.
[053] Surgical support system 310 may include an external data source and other surgical systems, such as a Bausch & Lomb Steliaris surgical system. Surgical support system 310 may include biological sensors that collect biological or physiological data of the patient, including, for example, heart rate, blood pressure, electrocardiogram, etc. Surgical support system 310 may further include a database that stores information of the patient, including the patient's medical history and healthcare record. The database may also include information of the underlying surgical procedure such as pre-operation analysis and planning performed by a physician, data collecting during the surgical procedure, and additional procedures recommended for post-operation follow-ups. The database may also include information of the operating physician including his or her identification, association, qualification, etc. Surgical support system 310 may be further connected to additional medical devices (not shown) such as an ultrasound imager, a magnetic resonance imaging device, a computed tomography device, etc., to collect additional image data of the patient.
[054] Processing unit 306 may receive the information and data from surgical support system 310 and controls insert 314 to generate images based on the information and data. For example, processing unit 306 may transmit the additional image data (i.e., ultrasound data, MRI data, CT data, etc.) received from system 310 to the driver circuit of insert 314 and control the driver circuit of insert 314 to render the additional image, through the display devices, along with the microscopic images of the patient provided by the microscope. Processing unit 306 may also generate additional image data representing the biological or physiological data collected from the patient and control insert 314 to render the additional image data through the display devices of insert 314.
[055] Figures 4 and 5 illustrate the operation of a microscope insert according to an embodiment using insert 100 as an example. As shown in Figure 4, microscope insert 100 may be integrated with a microscope 400 that is suitable for various purposes. In an embodiment, microscope 400 may be a stereoscopic, infinity-corrected, tube microscope. Alternatively, microscope insert 100 may be adapted for use in other microscope layouts and stereoscopic devices known in the art.
[056] Microscope 400 may include a viewing device 402 that allows a user to view images of an object 406 placed under the microscope. Viewing device 402 may be a heads-up device including one or more eye pieces, through which the images of the object are presented to the user. Microscope 400 further includes a set of lens elements 404 that receive light reflected from the object and form microscopic images of the object based on the reflected light. Lens elements 404 transmit the microscopic images of the object to tubes 406A and 406B of microscope 400. Tubes 406A and 406B form light transmission paths (i.e., light paths) that direct the microscopic image of the object toward viewing device 402. The microscopic image may be an analog image in an embodiment.
[057] As further shown in Figures 4 and 5, when insert 100 is installed in microscope 400, the polarizing beam splitters 120A and 120B are disposed in the respective light paths between lens elements 404 and viewing device 402 of the microscope, intercepting light coming from respective tubes 406A and 406B. The beam splitters 120A and 120B may also be placed at other locations within the microscope as one of ordinary skill in the art will appreciate. As further described below, beam splitters 120A and 120B may serve two functions in insert 100. First, they may direct a first component of the light signals coming from the object to respective cameras 118A and 118B so that cameras 118A and 118B capture images of the object. Second, they may merge a second component of the light signals coming from the object that is passed through to viewing device 402 with light signals projected from the display devices 110A and 110B.
[058] In particular, in an infinity-corrected tube microscope, for example, light rays passing through the tube are generally parallel, similar to those from a source infinitely far away. Beam splitter 120A/120B splits the light coming up from the object into two portions, directing a first portion (i.e., an S-poiarized component S1) towards camera 118A/118B and a second portion (i.e, a P-polarized component P1) towards viewing device 402 of the microscope. Lens 112C/112D between beam splitter 120A/120B and camera 118A/118B is used to focus the S-polarized component S1 exiting beam splitter 120A/120B onto the imaging sensor of camera 118A/118B.
[059] More particularly, polarizing beam splitter 120A/120B receives light signals representing a microscopic image of the object from lens elements 404 through tubes 406A and 406B. Each of polarizing beam splitters 120A and 120B splits incident light signals by allowing one polarized component S1 to reflect and the other polarized component P1 to pass through. The polarized component P1 that passes through beam splitter 120A/120B reaches viewing device 402 and provide the user with the microscopic image of the object for viewing.
[060] The polarized component S1 is reflected by beam splitter 120A/120B toward respective camera 118A/118B through respective tube lens 112C/112D. Camera 118A/118B receives the polarized component S1 reflected from beam splitter 120A/120B and converts the optica! signals to electronic image data corresponding to the microscopic image of the object. Camera 118A/118B may then transmit the electronic image data to processing unit 108 for further processing.
[061] Beam splitter 120A/120B operates in a similar manner on the display device side. In particular, display device 110A/110B renders images under the control of the driver circuit and projects light signals corresponding to the images to beam splitter 120A/120B through lens 112A/112B. Lens 112A/112B between beam splitter 120A/120B and respective display device 11 OA/11 OB converts the light signals projected from display devices 110A/110B to parallel light rays to match the up-ward parallel light rays coming from tube 406A/406B. Beam splitter 120A/120B splits the incident light signals coming from display devices 110A/110B, reflecting the S-polarized component S2 of the incident light signals originating from display devices 110A/110B and passing through foe P-polarized component P2 to camera 118A/118B.
[062] At viewing device 402, the reflected S-polarized component S2 from display devices 110A/110B is then merged or combined with the P-polarized component P1 passed through beam splitter 120A/120B from tube 406A/406B. As a result, the images of the object provided by the P-polarized component P1 and the images from display device 110A/110B provided by the S-polarized component S2 may be simultaneously viewed by the user through viewing device 402. In other words, when viewed through viewing device 402, the images generated by display devices 110A/110B appear as overlaid images on the images of the object formed by lens element 404.
[063] Polarizing element 114 placed between lens 112A/112B and beam splitter 120A/120B is configured to adjust the polarization of those projected parallel rays from lens 112A/112B so as to adjust the ratio of the light component (i.e., the S2 component) reflected by beam splitter 120A/120B to the !ight component (i.e., the P2 component) passed through to camera 118A/118B. Accordingly, the intensity of the S-polarized component S2 may be adjusted relatively to the intensity of the P-polarized component P2. in an embodiment, the intensity of the S-polarized component S2 may be substantial equal to the P-polahzed component P2 so that the light signals projected from display devices 110A/110B are equally split by beam splitter 120A/120B.
[064] Additionally, by adjusting the polarization imposed by polarizing element 114, the intensity of the S-polarized component S2 may also be adjusted relatively to the intensity of the P-poiarized component P1. As a result, the images on the display device 110A/110B may be adjusted to be brighter or dimmer with respect to the images of the object when viewed through viewing device 402.
[065] According to a further embodiment, when the P-polarized component P1 and the S-poiarized component S2 are combined by beam splitter 120A/120B, the user of microscope 400 may view a combined image including the microscopic image of the object and the overlaid image generated by display device 110A/110B. The optical components of the microscope insert may be adjusted so that the overlaid image may appear at a projection image plane 410 that substantially overlaps the focal plane of microscope 400 and is located within the depth of field 408 of microscope 400.
[066] The microscope insert for a stereoscopic microscope, as shown in Figures 1-5, includes a set of imaging and projection hardware for each of the right and left tubes of the microscope so as to generate stereoscopic images. As a result, the insert includes four lenses 112A-112D, lens 112C and 112D configured to focus the images of the object to left and right camera 118A and 118B, and lens 112A and 112B configured to project the images generated by left and right display devices 110A and 110B to beam splitters 120A and 120B. In order to maximize optical efficiency and reduce aberrations, these lenses may be incorporated in a lens set.
[067] In alternative embodiments, the microscope insert may include additional optical components, such as mirrors, prisms, or lenses, in the optical paths between the beam splitters and the cameras or between the beam splitter and the display devices to modify the directions of the light rays. The modified light rays may allow the optical components of the insert to be more freely arranged or repositioned so as to fit into a desired mechanical or industrial form.
[068] Figures 6A and 6B illustrate an embodiment of a microscope insert 600 including additional optical components to steer light rays. Figures 6A and 6B shows, respectively, a side view and a top view of major optical elements of microscope insert 600. Microscope insert 600 includes two optical channels for rendering images, respectively, for left and right eyes of the user. Although only one optical channel is described here, one of ordinary skill in the art will appreciate that the optical channels include similar elements and operate in similar manor.
[069] Each optical channel of microscope insert 600 includes a polarizing beam splitter 624 disposed in the corresponding light pathway of the microscope and coupled to the tube of the microscope, from which light reflected by an object enters microscope insert 600. A portion (i.e., the S-polarized component S1) of the incident light is diverted to a turning prism 625, which directs the S1 component through imaging lenses 627 on to a camera 604.
[070] The other portion (i.e., the P-polarized component P1) of the incident light passes through a polarizing beam splitter 624 and reaches the eyepiece of the microscope to provide a microscopic image of the object that is placed under the microscope, in an additional embodiment, beam splitter 624 may include a polarizer element configured to adjust the ratio of the light component diverted to camera 604 to the light component passed through to the eyepiece. The ratio may be, for example, 1 :1, 1:2, 1 :3, or other desired value.
[071] The images generated by the processing unit and to be overlaid on the microscopic images of the object are rendered by a projection LCOS display panel 622 illuminated by an RGB LED light source 621. The S-polarized light component S2 of the light generated by LED light source 621 is passed through a set of display illumination optics 620 including illumination lenses and a turning prism. From illumination optics 620, the S-polarized light component S2 is reflected at the hypotenuse of a polarizing beam splitter 623 to LCOS display panel 622. LCOS display panel 622 acts as an active polarizer. The P-polarized light component P2 passes through a projection lens module 628 and a polarizing wave plate 626 to tube polarizing beam splitter 624. The P-polarized light component P2 is then directed to camera 604 by tube polarizing beam splitter 624 and steering prism 625. The S- poiarized light component S2 is diverted and reflected by tube polarizing beam splitter 624 to the eyepiece of the microscope, which then visualizes the microscopic images of the object and the images generated by display panel 622. When viewed through the eyepiece, the images generated by display panel 622 are overlaid on the microscopic images of the object.
[072] Alternatively, polarizing wave plate 626 may be omitted. Accordingly, the light from LCOS display panel 622 passes through tube polarizing beam splitter 624 without being reflected to the eye piece. Instead, the light from LCOS display panel 622 is directed to turning prism 625 and, in turn to, imaging lens 627 and camera 604. The benefit of this configuration is that wave plate 626 can be removed to perform a calibration between display panel 622 and camera 604. Based on calibration, the system may confirm that images generated by display panel 622 are aligned to the image space being measured by camera 604.
[073] Figures 7A and 7B illustrate an embodiment of a microscope insert 700 that is similar to microscope insert 600 described above. The components of microscope insert 700 are packaged and assembled on a base plate 711 so that microscope insert 700 is ready to be installed on a microscope. In particular, insert 700 includes one or more optical channels, each including components similar to those of insert 600 illustrated in Figures 6A and 6B.
[074] Each optical channel includes a camera 704 disposed in a camera housing affixed to base plate 711 , a set of imaging lenses disposed in a lens tube 705, an imaging steering prism secured to base plate by prism bracket 706, a set of illumination optics disposed in an illumination optics housing 709, a set of projection lenses disposed in a lens tube 710. A focus mechanism is provided in imaging lens tube 705 and allows for fine adjustment of the relative position of the imaging lenses therein, for focusing. Likewise, a focus mechanism is also provided in display lens tube 710 and allows for fine adjustment of the position of the projection lenses for focusing.
[075] Each optical channel further includes an RGB LED light source and a display panel mounted to base plate 711 through a display and RGB LED mounting bracket 714. Microscope insert 700 further includes a driver circuit board 707 mounted to base plate 711 through a driver board bracket 708.
[076] Microscope insert 700 further includes mounting components for mounting onto a microscope. For example, insert 700 includes a top mount 701 that may be coupled to the eyepieces of the microscope. Top mount 701 may include features that allow the eyepieces to be secured thereon. Top mount 701 is secured to base plate 701 through one or more top mount braces. Top mount 701 includes one or more microscope tube openings that allow light to pass through from the polarizing beam splitters to the eye pieces of the microscope. Top mount 701 further includes a wave plate slot 712 for disposing and securing the wave plate. The wave plate may be easily inserted into wave plate slot or removed therefrom as desired. Microscope insert 700 further includes a bottom mount flange 702 that may be coupled and secured to the microscope tube within the body of the microscope.
[077] Figure 8 illustrates a microscope insert 800 according to another embodiment. In this embodiment, light reflected from the object under the microscope (not shown) is directed from the tubes (e.g., 406A and 406B of Figure 4) to respective cameras 818A and 818B by a group of reflective mirrors and prisms 820C, 820D, 822C, and 822D. Similarly, the images generated by display devices 810A and 810B are projected back to beam splitters 816A and 816B by another group of mirrors and prisms 820A, 820B, 822A, and 822B. The arrangement in this embodiment allows the components to be disposed on a relatively small base plate that has a relatively small footprint, thereby easing integration in a variety of microscopic systems.
[078] As further shown in Figure 8, a polarizer element 814A/814B may be disposed in the light path between display device 81 OA/810B and beam splitter 816A/816B and is used to vary the amount of light passed through to camera 818A/818B from display device 810A/810B. Polarizing element 814A/814B may be a set of polarizers, wave plates, or variable retarders, depending on the output polarization of display devices 81 OA and 810B. In an embodiment, display device 81 OA/81 OB outputs an S-polarized component, which is then rotated by a 1/2- lambda wave plate in polarizing element 814A/814B so as to be reflected upwardly to the eyepiece for viewing by the user.
[079] The microscope inserts disclosed herein may create a stereoscopic image. In particular, the inserts may create separate images for the left and right eyes of the user. The images are shifted with respect to each other to provide the perception of different convergence, resulting in stereoscopic rendering.
[080] Figure 9 is a schematic diagram of a display driver circuit 900 according to an embodiment. Display driver circuit 900 generally corresponds to driver circuit 102 of Figure 1. Driver circuit 900 provides communication interfaces between processing unit 108 and display devices 110A and 110B. The functions of driver circuit 900 may include, for example:
[081] Communicating customized resolution HDMI video signals from
processing unit 108 to display devices 110A and 110B;
[082] Generating image frames of a desired resolution (i.e., 1976x960), including a side by side (SBS) layout of the left and right images to be displayed to the user;
[083] Using line phasing to split the SBS image frames into left and right image signals;
[084] Directing the image data to each display device 110A/110B; and
[085] Providing a USB interface for communication with processing unit 108, which supports, for example, firmware updates, control of brightness, gamma, color channel gain of each display device, display focus, and status indication (i.e. power indication, insignia illumination, etc.).
[086] According to an embodiment, processing unit 108 analyzes image data provided by cameras 118A and 118B and provides inputs to display driver circuit 102 for generating overlaid images through the display devices 110A and 110B. For example, processing unit 108 may analyze the image data for registration, tracking, or modeling the object under the microscope. Information derived from the analysis of the image data may then be used to generate and adjust the overlaid images generated by display devices 110A and 110B.
[087] In a further embodiment, the microscope insert disclosed herein may be integrated in a microscope for ophthalmic procedures, such as cataract surgery. The microscope insert may generate images representing surgery-related information to assist a surgeon to navigate during a cataract surgery. The images may be displayed to the user overlaid with the real-time microscopic image of the patient's eye. As a result, the surgeon is able to simultaneously view the image of the eye and the overlaid images through the microscope.
[088] Figure 10 illustrates an exemplary composite image 1000 rendered by a microscope having a microscope insert described herein, according to an embodiment. Image 1000 includes a real-time microscopic image 1020 of a patient's eye as viewed through the microscope and images generated by the microscope insert overlaid on the real-time eye images. Microscopic image 1020 of the patient's eye may be an analog image formed by the zoom lens elements of the microscope. The overlaid images generated by the microscope insert include graphical representations of information related to the surgical procedure. The overlaid images may include prompts or instructions to guide the surgeon during the surgery.
[089] For example, the overlaid images may include image features indicating an axis of interest 1002 and incision points 1006 and 1008 to guide the surgeon to carry out incision and placement of the artificial lens. The overlaid images may also present information including parameters related to the surgery, such as the current operation stage 1012, ultrasound power 1014, vacuum suction 1016, current time, and the like. The information may be presented in an image area 1010 near the area of operation. Image area 1010 may have a shape that generally conforms to the shape of the patient's eye. The processing unit of the microscope insert is configured to track and determine the position, size, and rotation of the patient's eye as it is viewed through the microscope and adjust the position, size, and orientation of the overlaid images accordingly so that the overlaid images remain registered with the patient's eye.
[090] The microscope insert described here may also receive external data from external data sources and user inputs from user input devices during a surgical procedure, and adjust the overlaid images accordingly. For example, during a cataract surgery, the processing unit may receive, from the external data source, demographic information, bio-information, and medical history of the patient. The external data source may include a monitoring system that monitors status of surgical equipment or status of the patient, such as heart rate, respiratory rate, blood pressure, eye pressure, and the like, during the surgery. The processing unit may receive, from the monitoring system, the external data including real-time information representing the status of the patient and the equipment and presenting the external data as part of the overlaid image displayed to the operating surgeon through the microscope insert.
[091] Additionally, the processing unit may receive user inputs from the surgeon through the input devices, such as a joy stick, a foot pedal, a keyboard, a mouse, etc. The user inputs may instruct the processing unit to adjust the information displayed in the overlaid images. For example, based on the user inputs, the processing unit may select portions of the external data for display as part of the overlaid images.
[092] The processing unit may also display prompts or navigation
instructions related to the surgical procedure according to the user inputs. For example, when the surgeon completes a step of a surgical procedure and presses the foot pedal, the processing unit may control microscope insert to modify the overlaid images so as to display prompts or instructions for the next step. The prompts or instructions may include text or graphical information indicating the next step and may further include data or parameters relevant to the next step.
[093] The processing unit may also control the microscope insert to generate a warning to alert the surgeon if there are abnormalities during a surgical procedure. The warning may be a visual representation such as a warning sign generated by the display devices as part of the overlaid image. The warning may also be other visual, audio, or haptic feedback, such as a warning sound or a vibration.
[094] During the operation of the microscope insert, the field of view provided by the display device of the insert may be different from the field of view of the microscope. Figure 11 illustrates a process 1100 for correcting the field of view provided by the display devices and matching it with the field of view of the microscope.
[095] According to process 1100, at step 1102, the microscope generates a microscopic image 1132 having a field of view 1152. At step 1104, the microscope insert generates an overlaid image 1134 having a field of view 1154. In an embodiment, fields of view 1152 and 154 may each have a circular shape. Field of view 1152 may have a diameter D1 , and field of view 1154 may have a diameter D2.
[096] At step 1106, overlaid image 1134 generated by the microscope insert and microscopic image 1132 generated by the microscope are displayed to the user through the eyepiece. When viewed through the eyepiece, microscopic image 1132 and overlaid image 1134 are combined or overlaid. However, due to mismatch between the fields of view of the two images, image features of overlaid image 1134 may obscure important image features of microscopic image 1132 or may appear to be disproportional to the image features of microscopic image 1132.
[097] in order to align the fields of view of the two images, overlaid image 1134 must be adjusted according to the field of view of microscopic image 1132. As discussed above with reference to Figure 5, polarization imposed by polarizing element 114 on light signals projected by display device 110A/110B allows a portion (i.e., the P-polarized component P2) of the light signals to pass through polarizing beam splitter 120A/120B. The passed-through light from display device 110A/110B is received by camera 118A/118B, which captures overlaid image 1134. On the other hand, camera 118A/118B receives light (i.e., the S-polarized component S1) from the object, which is reflected by beam splitter 120A/120B, and captures microscopic image 1132 generated by the microscope. The processing unit (i.e., processing unit 108 of Figure 1) then compare overlaid image 1134 with microscopic image 1132 to determine image transformations necessary to match field of view 1154 of overlaid image 1134 with field of view 1152 of microscopic image 1132.
[098] At step 1108, the processing unit then applies the image
transformations to overlaid image 1134 generated by the display device and control the display device to generate an adjusted overlaid image 1138. As a result, the field of view provided by the display device is properly aligned with the field of view of the microscope at step 1110.
[099] Process 1100 may be used to correct any optical misalignment during manufacturing or slight damages from handling. The image transformations used by the processing unit may be affine transformations. Typical transformations may include translation, scaling, skewing, rotation, and the like. For example, the processor unit may determine a scaling factor for scaling overlaid image 1134 based on a ratio between the diameter D1 of field of view 1152 and the diameter D2 of field of view 1154. The processor unit may also determine translation parameters (Δχ and Ay) necessary to align the microscopic image and the overlaid image based on the distance between the circular centers of fields of view 1152 and 1154. Using process 1100, the microscope insert may provide more precisely placed overlaid images over the microscopic images when viewed through the eyepiece of the microscope.
[0100] According to additional embodiments, the processing unit may monitor changes in the field of view of the microscopic image (i.e., based on the S-polarized component S1) during operation and adjust the overlaid image in such a way to track or follow the field of view of the microscopic image. Alternatively, the processing unit may track an anatomical feature of the patient under the microscope and adjust the field of view of the overlaid image to follow the anatomical feature.
[0101] According to another embodiment, the camera (i.e., camera
118A/118B of Figure 1 ) is configured such that field of view 1152 of the microscope is entirely captured by the camera sensor. Similarly, the overlaid image generated by the display device (i.e., display device 110A/110B) is configured to cover entirely field of view 1152 of the microscope. The camera sensor and the display device are configured to provide oversampling so as to provide sufficient resolutions over the image area that covers the field of view of the microscope.
[0102] Figure 12 illustrates a process 1200 for generating an overlaid image over a microscopic image, according to an embodiment. Process 1200 may be implemented on the microscope insert (i.e., microscope insert 100) disclosed herein.
[0103] According to process 1200, at step 1202, the microscope insert receives a first light signal from a microscope (i.e., microscope 400). The first light signal represents a first image corresponding to an object (i.e., object 406) placed under the microscope. As shown in Figures 4 and 5, the first light signal may be received from the zoom lens elements of the microscope through the tube within the body of the microscope. The first image may be an analog microscopic image of the object.
[0104] At step 1204, the microscope insert directs a first portion (i.e., the P- polarized component P1) of the first light signal to a viewing device (i.e., viewing device 402) and a second portion (i.e., the S-polarized component S1) of the first light signal to a camera (i.e., camera 118A/118B). More particularly, the first light signal may be split by the polarizing beam splitter (i.e., PBS 120A/120B) of the microscope insert into the first portion and the second portion. The polarizing beam splitter may be configured to allow the first portion of the first light signal to pass through to the viewing device and reflect the second portion of the first light signal to the camera within the microscope insert. The microscope insert may further include a tube lens (i.e., lens 112C/112D) to focus the second portion of the first light signal onto the camera sensor and/or additional light steering components (i.e., mirrors and prisms) to direct or redirect the second portion of the first light signal to the location of the camera.
[0105] At step 1206, a display device (i.e., display device 110A/110B) of the microscope insert generates a second image to be overlaid on the first image. The second image (i.e., the overlaid image) includes graphical representations indicating information relevant to the object. For example, when the object is a patient's eye and a surgical procedure (i.e., a cataract surgery) is carried out on the object, the second image may include, for example, prompts, instructions, parameters, and data relevant to the underlying surgical procedure. By displaying the second image, the display device produces a second light signal representing the second image.
[0106] At step 1208, the microscope insert directs a first portion (i.e., the P- polarized component P2) of the second light signal to the camera and a second portion (i.e., the S-polarized component S2) of the second light signal to the viewing device. The second light signal may be split again by the polarizing beam splitter into the first portion and the second portion. The polarizing beam splitter may allow the first portion to pass through to the camera and reflect the second portion to the viewing device. The microscope insert may further include a tube lens (i.e., lens 112A/112B) between the display device and the polarizing beam splitter to alter (i.e., expand) the second light signal projected by the display device. The microscope insert may also include additional light steering components (i.e., mirrors and prisms) to direct the second light signal from the display device to the location of the polarizing beam splitter. The microscope insert may also include a polarizer element (i.e., polarizer element 114) between the display device and the polarizing beam splitter. The polarizer element may impose polarization on the second light signal so as to adjust the ratio between the first portion of the second light signal, which is passed through to the camera, and the second portion of the second light signal, which is reflected to the viewing device.
[0107] At step 1210, the first portion of the first light signal and the second portion of the second light signal are combined to form a composite image, including the first image corresponding to the object and the second image generated by the display device. The second image, when viewed through the viewing device, is rendered over the first image. As a result, the user of the microscope (i.e., the surgeon) may simultaneously view the first image (i.e., the microscopic image of the patient's eye) and the second image (i.e., the overlaid image) through the viewing device (i.e., the eyepiece) of the microscope.
[0108] Additionally, at step 1210, the microscope insert may detect any mismatch between a field of the view of the first image and a field of view of the second image. The microscope insert may detect the mismatch based on the second portion of the first light signal and the first portion of the second light signal received by the camera. If there is a mismatch, the microscope insert may adjust the second image according to the image transformations described herein so as to match the field of view of the second image with the field of view of the first image.
[0109] According to an embodiment, during operation, a microscope insert may apply image registration to the images of the object viewed under a microscope. Based on the registration, the processing unit of the microscope insert may render graphical elements, such as tags, labels, and the like, through the display device, providing instructions, prompts, or other surgery-related information to the operating surgeon. When the surgeon views the object through the eyepiece of the
microscope, the graphical elements are overlaid on the images of the object. The overlaid graphical elements may identity and track anatomical features that are of interest and are spatially associated with the identified anatomical features, thereby providing the surgeon with visual guidance and facilitating navigation through the surgical site.
[0110] According to some embodiments, the processing unit may be configured to perform two-dimensional (2D) or three-dimensional (3D) registration and tracking. In one embodiment, images generated by the cameras may be analyzed by the processing unit independently to provide 2D registration and tracking. Alternatively, the images generated by the cameras may be analyzed together to provide a 3D registration to a known or assumed model. The disclosed system provides the benefits of improved 3D registration by using two cameras for the 3D registration and two display devices for the 3D overlays. For example, the 3D registration is significantly improved using two cameras, compared with existing systems with one camera, and allows for improved registration and tracking, particularly for anatomical features that are at different depths within an operative site and move with respect to each other.
[0111] The tracking and registration of the movements of an object, such as a patient's eye, may be performed based on fiducial markers placed on the object or anatomical features of the object. Figure 13 illustrates an exemplary process 1300 for the marker-based registration. As shown in Figure 13, at step 1302, the processing unit of the microscope insert applies image enhancement to the images of the object captured by the camera. The image enhancement may be any known techniques such as digital filtering, sharpening, and the like.
[01123 At step 1304, the processing unit may detect and identify fiducial markers disposed on the object. The fiducial markers may be detected based on spatial or spectral analysis of the images of the object. Alternatively, the fiducial markers may be determined based on a predetermined shape or color. The processing unit may further identify the fiducial markers and associate the fiducial markers with respective identifications.
[0113] At step 1306, the processing unit may perform pose estimation. For example, the processing unit may further perform registration on the images of the object based on the identified markers. For example, the processing unit may determine a movement or orientation of the object based on the identified marker and calculate a coordinate transformation corresponding to the movement or orientation. The transformation mathematically represents translations, rotations, or other affine transformations of the object. In addition, the processing unit may adjust the graphical elements of the overlaid images generated by the display device according to the registration. The processing unit may apply the coordinate transformation to the graphical elements so that the graphical elements experience similar translations, rotations, and the like. In an embodiment, the registration is carried out in real-time when the images of the object are captured by the camera.
[0114] Figure 14 illustrates an exemplary process 1400 for the anatomical feature-based registration. According to process 1400, at step 1402, the processing unit performs image enhancement on the images of the object captured by the camera, similar to step 1302 discussed above. At step 1404, the processing unit performs image segmentation on the enhanced images of the object. The image segmentation may be performed on an entire image or on regions of the image. The image segmentation may be based on any known techniques, such as, the clustering techniques, histogram techniques, and thresholding techniques. During the image segmentation, image pixels are grouped according to the image features, to which they belong.
[0115] At step 1406, the image features are detected based on the segmented images of the object. Individual image features that are of interest may be extracted from the segmented images. The detected image features may correspond to known anatomical structures of the object. The detected image features may then be matched with the known anatomical structures based on, for example, their shapes, sizes, locations, colors, and the like.
[0116] At step 1408, the processing unit may perform pose estimation similar to step 1306. For example, the processing unit may use a random sampling technique to calculate the pose of the object based on the detected image features. The processing unit may determine a coordinate transformation corresponding to the pose of the object. In addition, the processing unit may apply the coordinate transformation to the image elements of the overlaid image generated by the display device. As a result, the overlaid image tracks the anatomical features of the object when viewed through the microscope.
[0117] In either the fiducial marker-based process or the anatomical feature-based process discussed above, a 2D or 3D registration may be achieved. In two-dimensional registration, the processing unit determines, based on the processed image, a set of coordinate transformation data including, for example, X, Y, R, and Theta. X and Y represent the position, in pixel space, of the center of the patient's eye. R represents the radius, in pixel space, of the limbus of the patient's eye. Theta represents an angle of rotation of the patient's eye.
[0118] In three-dimensional registration, the processing unit may use information from the images captured by the left and right cameras to solve for the coordinate transformation data of the object on six degrees of freedom. Additionally, the processing unit may also use a hybrid registration technique that combines elements of the fiducial marker-based registration and the anatomical feature-based registration to determine the position and orientation of the eye of a patient.
[0119] As shown in Figures 13 and 14, in both the fiducial marker-based method and the anatomical feature-based method, the processing unit first performs an image enhancement on the images of the object acquired by the camera. A range of image enhancement techniques may be used. For example, as shown in Figures 15A-15C, the processing unit may apply a histogram equalization technique to an image of an eye acquired by the camera (Figure 15A). A known contrast-limited adaptive histogram equalization may be used to equalize the histogram by broadening the range of the contract of image intensities. In a further embodiment, the processing unit first converts the image into an L-a-b color space representation, in order for the histogram equalization to be performed on the luminosity channel of the converted image. Figure 15B shows a resulting enhanced image after the histogram equalization.
[0120] Following the contrast enhancement, the processing unit extracts the fiducial markers by segmenting the enhanced image. For example, in the fiducial marker-based registration, the processing unit may select a saturation channel in an HSV color space representation and apply binary thresholding to the enhanced image. Figure 15C shows a binary image produced by binary thresholding. The binary image may be further digitally filtered by the processing unit to eliminate those regions that have a smaller number of pixels than a preset value. The further filtering may eliminate those regions that do not correspond to any markers. The processing unit may also remove highly eccentric regions having a large difference between major and minor diameters. The centroid of the remaining pixel clusters may then be computed and stored as the correct locations of the fiducial markers.
[0121] According to another embodiment, in the anatomical feature-based registration for an eye, the processing unit may further perform a contrast-limited adaptive histogram equalization (CLAHE) on the image acquired by the camera to enhance the contrast in the image. The processing unit may then apply a Gaussian filtering on the enhanced image and then segment the filtered image into regions based on color similarities in the a-b space of the L-a-b color space representation. The processing unit may then apply a K-means clustering technique known in the art, as shown in Figures 16A-16D, to achieve color segmentation on a real-time scale. The K-means clustering technique applies vector quantization on the image pixels and partitions the pixels into k clusters, in which each pixel belongs to the cluster with the nearest mean values. The K-means clustering technique starts from an initial guess (Figure 16A) and gradually refines the clusters (Figures 16B-16D) so that the image pixels are properly grouped in the feature space.
[0122] The resulting image may then be thresholded to define a binary mask, which may later be used to filter out or remove feature points inside the iris of the eye as shown in Figure 17. This is desired because the size and shape of the iris may change throughout the surgery, making any feature points within the iris unsuitable for tracking.
[0123] As further shown in Figures 18A, 18B, and 19, feature points of an eye along with their describing geometries, known as feature descriptors, are identified in a reference image (Figure 18A) and a test image (Figure 18B). A known real-time feature point detection technique or scale-invariant technique may be used to classify these features. These techniques include, for example, SIFT (Scale- invariant Feature Transform), SURF (Speeded-up Robust Features), STAR, FAST, GFTT (Good Features to Track), and MSER (Maximally Stable Extremal Regions).
[0124] A number of feature points are detected and classified in the reference image (Figure 18A) and the test image (Figure 18B), respectively. Each circle in the image represents a feature point, and the size of the circle represents a scale of the identified point. Feature points are identified in scale-space utilizing the determinant of the Hessian (DoH). A mask determined from color segmentation may be used to filter out any SURF features detected inside the iris of the eye. The remaining features may be matched by means of a random sampling method, such as the random sample consensus method (RANSAC), to enable a large number of outliers. The RANSAC method computes a homography from a minimal subset of feature points, and randomly adds features to find the solution, which encompasses the largest number of feature points. The resulting homography may then be transformed into a camera position and orientation, as shown in Figure 19, and compared to the known input transformation.
[0125] In another embodiment, the processing unit may analyze 2D images generated by the cameras based on parameters of the cameras and a spatial relationship between the cameras. The processing unit may then compare the 2D analyses or use a 3D registration to calculate the position of the microscope focal plane relative to a plane of interest with respect to the real object. For example, the processing unit may determine the distance between the microscope focal plane and the plane of interest. The processing unit may then use that distance H9 adjust the focus of the projected images generated by the insert so as to match the plane of interest with respect to the real object. This technique provides the benefit of relieving eye strain of the surgeon when viewing the projected images and the analog image of the object at the same time by focusing the projected images to the plane of interest that the user is visualizing.
[0126] Based on the registration and tracking discussed above, the processing unit may then control the display devices to adjust the overlaid images to track the changes or motions of the object viewed under the microscope, to display information relevant to the motions of the object, or perform other functions accordingly. For example, in cataract surgery, once a patients eye has been registered, the processing unit may cause the insert to render tags and labels associated with individual layers or features of the eye, such as the sclera, iimbus, pupil, or iris, or other very small layers of the eye. The processing unit may identify different anatomical layers or features of the eye and generate graphical elements in the overlaid images according to the identified anatomical layers or features.
[0127] Figure 20 illustrates a sequence diagram for interactions between a tracker engine configured to track the motions of an eye and a torsion engine configured to generate graphical representations for the guidance or prompts.
Figure 21 illustrates an exemplary tracking system for tracking the eye of a patient and generating the graphical representations for the guidance or prompts.
[0128] Figure 22 illustrates an exemplary process for carrying out the tracking by the tracker engine of Figure 21. The main processing blocks are similar for both the reference and sense images, and are illustrated in Figure 22. These are detailed as follows:
[0129] Image Preprocessing - pre-filtering the reference and sense images prior to edge detection.
[0130] Edge Detection - Canny edge detector.
[0131] Generalized Hough Transform - a modified Hough Transform is used to segment the limbic boundary.
[0132] Define and unwrap Annulus Region: The center and radius of the previously detected limbic boundary are used to define an annulus in the sceleral region. The annulus is then mapped from the Cartesian plane to the polar plane. The effect of this transformation is to map all pixels within the annulus to pixels in a rectangular region, i.e., /: (x,y) ·→ (ρ, θ). The transformed annulus is sent to the Torsion Engine where the Occular Torsion angle in computed.
[0133] Finally, the tracker engine then passes the computed Occular Torsion angle along with the radius and center of tfie limbic boundary to the Render Engine. [0134] Figure 23 illustrates an exemplary process for reference image processing and generating Housdorff Distance look-up table. The main processing blocks are illustrated in Figure 23 and are detailed as follows:
[0135] Image pre-processing - this processing step includes Histogram equalization.
[0136] Gabor filtering and Skeletonization, including extracting the vasculatures from the sclera using four orientations of a 2-D Gabor filter, skeletonizing the four feature images, combining (logical OR) the four skeletonized images, and skeletonizing the final image.
[0137] Define template ROI in reference annulus - a template ROI is extracted from the Gabor filtered and skeletonized reference annulus.
[0138] Compute table of all possible Hausdorff Distances - a look-up table is created at this step. It is predicated on the assumption that all pixels in the sense image ROI belong to a feature. This facilitates the computation of all possible Hausdorff Distances between the sense image RO! and the previously defined reference annulus template ROI.
[0139] Cache look-up table.
[0140] Figure 24 illustrates an exemplary process for processing the sense image and computation of the minimum-Hausdorff Distance between the reference image template ROI and ROI in the sense image. The main processing blocks are illustrated in Figure 24 and are detailed as follows:
[0141] Image pre-processing - this processing step includes histogram equalization.
[0142] Gabor filtering and Skeletonization, including extracting the vasculatures from the sclera using four orientations of a 2-D Gabor filter, skeletonizing the four feature images, combining (logical OR) the four skeletonized images, and skeletonizing the final image.
[0143] Define ROI in reference annulus - a ROI is extracted from the Gabor filtered and skeletonized sense image annulus.
[0144] Compute Hausdorff Distances - the minimum Hausdorff Distances between the sense image ROI and the previously defined reference annulus template ROI is computed.
[0145] Cache look-up table.
[0146] Compute the Occular Torsion angle.
[0147] Figures 25-32 depict a process for estimation of ocular torsion from sclera features. The ocular torsion estimation is based on the following
assumptions:
[0148] The Ocular Torsion is bounded:
Figure imgf000039_0001
[0149] Between neighboring (successive) frames: θοτ < 2° typically
Figure imgf000039_0002
[0150] As shown in Figure 25, torsion angle is deterministic. The computed angle is corrupted with additive noise (AWGN) and can be modeled as a sinusoidal (with AWGN).
[0151] During the estimation, at startup (i.e., the first frame), θοτ is not known a-priori. It is assumed initially that Worst case search scenario is that it
Figure imgf000039_0003
may be needed to search the entire exemplar with high computational complexity. It is further assumed that the torsional rotation between subsequent frames is constrained Only a small window is needed from the exemplar
Figure imgf000039_0004
to compute the distance metric, and, hence, the computational complexity
decreases.
[0152] During initialization of the estimation, the system captures exemplar frame from which to compute the reference feature vector. A reference image is captured with the subject (i.e., the patient) looking straight ahead. This is the exemplar image from which the feature template is generated.
[0153] According to an embodiment, the pupil radius of the patient under dilation is non-deterministic. The system may use a-priori knowledge of the pupil center and radius to determine an accurate estimate of the iris radius. The system may also preclude instabilities that may arise from extreme pupil dilation. Extreme pupil dilation precludes a safe guess as to the annular region which contains the outer iris boundary.
[0154] According to an embodiment, the system also computes iris radius. Iris boundary is deformable under incident forces. Absent any deforming incident forces, the radius is constant within bounds. The system first computes tine pupil radius, which is non-deterministic under dilation, and precludes a "guess" for the annular ring which encloses the elliptical boundary of the iris. The system then computes iris radius.
[0155] During iris radius, the system may relax the stringent requirements that are necessary for Bio-metric applications. Iris is enclosed by the limbus. The system may use an empirical determination of the max thickness of the limbus, or the best fit circle for the iris. The inner radius, rt, of the annulus is computed as follows: rt = rtris + Δ, where Δ is the max possible thickness of the limbus
[0156] Figure 26 illustrates an exemplary process for iris boundary estimation, including frame capture, down sampling, histogram equalization, grey scaling, Gaussian kernel filtering, pupil center and radius calculation, iris center and radius calculation, and definition of annular region in sclera.
[0157] Figure 27 illustrates an exemplary process for feature extraction from annular ring In exemplar image, including pre-definition of annular region in sclera, mapping annulus to rectangular region, sclera feature detection within (0 - 2n), and post processing. The sclera feature detection may be based on simple edge detection, such as Sobel filter, Canny filter, etc., Gabor wavelet, or wavelet using the lifting scheme.
[0158] Figure 28 illustrates an exemplary feature extraction from segment of annular ring in test image, including segmentation from pre-defined annular region in sclera, mapping annular segment to rectangular region, sclera feature detection, and post processing. The pre-determined segment is taken from the pre-defined annular ring. The annular ring may be previously defined in the exemplar image. The segmentation is included by two angles 01( 02 e [θ, 2π].
[0159] Figure 29 illustrates an exemplary annular ring constructed in scleral region of an exemplar image. Figure 30 illustrates an exemplary mapping of annulus in sclera to a rectangular region.
[0160] Figure 31 illustrates an exemplary segmentation in annular region in a test image. Figure 32 illustrates an exemplary template matching process.
[0161] According to an embodiment, the estimation of the ocular torsion may be based on the city-block distance metric (i.e., liriorm). In particular, for an n - dimensional vector space, the system may calculate the knorm between p and q by.
Figure imgf000041_0001
[0164] The linorm metric is computationally more efficient than the Euclidean distance and the cross-correlation techniques.
[0165] According to an embodiment, the segments of the annular ring will, at times, be occluded during surgery. The system may choose a segment that is not occluded, as segment occlusion will lead to template match failure.
[0166] The system may mitigate the occlusion by locating the occluded segment. The occulated segment may be non-deterministic if a-priori knowledge of the location of the instruments is unknown. The computational complexity may also be high.
[0167] The system may also define a number of segments using a-priori knowledge of instruments location (if known) so as to preclude a choice of an occluded segment. Alternatively, the system may choose a segment randomly and compute the distance metric. The system will know, if failure, without need to search the entire exemplar since the torsion angle between neighboring frames is constrained - between neighboring (successive) frames: θοτ < 2° typically
Figure imgf000042_0001
[0168] Using a-priori knowledge of the torsion frequency will allow the system to compute the next torsion angle gradient, and will allow some limited adaptation. It will also allow estimation of the torsion frequency, when the pupil is dilated and the patient is prone. This is the preamble to actual surgery. The system may use best fit sinusoidal to torsion angles.
[0169] One of the benefits of using 3D registration based on two cameras in the disclosed microscope insert is that the microscope insert allows the processing unit to acquire depth information along the z axis of any anatomical features, which is not available in existing systems. The processing unit may the use the depth information to provide 3D guidance and assist the surgeon to navigate during the surgical procedure.
[0170] This disclosure Is not limited to the particular implementations listed above. Other display techniques, protocols, formats, and signals may also be used without deviating from the principle of this disclosure. Other embodiments of the invention will be apparent to those skilled in the art from consideration of the specification and practice of the invention disclosed herein. Although the
microscope insert is described above in the context of a cataract surgery, one of ordinary skill in the art will appreciate that the microscope insert may be integrated in other surgical systems configured to carry out a wild variety of surgical procedures, such as spinal surgery, ear, nose, and throat (ENT) surgery, neurosurgery, plastic and reconstructive surgery, gynecology, oncology, etc. For these procedures, the insert may be used for registration, tracking, and image recognition and to generate customized stereoscopic overlaid information relevant to the procedure and a particular patient's anatomy that is not limited to what is disclosed herein. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the invention being indicated by the following claims.

Claims

1. A method for registering and tracking an object with a microscope, comprising:
capturing an image of an eye with a camera;
analyzing the image with a processing unit, wherein the processing unit performs to the steps of:
detecting a limbic boundary;
defining a plurality of segments in the image relative to the limbic boundary;
detecting features in the plurality of segments; and
determining at least one of a position, size, or rotation of the eye.
2. The method of clam 1, further comprising generating a graphical representation of information related to the eye.
3. The method of clam 1 , wherein defining the plurality of segments includes performing a modified Hough Transform.
4. The method of clam 1 , wherein the limbic boundary is annular
5. The method of claim 1 , wherein detecting the limbic boundary includes detecting with a Canny edge detector.
6. The method of clam 1 , wherein the number of segments is predetermined;
7. The method of clam 1 , wherein the features are anatomical.
8. The method of clam 2, further comprising tracking a position of the eye by: determining a coordinate transformation corresponding to the position of the eye; and
applying the coordinate transformation to the graphical representation.
9. A microscope system for registering and tracking an object with a microscope, comprising:
a camera configured to capture an image of the eye;
a processing unit configured to analyze the image of the eye, wherein the processing unit is configured to:
detect a limbic boundary;
define a plurality of segments in the image relative to the limbic boundary;
detect features in the plurality of segments; and
determine at least one of a position, size, or rotation of the eye.
10. The microscope system of clam 9, wherein the processing unit is further configured to generate a graphical representation of information related to the eye.
11. The microscope system of clam 9, wherein the processing unit performs a modified Hough Transform to define the plurality of segments.
12. The microscope system of clam 11 , wherein the limbic boundary is annular.
13. The microscope system of clam 12, wherein the processing device is configured to detect a limbic boundary using a Canny edge detector.
14. The microscope system of clam 9, wherein the number of segments is predetermined.
15. The microscope system of clam 9, wherein the features are anatomical,
16. The microscope system of clam 10, wherein the processing unit is further configured to:
determine a coordinate transformation corresponding to the position of the eye; and
apply the coordinate transformation to the graphical representation.
PCT/US2016/022236 2015-03-13 2016-03-13 Methods and systems for regitration using a microscope insert WO2016149155A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP16765529.9A EP3267892A4 (en) 2015-03-13 2016-03-13 Methods and systems for regitration using a microscope insert
US15/558,086 US20180049840A1 (en) 2015-03-13 2016-03-13 Methods and systems for registration using a microscope insert

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201562133182P 2015-03-13 2015-03-13
US62/133,182 2015-03-13

Publications (1)

Publication Number Publication Date
WO2016149155A1 true WO2016149155A1 (en) 2016-09-22

Family

ID=56919763

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2016/022236 WO2016149155A1 (en) 2015-03-13 2016-03-13 Methods and systems for regitration using a microscope insert

Country Status (3)

Country Link
US (1) US20180049840A1 (en)
EP (1) EP3267892A4 (en)
WO (1) WO2016149155A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019077495A3 (en) * 2017-10-18 2019-06-20 Johnson & Johnson Surgical Vision, Inc. Surgical workstation for simplified loading of intraocular lenses

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
MX2019007495A (en) * 2016-12-22 2019-11-28 Walmart Apollo Llc Systems and methods for monitoring item distribution.
CA3099629A1 (en) * 2018-07-02 2020-01-09 Alcon Inc. Shaking image for registration verification
CN110441901A (en) * 2019-08-14 2019-11-12 东北大学 It is a kind of can real-time tracing watch the optical microscope system and method for position attentively
EP4238483A1 (en) * 2020-10-27 2023-09-06 Topcon Corporation Ophthalmologic observation device
DE102023200474A1 (en) * 2022-01-25 2023-07-27 Carl Zeiss Meditec Ag Method for adjusting and/or calibrating a medical microscope and medical microscope
JP2024039546A (en) * 2022-09-09 2024-03-22 アークレイ株式会社 Guide program, guide method, imaging device, information processing device, and microscope device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4916533A (en) * 1988-12-31 1990-04-10 Olympus Optical Co., Ltd. Endoscope insertion direction detecting method
WO2009029638A1 (en) * 2007-08-27 2009-03-05 Videntity Systems, Inc. Iris recognition
US20110118609A1 (en) * 2009-11-16 2011-05-19 Lensx Lasers, Inc. Imaging Surgical Target Tissue by Nonlinear Scanning
US20140205156A1 (en) * 2006-09-25 2014-07-24 Morpho Trust USA Inc. Iris Data Extraction

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1210002B1 (en) * 2000-04-19 2007-12-12 Alcon RefractiveHorizons, Inc. Eye registration control method
DE102006002001B4 (en) * 2006-01-16 2009-07-23 Sensomotoric Instruments Gmbh Method for determining the spatial relation of an eye of a person with respect to a camera device
ATE509568T1 (en) * 2008-10-22 2011-06-15 Sensomotoric Instr Ges Fuer Innovative Sensorik Mbh METHOD AND DEVICE FOR IMAGE PROCESSING FOR COMPUTER-ASSISTED EYE OPERATIONS
US8784443B2 (en) * 2009-10-20 2014-07-22 Truevision Systems, Inc. Real-time surgical reference indicium apparatus and methods for astigmatism correction
US8231221B2 (en) * 2010-09-30 2012-07-31 Wavelight Gmbh Arrangement and method for carrying out a surgical treatment of an eye

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US4916533A (en) * 1988-12-31 1990-04-10 Olympus Optical Co., Ltd. Endoscope insertion direction detecting method
US20140205156A1 (en) * 2006-09-25 2014-07-24 Morpho Trust USA Inc. Iris Data Extraction
WO2009029638A1 (en) * 2007-08-27 2009-03-05 Videntity Systems, Inc. Iris recognition
US20110118609A1 (en) * 2009-11-16 2011-05-19 Lensx Lasers, Inc. Imaging Surgical Target Tissue by Nonlinear Scanning

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3267892A4 *

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019077495A3 (en) * 2017-10-18 2019-06-20 Johnson & Johnson Surgical Vision, Inc. Surgical workstation for simplified loading of intraocular lenses

Also Published As

Publication number Publication date
EP3267892A1 (en) 2018-01-17
EP3267892A4 (en) 2019-01-23
US20180049840A1 (en) 2018-02-22

Similar Documents

Publication Publication Date Title
US20170164829A1 (en) Registration Using a Microscope Insert
US20180049840A1 (en) Methods and systems for registration using a microscope insert
JP7196224B2 (en) Tilt-shift iris imaging
US11295460B1 (en) Methods and systems for registering preoperative image data to intraoperative image data of a scene, such as a surgical scene
US20170318235A1 (en) Head-mounted displaying of magnified images locked on an object of interest
Richa et al. Vision-based proximity detection in retinal surgery
US12056274B2 (en) Eye tracking device and a method thereof
EP3905943B1 (en) System and method for eye tracking
JP7195619B2 (en) Ophthalmic imaging device and system
AU2009320352A1 (en) Apparatus and method for two eye imaging for iris identification
US11003244B2 (en) System and method for real-time high-resolution eye-tracking
US20210244274A1 (en) Devices, systems, and methods for fundus imaging and associated image processing
Richa et al. Fundus image mosaicking for information augmentation in computer-assisted slit-lamp imaging
JP6452236B2 (en) Eyeball identification device and eyeball identification method
US10254528B2 (en) Microscope insert
EP3883455A1 (en) System and method for retina template matching in teleophthalmology
Nitschke et al. Corneal Imaging
KR20170090881A (en) Apparatus and method for eye tracking based on multi array
Prokopetc Precise Mapping for Retinal Photocoagulation in SLIM (Slit-Lamp Image Mosaicing)
Schnitzler et al. Fundus image mosaicking for information augmentation in computer-assisted slit-lamp imaging

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16765529

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE