EP2903500A1 - Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries - Google Patents

Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries

Info

Publication number
EP2903500A1
EP2903500A1 EP13792222.5A EP13792222A EP2903500A1 EP 2903500 A1 EP2903500 A1 EP 2903500A1 EP 13792222 A EP13792222 A EP 13792222A EP 2903500 A1 EP2903500 A1 EP 2903500A1
Authority
EP
European Patent Office
Prior art keywords
wavefront
eye
sld
image
lens
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Withdrawn
Application number
EP13792222.5A
Other languages
German (de)
English (en)
Inventor
Yan Zhou
Bradford Chew
William Shea
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Clarity Medical Systems Inc
Original Assignee
Clarity Medical Systems Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Clarity Medical Systems Inc filed Critical Clarity Medical Systems Inc
Publication of EP2903500A1 publication Critical patent/EP2903500A1/fr
Withdrawn legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/1015Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for wavefront analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery
    • A61F9/008Methods or devices for eye surgery using laser
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery
    • A61F9/008Methods or devices for eye surgery using laser
    • A61F9/00825Methods or devices for eye surgery using laser for photodisruption
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery
    • A61F9/008Methods or devices for eye surgery using laser
    • A61F2009/00844Feedback systems
    • A61F2009/00846Eyetracking
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery
    • A61F9/008Methods or devices for eye surgery using laser
    • A61F2009/00844Feedback systems
    • A61F2009/00848Feedback systems based on wavefront
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery
    • A61F9/008Methods or devices for eye surgery using laser
    • A61F2009/00844Feedback systems
    • A61F2009/00851Optical coherence topography [OCT]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery
    • A61F9/008Methods or devices for eye surgery using laser
    • A61F2009/00853Laser thermal keratoplasty or radial keratotomy
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery
    • A61F9/008Methods or devices for eye surgery using laser
    • A61F2009/00861Methods or devices for eye surgery using laser adapted for treatment at a particular location
    • A61F2009/0087Lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery
    • A61F9/008Methods or devices for eye surgery using laser
    • A61F2009/00861Methods or devices for eye surgery using laser adapted for treatment at a particular location
    • A61F2009/00872Cornea
    • FMECHANICAL ENGINEERING; LIGHTING; HEATING; WEAPONS; BLASTING
    • F04POSITIVE - DISPLACEMENT MACHINES FOR LIQUIDS; PUMPS FOR LIQUIDS OR ELASTIC FLUIDS
    • F04CROTARY-PISTON, OR OSCILLATING-PISTON, POSITIVE-DISPLACEMENT MACHINES FOR LIQUIDS; ROTARY-PISTON, OR OSCILLATING-PISTON, POSITIVE-DISPLACEMENT PUMPS
    • F04C2270/00Control; Monitoring or safety arrangements
    • F04C2270/04Force
    • F04C2270/041Controlled or regulated

Definitions

  • One or more embodiments of the present invention relate generally to wavefront sensor(s) for use in vision correction procedures.
  • the invention relates to the electronics and algorithms for driving, controlling and processing the data of a realtime sequential wavefront sensor and other subassemblies associated with the wavefront sensor.
  • Conventional wavefront sensors for human eye wavefront characterization are generally designed to take a snap shot or several snap shots of a patient's eye wavefront with room lighting turned down or off.
  • These wavefront sensors generally use a CCD or CMOS image sensor to capture the wavefront data and need to use relatively complicated data processing algorithms to figure out the wavefront aberrations.
  • these wavefront sensors are integrated with an ophthalmic device such as a surgical microscope, they generally cannot provide accurate/repeatable real time wavefront aberration measurement, especially with the microscope's illumination light turned on.
  • Fig. 9C shows a number of representative cases of planar wavefront, defocus and astigmatism, the associated image spot position on a quad-detector behind a subwavefront focusing lens, as well as the sequential movement of the corresponding centroid positions when displayed as a 2D data point pattern on a monitor.
  • Fig. 13 B shows the case when the wavefront shifted leftward as the SLD pulse is fired so that the aperture samples a portion at the right of the of the circular wavefront section
  • Fig. 13C shows that case when the wavefront is shifted upward as the SLD pulse is fired so that the aperture samples a portion at the bottom of the of the circular wavefront section
  • a lock- in detection electronics system associated with related algorithms for achieving high precision wavefront measurement obtains its electronic signal from an opto-electronic position sensing device/detector; it amplifies the analog signal with a composite trans-impedance amplifier, converts the analog signal to a digital signal via an A/D converter, amplifies the digital signal via a digital amplifier, and processes the data via a data processing unit.
  • the electronics system is connected to some or all of those electronically active devices of the wavefront sensor module to achieve different functionalities.
  • a fixation/imaging beam splitter 166/266 directs the image of a fixation target 164/264, formed by a lens or set of lenses 170/270 together with the first lens 104/204, along a reverse path to the patient eye.
  • the lens 168/268 in front of the image sensor 162/262 can be designed to work with the first lens 104/204 to provide a desired optical magnification for the live image of the anterior or posterior of the patient eye on a display (not shown in Fig. 1 and 2) and be used to adjust focus either manually or automatically if needed to ensure that the image sensor plane is conjugate with, for example, the eye pupil plane so that a clear eye pupil image can be obtained.
  • a second function of the LEDs (135/235) is to create specular reflection image spots returned from the optical interfaces of the cornea and/or the eye lens (natural or artificial) so that Purkinje images of the LEDs (135/235) can be captured by the image sensor (162/262).
  • the transverse position of the patient eye can be determined.
  • the top and/or bottom surface profile or the topograph of the cornea and/or the eye lens (natural or artificial) can be figured out in the same way as a corneal topographer and/or a keratometer/keratoscope does. This information obtained can be used to determine change(s) in the cornea shape or even some other eye biometric/anatomic parameters. The measured change can then be used to set a targeted or expected refraction during or right after the refractive surgery so that when the incision or wound made in the cornea of the eye is completely healed, the final refraction of the eye will be as desired.
  • the transverse position of the eye or the pupil can be determined using the live eye image or other means.
  • the limbus can provide a reference to where eye is; the border between the pupil and the iris can also provide the reference to where the eye is.
  • specularly reflected flood illumination light from the cornea anterior surface captured by the live eye camera as bright light spots or detected by additional position sensing detectors can also be used to provide the information on the transverse position of the eye.
  • specularly reflected SLD light from the cornea anterior surface can also be captured by the live eye camera as bright light spots or detected by additional position sensing detectors to determine the transverse position of the eye.
  • the SLD beam can also be scanned in two dimensions to search for the strongest cornea apex specular reflection and to determine the eye transverse position.
  • the wavefront beam scanner In another embodiment of the present disclosure, the wavefront beam scanner
  • the final wavefront image 482 when relayed to the final wavefront sampling image plane, can be transversely displaced to be re-centered with respect to the wavefront sampling aperture 458.
  • the SLD beam 498 would still enter the eye off-centered, the wavefront at the cornea plane as an object to be relayed by the 8-F relay is off-axis when passing through the first, second and third lenses, but after the wavefront scanner, the relay is corrected by the wavefront scanner and is now on-axis. Accordingly, further angular rotational scanning of the wavefront beam scanner relative to this DC offset angle would result in the sampling of a radially or rotationally symmetric annular ring 494 with respect to the center of the eye.
  • FIG. 5 On the left column of Fig. 5, three emmetropic eyes are shown with the top one 504 moved further away from the wavefront sensor, with the middle one 506 at the designed axial location of the wavefront sensor and the bottom one 508 moved towards the wavefront sensor.
  • the wavefront emerging from this emmetropic eye is planar, at the designed object plane 502 from which the wavefront will be relayed to the final wavefront sampling plane, the wavefronts 514, 516 and 518 are all planar for the three cases. Therefore, when the eye is emmetropic, if the eye is slightly displaced axially from the designed position, the wavefront measurement result will not be affected.
  • the radius of curvature of the wavefront 538 at the object plane of the wavefront sensor is now larger than the wavefront 536 at the corneal plane.
  • the measured wavefront result at the wavefront object plane will again be different from that at the corneal plane of the eye.
  • the wavefront emerging from the eye will be divergent and by extending the divergent light rays backward, one can find a virtual focus point (555, 557, 559) from which the light rays originate.
  • the hyperopic dioptric value of the wavefront at the corneal plane is determined by the distance from the corneal plane of the eye to the virtual focus point.
  • the wavefront 554 at the object plane 542 of the wavefront sensor is again not the same as the wavefront 556 at the corneal plane of the eye.
  • the divergent radius of curvature of the wavefront 554 at the object plane of the wavefront sensor is now larger than the divergent radius of curvature of the wavefront 556 at the corneal plane. Therefore, when this wavefront 554 at the object plane of the wavefront sensor is measured by the wavefront sensor, the measured result will again be different from the wavefront 556 at the corneal plane.
  • the surgical microscope when fully zoomed-out, can generally present to a surgeon a relatively sharp-focused view of the patient eye within an axial range of the order of about ⁇ 2.5mm. So when the surgeon focuses a patient eye under a surgical microscope, the variation in the patient eye's axial position should be within a range of about ⁇ 2.5mm. Therefore, the calibration can be done over such a range and the look-up table can be established also over such a range.
  • the wavefront data can be abandoned/filtered to exclude the "dark" or "bright" data and at the same time, the SLD 172/272 can be turned off.
  • the wavefront sensor is used to figure out if the eye is dry and a reminder in the form of video or audio signal can be sent to the surgeon or clinician to remind him/her when to irrigate the eye.
  • the signal from the image sensor 162/262 can also be used to identify if the patient eye is in a phakic, or aphakic or pseudo-phakic state and accordingly, the SLD pulses can be turned on during only the needed period.
  • These approaches can reduce the patient's overall exposure time to the SLD beam and thus possibly allow higher peak power or longer on-duration SLD pulses to be used to increase the wavefront measurement signal to noise ratio.
  • an algorithm can be applied to the resultant eye image to determine optimal distance to the eye through the effective blurriness of the resultant image, and/or in tandem with triangulation fiducials.
  • the SLD beam can be pre-shaped or manipulated so that when the beam enters the eye at the cornea plane, it can be either collimated or focused or partially defocused (either divergently or convergently) at the cornea plane.
  • the SLD beam lands on the retina as either a relatively small light spot or a somewhat extended light spot, it will be scattered over a relatively large angular range, and the returned beam thus generated will have both the original polarization and an orthogonal polarization.
  • the orthogonal polarization component of the wavefront relay beam is used for eye wavefront measurement.
  • a band pass filter 176/276 is arranged in the wavefront relay beam path to reject any visible light and/or ambient background light, and to only allow the desired relatively narrow spectrum of the wavefront relay beam light that the SLD generates to enter the rest of the wavefront sensor module.
  • the SLD beam can also be scanned to land over a small scanned area on the retina with the control from the electronics system which includes the front end electronic processor and the host computer.
  • the electronics system which includes the front end electronic processor and the host computer.
  • a scan mirror 180/280 for scanning the SLD beam as shown in Fig.l and 2 can be positioned at the back focal plane of the first wavefront relay lens 104/204.
  • the scanning of the SLD beam over a small area on the retina can provide several benefits; one is to reduce speckle effects resulting from having the SLD beam always landing on the same retina spot area, especially if the spot size is very small; another benefit is to divert the optical energy over a slightly larger retinal area so that a higher peak power or longer on-duration pulsed SLD beam can be launched to the eye to increase the signal to noise ratio for optical wavefront measurement; and still another benefit is to enable the wavefront measurement to be averaged over a slightly larger retinal area so that wavefront measurement errors resulting from retinal topographical non-uniformity can be averaged out or detected and/or quantified.
  • the SLD beam spot size on the retina can also be controlled to achieve similar goals.
  • An internal calibration target 199/299 can be moved into the wavefront relay beam path when a calibration/verification is to be made.
  • the SLD beam can be directed to be coaxial with the wavefront relay optical beam path axis when the internal calibration target is moved in place.
  • the calibration target can be made from a material that will scatter light in a way similar to an eye retina with maybe some desired attenuation so that a reference wavefront can be generated and measured by the sequential wavefront sensor for calibration/verification purpose.
  • the generated reference wavefront can be either a nearly planer wavefront or a typical aphakic wavefront, or a divergent or convergent wavefront of any other degree of divergence/convergence.
  • the returned light waves that pass through the PBS 174/274 is collected with a low coherence fiber optic interferometer as is typically employed for optical low coherence interferometry (OLCI) or optical coherence tomography (OCT) measurements.
  • the SLD output fiber 188/288 can be single mode (SM) (and polarization maintaining (PM) if desired) and can be connected to a normal single mode (SM) fiber (or a polarization maintaining (PM) single mode optical fiber) coupler so that one portion of the SLD light is sent to the wavefront sensor and another portion of the SLD light is sent to a reference arm 192/292.
  • SM single mode
  • PM polarization maintaining
  • the same fiber coupler 190/290 is used for both splitting and recombining the light waves in a Michelson type of optical interferometer configuration
  • other well known fiber optic interferometer configurations can all be used as well, one example is a Mach-Zehnder type configuration using two fiber couplers with a fiber circulator in the sample arm to efficiently direct the sample arm returned light wave to the recombining fiber coupler.
  • Various OLCI/OCT configurations and detection schemes including spectral domain, swept source, time domain, and balanced detection, can be employed.
  • the detection module 194/294, the reference arm 192/292 (including the reference mirror plus the fiber loop), and even the SLD 172/272 and the fiber coupler 190/290 can be located outside the wavefront sensor enclosure.
  • the reason for doing this is that the detection module 194/294 and/or the reference arm 192/292 and/or the SLD source 172/272 can be bulky depending on the scheme being used for the OLCI/OCT operation.
  • the electronics for operating the OLCI/OCT sub-assembly can be located either inside the wavefront sensor enclosure or outside the wavefront sensor enclosure.
  • the SLD beam can be scanned across the anterior segment of the eye or across a certain volume of the retina and biometric or anatomic structure measurement of the various parts of the eye can be made.
  • One particularly useful measurement is the cornea surface and thickness profile.
  • the beam scanner 112/212 used for shifting/scanning the wavefront and those (180/280, 182/282) used for scanning the SLD beam can also have a dynamic DC offset to bring additional benefits to the present disclosure.
  • the scanner 112/212 used for shifting and/or scanning the wavefront can be utilized to provide compensation to potential misalignment of the optical elements as a result of environmental changes such as temperature to ensure that wavefront sampling is still rotationally symmetric with respect to the center of the eye pupil.
  • the reference point on the position sensing device/detector (PSD) can also be adjusted if needed per the compensated image spot locations through a calibration.
  • the scanner 180/280 used for scanning the SLD beam can be employed to follow eye transverse movement within a certain range through a feedback signal from the image sensor 162/262.
  • the returned wavefront beam from the eye will be transversely displaced relative to the optical axis of the wavefront sensor module.
  • the relayed wavefront at the wavefront sampling image plane will also be transversely displaced.
  • the DC offset of the scanner 112/212 used for shifting the wavefront can be employed to compensate for this displacement and still make the scanned wavefront beam rotationally symmetric with respect to the wavefront sampling aperture 118/218.
  • the image sensor With the combination of information provided by the image sensor, the wavefront sensor, the specular reflection detector and/or the low coherence interferometer, it is possible to combine some or all the information to realize an auto selection of the correct calibration curve and/or the correct data processing algorithm. Meanwhile, a data integrity indicator, or a confidence indicator, or a cataract opacity degree indicator, or an indicator for the presence of optical bubbles can be shown to the surgeon or clinician through audio or video or other means, or connected to other instruments in providing feedback. The combined information can also be used for intraocular pressure (IOP) detection, measurement and/or calibration.
  • IOP intraocular pressure
  • a patient heart beat generated or an external acoustic wave generated intraocular pressure change in the anterior chamber of the eye can be detected by the wavefront sensor and/or the low coherence interferometer in synchronization with an oximeter that monitors the patient heart beat signal.
  • a pressure gauge equipped syringe can be used to inject viscoelastic gel into the eye to inflate the eye and also measure the intraocular pressure.
  • the combined information can also be used to detect and/or confirm the centering and/or tilt of an implanted intraocular lens (IOL) such as a multi-focal intraocular lens.
  • IOL implanted intraocular lens
  • the combined information can also be used for the detection of the eye status, including phakia, aphakia and pseudophakia.
  • the wavefront sensor signal can be combined with the OLCI/OCT signal to measure and indicate the degree of optical scattering and/or opacity of the eye lens or the optical media of the ocular system.
  • the wavefront sensor signal can also be combined with the OLCI/OCT signal to measure tear film distribution over the cornea of the patient eye.
  • One requirement for real time ophthalmic wavefront sensor is a large diopter measurement dynamic range that can be encountered during a cataract surgery, such as when the natural eye lens is removed and the eye is aphakic.
  • the optical wavefront relay configuration has been designed to cover a large diopter measurement dynamic range, the sequential nature has eliminated the cross talk issue, and the lock-in detection technique can filtered out DC and low frequency 1/f noises, the dynamic range can still be limited by the position sensing device/detector (PSD).
  • the optics is optimally designed so that over the desired the diopter coverage range, the image/light spot size on the PSD is always within a certain range such that its centroid can be sensed by the PSD.
  • a positive lens can be dropped into the wavefront relay beam path at the 4-F wavefront image plane to offset the spherical defocus component of the wavefront and therefore to bring the image/light spot landing on the PSD to within the range such that the PSD can sense/measure the centroid of sequentially sampled sub-wavefronts.
  • the wavefront/defocus offsetting device 178/278 can be scanned and deliberate offsetting can be applied to one or more particular aberration component(s) in a dynamic manner.
  • some lower order aberrations can be offset and information on other particular higher order wavefront aberrations can be highlighted to reveal those clinically important feature(s) of the remaining wavefront aberrations that need to be further corrected.
  • the vision correction practitioner or surgeon can fine tune the vision correction procedure and minimize the remaining wavefront aberration(s) in real time.
  • Fig. 6 shows an overall block diagram of one example embodiment of the electronics system 600 that controls and drives the sequential wavefront sensor and other associated active devices as shown in Figures 1 and 2.
  • a power module 605 converts AC power to DC power for the entire electronics system 600.
  • the wavefront data and the images/movies of the eye can be captured and/or recorded in synchronization in a stream manner.
  • the host computer & display module 610 provides back-end processing that includes synchronizing a live eye image with the wavefront measurement result, and a visible display to the user with the wavefront information overlaid on or displayed side -by- side with the live image of the patient eye.
  • the host computer & display module 610 can also convert the wavefront data into computer graphics which are synchronized and blended with the digital images/movies of the eye to form a composite movie and display the composite movie on the display that is synchronized to real-time activity performed during a vision correction procedure.
  • the host computer & display module 610 also provides power and communicates with the sequential wavefront sensor module 615 through serial or parallel data link(s) 620.
  • the optics as shown in Fig. 1 and 2 reside together with some front-end electronics in the sequential wavefront sensor module 615.
  • the host computer & display module 610 and sequential wavefront sensor module 615 communicate through a USB connection 620.
  • any convenient serial, parallel, or wireless data communication protocol will work.
  • the host computer & display module 610 can also include an optional connection 625 such as Ethernet to allow downloading of wavefront, video, and other data processed or raw onto an external network (not shown in Fig. 6) for other purposes such as later data analysis or playback.
  • the display should not be limited to a single display shown as combined with the host computer.
  • the display can be a built-in heads up display, a semi-transparent micro display in the ocular path of a surgical microscope, a back-projection display that can project information to overlay on the live microscopic view as seen by a surgeon/clinician, or a number of monitors mutually linked among one another.
  • the wavefront measurement result (as well as the other measurement results such as those from the image sensor and the low coherence interferometer) can also be displayed adjacently on different display windows of the same screen or separately on different displays/monitors.
  • the present electronics system is different in that the host computer & display module 610 is configured to provide back-end processing that includes synchronizing a live eye image with the sequential wavefront measurement data and at the same time, displays the synchronized information by overlaying the wavefront information on the live eye image or displaying the wavefront information side-by-side next to the live eye image.
  • the front-end electronics (as will be discussed shortly) inside the sequential wavefront sensor module 615 operates the sequential real time ophthalmic wavefront sensor in lock-in mode, and is configured to send the front-end processed wavefront data to be synchronized with the live eye image data to the host computer and display module 610.
  • Fig. 7 shows a block diagram of one example embodiment of the front-end electronic processing system 700 that resides within the wavefront sensor module 615 shown in Fig.6.
  • a live imaging camera module 705 (such as a CCD or CMOS image sensor/camera) provides a live image of the patient eye, the data of which is sent to the host computer and display module 610 as shown in Fig. 6 so that the wavefront data can be overlaid on the live image of the patient's eye.
  • a front-end processing system 710 is electronically coupled to the SLD drive and control circuit 715 (which, in addition to pulsing the SLD, may also perform SLD beam focusing and SLD beam steering as has been discussed before with regard to Fig.
  • the presently disclosed front-end electronic processing system has a number of features that when combined in one way or another make it different and also advantageous for real time ophthalmic wavefront measurement and display, especially during eye refractive cataract surgery.
  • the light source used for creating the wavefront from the eye is operated in pulse and/or burst mode.
  • the pulse repetition rate or frequency is higher (typically in or above the kHz range) than the typical frame rate of a standard two dimensional CCD/CMOS image sensor (which is typically about 25 to 30 Hz (generally referred to as frames per second)).
  • the position sensing detector is two dimensional with high enough temporal frequency response so that it can be operated in lock- in detection mode in synchronization with the pulsed light source at a frequency above the 1/f noise frequency range.
  • the front-end processing system 710 is at least electronically coupled to the SLD drive and control circuit 715, the wavefront scanner driving circuit 720, and the position sensing detector circuit 725.
  • the front-end electronics is configured to phase-lock the operation of the light source, the wavefront scanner, and the position sensing detector.
  • the front-end processing system 710 can also be electronically coupled to an internal fixation and LEDs driving circuit 730, and an internal calibration target positioning circuit 735.
  • the LEDs driving circuit 730 can include multiple LED drivers and be used to drive other LEDs, including indicator LEDs, flood illumination LEDs for the eye live imaging camera, as well as LEDs for triangulation based eye distance ranging.
  • the internal calibration target positioning circuit 735 can be used to activate the generation of a reference wavefront to be measured by the sequential wavefront sensor for calibration/verification purpose.
  • the front-end and back-end electronic processing systems include one or more digital processors and non-transitory computer readable memory for storing executable program code and data.
  • the various control and driving circuits 715-735 may be implemented as hard-wired circuitry, digital processing systems or combinations thereof as is know in the art.
  • Fig. 8 shows an example internal calibration and/or verification target
  • the internal calibration and/or verification target comprises a lens (such as an aspheric lens) 804 and a diffusely reflective or scattering material such as a piece of spectralon 806.
  • the spectralon 806 can be positioned a short distance in front of or beyond the back focal plane of the aspheric lens 804.
  • the aspheric lens 804 can be anti-reflection coated to substantially reduce any specular reflection from the lens itself.
  • the internal calibration and/or verification target 802 When the internal calibration and/or verification target 802 is moved into the wavefront relay beam path, it would be stopped by, for example, a magnetic stopper (not shown), such that the aspheric lens 804 is centered and coaxial with the wavefront relay optical axis.
  • the SLD beam would then be intercepted by the aspheric lens with minimum specular reflection and the SLD beam would be focused, at least to some extent, by the aspheric lens to land on the spectralon 806 as a light spot.
  • the returned light from the spectralon will be in the form of a divergent cone 812 and after travelling backward through the aspheric lens, it will become a slightly divergent or convergent beam of light 814.
  • the position of the internal calibration target as shown in the Fig. 1 and 2 is somewhere between the first lens 104/204 and the polarization beam splitter 174/274, therefore a somewhat slightly divergent or convergent beam there propagating backward would be equivalent to a beam coming from a point source in front of or behind the object plane of the first lens 104/204.
  • the internal calibration and/or verification target created reference wavefront is equivalent to a convergent or divergent wavefront coming from an eye under test.
  • the actual axial position of the spectralon relative to the aspheric lens can be designed such that the reference wavefront can be made to resemble that from an aphakic eye.
  • the actual axial position of the spectralon can be designed such that the reference wavefront thus created can be made to resemble that from an emmetropic or a myopic eye.
  • an aspheric lens here, a spherical lens and any other type of lens, including cylindrical plus spherical lens or even a tilted spherical lens can be used to create a reference wavefront with certain intended wavefront aberrations for calibration and/or verification.
  • the position of the spectralon relative to the aspheric lens can also be continuously varied so that the internally created wavefront can have continuously variable diopter values to enable a complete calibration of the wavefront sensor over the designed diopter measurement range.
  • the internal calibration target can simply be a bare piece of spectralon 836.
  • the requirement on the stop position of the piece of spectralon 836 can be lessened as any part of a flat spectralon surface, when moved into the wavefront relay beam path, can intercept the SLD beam to generate substantially the same reference wavefront assuming that the topographic property of the spectralon surface is substantially the same.
  • the emitted beam from the bare piece of spectralon will be a divergent beam 838.
  • the internal calibration and/or verification target comprises both a bare piece of spectralon 866 and also a structure with an aspheric lens 854 and a piece of spectralon 856, where the spectralon (866 and 856) can be a single piece.
  • the mechanism to move the internal calibration and/or verification target 852 into the wavefront relay beam path can have two stops, an intermediate stop that does not need to be very repeatable and a final magnetic stopping position that is high repeatable.
  • the intermediate stopping position can be used to enable the bare piece of spectralon to intercept the SLD beam and the highly repeatable stopping position can be used to position the aspheric lens plus spectralon structure so that the aspheric lens is centered well and coaxial with the wavefront relay beam optical axis.
  • the highly repeatable stopping position can be used to position the aspheric lens plus spectralon structure so that the aspheric lens is centered well and coaxial with the wavefront relay beam optical axis.
  • an optical attenuation means such as a neutral density filter and/or a polarizer, can be included in the internal calibration and/or verification target and be disposed either in front of or behind the aspheric lens to attenuate the light so that it is about the same as that from a real eye.
  • the thickness of the spectralon can be properly selected to only enable a desired amount of light to be diffusely back scattered and/or reflected and the transmitted light can be absorbed by a light absorbing material (not shown in Fig.8).
  • One embodiment of the present invention is to interface the front-end processing system 710 with the position sensing detector circuit 725 and the SLD driver and control circuit 715.
  • the position sensor detector is likely a parallel multiple channel one in order for it to have high enough temporal frequency response, it can be a quadrant detector/sensor, a lateral effect position sensing detector, a parallel small 2D array of photodiodes, or others.
  • the front-end processing system computes ratio-metric X and Y values based on signal amplitudes from each of the 4 channels (A, B, C and D) as will be discussed later.
  • the front-end processing system can (upon user discretion) automatically adjust SLD output and the gain of the variable gain amplifier either independently for each of the channels or together for all the channels so that the final amplified outputs of the A, B, C and D values for all sequentially sampled sub-wavefront image spots landing on the position sensing detector are optimized for optimal signal-to-noise ratio.
  • This is needed because the optical signal returned from a patient eye can vary depending on the refractive state (myopic, emmetropic and hyperopic), the surgical state (phakic, aphakic and pseudo-phakic), and degree of cataract of the eye.
  • Figs. 9A and 9B show an embodiment of an electronics block diagram that accomplishes the task of automatic SLD index and digital gain control through a servo mechanism in order to optimize the signal to noise ratio
  • Fig. 10 shows an example embodiment in the form of a process flow block diagram.
  • the microprocessor 901 is coupled to a memory unit 905 that has code and data stored in it.
  • the microprocessor 901 is also coupled to the SLD 911 via a SLD driver and control circuit with digital to analog conversion 915, the MEMS scanner 921 via a MEMS scanner driving circuit with digital to analog conversion 925, and the PSD 931 via a composite transimpedance amplifier 933, an analog to digital converter 935 and a variable gain digital amplifier 937.
  • the PSD in this example is a quadrant detector with four channels that lead to four final amplified digital outputs A, B, C, and D, so correspondingly, there are four composite transimpedance amplifiers, four analog to digital converters and four variable gain digital amplifiers, although in Fig.9A only one of each is drawn.
  • the image spot on the quad-detector will then be formed away from the center (moved towards the right-upper quadrant as shown by the image spot 938).
  • variable gain digital amplifier 937 sets the SLD initially to an output level as much as allowed per eye safety document requirement.
  • the gain of the variable gain digital amplifier 937 at this moment can be initially set at a value determined at the last session or at an intermediate value as would normally be selected.
  • the gain of the variable gain digital amplifier can be decreased as shown by step 1020 and the final outputs are checked as shown by step 1022. If all of the final outputs are within the desired range, the gain can be set as shown by step 1024 at a value slightly lower than the current value to overcome fluctuation induced signal variations that can cause the final outputs to go outside the desired range again.
  • step 1026 If any of the final outputs is still above the desired signal strength range and the gain has not reached its minimum as checked at step 1026, the process of decreasing the gain per step 1020 and checking the final outputs per step 1022 can be repeated until the final outputs all fall within the range and the gain is set as shown by step 1024.
  • the gain has reached its minimum when checked at step 1026 and one or more of the final outputs A, B, C and D is(are) still above the desired signal strength range.
  • the gain is kept at its minimum as shown at step 1028 and the SLD output can be decreased as shown by step 1030.
  • the final outputs A, B, C and D are checked at step 1032 after the SLD output is decreased and if it is found that the final A, B, C and D outputs are within the desired range, the SLD output is then set as shown by step 1034 at a level slightly lower than the current level to overcome fluctuation induced signal variations that can cause the final outputs to go outside the desired range again.
  • the process of decreasing the SLD output as shown by step 1030 and checking the final A, B, C and D outputs as shown by step 1032 can be repeated until they reach the desired range and the SLD output is set as shown by step 1034.
  • the only exception is that the SLD output has reached zero and one or more of the final A, B, C and D outputs is(are) still above the desired range. This means that even if there is no SLD output; there is still a strong wavefront signal. This can only happen when there is either electronic or optical interference or cross talk. We can keep the SLD output at zero as shown by step 1038 and send the end user a message that there is strong interference signal so data is invalid as shown by step 1040.
  • the end user can also manually control the SLD output and the gain of the variable gain digital amplifier until he/she feels that the real wavefront measurement result is satisfactory.
  • Fig.10 is only one of many possible ways to achieve the same goal of improving the signal to noise ratio, so it should be considered as illustrating the concept.
  • the SLD output can be initially set at any arbitrary level and then adjusted together with amplifier gain until the final outputs A, B, C and D fall within the desired range.
  • the advantage of setting the SLD output initially to a relatively high level is that in the optics or photonics domain, the optical signal to noise ratio before any opto-electronic conversion can be maximized. However, this does not mean that other choices would not work.
  • the SLD output can even be initially set at zero and gradually increased together with the adjustment of the amplifier gain until the final A, B, C and D outputs fall within the desired range. In this case, there will be a corresponding change to the sequence and details of the process flow. These variations should be considered as within the scope and spirit of the present disclosure.
  • FIG. 11 shows one example embodiment of a composite transimpedance amplifier that can be used to amplify the signal from any one quadrant (for example, Dl) of the four quadrant photodiodes of a quadrant detector.
  • the circuit is used in the position sensing detector circuit as shown in Fig. 9A.
  • the current-to-voltage conversion ratio is determined by the value of the feedback resistor Rl (which, for example, can be 22 MegOhms) and is matched by resistor R2 to balance the inputs of the op-amp U1A.
  • the shunt capacitors CI and C2 could be either parasitic capacitance of resistors Rl and R2 or small capacitors added to the feedback loop.
  • the transimpedance amplifier's stability and high-frequency noise reduction comes from the low- pass filter formed by resistor R3, capacitor C3 and op-amp U2A inside the feedback loop 1150.
  • +Vref is some positive reference voltage between ground and +Vcc. Since the output signal (Output A) is proportional to Rl, but noise is proportional to the square root of Rl, the signal-to-noise ratio therefore increases proportionally to the square root of Rl (since it is dominated by the Johnson noise of Rl).
  • prior art high-bandwidth wavefront sensors generally only use standard transimpedance amplifier(s) rather than composite transimpedance amplifier(s) (see, for example, S. Abado, et. al. "Two-dimensional High-Bandwidth Shack-Hartmann Wavefront Sensor: Design Guidelines and Evaluation Testing", Optical Engineering, 49(6), 064403, June 2010.).
  • prior art wavefront sensors are not purely sequential but parallel in one way or another. Furthermore, they do not face the same weak but synchronized and pulsed optical signal challenge as the present sequential ophthalmic wavefront sensor faces.
  • the selected feedback resistor value of Rl that is substantially matched by resistor R2 is very high;
  • the two shunt capacitors CI and C2 have very low capacitance values;
  • the low-pass filter formed by R3, C3 and U2A inside the feedback loop substantially improves the stability and also substantially reduces the high-frequency noise of the transimpedance amplifier;
  • the positive reference voltage +Vref is a properly scaled DC signal phase-locked to the drive signal of the SLD and the MEMS scanner, and it is between ground and +Vcc.
  • a quadrant sensor with minimal terminal capacitance is preferably selected
  • the optical signal converted to an analog current signal by the position sensing detector can also be AC coupled to and amplified by a conventional transimpedance amplifier, and then combined with a standard lock-in detection circuit to recover small signals which would otherwise be obscured by noises that can be much larger than the signal of interest.
  • Fig. 12 shows one example embodiment of such a combination.
  • the output signal from the transimpedance amplifier 1295 is mixed at a mixer 1296 with (i.e. multiplied by) the output of a phase-locked loop 1297 which is locked to the reference signal that drives and pulses the SLD.
  • the output of the mixer 1296 is passed through a low-pass filter 1298 to remove the sum frequency component of the mixed signal and the time constant of the low-pass filter is selected to reduce the equivalent noise bandwidth.
  • the low-pass filtered signal can be further amplified by another amplifier 1299 for analog to digital (A/D) conversion further down the signal path.
  • the wavefront scanner/shifter is an electromagnetic MEMS (Micro-Electro-Mechanical System) analog steering mirror driven by four D/A converters controlled by the microprocessor.
  • MEMS Micro-Electro-Mechanical System
  • two channels of D/A converters output sinusoids 90 degrees apart in phase, and the other two channels output X and Y DC-offset voltages to steer the center of the wavefront sampling annular ring.
  • the amplitude of the sine and cosine electronic waveforms determines the diameter of the wavefront sampling annular ring, which can be varied to accommodate various eye pupil diameters as well as to deliberately sample around one or more annular ring(s) of the wavefront with a desired diameter within the eye pupil area.
  • the aspect ratio of the X and Y amplitude can also be controlled to ensure that a circular scanning is done when the mirror reflects the wavefront beam sideways.
  • Fig. 13A the MEMS 1312 is oriented so that the entire wavefront is shifted downward when the SLD pulse is fired.
  • the aperture 1332 samples a portion at the top of the circular wavefront section.
  • Fig. 13 B the wavefront shifted leftward so that the aperture samples a portion at the right of the of the circular wavefront section
  • Fig. 13C the wavefront is shifted upward so that the aperture samples a portion at the bottom of the of the circular wavefront section
  • Fig. 13D the wavefront is shifted rightward so that the aperture samples a portion at the left of the of the circular wavefront section.
  • Fig. 13E depicts the equivalence of the sequential scanning sequence of four pulses per cycle to sampling the wavefront section with four detectors arranged in a ring.
  • the number of SLD pulses does not need to be restricted to 8 and can be any number, the SLD pulses do not need to be equally spaced in time, and they do not have to be aligned with the X and Y axes of the MEMS scanner.
  • Fig. 14 shows an example in which the 8 wavefront sampling positions are shifted 15° away from those shown in Fig. l3F by slightly delaying the SLD pulses.
  • a quadrant detector or lateral-effect position sensing detector is used as the PSD and its X-Y axis is aligned in orientation to that of the MEMS scanner so that they have the same X and Y axis, although this is not absolutely required.
  • the ratiometric X and Y values of a sequentially sampled sub- wavefront image spot can be expressed based on the signal strength from each of the four quadrants, A, B, C, and D as:
  • ratiometric values of X and Y do not directly give highly accurate transverse displacement or position of the centroids, because the response of, for example, a quadrant detector is also a function of gap distance, the image spot size which is dependent on a number of factors, including the local average tilt and the local divergence/convergence of the sampled sub-wavefront, as well as the sub-wavefront sampling aperture shape and size.
  • One embodiment of the present invention is to modify the relationship or equation so that the sampled sub-wavefront tilt can be more precisely determined.
  • the relationship between the ratiometric measurement result and the actual centroid displacement is theoretically and/or experimentally determined and the ratiometric expression is modified to more accurately reflect the centroid position.
  • Fig. 16 shows one example of a theoretically determined relationship between the ratiometric estimate and the actual centroid displacement or position along either the X or the Y axis.
  • Prime A and PrimeB are constants.
  • an experimentally determined relationship in the form of data matrix or matrices between the quadrant detector reported ratiometric result in terms of (X, Y) and the actual centroid position ( ⁇ ', ⁇ ') can be established, and a reversed relationship can be established to convert each (X, Y) data point to a new centroid ( ⁇ ', ⁇ ') data point.
  • the calibrated wavefront tilt and hence dioptric value versus the centroid data point position can be obtained.
  • a measurement can be made of a real eye and the obtained relationship can be used to determine the centroid position and hence the sampled sub-wavefront tilts from the real eye.
  • the determined centroid position or tilt of the sampled sub-wavefront can be used to determine the wavefront aberration or refractive errors of the real eye.
  • first and second calibration related steps can be executed once for each built wavefront sensor system and the third and fourth steps can be repeated for as many real eye measurements as one likes. However, this does not mean that the calibration steps should be done only once. In fact it is beneficial to periodically repeat the calibration steps.
  • the calibration steps or a partial calibration can be repeated as often as the manufacturer or an end user prefers using an internal calibration target driven by the microprocessor as shown in Fig.9A.
  • an internal calibration target can be moved into the optical wavefront relay beam path temporarily every time the system is powered up or even before each real eye measurement automatically or manually as desired by the end user.
  • the internal calibration does not need to provide all the data points as a more substantially comprehensive calibration would or can provide. Instead, the internal calibration target only needs to provide some data points. With these data points, one can experimentally confirm if the optical alignment of the wavefront sensor is intact or if any environmental factor such as temperature change and/or mechanical impact has disturbed the optical alignment of the wavefront sensor.
  • the measured reference wavefront aberration using an internal calibration target can figure out the inherent optical system aberration that the wavefront sensor optical system has and the real eye wavefront aberration can be determined by subtracting the optical system induced wavefront aberration from the measured overall wavefront aberration.
  • a calibration target (internal or external) can also be used to determine the initial time delay between the SLD firing pulse and the MEMS mirror scanning position, or the offset angle between the sub-wavefront sampling position and the MEMS mirror scanning position along a certain wavefront sampling annular ring.
  • the same calibration steps can also be used to determine if the SLD firing time is accurate enough with respect to the MEMS scan mirror position, and if there is any discrepancy from a certain desired accuracy, either an electronics hardware based correction or a pure software based correction can then be implemented to fine tune the SLD firing time or the MEMS scanning drive signal.
  • the ellipse would represent a divergent spherical wavefront where the degree of divergence is the same for the horizontal and vertical directions. Assume a t 0 value of 0 ⁇ t 0 ⁇ ⁇ /2, the point (U(t 0 ), V(t 0 )) will be in the first quadrant of the U-V Cartesian coordinate. [00155] Note that in this particular example of Fig. 18, as well as in Fig.
  • the sequential centroid data points expected from a convergent spherical wavefront sampled at the plane represented by the dashed line as shown in Fig. 23 will be a clockwise circle with the resulting data point position and polarity as indicated by the numbers and the arrows in Fig.23. Note the swapping of the numbered data points from the original position in Fig.22 to the opposite position in Fig.23 when the sampled wavefront changes from being divergent to being convergent.
  • One embodiment of the present disclosure is to use a calibration (internal or external) to determine the initial offset angle of the data point vector(s) relative to the Xtr or Ytr axes.
  • This algorithm enables real time high precision measurement of eye wavefront over a large dynamic range.
  • the orientation of the ellipse indicates the axis of astigmatism.
  • the magnitudes of a and b indicate the relative magnitudes of the divergent and convergent astigmatic components and the direction of rotation helps identifies which component is divergent and which component is convergent. As a result, real time titration of a surgical vision correction procedure can be performed.
  • Fig. 25 shows a special case of Fig. 24, the result of coordinate rotation transformation and 8 centroid data points on the U-V coordinate, with the left side corresponding to a divergent spherical wavefront having equal positive major and minor axes, and with the right side corresponding to a convergent spherical wavefront, having equal negative major and minor axes. Note again the swapping of the numbered data points from the original position to the opposite position when the sampled wavefront changes from being divergent to being convergent.
  • the MEMS scanning mirror can be operated to sample sub-wavefronts in a spiral pattern or concentric rings of varying radii, allowing the detection of higher-order aberrations.
  • Zernike decomposition can be performed to extract all the wavefront aberration coefficients, including high order aberrations such as trefoil, coma, and spherical aberration.
  • coma can be determined by detecting a lateral shift of the wavefront as the scan radius is increased or decreased. If the number of samples per annular ring is evenly divisible by 3, then trefoil can be detected when the dots form a triangular pattern that inverts when the scan radius is increased or decreased.
  • the effective spacing between any two wavefront sampling points can be controlled by controlling the SLD firing time and the drive signal amplitude of the MEMS scan mirror.
  • higher spatial precision/resolution sampling of the wavefront can also be achieved by precisely controlling the SLD firing time and also reducing the SLD pulse width as well as increasing the precision in the control of the MEMS scan mirror amplitude or position.
  • the MEMS scan mirror can be operated in closed-loop servo mode with the MEMS mirror scan angle monitor signal being fed-back to the microprocessor and/or the electronics control system to control the scan angle drive signal to achieve better scan angle control precision.
  • another embodiment of the present disclosure is to use the electronics to control the SLD and the wavefront shifter/scanner to achieve either higher precision/resolution in spatial wavefront sampling or more averaging in spatial wavefront sampling.
  • Higher precision/resolution spatial wavefront sampling is desired for high order aberration measurement and more averaged spatial wavefront sampling is desired for measuring the refractive errors of the wavefront in terms of the spherical and cylindrical dioptric values and the axis of cylinder or astigmatism.
  • a wavefront from a patient eye can contain higher order aberrations in addition to sphere and cylinder refractive errors.
  • sphere and cylinder refractive errors are corrected for most vision correction procedures such as cataract refractive surgery. Therefore, the need for averaging is desired so that the best sphere and cylinder correction dioptric values and cylinder axis angle can be found and prescribed.
  • the present disclosure is extremely suitable for such an application as by averaging and correlating the centroid trace(s) to one or more ellipse(s) over one or more annular rings, together with the polarity of major and minor axis taken into consideration when correlating the centroid data points to the ellipse(s), the resultant prescription given in terms of the sphere and the cylinder dioptric values as well as the cylinder axis has already included averaging the effect of higher order aberrations.
  • the algorithm and data processing can also tell the end user how much higher order aberration there is in the wavefront by calculating how close the correlation of the centroid data points to the ellipse(s) is.
  • step 2620 to change or adjust the offset angle(s), which can be achieved by changing the SLD pulse delay or the initial phase of the sinusoidal and co-sinusoidal drive signal sent to the MEMS scan mirror.
  • the offset angle can be adjusted such that one of the centroid data point is aligned with the X or Y axis and in this case, there is no need to further conduct the coordinate rotation transformation. This can reduce the burden on data processing.
  • the qualitative display can also be in the form of an ellipse with either the major or the minor axis length representing the sphere dioptric value, with the difference in major and minor axis length (polarity considered) representing the cylinder dioptric value, and with the ellipse orientation angle representing the cylinder axis angle.
  • the sign of the sphere and cylinder dioptric value can be shown using, for example, a different color or a different line pattern for the circle-plus-straight-line representation or for the ellipse representation.
  • One embodiment of the present disclosure is to allow user selection of an ellipse or a circle-plus-straight-line to represent the refractive errors of a patient eye.
  • the representation can also be an ellipse with its major axis proportional to one independent cylinder diopter value and its minor axis proportional to another independent and perpendicular cylinder diopter value.
  • the axis angle representing one cylinder or the other cylinder angle can be the original angle or shifted by 90°, as the cylinder axis angle can be either the major axis angle or the minor axis angle depending on whether the end user prefers a positive or negative cylinder prescription.
  • the representation can also be two orthogonal straight lines with one straight line length proportional to one independent cylinder dioptric value and the other orthogonal straight line length proportional to the other independent and perpendicular cylinder dioptric value.
  • one embodiment of the present disclosure is the overlay, on the live video image of the patient's eye, of the wavefront measurement result in a qualitative and/or quantitative way.
  • the displayed ellipse or straight-line angle can also be dependent on the orientation of the surgeon/clinician relative to the patient's eye (superior or temporal), and if temporal, which of the patient's eyes is being imaged (right or left).
  • the cylinder axis presented to a cataract surgeon is aligned with the steeper axis of the cornea so that the surgeon can conduct LRI (Limbal Relaxing Incision) based on the presented axis direction.
  • the live eye image can be processed with a pattern recognition algorithm to achieve eye registration for supine or vertical patient position and/or to determine the axis of an implanted toric IOL referenced to iris landmarks such as crypt.
  • the live image can also be used to identity particular lens (natural or artificial) registrations for alignment and/or comparison of optical signals (from, for example, wavefront and/or OLCI/OCT measurement) to physical features of the eye lens or iris.
  • the conversion from the correlated ellipse major and minor axis length to the diopter values can be done in different ways depending on the preference of the end user. As is well known to those skilled in the art, there are three ways to represent the same refractive error prescription. The first is to represent it as two independent perpendicular cylinders, the second one is to represent it as sphere and a positive cylinder, and the third one is to represent it as a sphere and a negative cylinder. In addition, the representation can be with respect to either prescription or the actual wavefront. Our correlated ellipse actually directly provides the dioptric values of the two independent perpendicular cylinders.
  • one embodiment of the present disclosure is the use of both positive and negative values to represent the major and minor axis of the correlated ellipse and the calibration approach to correlate the major and minor axis length, which can be either positive or negative, to the two independent perpendicular cylinder dioptric values which can also be positive or negative.
  • optometrists, ophthalmologists, and optical engineers may represent the same wavefront at the cornea or pupil plane of a patient eye using different ways.
  • an optometrist generally prefers a prescription representation which is the lens(se) to be used to cancel out the wavefront bending to make it planer or flat;
  • an ophthalmologist tends to prefer a direct representation which is what the wavefront at the eye cornea plane is in terms of sphere and cylinder dioptric values and cylinder axis; while an optical engineer would generally not use dioptric values but a wavefront map that shows the 2D deviation of the real wavefront from a perfect planar or flat wavefront, or a representation using Zernike polynomial coefficients.
  • One embodiment of the present disclosure is the mutual conversion between these different representations that can be carried out by the end user as the algorithm has been built in the device to do such conversion, so it is up to the end user to select the format of the representation.
  • the ellipse or circle-plus-straight-line correlation can be done for one frame (or set) of data points or multiple frames (or sets) of data points.
  • the obtained sphere and cylinder dioptric values as well as the cylinder axis angle can be averaged over multiple captures.
  • the averaging can be accomplished simply by adding respectively a given number of sphere and cylinder dioptric values of multiple measurements and dividing by the given number.
  • the cylinder angle can also be averaged although it can be more involved because of the wrap-around problem near 0°, as we report angles from 0° to 180°. As one approach, we use trigonometric functions to resolve this wrap-around issue.
  • Fig. 27 shows an example process flow diagram of an eye tracking algorithm.
  • the steps involved include step 2705 of estimating the position of the eye pupil using either the eye pupil position information from the live eye pupil or iris image or other means such as detecting specular reflection from the cornea apex by scanning the SLD beam in two dimensions; step 2710 of adjusting the SLD beam scanner to follow the eye movement; step 2715 of offsetting the DC drive component of the wavefront scanner/shifter in proportion to the SLD beam adjustment to compensate the eye pupil movement so that the same intended portions of the wavefront from the eye are always sampled regardless of the eye movement; and as an option, step 2720 of correcting the measurement of wavefront aberration.
  • the live image camera provides a visual estimate of either (a) the center of the iris, or (b) the center of the corneal limbus.
  • the SLD can be directed to the same position on the cornea.
  • this position is slightly off the axis or apex of the cornea as in this way, specular reflection of the SLD beam will generally not be directly returned to the position sensing detector/device of the wavefront sensor.
  • the center of the iris or the center of the limbus can be used as a reference point to directing the SLD beam.
  • a unique feature of the presently disclosed algorithm is the step of offsetting the DC drive component of the wavefront scanner/shifter in proportion to the SLD beam adjustment. This is a critical step as it can ensure that the same portions of the wavefront (such as the same annular ring of the wavefront) from the eye are sampled. Without this step, as the eye is transversely moved, different portions of the wavefront from the eye will be sampled and this can cause significant wavefront measurement errors.
  • the reason why the last step of correcting the measurement of wavefront aberration is optional is that with the compensation that can be provided by the wavefront scanner/shifter in proportion to the SLD beam adjustment, the consequence to the wavefront measurement is that there will be added astigmatism and/or prismatic tilt and/or other know aberration components to all the sampled portions of the wavefront which can be pre-determined and taken into consideration.
  • our refractive error decoding algorithm can automatically average the aberration to figure out compromised sphere and cylinder and to filter out the prismatic tilt through coordinate translation, so for refractive error measurements, there is no additional need for prismatic tilt correction.
  • Another embodiment of the present disclosure is in adaptive ly selecting the diameter of the wavefront sampling annular ring so that while wavefront sampling is only performed within the eye pupil area, the slope sensitivity of the response curve as a function of the annular ring diameter can also be exploited to provide higher measurement sensitivity and/or resolution.
  • the sphere dioptric value generally requires the largest coverage range as it can vary a lot among different eyes as well as during a cataract surgery when the natural eye lens is removed (i.e. the eye is aphakic).
  • the wavefront from the eye should be close to planar as the pseudo-phakic eye should in general be close to emmetropia.
  • the wavefront from only the 3 mm diameter central area of the eye pupil is generally sampled.
  • a wavefront sensor can therefore be designed to provide enough diopter measurement resolution (e.g. 0.1D) as well as enough diopter coverage range (e.g. -30D to +30D), over an effective wavefront sampling annular ring area that covers for example, a diameter range from 1mm to 3mm.
  • the wavefront sample annular ring in order confirm emmetropia with higher sensitivity and/or wavefront measurement resolution, we can expand the wavefront sample annular ring to a diameter of , for example, 5mm near the end of a cataract refractive surgery as long as the pupil size is large enough to more accurately measure the wavefront or refractive errors of a pseudo- phakic eye.
  • Fig. 28 shows an embodiment flow diagram of an algorithm that can implement this concept.
  • the steps involved include the step 2805 of using the eye pupil information obtained from the live eye image to estimate the eye pupil size, the step 2810 of using the eye pupil size information to determine the maximum diameter of the wavefront sampling annular ring, and the step 2815 of increasing the annular ring diameter up to the maximum diameter as determined by step 2810 for pseudo-phakic measurement to achieve better diopter resolution.
  • This "zoom in” feature could be user-selectable or automatic.
  • One feature of the present disclosure is to combine the live eye image, with or without a pattern recognition algorithm, with the wavefront measurement data, to detect the presence of eye lids/lashes, iris, facial skin, surgical tool(s), surgeon's hand, irrigation water or the moving away of the eye from the designed range. In doing so, "dark” or “bright” data can be excluded and the SLD can be smartly turned on and off to save exposure time, which can enable higher SLD power to be delivered to the eye to increase the optical or photonic signal to noise ratio.
  • Fig. 29 shows an example process flow diagram illustrating such a concept.
  • the steps involved include the step 2905 of using either the live eye image and/or the wavefront sensor signal to detect the presence of unintended object in the wavefront relay beam path or the moving away of the eye from a desired position and/or range, the step 2910 of abandoning the erroneous "bright” or “dark” wavefront data, the step 2915 of turning the SLD off when the wavefront data is erroneous, and an optional step 2920 of informing the end user that the wavefront data is erroneous or invalid.
  • the SLD beam spot size and/or shape on the retina can also be monitored using, for example, the same live eye image sensor by adjusting its focus or a different image sensor solely dedicated to monitoring the SLD beam spot on the retina of an eye.
  • the static or scanned pattern of the SLD spot on the retina can be controlled.
  • Still another embodiment of the present disclosure is to include a laser as a surgery light source that can be combined with the SLD beam to be launched through the same optical fiber or another free space light beam combiner that can use the same the SLD beam scanner or a different scanner to scan the surgery laser beam for performing refractive correction of the eye such as LRI (limbal relaxing incision).
  • the same laser or a different laser can also be used to "mark" the eye or “guide” the surgeon, i.e. "overlaying" on the eye so that the surgeon can see the laser mark(s) through the surgical microscope.
  • the combined information can also be used to detect the aphakic state of the eye and to calculate the IOL prescription needed for target refraction in real time in the operating room (OR) either on demand or right before the IOL is implanted, and/or to confirm the refraction, and/or to find out the effective lens position right after the IOL is implanted.
  • the combined information can also be used to determine the alignment of the patient head, i.e. to determine if the eye of the patient is normal to the optical axis of the wavefront sensor module.
  • the combined information can also be used to perform dry eye detection and to inform the surgeon when to irrigate the eye.
  • the image spot can also be made to always land at or near the center of the quadrant detector.
  • the drive signal for the wavefront compensating or defocus offsetting device, the wavefront shifter and the sub-wavefront focusing lens can be used to precisely determine the wavefront tilt of each sampled sub-wavefront.
  • the presently disclosed apparatus can accomplish a large number of additional tasks depending on the configuration of the host computer that processes the wavefront data, the eye image data, the eye distance data, the low coherence interferometer data, etc.
  • the host computer can be configured to analyze the wavefront data to obtain metrics such as refractive errors, to display the metrics qualitatively and/or quantitatively on the display, and to allow the surgeon/clinician to select the manner in which the qualitative and/or quantitative metrics is to be displayed.
  • the end user can opt for display of wavefront aberration versus refraction versus prescription, and/or positive cylinder versus negative cylinder, and/or end point indicator(s) such as emmetropia.
  • the host computer can also be configured to allow the surgeon/clinician to flip or rotate the live patient eye image/movie to a preferred orientation.
  • the surgeon/clinician can also rewind and replay desired recorded segments of a composite movie that may include the eye image, the wavefront measurement result and even the low coherence interferometry measurement results, on demand during or after the surgery.
  • the present disclosure can guide a surgeon to titrate the vision correction procedure in real time to optimize the vision correction procedure outcome. For example, it can guide a surgeon in adjusting the IOL position in the eye in terms of centration, tilt and circumferential angular orientation positioning until the measurement confirms optimal placement of the IOL. Moreover, it can guide a surgeon in rotating an implanted toric intraocular lens (IOL) to correct/neutralize astigmatism. It can also guide a surgeon in conducting limbal/corneal relaxing incision or intrastromal lenticule laser (Flexi) to titrate and thus neutralize astigmatism.
  • IOL implanted toric intraocular lens
  • Flexi intrastromal lenticule laser
  • a real time guide can be provided on how a vision correction procedure should proceed in order to facilitate removal of remaining aberration(s), confirm the results, and document the value and sense of the aberrations.
  • the real time information displayed can also be digitally "zoomed out” or “zoomed in” automatically or manually to alert a surgeon or vision correction practitioner that the correction procedure is going in the wrong or right direction.
  • the displayed information can turn into a highlighted form in terms of, for example, font size, boldness, style or color, to confirm intra-operatively that a refractive endpoint goal for a patient such as emmetropia has been reached.
  • audio feedback can also be used solely or in combination with video feedback.
  • audio information can be provided with or without video/graphic information to indicate which direction to move an IOL for proper alignment or which direction to rotate a toric lens to correct/neutralize astigmatism.
  • a real-time audio signal can be generated to indicate the type of refractive error, magnitude of error, and change in error.
  • the pitch, tone and loudness of the real-time audio signal can be varied to indicate improvement or worsening of applied corrections during the vision correction procedure.
  • a specific pitch of the real-time audio signal can be created to identify the error as, for example, cylinder with a tone that indicates the magnitude of the cylinder error.
  • One very important application of the present disclosure is in helping a cataract surgeon in determining, at the aphakic state of a patient's eye, if the pre-surgery selected IOL power is correct or not.
  • the real time aphakic wavefront measurement (preferably together with the eye biometry measurement such as that provided by a built-in low coherence interferometer) can more accurately determine the IOL power needed and thus confirm whether the IOL power selected pre-surgically is correct or not, especially for patients with post-op corneal refractive procedures for whom the pre-surgery IOL selection formulas do not deliver consistent results.
  • Another important application of the present disclosure is in monitoring and recording of the changes in the cornea shape and other eye biometric/anatomic parameters during the whole session of a cataract surgery while the wavefront from the patient eye is measured.
  • the changes can be measured before, during, and after a cataract surgery in the OR (operating room) and can be in corneal topography and thickness as can be measured with keratometry and pachymetry, anterior chamber depth, lens position and thickness, as a result of various factors that can cause a change in the wavefront from the patient eye.
  • the right- before-surgery measurement can be done in the OR when the patient is in the supine position before and after topical anesthesia is applied, before and after an eye lid speculum is engaged to keep the eye lids open.
  • the during-surgery measurements can be done in the OR after incision(s) is(are) made in the cornea, after the cataract lens is removed and the anterior chamber is filled with a certain gel (OVD, Ophthalmic Viscosurgical Device) before an artificial intraocular lens is implanted, after an IOL is implanted but before the incision wound is sealed.
  • ODD Ophthalmic Viscosurgical Device
  • the right-after-surgery measurement can be done in the OR as well when the patient is still in the supine position right after the surgeon has sealed the incision/wound but before the eye lid speculum is removed, and after the eye lid speculum is removed.
  • the data thus obtained on the changes in the cornea shape and other eye biometric/anatomic parameters can be combined with the ocular wavefront measurement data and be saved in a data base. Another round of measurements can be done after the incision(s)/wound has/have completely healed weeks or months after the surgery and the difference or change in the ocular wavefront and the cornea shape and/or the eye biometry parameters can also be collected.
  • a nominal data base can therefore be established and processed to figure out the target refraction right after a cataract surgery that needs to be set in order to result in a final desired vision correction outcome after the wound has completely healed. In this way, all the effects, including even surgeon-induced aberrations such as astigmatism resulting, for example, from a particular personalized cornea incision habit, would have been taken into consideration.
  • the presently disclosed wavefront sensor can be combined with a variety of other ophthalmic instruments for a wide range of applications.
  • it can be integrated with a femto-second laser or an excimer laser for LASIK, or eye lens fracturing, or for alignment and/or guidance on "incision", or for close loop ablation of eye tissues.
  • the live eye image, OLCI/OCT data, and the wavefront data can be combined to indicate if optical bubble(s) is/are present in the eye lens or anterior chamber before, during and after an eye surgical operation.
  • the wavefront sensor can also be integrated with or adapted to a slit lamp bio-microscope.
  • the present invention can also be integrated or combined with an adaptive optics system.
  • a deformable mirror or LC (liquid crystal) based transmissive wavefront compensator can be used to do real time wavefront manipulation to compensate some or all of the wavefront errors partially or fully.
  • the presently disclosed wavefront sensor can also be combined with any other type of intra-ocular pressure (IOP) measurement means.
  • IOP intra-ocular pressure
  • it can even be directly used to detect IOP by measuring the eye wavefront change as a function of a patient's heart beat. It can also be directly used for calibrating the IOP.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physics & Mathematics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Optics & Photonics (AREA)
  • Vascular Medicine (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Medical Informatics (AREA)
  • Eye Examination Apparatus (AREA)
  • Human Computer Interaction (AREA)
  • Radiology & Medical Imaging (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Microscoopes, Condenser (AREA)
  • Measurement Of Optical Distance (AREA)
  • Optical Radar Systems And Details Thereof (AREA)

Abstract

La présente invention concerne un capteur de fronts d'onde comportant un module de balayage de fronts d'onde (615) configuré pour émettre en sortie des mesures d'inclinaison de fronts d'onde de sortie d'un faisceau de front d'onde renvoyé depuis l'œil d'un sujet, un dispositif de mesure biométrique/anatomique (197) configuré pour émettre en sortie des mesures biométriques/anatomiques de l'œil du sujet, et un système de traitement couplé au module de balayage (615) et au dispositif de mesure biométrique/anatomique (197) configuré pour le traitement des mesures biométriques/anatomiques émises en sortie lors d'une procédure chirurgicale pour déterminer une information d'état oculaire et pour émettre en sortie simultanément une information d'état oculaire et une information d'inclinaison de fronts d'onde lors de la procédure chirurgicale.
EP13792222.5A 2012-11-07 2013-11-06 Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries Withdrawn EP2903500A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201261723531P 2012-11-07 2012-11-07
PCT/US2013/068676 WO2014074573A1 (fr) 2012-11-07 2013-11-06 Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries

Publications (1)

Publication Number Publication Date
EP2903500A1 true EP2903500A1 (fr) 2015-08-12

Family

ID=49585675

Family Applications (7)

Application Number Title Priority Date Filing Date
EP13792227.4A Withdrawn EP2903502A1 (fr) 2012-11-07 2013-11-06 Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries
EP13792229.0A Withdrawn EP2903503A1 (fr) 2012-11-07 2013-11-06 Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries
EP13792226.6A Withdrawn EP2916713A1 (fr) 2012-11-07 2013-11-06 Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries
EP13792225.8A Withdrawn EP2903501A1 (fr) 2012-11-07 2013-11-06 Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries
EP13795096.0A Withdrawn EP2903504A1 (fr) 2012-11-07 2013-11-06 Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries
EP13792222.5A Withdrawn EP2903500A1 (fr) 2012-11-07 2013-11-06 Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries
EP13792221.7A Withdrawn EP2903499A1 (fr) 2012-11-07 2013-11-06 Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries

Family Applications Before (5)

Application Number Title Priority Date Filing Date
EP13792227.4A Withdrawn EP2903502A1 (fr) 2012-11-07 2013-11-06 Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries
EP13792229.0A Withdrawn EP2903503A1 (fr) 2012-11-07 2013-11-06 Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries
EP13792226.6A Withdrawn EP2916713A1 (fr) 2012-11-07 2013-11-06 Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries
EP13792225.8A Withdrawn EP2903501A1 (fr) 2012-11-07 2013-11-06 Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries
EP13795096.0A Withdrawn EP2903504A1 (fr) 2012-11-07 2013-11-06 Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries

Family Applications After (1)

Application Number Title Priority Date Filing Date
EP13792221.7A Withdrawn EP2903499A1 (fr) 2012-11-07 2013-11-06 Appareil et procédé pour le fonctionnement d'un capteur de fronts d'onde séquentiel en temps réel à large plage de dioptries

Country Status (9)

Country Link
EP (7) EP2903502A1 (fr)
JP (7) JP2016501044A (fr)
KR (7) KR20150083902A (fr)
CN (7) CN104883955B (fr)
AU (8) AU2013341263B2 (fr)
CA (7) CA2890629A1 (fr)
RU (7) RU2015121708A (fr)
TW (7) TWI563964B (fr)
WO (7) WO2014074572A1 (fr)

Families Citing this family (42)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8348429B2 (en) 2008-03-27 2013-01-08 Doheny Eye Institute Optical coherence tomography device, method, and system
US11839430B2 (en) 2008-03-27 2023-12-12 Doheny Eye Institute Optical coherence tomography-based ophthalmic testing methods, devices and systems
US8820931B2 (en) 2008-07-18 2014-09-02 Doheny Eye Institute Optical coherence tomography-based ophthalmic testing methods, devices and systems
US9655517B2 (en) 2012-02-02 2017-05-23 Visunex Medical Systems Co. Ltd. Portable eye imaging apparatus
US20150021228A1 (en) 2012-02-02 2015-01-22 Visunex Medical Systems Co., Ltd. Eye imaging apparatus and systems
US9179840B2 (en) 2012-03-17 2015-11-10 Visunex Medical Systems Co. Ltd. Imaging and lighting optics of a contact eye camera
US9351639B2 (en) 2012-03-17 2016-05-31 Visunex Medical Systems Co. Ltd. Eye imaging apparatus with a wide field of view and related methods
US9226856B2 (en) 2013-03-14 2016-01-05 Envision Diagnostics, Inc. Inflatable medical interfaces and other medical devices, systems, and methods
US10772497B2 (en) 2014-09-12 2020-09-15 Envision Diagnostics, Inc. Medical interfaces and other medical devices, systems, and methods for performing eye exams
US9986908B2 (en) 2014-06-23 2018-06-05 Visunex Medical Systems Co. Ltd. Mechanical features of an eye imaging apparatus
DE102014012633A1 (de) * 2014-08-22 2016-02-25 Carl Zeiss Meditec Ag Augenchirurgiesystem
WO2016123138A1 (fr) 2015-01-26 2016-08-04 Visunex Medical Systems Co. Ltd. Bouchon jetable pour appareil d'imagerie oculaire et procédés associés
WO2016159332A1 (fr) * 2015-03-31 2016-10-06 株式会社ニデック Dispositif de chirurgie laser ophtalmique et programme de commande de chirurgie laser ophtalmique
JP2016206348A (ja) * 2015-04-20 2016-12-08 株式会社トプコン 眼科手術用顕微鏡
JP2016202453A (ja) * 2015-04-20 2016-12-08 株式会社トプコン 眼科手術用顕微鏡
EP3349642B1 (fr) 2015-09-17 2020-10-21 Envision Diagnostics, Inc. Interfaces médicales et autres dispositifs, systèmes et procédés médicaux pour réaliser des examens oculaires
US20170086667A1 (en) * 2015-09-24 2017-03-30 Clarity Medical Systems, Inc. Apparatus and method for wavefront guided vision correction
WO2017063714A1 (fr) * 2015-10-16 2017-04-20 Novartis Ag Traitement d'images ophtalmiques chirurgicales
US10426339B2 (en) * 2016-01-13 2019-10-01 Novartis Ag Apparatuses and methods for parameter adjustment in surgical procedures
DE102016204032A1 (de) * 2016-03-11 2017-09-14 Carl Zeiss Meditec Ag Ophthalmologisches Lasertherapiesystem
AU2017257258B2 (en) * 2016-04-28 2021-07-22 Alex Artsyukhovich Detachable miniature microscope mounted keratometer for cataract surgery
EP3448234A4 (fr) 2016-04-30 2019-05-01 Envision Diagnostics, Inc. Dispositifs, systèmes et procédés médicaux de mise en oeuvre d'examens oculaires et d'oculométrie
WO2017190071A1 (fr) * 2016-04-30 2017-11-02 Envision Diagnostics, Inc. Dispositifs médicaux, systèmes, et procédés d'exécution d'examens des yeux au moyen d'affichages comprenant des miroirs de balayage mems
WO2018049414A1 (fr) * 2016-09-12 2018-03-15 Lensar, Inc. Procédés et systèmes utilisant le laser pour l'insertion alignée de dispositifs dans une structure de l'oeil
DE102016121246A1 (de) * 2016-11-07 2018-05-09 Carl Zeiss Ag Verfahren zur Selbstuntersuchung eines Auges und ophthalmologische Selbstuntersuchungsvorrichtung
MX2019009112A (es) * 2017-02-03 2019-09-13 Sharp Kk Aparato de estacion base, aparato terminal y metodo de comunicacion para aparato de estacion base y aparato terminal.
IL258706A (en) * 2017-04-25 2018-06-28 Johnson & Johnson Vision Care Treatment follow-up methods in emmetropia and system
ES2688769B2 (es) * 2017-05-05 2019-05-31 Univ Catalunya Politecnica Método para medir la difusión intraocular que afecta a diferentes medios oculares del ojo y productos de programa de ordenador del mismo
CN110944571B (zh) * 2017-08-11 2023-09-29 卡尔蔡司医疗技术公司 用于改进眼科成像的系统和方法
DE102017124545B3 (de) 2017-10-20 2019-01-24 Carl Zeiss Meditec Ag Mikroskop
DE102017124547B4 (de) * 2017-10-20 2020-01-02 Carl Zeiss Meditec Ag Mikroskop
DE102017124548B3 (de) 2017-10-20 2018-07-26 Carl Zeiss Meditec Ag Mikroskop mit einer OCT-Einrichtung und einer Wellenfrontmesseinrichtung
AU2018384035B2 (en) * 2017-12-12 2023-12-21 Alcon Inc. Multi-beam splitting using spatial beam separation
DE102018219902A1 (de) * 2018-11-21 2020-05-28 Carl Zeiss Meditec Ag Anordnung und Verfahren zur Kompensation der Temperaturabhängigkeit einer Facettenlinse für die Bestimmung der Topographie eines Auges
JP7218858B2 (ja) * 2018-11-27 2023-02-07 国立大学法人 筑波大学 画像解析装置、画像解析装置の作動方法、及び眼科装置
CN109633668B (zh) * 2018-12-26 2021-01-15 中国科学院长春光学精密机械与物理研究所 激光测距装置
CN110123267B (zh) * 2019-03-22 2022-02-08 重庆康华瑞明科技股份有限公司 基于眼科裂隙灯的附加泛光投影装置及图像分析系统
DE102019208386B4 (de) * 2019-06-07 2024-07-25 Infineon Technologies Ag Steuersystem und Verfahren für Laserabtastung
DE102019135609B4 (de) * 2019-12-20 2023-07-06 Schwind Eye-Tech-Solutions Gmbh Verfahren zur Steuerung eines augenchirurgischen Lasers, sowie Behandlungsvorrichtung
TWI745053B (zh) * 2020-08-27 2021-11-01 國立雲林科技大學 稜鏡度測試儀
TWI801174B (zh) * 2022-03-22 2023-05-01 葉豐銘 數位式紅綠藍點光源視標及驗光裝置
CN114858291B (zh) * 2022-07-05 2022-09-20 中国工程物理研究院激光聚变研究中心 一种基于点衍射的激光链路分段波前探测方法及装置

Family Cites Families (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0618363A (ja) * 1992-06-30 1994-01-25 Canon Inc レンズメータ
US5651600A (en) * 1992-09-28 1997-07-29 The Boeing Company Method for controlling projection of optical layup template utilizing cooperative targets
US5345281A (en) * 1992-12-17 1994-09-06 John Taboada Eye tracking system and method
US5457310A (en) * 1993-10-20 1995-10-10 Varo Inc. Method and system for automatically correcting boresight errors in a laser beam guidance system
UA46833C2 (uk) * 1998-10-07 2002-06-17 Інститут Біомедичної Техніки Та Технологій Академії Технологічних Наук України Вимірювач абераційної рефракції ока
PT1210003E (pt) * 2000-05-08 2004-11-30 Alcon Inc Medicao objectiva e correccao de sistemas opticos que utilizam a analise da frente de onda
US6460997B1 (en) * 2000-05-08 2002-10-08 Alcon Universal Ltd. Apparatus and method for objective measurements of optical systems using wavefront analysis
US6616279B1 (en) * 2000-10-02 2003-09-09 Johnson & Johnson Vision Care, Inc. Method and apparatus for measuring wavefront aberrations
US6694169B2 (en) * 2001-02-22 2004-02-17 Minrad Inc. Targeting system and method of targeting
FR2823968B1 (fr) * 2001-04-27 2005-01-14 Ge Med Sys Global Tech Co Llc Procede d'etalonnage d'un systeme d'imagerie, support de memoire et dispositif associe
AU2002353960A1 (en) * 2001-11-09 2003-05-26 Wavefront Sciences, Inc. System and method for perfoming optical corrective procedure with real-time feedback
US6637884B2 (en) * 2001-12-14 2003-10-28 Bausch & Lomb Incorporated Aberrometer calibration
AU2003210974A1 (en) * 2002-02-11 2003-09-04 Visx, Inc. Method and device for calibrating an optical wavefront system
US7248374B2 (en) * 2002-02-22 2007-07-24 Faro Laser Trackers Llc Spherically mounted light source with angle measuring device, tracking system, and method for determining coordinates
US7284862B1 (en) * 2003-11-13 2007-10-23 Md Lasers & Instruments, Inc. Ophthalmic adaptive-optics device with a fast eye tracker and a slow deformable mirror
US20050122473A1 (en) * 2003-11-24 2005-06-09 Curatu Eugene O. Method and apparatus for aberroscope calibration and discrete compensation
US20060126018A1 (en) * 2004-12-10 2006-06-15 Junzhong Liang Methods and apparatus for wavefront sensing of human eyes
EP1894072B1 (fr) * 2005-05-31 2018-11-21 BorgWarner, Inc. Procédé de commande d'actionneur
US8820929B2 (en) * 2006-01-20 2014-09-02 Clarity Medical Systems, Inc. Real-time measurement/display/record/playback of wavefront data for use in vision correction procedures
US7445335B2 (en) * 2006-01-20 2008-11-04 Clarity Medical Systems, Inc. Sequential wavefront sensor
US8777413B2 (en) * 2006-01-20 2014-07-15 Clarity Medical Systems, Inc. Ophthalmic wavefront sensor operating in parallel sampling and lock-in detection mode
US8100530B2 (en) * 2006-01-20 2012-01-24 Clarity Medical Systems, Inc. Optimizing vision correction procedures
US8356900B2 (en) 2006-01-20 2013-01-22 Clarity Medical Systems, Inc. Large diopter range real time sequential wavefront sensor
US7758189B2 (en) * 2006-04-24 2010-07-20 Physical Sciences, Inc. Stabilized retinal imaging with adaptive optics
US7665844B2 (en) * 2006-10-18 2010-02-23 Lawrence Livermore National Security Llc High-resolution adaptive optics scanning laser ophthalmoscope with multiple deformable mirrors
GB2450075A (en) * 2007-03-08 2008-12-17 Selex Sensors & Airborne Sys Tracking device for guiding a flight vehicle towards a target
US20090149840A1 (en) * 2007-09-06 2009-06-11 Kurtz Ronald M Photodisruptive Treatment of Crystalline Lens
US7654672B2 (en) 2007-10-31 2010-02-02 Abbott Medical Optics Inc. Systems and software for wavefront data processing, vision correction, and other applications
DE102008014294A1 (de) * 2008-03-14 2009-09-17 Bausch & Lomb Inc. Schneller Algorithmus für Wellenfrontdatenstrom
DE102008047400B9 (de) * 2008-09-16 2011-01-05 Carl Zeiss Surgical Gmbh Augenchirurgie-Messsystem
US8459795B2 (en) * 2008-09-16 2013-06-11 Carl Zeiss Meditec Ag Measuring system for ophthalmic surgery
FR2952784B1 (fr) * 2009-11-16 2012-03-23 Alcatel Lucent Procede et systeme d'economie d'energie dans un terminal mobile
US9492322B2 (en) * 2009-11-16 2016-11-15 Alcon Lensx, Inc. Imaging surgical target tissue by nonlinear scanning
PL2563206T3 (pl) * 2010-04-29 2018-12-31 Massachusetts Institute Of Technology Sposób i urządzenie do korekcji ruchu i poprawy jakości obrazu w optycznej tomografii koherencyjnej

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See references of WO2014074573A1 *

Also Published As

Publication number Publication date
EP2916713A1 (fr) 2015-09-16
JP2016504062A (ja) 2016-02-12
TW201422201A (zh) 2014-06-16
AU2013341264A1 (en) 2015-05-14
CA2890629A1 (fr) 2014-05-15
RU2015121708A (ru) 2016-12-27
TW201422199A (zh) 2014-06-16
AU2013341230A1 (en) 2015-05-14
EP2903501A1 (fr) 2015-08-12
RU2015121412A (ru) 2016-12-27
AU2013341264B2 (en) 2015-12-03
CA2890623C (fr) 2017-01-10
AU2013341243B2 (en) 2015-09-17
CN104883954A (zh) 2015-09-02
EP2903502A1 (fr) 2015-08-12
JP5996120B2 (ja) 2016-09-21
TWI520712B (zh) 2016-02-11
RU2015121705A (ru) 2017-01-10
CN104883955B (zh) 2016-12-28
KR20150083903A (ko) 2015-07-20
JP2016504061A (ja) 2016-02-12
KR20150082566A (ko) 2015-07-15
CA2890616C (fr) 2017-05-02
JP2016504927A (ja) 2016-02-18
TW201422198A (zh) 2014-06-16
CA2890608A1 (fr) 2014-05-15
WO2014074598A1 (fr) 2014-05-15
RU2015121415A (ru) 2016-12-27
CN104883955A (zh) 2015-09-02
AU2013341289B2 (en) 2015-09-17
CA2890634A1 (fr) 2014-05-15
AU2013341263A8 (en) 2015-06-18
TWI520711B (zh) 2016-02-11
CN104883956B (zh) 2016-12-07
CN104883958A (zh) 2015-09-02
CN104883959B (zh) 2016-11-09
AU2013341263A1 (en) 2015-05-14
WO2014074623A1 (fr) 2014-05-15
JP2016502425A (ja) 2016-01-28
AU2013341281A1 (en) 2015-05-14
AU2013341230B2 (en) 2016-04-28
CA2890634C (fr) 2017-05-02
WO2014074595A1 (fr) 2014-05-15
CN106539555A (zh) 2017-03-29
WO2014074573A1 (fr) 2014-05-15
WO2014074572A1 (fr) 2014-05-15
RU2015121346A (ru) 2016-12-27
CA2890646A1 (fr) 2014-05-15
KR20150082567A (ko) 2015-07-15
CA2890651A1 (fr) 2014-05-15
TW201422196A (zh) 2014-06-16
AU2013341243A1 (en) 2015-05-14
AU2013341286B2 (en) 2016-04-28
CA2890651C (fr) 2017-01-03
EP2903499A1 (fr) 2015-08-12
CN104883957A (zh) 2015-09-02
TWI520710B (zh) 2016-02-11
KR20150084914A (ko) 2015-07-22
RU2015121378A (ru) 2016-12-27
CN104883959A (zh) 2015-09-02
CA2890608C (fr) 2017-01-10
JP2016505287A (ja) 2016-02-25
WO2014074590A1 (fr) 2014-05-15
RU2015121427A (ru) 2016-12-27
TW201422200A (zh) 2014-06-16
TWI520713B (zh) 2016-02-11
KR20150083902A (ko) 2015-07-20
TW201422202A (zh) 2014-06-16
AU2013341289A1 (en) 2015-05-14
EP2903504A1 (fr) 2015-08-12
CA2890646C (fr) 2016-04-19
TWI599342B (zh) 2017-09-21
CA2890616A1 (fr) 2014-05-15
CN104883956A (zh) 2015-09-02
KR20150084916A (ko) 2015-07-22
CA2890623A1 (fr) 2014-05-15
CN104883954B (zh) 2016-11-09
CN104883960A (zh) 2015-09-02
JP2016501044A (ja) 2016-01-18
AU2013341263B2 (en) 2016-03-10
TWI563964B (zh) 2017-01-01
AU2013341286A1 (en) 2015-05-14
WO2014074636A1 (fr) 2014-05-15
EP2903503A1 (fr) 2015-08-12
TWI538656B (zh) 2016-06-21
JP2016501045A (ja) 2016-01-18
CN104883958B (zh) 2017-08-25
KR20150084915A (ko) 2015-07-22
AU2013341281B2 (en) 2016-03-10
AU2016208287A1 (en) 2016-08-11
TW201422197A (zh) 2014-06-16

Similar Documents

Publication Publication Date Title
US9585553B2 (en) Apparatus and method for operating a real time large diopter range sequential wavefront sensor
CA2890616C (fr) Appareil et procede pour le fonctionnement d'un capteur de fronts d'onde sequentiel en temps reel a large plage de dioptries
US9101292B2 (en) Apparatus and method for operating a real time large dipoter range sequential wavefront sensor

Legal Events

Date Code Title Description
PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

17P Request for examination filed

Effective date: 20150507

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

AX Request for extension of the european patent

Extension state: BA ME

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN

18D Application deemed to be withdrawn

Effective date: 20180602