WO2015003086A1 - Apparatus and method of determining an eye prescription - Google Patents

Apparatus and method of determining an eye prescription Download PDF

Info

Publication number
WO2015003086A1
WO2015003086A1 PCT/US2014/045305 US2014045305W WO2015003086A1 WO 2015003086 A1 WO2015003086 A1 WO 2015003086A1 US 2014045305 W US2014045305 W US 2014045305W WO 2015003086 A1 WO2015003086 A1 WO 2015003086A1
Authority
WO
WIPO (PCT)
Prior art keywords
eye
data sets
wavefront
optical
optical apparatus
Prior art date
Application number
PCT/US2014/045305
Other languages
French (fr)
Inventor
Nicholas James DURR
Eduardo Lage NEGRO
Shivang R. DAVE
Original Assignee
Massachusetts Institute Of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Massachusetts Institute Of Technology filed Critical Massachusetts Institute Of Technology
Priority to US14/900,695 priority Critical patent/US9854965B2/en
Priority to CN201480046065.8A priority patent/CN105473056B/en
Priority to EP18175052.2A priority patent/EP3387985B1/en
Priority to EP14820078.5A priority patent/EP3016576A4/en
Priority to KR1020167002229A priority patent/KR101995878B1/en
Priority to JP2016524353A priority patent/JP6470746B2/en
Publication of WO2015003086A1 publication Critical patent/WO2015003086A1/en
Priority to US15/858,415 priority patent/US10349830B2/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/103Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining refraction, e.g. refractometers, skiascopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0025Operational features thereof characterised by electronic signal processing, e.g. eye models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0091Fixation targets for viewing direction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/1015Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for wavefront analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound
    • A61B5/7415Sound rendering of measured values, e.g. by pitch or volume variation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7455Details of notification to user or communication with user or patient ; user input means characterised by tactile indication, e.g. vibration or electrical stimulation
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61FFILTERS IMPLANTABLE INTO BLOOD VESSELS; PROSTHESES; DEVICES PROVIDING PATENCY TO, OR PREVENTING COLLAPSING OF, TUBULAR STRUCTURES OF THE BODY, e.g. STENTS; ORTHOPAEDIC, NURSING OR CONTRACEPTIVE DEVICES; FOMENTATION; TREATMENT OR PROTECTION OF EYES OR EARS; BANDAGES, DRESSINGS OR ABSORBENT PADS; FIRST-AID KITS
    • A61F9/00Methods or devices for treatment of the eyes; Devices for putting-in contact lenses; Devices to correct squinting; Apparatus to guide the blind; Protective devices for the eyes, carried on the body or in the hand
    • A61F9/007Methods or devices for eye surgery
    • A61F9/008Methods or devices for eye surgery using laser
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/04Constructional details of apparatus
    • A61B2560/0406Constructional details of apparatus specially shaped apparatus housings
    • A61B2560/0425Ergonomically shaped housings
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • A61B3/145Arrangements specially adapted for eye photography by video means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • A61B3/15Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing
    • A61B3/152Arrangements specially adapted for eye photography with means for aligning, spacing or blocking spurious reflection ; with means for relaxing for aligning
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/18Arrangement of plural eye-testing or -examining apparatus

Definitions

  • the invention generally relates to optical or ophthalmologic methods and apparatus and, more particularly, the invention relates to methods and devices for facilitating the process of determining optical properties of an eye.
  • Refractive errors are low-order aberrations, such as in an eye of a human.
  • “refractive prescription” is a prescription for corrective lenses (eyeglasses) that correct refractive errors. As described in more detail herein, eyes may also or instead suffer from higher-order aberrations.
  • autorefractors While widely used in the United States and Europe for many years, autorefractors have a number of drawbacks. For example, autorefractors typically are quite expensive, often costing more than ten thousand dollars. In addition, autorefractors generally are large and immobile, and they require extensive assistance by an ophthalmologist, optometrist or her trained staff. Accordingly, for these and other related reasons, autorefractors are used much less frequently in low-resource settings, such as parts of Africa, Asia and even rural portions of the United States. Wavefront aberrometers are a complex and expensive type of autorefractor. Wavefront aberrometers are also used to guide laser surgery, such as for cataracts and vision correction.
  • Prescriptions may be expresses in optometric notation, power vectors notation and their equivalence.
  • An embodiment of the present invention provides a method of determining an optical property of an eye of a living being.
  • the method includes providing an optical apparatus that has a proximal port and a distal port.
  • the proximal port and the distal port together form a visual channel.
  • the eye is aligned with the proximal port.
  • Target indicia are produced at effective infinity.
  • the target indicia are viewable through the visual channel.
  • the eye is focused on the target indicia.
  • Accommodation of the eye is determined, as the eye views the target indicia.
  • An optical property for the eye is calculated, as a function of the determined accommodation.
  • Data may be gathered relating to accommodation of the eye.
  • Calculating the optical property may include using the data relating to accommodation to identify when the eye is accommodating and when the eye is not accommodating.
  • Calculating the optical property may include selecting data relating to when the eye is not accommodating to calculate the optical property.
  • Calculating the optical property may include discarding the data relating to when the eye is accommodating.
  • the method may also include generating a target light beam by a target light source coupled to the apparatus and producing the target indicia with the target light beam.
  • Determining accommodation may include obtaining a plurality of sequential images of a light wavefront from the eye, as the eye focuses on the target indicia.
  • Calculating the optical property may include calculating the optical property as a function of timing of the sequential images.
  • Determining accommodation may include tracking changes in the optical aberrations of the eye using measurements from a plurality of sequential images of a light wavefront from the eye, as the eye focuses on the target indicia. [0013] Determining accommodation of the eye may include filtering one or more images from the plurality of sequential images.
  • the filtering may be based on physiological parameters of the eye, including a rate of change in accommodation of the eye.
  • Calculating the optical property for the eye may include using a wavefront aberrometer to calculate the optical property.
  • Focusing the eye on the target indicia may include focusing the eye on the target indicia while the target indicia are at least about 10 feet from the apparatus.
  • Calculating the optical property may include calculating a prescription for the eye.
  • Calculating the optical property may include calculating an eyeglass prescriptions for distant and near vision.
  • Another embodiment of the present invention provides an optical apparatus that includes a proximal port and a distal port that together form a visual channel.
  • a target light source is configured to produce target indicia at effective infinity.
  • the target indicia are viewable through the visual channel.
  • Determining logic is configured to determine accommodation of an eye, as the eye views the target indicia.
  • the apparatus may also include a body forming the proximal and distal ports.
  • the body may further contain the determining logic.
  • the determining logic may be configured to calculate a prescription for the eye, as a function of the determined accommodation of the eye.
  • the optical apparatus may further include a wavefront image sensor operatively coupled with the determining logic.
  • the image sensor may be configured to capture a plurality of sequential images of wavefronts, as the eye focuses on the target indicia.
  • the logic for determining accommodation may be configured to calculate the prescription, as a function of timing of the sequential images.
  • the determining logic may use as input a spherical prescription for the eye, as a function of the timing of the sequential images.
  • the determining logic may use as input a spherical equivalent (M) prescription for the eye, as a function of the timing of the sequential images.
  • a filter may be operably coupled with the determining logic. The filter may be configured to filter one or more images from the plurality of sequential images.
  • Yet another embodiment of the present invention provides an optical apparatus that includes a proximal port configured to receive an eye.
  • An array of primary light sensors is configured to receive a wavefront passing through the proximal port.
  • the array of primary light sensors has a perimeter.
  • At least one secondary light sensor is positioned outside the perimeter of the array of primary light sensors.
  • a circuit is configured to detemiine a parameter of the eye using wavefront data from the array of primary light sensors.
  • the optical apparatus may further include a non-stationary body.
  • the non-stationary body has the proximal port and a distal port.
  • the proximal port and the distal port form a visual channel from the proximal port through the distal port.
  • the visual channel may be open view to enable the eye to see target indicia external to and spaced away from the body.
  • a retinal light source may be configured to direct an illumination beam toward the proximal port to produce the wavefront.
  • a cue generator may be operatively coupled with the at least one secondary light sensor.
  • the cue generator may be configured to generate a cue as a function of receipt of the wavefront by the at least one secondary light sensor.
  • the cue generator may be configured to generate a visual cue, an acoustic cue and/or a mechanical cue, as a function of receipt of the wavefront by the at least one secondary light sensor.
  • the array of primary light sensors may have a first sensitivity to the wavefront, and the at least one secondary light sensor may have a second sensitivity to the wavefront.
  • the first sensitivity may be greater than the second sensitivity.
  • the array of primary light sensors may include a CCD, and the at least one secondary light sensor may include a quadrant sensor.
  • the distal port may at least in part define an optical axis.
  • the at least one secondary light sensor may be configured to receive the wavefront, as a function of the orientation of the eye relative to the optical axis.
  • the at least one secondary light sensor may substantially circumscribe the perimeter of the array of primary light sensors.
  • An embodiment of the present invention provides an optical method that includes providing an optical apparatus.
  • the optical apparatus has a proximal port and a distal port that together form a visual channel from the proximal port through the distal port.
  • the apparatus further includes an array of primary light sensors having a perimeter.
  • the apparatus further includes at least one secondary light sensor positioned outside the perimeter of the array of primary light sensors.
  • a living being's eye is aligned with the proximal port.
  • the eye views through the distal port to target indicia exterior of the apparatus.
  • the eye is illuminated to produce a wavefront through the proximal port.
  • the amount of the wavefront sensed by the at least one secondary light sensor is determined.
  • a cue is generated, as a function of the amount of the wavefront sensed by the at least one secondary light sensor.
  • An eye parameter such as a prescription for the eye, may be determined.
  • the distal port may at least in part define an optical axis.
  • the method may further include moving the eye toward the optical axis in response to the cue.
  • the distal port may at least in part define an optical axis.
  • the at least one secondary light sensor may be configured to receive the wavefront, as a function of the orientation of the eye, relative to the optical axis.
  • the wavefront may be split into a primary path toward the array of primary light sensors, and the wavefront may be further split into a secondary path toward the at least one secondary light sensor.
  • Another embodiment of the present invention provides an optical apparatus that includes a proximal port configured to receive an eye and a distal port.
  • the apparatus includes a visual channel from the proximal port through the distal port.
  • An array of primary light sensors is configured to receive a wavefront passing through the proximal port.
  • the apparatus also includes at least one secondary light sensor. Optics within the visual channel are configured to split the wavefront into a primary path toward the array of primary light sensors and a secondary path toward the at least one secondary light sensor.
  • the primary light sensors may be adjacent the at least one secondary light sensor.
  • a lens may be adjacent the at least one secondary light sensor.
  • the lens may be positioned so the secondary path passes through the lens.
  • a retinal light source may be configured to direct an illumination beam toward the proximal port to produce the wavefront.
  • a cue generator may be operatively coupled with the at least one secondary light sensor.
  • the cue generator may be configured to generate a cue, as a function of receipt of the wavefront by the at least one secondary light sensor.
  • the cue generator may be configured to generate a visual cue, an acoustic cue and/or a mechanical cue.
  • the cue generator generates the at least one cue, as a function of receipt of the wavefront by the at least one secondary light sensor.
  • the array of primary light sensors may have a first sensitivity to the wavefront, and the at least one secondary light sensor may have a second sensitivity to the wavefront.
  • the first sensitivity may be greater than the second sensitivity.
  • the array of primary light sensors may include a CCD, and the at least one secondary light sensor may include a quadrant sensor.
  • Yet another embodiment of the present invention provides an optical method.
  • the method includes providing an optical apparatus that has a proximal port and a distal port. Together, the proximal port and the distal port form a visual channel from the proximal port through the distal port.
  • the apparatus also has an array of primary light sensors and at least one secondary light sensor.
  • a living being's eye is aligned with the proximal port. The eye views through the distal port to target indicia exterior of the apparatus. The eye is illuminated to produce a wavefront through the proximal port. The wavefront is split into a primary path toward the array of primary light sensors and a secondary path toward the at least one secondary light sensor.
  • the method may include passing the secondary path of the wavefront through a lens to focus the split portion of the wavefront along the secondary path.
  • a light beam may be directed toward the proximal port to reflect off the eye to produce the wavefront.
  • a cue may be generated, as a function of receipt of the wavefront by the at least one secondary light sensor.
  • Generating the cue may include generating a visual cue, an acoustic cue and/or a mechanical cue.
  • the cue may be generated as a function of receipt of the wavefront by the at least one secondary light sensor.
  • the at least one secondary light sensor may include a quadrant sensor.
  • An embodiment of the present invention provides a method of determining an optical property of an eye of a living being.
  • Thee method includes providing an optical apparatus having a proximal port and a distal port that together form a visual channel.
  • the eye is aligned with the proximal port.
  • Light is directed into the eye to produce a wavefront.
  • the wavefront is received via the proximal port.
  • a plurality of sequential, time spaced-apart data sets of the wavefront is captured.
  • the data sets include temporal information sufficient to describe a relative time at which each data set of the plurality of data sets is captured.
  • An optical property of the eye is determined, as a function of the temporal information.
  • the plurality of sequential data sets may include images of the wavefront.
  • the method may include filtering at least one data set of the plurality of data sets.
  • Determining the optical property may include determining the optical property as a function of the order of the plurality of data sets and the contents of the plurality of data sets.
  • Each data set of the plurality of data sets may include wavefront aberration information.
  • Determining the optical property may include analyzing the plurality of data sets for trends in the data.
  • the plurality of data sets may include information relating to accommodation of the eye.
  • the method may include weighting certain data sets of the plurality of data sets, as a function of a signal-to-noise ratio.
  • the plurality of data sets may include a video of the wavefront.
  • the optical property may includes a prescription for the eye.
  • the apparatus includes a proximal port and a distal port that together form a visual channel.
  • An illumination light source is configured to direct light into the eye to produce a wavefront that is received through the proximal port.
  • An image capture sensor is operatively coupled with the visual channel. The sensor is configured to capture a plurality of sequential, time spaced-apart data sets of the wavefront. The data sets include temporal information sufficient to describe a relative time at which each data set of the plurality of data sets is captured.
  • Optical property logic is operatively coupled to the image capture sensor. The optical property logic is configured to determine an optical property of the eye, as a function of the temporal information.
  • the plurality of data sets may include images of the wavefront.
  • a filter may be configured to filter at least one data set of the plurality of data sets.
  • the optical property logic may include logic configured for determining an optical property, as a function of the order of the plurality of data sets and contents of the plurality of data sets.
  • the optical property may be a prescription for the eye.
  • Each data set of the plurality of data sets may include wavefront aberration information.
  • the optical property may include a spherical component and a cylindrical component.
  • the optical property logic may be configured to determine the cylindrical component after determining the spherical component.
  • the optical property logic may be configured to analyze the plurality of data sets for trends in the data.
  • the plurality of data sets may include information relating to accommodation of the eye.
  • the optical property logic may be configured to weigh at least one data set of the plurality of data sets, as a function of a signal-to-noise ratio.
  • the sequential data sets may include a video of the wavefront.
  • Yet another embodiment of the present invention provides a method of determining an optical property of an eye of a living being.
  • the method includes providing an optical apparatus having a proximal port and a distal port that together form a visual channel.
  • the eye is aligned with the proximal port.
  • Light is directed into the eye to produce a wavefront.
  • the wavefront is received via the proximal port.
  • a plurality of sequential, time spaced-apart data sets of the wavefront is captured.
  • the data sets include temporal information sufficient to describe a relative time at which each data set of the plurality of data sets is captured.
  • the data sets include high-frequency noise.
  • Data sets of the plurality of data sets are registered. An optical property of the eye is determined, as a function of the registered data sets.
  • Registering the data sets may include mitigating the high frequency noise.
  • Registering the data sets may include increasing a signal-to-noise ratio.
  • Registering the data sets may include registering consecutive data sets.
  • the plurality of data sets may include images of the wavefront.
  • At least one data set of the plurality of data sets may be filtered before registering the data sets.
  • Each data set may include wavefront aberration information.
  • the plurality of data sets may include a video of the wavefront.
  • Registering the plurality of data sets may include registering consecutive data sets and combining the registered consecutive data sets to mitigate noise.
  • Registering the plurality of data sets may include selecting data sets that were acquired close enough together in time to avoid data sets that span a change in the optical property of the eye due to accommodation. Registering the plurality of data sets may also include registering the selected data sets. The method may further include combining the registered data sets to mitigate noise.
  • Registering the plurality of data sets may include registering data sets with similar, within a predetermined range, wavefront aberration information and combining the registered data sets to mitigate noise.
  • An embodiment of the present invention provides an optical apparatus for determining an optical property of an eye of a living being.
  • the apparatus includes a proximal port and a distal port that together form a visual channel.
  • An illumination light source is configured to direct light into the eye to produce a wavefront that is received through the proximal port.
  • An image capture sensor is operatively coupled with the visual channel.
  • the sensor is configured to capture a plurality of sequential, time spaced-apart data sets of the wavefront.
  • the data sets include temporal information sufficient to describe a relative time at which each data set of the plurality of data sets is captured.
  • the data sets include high-frequency noise.
  • Optical property logic is operatively coupled with the sensor.
  • the optical property logic is configured to register consecutive data sets of the plurality of data sets to mitigate the high frequency noise.
  • the optical property logic is configured to also determine an optical property of the eye, as a function of the registered data sets.
  • the plurality of data sets may include images of the wavefront.
  • a filter may be configured to filter at least one data set of the plurality of data sets before registering.
  • Each data set of the plurality of data sets may include wavefront aberration information.
  • the plurality of data sets may include a video of the wavefront.
  • the optical property logic may be configured to combine or average the registered data sets to mitigate noise.
  • the optical property logic may be configured to register consecutive data sets.
  • the optical property logic may be configured to select data sets that were acquired close enough together in time to avoid data sets that span a change in the optical property of the eye due to accommodation and register the selected data sets.
  • the optical property may be an eye prescription.
  • Illustrative embodiments of the invention may be implemented as a computer program product having a computer usable medium with computer readable program code thereon.
  • the computer readable code may be read and utilized by a computer system in accordance with conventional processes.
  • Fig. 1 is a schematic cross-sectional diagram of an emmetropic human eye imaging a distant object.
  • Fig. 2 is a schematic cross-sectional diagram of an emmetropic human eye imaging a close object.
  • Fig. 3 is a schematic cross-sectional diagram of a hyperopic human eye imaging a close object.
  • Fig. 4 is a schematic cross-sectional diagram of a myopic human eye imaging a distant object.
  • Fig. 5 schematically illustrates a corrective lens disposed in front of the myopic eye of Fig. 4 to correct the myopia.
  • Fig. 6 schematically illustrates a Hartmann-Shack wavefront aberrometer adjacent an emmetropic human eye, according to the prior art.
  • Fig. 7 schematically illustrates wavefronts from a virtual light source exiting the eye of Fig. 6 and received by the Hartmann-Shack wavefront aberrometer, as well as a hypothetical spot diagram generated by the Hartmann-Shack wavefront aberrometer, according to the prior art.
  • Fig. 8 schematically illustrates wavefronts from a virtual light source exiting a non- emmetropic eye and received by a Hartmann-Shack wavefront aberrometer, as well as a hypothetical spot diagram generated by the Hartmann-Shack wavefront aberrometer, according to the prior art.
  • Fig. 9 is schematically illustrates a hypothetical wavefront from a non-emmetropic eye impinging on an array of lenslets of a Hartmann-Shack wavefront aberrometer and resulting illumination of an optical sensor of the aberrometer and a three-dimensional graph representing geographic distribution of intensity of the illumination, as well as an enlarged view of one lens of the array of lenslets, according to the prior art.
  • Fig. 10 provides perspective views of surface shapes defined by 1st to 4th order
  • Figs. 11, 12 and 13 contain right, front and left side views of a lightweight portable hand-held automatic device that includes a Hartmann-Shack wavefront aberrometer, according to an embodiment of the present invention.
  • Fig. 14 illustrates the device of Figs. 11-13 in use by a patient.
  • Fig. 15 is a schematic block diagram of the device of Figs. 11-14, showing its internal components, according to an embodiment of the present invention.
  • Fig. 15-1 is a schematic diagram illustrating an eye properly aligned with the device of Figs. 11-15, as well as a view as seen by the eye through the device, according to an embodiment of the present invention.
  • Fig. 15-2 is a schematic diagram illustrating an eye slightly misaligned with the device of Figs. 11-15, as well as a hypothetical view as seen by the eye through the device, according to an embodiment of the present invention.
  • Fig. 15-3 is a schematic diagram illustrating an eye grossly misaligned with the device of Figs. 11-15, as well as a view as seen by the eye through the device, according to an embodiment of the present invention.
  • Fig. 16 is a schematic block diagram of the device of Figs. 11-14, showing its internal components, according to another embodiment of the present invention.
  • Fig. 17 illustrates a view through the device of Fig. 16, as seen by a patient using the device when the patient's eye is properly aligned with the device.
  • Fig. 18 illustrates a hypothetical view through the device of Fig. 16, as seen by a patient using the device when the patient's eye is not properly aligned with the device.
  • Fig. 19 is a schematic block diagram of the device of Figs. 11-14, showing its internal components, according to yet another embodiment of the present invention.
  • Fig. 20 is a schematic block diagram of the device of Fig. 19 in use.
  • Fig. 21 is a schematic block diagram of an alternative embodiment of the device of
  • Fig. 22 is a front view of an embodiment of the present invention in use by a patient, in which the device is tilted, with respect to the patient's interpupillary axis.
  • Fig. 23 illustrates a binocular lightweight portable hand-held automatic device that includes a Hartmann-Shack wavefront aberrometer, according to an embodiment of the present invention.
  • Fig. 23-1 illustrates a binocular lightweight portable hand-held automatic device that includes a Hartmann-Shack wavefront aberrometer, according to another embodiment of the present invention.
  • Fig. 24 schematically illustrates a dovetail slide of the device of Fig. 23 for adjusting spacing between two eyepieces of the device, according to an embodiment of the present invention.
  • Fig. 25 schematically illustrates a pivot joint of the device of Fig. 23 for adjusting the spacing between two eyepieces of the device, according to another embodiment of the present invention.
  • Fig. 26 is a schematic block diagram of hardware components of an analysis unit that may be included in, for example, the device of Fig. 15, according to an embodiment of the present invention.
  • Fig. 27 is a schematic diagram of a hypothetical spot diagram not centered on an optical sensor of the device of Figs. 11-15.
  • Fig. 28 is a schematic block diagram of an alignment feedback module, according to several embodiments of the present invention.
  • Fig. 29 is a schematic diagram of a hypothetical spot diagram that falls only partially on the optical sensor of the device of Figs. 11-15.
  • Fig. 30 is a schematic diagram of a display indicating a location of a hypothetical centroid of a spot diagram, relative to vertical and horizontal axes, according to several embodiments of the present invention.
  • Fig. 31 is a plan view of an array of light sensors around the optical sensor of the device of Figs. 11-15, according to several embodiments of the present invention.
  • Fig. 32 is a schematic block diagram of the device of Figs. 11-14, showing its internal components, according to another embodiment of the present invention.
  • Fig. 33 is a plan view of a quadrant photodiode detector of the device of Fig. 32, including a hypothetical spot diagram projected thereon, according to an embodiment of the present invention.
  • Fig. 34 is a schematic plan view of an exemplary array of visible light sources for projecting a visible spot on a distant object, according to an embodiment of the present invention.
  • Fig. 35 is a schematic plan view of an exemplary array of light sources for projecting a virtual light source onto a retina of a patient's eye, according to an embodiment of the present invention.
  • Fig. 36 is a schematic block diagram of an unaccommodation detector, according to an embodiment of the present invention.
  • Fig. 37 contains a graph of spherical and cylindrical power candidate prescriptions calculated from a hypothetical patient, according to an embodiment of the present invention.
  • Fig. 38 is a schematic block diagram of processing modules and interconnections among these modules, according to an embodiment of the present invention.
  • Fig. 39 is schematic diagram of a complete spot diagram captured by a prototype instrument as described herein and according to an embodiment of the present invention.
  • Fig. 40 is a schematic diagram of a partial spot diagram, i.e., a spot diagram in which a portion of the spot diagram falls off the optical sensor, captured by a prototype instrument as described herein and according to an embodiment of the present invention.
  • Fig. 41 is a schematic diagram of a frame from the optical sensor of Fig. 15, with no spot diagram.
  • Fig. 42 is a schematic diagram of a frame from the optical sensor 1532 containing a corneal reflection, captured by a prototype instrument as described herein and according to an embodiment of the present invention.
  • Figs. 43-46 are schematic diagrams of a set of frames from the optical sensor of Fig.
  • Fig. 47 is a schematic diagram of a hypothetical frame from the optical sensor of
  • Fig. 15 containing a complete spot diagram, according to an embodiment of the present invention.
  • Fig. 48 is a schematic illustration of a portion of a lenslet array and a portion of a hypothetical aberrated wavefront, showing displacement calculations.
  • Fig. 49 is a graph showing mean, maximum and minimum amounts by which a normal human eye can accommodate, plotted against age.
  • Figs. 50 and 51 are graphs of sets of M, JO and J45 prescriptions for two different patients, calculated by a prototype instrument as described herein and according to an embodiment of the present invention.
  • Fig. 52 is a hypothetical histogram of spherical prescriptions calculated according to an embodiment of the present invention.
  • Fig. 53 is a schematic diagram illustrating combining frames to yield a second set of frames, according to an embodiment of the present invention.
  • Fig. 54 is a schematic diagram illustrating calculation of an estimated confidence region for a final astigmatism prescription, according to an embodiment of the present invention.
  • Fig. 55 is a schematic diagram illustrating an eye properly aligned with a device, as well as a view as seen by the eye through the device, according to another embodiment of the present invention.
  • Fig. 56 is a schematic diagram illustrating an eye slightly misaligned with the device of Fig 55, as well as a hypothetical view as seen by the eye through the device, according to an embodiment of the present invention.
  • Fig. 57 is a schematic diagram illustrating an eye grossly misaligned with the device of Fig. 55, as well as a view as seen by the eye through the device, according to an embodiment of the present invention.
  • methods and apparatus for calculating a prescription to correct refractive errors with a relatively inexpensive, light-weight, portable instrument that does not require a professional clinician, cycloplegic agent, fogging or virtual images. Some embodiments also calculate prescriptions to correct higher-order aberrations in an eye and/or additional optical properties of the eye. Some embodiments may be used to calculate prescriptions for corrective lenses (eyeglasses) and/or to check whether an existing eyeglass has a correct prescription of a patient.
  • FIG. 1 is a schematic cross-sectional diagram of a normal emmetropic human eye
  • Emmetropia describes a state of vision where an object at infinity is in sharp focus, with the eye's crystalline lens 102 in a neutral (relaxed or "unaccommodated") state.
  • This condition of the normal eye 100 is achieved when the refractive optical power of the cornea 104 and lens 102 balance the axial length 106 of the eye 100, thereby focusing rays 108 from a distant object (not shown) exactly on the retina 110, resulting in perfect vision.
  • disant means more than 20 feet (6 meters) away.
  • An eye in a state of emmetropia requires no correction.
  • Figs. 1 and 2 illustrate normal eyes.
  • various imperfections in the shape or composition of the lens 102, cornea 104, retina 110 or the eye 100 in general can prevent the eye 100 from perfectly focusing the rays 108 or 208 on the retina 110, even in young people.
  • These imperfections prevent the eye 100 from bending (refracting) light rays as a normal eye would, thereby causing "refractive errors.”
  • Fig. 3 schematically illustrates a hyperopic (farsighted) eye 300, in which light rays 308 from a close object 310 are too divergent to focus on the retina 110, leading to blurry vision.
  • Fig. 3 schematically illustrates a hyperopic (farsighted) eye 300, in which light rays 308 from a close object 310 are too divergent to focus on the retina 110, leading to blurry vision.
  • FIG. 4 schematically illustrates a myopic (nearsighted) eye 400, in which light rays 408 from a distant object (not shown) focus in front of the retina 110, causing the distant object to appear blurry.
  • the lens 402 of a myopic eye has too much optical power, relative to the axial length 406 of the eye 400.
  • Myopic eyes can, however, focus well on near objects. In both myopia and hyperopia, an inability to create a sharp image of an object on the retina is referred to as a "defocus error.” Imperfections in eyes can be congenital or result from other factors such as an injury or a disease.
  • Fig. 5 schematically illustrates a corrective lens 500 disposed in front of the myopic eye 400 of Fig. 4 to correct the myopia.
  • the lens 500 is disposed in a "spectacle plane" 502 located a small distance away from the eye 400.
  • the spectacle plane 502 defines where eyeglasses are worn, relative to the eye 400.
  • the spectacle plan is close to the outer layer of the cornea.
  • a lens to correct myopia has a negative optical power, i.e., it has a net concave effect, which counteracts the excessive positive optical power of the myopic eye.
  • the following descriptions refer to eyeglasses or spectacles, although they also apply to contact lenses.
  • a prescription for corrective eyeglasses specifies all aspects of the lenses of the eyeglasses. Some eye imperfections are simpler to correct than others. For example, if an eye is only hyperopic or only myopic, a spherical lens can be used to correct the defocus errors of the eye. A spherical lens includes a surface that is a portion of a sphere. However, if the crystalline lens 102 (Fig. 1), the cornea 104, the retina 110 or the eye 100 in general is not properly shaped, for example if the focusing power of the eye is different along different axes, a simple spherical lens cannot fully correct the eye.
  • the eye is referred to as having "astigmatism.”
  • Corrective eyeglasses that have a spherical and a cylindrical component are used to correct astigmatism.
  • Spherical and cylindrical imperfections account for most, but not all, of the eye's imperfections.
  • Spherical and cylindrical imperfections are referred to as low-order aberrations.
  • most prescriptions include a spherical component and a cylindrical component to correct low-order aberrations.
  • the spherical component corrects the defocus error and is described in terms of the optical power, positive or negative, of the corrective lens, typically expressed as a number of diopters.
  • a diopter is a unit of measurement of optical power of a lens, which is equal to a reciprocal of the focal length (f) of the lens measured in meters, i.e., 1/f.
  • the cylindrical component is described in terms of power and axis of a cylindrical lens. Typically, one or two axes are specified, corresponding to one or two cylindrical lenses. Each axis is specified as an angle.
  • the resulting corrective lens has a compound surface shape that includes spherical and cylindrical components, as described by the prescription, to compensate for the defocus and astigmatism imperfections in the eye.
  • An "aberration" is a departure of the optical performance of an eye from a perfect eye.
  • defocus and astigmatism imperfections are examples of aberrations.
  • eyes may suffer from more complex imperfections, which are commonly referred to as “higher-order aberrations.”
  • higher-order aberrations include coma and spherical aberration (not to be confused with the low-order spherical imperfections that cause defocus errors, as described above).
  • Coma causes an off-axis point source to appear distorted, appearing to have a tail.
  • Spherical aberrations cause collimated rays entering the eye far from the optical axis to focus at a different position than collimated rays entering the eye close to the optical axis.
  • Some prescriptions at least partially correct for higher order aberrations, although determining these prescriptions requires large, heavy, expensive, fixed (such as to a desk) diagnostic equipment and highly skilled clinicians.
  • Optical professionals use various tools and methods to generate eyeglass prescriptions. Some methods are subjective, others are objective. For example, a phoropter allows a clinician to position various combinations of lenses, at various angles, in front of a patient and ask the patient whether one combination is better than a different combination for visualizing a target. Based on reports from the patient, a skilled clinician can achieve progressively better combinations, eventually arriving at a good, although not necessarily perfect, prescription. However, the accuracy of the prescription depends in large part on the patient's reporting accuracy. Phoropters are relatively inexpensive, but the above-described process is time consuming.
  • An aberrometer objectively measures how light is changed by an eye, thereby identifying and quantifying refractive errors caused by the eye.
  • Aberrometers are usually classified into three types: (1) outgoing wavefront aberrometers, such as a Hartmann-Shack sensor; (2) ingoing retinal imaging aberrometers, such as a cross-cylinder aberrometer or Tscherning aberrometer or as used in a sequential retinal ray tracing method; and (3) ingoing feedback aberrometers, such as a spatially-resolved refractometer or as used in an optical path difference method.
  • a Hartmann-Shack wavefront aberrometer includes an array 600 of lenses (“lenslets”), exemplified by lenslets 602, 604 and 606. All the lenslets 602-606 have identical sizes and focal lengths, within some manufacturing tolerances.
  • the lenslet array 600 is disposed optically between an eye 608 and an optical sensor 610, such as a pixelated charge-coupled device (CCD), pixelated complementary metal oxide semiconductor (CMOS) device or an array of quadrant photodiode detectors.
  • an optical sensor 610 such as a pixelated charge-coupled device (CCD), pixelated complementary metal oxide semiconductor (CMOS) device or an array of quadrant photodiode detectors.
  • CCD charge-coupled device
  • CMOS complementary metal oxide semiconductor
  • Each lenslet 602-606 may, but need not, be focused on the center of a respective pixel of a pixelated CCD array or on the center of a respective quadrant sensor.
  • the optical sensor 610 is configured to have sufficient spatial resolution to enable a circuit or processor to measure displacement of each spot of the array of spots from a position directly in line with the center of the corresponding lenslet, as described in more detail below.
  • a point 618 within the eye 608 is illuminated by shining a light, typically from a laser or a superluminescent diode (SLED or SLD), into the eye 608, thereby creating a "virtual point light source" within the eye 608.
  • a light typically from a laser or a superluminescent diode (SLED or SLD)
  • virtual light source is a term of art used in wavefront aberrometry, and as used herein, the term means a place where light appears to emanate, although no light is actually generated there. In the case of point 618, the laser or SLED creates the virtual light source. As used herein, unless context indicates otherwise, “virtual” should not be confused with that term as used in optics, where “virtual” means a physical source that is imaged to another location.
  • wavefronts 702, 704, 706, and 708 represent the exiting light.
  • Each lenslet of the array of lenslets 600 focuses a respective portion of each wavefront 700-706 onto a corresponding portion of the optical sensor 610, creating a circular array of spots.
  • a hypothetical array of spot 710 (also referred to herein as a "spot diagram") is shown, although the array of lenslets 600 may include more or fewer lenslets than are shown and, therefore, the spot diagram 710 may include more or fewer spots than are shown.
  • the wavefronts 706-708 are planar, and the spots of the spot diagram 710 are equally displaced from the center of each individual lenslet.
  • the outer perimeter of the spot diagram is a projection of the pupil of the eye 608, thus the diameter of the outer perimeter of the spot diagram indicates the pupil diameter.
  • Fig. 9 schematically illustrates wavefront 908 conceptually divided into square regions, exemplified by regions 900, 902 and 904. Each region 900-904 impinges on the lenslet array 600 along a direction substantially perpendicular to the region, as indicated by respective arrows 906, 908 and 910. Thus, the spots of the spot diagram 810 (Fig. 8) are displaced from where they would be if the wavefront 808 were planar.
  • FIG. 9 One such displaced spot 912 is shown in an enlarged portion of Fig. 9.
  • the region of the wavefront 808 contributing the spot 912 had been parallel to the lenslet array 600, the region would have traveled through the lenslet 914 and impinged on the optical sensor 610 along a line 916 normal to the optical sensor 610 and created a spot at location 918.
  • the spot 912 is displaced an x and a y distance from the location 918.
  • centroid finding methods may be used to analyze data from the optical sensor 610 to calculate the x and y displacements and angles ⁇ for each lenslet, often with sub-pixel resolution.
  • a local tilt of the wavefront 908 across each lenslet can be calculated from the position of the spot on the optical sensor 610 generated by the lenslet. Any phase aberration can be approximated to a set of discrete tilts.
  • the wavefront data can then be used to characterize the eye 800 (Fig. 8) as an optical system.
  • the shape of the wavefront 808 can be expressed as a weighted sum of a set of pre-determined three-dimensional surface shapes or basis functions.
  • Each shape of the set is usually defined by an independent polynomial function which represents a specific aberration term.
  • Fig. 10 illustrates the shapes defined by the 0th through 4th orders (modes) of the Zernike polynomials.
  • the views in Fig. 10 are perspective. However, often these shapes are shown in top view, using color gradients to represent powers of the aberrations.
  • the shapes become increasingly complex with increased order, and these shapes can be combined to precisely describe a surface that fits as well as possible to a measured wavefront.
  • Each order describes a surface shape that corresponds with an ocular aberration.
  • 0th order has one term (Z Q ) that represents a constant.
  • the 1st order has two terms (Z 1 and that represent tilt for the x and y axes.
  • the 2nd order includes three terms that represent defocus and regular astigmatism in two directions.
  • the 3rd order has four terms that represent coma and trefoil.
  • the 4th order has five terms that represent tetrafoil, secondary astigmatism and spherical aberration.
  • the 5th order (not shown) has six terms that represent pentafoil aberration.
  • the polynomials can be expanded up to an arbitrary order, if a sufficient number of measurements are made for the calculations and the optical sensor provides sufficient spatial resolution.
  • Zernike polynomials The weight applied to each mode when computing this sum is called a Zernike coefficient and is usually expressed in microns.
  • the weighted sum of the Zernike polynomials equals a description of all the aberrations, i.e., a total refractive error, of an eye.
  • a Zernike analysis includes a finite number of modes. Once the total refractive error of an eye has been ascertained to a desired accuracy, i.e., using a desired number of Zernike modes, a corrective lens prescription can be calculated to compensate for the refractive error in a well-known manner. Thus, a spot diagram can be used to calculate a prescription.
  • the Zernike coefficients can be used somewhat analogously to a fingerprint to uniquely identify an individual eye and, therefore, an individual person.
  • Optical properties of an eye include: scattering (which may be used to determine if a patient has cataracts), wavefront (which may be used to measure refraction, low-order aberrations, high-order aberrations, accommodation, keratoconus, which is a high-order spherical aberration, and the like) and pupil size.
  • Accommodation introduces an uncontrolled variable into the measurement process.
  • Fogging refers to temporarily disposing a lens with positive spherical power in front of a patient's eye in an attempt to control accommodation.
  • the goal of fogging is to move the focal point in front of the retina, regardless of the distance to the object.
  • the patient is temporarily made artificially myopic.
  • the eye accommodates by changing the shape of the lens to increase its optical power in order to see close objects more clearly.
  • vision becomes blurrier, not clearer, regardless of the distance to the object, thus discouraging accommodation.
  • Virtual images are images created within a diagnostic instrument but that optically are located at least 20 feet (6 meters) from the patient.
  • the patient intuitively knows the viewed object is not 20 feet away and, therefore, the patient tends to accommodate. This phenomenon is sometimes referred to as "instrument-induced myopia," and it is difficult to avoid, even with fogging techniques.
  • ophthalmo logical diagnostic equipment is large, heavy and mechanically complex, at least in part because the equipment is designed to hold a patient's head steady and align it, and thereby align the patient's eye, with certain optical elements within the diagnostic equipment. Consequently, this equipment is typically attached to a table and includes heavy-duty structural members, forehead and chin rests and rack and pinion alignment mechanisms.
  • Figs. 11, 12 and 13 contain various views of a lightweight, portable, hand-held, self- contained, automatic optical or ophthalmologic apparatus 1100 that includes a Hartmann-Shack wavefront aberrometer, according to an embodiment of the present invention.
  • Fig. 14 shows the apparatus 1100 in use by a patient 1400.
  • the apparatus 1100 solves many of the problems associated with the prior art. For example, the apparatus 1100 provides feedback to the patient 1400, enabling the patient 1400 to correctly align the apparatus 1100 to the patient's eye, without the cumbersome mechanical paraphernalia required by prior art devices.
  • the apparatus 1100 is of an "open view" design, therefore it is configured to inherently encourage the patient 1400 not to accommodate, without any cycloplegic agents, fogging or virtual images.
  • the apparatus 1100 automatically determines when the patient 1400 is not accommodating, and uses data acquired during a period of non-accommodation to automatically calculate an eyeglass prescription.
  • the apparatus 1100 can measure the optical properties of an eye that is focused at a known, non-infinite distance, and these optical properties can be used to calculate the patient's eye's optical properties if the patient were to focus at infinity.
  • the apparatus 1100 includes an eyepiece 1102, into which the patient 1400 looks with one eye.
  • the eyepiece 1102 may include an eyecup configured to be pressed against the patient's face, thereby blocking ambient light.
  • the eyecup may be sized and shaped differently to fit well against various facial geometries and anatomical configurations, such as young and old patients.
  • the apparatus 1100 also defines an exit port 1104, through which the patient 1400 can see.
  • the apparatus 1100 has an "open view" configuration.
  • Fig. 15 is a schematic block diagram of the apparatus 1100 showing its internal components, within a body 1500.
  • Two beamsplitters 1501 and 1502 are disposed along an optical axis 1504 between the eyepiece 1102 and the exit port 1104.
  • the patient looking into the eyepiece 1102 along the optical axis 1504 can see an external object 1506 that is aligned with the optical axis 1504.
  • a view, as seen by the patient, is shown in an insert of Fig. 15.
  • a visible light source 1508 such as a laser diode or light emitting diode (LED), emits a beam of light 1510, which the beamsplitter 1502 reflects along the optical axis 1504 out the exit port 1104, as indicated by arrow 1512.
  • the beam 1512 can be used to create a spot of light on a distant wall or other object 1514.
  • the object 1506 is assumed to be the spot of light created on the wall 1514 by the beam 1512.
  • the visible light source 1508 is fixed, relative to the body 1501 and the optical components within the apparatus 1100. Thus, the beam 1512 is always coincident with the optical axis 1504.
  • the distant wall 1514 should be at least 20 feet (6 meters) from the apparatus 1100, so when the patient looks at the spot 1506, the patient's eye 1516 is substantially unaccommodated.
  • An ultrasonic or other range sensor 1517 may be used to measure the distance between the apparatus 1100 and the wall 1514.
  • the apparatus 1100 may provide an audible, visual, haptic or other warning if the distance is inappropriate.
  • a return beam 1518 from the spot 1506 enters the exit port 1104, passes through the two beamsplitters 1502 and 1501 along the optical axis 1504 and enters the patient's eye 1516 via the eyepiece 1102. This enables the patient to see the spot 1506.
  • the optical axis 1504 and the two beams 1512 and 1518 are shown spaced apart; however, the axis and the two beams are coincident.
  • the target can be an arbitrary, but known, distance from the patients. For example, if the target is projected 10 feet (3 meters) from the instrument, the amount of accommodation necessary for the patient to focus on the target is calculated, and then a prescription is calculated that compensates for the accommodation.
  • the eyepiece 1102 may also be referred to as a proximal port, and the exit port 1104 may also be referred to as a distal port.
  • the body 1500 forms a visual channel between the eyepiece 1102 and the exit port 1104.
  • the beam 1512 may also be referred to as a target beam
  • the wall or other object 1514 may be referred to as a target
  • the spot 1506 may also be referred to as target indicia.
  • the visual channel between the eyepiece 1102 and the exit port 1104 may have a conic shape, i.e., the shape may be a portion of a cone.
  • the visual channel is configured such that a vertex of the conic shape is toward the eyepiece 1102 and a base of the conic shape toward the exit port 1104.
  • a pinhole constrains where a user can position her eye and see through the pinhole.
  • a pinhole does not, however, constrain the angle along which the user can through the pinhole.
  • a tubular or conic visual channel does, however, constrain the view angle.
  • a conic visual channel, with pinhole which may be implemented as a small hole at or near the vertex of the cone, constrains both the position of the user's eye and the angle along which the eye sees.
  • Another light source such as another laser diode, 1520 projects a beam of light
  • the beam splitter 1502 reflects the beam 1522 toward the eyepiece 1102 along the optical axis 1504, as indicated by arrow 1524.
  • the beam 1524 illuminates a spot 1525 on the back of the eye 1516, thereby essentially creating a virtual point light source within the eye 1516.
  • This virtual light source 1525 corresponds to the spot 618 described above, with respect to Fig. 8.
  • return wavefronts travel along a beam 1526 from the eye 1516.
  • the beamsplitter 1501 reflects the beam 1526, and resulting beam 1528 passes through a lenslet array 1530 and impinges on an optical sensor 1532.
  • Optional optics 1534 such as a relay lens system to make the lenslet array 1530 optically conjugate with the patient's spectacle plane and a band-pass and/or neutral density filter, may be disposed in the path of beam 1528.
  • the optical axis 1504 and the two beams 1524 and 1526 are shown spaced apart; however, the axis and the two beams are generally coincident.
  • Hartmann-Shack wavefront aberrometry using a lenslet array are described, other methods for wavefront sensing can be used.
  • Other embodiments use pinhole arrays or arrays of sensors for defocus imaging.
  • time-of-flight cameras, interferometric techniques or partitioned aperture wavefront imaging systems are used. Partitioned aperture wavefront imaging systems are well known to those of skill in the art, as evidenced by information available at http://biomicroscopy.bu.edu/research/partioned-aperture- wavefront-imaging.
  • An analysis unit 1536 is electronically coupled to the optical sensor 1532.
  • the analysis unit 1536 includes appropriate interface electronics, a processor, memory and associated circuits configured to analyze signals from the optical sensor 1532 to calculate x and y displacements of spots in a spot diagram from where they would be if the eye 1516 were normal. From this data, the analysis unit 1536 calculates a set of Zernike coefficients and calculates a corrective lens prescription. Additional details about these analyses and calculations are provided below.
  • An internal battery 1538 powers the analysis unit 1536, the two light sources 1508 and 1520, the optical sensor 1532 and other components of the apparatus 1100.
  • a handle portion 1539 of the housing 1500 may house the battery 1538. All electronic components of the apparatus 1100 are powered by the battery 1538, and all calculations necessary to ascertain the prescription are performed by the analysis unit 1536.
  • the apparatus 1100 is completely self-contained, i.e., all components, apart from the wall 1514 and the eye 1516, necessary to perform its functions are included within the housing 1500.
  • the apparatus 1100 is small and lightweight enough to be held in place long enough to perform the described measurement by a typical patient using one hand.
  • the light source 1520 that creates the virtual light source 1525 within the eye 1516 is a near infrared (NIR) light source.
  • the wavelength of the light source 1520 is selected such that the patient perceives a red dot, although the bulk of the energy of the beam 1504 is not within the spectrum visible to the patient.
  • the visible light source 1508 is selected to have a perceptively different color, such as green, than the red perceived by the patient from the NIR light source 1520.
  • the patient may be instructed to orient the apparatus 1100, relative to the patient's eye, so as to maximize the perceived brightness of the red dot.
  • Fig. 15-1 if the patient's eye 1516 is properly aligned with the eyepiece 1102, such that the eye's center of vision 1590 is aligned with the optical axis 1504 of the apparatus 1100, the patient perceives two coincident dots 1592 and 1594, one red and the other green, as illustrated on the left side of Fig. 15-1, or a single dot that is both red and green.
  • the patient can be instructed to reorient the apparatus 1100 until she perceives the two coincident dots or one dual-colored dot. The patient can then easily hold the apparatus 1100 in the proper alignment for the short time required to collect data for generating a prescription.
  • Fig. 15-2 As schematically illustrated in Fig. 15-2, if the patient's eye 1516 is improperly aligned with the eyepiece 1102, such that the eye's center of vision 1590 is parallel to, but slightly displaced from, the optical axis 1504 of the apparatus 1100, the patient sees the dots 1592 and 1594 off-center within the field of view afforded by the eyepiece, as exemplified on the left side of Fig. 15-2. However, as schematically illustrated in Fig. 15-3, if the patient's eye 1516 is grossly misaligned, the patient does not see any dots within the field of view afforded by the eyepiece, as shown on the left side of Fig. 15-3.
  • the simple design of the apparatus 1100 enables easy alignment of a patient's eye with optics of the apparatus 1100, without a chin rest or other complex heavy mechanical alignment apparatus required by the prior art. Furthermore, the open view design of the apparatus 1100 encourages the patient not to accommodate, without any cyclop legic agents, fogging or virtual images.
  • wavelengths may be used by the two light sources 1508 and 1520.
  • visible wavelengths are used for both of the light sources 1508 and 1520.
  • identical or similar wavelengths are used by both of the light sources 1508 and 1520, but one or both of the light sources 1508 and 1520 blink, so the patient can distinguish between the two resulting dots. If both light sources 1508 and 1520 blink, they should alternate being on.
  • the apparatus 1100 may include additional optical elements, such as a diaphragm
  • the beamsplitter 1502 may include a "hot mirror," which passes visible light entering the exit port 1104, within a range of about 375 nm to about 725 nm, so the patient can see the spot 1506 through the eyepiece 1102.
  • the components of the apparatus 1100 may be displaced along a Y axis, so as to offset the beams 1504 and 1527 by about 1-2 mm to reduce specular reflection from the eye 1516. This specular reflection constitutes noise to the optical sensor 1532.
  • the amount of optical power than can safely be delivered by the light source 1520 to the eye 1516 is limited.
  • Ambient light that enters the apparatus 1100 and impinges on the optical sensor 1532 constitutes noise. Under high ambient light conditions, this noise may reach unacceptably high levels. In addition, the ambient light may overwhelm the light from light source 1520, thereby preventing the patient from seeing a spot from this light source.
  • a neutral density filter 1544 may be disposed along the light path between the exit port 1104 and the beam splitter 1502. The neutral density filter may be selected or adjusted to admit any appropriate amount of light, such as about 1%.
  • an optical or ophthalmologic apparatus 1600 includes components as described above, with respect to Fig. 15, and further includes a cross-hair 1602 disposed along the optical path between the eyepiece 1102 and the exit port 1104, such that the center of the cross-hair 1602 coincides with the optical axis 1504.
  • the cross-hair 1602 is visible in the field of view of the eye 1516, as shown in Figs. 17 and 18. If the patient sights down the center of the optical path between the eyepiece 1102 and the exit port 1104, thereby aligning his eye 1516 with the optical axis 1504, the spot 1506 appears at the intersection of the cross-hair 1602, as shown in Fig. 17.
  • the spot 1506 does not appear at the intersection of the cross-hair 1602, for example as shown in Fig. 18.
  • the patient can be instructed to reorient the apparatus 1600 until he sees the spot 1506 at the center of the cross-hair 1602.
  • the light source 1520 need not generate a beam 1522 that is perceived at all by the patient.
  • the cross-hair 1602 should be disposed a distance away from the eye 1516, so as not to require the eye 1516 to accommodate and still have the cross-hair 1602 reasonably well focused. This may require the cross-hair 1602 to be held a distance from most of the housing 1604, such as by an outrigger 1608, as shown in the insert of Fig. 16.
  • FIG. 16 Other aspects of the apparatus 1600 are similar to the apparatus 1100; however, some reference numerals are omitted from Fig. 16 in the interest of clarity.
  • an optical or ophthalmologic apparatus 1900 includes components as described above, with respect to Fig. 15, except the visible light source 1902 projects a spot on the wall 1514, without the visible beam 1904 passing through the beamsplitter 1502. As shown schematically in Fig. 20, the beam 1904 is not parallel to the optical axis 1504. Thus, the beam 1904 intersects the optical axis 1504 at a distance 2000 from the apparatus 1900.
  • An angle 2002 of the light source 1902, relative to the optical axis 1504, is selected such that the distance 2000 is at least 20 feet (6 meters).
  • the ultrasonic or other range sensor 1517 or a simple tape measure may be used to selectively dispose the apparatus 1900 at a desired distance from the wall 1514.
  • an apparatus 2100 is similar to the apparatus shown in Figs. 19 and 20, except the visible light source 1902 is aligned parallel to, but spaced apart from, the optical axis 1504. If the distance between optical axes of the projected light source 1902 and the internal light source 1520 is sufficiently small, then when the patient aligns the device so that images of the two sources are coincident, the eye is sufficiently aligned for an accurate wavefront measurement.
  • the axis of the visible light source 1902 or 1508 (Fig. 15) is offset about 20 mm from the axis of the internal light source 1520.
  • the apparatus does not include a visible light source, such as light source 1508 or 1902 (Figs. 15 and 19).
  • the patient is instructed to look into the apparatus and maintain her gaze in the center of the field of view provided by the eyepiece 1102 and exit port 1104 of the apparatus.
  • the eye 1516 is properly aligned with the optical axis 1504 of the device.
  • a hypothetical view, as seen by the eye 1516, is shown on the left in Fig. 55.
  • the eye 1516 is slightly misaligned with the optical axis 1504 of the device.
  • a hypothetical view, as seen by the eye 1516 is shown on the left in Fig. 56.
  • the eye 1516 is grossly misaligned with the optical axis 1504 of the device.
  • a hypothetical view, as seen by the eye 1516 is shown on the left in Fig. 57.
  • a relatively small field of view such as by closing the iris diaphragm 1542 (Fig. 15) smaller than in the embodiments described above with respect to Figs. 15, 16 and 19.
  • Processing of the spot diagram generated from the patient's eye 1516 may be used to ascertain whether the patient's eye 1516 is properly aligned with the optical axis 1504 and, if not, generate a feedback instructional signal to the patient, as described in more detail below.
  • Binocular aberrometer Fig. 15
  • Fig. 22 illustrates a possible source of error in measurements made by the devices described thus far. If a patient holds the apparatus 2200 at an angle 2202 other than perpendicular to the patient's interpupiUary axis 2204, the cylindrical axis components of a prescription generated by the apparatus 2200 may be incorrect.
  • One solution to this problem involves including an accelerometer (not shown) in the apparatus 2200 to detect if the apparatus 2200 is oriented other than vertical and, if so, warn the user.
  • Another solution is to use the measured angle from the accelerator to offset the measured cylindrical axis by the appropriate amount.
  • these approaches have limitations. For example, the patient may not be positioned with her head vertical, thereby making a vertical orientation of the apparatus an incorrect orientation.
  • any embodiment described herein may be configured as a binocular instrument, as exemplified by an optical or ophthalmologic apparatus 2300 illustrated in Fig. 23.
  • An alternative binocular instrument 2350 is illustrated in Fig 23-1.
  • the binocular instrument 2300 may be held by a patient using two hands, thereby providing more stability than one hand holding a monocular instrument, at least in part because using two hands reduces the number of degrees of freedom of movement of the instrument 2300. Because the binocular instrument 2300 is more likely to be held by the patient so the instrument axis between the two eyepieces is parallel to the patient's intraocular axis, prescriptions to correct astigmatism are more likely to include accurate angles of the cylinder axis.
  • one side 2302 of the instrument 2300 includes the components described above, such as with respect to Fig. 15, and the other side 2304 of the instrument is essentially merely a hollow tube.
  • a patient is very likely to hold a binocular instrument to her face in a manner such that a vertical axis 2306 of the instrument is perpendicular to the patient's interpupiUary axis, even if the patient leans her head to one side.
  • the side 2302 of the binocular instrument 2300 that includes the aberrometer may include a neutral density filter 1544 (Fig. 15) to reduce the amount of ambient light admitted into the instrument, as discussed above. Even without a neutral density filter 1544 in the "business" side 2302 of the binocular instrument 2300, the beamsplitter 1501, band pass filter 1534, etc. attenuate light. Therefore, the other side 2304 of the binocular instrument 2300 should include a neutral density filter, so both eyes receive approximately equal amounts of light. [00209] Once the instrument 2300 has been used to measure one eye, the instrument 2300 can be turned up-side-down to measure the other eye. The binocular instrument 2300 shown in Fig.
  • both sides 2302 and 2304 may include most of the components described above, such as with respect to Fig. 15.
  • Such an embodiment can measure both eyes substantially simultaneously, without requiring the device to be turned up-side-down.
  • additional beamsplitters can be incorporated into a secondary channel to route the measuring light and wavefront sensor field of view to simultaneously image spot diagrams from both eyes.
  • the two portions 2302 and 2304 of the binocular instrument 2300 may be adjustably attached to each other by a dovetail or other sliding rail 2400, enabling distance 2402 between centers of the two eyepieces 1102 and 2404 to be adjusted to match a patient's interpupillary distance.
  • the interocular distance can be read from a scale 2406 using a pointer 2408.
  • a linear encoder 2410 and indicia 2412 are used to electronically measure the distance 2402.
  • the distance 2402 can be used as a parameter for constructing a pair of eyeglasses for the patient.
  • a worm gear or other suitable linear, angular or other adjustable link may be used.
  • the two portions 2302 and 2304 may be adjustably attached to each other by a pivot joint 2500.
  • a scale 2508 and pointer 2510 may be calibrated to indicate the distance 2402 or the angle 2502.
  • the distance 2402 can be calculated from the angle 2502 and known geometry of the instrument.
  • an angular encoder 2510 (shown in dashed line) is included in the pivot joint 2500.
  • the optical or ophthalmologic apparatus 1100 includes an analysis unit 1536 configured to analyze signals from the optical sensor 1532 to calculate x and y displacements of spots in a spot diagram, calculate a set of Zernike coefficients from the displacements and calculate a corrective lens prescription from the coefficients.
  • the analysis unit 1536 also controls operation of various components of the apparatus 1100.
  • Fig. 26 is a schematic block diagram of the analysis unit 1536. Similar analysis units may be used in other embodiments of the present invention.
  • the analysis unit 1536 includes a processor 2600 coupled to a memory 2602 via a computer bus 2604.
  • the processor 2600 executes instructions stored in the memory 2602. In so doing, the processor 2600 also fetches and stores data from and to the memory 2602.
  • an optical sensor interface 2606 Also connected to the computer bus 2604 are: an optical sensor interface 2606, a light source interface 2608, an iris/neutral density filter interface 2610, a computer network interface 2612, a range finder interface 2614, an audio/visual/haptic user interface 2616 and a user input interface 2618.
  • These interfaces 2606-2618 are controlled by the processor 2600 via the computer bus 2604, enabling the processor to send and/or receive data to and/or from respective components coupled to the interfaces 2606-2618, as well as control their operations.
  • the optical sensor interface 2606 is coupled to the optical sensor 1532 (Fig. 15) to receive data from the pixels, quadrant sectors or other elements of the optical sensor 1532.
  • the optical sensor 1532 is pixelated.
  • the optical sensor 1532 includes a rectangular array of quadrant sensors. In either case, the optical sensor 1532 provides data indicating intensity of illumination impinging on portions of the optical sensor 1532.
  • the processor 2600 uses this information to calculate locations of centroids of spots of a spot diagram and to calculate displacements of the centroids from locations where a perfect eye would cause the centroids to impinge on the optical sensor 1532.
  • Some optical or ophthalmologic apparatus embodiments, described below, include other or additional optical sensors, which are also coupled to the optical sensor interface 2606.
  • the light source interface 2608 is coupled to the visible light source 1508 (Fig. 15) and to the light source 1520 to control their operations, such as turning the light sources on and off and, in some embodiments, controlling intensities of light emitted by the light sources 1508 and 1520.
  • the light sources 1508 and 1520 include a respective array of individual light sources.
  • the light source interface 2608 may enable the processor to control each of the individual light sources separately.
  • the iris/neutral density filter interface 2610 is coupled to the adjustable iris diaphragm 1542 and/or the neutral density filter 1544 (Fig. 15) to enable the processor 2600 to control their operations.
  • the processor 2600 may send signals, via the interface 2610, to command the iris diaphragm 1542 to open or close to a specified size.
  • the neutral density filter 1544 is adjustable, the processor 2600 may send signals, via the interface 2610, to command the neutral density filter 1544 to admit a specified portion of light.
  • the network interface 2612 includes a wired or wireless interface, such as a universal serial bus (USB) interface, a wired Ethernet interface, a Bluetooth interface, a wireless infrared (IR) interface, a wireless local area network (WLAN) interface or a wireless cellular data interface, by which the processor 2600 may communicate with another suitably equipped external device, such as a printer, a personal computer a cell phone or smartphone, an automated lens grinder or an eyeglass order processing system.
  • the processor 2600 sends a prescription it has calculated to the external device, either directly or via a network, such as a local area network or a cellular carrier network.
  • the processor receives patient data, program updates, configuration information, etc.
  • the processor sends raw data from the optical sensor 1532, calculated spot diagram information, Zernike coefficients or other intermediate information to the external device, and the external device calculates the prescription.
  • the range finder interface 2614 is coupled to any range sensor 1517 in the ophthalmologic apparatus.
  • the audio/visual/haptic interface 2616 is coupled to any audio, visual and/or haptic output devices in the ophthalmologic apparatus.
  • the apparatus 1100 may provide an audible, visual, haptic or other warning if the distance 2000 (Fig. 20) between the apparatus 1900 and the wall 1514 is inappropriate.
  • this interface 2616 can be used to provide feedback about the alignment between the patient's eye and the optical axis of the device.
  • Suitable audio devices include beepers, loudspeakers, piezoelectric devices, etc.
  • Suitable visual devices include lights, liquid crystal display (LCD) screens, etc.
  • Suitable haptic devices include vibrators, refreshable braille displays, etc.
  • the user input interface 2618 is coupled to any user input devices in the ophthalmologic apparatus. Such input devices may, for example, be used to initiate a measurement of a patient's eye. Suitable user input devices include buttons, keys, triggers, touchscreens, tactile sensors, etc. An exemplary user interface 2352 is shown in Fig. 23-1.
  • One or more of the interfaces 2606-2618, the processor 2600, the memory 2602 and the computer bus 2604, or any portion thereof, may be replaced or augmented by a suitably programmed device such as a programmable logic device (PLD) 2626, field-programmable gate array (FPGA) 2620, digital signal processor (DSP) 2622, application-specific integrated circuit (ASIC) 2624, discrete logic or suitable circuit.
  • a programmable logic device PLD
  • FPGA field-programmable gate array
  • DSP digital signal processor
  • ASIC application-specific integrated circuit
  • an array of spots (a spot diagram 710) is projected on the optical sensor 610. If the eye 1516 is aligned with the optical axis 1504 of the device 1100 as shown in Fig. 15, the spot diagram is centered on the optical sensor 1532. However, as schematically exemplified in Fig. 27, if the eye 1516 is slightly misaligned with the optical axis 1504, the spot diagram 2700 is not centered on the optical sensor 1532.
  • Fig. 28 is a schematic block diagram of an alignment feedback module 2800, according to several embodiments of the present invention.
  • the term "module” refers to one or more interconnected hardware components, one or more interconnected software components or a combination thereof.
  • the alignment feedback module 2800 may be implemented by any of the components discussed above, with respect to Fig. 26.
  • Fig. 9 it can be seen that all the spots of the spot diagram 810 are generally not of equal intensity.
  • intensity of each spot is schematically indicated by the diameter of the spot.
  • spot intensity decreases with radial distance from the center of the spot diagram 810.
  • Spot intensity distribution within the spot diagram is represented by a three- dimensional surface graph 920.
  • a spot diagram centroid and size calculator 2802 is coupled to the optical sensor 1532 to receive signals therefrom, such as the intensity of light detected by each pixel or each quadrant.
  • the spot diagram centroid and size calculator 2802 calculates size and location of the centroid of the entire spot diagram, such as its x and y or polar coordinates and the spot diagram diameter, on the optical sensor 1532.
  • the spot diagram centroid calculator 2802 may use any appropriate algorithm or method for determining the centroid and size. Many such algorithms and methods are well known.
  • a weighted sum of the coordinates of the illuminated pixels is calculated, where each pixel's coordinates are weighted by the illumination level detected by the pixel. This information can be also used to determine the size of the spot diagram, e.g., the diameter of the spot diagram.
  • the spot diagram centroid calculator 2802 may use the portion 2900 of the spots to calculate a location within the spots 2900 and provide this location as the centroid of the spot diagram.
  • the shape of the portion of the spot diagram falling on the optical sensor 1532 can be also used to estimate the size of the spot diagram.
  • the curvature of the portion of the spot diagram falling on the optical sensor 1532 may be used to estimate the diameter of the spot diagram.
  • the curvature of the portion of the spot diagram falling on the optical sensor 1532 may be used to estimate the true center of the spot diagram, even if the center is not within the optical sensor 1532.
  • the spot diagram centroid calculator 2802 may generate an additional signal to indicate the true centroid of the spot diagram is off the optical sensor 1532.
  • a difference calculator 2804 calculates a difference between the location of the centroid of the spot diagram and the center location 2806 of the optical sensor 1532.
  • An output of the difference calculator 2804 represents a magnitude and direction 2808 of the displacement of the centroid of the spot diagram 2700 (Fig. 27) from the center of the optical sensor 1532. This magnitude and difference 2808 is fed to a feedback signal generator 2810.
  • the feedback signal generator 2810 generates an audio, visual, haptic and/or other output to the patient and/or an optional operator. Some embodiments include a loudspeaker, as exemplified by a loudspeaker 1546 (Fig. 15), and the feedback signal generator 2810 is coupled to the loudspeaker 1546. In some embodiments, the feedback signal generator 2810 generates audio signals, via the loudspeaker 1546, to indicate to the patient an extent of misalignment and/or a direction of the misalignment. In some such embodiments, a pitch or volume of a sound or a frequency of ticks (somewhat like a sound emitted by a Geiger counter) may represent how closely the eye is aligned to the optical axis.
  • a particular sound such as a beep
  • the feedback signal generator 2810 may include a speech synthesizer to generate synthetic speech that instructs the patient how to improve or maintain the alignment of the eye, for example, "Move the instrument up a little,” “Look a little to the left” or "Perfect. Don't move your eyes.”
  • the loudspeaker may also be used to play instructions for using the device. One important instruction is to ask the patient to blink. A fresh tear film is important for good measurement of the optical properties of the eye.
  • Some embodiments include visual indicators, such as arrows illuminated by LEDs, located in the eyepiece 1102, exit port 1104 or elsewhere in the instrument 1100.
  • Exemplary visual indicators 1548 and 1550 are shown in the eyepiece 1102 in Fig. 15.
  • the feedback signal generator 2810 may selectively illuminate one or more of these indicators 1548 and 1550 to represent a magnitude and direction in which the patient should adjust his gaze to better align his eye with the optical axis.
  • the housing 1500 includes an LCD display, and the feedback signal generator 2810 generates a display, schematically exemplified by display 3000 in Fig.
  • the housing 1500 may include lights, such as LEDs, coupled to the feedback signal generator 2810 to indicate a relative direction, and optionally a relative distance, in which the instrument 1100 should be moved to improve alignment of the eye with the optical axis.
  • Some embodiments include haptic output devices that signal a patient with vibration along an axis to indicate the patient should shift his gaze or move the instrument 1100 in a direction along the axis of vibration.
  • the frequency of vibration may indicate an extent to which the patient should shift his gaze or move the instrument 1100.
  • the spot diagram centroid calculator 2802 calculates a centroid location.
  • Some embodiments solve this problem by including an array of light sensors around the optical sensor 1532, as shown schematically in Fig. 31.
  • an array 3100 of light sensors exemplified by light sensors 3102, 3104 and 3106, are arranged to largely surround the optical sensor 1532.
  • the light sensors 3102-3106 are shown the same size as the optical sensor 1532. However, the light sensors 3102-3106 may be smaller or larger than the optical sensor 1532. Each light sensor 3102-3106 has a single light-sensitive area. Thus, the light sensors 3102-3104 may be less expensive than the optical sensor 1532.
  • the light sensors 3102-3106 are coupled to the spot diagram centroid calculator
  • spot diagram centroid calculator 2802 may use signals from the array of light sensors 3100 to calculate at least an approximate location of the spot diagram 3108 and provide this as a simulated location of the centroid of the spot diagram.
  • the spot diagram centroid calculator 2802 simply returns one of several directions from the center of the optical sensor 1532, in which the spot diagram 3108 has fallen.
  • the number of possible directions may be equal to the number of light sensors 3102- 3106 in the array 3100.
  • the number of possible directions may be greater than the number of light sensors 3102-3106.
  • the spot diagram centroid calculator 2802 may calculate a direction by taking a weighted sum of the signals from the light sensors.
  • the spot diagram centroid calculator 2802 may generate an additional signal to indicate the true centroid of the spot diagram is off the optical sensor 1532.
  • each light sensor 3102-3106 is used in one square ring around the optical sensor 1532.
  • other numbers of light sensors and/or other number of concentric rings and/or other shaped rings may be used.
  • the number of light sensors and/or rings may be selected based on a desired resolution of the direction and/or distance to the spot diagram.
  • an additional beamsplitter 3200 directs a portion of the optical signal 1526 from the eye 1516 to a quadrant photodiode detector 3202.
  • the quadrant photodiode detector 3202 is coupled to the spot diagram centroid calculator 2802.
  • Fig. 33 is a plan view of the quadrant photodiode detector 3202, including a hypothetical spot diagram 3300 projected thereon.
  • the quadrant photodiode detector 3202 can be any size, relative to the optical sensor 1532.
  • a demagnifying lens 3204 interposed between the beamsplitter 3200 and the quadrant photodiode detector 3202 enables using a relatively small and inexpensive detector to detect locations of the spot diagrams over a relatively large area. Operation of the spot diagram centroid calculator 2802 in such embodiments is similar to the operation described above, with respect to Fig. 31.
  • any other suitable sensor may be used, such as a position-sensitive detector (PSD) or a multi-element camera array.
  • PSD position-sensitive detector
  • a detector with another number of sectors may be used. The number of sectors may be selected based on a desired resolution with which the location of the spot diagram is to be ascertained.
  • feedback to the patient about misalignment of the patient's eye to the optical axis is provided by changing the location where the spot 1506 (Fig. 15) is projected on the wall 1514.
  • the visible light source 1508 is steerable, such as by a pan and tilt head (not shown) driven by the light source interface 2608 (Fig. 26) or by an array of visible light sources driven by the light source interface 2608. If the patient's eye is not properly aligned with the optical axis 1504 of the instrument 1100, the location of the spot 1506 is changed in a direction and by a distance that correspond to the direction and magnitude of the misalignment. Note that consequently the spot 1506 may no longer be along the optical axis 1504.
  • the patient is subtly directed to redirect her gaze toward the new location of the spot 1506, thereby improving alignment of her eye with the optical axis 1504.
  • the optical axis 1504 of the instrument 1100 is not changed. Only the location where the spot 1506 is projected changes.
  • Fig. 34 is a schematic plan view of an exemplary array 3400 of visible light sources exemplified by visible light sources 3402, 3404 and 3406.
  • Each of the visible light sources 3402- 3406 is disposed so as to project the beam of light 1510 (Fig. 15) along a slightly different axis, thereby illuminating the spot 1506 on a slightly different location on the wall 1514.
  • the embodiment shown in Fig. 34 includes 25 visible light sources 3402-3406. However, other numbers of visible light sources and their spacings may be used, depending on a desired granularity and range o f contro 1 over lo cation o f the spot 1506 on the wall 1514.
  • the feedback signal generator 2810 sends a signal to the light source interface 2608 to control which of the individual visible light sources 3402-3406 projects the spot 1506.
  • a central visible light source 3408 is disposed where a single visible light source 1508 would otherwise be disposed, so as to project the spot 1506 along the optical axis 1504.
  • This light source 3408 is used to initially illuminate the spot 1506 on the wall 1514.
  • the spot diagram centroid location calculator 2802 ascertains that the patient's eye is not aligned with the optical axis 1504
  • the magnitude and direction of misalignment signal 2808 causes the feedback signal generator 2810 to extinguish the visible light source 3408 and illuminate a different light source of the array of visible light sources 3400.
  • the feedback signal generator 2810 selects one of the visible light sources 3402-3406 located a direction and distance from the central visible light source 3408 corresponding to the direction and magnitude signal 2808.
  • the location of the virtual light source 1525 (Fig. 15) within the patient's eye is changed so as to automatically generate a spot diagram that is better centered on the optical sensor 1532.
  • the light source 1520 is steerable, such as by a pan and tilt head (not shown) driven by the light source interface 2608 (Fig. 26) or by an array of light sources driven by the light source interface 2608.
  • the location of the virtual light source 1525 is changed in a direction and by a distance that correspond to the direction and magnitude of the misalignment. Note that consequently the virtual light source 1525 may no longer be along the optical axis 1504. As a result, the spot diagram falls on a different location on the optical sensor 1532, closer to the center of the optical sensor 1532, without any action by the patient. The optical axis 1504 of the instrument 1100 is not changed. Only the location where the spot diagram falls on the optical sensor 1532 changes.
  • Fig. 35 is a schematic plan view of an exemplary array 3500 of light sources exemplified by light sources 3502, 3504 and 3506.
  • Each of the light sources 3502-3506 is disposed so as to project the beam of light 1522 (Fig. 15) along a slightly different axis, thereby creating the virtual light source 1525 at a slightly different location on the retina of the eye 1516.
  • the embodiment shown in Fig. 35 includes 25 light sources 3502-3506. However, other numbers of light sources and their spacings may be used, depending on a desired granularity and range of control over location of the virtual light source 1525 on the retina of the eye 1516.
  • the feedback signal generator 2810 sends a signal to the light source interface 2608 to control which of the individual visible light sources 3502-3506 projects the virtual light source 1525.
  • a central light source 3508 is disposed where a single light source 1520 would otherwise be disposed, so as to project the virtual light source 1525 along the optical axis 1504. This light source 3508 is used to initially illuminate the virtual light source 1525 on the wall retina of the eye 1516.
  • the magnitude and direction of misalignment signal 2808 causes the feedback signal generator 2810 to extinguish the light source 3508 and illuminate a different light source of the array of light sources 3500.
  • the feedback signal generator 2810 selects one of the light sources 3502-3506 located a direction and distance from the central light source 3508 corresponding to the direction and magnitude signal 2808. Automatically determine whether an eye is accommodating and
  • the open view design described herein encourage patients not to accommodate, at least in part because the patients know a spot begin projected on a wall is far away. Nevertheless, a patient may at times accommodate while her eye is being measured. Accommodation introduces an uncontrolled variable into the prescription measurement process, because a corrective eyeglass prescription should be calculated based on wavefronts emanating from an unaccommodated eye. To avoid this problem, embodiments of the present invention automatically ascertain when a patient is not accommodating and use wavefront data from such periods to calculate a prescription.
  • a spot diagram generated by wavefront aberrometry can be used to calculate a corrective lens prescription.
  • embodiments of the present invention capture video data, i.e., a series of time spaced-apart frames, rather than one or a small number of single arbitrarily-timed images.
  • the video frame rate may be constant or variable.
  • the frame rate may be adjusted in real time, from frame to frame, based on characteristics of the spot diagram imaged by the optical sensor 1532 (Fig. 15), such as overall illumination and percent of saturated pixels in a given frame.
  • the frame rate may vary from about 6 frames per second to more than 15 frames per second.
  • the inter- frame time is relative short, on the order of about 1/10 second, thus we refer to the video frames as being “continuous.”
  • the video data is captured from the optical sensor 1532 (Fig. 15) and stored in the memory 2602 (Fig. 26) for processing.
  • Each frame of the video includes an image captured by the optical sensor 1532, an associated frame number and, if the frame rate is not constant, an associated time at which the frame was captured.
  • a prescription can be calculated from each frame.
  • An aberration profile which may be described by a refractive prescription, a set of Zernike coefficients, or some other representation calculated from a frame is referred to herein as a "candidate prescription," because some frames include noise, incomplete spot diagrams, no spot diagram or are otherwise undesirable for prescription calculation.
  • a candidate prescription is calculated for each frame and stored in the memory
  • the candidate prescription calculations may be performed after the last frame of the video has been captured, or the calculations may overlap in time with the video capture and storage. If sufficient computing power is available, a candidate prescription may be calculated for each frame, after the frame has been captured, but before the successive frame is captured. In the latter case, in some embodiments, the raw video data is not stored in the memory 2602.
  • Fig. 36 is a schematic block diagram of an unaccommodation detector module 3600, according to an embodiment of the present invention.
  • a prescription calculator 3602 receives signals from the optical sensor 1532 and calculates a candidate prescription from the signals, as described herein.
  • a prescription typically includes at least a spherical component and one or two cylindrical components.
  • the spherical component is described in terms of the optical power, positive or negative, and the cylindrical component is described in terms of powers and axes or equivalent terms (e. g. power vector notation).
  • a prescription may also include additional lens specifications to correct higher order aberrations.
  • the prescription calculator 3602 outputs a set of individual lens specifications, such as sphere 3604, cylinder-1 3605, axis-1 3606, etc.
  • other information 3607 such as spot size, is also output.
  • the outputs are collectively referred to as a candidate prescription 3612.
  • Each candidate prescription 3612 is stored in the memory 2602 (Fig. 26), along with an identification of from which video frame the candidate prescription was calculated or the relative time 3608 at which the frame was taken by the optical sensor 1532.
  • the as sphere 3604, cylinder-1 3605, axis-1 3606, etc. can be prescription data calculated using various Zernike modes, such as M, JO and J45, obtained using various order Zernike information.
  • a normally-open switch 3620 closes after all the frames have been acquired.
  • the spot diagram size is related to a quality metric, and it gives some information about the prescription. Assuming a constant pupil size, if the eye is emmetropic, the spot diagram size is equal to pupil size. However, if the eye myopic, the spot diagram is smaller than the pupil size. The higher the myopia, the smaller the pupil size. On the other hand, if the eye is hyperopic, the spot diagram diameter is bigger than the pupil size. The more hyperopic, the bigger the spot diagram.
  • the spot diagram size also change with accommodation.
  • the instrument can detect changes in spot diagram size.
  • the spot diagram size is related to pupil size, and pupil size is related to amount of light received by the eye. In darker environments, the pupil automatically becomes larger.
  • the instrument can use pupil size, as estimated from spot diagram size, to track external conditions, such as a change in ambient light in the room while the patient was being measured.
  • the size of the spot diagram can be related to a quality metric.
  • spot diagram size can be used to calculate the pupil size.
  • the pupil size is important to measure the aberrations, because the aberration profile is associated with a specific aperture diameter.
  • Fig. 37 contains a graph 3700 of spherical and cylindrical power candidate prescriptions calculated from a hypothetical patient. Open circles represent spherical candidate prescriptions, and crossed circles represent cylindrical candidate prescriptions. The vertical axis indicates power in diopters (D), and the horizontal axis indicates time at which the candidate prescriptions were calculated. The frames were captured at approximately 10 frames per second.
  • Fig. 37 shows the patient's candidate spherical prescription varying over time, starting at about -2.25 D at time 1. Starting at about time 175, the candidate spherical prescription increases from about -1.7 D to about +0.4 D at about time 240. After about time 240, the candidate spherical prescription decreases.
  • the candidate spherical prescription should be different than any candidate spherical prescription calculated while the eye is accommodated, because an unaccommodated crystalline lens provides different optical power than an accommodated crystalline lens and, therefore, requires a different correction than an accommodated lens.
  • cylindrical correction does not vary significantly with amount of accommodation, so if variations in the cylindrical components of the prescriptions are found, these are in general indicatives of undesirable movements of the patient during the test.
  • candidate spherical prescription around 0 D (such as 3702) should be closer to the correct prescription for the patient, because those are the greatest candidate spherical prescriptions calculated.
  • embodiments of the present invention do not necessarily accept the greatest candidate spherical prescription as the correct prescription, because a candidate prescription may be a result of noise and the magnitude and direction of the accommodation may depend on or other factors such as the actual refractive error of the patient.
  • a candidate prescription may be a result of noise and the magnitude and direction of the accommodation may depend on or other factors such as the actual refractive error of the patient.
  • Fig. 37 neither candidate spherical prescription 3702 nor 3704 is accepted, because we realize an eye cannot change accommodation quickly enough to have yielded either candidate spherical prescription 3702 or 3704.
  • the literature reports a maximum accommodation rate of about 1-2 diopters per second in a human eye.
  • a bar 3706 indicates an approximate amount of time required by an eye to change accommodation from a close to a distant object.
  • candidate spherical prescriptions 3702 and 3704 would have required the eye to change accommodations much more quickly than is physiologically common. Furthermore, an eye changes accommodation continuously, as the crystalline lens changes shape. Thus, candidate prescriptions with no nearby candidate prescriptions are very likely a result of noise. For instance, in the case of a myopic eye with a small pupil, there will be very few spots composing the spot diagram. The errors in determining the centroids of these spots from specular noise can then cause large errors in the calculation of the Zernike coefficients corresponding to the defocus of the eye.
  • the normally-open switch 3620 is closed, and the candidate spherical prescription 3604 is fed to a low-pass filter 3614 in an accommodation filter 3622 (accommodation filter 3810 in Fig. 38) to remove candidate spherical prescriptions that are radically different than surrounding candidate spherical prescriptions, i.e., where the absolute slope of the candidate spherical prescription is greater than a predetermined value.
  • an instantaneous slope in the candidate spherical prescription signal 3604 greater than about ⁇ 1 diopter per second triggers rejection of a candidate spherical prescription.
  • Smoothed candidate spherical prescriptions i.e., candidate spherical prescriptions that pass through the low-pass filter 3614, are processed according to accommodation correction rules 3624.
  • the rules 3624 select the greatest candidate spherical prescription. In the graph of Fig. 37, candidate spherical prescription 3708 would be selected by the rules 3624.
  • other selection criteria, machine learning or other mechanism may be used to process the candidate prescriptions to arrive at a prescription.
  • other portions of the candidate prescription 3612 or other information such as spot diagram size as a function of time, may also be used.
  • the frame number or time associated with the candidate spherical prescription selected by the rules 3624 is used to select a candidate prescription stored in the memory 2602, i.e., the other candidate prescription parameters calculated from the same frame as the candidate spherical prescription detected by the rules 3624.
  • the selected candidate prescription 3618 is reported as a prescription for the patient or fed to another module.
  • more than one candidate spherical prescription may be deemed to have been calculated from frames captured while the patient's eye was not accommodated. For example, all candidate spherical prescriptions within a predetermined range of the candidate spherical prescription detected by the rules 3634, as described above, may be deemed to have been calculated from unaccommodated eye data.
  • the rules 3634 store information in the memory 2602 identifying candidate prescriptions that were calculated from unaccommodated eye data.
  • Some embodiments provide feedback to the patient when a peak candidate spherical prescription has been detected or when no such peak has been detected within a predetermined amount of time after commencing collecting data from the optical sensor 1532.
  • This feedback may be in the form of audio, visual, haptic or other feedback, along the lines described above, with respect to Fig. 28.
  • module 3800 data is acquired from the image sensor 1532 (Fig. 15). Each frame is acquired according to image sensor settings, including exposure time and frame rate. These settings may be adjusted on a frame-by-frame basis, with a goal of acquiring frames with good signal-to-noise ratios. In general, frames with bright spots in their spot diagrams have better signal- to-noise than frames with dim spots, although a large number of spots that are saturated is undesirable. "Saturated" means a brightness value of a pixel is equal to the maximum value possible for the pixel. Alternatively, module 3800 may process frames that were acquired earlier and are stored in memory 2602.
  • the exposure time of the next frame is reduced.
  • the fraction may be expressed as a percentage, for instance 0.1% of all the pixels in the sensor should be saturated. This fraction can vary based on the size of the pupil and the average size of the spots comprising the spot diagram. In addition, this fraction may be set based on characteristics of the image sensor 1532 and the light source 1520.
  • the exposure time of the next frame is increased. However, the exposure time should not be increased to a value that might cause motion blur as a result of the eye moving. Thus, a maximum exposure time can be ascertained, based on the size of the optical sensor 1532 and the number of pixels or quadrants it contains.
  • Outputs from the data acquisition module 3800 are summarized in Table 1.
  • Image sensor settings exposure time and frame rate
  • the patient may be instructed to adjust the position of the instrument 1100, relative to the patient's eye, so the patient perceives a red dot at maximum brightness.
  • the instrument 1100 (Fig. 15) is well oriented, relative to the patient's eye socket.
  • the patient's eye can still move within the eye socket. That is, the patient can look up, down, left and right.
  • the center of the eye's field of view may not be aligned with the optical axis 1504 of the instrument 1100, and the spot diagram may not be centered on the optical sensor 1532, or the spot diagram may be completely off the optical sensor 1532.
  • the patient might blink.
  • the signal reaching the optical sensor 1532 may be from a reflection from the eye's cornea, rather than from the virtual light source on the eye's retina. Thus, some frame may not contain useful information.
  • a frame selector 3802 retains only frames that may contain useful information.
  • An objective of the frame selector 3802 is to ensure raw data used to calculate a prescription is as good as possible.
  • the frame selector 3802 may discard frames, as summarized in Table 2. For example, successive frames in which the diameter of the spot diagram varies from frame to frame by more than a predetermined amount may be discarded.
  • the frame selector 3802 tags the frames, such as "valid,” "incomplete” or possibly
  • the tags may be represented by codes stored in the memory 2602 in association with data representing brightness values of the pixels of the frames or prescriptions calculated from the frames.
  • Table 2 Frames discarded by frame selector module Patient blinked (no spot diagram)
  • Fig. 39 is schematic diagram of a complete spot diagram
  • Fig. 40 is a schematic diagram of a partial spot diagram, i.e., a spot diagram in which a portion of the spot diagram falls off the optical sensor 1532.
  • the spot diagrams were captured by a prototype instrument as described herein.
  • the frame selector 3802 may distinguish these two types of frames from each other by various techniques. For example, the frame selector 3802 may ascertain a shape of the spot diagram. If the spot diagram is approximately circular or elliptical and complete, the frame may deemed to contain a complete spot diagram, and the frame may be accepted and tagged as "valid.” The frame selector 3802 may also calculate the location of the center of the spot diagram.
  • the frame selector 3802 may also calculate or estimate what fraction of the spot diagram falls on the optical sensor 1532. As will be discussed below, incomplete spot diagrams may be used in some prescription calculations.
  • Fig. 41 is a schematic diagram of a frame from the optical sensor 1532 with no spot diagram, such as a result of a patient blink or gross misalignment of the patient's eye with the optical axis 1504 of the instrument 1100.
  • the frame selector 3802 may detect this type of frame by summing or integrating all the pixels of the frame. If the sum or integral is less than a predetermined value, indicating few or no spots of a spot diagram are present in the frame, the frame selector 3802 may discard the frame.
  • Fig. 42 is a schematic diagram of a frame from the optical sensor 1532 containing a corneal reflection.
  • the frame was captured by a prototype instrument as described herein.
  • the frame selector 3802 may identify such a frame, based on several factors. For example, if the image contains more spots than lenses in the lenslet array 1530, the frame selector 3802 may discard the frame.
  • the frame selector 3802 may sum or integrate all the pixels of the frame. If the sum or integral is greater than a predetermined value, indicating too many spots for a spot diagram are present in the frame, the frame selector 3802 may discard the frame. Frames discarded by the frame selector 3802 may be stored in the memory, but tagged "discarded.”
  • Table 3 summarizes outputs from the frame selector 3802 module.
  • Image sensor settings exposure time and frame rate
  • several consecutive frames may be combined to obtain a single frame with a better signal-to-noise ratio than each of the consecutive frames.
  • a low-cost light source 1520 (Fig. 15) is used to create the virtual light source 1525 in the patient's eye 1516
  • the images acquired by the optical sensor 1532 may include significant speckle noise. Speckle noise may result from path length differences between points within the virtual light source 1525 and the optical sensor 1532. These path length differences cause random variations in intensity due to mutual interference from several wavefronts emanating from the points within the virtual light source 1525.
  • intraocular fluid such as vitreous humor
  • flow of the vitreous human may randomize path lengths on the time scale of the frames and, therefore, reduce speckle noise.
  • combining several frames can improve the signal-to-noise by averaging the speckle noise.
  • a frame combiner 3804 receives output from the frame selector module 3802, and optionally from the prescription calculator 3806, and outputs a single combined frame.
  • the frame combiner 3804 may combine only consecutive frames that are tagged "valid.”
  • the frame combiner 3804 may combine consecutive frames that are tagged "valid” or "incomplete.”
  • the frame combiner 3804 may combine non-consecutive frames, based on the prescription information provided by the prescription calculator 3806.
  • the frame combiner 3804 registers the frames that are to be combined, so corresponding spots of the spot diagram register with each other.
  • a non-deforming (rigid) registration process should be used, so as not to alter the shape of the spot diagram.
  • the frames may be summed or averaged. That is, the intensities recorded by corresponding pixels in each summed frame are added or averaged.
  • the exposure time for the spot diagram should be revised by summing the exposure times of the frames that were combined. It is also to take into account at this stage that only frames which are close in time (i .e . consecutive frames in which the eye had no time to accomodate) may be combined, since accommodation can cause the combination of frames with different prescriptions leading to incorrect results.
  • Figs. 43- 46 are schematic diagrams of a set of frames from the optical sensor 1532 containing a sequence of images acquired as an eye slowly moved, creating a set of spot diagrams that move from left to right. The frames were captured by a prototype instrument as described herein. The spot diagrams in Figs. 43-45 are tagged incomplete, and the spot diagram in Fig. 46 is tagged valid. Essentially the same procedure as described above for combining frames may be used for combining the frames represented by Figs. 43-46.
  • spots in the resulting combined spot diagram result from adding or averaging a different number of spots than other resulting combined spots.
  • some spots are not included in the spot diagram of Fig. 43, because these spots fall off the left side of the optical sensor 1532. These spots appear in subsequent frames, as the spot diagram moves to the right. Therefore, these spots have fewer contributions to their sum or average. Thus, these spots likely have worse signal-to-noise ratios than spots that appear in each of Figs. 43-46.
  • a low-pass filter may be used to smooth each frame that is to be combined, in order to calculate registration parameters, such as displacements to apply to the frame images to register them to a target reference.
  • the low-pass filter is used to calculate the registration parameters. Once the registration parameters have been calculated, the registration displacements are applied to the original frames, not to the filtered frames.
  • Characteristics of the low-pass filter may be determined empirically, given characteristics of the light source 1520 (Fig. 15) and characteristics of the lenslet array 1530. Characteristics of the low-pass filter relate to size of the speckle, which is related to the diffraction limit of the lenslet array 1530. Calibrations related to misalignment of different components within the device 1100 should be applied before the registration process. Outputs from the frame combiner 3804 are summarized in Table 4.
  • a prescription calculator module 3806 calculates a prescription from each frame.
  • Fig. 47 is a schematic diagram of a hypothetical frame from the optical sensor 1532 containing a complete spot diagram. An "X" indicates the centroid of the spot diagram. Crosses indicate centroid locations for spots, where they would appear for a perfect eye. As evident from the figure, many spots of the spot diagram are displaced from these crosses.
  • the spot diagram is generated when a wavefront impinges on an array of lenslets.
  • a slope of the wavefront at each sample point (lens of the lenslet array) is calculated.
  • a displacement ( ⁇ and Ay) of each spot of the spot diagram is calculated, relative to the location of a spot from a perfect eye, as exemplified in Fig. 48. Given the focal length of the lenslet array, the slopes can be calculated from the displacements.
  • W j is the coefficient of the Z j mode in the expansion.
  • W j is equal to the RMS wavefront error for that mode.
  • the Zernike coefficients are used to calculate a prescription. Because the Zernike expansion employs an orthonormal set of basis functions, the least-squares solution is given by the second order Zernike coefficients, regardless of the value of the other coefficients. These second- order Zernike coefficients can be converted to a sphero-cylindrical prescription in power vector notation using the following or other well-known equations:
  • CTM is the nth order Zernike coefficient
  • r is pupil radius
  • the power vector notation is a cross-cylinder convention that is easily transposed into conventional formats used by clinicians.
  • Image sensor settings exposure time and frame rate
  • One or several prescriptions in the power vector domain (PWV) (M, JO and J45), or in another domain such as optometric, for each spot diagram (frame)
  • the system can provide more than one prescription.
  • one prescription may be calculated with Two Zernike orders, i.e., with no high-order aberrations, and other prescriptions may be calculated using high-order aberrations, such as Zernike orders 4 or 6.)
  • information about the prescription may be provided by the prescription calculator 3806 to the frame combiner 3804.
  • the frame combiner 3804 may use this information to determine how to combine frames.
  • quality metrics may be calculated for each calculated prescription by a quality metric calculator 3808.
  • the quality metrics may be used to weight the prescription calculated from each frame or frame combination to calculate a final prescription.
  • the quality metrics may be as simple as a binary value, for example "0" for "bad” and "1" for "good.” More complex quality metrics may fall within a range, such as a real number between 0.0 and 1.0.
  • the quality metrics may be based on, for example, the number of frames, signal-to-noise ratio of the spot diagram, number of spots in the spot diagram, sharpness of the points in the spot diagram and absence, or small values, of high-order Zernike coefficients, or combinations thereof.
  • the signal-to-noise ratio of a frame may, for example, be calculated by dividing the mean pixel value of spots in the spot diagram by the mean pixel value of background, i.e., an area outside the spot diagram.
  • Image sensor settings exposure time and frame rate
  • an accommodation filter module 3810 selects frames captured when the patient is not accommodating.
  • the amount by which a human eye can accommodate varies with age of a patient, as summarized in a graph in Fig. 49.
  • Embodiments of the present invention input the age of each patient, such as via a numeric keyboard or up/down arrow buttons coupled to a numeric display that increase or decrease a displayed age value as the arrow buttons are pressed.
  • the accommodation filter 3810 discards frames that evidence changes in accommodation faster than the patient should be able to accommodate, given the patient's age.
  • the accommodation filter 3810 includes a variable low-pass filter whose characteristics are controlled by the expected maximum accommodation rate. The low-pass filter operates on the M (spherical error) portion of the prescription.
  • Other embodiments employ fixed accommodation rate limits, such as about lto 2 diopter per second, independent of the patient's age.
  • a change in the calculated defocus term (or M in PWV notation) that occurs faster than the fixed accommodation rate limit is considered noise and is not included in determining the final prescription.
  • Fig. 50 is a graph of a set of M, JO and J45 prescriptions calculated by a prototype instrument, as described herein.
  • a dark line is added to show M values after processing by the accommodation filter 3810.
  • the accommodation filter 3810 selects frames acquired during these times and discards other frames, for the purpose of calculating spherical terms for the prescription. Because astigmatism and other terms of the prescription do not vary with accommodation, the frames discarded by the accommodation filter 3810 may be used to calculate these other terms. If variations in astigmatism as a function of time are found, this can be used as indicators of patient movements during the test, and thus used to tag frames as invalid.
  • Fig. 51 is a graph of a set of M, JO and J45 prescriptions calculated by a prototype instrument for a different patient.
  • the M values 5100 do not vary significantly throughout the graph. It can, therefore, be assumed that the patient did not accommodate throughout the time period represented by the graph. In this case, the accommodation filter 3810 selects all frames represented by the graph; no frames are discarded.
  • Image sensor settings exposure time and frame rate
  • Groups of frames may yield similar prescriptions. For example, as shown in the graph of Fig. 50, two groups of frames 5000 and 5002 yield similar J (spherical) prescriptions.
  • a frame grouper module identifies groups of frames that yield similar prescriptions, such as prescriptions within a predetermined range of values. Two such frame grouper modules 3812 and 3814 are shown in Fig. 38.
  • One frame grouper 3812 groups frames that yield similar, such as within about a 5% difference, Zernike coefficients. In some embodiments, the prescription grouper 3812 considers only the first six Zernike coefficients, although other numbers of coefficients may be used.
  • the other frame grouper 3814 groups frames that yield similar prescriptions, for example, values of M, JO and/or J45 that fall within about ⁇ 0.125 diopters or within about ⁇ 0.25 diopters. Frame groupers that group frames based on other similarities may also be used.
  • Separate groups of frames may be defined for each term of the prescription. Thus, one group of frames may be selected for having similar M values, and a different, possibly overlapping, group of frames may be selected for having similar JO values. If some frames were discarded by the accommodation filter 3810, a different pool of frames may be available to the frame grouper 3814 for selecting frames based on similarity of M values than for selecting frames based on similarity of JO values. Similarly, different pools of frames may be available to the other frame grouper 3812.
  • the frame grouper 3814 may operate by generating a histogram for each term of the prescription.
  • a hypothetical histogram for spherical prescriptions is shown in Fig. 52.
  • the horizontal axis represents spherical prescription values or M values in the power vector domain, and the vertical axis represents the number of frames that yielded a given spherical prescription. Note that frames containing low-quality raw data, such as due to low signal-to-noise, were discarded by other modules. Thus, some prescription values may not have been calculated from any accepted frames.
  • the prescription value 5200 yielded from the greatest number of frames, and a range 5202 of prescription values around this value, are selected by the frame grouper 3814.
  • the frame grouper 3814 operates similarly for the other prescription terms.
  • the other frame grouper 3812 operates similarly, generating a histogram for each Zernike coefficient it considers.
  • the frame grouper 3814 may use the sum of the quality metrics for the frames.
  • the quality metric values are between 0.0 and 1.0
  • the histogram represents the sum of the quality metrics for the frames that yielded that prescription.
  • the frame groupers 3812 and 3814 may use other selection operations, other than or in addition to, histograms.
  • Image sensor settings exposure time and frame rate
  • frames that yield similar prescriptions or Zernike coefficients may be combined to yield frames with better signal-to-noise, and prescriptions can be calculated from the combined frames.
  • a frame restorer 3816 combines the frames output by one or both of the frame groupers 3812 and/or 3814. The frame restorer 3816 combines these frames in a manner similar to that described above, with respect to the frame combiner 3804. All frames available from the frame grouper(s) 3812 and/or 3814 may be combined into a single frame. Alternatively, all the frames may be combined on a per prescription term basis. That is, all frames with similar M and J values may be combined to generate a single combined frame.
  • the frames may be combined so as to yield a new set of frames in which each frame is a combination of all preceding frames in the input set of frames, as graphically illustrated in Fig. 53.
  • Output frame 1 is generated by registering and summing or averaging input frames 1 and 2.
  • Output frame 2 is generated by registering and summing or averaging input frames 1, 2 and 3.
  • Output frame N is generated by registering and summing or averaging input frames 1, 2, 3, ... N.
  • the quality metrics of each generated frame may be adjusted. In general, combining frames improves signal-to-noise.
  • Outputs of the frame restorer 3816 are summarized in Table 9.
  • Image sensor settings exposure time and frame rate
  • a second prescription calculator 3818 calculates prescriptions from the frames generated by the frame restorer 3816.
  • the second prescription calculator 3818 operates largely as described above, with respect to the first prescription calculator 3806, except the input dataset is different.
  • Outputs from the second prescription calculator 3816 are essentially the same as described in Table 5.
  • a final prescription calculator 3820 accepts inputs from the frame grouper 3812, the frame grouper 3814 and/or the second prescription calculator 3818.
  • the final prescription calculator 3820 calculates a single final prescription from its inputs using one or more statistical calculations.
  • the final prescription calculator 3820 calculates the final M, JO and J45 prescriptions as a mean, mode or median of its input M, JO and J45 prescriptions, after weighting each frame's prescriptions by the frame's quality metrics.
  • higher-order prescription terms are calculated in the same manner as the M, JO and J45 prescriptions are calculated.
  • the final prescription calculator 3820 also calculates estimated error value for each final calculated prescription.
  • the M error is estimated to be the standard deviation of the final calculated M prescription, within the M input data to the final prescription calculator 3820. In some embodiments, the error is estimated to be twice the standard deviation, according to preferences of some clinicians (95% confidence interval). Other embodiments may estimate the error using other statistical formulas. This error may be communicated to the user of the device by a confidence value in the prescription, for instance, indicating a strong confidence in the measured prescription, or a weak confidence in the measured prescription and suggesting to run the test again.
  • Some embodiments estimate a confidence region for the final astigmatism prescription.
  • This confidence region may be an ellipse computed for the bivariate distribution of JO and J45.
  • the precision of the astigmatism prescription is deemed to be the geometric mean of the major and minor axes of the 95% confidence ellipse, as exemplified in Fig. 54.
  • Embodiments of the present invention are not necessarily limited to calculating prescriptions for living beings. Some embodiments may be used on a model eye ball to evaluate a person's spectacle prescription. For example, these embodiments may be used to evaluate a person's spectacles and automatically determine if they are appropriate for the person by checking the person without his spectacles and either checking the person with his spectacles on (as indicated in phantom at 1552 in Fig. 15) or checking the spectacles on a model eye. Optionally or alternatively, embodiments may be used to evaluate a person's spectacles and automatically determine if they are appropriate for the person by checking the person with his spectacles on and determining if the returned wavefronts indicate correct vision, at least within a predetermined range.
  • the patient's aberrations may be measured when the patient is looking through an embodiment at a target located closer than 20 feet (6 meters) away, and an accommodative offset is then calculated, so as to estimate a prescription for the patient at infinity.
  • a monocular aberrometer in another embodiment, includes an accelerometer in it to enable the device to ascertain which direction is up and, therefore, automatically ascertain which eye (left or right) is being measured. The device is turned upside down to measure the opposite eye.
  • Some embodiments also track undesired movements of a patient by tracking how astigmatism components of the prescription change, as a function of time.
  • each block, or a combination of blocks may be combined, separated into separate operations or performed in other orders. All or a portion of each block, or a combination of blocks, may be implemented as computer program instructions (such as software), hardware (such as combinatorial logic, Application Specific Integrated Circuits (ASICs), Field-Programmable Gate Arrays (FPGAs) or other hardware), firmware or combinations thereof.
  • Embodiments may be implemented by a processor executing, or controlled by, instructions stored in a memory.
  • the memory may be random access memory (RAM), read-only memory (ROM), flash memory or any other memory, or combination thereof, suitable for storing control software or other instructions and data.
  • Instructions defining the functions of the present invention may be delivered to a processor in many forms, including, but not limited to, information permanently stored on tangible non-writable storage media (e.g., read-only memory devices within a computer, such as ROM, or devices readable by a computer I/O attachment, such as CD-ROM or DVD disks), information alterably stored on tangible writable storage media (e.g., floppy disks, removable flash memory and hard drives) or information conveyed to a computer through a communication medium, including wired or wireless computer networks.
  • tangible non-writable storage media e.g., read-only memory devices within a computer, such as ROM, or devices readable by a computer I/O attachment, such as CD-ROM or DVD disks
  • information alterably stored on tangible writable storage media e.

Abstract

Eye prescriptions may be determined by providing a simple, easy to use, portable device with a specially configured targeting light source that aligns the eye, mitigates accommodation, and provides accurate results. Unlike stationary, closed view autorefractors, this device typically is portable, self-usable, relatively inexpensive, enabling more widespread use across the world.

Description

Apparatus and Method of Determining an Eye Prescription
CROSS REFERENCE TO RELATED APPLICATIONS
[0001] This application claims the benefit of U.S. Provisional Patent Application No.
61/842,190, filed 7/2/2013, titled "System and Method for Optical Alignment of an Eye with a Device for Measurement of Optical Properties of the Eye," U.S. Provisional Patent Application No. 61/972,058, filed 3/28/2014, titled 'Apparatus and Method for Determining an Eye Prescription" and U.S. Provisional Patent Application No. 61/972,191, filed 3/28/2014, titled 'Apparatus and Method for Determining an Eye Prescription," the entire contents of all of which are hereby incorporated by reference herein, for all purposes.
TECHNICAL FIELD
[0002] The invention generally relates to optical or ophthalmologic methods and apparatus and, more particularly, the invention relates to methods and devices for facilitating the process of determining optical properties of an eye.
BACKGROUND ART
[0003] "Refractive errors" are low-order aberrations, such as in an eye of a human. A
"refractive prescription" is a prescription for corrective lenses (eyeglasses) that correct refractive errors. As described in more detail herein, eyes may also or instead suffer from higher-order aberrations.
[0004] Autorefractors automatically estimate a refractive prescription for a patient's eyes.
While widely used in the United States and Europe for many years, autorefractors have a number of drawbacks. For example, autorefractors typically are quite expensive, often costing more than ten thousand dollars. In addition, autorefractors generally are large and immobile, and they require extensive assistance by an ophthalmologist, optometrist or her trained staff. Accordingly, for these and other related reasons, autorefractors are used much less frequently in low-resource settings, such as parts of Africa, Asia and even rural portions of the United States. Wavefront aberrometers are a complex and expensive type of autorefractor. Wavefront aberrometers are also used to guide laser surgery, such as for cataracts and vision correction.
[0005] Prescriptions may be expresses in optometric notation, power vectors notation and their equivalence.
SUMMARY OF EMBODIMENTS
[0006] An embodiment of the present invention provides a method of determining an optical property of an eye of a living being. The method includes providing an optical apparatus that has a proximal port and a distal port. The proximal port and the distal port together form a visual channel. The eye is aligned with the proximal port. Target indicia are produced at effective infinity. The target indicia are viewable through the visual channel. The eye is focused on the target indicia. Accommodation of the eye is determined, as the eye views the target indicia. An optical property for the eye is calculated, as a function of the determined accommodation.
[0007] Data may be gathered relating to accommodation of the eye. Calculating the optical property may include using the data relating to accommodation to identify when the eye is accommodating and when the eye is not accommodating. Calculating the optical property may include selecting data relating to when the eye is not accommodating to calculate the optical property.
[0008] Calculating the optical property may include discarding the data relating to when the eye is accommodating.
[0009] The method may also include generating a target light beam by a target light source coupled to the apparatus and producing the target indicia with the target light beam.
[0010] Determining accommodation may include obtaining a plurality of sequential images of a light wavefront from the eye, as the eye focuses on the target indicia.
[0011] Calculating the optical property may include calculating the optical property as a function of timing of the sequential images.
[0012] Determining accommodation may include tracking changes in the optical aberrations of the eye using measurements from a plurality of sequential images of a light wavefront from the eye, as the eye focuses on the target indicia. [0013] Determining accommodation of the eye may include filtering one or more images from the plurality of sequential images.
[0014] The filtering may be based on physiological parameters of the eye, including a rate of change in accommodation of the eye.
[0015] Calculating the optical property for the eye may include using a wavefront aberrometer to calculate the optical property.
[0016] Focusing the eye on the target indicia may include focusing the eye on the target indicia while the target indicia are at least about 10 feet from the apparatus.
[0017] Calculating the optical property may include calculating a prescription for the eye.
[0018] Calculating the optical property may include calculating an eyeglass prescriptions for distant and near vision.
[0019] Another embodiment of the present invention provides an optical apparatus that includes a proximal port and a distal port that together form a visual channel. A target light source is configured to produce target indicia at effective infinity. The target indicia are viewable through the visual channel. Determining logic is configured to determine accommodation of an eye, as the eye views the target indicia.
[0020] The apparatus may also include a body forming the proximal and distal ports. The body may further contain the determining logic.
[0021] The determining logic may be configured to calculate a prescription for the eye, as a function of the determined accommodation of the eye.
[0022] The optical apparatus may further include a wavefront image sensor operatively coupled with the determining logic. The image sensor may be configured to capture a plurality of sequential images of wavefronts, as the eye focuses on the target indicia.
[0023] The logic for determining accommodation may be configured to calculate the prescription, as a function of timing of the sequential images.
[0024] The determining logic may use as input a spherical prescription for the eye, as a function of the timing of the sequential images.
[0025] The determining logic may use as input a spherical equivalent (M) prescription for the eye, as a function of the timing of the sequential images. [0026] A filter may be operably coupled with the determining logic. The filter may be configured to filter one or more images from the plurality of sequential images.
[0027] Yet another embodiment of the present invention provides an optical apparatus that includes a proximal port configured to receive an eye. An array of primary light sensors is configured to receive a wavefront passing through the proximal port. The array of primary light sensors has a perimeter. At least one secondary light sensor is positioned outside the perimeter of the array of primary light sensors. A circuit is configured to detemiine a parameter of the eye using wavefront data from the array of primary light sensors.
[0028] The optical apparatus may further include a non-stationary body. The non-stationary body has the proximal port and a distal port. The proximal port and the distal port form a visual channel from the proximal port through the distal port. The visual channel may be open view to enable the eye to see target indicia external to and spaced away from the body.
[0029] A retinal light source may be configured to direct an illumination beam toward the proximal port to produce the wavefront.
[0030] A cue generator may be operatively coupled with the at least one secondary light sensor. The cue generator may be configured to generate a cue as a function of receipt of the wavefront by the at least one secondary light sensor.
[0031] The cue generator may be configured to generate a visual cue, an acoustic cue and/or a mechanical cue, as a function of receipt of the wavefront by the at least one secondary light sensor.
[0032] The array of primary light sensors may have a first sensitivity to the wavefront, and the at least one secondary light sensor may have a second sensitivity to the wavefront. The first sensitivity may be greater than the second sensitivity.
[0033] The array of primary light sensors may include a CCD, and the at least one secondary light sensor may include a quadrant sensor.
[0034] The distal port may at least in part define an optical axis. The at least one secondary light sensor may be configured to receive the wavefront, as a function of the orientation of the eye relative to the optical axis.
[0035] The at least one secondary light sensor may substantially circumscribe the perimeter of the array of primary light sensors. [0036] An embodiment of the present invention provides an optical method that includes providing an optical apparatus. The optical apparatus has a proximal port and a distal port that together form a visual channel from the proximal port through the distal port. The apparatus further includes an array of primary light sensors having a perimeter. The apparatus further includes at least one secondary light sensor positioned outside the perimeter of the array of primary light sensors. A living being's eye is aligned with the proximal port. The eye views through the distal port to target indicia exterior of the apparatus. The eye is illuminated to produce a wavefront through the proximal port. The amount of the wavefront sensed by the at least one secondary light sensor is determined. A cue is generated, as a function of the amount of the wavefront sensed by the at least one secondary light sensor.
[0037] An eye parameter, such as a prescription for the eye, may be determined.
[0038] The distal port may at least in part define an optical axis. The method may further include moving the eye toward the optical axis in response to the cue.
[0039] The distal port may at least in part define an optical axis. The at least one secondary light sensor may be configured to receive the wavefront, as a function of the orientation of the eye, relative to the optical axis.
[0040] The wavefront may be split into a primary path toward the array of primary light sensors, and the wavefront may be further split into a secondary path toward the at least one secondary light sensor.
[0041] Another embodiment of the present invention provides an optical apparatus that includes a proximal port configured to receive an eye and a distal port. The apparatus includes a visual channel from the proximal port through the distal port. An array of primary light sensors is configured to receive a wavefront passing through the proximal port. The apparatus also includes at least one secondary light sensor. Optics within the visual channel are configured to split the wavefront into a primary path toward the array of primary light sensors and a secondary path toward the at least one secondary light sensor.
[0042] The primary light sensors may be adjacent the at least one secondary light sensor.
[0043] A lens may be adjacent the at least one secondary light sensor. The lens may be positioned so the secondary path passes through the lens. [0044] A retinal light source may be configured to direct an illumination beam toward the proximal port to produce the wavefront.
[0045] A cue generator may be operatively coupled with the at least one secondary light sensor. The cue generator may be configured to generate a cue, as a function of receipt of the wavefront by the at least one secondary light sensor.
[0046] The cue generator may be configured to generate a visual cue, an acoustic cue and/or a mechanical cue. The cue generator generates the at least one cue, as a function of receipt of the wavefront by the at least one secondary light sensor.
[0047] The array of primary light sensors may have a first sensitivity to the wavefront, and the at least one secondary light sensor may have a second sensitivity to the wavefront. The first sensitivity may be greater than the second sensitivity.
[0048] The array of primary light sensors may include a CCD, and the at least one secondary light sensor may include a quadrant sensor.
[0049] Yet another embodiment of the present invention provides an optical method. The method includes providing an optical apparatus that has a proximal port and a distal port. Together, the proximal port and the distal port form a visual channel from the proximal port through the distal port. The apparatus also has an array of primary light sensors and at least one secondary light sensor. A living being's eye is aligned with the proximal port. The eye views through the distal port to target indicia exterior of the apparatus. The eye is illuminated to produce a wavefront through the proximal port. The wavefront is split into a primary path toward the array of primary light sensors and a secondary path toward the at least one secondary light sensor.
[0050] The method may include passing the secondary path of the wavefront through a lens to focus the split portion of the wavefront along the secondary path.
[0051] A light beam may be directed toward the proximal port to reflect off the eye to produce the wavefront.
[0052] A cue may be generated, as a function of receipt of the wavefront by the at least one secondary light sensor.
[0053] Generating the cue may include generating a visual cue, an acoustic cue and/or a mechanical cue. The cue may be generated as a function of receipt of the wavefront by the at least one secondary light sensor. [0054] The at least one secondary light sensor may include a quadrant sensor.
[0055] An embodiment of the present invention provides a method of determining an optical property of an eye of a living being. Thee method includes providing an optical apparatus having a proximal port and a distal port that together form a visual channel. The eye is aligned with the proximal port. Light is directed into the eye to produce a wavefront. The wavefront is received via the proximal port. A plurality of sequential, time spaced-apart data sets of the wavefront is captured. The data sets include temporal information sufficient to describe a relative time at which each data set of the plurality of data sets is captured. An optical property of the eye is determined, as a function of the temporal information.
[0056] The plurality of sequential data sets may include images of the wavefront.
[0057] The method may include filtering at least one data set of the plurality of data sets.
[0058] Determining the optical property may include determining the optical property as a function of the order of the plurality of data sets and the contents of the plurality of data sets.
[0059] Each data set of the plurality of data sets may include wavefront aberration information.
[0060] The optical property may include a spherical component and a cylindrical component. Determining the optical property may include determining the cylindrical component after determining the spherical component.
[0061] Determining the optical property may include analyzing the plurality of data sets for trends in the data.
[0062] The plurality of data sets may include information relating to accommodation of the eye.
[0063] The method may include weighting certain data sets of the plurality of data sets, as a function of a signal-to-noise ratio.
[0064] The plurality of data sets may include a video of the wavefront.
[0065] The optical property may includes a prescription for the eye.
[0066] Another embodiment of the present invention provides an apparatus for determining an optical property of an eye a living being. The apparatus includes a proximal port and a distal port that together form a visual channel. An illumination light source is configured to direct light into the eye to produce a wavefront that is received through the proximal port. An image capture sensor is operatively coupled with the visual channel. The sensor is configured to capture a plurality of sequential, time spaced-apart data sets of the wavefront. The data sets include temporal information sufficient to describe a relative time at which each data set of the plurality of data sets is captured. Optical property logic is operatively coupled to the image capture sensor. The optical property logic is configured to determine an optical property of the eye, as a function of the temporal information.
[0067] The plurality of data sets may include images of the wavefront.
[0068] A filter may be configured to filter at least one data set of the plurality of data sets.
[0069] The optical property logic may include logic configured for determining an optical property, as a function of the order of the plurality of data sets and contents of the plurality of data sets.
[0070] The optical property may be a prescription for the eye.
[0071] Each data set of the plurality of data sets may include wavefront aberration information.
[0072] The optical property may include a spherical component and a cylindrical component. The optical property logic may be configured to determine the cylindrical component after determining the spherical component.
[0073] The optical property logic may be configured to analyze the plurality of data sets for trends in the data.
[0074] The plurality of data sets may include information relating to accommodation of the eye.
[0075] The optical property logic may be configured to weigh at least one data set of the plurality of data sets, as a function of a signal-to-noise ratio.
[0076] The sequential data sets may include a video of the wavefront.
[0077] Yet another embodiment of the present invention provides a method of determining an optical property of an eye of a living being. The method includes providing an optical apparatus having a proximal port and a distal port that together form a visual channel. The eye is aligned with the proximal port. Light is directed into the eye to produce a wavefront. The wavefront is received via the proximal port. A plurality of sequential, time spaced-apart data sets of the wavefront is captured. The data sets include temporal information sufficient to describe a relative time at which each data set of the plurality of data sets is captured. The data sets include high-frequency noise. Data sets of the plurality of data sets are registered. An optical property of the eye is determined, as a function of the registered data sets.
[0078] Registering the data sets may include mitigating the high frequency noise.
[0079] Registering the data sets may include increasing a signal-to-noise ratio.
[0080] Registering the data sets may include registering consecutive data sets.
[0081] The plurality of data sets may include images of the wavefront.
[0082] At least one data set of the plurality of data sets may be filtered before registering the data sets.
[0083] Each data set may include wavefront aberration information.
[0084] The plurality of data sets may include a video of the wavefront.
[0085] Registering the plurality of data sets may include registering consecutive data sets and combining the registered consecutive data sets to mitigate noise.
[0086] Registering the plurality of data sets may include selecting data sets that were acquired close enough together in time to avoid data sets that span a change in the optical property of the eye due to accommodation. Registering the plurality of data sets may also include registering the selected data sets. The method may further include combining the registered data sets to mitigate noise.
[0087] Registering the plurality of data sets may include registering data sets with similar, within a predetermined range, wavefront aberration information and combining the registered data sets to mitigate noise.
[0088] An embodiment of the present invention provides an optical apparatus for determining an optical property of an eye of a living being. The apparatus includes a proximal port and a distal port that together form a visual channel. An illumination light source is configured to direct light into the eye to produce a wavefront that is received through the proximal port. An image capture sensor is operatively coupled with the visual channel. The sensor is configured to capture a plurality of sequential, time spaced-apart data sets of the wavefront. The data sets include temporal information sufficient to describe a relative time at which each data set of the plurality of data sets is captured. The data sets include high-frequency noise. Optical property logic is operatively coupled with the sensor. The optical property logic is configured to register consecutive data sets of the plurality of data sets to mitigate the high frequency noise. The optical property logic is configured to also determine an optical property of the eye, as a function of the registered data sets.
[0089] The plurality of data sets may include images of the wavefront.
[0090] A filter may be configured to filter at least one data set of the plurality of data sets before registering.
[0091] Each data set of the plurality of data sets may include wavefront aberration information.
[0092] The plurality of data sets may include a video of the wavefront.
[0093] The optical property logic may be configured to combine or average the registered data sets to mitigate noise.
[0094] The optical property logic may be configured to register consecutive data sets.
[0095] The optical property logic may be configured to select data sets that were acquired close enough together in time to avoid data sets that span a change in the optical property of the eye due to accommodation and register the selected data sets.
[0096] The optical property may be an eye prescription.
[0097] Illustrative embodiments of the invention may be implemented as a computer program product having a computer usable medium with computer readable program code thereon. The computer readable code may be read and utilized by a computer system in accordance with conventional processes.
BRIEF DESCRIPTION OF THE DRAWINGS
[0098] The invention will be more fully understood by referring to the following Detailed
Description of Specific Embodiments in conjunction with the Drawings, of which:
[0099] Fig. 1 is a schematic cross-sectional diagram of an emmetropic human eye imaging a distant object.
[00100] Fig. 2 is a schematic cross-sectional diagram of an emmetropic human eye imaging a close object.
[00101] Fig. 3 is a schematic cross-sectional diagram of a hyperopic human eye imaging a close object. [00102] Fig. 4 is a schematic cross-sectional diagram of a myopic human eye imaging a distant object.
[00103] Fig. 5 schematically illustrates a corrective lens disposed in front of the myopic eye of Fig. 4 to correct the myopia.
[00104] Fig. 6 schematically illustrates a Hartmann-Shack wavefront aberrometer adjacent an emmetropic human eye, according to the prior art.
[00105] Fig. 7 schematically illustrates wavefronts from a virtual light source exiting the eye of Fig. 6 and received by the Hartmann-Shack wavefront aberrometer, as well as a hypothetical spot diagram generated by the Hartmann-Shack wavefront aberrometer, according to the prior art.
[00106] Fig. 8 schematically illustrates wavefronts from a virtual light source exiting a non- emmetropic eye and received by a Hartmann-Shack wavefront aberrometer, as well as a hypothetical spot diagram generated by the Hartmann-Shack wavefront aberrometer, according to the prior art.
[00107] Fig. 9 is schematically illustrates a hypothetical wavefront from a non-emmetropic eye impinging on an array of lenslets of a Hartmann-Shack wavefront aberrometer and resulting illumination of an optical sensor of the aberrometer and a three-dimensional graph representing geographic distribution of intensity of the illumination, as well as an enlarged view of one lens of the array of lenslets, according to the prior art.
[00108] Fig. 10 provides perspective views of surface shapes defined by 1st to 4th order
Zernike polynomials, according to the prior art.
[00109] Figs. 11, 12 and 13 contain right, front and left side views of a lightweight portable hand-held automatic device that includes a Hartmann-Shack wavefront aberrometer, according to an embodiment of the present invention.
[00110] Fig. 14 illustrates the device of Figs. 11-13 in use by a patient.
[00111] Fig. 15 is a schematic block diagram of the device of Figs. 11-14, showing its internal components, according to an embodiment of the present invention.
[00112] Fig. 15-1 is a schematic diagram illustrating an eye properly aligned with the device of Figs. 11-15, as well as a view as seen by the eye through the device, according to an embodiment of the present invention. [00113] Fig. 15-2 is a schematic diagram illustrating an eye slightly misaligned with the device of Figs. 11-15, as well as a hypothetical view as seen by the eye through the device, according to an embodiment of the present invention.
[00114] Fig. 15-3 is a schematic diagram illustrating an eye grossly misaligned with the device of Figs. 11-15, as well as a view as seen by the eye through the device, according to an embodiment of the present invention.
[00115] Fig. 16 is a schematic block diagram of the device of Figs. 11-14, showing its internal components, according to another embodiment of the present invention.
[00116] Fig. 17 illustrates a view through the device of Fig. 16, as seen by a patient using the device when the patient's eye is properly aligned with the device.
[00117] Fig. 18 illustrates a hypothetical view through the device of Fig. 16, as seen by a patient using the device when the patient's eye is not properly aligned with the device.
[00118] Fig. 19 is a schematic block diagram of the device of Figs. 11-14, showing its internal components, according to yet another embodiment of the present invention.
[00119] Fig. 20 is a schematic block diagram of the device of Fig. 19 in use.
[00120] Fig. 21 is a schematic block diagram of an alternative embodiment of the device of
Figs. 19 and 20 in use.
[00121] Fig. 22 is a front view of an embodiment of the present invention in use by a patient, in which the device is tilted, with respect to the patient's interpupillary axis.
[00122] Fig. 23 illustrates a binocular lightweight portable hand-held automatic device that includes a Hartmann-Shack wavefront aberrometer, according to an embodiment of the present invention.
[00123] Fig. 23-1 illustrates a binocular lightweight portable hand-held automatic device that includes a Hartmann-Shack wavefront aberrometer, according to another embodiment of the present invention.
[00124] Fig. 24 schematically illustrates a dovetail slide of the device of Fig. 23 for adjusting spacing between two eyepieces of the device, according to an embodiment of the present invention.
[00125] Fig. 25 schematically illustrates a pivot joint of the device of Fig. 23 for adjusting the spacing between two eyepieces of the device, according to another embodiment of the present invention. [00126] Fig. 26 is a schematic block diagram of hardware components of an analysis unit that may be included in, for example, the device of Fig. 15, according to an embodiment of the present invention.
[00127] Fig. 27 is a schematic diagram of a hypothetical spot diagram not centered on an optical sensor of the device of Figs. 11-15.
[00128] Fig. 28 is a schematic block diagram of an alignment feedback module, according to several embodiments of the present invention.
[00129] Fig. 29 is a schematic diagram of a hypothetical spot diagram that falls only partially on the optical sensor of the device of Figs. 11-15.
[00130] Fig. 30 is a schematic diagram of a display indicating a location of a hypothetical centroid of a spot diagram, relative to vertical and horizontal axes, according to several embodiments of the present invention.
[00131] Fig. 31 is a plan view of an array of light sensors around the optical sensor of the device of Figs. 11-15, according to several embodiments of the present invention.
[00132] Fig. 32 is a schematic block diagram of the device of Figs. 11-14, showing its internal components, according to another embodiment of the present invention.
[00133] Fig. 33 is a plan view of a quadrant photodiode detector of the device of Fig. 32, including a hypothetical spot diagram projected thereon, according to an embodiment of the present invention.
[00134] Fig. 34 is a schematic plan view of an exemplary array of visible light sources for projecting a visible spot on a distant object, according to an embodiment of the present invention.
[00135] Fig. 35 is a schematic plan view of an exemplary array of light sources for projecting a virtual light source onto a retina of a patient's eye, according to an embodiment of the present invention.
[00136] Fig. 36 is a schematic block diagram of an unaccommodation detector, according to an embodiment of the present invention.
[00137] Fig. 37 contains a graph of spherical and cylindrical power candidate prescriptions calculated from a hypothetical patient, according to an embodiment of the present invention.
[00138] Fig. 38 is a schematic block diagram of processing modules and interconnections among these modules, according to an embodiment of the present invention. [00139] Fig. 39 is schematic diagram of a complete spot diagram captured by a prototype instrument as described herein and according to an embodiment of the present invention.
[00140] Fig. 40 is a schematic diagram of a partial spot diagram, i.e., a spot diagram in which a portion of the spot diagram falls off the optical sensor, captured by a prototype instrument as described herein and according to an embodiment of the present invention.
[00141] Fig. 41 is a schematic diagram of a frame from the optical sensor of Fig. 15, with no spot diagram.
[00142] Fig. 42 is a schematic diagram of a frame from the optical sensor 1532 containing a corneal reflection, captured by a prototype instrument as described herein and according to an embodiment of the present invention.
[00143] Figs. 43-46 are schematic diagrams of a set of frames from the optical sensor of Fig.
15 containing a sequence of images acquired as an eye moves, creating a set of spot diagrams that move from left to right, captured by a prototype instrument as described herein and according to an embodiment of the present invention.
[00144] Fig. 47 is a schematic diagram of a hypothetical frame from the optical sensor of
Fig. 15 containing a complete spot diagram, according to an embodiment of the present invention.
[00145] Fig. 48 is a schematic illustration of a portion of a lenslet array and a portion of a hypothetical aberrated wavefront, showing displacement calculations.
[00146] Fig. 49 is a graph showing mean, maximum and minimum amounts by which a normal human eye can accommodate, plotted against age.
[00147] Figs. 50 and 51 are graphs of sets of M, JO and J45 prescriptions for two different patients, calculated by a prototype instrument as described herein and according to an embodiment of the present invention.
[00148] Fig. 52 is a hypothetical histogram of spherical prescriptions calculated according to an embodiment of the present invention.
[00149] Fig. 53 is a schematic diagram illustrating combining frames to yield a second set of frames, according to an embodiment of the present invention.
[00150] Fig. 54 is a schematic diagram illustrating calculation of an estimated confidence region for a final astigmatism prescription, according to an embodiment of the present invention. [00151] Fig. 55 is a schematic diagram illustrating an eye properly aligned with a device, as well as a view as seen by the eye through the device, according to another embodiment of the present invention.
[00152] Fig. 56 is a schematic diagram illustrating an eye slightly misaligned with the device of Fig 55, as well as a hypothetical view as seen by the eye through the device, according to an embodiment of the present invention.
[00153] Fig. 57 is a schematic diagram illustrating an eye grossly misaligned with the device of Fig. 55, as well as a view as seen by the eye through the device, according to an embodiment of the present invention.
DETAILED DESCRIPTION OF SPECIFIC EMBODIMENTS
[00154] In accordance with preferred embodiments of the present invention, methods and apparatus are disclosed for calculating a prescription to correct refractive errors with a relatively inexpensive, light-weight, portable instrument that does not require a professional clinician, cycloplegic agent, fogging or virtual images. Some embodiments also calculate prescriptions to correct higher-order aberrations in an eye and/or additional optical properties of the eye. Some embodiments may be used to calculate prescriptions for corrective lenses (eyeglasses) and/or to check whether an existing eyeglass has a correct prescription of a patient.
Introduction
[00155] Fig. 1 is a schematic cross-sectional diagram of a normal emmetropic human eye
100. Emmetropia describes a state of vision where an object at infinity is in sharp focus, with the eye's crystalline lens 102 in a neutral (relaxed or "unaccommodated") state. This condition of the normal eye 100 is achieved when the refractive optical power of the cornea 104 and lens 102 balance the axial length 106 of the eye 100, thereby focusing rays 108 from a distant object (not shown) exactly on the retina 110, resulting in perfect vision. Here "distant" means more than 20 feet (6 meters) away. An eye in a state of emmetropia requires no correction.
[00156] If gaze shifts to a close object 200, as shown schematically in Fig. 2, ciliary muscles
(not shown) change the shape of the lens 102, thickening it, thereby increasing its optical power, so the eye 100 focuses the rays 208 on the retina 110. This process is referred to as "accommodation." Thus, absent effort by the ciliary muscles, the eye 100 automatically focuses on objects in the distance. However, focusing on close objects requires effort. Humans naturally, and typically unconsciously, automatically focus on objects of interest. However, with age, the lens 102 becomes increasingly stiff and the ciliary muscles loose some degree of contractility, thereby making it progressively more difficult to focus on close objects. Typically by age 45 to 50 it becomes impossible to focus on objects at book-reading distance, thereby requiring reading glasses.
[00157] Figs. 1 and 2 illustrate normal eyes. However, various imperfections in the shape or composition of the lens 102, cornea 104, retina 110 or the eye 100 in general can prevent the eye 100 from perfectly focusing the rays 108 or 208 on the retina 110, even in young people. These imperfections prevent the eye 100 from bending (refracting) light rays as a normal eye would, thereby causing "refractive errors." For example, Fig. 3 schematically illustrates a hyperopic (farsighted) eye 300, in which light rays 308 from a close object 310 are too divergent to focus on the retina 110, leading to blurry vision. Similarly, Fig. 4 schematically illustrates a myopic (nearsighted) eye 400, in which light rays 408 from a distant object (not shown) focus in front of the retina 110, causing the distant object to appear blurry. Essentially, the lens 402 of a myopic eye has too much optical power, relative to the axial length 406 of the eye 400. Myopic eyes can, however, focus well on near objects. In both myopia and hyperopia, an inability to create a sharp image of an object on the retina is referred to as a "defocus error." Imperfections in eyes can be congenital or result from other factors such as an injury or a disease.
[00158] These and other imperfections in eyes can be treated by prescribing eyeglasses
("spectacles") or contact lenses, which introduce corrective lenses in front of the eyes. Fig. 5 schematically illustrates a corrective lens 500 disposed in front of the myopic eye 400 of Fig. 4 to correct the myopia. The lens 500 is disposed in a "spectacle plane" 502 located a small distance away from the eye 400. The spectacle plane 502 defines where eyeglasses are worn, relative to the eye 400. In the case of a contact lens, the spectacle plan is close to the outer layer of the cornea. A lens to correct myopia has a negative optical power, i.e., it has a net concave effect, which counteracts the excessive positive optical power of the myopic eye. For simplicity, the following descriptions refer to eyeglasses or spectacles, although they also apply to contact lenses.
[00159] A prescription for corrective eyeglasses specifies all aspects of the lenses of the eyeglasses. Some eye imperfections are simpler to correct than others. For example, if an eye is only hyperopic or only myopic, a spherical lens can be used to correct the defocus errors of the eye. A spherical lens includes a surface that is a portion of a sphere. However, if the crystalline lens 102 (Fig. 1), the cornea 104, the retina 110 or the eye 100 in general is not properly shaped, for example if the focusing power of the eye is different along different axes, a simple spherical lens cannot fully correct the eye. In this case, the eye is referred to as having "astigmatism." Corrective eyeglasses that have a spherical and a cylindrical component are used to correct astigmatism. Spherical and cylindrical imperfections account for most, but not all, of the eye's imperfections. Spherical and cylindrical imperfections are referred to as low-order aberrations.
[00160] Thus, most prescriptions include a spherical component and a cylindrical component to correct low-order aberrations. The spherical component corrects the defocus error and is described in terms of the optical power, positive or negative, of the corrective lens, typically expressed as a number of diopters. A diopter is a unit of measurement of optical power of a lens, which is equal to a reciprocal of the focal length (f) of the lens measured in meters, i.e., 1/f. The cylindrical component is described in terms of power and axis of a cylindrical lens. Typically, one or two axes are specified, corresponding to one or two cylindrical lenses. Each axis is specified as an angle. The resulting corrective lens has a compound surface shape that includes spherical and cylindrical components, as described by the prescription, to compensate for the defocus and astigmatism imperfections in the eye.
[00161] An "aberration" is a departure of the optical performance of an eye from a perfect eye. Thus, defocus and astigmatism imperfections are examples of aberrations. However, eyes may suffer from more complex imperfections, which are commonly referred to as "higher-order aberrations." Examples of higher-order aberrations include coma and spherical aberration (not to be confused with the low-order spherical imperfections that cause defocus errors, as described above). Coma causes an off-axis point source to appear distorted, appearing to have a tail. Spherical aberrations cause collimated rays entering the eye far from the optical axis to focus at a different position than collimated rays entering the eye close to the optical axis. Some prescriptions at least partially correct for higher order aberrations, although determining these prescriptions requires large, heavy, expensive, fixed (such as to a desk) diagnostic equipment and highly skilled clinicians.
[00162] Optical professionals use various tools and methods to generate eyeglass prescriptions. Some methods are subjective, others are objective. For example, a phoropter allows a clinician to position various combinations of lenses, at various angles, in front of a patient and ask the patient whether one combination is better than a different combination for visualizing a target. Based on reports from the patient, a skilled clinician can achieve progressively better combinations, eventually arriving at a good, although not necessarily perfect, prescription. However, the accuracy of the prescription depends in large part on the patient's reporting accuracy. Phoropters are relatively inexpensive, but the above-described process is time consuming.
[00163] An aberrometer (wavefront sensor) objectively measures how light is changed by an eye, thereby identifying and quantifying refractive errors caused by the eye. Aberrometers are usually classified into three types: (1) outgoing wavefront aberrometers, such as a Hartmann-Shack sensor; (2) ingoing retinal imaging aberrometers, such as a cross-cylinder aberrometer or Tscherning aberrometer or as used in a sequential retinal ray tracing method; and (3) ingoing feedback aberrometers, such as a spatially-resolved refractometer or as used in an optical path difference method.
[00164] As schematically illustrated in Fig. 6, a Hartmann-Shack wavefront aberrometer includes an array 600 of lenses ("lenslets"), exemplified by lenslets 602, 604 and 606. All the lenslets 602-606 have identical sizes and focal lengths, within some manufacturing tolerances. The lenslet array 600 is disposed optically between an eye 608 and an optical sensor 610, such as a pixelated charge-coupled device (CCD), pixelated complementary metal oxide semiconductor (CMOS) device or an array of quadrant photodiode detectors. Each lenslet 602-606 is focused onto a portion of the optical sensor 610. Thus, light from a single point source is focused by the lenslet array 600 onto the optical sensor 610 to create an array of spots of light.
[00165] Each lenslet 602-606 may, but need not, be focused on the center of a respective pixel of a pixelated CCD array or on the center of a respective quadrant sensor. The optical sensor 610 is configured to have sufficient spatial resolution to enable a circuit or processor to measure displacement of each spot of the array of spots from a position directly in line with the center of the corresponding lenslet, as described in more detail below. A point 618 within the eye 608 is illuminated by shining a light, typically from a laser or a superluminescent diode (SLED or SLD), into the eye 608, thereby creating a "virtual point light source" within the eye 608. The term "virtual light source" is a term of art used in wavefront aberrometry, and as used herein, the term means a place where light appears to emanate, although no light is actually generated there. In the case of point 618, the laser or SLED creates the virtual light source. As used herein, unless context indicates otherwise, "virtual" should not be confused with that term as used in optics, where "virtual" means a physical source that is imaged to another location.
[00166] As schematically illustrated in Fig. 7, light reflects from the point 618 and exits the eye 608. Wavefronts 702, 704, 706, and 708 represent the exiting light. Each lenslet of the array of lenslets 600 focuses a respective portion of each wavefront 700-706 onto a corresponding portion of the optical sensor 610, creating a circular array of spots. A hypothetical array of spot 710 (also referred to herein as a "spot diagram") is shown, although the array of lenslets 600 may include more or fewer lenslets than are shown and, therefore, the spot diagram 710 may include more or fewer spots than are shown. If the eye 608 is perfectly shaped (emmetropic) and unaccommodated, the wavefronts 706-708 are planar, and the spots of the spot diagram 710 are equally displaced from the center of each individual lenslet. The outer perimeter of the spot diagram is a projection of the pupil of the eye 608, thus the diameter of the outer perimeter of the spot diagram indicates the pupil diameter.
[00167] However, as schematically illustrated in Fig. 8, if the eye 800 is aberrated, the wavefronts 806-808 exiting the eye 800 are non-planar. The shape of the wavefronts 806-808 is determined by the lower-order and higher-order aberration(s) of the eye 800. Fig. 9 schematically illustrates wavefront 908 conceptually divided into square regions, exemplified by regions 900, 902 and 904. Each region 900-904 impinges on the lenslet array 600 along a direction substantially perpendicular to the region, as indicated by respective arrows 906, 908 and 910. Thus, the spots of the spot diagram 810 (Fig. 8) are displaced from where they would be if the wavefront 808 were planar.
[00168] One such displaced spot 912 is shown in an enlarged portion of Fig. 9. Here, if the region of the wavefront 808 contributing the spot 912 had been parallel to the lenslet array 600, the region would have traveled through the lenslet 914 and impinged on the optical sensor 610 along a line 916 normal to the optical sensor 610 and created a spot at location 918. However, due to the tilt of the wavefront region caused by the aberrated eye, the spot 912 is displaced an x and a y distance from the location 918.
[00169] Conventional centroid finding methods may be used to analyze data from the optical sensor 610 to calculate the x and y displacements and angles β for each lenslet, often with sub-pixel resolution. Thus, a local tilt of the wavefront 908 across each lenslet can be calculated from the position of the spot on the optical sensor 610 generated by the lenslet. Any phase aberration can be approximated to a set of discrete tilts. By sampling signals from the elements of the optical sensor 610, all these tilts can be measured, and the whole wavefront can be reconstructed and characterized as numerical wavefront data. The wavefront data can then be used to characterize the eye 800 (Fig. 8) as an optical system.
[00170] Using the displacements of each spot, it is possible to reconstruct an analytical representation of the wavefront. For example, the shape of the wavefront 808 can be expressed as a weighted sum of a set of pre-determined three-dimensional surface shapes or basis functions. Each shape of the set is usually defined by an independent polynomial function which represents a specific aberration term. Among all the possible sets of basis functions, it is common to use the Zernike polynomials. The Zernike polynomials are appropriate for describing very complex shapes, such as wavefront aberrations, because of they are orthonormal over circular pupils and, more importantly, because they are constructed in such a way that higher-order polynomials are "balanced" by lower-order polynomials so that the image intensity at the focal plane can be optimized when the amount of aberration is low. Fig. 10 illustrates the shapes defined by the 0th through 4th orders (modes) of the Zernike polynomials. The views in Fig. 10 are perspective. However, often these shapes are shown in top view, using color gradients to represent powers of the aberrations. The shapes become increasingly complex with increased order, and these shapes can be combined to precisely describe a surface that fits as well as possible to a measured wavefront.
[00171] Each order describes a surface shape that corresponds with an ocular aberration. The
0th order has one term (ZQ) that represents a constant. The 1st order has two terms (Z 1 and
Figure imgf000021_0001
that represent tilt for the x and y axes. The 2nd order includes three terms that represent defocus and regular astigmatism in two directions. The 3rd order has four terms that represent coma and trefoil. The 4th order has five terms that represent tetrafoil, secondary astigmatism and spherical aberration. The 5th order (not shown) has six terms that represent pentafoil aberration. The polynomials can be expanded up to an arbitrary order, if a sufficient number of measurements are made for the calculations and the optical sensor provides sufficient spatial resolution.
[00172] Thus, Zernike analysis describes a wavefront mathematically as a weighted sum of
Zernike polynomials. The weight applied to each mode when computing this sum is called a Zernike coefficient and is usually expressed in microns. The weighted sum of the Zernike polynomials equals a description of all the aberrations, i.e., a total refractive error, of an eye. In practice, a Zernike analysis includes a finite number of modes. Once the total refractive error of an eye has been ascertained to a desired accuracy, i.e., using a desired number of Zernike modes, a corrective lens prescription can be calculated to compensate for the refractive error in a well-known manner. Thus, a spot diagram can be used to calculate a prescription.
[00173] Because no two eyes yield identical sets of Zernike coefficients (assuming a sufficient number of Zernike modes), the Zernike coefficients can be used somewhat analogously to a fingerprint to uniquely identify an individual eye and, therefore, an individual person.
[00174] Optical properties of an eye include: scattering (which may be used to determine if a patient has cataracts), wavefront (which may be used to measure refraction, low-order aberrations, high-order aberrations, accommodation, keratoconus, which is a high-order spherical aberration, and the like) and pupil size.
[00175] All prior art methods and apparatus for determining eyeglass prescriptions have associated problems. For example, phoropters required skilled clinicians and rely on subjective reports from patients. Hartmann-Shack wavefront aberrometers require ciliary muscles to be temporarily paralyzed by a cycloplegic agent, the eye to be "fogged" or the patient to be shown a virtual image at infinity, so as to prevent accommodation while the eye is measured.
[00176] Accommodation introduces an uncontrolled variable into the measurement process.
Fogging refers to temporarily disposing a lens with positive spherical power in front of a patient's eye in an attempt to control accommodation. The goal of fogging is to move the focal point in front of the retina, regardless of the distance to the object. Essentially, the patient is temporarily made artificially myopic. As noted, the eye accommodates by changing the shape of the lens to increase its optical power in order to see close objects more clearly. However, if an eye is fogged, and the eye accommodates, vision becomes blurrier, not clearer, regardless of the distance to the object, thus discouraging accommodation. Some patients do not respond well to fogging.
[00177] Virtual images are images created within a diagnostic instrument but that optically are located at least 20 feet (6 meters) from the patient. However, when a patient looks into a relatively small (compared to 20 feet) instrument, the patient intuitively knows the viewed object is not 20 feet away and, therefore, the patient tends to accommodate. This phenomenon is sometimes referred to as "instrument-induced myopia," and it is difficult to avoid, even with fogging techniques.
[00178] Most ophthalmo logical diagnostic equipment is large, heavy and mechanically complex, at least in part because the equipment is designed to hold a patient's head steady and align it, and thereby align the patient's eye, with certain optical elements within the diagnostic equipment. Consequently, this equipment is typically attached to a table and includes heavy-duty structural members, forehead and chin rests and rack and pinion alignment mechanisms.
Lightweight portable automatic Hartmann-Shack wavefront aberrometer
[00179] Figs. 11, 12 and 13 contain various views of a lightweight, portable, hand-held, self- contained, automatic optical or ophthalmologic apparatus 1100 that includes a Hartmann-Shack wavefront aberrometer, according to an embodiment of the present invention. Fig. 14 shows the apparatus 1100 in use by a patient 1400. The apparatus 1100 solves many of the problems associated with the prior art. For example, the apparatus 1100 provides feedback to the patient 1400, enabling the patient 1400 to correctly align the apparatus 1100 to the patient's eye, without the cumbersome mechanical paraphernalia required by prior art devices. Furthermore, the apparatus 1100 is of an "open view" design, therefore it is configured to inherently encourage the patient 1400 not to accommodate, without any cycloplegic agents, fogging or virtual images. The apparatus 1100 automatically determines when the patient 1400 is not accommodating, and uses data acquired during a period of non-accommodation to automatically calculate an eyeglass prescription. Alternatively, the apparatus 1100 can measure the optical properties of an eye that is focused at a known, non-infinite distance, and these optical properties can be used to calculate the patient's eye's optical properties if the patient were to focus at infinity.
[00180] The apparatus 1100 includes an eyepiece 1102, into which the patient 1400 looks with one eye. The eyepiece 1102 may include an eyecup configured to be pressed against the patient's face, thereby blocking ambient light. The eyecup may be sized and shaped differently to fit well against various facial geometries and anatomical configurations, such as young and old patients. The apparatus 1100 also defines an exit port 1104, through which the patient 1400 can see. Thus, the apparatus 1100 has an "open view" configuration. [00181] Fig. 15 is a schematic block diagram of the apparatus 1100 showing its internal components, within a body 1500. Two beamsplitters 1501 and 1502 are disposed along an optical axis 1504 between the eyepiece 1102 and the exit port 1104. The patient looking into the eyepiece 1102 along the optical axis 1504 can see an external object 1506 that is aligned with the optical axis 1504. A view, as seen by the patient, is shown in an insert of Fig. 15.
[00182] In one embodiment, a visible light source 1508, such as a laser diode or light emitting diode (LED), emits a beam of light 1510, which the beamsplitter 1502 reflects along the optical axis 1504 out the exit port 1104, as indicated by arrow 1512. The beam 1512 can be used to create a spot of light on a distant wall or other object 1514. In this description, the object 1506 is assumed to be the spot of light created on the wall 1514 by the beam 1512. The visible light source 1508 is fixed, relative to the body 1501 and the optical components within the apparatus 1100. Thus, the beam 1512 is always coincident with the optical axis 1504.
[00183] The distant wall 1514 should be at least 20 feet (6 meters) from the apparatus 1100, so when the patient looks at the spot 1506, the patient's eye 1516 is substantially unaccommodated. An ultrasonic or other range sensor 1517 may be used to measure the distance between the apparatus 1100 and the wall 1514. The apparatus 1100 may provide an audible, visual, haptic or other warning if the distance is inappropriate. A return beam 1518 from the spot 1506 enters the exit port 1104, passes through the two beamsplitters 1502 and 1501 along the optical axis 1504 and enters the patient's eye 1516 via the eyepiece 1102. This enables the patient to see the spot 1506. For clarity, the optical axis 1504 and the two beams 1512 and 1518 are shown spaced apart; however, the axis and the two beams are coincident.
[00184] In another embodiment, the target can be an arbitrary, but known, distance from the patients. For example, if the target is projected 10 feet (3 meters) from the instrument, the amount of accommodation necessary for the patient to focus on the target is calculated, and then a prescription is calculated that compensates for the accommodation.
[00185] The eyepiece 1102 may also be referred to as a proximal port, and the exit port 1104 may also be referred to as a distal port. The body 1500 forms a visual channel between the eyepiece 1102 and the exit port 1104. The beam 1512 may also be referred to as a target beam, the wall or other object 1514 may be referred to as a target and the spot 1506 may also be referred to as target indicia. [00186] Optionally, the visual channel between the eyepiece 1102 and the exit port 1104 may have a conic shape, i.e., the shape may be a portion of a cone. In such an embodiment, the visual channel is configured such that a vertex of the conic shape is toward the eyepiece 1102 and a base of the conic shape toward the exit port 1104. A pinhole constrains where a user can position her eye and see through the pinhole. A pinhole does not, however, constrain the angle along which the user can through the pinhole. A tubular or conic visual channel does, however, constrain the view angle. Thus, a conic visual channel, with pinhole, which may be implemented as a small hole at or near the vertex of the cone, constrains both the position of the user's eye and the angle along which the eye sees.
[00187] Another light source, such as another laser diode, 1520 projects a beam of light
1522. The beam splitter 1502 reflects the beam 1522 toward the eyepiece 1102 along the optical axis 1504, as indicated by arrow 1524. The beam 1524 illuminates a spot 1525 on the back of the eye 1516, thereby essentially creating a virtual point light source within the eye 1516. This virtual light source 1525 corresponds to the spot 618 described above, with respect to Fig. 8. As discussed above, with respect to Hartmann-Shack wavefront aberrometry, return wavefronts travel along a beam 1526 from the eye 1516. The beamsplitter 1501 reflects the beam 1526, and resulting beam 1528 passes through a lenslet array 1530 and impinges on an optical sensor 1532. Optional optics 1534, such as a relay lens system to make the lenslet array 1530 optically conjugate with the patient's spectacle plane and a band-pass and/or neutral density filter, may be disposed in the path of beam 1528. For clarity, the optical axis 1504 and the two beams 1524 and 1526 are shown spaced apart; however, the axis and the two beams are generally coincident.
[00188] Although embodiments using Hartmann-Shack wavefront aberrometry using a lenslet array are described, other methods for wavefront sensing can be used. Other embodiments use pinhole arrays or arrays of sensors for defocus imaging. In some embodiments, time-of-flight cameras, interferometric techniques or partitioned aperture wavefront imaging systems are used. Partitioned aperture wavefront imaging systems are well known to those of skill in the art, as evidenced by information available at http://biomicroscopy.bu.edu/research/partioned-aperture- wavefront-imaging.
[00189] An analysis unit 1536 is electronically coupled to the optical sensor 1532. The analysis unit 1536 includes appropriate interface electronics, a processor, memory and associated circuits configured to analyze signals from the optical sensor 1532 to calculate x and y displacements of spots in a spot diagram from where they would be if the eye 1516 were normal. From this data, the analysis unit 1536 calculates a set of Zernike coefficients and calculates a corrective lens prescription. Additional details about these analyses and calculations are provided below.
[00190] An internal battery 1538 powers the analysis unit 1536, the two light sources 1508 and 1520, the optical sensor 1532 and other components of the apparatus 1100. A handle portion 1539 of the housing 1500 may house the battery 1538. All electronic components of the apparatus 1100 are powered by the battery 1538, and all calculations necessary to ascertain the prescription are performed by the analysis unit 1536. Thus, the apparatus 1100 is completely self-contained, i.e., all components, apart from the wall 1514 and the eye 1516, necessary to perform its functions are included within the housing 1500. The apparatus 1100 is small and lightweight enough to be held in place long enough to perform the described measurement by a typical patient using one hand.
[00191] In one embodiment of the apparatus 1100, the light source 1520 that creates the virtual light source 1525 within the eye 1516 is a near infrared (NIR) light source. The wavelength of the light source 1520 is selected such that the patient perceives a red dot, although the bulk of the energy of the beam 1504 is not within the spectrum visible to the patient. On the other hand, the visible light source 1508 is selected to have a perceptively different color, such as green, than the red perceived by the patient from the NIR light source 1520. The patient may be instructed to orient the apparatus 1100, relative to the patient's eye, so as to maximize the perceived brightness of the red dot.
[00192] Thus, as schematically illustrated in Fig. 15-1, if the patient's eye 1516 is properly aligned with the eyepiece 1102, such that the eye's center of vision 1590 is aligned with the optical axis 1504 of the apparatus 1100, the patient perceives two coincident dots 1592 and 1594, one red and the other green, as illustrated on the left side of Fig. 15-1, or a single dot that is both red and green. Thus, the patient can be instructed to reorient the apparatus 1100 until she perceives the two coincident dots or one dual-colored dot. The patient can then easily hold the apparatus 1100 in the proper alignment for the short time required to collect data for generating a prescription.
[00193] As schematically illustrated in Fig. 15-2, if the patient's eye 1516 is improperly aligned with the eyepiece 1102, such that the eye's center of vision 1590 is parallel to, but slightly displaced from, the optical axis 1504 of the apparatus 1100, the patient sees the dots 1592 and 1594 off-center within the field of view afforded by the eyepiece, as exemplified on the left side of Fig. 15-2. However, as schematically illustrated in Fig. 15-3, if the patient's eye 1516 is grossly misaligned, the patient does not see any dots within the field of view afforded by the eyepiece, as shown on the left side of Fig. 15-3.
[00194] Thus, the simple design of the apparatus 1100 enables easy alignment of a patient's eye with optics of the apparatus 1100, without a chin rest or other complex heavy mechanical alignment apparatus required by the prior art. Furthermore, the open view design of the apparatus 1100 encourages the patient not to accommodate, without any cyclop legic agents, fogging or virtual images.
[00195] In other embodiments, other wavelengths may be used by the two light sources 1508 and 1520. In some embodiments, visible wavelengths are used for both of the light sources 1508 and 1520. In some embodiments, identical or similar wavelengths are used by both of the light sources 1508 and 1520, but one or both of the light sources 1508 and 1520 blink, so the patient can distinguish between the two resulting dots. If both light sources 1508 and 1520 blink, they should alternate being on.
[00196] The apparatus 1100 may include additional optical elements, such as a diaphragm
1540 to define the beam 1522 and align it with the beamsplitter 1502. An adjustable iris diaphragm 1542 may be used to define the exit port 1104. In one embodiment, the diaphragm 1542 has a maximum diameter of about 7 mm, and the beamsplitter 1500 has a 4: 1 ratio of reflectance to transmittance at the operational wavelength of the light source 1520. The light source 1520 may generate an approximately 3 mW beam of about 2 mm in diameter at a wavelength of about 850 nm. The beamsplitter 1502 may include a "hot mirror," which passes visible light entering the exit port 1104, within a range of about 375 nm to about 725 nm, so the patient can see the spot 1506 through the eyepiece 1102. Optionally, the components of the apparatus 1100 may be displaced along a Y axis, so as to offset the beams 1504 and 1527 by about 1-2 mm to reduce specular reflection from the eye 1516. This specular reflection constitutes noise to the optical sensor 1532.
[00197] The amount of optical power than can safely be delivered by the light source 1520 to the eye 1516 is limited. Ambient light that enters the apparatus 1100 and impinges on the optical sensor 1532 constitutes noise. Under high ambient light conditions, this noise may reach unacceptably high levels. In addition, the ambient light may overwhelm the light from light source 1520, thereby preventing the patient from seeing a spot from this light source. Optionally, to reduce the level of this noise and reduce the level of ambient light seen by the patient, a neutral density filter 1544 may be disposed along the light path between the exit port 1104 and the beam splitter 1502. The neutral density filter may be selected or adjusted to admit any appropriate amount of light, such as about 1%.
[00198] In another embodiment shown schematically in Fig. 16, an optical or ophthalmologic apparatus 1600 includes components as described above, with respect to Fig. 15, and further includes a cross-hair 1602 disposed along the optical path between the eyepiece 1102 and the exit port 1104, such that the center of the cross-hair 1602 coincides with the optical axis 1504. Thus, the cross-hair 1602 is visible in the field of view of the eye 1516, as shown in Figs. 17 and 18. If the patient sights down the center of the optical path between the eyepiece 1102 and the exit port 1104, thereby aligning his eye 1516 with the optical axis 1504, the spot 1506 appears at the intersection of the cross-hair 1602, as shown in Fig. 17. However, if the patient does not properly align his eye 1516 with the optical axis 1504, the spot 1506 does not appear at the intersection of the cross-hair 1602, for example as shown in Fig. 18. The patient can be instructed to reorient the apparatus 1600 until he sees the spot 1506 at the center of the cross-hair 1602. In this embodiment, the light source 1520 need not generate a beam 1522 that is perceived at all by the patient.
[00199] The cross-hair 1602 should be disposed a distance away from the eye 1516, so as not to require the eye 1516 to accommodate and still have the cross-hair 1602 reasonably well focused. This may require the cross-hair 1602 to be held a distance from most of the housing 1604, such as by an outrigger 1608, as shown in the insert of Fig. 16.
[00200] Other aspects of the apparatus 1600 are similar to the apparatus 1100; however, some reference numerals are omitted from Fig. 16 in the interest of clarity.
[00201] In yet another embodiment shown schematically in Fig. 19, an optical or ophthalmologic apparatus 1900 includes components as described above, with respect to Fig. 15, except the visible light source 1902 projects a spot on the wall 1514, without the visible beam 1904 passing through the beamsplitter 1502. As shown schematically in Fig. 20, the beam 1904 is not parallel to the optical axis 1504. Thus, the beam 1904 intersects the optical axis 1504 at a distance 2000 from the apparatus 1900. An angle 2002 of the light source 1902, relative to the optical axis 1504, is selected such that the distance 2000 is at least 20 feet (6 meters). The ultrasonic or other range sensor 1517 or a simple tape measure may be used to selectively dispose the apparatus 1900 at a desired distance from the wall 1514.
[00202] In another embodiment shown schematically in Fig. 21, an apparatus 2100 is similar to the apparatus shown in Figs. 19 and 20, except the visible light source 1902 is aligned parallel to, but spaced apart from, the optical axis 1504. If the distance between optical axes of the projected light source 1902 and the internal light source 1520 is sufficiently small, then when the patient aligns the device so that images of the two sources are coincident, the eye is sufficiently aligned for an accurate wavefront measurement. In some embodiments, the axis of the visible light source 1902 or 1508 (Fig. 15) is offset about 20 mm from the axis of the internal light source 1520. However, with a target distance of about 20 feet (6 meters), this imperfect alignment does not substantially affect operation of the instrument or prescription or measurements taken by the instrument. Thus, in embodiments in which the visible light source 1902 or 1508, internal light source 1520 and/or optical axis 1504 misalignment is less than about 0.5%, we refer to these components as being "substantially aligned."
[00203] In another embodiment schematically illustrated in Figs. 55-57, the apparatus does not include a visible light source, such as light source 1508 or 1902 (Figs. 15 and 19).Instead, the patient is instructed to look into the apparatus and maintain her gaze in the center of the field of view provided by the eyepiece 1102 and exit port 1104 of the apparatus. In Fig. 55, the eye 1516 is properly aligned with the optical axis 1504 of the device. A hypothetical view, as seen by the eye 1516, is shown on the left in Fig. 55. In Fig. 56, the eye 1516 is slightly misaligned with the optical axis 1504 of the device. A hypothetical view, as seen by the eye 1516, is shown on the left in Fig. 56. In Fig. 57, the eye 1516 is grossly misaligned with the optical axis 1504 of the device. A hypothetical view, as seen by the eye 1516, is shown on the left in Fig. 57.
[00204] In such embodiments, it may be advantageous to provide a relatively small field of view, such as by closing the iris diaphragm 1542 (Fig. 15) smaller than in the embodiments described above with respect to Figs. 15, 16 and 19. Processing of the spot diagram generated from the patient's eye 1516 may be used to ascertain whether the patient's eye 1516 is properly aligned with the optical axis 1504 and, if not, generate a feedback instructional signal to the patient, as described in more detail below. Binocular aberrometer
[00205] Fig. 22 illustrates a possible source of error in measurements made by the devices described thus far. If a patient holds the apparatus 2200 at an angle 2202 other than perpendicular to the patient's interpupiUary axis 2204, the cylindrical axis components of a prescription generated by the apparatus 2200 may be incorrect. One solution to this problem involves including an accelerometer (not shown) in the apparatus 2200 to detect if the apparatus 2200 is oriented other than vertical and, if so, warn the user. Another solution is to use the measured angle from the accelerator to offset the measured cylindrical axis by the appropriate amount. However, these approaches have limitations. For example, the patient may not be positioned with her head vertical, thereby making a vertical orientation of the apparatus an incorrect orientation.
[00206] To overcome this problem, optionally, any embodiment described herein may be configured as a binocular instrument, as exemplified by an optical or ophthalmologic apparatus 2300 illustrated in Fig. 23. An alternative binocular instrument 2350 is illustrated in Fig 23-1. The binocular instrument 2300 may be held by a patient using two hands, thereby providing more stability than one hand holding a monocular instrument, at least in part because using two hands reduces the number of degrees of freedom of movement of the instrument 2300. Because the binocular instrument 2300 is more likely to be held by the patient so the instrument axis between the two eyepieces is parallel to the patient's intraocular axis, prescriptions to correct astigmatism are more likely to include accurate angles of the cylinder axis.
[00207] In the binocular instrument 2300, one side 2302 of the instrument 2300 includes the components described above, such as with respect to Fig. 15, and the other side 2304 of the instrument is essentially merely a hollow tube. A patient is very likely to hold a binocular instrument to her face in a manner such that a vertical axis 2306 of the instrument is perpendicular to the patient's interpupiUary axis, even if the patient leans her head to one side.
[00208] The side 2302 of the binocular instrument 2300 that includes the aberrometer may include a neutral density filter 1544 (Fig. 15) to reduce the amount of ambient light admitted into the instrument, as discussed above. Even without a neutral density filter 1544 in the "business" side 2302 of the binocular instrument 2300, the beamsplitter 1501, band pass filter 1534, etc. attenuate light. Therefore, the other side 2304 of the binocular instrument 2300 should include a neutral density filter, so both eyes receive approximately equal amounts of light. [00209] Once the instrument 2300 has been used to measure one eye, the instrument 2300 can be turned up-side-down to measure the other eye. The binocular instrument 2300 shown in Fig. 23 includes two handles 2308 and 2310, making the instrument equally easy to hold right-side-up and up-side-down. Alternatively, both sides 2302 and 2304 may include most of the components described above, such as with respect to Fig. 15. Such an embodiment can measure both eyes substantially simultaneously, without requiring the device to be turned up-side-down. Alternatively, additional beamsplitters can be incorporated into a secondary channel to route the measuring light and wavefront sensor field of view to simultaneously image spot diagrams from both eyes.
[00210] As shown schematically in Fig. 24, the two portions 2302 and 2304 of the binocular instrument 2300 may be adjustably attached to each other by a dovetail or other sliding rail 2400, enabling distance 2402 between centers of the two eyepieces 1102 and 2404 to be adjusted to match a patient's interpupillary distance. Once the separation between the two eyepieces 1102 and 2404 has been adjusted so the eyecups comfortably fit contours of the patient's face, the interocular distance can be read from a scale 2406 using a pointer 2408. Optionally or alternatively, a linear encoder 2410 and indicia 2412 are used to electronically measure the distance 2402. The distance 2402 can be used as a parameter for constructing a pair of eyeglasses for the patient.
[00211] Rather than a sliding rail 2400, a worm gear or other suitable linear, angular or other adjustable link may be used. For example, as shown schematically in Fig. 25, the two portions 2302 and 2304 may be adjustably attached to each other by a pivot joint 2500. As an angle 2502 defined by two connecting member 2504 and 2506 changes, the distance 2402 between the centers of the two eyepieces 1102 and 2404 changes. A scale 2508 and pointer 2510 may be calibrated to indicate the distance 2402 or the angle 2502. Of course, the distance 2402 can be calculated from the angle 2502 and known geometry of the instrument. Optionally or alternatively, an angular encoder 2510 (shown in dashed line) is included in the pivot joint 2500.
Analysis unit
[00212] As noted with respect to Fig. 15, the optical or ophthalmologic apparatus 1100 includes an analysis unit 1536 configured to analyze signals from the optical sensor 1532 to calculate x and y displacements of spots in a spot diagram, calculate a set of Zernike coefficients from the displacements and calculate a corrective lens prescription from the coefficients. The analysis unit 1536 also controls operation of various components of the apparatus 1100. Fig. 26 is a schematic block diagram of the analysis unit 1536. Similar analysis units may be used in other embodiments of the present invention.
[00213] The analysis unit 1536 includes a processor 2600 coupled to a memory 2602 via a computer bus 2604. The processor 2600 executes instructions stored in the memory 2602. In so doing, the processor 2600 also fetches and stores data from and to the memory 2602.
[00214] Also connected to the computer bus 2604 are: an optical sensor interface 2606, a light source interface 2608, an iris/neutral density filter interface 2610, a computer network interface 2612, a range finder interface 2614, an audio/visual/haptic user interface 2616 and a user input interface 2618. These interfaces 2606-2618 are controlled by the processor 2600 via the computer bus 2604, enabling the processor to send and/or receive data to and/or from respective components coupled to the interfaces 2606-2618, as well as control their operations.
[00215] The optical sensor interface 2606 is coupled to the optical sensor 1532 (Fig. 15) to receive data from the pixels, quadrant sectors or other elements of the optical sensor 1532. As noted, in some embodiments, the optical sensor 1532 is pixelated. In some embodiments, the optical sensor 1532 includes a rectangular array of quadrant sensors. In either case, the optical sensor 1532 provides data indicating intensity of illumination impinging on portions of the optical sensor 1532. The processor 2600 uses this information to calculate locations of centroids of spots of a spot diagram and to calculate displacements of the centroids from locations where a perfect eye would cause the centroids to impinge on the optical sensor 1532. Some optical or ophthalmologic apparatus embodiments, described below, include other or additional optical sensors, which are also coupled to the optical sensor interface 2606.
[00216] The light source interface 2608 is coupled to the visible light source 1508 (Fig. 15) and to the light source 1520 to control their operations, such as turning the light sources on and off and, in some embodiments, controlling intensities of light emitted by the light sources 1508 and 1520. In some embodiments, described below, one or both of the light sources 1508 and 1520 include a respective array of individual light sources. In these cases, the light source interface 2608 may enable the processor to control each of the individual light sources separately.
[00217] The iris/neutral density filter interface 2610 is coupled to the adjustable iris diaphragm 1542 and/or the neutral density filter 1544 (Fig. 15) to enable the processor 2600 to control their operations. For example, the processor 2600 may send signals, via the interface 2610, to command the iris diaphragm 1542 to open or close to a specified size. Similarly, if the neutral density filter 1544 is adjustable, the processor 2600 may send signals, via the interface 2610, to command the neutral density filter 1544 to admit a specified portion of light.
[00218] The network interface 2612 includes a wired or wireless interface, such as a universal serial bus (USB) interface, a wired Ethernet interface, a Bluetooth interface, a wireless infrared (IR) interface, a wireless local area network (WLAN) interface or a wireless cellular data interface, by which the processor 2600 may communicate with another suitably equipped external device, such as a printer, a personal computer a cell phone or smartphone, an automated lens grinder or an eyeglass order processing system. In some embodiments, the processor 2600 sends a prescription it has calculated to the external device, either directly or via a network, such as a local area network or a cellular carrier network. In some embodiments, the processor receives patient data, program updates, configuration information, etc. from an external device via the network interface 2612. Although embodiments have been described in which all Zernike and prescription calculations are performed within the apparatus, in other embodiments the processor sends raw data from the optical sensor 1532, calculated spot diagram information, Zernike coefficients or other intermediate information to the external device, and the external device calculates the prescription.
[00219] The range finder interface 2614 is coupled to any range sensor 1517 in the ophthalmologic apparatus.
[00220] The audio/visual/haptic interface 2616 is coupled to any audio, visual and/or haptic output devices in the ophthalmologic apparatus. For example, as noted, the apparatus 1100 may provide an audible, visual, haptic or other warning if the distance 2000 (Fig. 20) between the apparatus 1900 and the wall 1514 is inappropriate. Alternatively, as indicated below, this interface 2616 can be used to provide feedback about the alignment between the patient's eye and the optical axis of the device. Suitable audio devices include beepers, loudspeakers, piezoelectric devices, etc. Suitable visual devices include lights, liquid crystal display (LCD) screens, etc. Suitable haptic devices include vibrators, refreshable braille displays, etc.
[00221] The user input interface 2618 is coupled to any user input devices in the ophthalmologic apparatus. Such input devices may, for example, be used to initiate a measurement of a patient's eye. Suitable user input devices include buttons, keys, triggers, touchscreens, tactile sensors, etc. An exemplary user interface 2352 is shown in Fig. 23-1.
[00222] One or more of the interfaces 2606-2618, the processor 2600, the memory 2602 and the computer bus 2604, or any portion thereof, may be replaced or augmented by a suitably programmed device such as a programmable logic device (PLD) 2626, field-programmable gate array (FPGA) 2620, digital signal processor (DSP) 2622, application-specific integrated circuit (ASIC) 2624, discrete logic or suitable circuit. The components connected to the interfaces 2606- 2618, the interfaces themselves, the processor 2600, the memory 2602 and computer bus 2604, together with the optical and mechanical elements described herein, collectively perform the functions described herein, under control of the processor 2600 and/or the PLD 2626, FPGA 2620, DSP 2622 and/or ASIC 2624.
Automatically determine whether eye is aligned with optical axis
and provide feedback to patient
[00223] As noted with respect to Fig. 7, an array of spots (a spot diagram 710) is projected on the optical sensor 610. If the eye 1516 is aligned with the optical axis 1504 of the device 1100 as shown in Fig. 15, the spot diagram is centered on the optical sensor 1532. However, as schematically exemplified in Fig. 27, if the eye 1516 is slightly misaligned with the optical axis 1504, the spot diagram 2700 is not centered on the optical sensor 1532. It should be noted that, even if the eyecup remains firmly pressed against a patient's face and the device 1100 does not move relative to the patient's head, the patient's eye can move within its eye socket and, therefore, become unaligned with the optical axis 1504. s
[00224] Various approaches are available for automatically detecting when the patient has not aligned her eye 1516 with the optical axis 1504 and for providing feedback to the patient that notifies the patient of the misalignment. In some embodiments, the feedback indicates to the patient an extent and/or direction of the misalignment to provide guidance for self-correction. Several of these approaches will now be described.
[00225] Fig. 28 is a schematic block diagram of an alignment feedback module 2800, according to several embodiments of the present invention. As used herein, the term "module" refers to one or more interconnected hardware components, one or more interconnected software components or a combination thereof. Thus, the alignment feedback module 2800 may be implemented by any of the components discussed above, with respect to Fig. 26.
[00226] In Fig. 9, it can be seen that all the spots of the spot diagram 810 are generally not of equal intensity. On the sensor 610, intensity of each spot is schematically indicated by the diameter of the spot. In general, spot intensity decreases with radial distance from the center of the spot diagram 810. Spot intensity distribution within the spot diagram is represented by a three- dimensional surface graph 920.
[00227] Returning to Fig. 28, a spot diagram centroid and size calculator 2802 is coupled to the optical sensor 1532 to receive signals therefrom, such as the intensity of light detected by each pixel or each quadrant. The spot diagram centroid and size calculator 2802 calculates size and location of the centroid of the entire spot diagram, such as its x and y or polar coordinates and the spot diagram diameter, on the optical sensor 1532. The spot diagram centroid calculator 2802 may use any appropriate algorithm or method for determining the centroid and size. Many such algorithms and methods are well known. In some embodiments, a weighted sum of the coordinates of the illuminated pixels is calculated, where each pixel's coordinates are weighted by the illumination level detected by the pixel. This information can be also used to determine the size of the spot diagram, e.g., the diameter of the spot diagram.
[00228] Even if only a portion 2900 of the spots of a spot diagram fall on the optical sensor
1532, such that the true centroid of the spot diagram falls completely off the optical sensor 1532, as schematically exemplified in Fig. 29, the spot diagram centroid calculator 2802 may use the portion 2900 of the spots to calculate a location within the spots 2900 and provide this location as the centroid of the spot diagram. Furthermore, the shape of the portion of the spot diagram falling on the optical sensor 1532 can be also used to estimate the size of the spot diagram. The curvature of the portion of the spot diagram falling on the optical sensor 1532 may be used to estimate the diameter of the spot diagram. Similarly, the curvature of the portion of the spot diagram falling on the optical sensor 1532 may be used to estimate the true center of the spot diagram, even if the center is not within the optical sensor 1532. Optionally, the spot diagram centroid calculator 2802 may generate an additional signal to indicate the true centroid of the spot diagram is off the optical sensor 1532. [00229] A difference calculator 2804 calculates a difference between the location of the centroid of the spot diagram and the center location 2806 of the optical sensor 1532. An output of the difference calculator 2804 represents a magnitude and direction 2808 of the displacement of the centroid of the spot diagram 2700 (Fig. 27) from the center of the optical sensor 1532. This magnitude and difference 2808 is fed to a feedback signal generator 2810.
[00230] The feedback signal generator 2810 generates an audio, visual, haptic and/or other output to the patient and/or an optional operator. Some embodiments include a loudspeaker, as exemplified by a loudspeaker 1546 (Fig. 15), and the feedback signal generator 2810 is coupled to the loudspeaker 1546. In some embodiments, the feedback signal generator 2810 generates audio signals, via the loudspeaker 1546, to indicate to the patient an extent of misalignment and/or a direction of the misalignment. In some such embodiments, a pitch or volume of a sound or a frequency of ticks (somewhat like a sound emitted by a Geiger counter) may represent how closely the eye is aligned to the optical axis. In some embodiments, a particular sound, such as a beep, is played when or whenever the eye is properly aligned. The feedback signal generator 2810 may include a speech synthesizer to generate synthetic speech that instructs the patient how to improve or maintain the alignment of the eye, for example, "Move the instrument up a little," "Look a little to the left" or "Perfect. Don't move your eyes." The loudspeaker may also be used to play instructions for using the device. One important instruction is to ask the patient to blink. A fresh tear film is important for good measurement of the optical properties of the eye.
[00231] Some embodiments include visual indicators, such as arrows illuminated by LEDs, located in the eyepiece 1102, exit port 1104 or elsewhere in the instrument 1100. Exemplary visual indicators 1548 and 1550 are shown in the eyepiece 1102 in Fig. 15. The feedback signal generator 2810 may selectively illuminate one or more of these indicators 1548 and 1550 to represent a magnitude and direction in which the patient should adjust his gaze to better align his eye with the optical axis. Optionally or alternatively, the housing 1500 (Fig. 15) includes an LCD display, and the feedback signal generator 2810 generates a display, schematically exemplified by display 3000 in Fig. 30, indicating the location of the centroid 3002 of the spot diagram, relative to vertical and horizontal axes 3004 and 3006 that intersect at the center of the optical sensor. Such a display 3000 may be used by an operator who coaches the patient. Another embodiment of the display 3000 is indicated at 2351 in Fig. 23-1. Optionally or alternatively, the housing 1500 may include lights, such as LEDs, coupled to the feedback signal generator 2810 to indicate a relative direction, and optionally a relative distance, in which the instrument 1100 should be moved to improve alignment of the eye with the optical axis.
[00232] Some embodiments include haptic output devices that signal a patient with vibration along an axis to indicate the patient should shift his gaze or move the instrument 1100 in a direction along the axis of vibration. The frequency of vibration may indicate an extent to which the patient should shift his gaze or move the instrument 1100.
[00233] Thus far, it has been assumed at least a portion of the spot diagram falls on the optical sensor. However, if the eye is grossly misaligned with the optical axis, none of the spot diagram falls on the optical sensor, or an insufficient portion of the spot diagram falls on the optical sensor for the spot diagram centroid calculator 2802 to calculate a centroid location. Some embodiments solve this problem by including an array of light sensors around the optical sensor 1532, as shown schematically in Fig. 31. Here, an array 3100 of light sensors, exemplified by light sensors 3102, 3104 and 3106, are arranged to largely surround the optical sensor 1532. The light sensors 3102-3106 are shown the same size as the optical sensor 1532. However, the light sensors 3102-3106 may be smaller or larger than the optical sensor 1532. Each light sensor 3102-3106 has a single light-sensitive area. Thus, the light sensors 3102-3104 may be less expensive than the optical sensor 1532.
[00234] The light sensors 3102-3106 are coupled to the spot diagram centroid calculator
2802. If the spot diagram, here exemplified by a spot diagram 3108, falls off the optical sensor 1532, a signal from one or more of the light sensors 3102-3106 indicates to the spot diagram centroid calculator 2802 at least a direction from the center of the optical sensor 1532 to the spot diagram 3108. As in the case where only a portion of the spot diagram falls on the optical sensor 1532, the spot diagram centroid calculator 2802 may use signals from the array of light sensors 3100 to calculate at least an approximate location of the spot diagram 3108 and provide this as a simulated location of the centroid of the spot diagram.
[00235] Optionally or alternatively, the spot diagram centroid calculator 2802 simply returns one of several directions from the center of the optical sensor 1532, in which the spot diagram 3108 has fallen. The number of possible directions may be equal to the number of light sensors 3102- 3106 in the array 3100. The number of possible directions may be greater than the number of light sensors 3102-3106. For example, with three or more light sensors, the spot diagram centroid calculator 2802 may calculate a direction by taking a weighted sum of the signals from the light sensors. Optionally, the spot diagram centroid calculator 2802 may generate an additional signal to indicate the true centroid of the spot diagram is off the optical sensor 1532.
[00236] In the embodiment shown in Fig. 31, eight light sensors 3102-3106 are used in one square ring around the optical sensor 1532. However, in other embodiments, other numbers of light sensors and/or other number of concentric rings and/or other shaped rings may be used. The number of light sensors and/or rings may be selected based on a desired resolution of the direction and/or distance to the spot diagram.
[00237] In yet another embodiment shown schematically in Fig. 32, an additional beamsplitter 3200 directs a portion of the optical signal 1526 from the eye 1516 to a quadrant photodiode detector 3202. The quadrant photodiode detector 3202 is coupled to the spot diagram centroid calculator 2802. Fig. 33 is a plan view of the quadrant photodiode detector 3202, including a hypothetical spot diagram 3300 projected thereon. The quadrant photodiode detector 3202 can be any size, relative to the optical sensor 1532. However, a demagnifying lens 3204 interposed between the beamsplitter 3200 and the quadrant photodiode detector 3202 enables using a relatively small and inexpensive detector to detect locations of the spot diagrams over a relatively large area. Operation of the spot diagram centroid calculator 2802 in such embodiments is similar to the operation described above, with respect to Fig. 31. Alternatively, instead of a quadrant photodiode detector 3202, any other suitable sensor may be used, such as a position-sensitive detector (PSD) or a multi-element camera array. Alternatively, instead of a quadrant detector, a detector with another number of sectors may be used. The number of sectors may be selected based on a desired resolution with which the location of the spot diagram is to be ascertained.
[00238] Optionally or alternatively, feedback to the patient about misalignment of the patient's eye to the optical axis is provided by changing the location where the spot 1506 (Fig. 15) is projected on the wall 1514. In such embodiments, the visible light source 1508 is steerable, such as by a pan and tilt head (not shown) driven by the light source interface 2608 (Fig. 26) or by an array of visible light sources driven by the light source interface 2608. If the patient's eye is not properly aligned with the optical axis 1504 of the instrument 1100, the location of the spot 1506 is changed in a direction and by a distance that correspond to the direction and magnitude of the misalignment. Note that consequently the spot 1506 may no longer be along the optical axis 1504. As a result, the patient is subtly directed to redirect her gaze toward the new location of the spot 1506, thereby improving alignment of her eye with the optical axis 1504. The optical axis 1504 of the instrument 1100 is not changed. Only the location where the spot 1506 is projected changes.
[00239] Fig. 34 is a schematic plan view of an exemplary array 3400 of visible light sources exemplified by visible light sources 3402, 3404 and 3406. Each of the visible light sources 3402- 3406 is disposed so as to project the beam of light 1510 (Fig. 15) along a slightly different axis, thereby illuminating the spot 1506 on a slightly different location on the wall 1514. The embodiment shown in Fig. 34 includes 25 visible light sources 3402-3406. However, other numbers of visible light sources and their spacings may be used, depending on a desired granularity and range o f contro 1 over lo cation o f the spot 1506 on the wall 1514.
[00240] As shown in Fig. 28, the feedback signal generator 2810 sends a signal to the light source interface 2608 to control which of the individual visible light sources 3402-3406 projects the spot 1506. A central visible light source 3408 is disposed where a single visible light source 1508 would otherwise be disposed, so as to project the spot 1506 along the optical axis 1504. This light source 3408 is used to initially illuminate the spot 1506 on the wall 1514. However, if the spot diagram centroid location calculator 2802 ascertains that the patient's eye is not aligned with the optical axis 1504, the magnitude and direction of misalignment signal 2808 causes the feedback signal generator 2810 to extinguish the visible light source 3408 and illuminate a different light source of the array of visible light sources 3400. The feedback signal generator 2810 selects one of the visible light sources 3402-3406 located a direction and distance from the central visible light source 3408 corresponding to the direction and magnitude signal 2808.
Automatically adjust location of virtual light source to better
center spot diagram on optical sensor
[00241] Optionally or alternatively, if the patient's eye is not properly aligned with the optical axis 1504 of the instrument 1100, the location of the virtual light source 1525 (Fig. 15) within the patient's eye is changed so as to automatically generate a spot diagram that is better centered on the optical sensor 1532. In such embodiments, the light source 1520 is steerable, such as by a pan and tilt head (not shown) driven by the light source interface 2608 (Fig. 26) or by an array of light sources driven by the light source interface 2608. If the patient's eye is not properly aligned with the optical axis 1504 of the instrument 1100, the location of the virtual light source 1525 is changed in a direction and by a distance that correspond to the direction and magnitude of the misalignment. Note that consequently the virtual light source 1525 may no longer be along the optical axis 1504. As a result, the spot diagram falls on a different location on the optical sensor 1532, closer to the center of the optical sensor 1532, without any action by the patient. The optical axis 1504 of the instrument 1100 is not changed. Only the location where the spot diagram falls on the optical sensor 1532 changes.
[00242] Fig. 35 is a schematic plan view of an exemplary array 3500 of light sources exemplified by light sources 3502, 3504 and 3506. Each of the light sources 3502-3506 is disposed so as to project the beam of light 1522 (Fig. 15) along a slightly different axis, thereby creating the virtual light source 1525 at a slightly different location on the retina of the eye 1516. The embodiment shown in Fig. 35 includes 25 light sources 3502-3506. However, other numbers of light sources and their spacings may be used, depending on a desired granularity and range of control over location of the virtual light source 1525 on the retina of the eye 1516.
[00243] As shown in Fig. 28, the feedback signal generator 2810 sends a signal to the light source interface 2608 to control which of the individual visible light sources 3502-3506 projects the virtual light source 1525. A central light source 3508 is disposed where a single light source 1520 would otherwise be disposed, so as to project the virtual light source 1525 along the optical axis 1504. This light source 3508 is used to initially illuminate the virtual light source 1525 on the wall retina of the eye 1516. However, if the spot diagram centroid location calculator 2802 ascertains that the patient's eye is not aligned with the optical axis 1504, the magnitude and direction of misalignment signal 2808 causes the feedback signal generator 2810 to extinguish the light source 3508 and illuminate a different light source of the array of light sources 3500. The feedback signal generator 2810 selects one of the light sources 3502-3506 located a direction and distance from the central light source 3508 corresponding to the direction and magnitude signal 2808. Automatically determine whether an eye is accommodating and
provide feedback to patient
[00244] The open view design described herein encourage patients not to accommodate, at least in part because the patients know a spot begin projected on a wall is far away. Nevertheless, a patient may at times accommodate while her eye is being measured. Accommodation introduces an uncontrolled variable into the prescription measurement process, because a corrective eyeglass prescription should be calculated based on wavefronts emanating from an unaccommodated eye. To avoid this problem, embodiments of the present invention automatically ascertain when a patient is not accommodating and use wavefront data from such periods to calculate a prescription.
[00245] As noted, a spot diagram generated by wavefront aberrometry can be used to calculate a corrective lens prescription. However, unlike the prior art, embodiments of the present invention capture video data, i.e., a series of time spaced-apart frames, rather than one or a small number of single arbitrarily-timed images. The video frame rate may be constant or variable. The frame rate may be adjusted in real time, from frame to frame, based on characteristics of the spot diagram imaged by the optical sensor 1532 (Fig. 15), such as overall illumination and percent of saturated pixels in a given frame. In some embodiments, the frame rate may vary from about 6 frames per second to more than 15 frames per second. Nevertheless, the inter- frame time is relative short, on the order of about 1/10 second, thus we refer to the video frames as being "continuous." The video data is captured from the optical sensor 1532 (Fig. 15) and stored in the memory 2602 (Fig. 26) for processing. Each frame of the video includes an image captured by the optical sensor 1532, an associated frame number and, if the frame rate is not constant, an associated time at which the frame was captured. Thus, a prescription can be calculated from each frame. An aberration profile, which may be described by a refractive prescription, a set of Zernike coefficients, or some other representation calculated from a frame is referred to herein as a "candidate prescription," because some frames include noise, incomplete spot diagrams, no spot diagram or are otherwise undesirable for prescription calculation.
[00246] A candidate prescription is calculated for each frame and stored in the memory
2602. The candidate prescription calculations may be performed after the last frame of the video has been captured, or the calculations may overlap in time with the video capture and storage. If sufficient computing power is available, a candidate prescription may be calculated for each frame, after the frame has been captured, but before the successive frame is captured. In the latter case, in some embodiments, the raw video data is not stored in the memory 2602.
[00247] Fig. 36 is a schematic block diagram of an unaccommodation detector module 3600, according to an embodiment of the present invention. A prescription calculator 3602 receives signals from the optical sensor 1532 and calculates a candidate prescription from the signals, as described herein. As noted, a prescription typically includes at least a spherical component and one or two cylindrical components. The spherical component is described in terms of the optical power, positive or negative, and the cylindrical component is described in terms of powers and axes or equivalent terms (e. g. power vector notation). A prescription may also include additional lens specifications to correct higher order aberrations.
[00248] The prescription calculator 3602 outputs a set of individual lens specifications, such as sphere 3604, cylinder-1 3605, axis-1 3606, etc. Optionally, other information 3607, such as spot size, is also output. The outputs are collectively referred to as a candidate prescription 3612. Each candidate prescription 3612 is stored in the memory 2602 (Fig. 26), along with an identification of from which video frame the candidate prescription was calculated or the relative time 3608 at which the frame was taken by the optical sensor 1532. The as sphere 3604, cylinder-1 3605, axis-1 3606, etc. can be prescription data calculated using various Zernike modes, such as M, JO and J45, obtained using various order Zernike information.
[00249] A normally-open switch 3620 closes after all the frames have been acquired.
[00250] The spot diagram size is related to a quality metric, and it gives some information about the prescription. Assuming a constant pupil size, if the eye is emmetropic, the spot diagram size is equal to pupil size. However, if the eye myopic, the spot diagram is smaller than the pupil size. The higher the myopia, the smaller the pupil size. On the other hand, if the eye is hyperopic, the spot diagram diameter is bigger than the pupil size. The more hyperopic, the bigger the spot diagram.
[00251] The spot diagram size also change with accommodation. Thus, if the patient is accommodating, the instrument can detect changes in spot diagram size.
[00252] The spot diagram size is related to pupil size, and pupil size is related to amount of light received by the eye. In darker environments, the pupil automatically becomes larger. Thus, the instrument can use pupil size, as estimated from spot diagram size, to track external conditions, such as a change in ambient light in the room while the patient was being measured.
[00253] In addition, the size of the spot diagram can be related to a quality metric. Using the defocus aberration and propagation algorithms, spot diagram size can be used to calculate the pupil size. The pupil size is important to measure the aberrations, because the aberration profile is associated with a specific aperture diameter.
[00254] Fig. 37 contains a graph 3700 of spherical and cylindrical power candidate prescriptions calculated from a hypothetical patient. Open circles represent spherical candidate prescriptions, and crossed circles represent cylindrical candidate prescriptions. The vertical axis indicates power in diopters (D), and the horizontal axis indicates time at which the candidate prescriptions were calculated. The frames were captured at approximately 10 frames per second.
[00255] Fig. 37 shows the patient's candidate spherical prescription varying over time, starting at about -2.25 D at time 1. Starting at about time 175, the candidate spherical prescription increases from about -1.7 D to about +0.4 D at about time 240. After about time 240, the candidate spherical prescription decreases.
[00256] When an eye is unaccommodated, the candidate spherical prescription should be different than any candidate spherical prescription calculated while the eye is accommodated, because an unaccommodated crystalline lens provides different optical power than an accommodated crystalline lens and, therefore, requires a different correction than an accommodated lens. On the other hand, cylindrical correction does not vary significantly with amount of accommodation, so if variations in the cylindrical components of the prescriptions are found, these are in general indicatives of undesirable movements of the patient during the test. Thus, from the graph in Fig. 37, it would appear as though candidate spherical prescription around 0 D (such as 3702) should be closer to the correct prescription for the patient, because those are the greatest candidate spherical prescriptions calculated.
[00257] However, embodiments of the present invention do not necessarily accept the greatest candidate spherical prescription as the correct prescription, because a candidate prescription may be a result of noise and the magnitude and direction of the accommodation may depend on or other factors such as the actual refractive error of the patient. For example, in the case illustrated by Fig. 37, neither candidate spherical prescription 3702 nor 3704 is accepted, because we realize an eye cannot change accommodation quickly enough to have yielded either candidate spherical prescription 3702 or 3704. The literature reports a maximum accommodation rate of about 1-2 diopters per second in a human eye. A bar 3706 indicates an approximate amount of time required by an eye to change accommodation from a close to a distant object. In contrast, candidate spherical prescriptions 3702 and 3704 would have required the eye to change accommodations much more quickly than is physiologically common. Furthermore, an eye changes accommodation continuously, as the crystalline lens changes shape. Thus, candidate prescriptions with no nearby candidate prescriptions are very likely a result of noise. For instance, in the case of a myopic eye with a small pupil, there will be very few spots composing the spot diagram. The errors in determining the centroids of these spots from specular noise can then cause large errors in the calculation of the Zernike coefficients corresponding to the defocus of the eye.
[00258] Returning to Fig. 36, after all the frames have been acquired, the normally-open switch 3620 is closed, and the candidate spherical prescription 3604 is fed to a low-pass filter 3614 in an accommodation filter 3622 (accommodation filter 3810 in Fig. 38) to remove candidate spherical prescriptions that are radically different than surrounding candidate spherical prescriptions, i.e., where the absolute slope of the candidate spherical prescription is greater than a predetermined value. In some embodiments, an instantaneous slope in the candidate spherical prescription signal 3604 greater than about ±1 diopter per second triggers rejection of a candidate spherical prescription. Smoothed candidate spherical prescriptions, i.e., candidate spherical prescriptions that pass through the low-pass filter 3614, are processed according to accommodation correction rules 3624. In some embodiments, the rules 3624 select the greatest candidate spherical prescription. In the graph of Fig. 37, candidate spherical prescription 3708 would be selected by the rules 3624. However, in other embodiments, other selection criteria, machine learning or other mechanism may be used to process the candidate prescriptions to arrive at a prescription. In some embodiments, other portions of the candidate prescription 3612 or other information, such as spot diagram size as a function of time, may also be used.
[00259] The frame number or time associated with the candidate spherical prescription selected by the rules 3624 is used to select a candidate prescription stored in the memory 2602, i.e., the other candidate prescription parameters calculated from the same frame as the candidate spherical prescription detected by the rules 3624. The selected candidate prescription 3618 is reported as a prescription for the patient or fed to another module. Thus, embodiments of the present invention automatically ascertain when a patient is not accommodating and use wavefront data from such periods to calculate a prescription.
[00260] In some embodiments, more than one candidate spherical prescription may be deemed to have been calculated from frames captured while the patient's eye was not accommodated. For example, all candidate spherical prescriptions within a predetermined range of the candidate spherical prescription detected by the rules 3634, as described above, may be deemed to have been calculated from unaccommodated eye data. The rules 3634 store information in the memory 2602 identifying candidate prescriptions that were calculated from unaccommodated eye data.
[00261] Some embodiments provide feedback to the patient when a peak candidate spherical prescription has been detected or when no such peak has been detected within a predetermined amount of time after commencing collecting data from the optical sensor 1532. This feedback may be in the form of audio, visual, haptic or other feedback, along the lines described above, with respect to Fig. 28.
Combining multiple frames to improve signal-to-noise ratio
[00262] Although individual frames from the optical sensor 1532 that include a spot diagrams may be used to calculate prescriptions, in some embodiments multiple frames are combined to calculate a single prescription. Combining multiple frames can improve signal-to-noise (S/N) ratios, such as by averaging noise. Several embodiments that combine frames will now be described, along with additional details that pertain to these embodiments and to some embodiments that do not combine frames. Several processing modules will be described. The processing modules and interconnections among these modules are summarized in Fig. 38.
[00263] In module 3800, data is acquired from the image sensor 1532 (Fig. 15). Each frame is acquired according to image sensor settings, including exposure time and frame rate. These settings may be adjusted on a frame-by-frame basis, with a goal of acquiring frames with good signal-to-noise ratios. In general, frames with bright spots in their spot diagrams have better signal- to-noise than frames with dim spots, although a large number of spots that are saturated is undesirable. "Saturated" means a brightness value of a pixel is equal to the maximum value possible for the pixel. Alternatively, module 3800 may process frames that were acquired earlier and are stored in memory 2602.
[00264] In one embodiment, if more than a first predetermined fraction of pixels of a frame are saturated, the exposure time of the next frame is reduced. The fraction may be expressed as a percentage, for instance 0.1% of all the pixels in the sensor should be saturated. This fraction can vary based on the size of the pupil and the average size of the spots comprising the spot diagram. In addition, this fraction may be set based on characteristics of the image sensor 1532 and the light source 1520. Conversely, if fewer than a second predetermined fraction of the pixels of the frame are saturated, the exposure time of the next frame is increased. However, the exposure time should not be increased to a value that might cause motion blur as a result of the eye moving. Thus, a maximum exposure time can be ascertained, based on the size of the optical sensor 1532 and the number of pixels or quadrants it contains. Outputs from the data acquisition module 3800 are summarized in Table 1.
Table 1 : Outputs from data acquisition module A set of frames
A timestamp for each of the frames
Image sensor settings (exposure time and frame rate) for each frame Fraction of pixels that are saturated in each frame
How well aligned the spot diagram is on the image sensor 1532, such as based on information from the array of light sensors 3100 (optional)
[00265] As noted, the patient may be instructed to adjust the position of the instrument 1100, relative to the patient's eye, so the patient perceives a red dot at maximum brightness. At this position, the instrument 1100 (Fig. 15) is well oriented, relative to the patient's eye socket. However, the patient's eye can still move within the eye socket. That is, the patient can look up, down, left and right. Thus, the center of the eye's field of view may not be aligned with the optical axis 1504 of the instrument 1100, and the spot diagram may not be centered on the optical sensor 1532, or the spot diagram may be completely off the optical sensor 1532. In addition, the patient might blink. Furthermore, in some frames, the signal reaching the optical sensor 1532 may be from a reflection from the eye's cornea, rather than from the virtual light source on the eye's retina. Thus, some frame may not contain useful information.
[00266] A frame selector 3802 retains only frames that may contain useful information. An objective of the frame selector 3802 is to ensure raw data used to calculate a prescription is as good as possible. The frame selector 3802 may discard frames, as summarized in Table 2. For example, successive frames in which the diameter of the spot diagram varies from frame to frame by more than a predetermined amount may be discarded.
[00267] The frame selector 3802 tags the frames, such as "valid," "incomplete" or possibly
"discarded." The tags may be represented by codes stored in the memory 2602 in association with data representing brightness values of the pixels of the frames or prescriptions calculated from the frames.
Table 2: Frames discarded by frame selector module Patient blinked (no spot diagram)
Eye grossly misaligned with optical axis (no spot diagram)
Corneal reflection (too many spots in spot diagram and of high intensity)
Rapidly changing spot diagram diameter
Too much time has passed since the last patient blink; tear film may be compromised, so the frame should be discarded (optional)
Partial spot diagram (optional)
[00268] Fig. 39 is schematic diagram of a complete spot diagram, and Fig. 40 is a schematic diagram of a partial spot diagram, i.e., a spot diagram in which a portion of the spot diagram falls off the optical sensor 1532. The spot diagrams were captured by a prototype instrument as described herein. The frame selector 3802 may distinguish these two types of frames from each other by various techniques. For example, the frame selector 3802 may ascertain a shape of the spot diagram. If the spot diagram is approximately circular or elliptical and complete, the frame may deemed to contain a complete spot diagram, and the frame may be accepted and tagged as "valid." The frame selector 3802 may also calculate the location of the center of the spot diagram. On the other hand, if only a portion of the shape is circular, and spots of the spot diagram are adjacent edges of the optical sensor 1532, the frame may be deemed to contain a partial spot diagram and tagged "incomplete." For frames tagged as incomplete, the frame selector 3802 may also calculate or estimate what fraction of the spot diagram falls on the optical sensor 1532. As will be discussed below, incomplete spot diagrams may be used in some prescription calculations.
[00269] Fig. 41 is a schematic diagram of a frame from the optical sensor 1532 with no spot diagram, such as a result of a patient blink or gross misalignment of the patient's eye with the optical axis 1504 of the instrument 1100. The frame selector 3802 may detect this type of frame by summing or integrating all the pixels of the frame. If the sum or integral is less than a predetermined value, indicating few or no spots of a spot diagram are present in the frame, the frame selector 3802 may discard the frame.
[00270] Fig. 42 is a schematic diagram of a frame from the optical sensor 1532 containing a corneal reflection. The frame was captured by a prototype instrument as described herein. The frame selector 3802 may identify such a frame, based on several factors. For example, if the image contains more spots than lenses in the lenslet array 1530, the frame selector 3802 may discard the frame. The frame selector 3802 may sum or integrate all the pixels of the frame. If the sum or integral is greater than a predetermined value, indicating too many spots for a spot diagram are present in the frame, the frame selector 3802 may discard the frame. Frames discarded by the frame selector 3802 may be stored in the memory, but tagged "discarded."
[00271] Table 3 summarizes outputs from the frame selector 3802 module.
Table 3: Outputs from frame selector module
A set of consecutiveframes, each containing a spot diagram (Note that some intermediate frames may have been discarded. Nevertheless, the remaining frames are referred to herein as being "consecutive.")
A timestamp for each frame
A tag for each frame, ex. "valid" or "incomplete"
Coordinates for each spot diagram's center
A diameter of each spot diagram (projected pupil size)
Image sensor settings (exposure time and frame rate) for each frame
A fraction of the pixels in each frame, or alternatively each spot diagram, that are saturated [00272] Optionally, several consecutive frames may be combined to obtain a single frame with a better signal-to-noise ratio than each of the consecutive frames. If a low-cost light source 1520 (Fig. 15) is used to create the virtual light source 1525 in the patient's eye 1516, the images acquired by the optical sensor 1532 may include significant speckle noise. Speckle noise may result from path length differences between points within the virtual light source 1525 and the optical sensor 1532. These path length differences cause random variations in intensity due to mutual interference from several wavefronts emanating from the points within the virtual light source 1525. Furthermore, even if the patient's eye does not move, intraocular fluid, such as vitreous humor, may flow, causing optical interference. On the other hand, flow of the vitreous human may randomize path lengths on the time scale of the frames and, therefore, reduce speckle noise. In any case, combining several frames can improve the signal-to-noise by averaging the speckle noise.
[00273] A frame combiner 3804 receives output from the frame selector module 3802, and optionally from the prescription calculator 3806, and outputs a single combined frame. The frame combiner 3804 may combine only consecutive frames that are tagged "valid." Optionally or alternatively, the frame combiner 3804 may combine consecutive frames that are tagged "valid" or "incomplete." Optionally, the frame combiner 3804 may combine non-consecutive frames, based on the prescription information provided by the prescription calculator 3806.
[00274] In combining frames, the frame combiner 3804 registers the frames that are to be combined, so corresponding spots of the spot diagram register with each other. A non-deforming (rigid) registration process should be used, so as not to alter the shape of the spot diagram. Once the frames are registered, they may be summed or averaged. That is, the intensities recorded by corresponding pixels in each summed frame are added or averaged. In addition, the exposure time for the spot diagram should be revised by summing the exposure times of the frames that were combined. It is also to take into account at this stage that only frames which are close in time (i .e . consecutive frames in which the eye had no time to accomodate) may be combined, since accommodation can cause the combination of frames with different prescriptions leading to incorrect results.
[00275] In some embodiments, only frames tagged as "valid" are combined. In some embodiments, frames tagged as "valid" and frames tagged as "incomplete" are combined. Figs. 43- 46 are schematic diagrams of a set of frames from the optical sensor 1532 containing a sequence of images acquired as an eye slowly moved, creating a set of spot diagrams that move from left to right. The frames were captured by a prototype instrument as described herein. The spot diagrams in Figs. 43-45 are tagged incomplete, and the spot diagram in Fig. 46 is tagged valid. Essentially the same procedure as described above for combining frames may be used for combining the frames represented by Figs. 43-46. However, some spots in the resulting combined spot diagram result from adding or averaging a different number of spots than other resulting combined spots. For example, some spots are not included in the spot diagram of Fig. 43, because these spots fall off the left side of the optical sensor 1532. These spots appear in subsequent frames, as the spot diagram moves to the right. Therefore, these spots have fewer contributions to their sum or average. Thus, these spots likely have worse signal-to-noise ratios than spots that appear in each of Figs. 43-46.
[00276] In either case, a low-pass filter may be used to smooth each frame that is to be combined, in order to calculate registration parameters, such as displacements to apply to the frame images to register them to a target reference. The low-pass filter is used to calculate the registration parameters. Once the registration parameters have been calculated, the registration displacements are applied to the original frames, not to the filtered frames. Characteristics of the low-pass filter may be determined empirically, given characteristics of the light source 1520 (Fig. 15) and characteristics of the lenslet array 1530. Characteristics of the low-pass filter relate to size of the speckle, which is related to the diffraction limit of the lenslet array 1530. Calibrations related to misalignment of different components within the device 1100 should be applied before the registration process. Outputs from the frame combiner 3804 are summarized in Table 4.
Table 4: Outputs from frame combiner module
A set of consecutive frames, each containing a spot diagram
A timestamp for each frame
A tag for each frame, ex. "valid" or "incomplete"
Coordinates for each spot diagram's center
A diameter of each spot diagram (projected pupil size)
Revised image sensor settings (exposure time and frame rate) for each frame
A fraction of the pixels in each frame, or alternatively each spot diagram, that are saturated [00277] A prescription calculator module 3806 calculates a prescription from each frame.
For each frame, the prescription calculator 3806 calculates centroid coordinates for each spot of the spot diagram. Fig. 47 is a schematic diagram of a hypothetical frame from the optical sensor 1532 containing a complete spot diagram. An "X" indicates the centroid of the spot diagram. Crosses indicate centroid locations for spots, where they would appear for a perfect eye. As evident from the figure, many spots of the spot diagram are displaced from these crosses.
[00278] As noted, the spot diagram is generated when a wavefront impinges on an array of lenslets. A slope of the wavefront at each sample point (lens of the lenslet array) is calculated. A displacement (Δχ and Ay) of each spot of the spot diagram is calculated, relative to the location of a spot from a perfect eye, as exemplified in Fig. 48. Given the focal length of the lenslet array, the slopes can be calculated from the displacements.
[00279] The displacement data is fitted to a Zernike polynomial expansion, where the expansion coefficients are determined using least-squares estimation, as summarized in the following equations:
5W(x, y Ax(x, y
δχ f
5W(x, y) Ay(x, y
Sy = f
W(x, y) = y WjZj ix. y)
Wj is the coefficient of the Zj mode in the expansion.
Wj is equal to the RMS wavefront error for that mode.
[00280] The Zernike coefficients are used to calculate a prescription. Because the Zernike expansion employs an orthonormal set of basis functions, the least-squares solution is given by the second order Zernike coefficients, regardless of the value of the other coefficients. These second- order Zernike coefficients can be converted to a sphero-cylindrical prescription in power vector notation using the following or other well-known equations:
Figure imgf000052_0001
c, 2-v 6
J45 = ^l~
r~
[00281] Where C™ is the nth order Zernike coefficient, and r is pupil radius. It is also possible to compute a prescription using more Zernike coefficients, i.e., for higher order aberrations, as indicated, for example, in the following equations:
- 4 fI + c0A2js - c 4j7 + ...
M
Figure imgf000052_0002
- 2 + C 6V10 - (. " 12Vi4 + -
J45
[00282] The power vector notation is a cross-cylinder convention that is easily transposed into conventional formats used by clinicians.
[00283] While or after the Zernike coefficients are used to calculate a prescription, calibrations can be applied to the Zernike coefficients or to the power vectors to eliminate errors of the device 1100, such as gain, offset, non-linearity or misalignments among the optical components of the system. In the equations shown above, M relates to spherical error (Myopia or hyperopia), and JO and J45 represent astigmatism. As noted, the pupil radius is estimated, based on the size of the spot diagram. Outputs from the prescription calculator 3806 are summarized in Table 5.
Table 5: Outputs from prescription calculator module
A set of consecutive frames, each containing a spot diagram
A timestamp for each frame
A tag for each frame, ex. "valid" or "incomplete"
Coordinates for each spot diagram's center
A diameter of each spot diagram (projected pupil size)
Image sensor settings (exposure time and frame rate) for each frame
Zernike coefficients for each spot diagram (frame)
One or several prescriptions in the power vector domain (PWV) (M, JO and J45), or in another domain such as optometric, for each spot diagram (frame) (The system can provide more than one prescription. For example, one prescription may be calculated with Two Zernike orders, i.e., with no high-order aberrations, and other prescriptions may be calculated using high-order aberrations, such as Zernike orders 4 or 6.)
[00284] Optionally, information about the prescription may be provided by the prescription calculator 3806 to the frame combiner 3804. In this case, the frame combiner 3804 may use this information to determine how to combine frames.
[00285] Optionally, quality metrics may be calculated for each calculated prescription by a quality metric calculator 3808. In a subsequent module, the quality metrics may be used to weight the prescription calculated from each frame or frame combination to calculate a final prescription. The quality metrics may be as simple as a binary value, for example "0" for "bad" and "1" for "good." More complex quality metrics may fall within a range, such as a real number between 0.0 and 1.0. The quality metrics may be based on, for example, the number of frames, signal-to-noise ratio of the spot diagram, number of spots in the spot diagram, sharpness of the points in the spot diagram and absence, or small values, of high-order Zernike coefficients, or combinations thereof. The signal-to-noise ratio of a frame may, for example, be calculated by dividing the mean pixel value of spots in the spot diagram by the mean pixel value of background, i.e., an area outside the spot diagram.
[00286] Outputs from the quality metric calculator 3808 are summarized in Table 6. Table 6: Outputs from quality metrics module
A set of consecutive frames, each containing a spot diagram
A timestamp for each frame
A tag for each frame, ex. "valid" or "incomplete"
Coordinates for each spot diagram's center
A diameter of each spot diagram (projected pupil size)
Image sensor settings (exposure time and frame rate) for each frame
Zernike coefficients for each spot diagram (frame)
One or several prescription in the PWV (M, JO and J45) domain for each spot diagram (frame)
Quality metrics for each frame
[00287] As noted, accommodation introduces an uncontrolled variable into the measurement process. Therefore, prescriptions calculated from spot diagrams captured while a patient is accommodating are unlikely to be accurate. Optionally, an accommodation filter module 3810 selects frames captured when the patient is not accommodating.
[00288] The amount by which a human eye can accommodate varies with age of a patient, as summarized in a graph in Fig. 49. Embodiments of the present invention input the age of each patient, such as via a numeric keyboard or up/down arrow buttons coupled to a numeric display that increase or decrease a displayed age value as the arrow buttons are pressed. Using the age of the patient and physiological data existing in the literature about accommodation speed, the accommodation filter 3810 discards frames that evidence changes in accommodation faster than the patient should be able to accommodate, given the patient's age. In one embodiment, the accommodation filter 3810 includes a variable low-pass filter whose characteristics are controlled by the expected maximum accommodation rate. The low-pass filter operates on the M (spherical error) portion of the prescription. Other embodiments employ fixed accommodation rate limits, such as about lto 2 diopter per second, independent of the patient's age. In such an embodiment, a change in the calculated defocus term (or M in PWV notation) that occurs faster than the fixed accommodation rate limit is considered noise and is not included in determining the final prescription.
[00289] Fig. 50 is a graph of a set of M, JO and J45 prescriptions calculated by a prototype instrument, as described herein. A dark line is added to show M values after processing by the accommodation filter 3810. As can be seen from variations of the M value, the patient's accommodation varied. Peaks in the M value, indicated by circles 5000 and 5002, indicate times at which the patient did not accommodate. Therefore, the accommodation filter 3810 selects frames acquired during these times and discards other frames, for the purpose of calculating spherical terms for the prescription. Because astigmatism and other terms of the prescription do not vary with accommodation, the frames discarded by the accommodation filter 3810 may be used to calculate these other terms. If variations in astigmatism as a function of time are found, this can be used as indicators of patient movements during the test, and thus used to tag frames as invalid.
[00290] Fig. 51 is a graph of a set of M, JO and J45 prescriptions calculated by a prototype instrument for a different patient. As can be seen, the M values 5100 do not vary significantly throughout the graph. It can, therefore, be assumed that the patient did not accommodate throughout the time period represented by the graph. In this case, the accommodation filter 3810 selects all frames represented by the graph; no frames are discarded.
[00291] Outputs of the accommodation filter are summarized in Table 7.
Table 7: Outputs from accommodation filter module
A set of consecutive frames, each containing a spot diagram
A timestamp for each frame
A tag for each frame, ex. "valid" or "incomplete"
Coordinates for each spot diagram's center
A diameter of each spot diagram (projected pupil size)
Image sensor settings (exposure time and frame rate) for each frame
Zernike coefficients for each spot diagram (frame)
A prescription in the PWV (M, JO and J45) domain for each spot diagram (frame)
Quality metrics for each frame
A set of not necessarily consecutive frames, each frame containing a spot diagram from which a spherical term may be calculated
[00292] Groups of frames may yield similar prescriptions. For example, as shown in the graph of Fig. 50, two groups of frames 5000 and 5002 yield similar J (spherical) prescriptions. Optionally, a frame grouper module identifies groups of frames that yield similar prescriptions, such as prescriptions within a predetermined range of values. Two such frame grouper modules 3812 and 3814 are shown in Fig. 38.
[00293] One frame grouper 3812 groups frames that yield similar, such as within about a 5% difference, Zernike coefficients. In some embodiments, the prescription grouper 3812 considers only the first six Zernike coefficients, although other numbers of coefficients may be used. The other frame grouper 3814 groups frames that yield similar prescriptions, for example, values of M, JO and/or J45 that fall within about ±0.125 diopters or within about ±0.25 diopters. Frame groupers that group frames based on other similarities may also be used.
[00294] Separate groups of frames may be defined for each term of the prescription. Thus, one group of frames may be selected for having similar M values, and a different, possibly overlapping, group of frames may be selected for having similar JO values. If some frames were discarded by the accommodation filter 3810, a different pool of frames may be available to the frame grouper 3814 for selecting frames based on similarity of M values than for selecting frames based on similarity of JO values. Similarly, different pools of frames may be available to the other frame grouper 3812.
[00295] The frame grouper 3814 may operate by generating a histogram for each term of the prescription. A hypothetical histogram for spherical prescriptions is shown in Fig. 52. The horizontal axis represents spherical prescription values or M values in the power vector domain, and the vertical axis represents the number of frames that yielded a given spherical prescription. Note that frames containing low-quality raw data, such as due to low signal-to-noise, were discarded by other modules. Thus, some prescription values may not have been calculated from any accepted frames. The prescription value 5200 yielded from the greatest number of frames, and a range 5202 of prescription values around this value, are selected by the frame grouper 3814. The frame grouper 3814 operates similarly for the other prescription terms. The other frame grouper 3812 operates similarly, generating a histogram for each Zernike coefficient it considers. Alternatively, instead of the histogram representing the number of frames on the vertical axis, the frame grouper 3814 may use the sum of the quality metrics for the frames. Thus, if the quality metric values are between 0.0 and 1.0, instead of the histogram indicating the number of frames that yielded a given prescription, the histogram represents the sum of the quality metrics for the frames that yielded that prescription. Optionally or alternatively, the frame groupers 3812 and 3814 may use other selection operations, other than or in addition to, histograms.
[00296] Outputs from the frame groupers 3812 and 3814 are summarized in Table 8.
Table 8: Outputs from each frame grouper module
A set of consecutive frames, each containing a spot diagram
A timestamp for each frame
A tag for each frame, ex. "valid" or "incomplete"
Coordinates for each spot diagram's center
A diameter of each spot diagram (projected pupil size)
Image sensor settings (exposure time and frame rate) for each frame
Zernike coefficients for each spot diagram (frame)
A prescription in the PWV (M, JO and J45) domain for each spot diagram (frame)
Quality metrics for each frame
A set of not necessarily consecutive frames, each frame containing a spot diagram from which a spherical term may be calculated
A set of frames yielding similar prescriptions or Zernike coefficients, as the case may be
[00297] Optionally, frames that yield similar prescriptions or Zernike coefficients may be combined to yield frames with better signal-to-noise, and prescriptions can be calculated from the combined frames. A frame restorer 3816 combines the frames output by one or both of the frame groupers 3812 and/or 3814. The frame restorer 3816 combines these frames in a manner similar to that described above, with respect to the frame combiner 3804. All frames available from the frame grouper(s) 3812 and/or 3814 may be combined into a single frame. Alternatively, all the frames may be combined on a per prescription term basis. That is, all frames with similar M and J values may be combined to generate a single combined frame.
[00298] Alternatively, the frames may be combined so as to yield a new set of frames in which each frame is a combination of all preceding frames in the input set of frames, as graphically illustrated in Fig. 53. Output frame 1 is generated by registering and summing or averaging input frames 1 and 2. Output frame 2 is generated by registering and summing or averaging input frames 1, 2 and 3. Output frame N is generated by registering and summing or averaging input frames 1, 2, 3, ... N. Optionally, the quality metrics of each generated frame may be adjusted. In general, combining frames improves signal-to-noise. [00299] Outputs of the frame restorer 3816 are summarized in Table 9.
Table 9: Outputs from frame restorer module
A set of consecutive frames, each containing a spot diagram
A timestamp for each frame
A tag for each frame, ex. "valid" or "incomplete"
Coordinates for each spot diagram's center
A diameter of each spot diagram (projected pupil size)
Image sensor settings (exposure time and frame rate) for each frame
Zernike coefficients for each spot diagram (frame)
A prescription in the PWV (M, JO and J45) domain for each spot diagram (frame)
Quality metrics for each frame
A set of not necessarily consecutive frames, each frame containing a spot diagram from which a spherical term may be calculated
A set of frames yielding similar prescriptions or Zernike coefficients, as the case may be A set of combined frames
[00300] Optionally, a second prescription calculator 3818 calculates prescriptions from the frames generated by the frame restorer 3816. The second prescription calculator 3818 operates largely as described above, with respect to the first prescription calculator 3806, except the input dataset is different. Outputs from the second prescription calculator 3816 are essentially the same as described in Table 5.
[00301] A final prescription calculator 3820 accepts inputs from the frame grouper 3812, the frame grouper 3814 and/or the second prescription calculator 3818. The final prescription calculator 3820 calculates a single final prescription from its inputs using one or more statistical calculations. In some embodiments, the final prescription calculator 3820 calculates the final M, JO and J45 prescriptions as a mean, mode or median of its input M, JO and J45 prescriptions, after weighting each frame's prescriptions by the frame's quality metrics. In the final prescription calculator 3820, and in other modules described herein, higher-order prescription terms are calculated in the same manner as the M, JO and J45 prescriptions are calculated. [00302] Optionally, the final prescription calculator 3820 also calculates estimated error value for each final calculated prescription. In some embodiments, the M error is estimated to be the standard deviation of the final calculated M prescription, within the M input data to the final prescription calculator 3820. In some embodiments, the error is estimated to be twice the standard deviation, according to preferences of some clinicians (95% confidence interval). Other embodiments may estimate the error using other statistical formulas. This error may be communicated to the user of the device by a confidence value in the prescription, for instance, indicating a strong confidence in the measured prescription, or a weak confidence in the measured prescription and suggesting to run the test again.
[00303] Some embodiments estimate a confidence region for the final astigmatism prescription. This confidence region may be an ellipse computed for the bivariate distribution of JO and J45. In these embodiments, the precision of the astigmatism prescription is deemed to be the geometric mean of the major and minor axes of the 95% confidence ellipse, as exemplified in Fig. 54.
[00304] Embodiments of the present invention are not necessarily limited to calculating prescriptions for living beings. Some embodiments may be used on a model eye ball to evaluate a person's spectacle prescription. For example, these embodiments may be used to evaluate a person's spectacles and automatically determine if they are appropriate for the person by checking the person without his spectacles and either checking the person with his spectacles on (as indicated in phantom at 1552 in Fig. 15) or checking the spectacles on a model eye. Optionally or alternatively, embodiments may be used to evaluate a person's spectacles and automatically determine if they are appropriate for the person by checking the person with his spectacles on and determining if the returned wavefronts indicate correct vision, at least within a predetermined range.
[00305] If a patient is known to be able to accommodate well, the patient's aberrations may be measured when the patient is looking through an embodiment at a target located closer than 20 feet (6 meters) away, and an accommodative offset is then calculated, so as to estimate a prescription for the patient at infinity.
[00306] In another embodiment, a monocular aberrometer includes an accelerometer in it to enable the device to ascertain which direction is up and, therefore, automatically ascertain which eye (left or right) is being measured. The device is turned upside down to measure the opposite eye. [00307] Some embodiments also track undesired movements of a patient by tracking how astigmatism components of the prescription change, as a function of time.
[00308] While the invention is described through the above-described exemplary embodiments, modifications to, and variations of, the illustrated embodiments may be made without departing from the inventive concepts disclosed herein. For example, embodiments of the present invention may find utility in virtual reality goggles or adaptive correction displays. Furthermore, disclosed aspects, or portions thereof, may be combined in ways not listed above and/or not explicitly claimed. Accordingly, the invention should not be viewed as being limited to the disclosed embodiments.
[00309] Although aspects of embodiments may be described with reference to flowcharts and/or block diagrams, functions, operations, decisions, etc. of all or a portion of each block, or a combination of blocks, may be combined, separated into separate operations or performed in other orders. All or a portion of each block, or a combination of blocks, may be implemented as computer program instructions (such as software), hardware (such as combinatorial logic, Application Specific Integrated Circuits (ASICs), Field-Programmable Gate Arrays (FPGAs) or other hardware), firmware or combinations thereof. Embodiments may be implemented by a processor executing, or controlled by, instructions stored in a memory. The memory may be random access memory (RAM), read-only memory (ROM), flash memory or any other memory, or combination thereof, suitable for storing control software or other instructions and data. Instructions defining the functions of the present invention may be delivered to a processor in many forms, including, but not limited to, information permanently stored on tangible non-writable storage media (e.g., read-only memory devices within a computer, such as ROM, or devices readable by a computer I/O attachment, such as CD-ROM or DVD disks), information alterably stored on tangible writable storage media (e.g., floppy disks, removable flash memory and hard drives) or information conveyed to a computer through a communication medium, including wired or wireless computer networks. Moreover, while embodiments may be described in connection with various illustrative data structures, systems may be embodied using a variety of data structures.

Claims

CLAIMS What is claimed is:
1. A method of determining an optical property of an eye of a living being, the method comprising:
providing an optical apparatus having a proximal port and a distal port that together form a visual channel;
aligning the eye with the proximal port;
producing target indicia at effective infinity, the target indicia being viewable through the visual channel;
focusing the eye on the target indicia;
determining accommodation of the eye, as the eye views the target indicia; and
calculating an optical property for the eye, as a function of the determined accommodation.
2. The method as defined by claim 1, further comprising:
gathering data relating to accommodation of the eye; wherein:
calculating the optical property comprises:
a) using the data relating to accommodation to identify when the eye is accommodating and when the eye is only partially accommodating; and
b) selecting data relating to when the eye is only partially accommodating to calculate the optical property.
3. The method as defined by claim 2, wherein calculating the optical property comprises discarding the data relating to when the eye is accommodating.
4. The method as defined by claim 1, further comprising:
generating a target light beam by a target light source coupled to the apparatus; and producing the target indicia with the target light beam.
5. The method as defined by claim 1, wherein determining accommodation comprises obtaining a plurality of sequential images of a light wavefront from the eye, as the eye focuses on the target indicia.
6. The method as defined by claim 5, wherein calculating the optical property comprises calculating the optical property as a function of timing of the sequential images.
7. The method as defined by claim 5, wherein determining accommodation comprises tracking changes in the optical aberrations of the eye using measurements from a plurality of sequential images of a light wavefront from the eye, as the eye focuses on the target indicia.
8. The method as defined by claim 5, wherein determining accommodation of the eye comprises filtering one or more images from the plurality of sequential images.
9. The method as defined by claim 8, wherein the filtering is based on physiological parameters of the eye, including a rate of change in accommodation of the eye.
10. The method as defined by claim 1, wherein calculating the optical property for the eye comprises using a wavefront aberrometer to calculate the optical property.
11. The method as defined by claim 1, wherein focusing the eye on the target indicia comprises focusing the eye on the target indicia while the target indicia is at least about 10 feet from the apparatus.
12. The method as defined by claim 1, wherein calculating the optical property comprises calculating a prescription for the eye.
13. An optical apparatus comprising :
a proximal port and a distal port that together form a visual channel;
a target light source configured to produce target indicia at effective infinity, the target indicia being viewable through the visual channel; and
determining logic configured to determine accommodation of an eye, as the eye views the target indicia.
14. The optical apparatus as defined by claim 13, further comprising a body forming the proximal and distal ports, the body further containing the determining logic.
15. The optical apparatus as defined by claim 13, wherein the determining logic is configured to calculate a prescription for the eye, as a function of the determined accommodation of the eye.
16. The optical apparatus as defined by claim 13, further comprising a wavefront image sensor operatively coupled with the determining logic, the image sensor being configured to capture a plurality of sequential images of wavefronts, as the eye focuses on the target indicia.
17. The optical apparatus as defined by claim 16, wherein the logic for determining accommodation is configured to calculate the prescription, as a function of timing of the sequential images.
18. The optical apparatus as defined in claim 17, wherein the determining logic uses as input a spherical prescription for the eye, as a function of the timing of the sequential images.
19. The optical apparatus as defined in claim 17, wherein the determining logic uses as input a spherical equivalent (M) prescription for the eye, as a function of the timing of the sequential images.
20. The optical apparatus as defined in claim 16, further comprising a filter operably coupled with the determining logic, the filter being configured to filter one or more images from the plurality of sequential images.
21. An optical apparatus comprising :
a proximal port configured to receive an eye;
an array of primary light sensors configured to receive a wavefront passing through the proximal port, the array of primary light sensors having a perimeter;
at least one secondary light sensor positioned outside the perimeter of the array of primary light sensors; and
a circuit configured to determine a parameter of the eye using wavefront data from the array of primary light sensors.
22. The optical apparatus as defined by claim 21 further comprising a non-stationary body having the proximal port and a distal port and forming a visual channel from the proximal port through the distal port, the visual channel being open view to enable the eye to see target indicia external to and spaced away from the body.
23. The optical apparatus as defined by claim 21, further comprising a retinal light source configured to direct an illumination beam toward the proximal port to produce the wavefront.
24. The optical apparatus as defined by claim 21, further comprising a cue generator operatively coupled with the at least one secondary light sensor, the cue generator being configured to generate a cue as a function of receipt of the wavefront by the at least one secondary light sensor.
25. The optical apparatus as defined by claim 24, wherein the cue generator is configured to generate at least one of a visual cue, an acoustic cue and a mechanical cue, as a function of receipt of the wavefront by the at least one secondary light sensor.
26. The optical apparatus as defined by claim 21, wherein the array of primary light sensors has a first sensitivity to the wavefront, the at least one secondary light sensor has a second sensitivity to the wavefront, the first sensitivity being greater than the second sensitivity.
27. The optical apparatus as defined by claim 21, wherein the array of primary light sensors comprises a CCD, and the at least one secondary light sensor comprises a quadrant sensor.
28. The optical apparatus as defined by claim 21, wherein the distal port at least in part defines an optical axis; and the at least one secondary light sensor is configured to receive the wavefront, as a function of the orientation of the eye relative to the optical axis.
29. The optical apparatus as defined by claim 21, wherein the at least one secondary light sensor substantially circumscribes the perimeter of the array of primary light sensors.
30. An optical method comprising:
providing an optical apparatus having a proximal port and a distal port that together form a visual channel from the proximal port through the distal port, the apparatus further including an array of primary light sensors having a perimeter and at least one secondary light sensor positioned outside the perimeter of the array of primary light sensors;
aligning a living being's eye with the proximal port, the eye viewing through the distal port to target indicia exterior of the apparatus;
illuminating the eye to produce a wavefront through the proximal port;
determining the amount of the wavefront sensed by the at least one secondary light sensor; and
generating a cue, as a function of the amount of the wavefront sensed by the at least one secondary light sensor.
31. The method as defined by claim 30, wherein the distal port at least in part defines an optical axis, the method further comprising moving the eye toward the optical axis in response to the cue.
32. The method as defined by claim 30, wherein the distal port at least in part defines an optical axis and the at least one secondary light sensor is configured to receive the wavefront, as a function of the orientation of the eye, relative to the optical axis.
33. The method as defined by claim 30, further comprising splitting the wavefront into a primary path toward the array of primary light sensors, the method further splitting the wavefront into a secondary path toward the at least one secondary light sensor.
34. An optical apparatus comprising:
a proximal port configured to receive an eye;
a distal port;
a visual channel from the proximal port through the distal port;
an array of primary light sensors configured to receive a wavefront passing through the proximal port;
at least one secondary light sensor; and
optics within the visual channel and configured to split the wavefront into a primary path toward the array of primary light sensors and a secondary path toward the at least one secondary light sensor.
35. The optical apparatus as defined by claim 34, wherein the primary light sensors are adjacent the at least one secondary light sensor.
36. The optical apparatus as defined by claim 34, further comprising a lens adjacent the at least one secondary light sensor, the lens being positioned so the secondary path passes through the lens.
37. The optical apparatus as defined by claim 34, further comprising a retinal light source configured to direct an illumination beam toward the proximal port to produce the wavefront.
38. The optical apparatus as defined by claim 34, further comprising a cue generator operatively coupled with the at least one secondary light sensor, the cue generator being configured to generate a cue, as a function of receipt of the wavefront by the at least one secondary light sensor.
39. The optical apparatus as defined by claim 38, wherein the cue generator is configured to generate at least one of a visual cue, an acoustic cue and a mechanical cue, the cue generator generating the at least one cue, as a function of receipt of the wavefront by the at least one secondary light sensor.
40. The optical apparatus as defined by claim 34, wherein the array of primary light sensors has a first sensitivity to the wavefront, the at least one secondary light sensor has a second sensitivity to the wavefront and the first sensitivity is greater than the second sensitivity.
41. The optical apparatus as defined by claim 34, wherein the array of primary light sensors comprises a CCD, and the at least one secondary light sensor comprises a quadrant sensor.
42. An optical method comprising:
providing an optical apparatus having a proximal port and a distal port that together form a visual channel from the proximal port through the distal port, the apparatus also having an array of primary light sensors and at least one secondary light sensor;
aligning a living being's eye with the proximal port, the eye viewing through the distal port to target indicia exterior of the apparatus;
illuminating the eye to produce a wavefront through the proximal port; and
splitting the wavefront into a primary path toward the array of primary light sensors and a secondary path toward the at least one secondary light sensor.
43. The method as defined by claim 42, further comprising passing the secondary path of the wavefront through a lens to focus the split portion of the wavefront along the secondary path.
44. The method as defined by claim 42, further comprising directing a light beam toward the proximal port to reflect off the eye to produce the wavefront.
45. The method as defined by claim 42, further comprising generating a cue, as a function of receipt of the wavefront by the at least one secondary light sensor.
46. The method as defined by claim 45, wherein generating the cue comprises generating at least one of: a visual cue, an acoustic cue and a mechanical cue, as a function of receipt of the wavefront by the at least one secondary light sensor.
47. The method as defined by claim 42, wherein the at least one secondary light sensor comprises a quadrant sensor.
48. A method of determining an optical property of an eye of a living being, the method comprising:
providing an optical apparatus having a proximal port and a distal port that together form a visual channel;
aligning the eye with the proximal port;
directing light into the eye to produce a wavefront;
receiving the wavefront via the proximal port;
capturing a plurality of sequential, time spaced-apart data sets of the wavefront, the data sets including temporal information sufficient to describe a relative time at which each data set of the plurality of data sets is captured; and
determining an optical property of the eye, as a function of the temporal information.
49. The method as defined by claim 48, wherein the plurality of sequential data sets comprises images of the wavefront.
50. The method as defined by claim 48, further comprising filtering at least one data set of the plurality of data sets.
51. The method as defined by claim 48, wherein determining the optical property comprises determining the optical property as a function of the order of the plurality of data sets and the contents of the plurality of data sets.
52. The method as defined by claim 48, wherein each data set of the plurality of data sets includes wavefront aberration information.
53. The method as defined by claim 48, wherein the optical property includes a spherical component and a cylindrical component, wherein determining the optical property comprises determining the cylindrical component after determining the spherical component.
54. The method as defined by claim 48, wherein determining the optical property comprises analyzing the plurality of data sets for trends in the data.
55. The method as defined by claim 48, wherein the plurality of data sets includes information relating to accommodation of the eye.
56. The method as defined by claim 48, further comprising weighting certain data sets of the plurality of data sets, as a function of a signal-to-noise ratio.
57. The method as defined by claim 48, wherein the plurality of data sets comprises a video of the wavefront.
58. The method as defined by claim 48, wherein the optical property includes a prescription for the eye.
59. An apparatus for determining an optical property of an eye a living being, the apparatus comprising:
a proximal port and a distal port that together form a visual channel;
an illumination light source configured to direct light into the eye to produce a wavefront that is received through the proximal port;
an image capture sensor operatively coupled with the visual channel, the sensor being configured to capture a plurality of sequential, time spaced-apart data sets of the wavefront, the data sets including temporal information sufficient to describe a relative time at which each data set of the plurality of data sets is captured; and
optical property logic operatively coupled to the image capture sensor, the optical property logic being configured to determine an optical property of the eye, as a function of the temporal information.
60. The optical apparatus as defined by claim 59, wherein the plurality of data sets comprises images of the wavefront.
61. The optical apparatus as defined by claim 59, further comprising a filter configured to filter at least one data set of the plurality of data sets.
62. The optical apparatus as defined by claim 59, wherein the optical property logic comprises logic configured for determining an optical property, as a function of the order of the plurality of data sets and contents of the plurality of data sets.
63. The optical apparatus as defined by claim 62, wherein the optical property comprises a prescription for the eye.
64. The optical apparatus as defined by claim 59, wherein each data set of the plurality of data sets includes wavefront aberration information.
65. The optical apparatus as defined by claim 59, wherein the optical property includes a spherical component and a cylindrical component and the optical property logic is configured to determine the cylindrical component after determining the spherical component.
66. The optical apparatus as defined by claim 59, wherein the optical property logic is configured to analyze the plurality of data sets for trends in the data.
67. The optical apparatus as defined by claim 59, wherein the plurality of data sets includes information relating to accommodation of the eye.
68. The optical apparatus as defined by claim 59, wherein the optical property logic is configured to weigh at least one data set of the plurality of data sets, as a function of a signal-to- noise ratio.
69. The optical apparatus as defined by claim 59, wherein the sequential data sets comprises a video of the wavefront.
70. A method of determining an optical property of an eye of a living being, the method comprising:
providing an optical apparatus having a proximal port and a distal port that together form a visual channel;
aligning the eye with the proximal port;
directing light into the eye to produce a wavefront;
receiving the wavefront via the proximal port;
capturing a plurality of sequential, time spaced-apart data sets of the wavefront, the data sets including temporal information sufficient to describe a relative time at which each data set of the plurality of data sets is captured, the data sets including high-frequency noise;
registering data sets of the plurality of data sets; and
determining an optical property of the eye, as a function of the registered data sets.
71. The method as defined by claim 70, wherein registering the data sets comprises mitigating the high frequency noise.
72. The method as defined by claim 70, wherein registering the data sets comprises increasing a signal-to-noise ratio.
73. The method as defined by claim 70, wherein registering the data sets comprises registering consecutive data sets.
74. The method as defined by claim 70, wherein the plurality of data sets comprises images of the wavefront.
75. The method as defined by claim 70, further comprising filtering at least one data set of the plurality of data sets before registering.
76. The method as defined by claim 70, wherein each data set includes wavefront aberration information.
77. The method as defined by claim 70, wherein the plurality of data sets comprises a video of the wavefront.
78. The method as defined by claim 70, wherein registering the plurality of data sets comprises registering consecutive data sets and combining the registered consecutive data sets to mitigate noise.
79. The method as defined by claim 70, wherein:
registering the lurality of data sets comprises:
selecting data sets that were acquired close enough together in time to avoid data sets that span a change in the optical property of the eye due to accommodation; and
registering the selected data sets; the method further comprising:
combining the registered data sets to mitigate noise.
80. The method as defined by claim 70, wherein registering the plurality of data sets comprises registering data sets with similar, within a predetermined range, wavefront aberration information and combining the registered data sets to mitigate noise.
81. An optical apparatus for determining an optical property of an eye of a living being, the apparatus comprising:
a proximal port and a distal port that together form a visual channel;
an illumination light source configured to direct light into the eye to produce a wavefront that is received through the proximal port;
an image capture sensor operatively coupled with the visual channel, the sensor being configured to capture a plurality of sequential, time spaced-apart data sets of the wavefront, the data sets including temporal information sufficient to describe a relative time at which each data set of the plurality of data sets is captured, the data sets including high-frequency noise;
optical property logic operatively coupled with the sensor, the optical property logic being configured to register consecutive data sets of the plurality of data sets to mitigate the high frequency noise, the optical property logic being configured to also determine an optical property of the eye, as a function of the registered data sets.
82. The optical apparatus as defined by claim 1, wherein the plurality of data sets comprises images of the wavefront.
83. The optical apparatus as defined by claim 1, further comprising a filter configured to filter at least one data set of the plurality of data sets before registering.
84. The optical apparatus as defined by claim 1, wherein each data set of the plurality of data sets includes wavefront aberration information.
85. The optical apparatus as defined by claim 1, wherein the plurality of data sets comprises a video of the wavefront.
86. The optical apparatus as defined by claim 1, wherein the optical property logic is configured to combine or average the registered data sets to mitigate noise.
87. The optical apparatus as defined by claim 1, wherein the optical property logic is configured to register consecutive data sets.
88. The optical apparatus as defined by claim 1, wherein the optical property logic is configured to select data sets that were acquired close enough together in time to avoid data sets that span a change in the optical property of the eye due to accommodation and register the selected data sets.
89. The optical apparatus as defined by claim 1, wherein the optical property comprises an eye prescription.
90. The method as defined by claim 12, wherein calculating the optical property comprises calculating an eyeglass prescriptions for distant and near vision.
PCT/US2014/045305 2013-07-02 2014-07-02 Apparatus and method of determining an eye prescription WO2015003086A1 (en)

Priority Applications (7)

Application Number Priority Date Filing Date Title
US14/900,695 US9854965B2 (en) 2013-07-02 2014-07-02 Apparatus and method of determining an eye prescription
CN201480046065.8A CN105473056B (en) 2013-07-02 2014-07-02 The device and method for determining eye prescription
EP18175052.2A EP3387985B1 (en) 2013-07-02 2014-07-02 Apparatus and method of determining an eye prescription
EP14820078.5A EP3016576A4 (en) 2013-07-02 2014-07-02 Apparatus and method of determining an eye prescription
KR1020167002229A KR101995878B1 (en) 2013-07-02 2014-07-02 Apparatus and method of determining an eye prescription
JP2016524353A JP6470746B2 (en) 2013-07-02 2014-07-02 Apparatus and method for determining ophthalmic prescription
US15/858,415 US10349830B2 (en) 2013-07-02 2017-12-29 Apparatus and method of determining an eye prescription

Applications Claiming Priority (6)

Application Number Priority Date Filing Date Title
US201361842190P 2013-07-02 2013-07-02
US61/842,190 2013-07-02
US201461972058P 2014-03-28 2014-03-28
US201461972191P 2014-03-28 2014-03-28
US61/972,191 2014-03-28
US61/972,058 2014-03-28

Related Child Applications (2)

Application Number Title Priority Date Filing Date
US14/900,695 A-371-Of-International US9854965B2 (en) 2013-07-02 2014-07-02 Apparatus and method of determining an eye prescription
US15/858,415 Continuation US10349830B2 (en) 2013-07-02 2017-12-29 Apparatus and method of determining an eye prescription

Publications (1)

Publication Number Publication Date
WO2015003086A1 true WO2015003086A1 (en) 2015-01-08

Family

ID=52144195

Family Applications (2)

Application Number Title Priority Date Filing Date
PCT/US2014/045261 WO2015003062A1 (en) 2013-07-02 2014-07-02 Apparatus and method of determining an eye prescription
PCT/US2014/045305 WO2015003086A1 (en) 2013-07-02 2014-07-02 Apparatus and method of determining an eye prescription

Family Applications Before (1)

Application Number Title Priority Date Filing Date
PCT/US2014/045261 WO2015003062A1 (en) 2013-07-02 2014-07-02 Apparatus and method of determining an eye prescription

Country Status (6)

Country Link
US (4) US9854965B2 (en)
EP (4) EP3016575A4 (en)
JP (2) JP6470746B2 (en)
KR (2) KR101995878B1 (en)
CN (2) CN105578947B (en)
WO (2) WO2015003062A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160205298A1 (en) * 2015-01-09 2016-07-14 Smart Vision Labs Portable Fundus Camera
JP2017189617A (en) * 2016-04-14 2017-10-19 キヤノン株式会社 Imaging apparatus control method, computer-readable medium, and controller for controlling imaging apparatus
WO2017218539A1 (en) 2016-06-14 2017-12-21 Plenoptika, Inc. Tunable-lens-based refractive examination
CN108392170A (en) * 2018-02-09 2018-08-14 中北大学 A kind of human eye follow-up mechanism and recognition positioning method for optometry unit
US20220338732A1 (en) * 2019-07-31 2022-10-27 Yoichiro Kobayashi Eyeball imaging device and diagnosis support system
WO2022236333A2 (en) 2021-05-06 2022-11-10 Plenoptika, Inc. Eye examination method and system
WO2023064946A1 (en) 2021-10-15 2023-04-20 Plenoptika, Inc. Dynamic retinal image quality

Families Citing this family (34)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9066683B2 (en) 2013-04-09 2015-06-30 Smart Vision Labs Portable wavefront aberrometer
JP6470746B2 (en) 2013-07-02 2019-02-13 マサチューセッツ インスティテュート オブ テクノロジー Apparatus and method for determining ophthalmic prescription
CA2986033A1 (en) * 2015-05-12 2016-11-17 Ikem C AJAELO Electronic drop dispensing device and method of operation thereof
WO2017025583A1 (en) * 2015-08-12 2017-02-16 Carl Zeiss Meditec, Inc. Alignment improvements for ophthalmic diagnostic systems
CN105496351B (en) * 2015-12-30 2017-11-14 深圳市莫廷影像技术有限公司 A kind of binocular optometry equipment
WO2017138004A1 (en) * 2016-02-12 2017-08-17 Shamir Optical Industry Ltd. Methods and systems for testing of eyeglasses
US10859857B2 (en) * 2016-03-22 2020-12-08 Johnson & Johnson Vision Care, Inc. Pulsed plus lens designs for myopia control, enhanced depth of focus and presbyopia correction
CN105933060B (en) * 2016-06-24 2018-02-13 温州大学 A kind of wavefront reconstruction method based on dynamics Feedback Neural Network
WO2018110731A1 (en) * 2016-12-14 2018-06-21 주식회사 에덴룩스 Vision training device
FR3062476B1 (en) * 2017-01-27 2020-12-25 Imagine Optic METHOD OF EVALUATING THE QUALITY OF MEASUREMENT OF A WAVEFRONT AND SYSTEMS IMPLEMENTING SUCH A METHOD
WO2018147834A1 (en) * 2017-02-07 2018-08-16 Carl Zeiss Vision International Gmbh Prescription determination
JP2018166649A (en) * 2017-03-29 2018-11-01 株式会社トプコン Ophthalmologic apparatus
JP2018166650A (en) * 2017-03-29 2018-11-01 株式会社トプコン Ophthalmologic apparatus
US10789450B2 (en) * 2017-10-20 2020-09-29 Synaptics Incorporated Optical biometric sensor with automatic gain and exposure control
CN111542258B (en) * 2017-11-07 2023-10-20 诺达尔视觉有限公司 Method and system for alignment of ophthalmic imaging devices
US20190223712A1 (en) * 2018-01-24 2019-07-25 Ubiquity Biomedical Corporation Eyelid opener and eyelid opening and sensing device
CN108703739B (en) * 2018-06-12 2021-02-09 钟青松 Ophthalmic laser speckle optometry machine
WO2020010138A1 (en) * 2018-07-06 2020-01-09 The Johns Hopkins University Computational lightfield ophthalmoscope
GB201814071D0 (en) 2018-08-29 2018-10-10 Provost Fellows Found Scholars And The Other Members Of Board Of The College Of The Holy And Undivid Optical device and method
EP3663838A1 (en) * 2018-12-03 2020-06-10 Carl Zeiss Vision International GmbH Spectacle lens, family of spectacle lenses, method for designing a spectacle lens family and method for producing a spectacle lens
TW202034840A (en) * 2018-12-06 2020-10-01 美商愛奎有限公司 Refraction measurement of the human eye with a reverse wavefront sensor
KR102130310B1 (en) * 2019-01-21 2020-07-08 박귀종 Optometer
CA3132556A1 (en) * 2019-03-05 2020-09-10 Gaston Daniel Baudat System and method of wavefront sensing with engineered images
KR102216051B1 (en) * 2019-03-12 2021-02-15 연세대학교 원주산학협력단 Apparatus and method for monitoring ophthalmic diseases
CN114269228A (en) 2019-09-30 2022-04-01 爱尔康公司 Improved ocular aberration measurement recovery system and method
CA3143563A1 (en) * 2019-09-30 2021-04-08 Max Hall Improved ocular aberrometer systems and methods
CN115135227A (en) * 2019-12-19 2022-09-30 奥西奥有限公司 Ophthalmoscope for examining eyes
KR102460451B1 (en) * 2020-02-20 2022-10-28 사회복지법인 삼성생명공익재단 Method for measuring anomalies of refraction using a reflection image of pupil in visible light
CN111281337B (en) * 2020-03-02 2023-03-14 南方科技大学 Image acquisition equipment and fundus retina imaging method based on image acquisition equipment
US11589744B2 (en) * 2020-09-23 2023-02-28 Troy Lee Carter Ophthalmic fixation apparatus and method of use
EP3991634A1 (en) 2020-10-29 2022-05-04 Carl Zeiss Vision International GmbH Method and device for determining a refractive error
KR102480635B1 (en) * 2020-11-20 2022-12-26 주식회사 에덴룩스 Measuring apparatus of vision and measuring system including the same
JP2023067207A (en) * 2021-10-29 2023-05-16 DeepEyeVision株式会社 Eyeground information acquisition method and eyeground information acquisition apparatus
WO2023131981A1 (en) * 2022-01-10 2023-07-13 Remidio Innovative Solutions Pvt. Ltd Autorefractive device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5684561A (en) * 1992-05-26 1997-11-04 Daphne Eye Technologies Device and method for evaluation of refraction of the eye
US20030071969A1 (en) * 2001-08-31 2003-04-17 Levine Bruce M. Ophthalmic instrument with adaptive optic subsystem that measures aberrations (including higher order aberrations) of a human eye and that provides a view of compensation of such aberrations to the human eye
US20040174495A1 (en) * 2001-06-05 2004-09-09 Adaptive Optics Associates, Inc. Method of and system for examining the human eye with a wavefront sensor-based ophthalmic instrument
US20080198331A1 (en) * 2004-07-19 2008-08-21 Massachusetts Eye & Ear Infirmary, A Massachusetts Corporation Ocular wavefront-correction profiling
US20080284979A1 (en) * 2007-05-17 2008-11-20 Visx, Incorporated System and Method for Illumination and Fixation with Ophthalmic Diagnostic Instruments
US20090002632A1 (en) * 2001-11-07 2009-01-01 Cart Zeiss Meditec Ag Method, device and arrangement for measuring the dynamic behavior of an optical system

Family Cites Families (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5963300A (en) * 1998-02-17 1999-10-05 Amt Technologies, Corp. Ocular biometer
US7303281B2 (en) * 1998-10-07 2007-12-04 Tracey Technologies, Llc Method and device for determining refractive components and visual function of the eye for vision correction
DE69941988D1 (en) * 1999-07-27 2010-03-18 Wavetec Vision Systems Inc OKULAR BIOMETER
EP1429652A4 (en) * 2001-07-27 2007-05-02 Tracey Technologies Llc Measuring refractive characteristics of human eyes
JP4718056B2 (en) 2001-08-13 2011-07-06 株式会社アマダエンジニアリングセンター Steel conveying device
US6802605B2 (en) * 2001-12-11 2004-10-12 Bausch And Lomb, Inc. Contact lens and method for fitting and design
US20050174535A1 (en) * 2003-02-13 2005-08-11 Lai Shui T. Apparatus and method for determining subjective responses using objective characterization of vision based on wavefront sensing
US6761454B2 (en) * 2002-02-13 2004-07-13 Ophthonix, Inc. Apparatus and method for determining objective refraction using wavefront sensing
US6988801B2 (en) 2003-03-25 2006-01-24 University Of Rochester Compact portable wavefront sensor
US7131727B2 (en) * 2003-06-30 2006-11-07 Johnson & Johnson Vision Care, Inc. Simultaneous vision emulation for fitting of corrective multifocal contact lenses
US8128606B2 (en) * 2003-07-03 2012-03-06 Hewlett-Packard Development Company, L.P. Ophthalmic apparatus and method for administering agents to the eye
CN100463646C (en) * 2003-11-14 2009-02-25 欧弗搜尼克斯股份有限公司 Ophthalmic binocular wavefront measurement system
US20050105044A1 (en) * 2003-11-14 2005-05-19 Laurence Warden Lensometers and wavefront sensors and methods of measuring aberration
WO2005048829A2 (en) * 2003-11-14 2005-06-02 Ophthonix, Inc. Ophthalmic binocular wafefront measurement system
ES2665536T3 (en) * 2004-04-20 2018-04-26 Alcon Research, Ltd. Integrated surgical microscope and wavefront sensor
US7387387B2 (en) * 2004-06-17 2008-06-17 Amo Manufacturing Usa, Llc Correction of presbyopia using adaptive optics and associated methods
WO2011047214A2 (en) * 2009-10-14 2011-04-21 Optimum Technologies, Inc. Portable retinal camera and image acquisition method
US9504376B2 (en) * 2009-12-22 2016-11-29 Amo Wavefront Sciences, Llc Optical diagnosis using measurement sequence
US8783871B2 (en) * 2010-04-22 2014-07-22 Massachusetts Institute Of Technology Near eye tool for refractive assessment
US10238282B2 (en) * 2011-09-30 2019-03-26 Johnson & Johnson Vision Care, Inc. Method and device for dosage and administration feedback
US20140268037A1 (en) * 2013-03-15 2014-09-18 Neuroptics, Inc. Intelligent headrest and ophthalmic examination and data management system
US9066683B2 (en) 2013-04-09 2015-06-30 Smart Vision Labs Portable wavefront aberrometer
JP6470746B2 (en) 2013-07-02 2019-02-13 マサチューセッツ インスティテュート オブ テクノロジー Apparatus and method for determining ophthalmic prescription
WO2015102703A1 (en) 2013-12-31 2015-07-09 Smart Vision Labs Portable wavefront aberrometer
US9427156B1 (en) 2015-08-27 2016-08-30 Ovitz Corporation Devices and methods for wavefront sensing and corneal topography

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5684561A (en) * 1992-05-26 1997-11-04 Daphne Eye Technologies Device and method for evaluation of refraction of the eye
US20040174495A1 (en) * 2001-06-05 2004-09-09 Adaptive Optics Associates, Inc. Method of and system for examining the human eye with a wavefront sensor-based ophthalmic instrument
US20030071969A1 (en) * 2001-08-31 2003-04-17 Levine Bruce M. Ophthalmic instrument with adaptive optic subsystem that measures aberrations (including higher order aberrations) of a human eye and that provides a view of compensation of such aberrations to the human eye
US20090002632A1 (en) * 2001-11-07 2009-01-01 Cart Zeiss Meditec Ag Method, device and arrangement for measuring the dynamic behavior of an optical system
US20080198331A1 (en) * 2004-07-19 2008-08-21 Massachusetts Eye & Ear Infirmary, A Massachusetts Corporation Ocular wavefront-correction profiling
US20080284979A1 (en) * 2007-05-17 2008-11-20 Visx, Incorporated System and Method for Illumination and Fixation with Ophthalmic Diagnostic Instruments

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
See also references of EP3016576A4 *

Cited By (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20160205298A1 (en) * 2015-01-09 2016-07-14 Smart Vision Labs Portable Fundus Camera
US20160198947A1 (en) * 2015-01-09 2016-07-14 Smart Vision Labs Portable Wavefront Aberrometer with Open Field Alignment Channel
JP2018501936A (en) * 2015-01-09 2018-01-25 スマート ヴィジョン ラブズ インコーポレイテッド Portable wavefront aberrometer with open field alignment channel
US10117574B2 (en) * 2015-01-09 2018-11-06 Smart Vision Labs, Inc. Portable wavefront aberrometer with open field alignment channel
US10492679B2 (en) * 2015-01-09 2019-12-03 Smart Vision Labs, Inc. Portable fundus camera
JP2017189617A (en) * 2016-04-14 2017-10-19 キヤノン株式会社 Imaging apparatus control method, computer-readable medium, and controller for controlling imaging apparatus
WO2017218539A1 (en) 2016-06-14 2017-12-21 Plenoptika, Inc. Tunable-lens-based refractive examination
US11096576B2 (en) 2016-06-14 2021-08-24 Plenoptika, Inc. Tunable-lens-based refractive examination
CN108392170A (en) * 2018-02-09 2018-08-14 中北大学 A kind of human eye follow-up mechanism and recognition positioning method for optometry unit
US20220338732A1 (en) * 2019-07-31 2022-10-27 Yoichiro Kobayashi Eyeball imaging device and diagnosis support system
WO2022236333A2 (en) 2021-05-06 2022-11-10 Plenoptika, Inc. Eye examination method and system
WO2023064946A1 (en) 2021-10-15 2023-04-20 Plenoptika, Inc. Dynamic retinal image quality

Also Published As

Publication number Publication date
KR101995877B1 (en) 2019-07-03
CN105578947A (en) 2016-05-11
EP3016575A4 (en) 2017-01-11
EP3539460A1 (en) 2019-09-18
KR20160027044A (en) 2016-03-09
CN105473056A (en) 2016-04-06
JP2016526985A (en) 2016-09-08
JP6448635B2 (en) 2019-01-09
WO2015003062A1 (en) 2015-01-08
EP3387985B1 (en) 2023-12-20
US20160128562A1 (en) 2016-05-12
CN105578947B (en) 2018-10-26
KR20160058748A (en) 2016-05-25
US10349830B2 (en) 2019-07-16
US9854965B2 (en) 2018-01-02
US20180116504A1 (en) 2018-05-03
EP3387985A1 (en) 2018-10-17
EP3016576A4 (en) 2017-01-18
JP6470746B2 (en) 2019-02-13
US10786150B2 (en) 2020-09-29
KR101995878B1 (en) 2019-07-03
US20160128566A1 (en) 2016-05-12
EP3016575A1 (en) 2016-05-11
CN105473056B (en) 2018-10-26
US20180078131A1 (en) 2018-03-22
EP3016576A1 (en) 2016-05-11
JP2016526984A (en) 2016-09-08

Similar Documents

Publication Publication Date Title
US10349830B2 (en) Apparatus and method of determining an eye prescription
US20220007939A1 (en) Apparatus and method for determining an eye property
US6439720B1 (en) Method and apparatus for measuring optical aberrations of the human eye
US10575725B2 (en) System and method for characterising eye-related systems
US20210089118A1 (en) Virtual reality instrument for the automatic measurement of refraction and aberrations of the eye
US11445904B2 (en) Joint determination of accommodation and vergence
WO2022224186A1 (en) Compact autocylinder compensation module for autorefractor and autorefractor with autocylinder compensation module
Davarinia et al. Total aberrations measurement with using the mathematical model of the eye

Legal Events

Date Code Title Description
WWE Wipo information: entry into national phase

Ref document number: 201480046065.8

Country of ref document: CN

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14820078

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 14900695

Country of ref document: US

ENP Entry into the national phase

Ref document number: 2016524353

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

WWE Wipo information: entry into national phase

Ref document number: 2014820078

Country of ref document: EP

ENP Entry into the national phase

Ref document number: 20167002229

Country of ref document: KR

Kind code of ref document: A