WO2022150448A1 - System and method to measure aberrations by imaging both the crescent and the halo of the crescent - Google Patents

System and method to measure aberrations by imaging both the crescent and the halo of the crescent Download PDF

Info

Publication number
WO2022150448A1
WO2022150448A1 PCT/US2022/011400 US2022011400W WO2022150448A1 WO 2022150448 A1 WO2022150448 A1 WO 2022150448A1 US 2022011400 W US2022011400 W US 2022011400W WO 2022150448 A1 WO2022150448 A1 WO 2022150448A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
autorefractor
camera
eye
crescent
Prior art date
Application number
PCT/US2022/011400
Other languages
French (fr)
Inventor
David R. Williams
Pablo Artal
Silvestre Manzanera
Original Assignee
University Of Rochester
Universidad De Murcia
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by University Of Rochester, Universidad De Murcia filed Critical University Of Rochester
Publication of WO2022150448A1 publication Critical patent/WO2022150448A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/103Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining refraction, e.g. refractometers, skiascopes
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/103Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining refraction, e.g. refractometers, skiascopes
    • A61B3/1035Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining refraction, e.g. refractometers, skiascopes for measuring astigmatism

Definitions

  • the application relates to objective measurements of an optical system including the human eye, particularly to measurement of aberrations of the optical system.
  • refractive errors myopia, hyperopia and astigmatism
  • glasseses, contact lenses, and refractive surgery are the methods most extensively used to correct vision.
  • a precise measurement of the refractive error (refraction) is needed before the prescription of the correction. The measurement typically includes a visit to an optician or ophthalmologist who measures the person’s refraction using subjective and/or objective methods. Objective methods usually make use of bulky, expensive instruments and subjective methods are time-consuming.
  • An autorefractor apparatus includes at least one source of flash illumination. At least one camera is mounted on or in a surface of a mobile device. The at least one camera includes at least one focusable lens. The at least one camera is disposed at a different location on a surface of the mobile device at a same or different distance from the at least one source of flash illumination. At least one computer is programmed to ran an autorefractor process to transform at least one image of an eye taken by the at least one camera into an eyewear prescription based on a two-dimensional light distribution of a crescent in the at least one image of light returning from a retina of the eye.
  • the two-dimensional image of the crescent in the pupil can be characterized by any of the parameters: baseline intensity, top intensity at the plateau, center of the circle, radius of the circle, radius of the plateau, and combinations thereof.
  • the autorefractor process can be based at least in part on a light intensity distribution where the crescent is substantially not visible in the at least one image and where a correction is less than about 1.5 diopter.
  • the autorefractor process can be based further on an age of a person autorefracted.
  • the autorefractor process can be based further on an ethnicity of a person autorefracted.
  • the autorefractor process can be based further on a skin color of a person autorefracted.
  • the skin color of the person autorefracted can be determined from the at least one image.
  • the color of the back of the eye or the color of the fundus of the eye of the person autorefracted can be determined from the at least one image.
  • the autorefractor apparatus can further include a self-calibration process based on RGB information of the at least one image.
  • the camera includes a smartphone RGB camera.
  • a method of measuring aberrations in an optical system smartphone autorefractor includes: providing at least one source of flash illumination, at least one camera mounted on or in a surface of a mobile device, the at least one camera including at least one focusable lens, the at least one camera disposed at a different location on a surface of the mobile device at a same or different distance from the at least one source of flash illumination; and at least one computer programmed to run an autorefractor process; acquiring at least one image of an eye of a person at a distance D; and transforming the at least one image to an eyewear prescription based on a two-dimensional light distribution of a crescent in the at least one image of light returning from a retina of the eye.
  • the two-dimensional image of the crescent in the pupil can be characterized by any of the parameters: baseline intensity, top intensity at the plateau, center of the circle, radius of the circle, radius of the plateau, and combinations thereof.
  • the step of transforming can include transforming the at least one image to an eyewear prescription based at least in part on a light intensity distribution where the crescent is substantially not visible in the at least one image and where a correction is less than about 1.5 diopter.
  • the step of transforming can include transforming the at least one image to an eyewear prescription corrected based on an age of a person autorefracted.
  • the step of transforming can include transforming the at least one image to an eyewear prescription corrected based on an ethnicity of a person autorefracted.
  • the step of transforming can further include transforming the at least one image to an eyewear prescription corrected based on a skin color of a person autorefracted.
  • the method of transforming can include transforming the at least one image to an eyewear prescription corrected based on a color of the back of the eye or a color of the fundus of the eye of a person autorefracted.
  • the method can further include the step of determining the skin color of the person autorefracted is determined from the at least one image.
  • the method can further include the step of performing a self-calibration process based on RGB information of the at least one image.
  • the step of providing can include providing a smartphone RGB camera.
  • the step of acquiring can include taking a selfie with a smartphone camera.
  • the method can further include before the step of acquiring, the step of placing a lens between the eye and the camera to substantially eliminate the dead zone.
  • a method of measuring aberrations in an optical system smartphone autorefractor includes: providing at least one source of flash illumination, at least one camera mounted on or in a surface of a mobile device, the at least one camera including at least one focusable lens, the at least one camera disposed at a different location on a surface of the mobile device at a same or different distance from the at least one source of flash illumination; and at least one computer programmed to run an autorefractor process; acquiring at least one image of an eye of a person at a distance D; and transforming the at least one image to an eyewear prescription based at least in part on a provided ethnicity or a skin color read from the at least one image and a crescent in the at least one image of light returning from a retina of the eye.
  • a method of measuring aberrations in an optical system smartphone autorefractor includes: providing at least one source of flash illumination, at least one camera mounted on or in a surface of a mobile device, the at least one camera including at least one focusable lens, the at least one camera disposed at a different location on a surface of the mobile device at a same or different distance from the at least one source of flash illumination; and at least one computer programmed to run an autorefractor process; acquiring at least one image of an eye of a person at a distance D; and transforming the at least one image to an eyewear prescription based at least in part on a provided age of a patient and a crescent in the at least one image of light returning from a retina of the eye
  • FIG. 1A is a drawing showing a person performing an autorefraction by selfie according to the Application by use of a smartphone;
  • FIG. IB is a drawing showing a person performing an autorefraction by selfie with a different orientation of the smartphone
  • FIG. 2 is block diagram of an exemplary apparatus useful for performing a self autorefraction process
  • FIG. 3 is a flowchart showing an exemplary self- autorefraction process
  • FIG. 4A is a drawing that illustrates photorefraction with a smartphone and the comparison with panel
  • FIG. 4B is a drawing that illustrates the equivalence of a smartphone camera
  • FIG. 5A is a drawing showing the visible crescent over a range of refraction aberrations of the human eye
  • FIG. 5B is a drawing of the visible crescent over a range of refractive errors of the human eye, now including the scattered light of the halo of the crescent;
  • FIG. 6A is a drawing that represents a simulation showing how parameters change with the defocus of the prescription
  • FIG. 6B shows the light intensity in a cross section through the crescent and halo, illustrating the transition zone where the intensity falls to a non-zero light distribution
  • FIG. 7 is a drawing showing images and graphs of a location of a center of the plateau and defocus and a diameter of a circle suggested by a crescent;
  • FIG. 8A is a drawing which shows the central image shows a pupil with a crescent
  • FIG. 8B is a drawing which shows the dashed line which outlines the circle suggested from the crescent on the pupil;
  • FIG. 9A is a drawing showing parameters of a 2D geometric model which can be used to transform at least one image of the pupil of the eye into an eyewear prescription
  • FIG. 9B is a flow chart showing an exemplary process for transforming at least one image into an eyewear prescription
  • FIG. 10 is a drawing showing a laboratory demonstration of determining a prescription based on a 2D intensity plot of the crescent
  • FIG. 11 is a drawing which illustrates the dead zone
  • FIG. 12 is a drawing showing images of the pupil of an eye where different refractive errors have been induced by means of trial lenses;
  • FIG. 13 is a drawing showing the error in the estimation of the refractive state of the eye is plotted versus the lightness of the iris;
  • FIG. 14 is a graph showing a photorefraction vs HS - center error correction
  • FIG. 15 is a drawing which shows an example of a RGB camera collecting colored images of the pupil.
  • CRESENT - Eccentric photorefraction measurements are typically based on a geometric shape of light reflected back from an illuminated human eye.
  • the light as reflected by the retina in the complete two-way passage of light through the eye creates a crescent shape in an image of the pupil of the illuminated eye.
  • An arc can be matched to the outer curve of the crescent to create a complete circle.
  • the center of that circle can be converted to a refractive measurement of the eye of a person and ultimately transformed into an optical prescription for eyewear.
  • the location of the crescent in the image moves (typically in a vertical direction with varying refractive aberration) from below the pupil to above the pupil.
  • the pupil obscures the image of the crescent from the retina, and the crescent is obscured by the circular opening of the pupil. Therefore, it is difficult to impossible to obtain an accurate prescription for a relatively small correction by use of the crescent method of eccentric photorefraction.
  • HALO OF THE CRESCENT By use of relatively sensitive imaging devices, as are common now in regular smartphone cameras, where previously it was believed that there was no useful information in the absence of a visible crescent or a large enough portion of a visible crescent, we believe there may be enough useful light, to make autorefraction measurements for a small correction in the crescent dead zone.
  • a 2 dimensional (2D) image of the light distribution from the pupil including shape and size information of the crescent and the halo of the crescent, could provide a complete range of human eyewear prescriptions based on one more images taken by a smart phone, such as by taking a selfie.
  • INTRODUCTION - Autorefractors can objectively read a person’s prescription based on passing light from a light source through the pupils of a person’s eyes and back through various types of optical components, typically including a relatively complex arrangement of beam splitters, lenses, and light sensors.
  • Lensometers use similar technologies to measure the optical parameters of a lens, as compared to the aberrations of the eyes as measured by an autorefractor. Both autorefractors and lensometers typically cost several thousands of dollars.
  • Autorefractors are typically used as the starting point for providing individual persons with a prescription needed for eyeglasses, contact lenses, or refractive surgery.
  • the estimate provided by the autorefractor is then typically refined with a lengthy subjective procedure in which the practitioner asks the person which of two views of a letter chart produce the clearest view. This second step is time consuming and not always accurate.
  • the new method and apparatus of the Application can replace the subjective procedure with an automated procedure in which the final refraction is determined by the autorefractor.
  • the new method and apparatus can also optionally include an algorithm that optimizes the refraction based on a learning algorithm, such as, for example, a deep layer learning algorithm.
  • the deep layer learning algorithm can be based on comparing the autorefractor raw data with the results of a large number of subjective refractions and using the latter to self-compute by the learning algorithm what the best estimate of the refraction is. This deep learning method allows the cumulative wisdom of many practitioners to guide the optimization of the autorefraction.
  • a new autorefractor according to the Application can operate with no additional optical components needed in the light path between the camera with lens or lens assembly (e.g. for a camera with detachable or interchangeable lenses) and the face of the person.
  • the cameras with built in, embedded, or integrated lens systems e.g. a smartphone
  • lens assemblies which attach to a camera body e.g. a CCD or CMOS camera body with a lens coupling system
  • a subject downloads an autorefractor “app” to their smartphone. Then, for example, the person might hold the smartphone at an approximate distance from their face about in front of the face so as to capture images of both eyes.
  • a smartphone or any other suitable camera
  • a soft button starts the process, although any suitable trigger can be used ranging from a self-timer to use of one or more of the mechanical buttons of the smartphone.
  • the computer of the phone one or more processors of the smartphone
  • a remote computer accessed by a server perform the new method to determine an eyewear prescription from one or more smartphone camera images.
  • images can be acquired by any suitable imaging device, typically a camera, and any suitable support means, including, for example, by selfie (camera held by the person), physical mount including any suitable camera mount, such as any suitable tripod, mounting bracket, or a camera set on or against a stationary surface, camera held by another person, etc.
  • an autorefractor apparatus uses a camera, and a focusable lens.
  • FIG. 1A is a drawing showing a person performing an autorefraction by selfie according to the Application by use of a smartphone 100.
  • the person performing an autorefraction by selfie holds the camera (here smartphone 100) at any suitable about constant distance D from their face.
  • the autorefractor process e.g. a smartphone “App” running on smart phone 100 takes one or more images. Where a flash is used, the App also controls timing and firing of the flash. Generally, any manufacturer “red eye” correction mode is turned off while the autorefractor App is running.
  • FIG. IB is a drawing showing a person performing an autorefraction by selfie with a different orientation of the smartphone.
  • main cameras are preferred at present because of the relationship between the cameras, lenses, and flash on the surface of the smartphone. It may be possible to also use the conventional forward facing camera and flash. Where the screen is facing away from the person, voice prompts can be used to aid in timing of when the images are taken, and guidance on how far away and at what orientation to hold smartphone.
  • FIG. 2 is block diagram of an exemplary apparatus useful for performing a self autorefraction process.
  • the camera 101 includes at least one focusable lens 103, and at least one source of illumination, e.g. flash 105.
  • At least one computer e.g. processor 109, is programmed to run all or any part of an autorefractor process 107 to transform one or more images of a human eye into an eyeglass or a contact lens prescription based on a pupil size and shape information in each of the at least two images.
  • the camera can be a smartphone camera.
  • the images can be taken with an App installed and running on the smartphone by taking a selfie (where the App causes at least two images to be taken at two different focal planes).
  • the autorefractor process can include a human eye pupil refraction model.
  • the autorefractor process can also include a model derived from a deep learning autorefractor or subjective eye measurements for each pupil.
  • the camera is a smartphone, there are no additional optical components needed nor disposed between the camera and a face of a person undergoing autorefraction.
  • Any part of the autorefractor process can optionally reside on a separate computer, e.g. computer server 201, which can be communicatively coupled to smartphone 100 by any suitable wired or wireless communication path 203 including, for example, via the Internet.
  • FIG. 3 is a flowchart showing an exemplary self autorefraction process.
  • the method of autorefraction which can be performed by a patient themselves with a smartphone taking a selfie.
  • a method of measuring aberrations in an optical system smartphone autorefractor includes: A) providing at least one source of flash illumination, at least one camera mounted on or in a surface of a mobile device, the at least one camera including at least one focusable lens, the at least one camera disposed at a different location on a surface of the mobile device at a same or different distance from the at least one source of flash illumination; and at least one computer programmed to run an autorefractor process; B) acquiring at least one image of an eye of a person at a distance D; and C) transforming the at least one image to an eyewear prescription based on a two-dimensional light distribution of a crescent in the at least one image of light returning from a retina of the eye.
  • Eccentric photorefraction can make use of a camera having half of its aperture shielded and a point- like light source located at the plane of the camera and at some distance (eccentricity) from the optical axis created by the eye and the center of the camera.
  • the camera is focused on the eye and the system aperture collects only those rays diffusely reflected on the retina which crosses the pupil through certain areas depending on the refractive state of the eye. Then, a “crescent” is observed either at the upper part of the pupil (myopic eye) or the lower part (hyperopic eye).
  • Fig 1 a) shows the optical configuration for a myopic eye.
  • FIG. 4A and FIG. 4B are illustrations of photorefraction for a myopic eye using
  • FIG. 4A a photorefractor
  • FIG. 4B a smartphone.
  • FIG. 4A is a drawing which illustrates photorefraction with a smartphone and the comparison with panel.
  • FIG. 4B is a drawing which illustrates the equivalence of a smartphone camera.
  • the lines 403 represent light rays from the retina and only those between the upper part of the pupil and the thick line 405 crosses the camera’ s aperture, which creates the appearance of the crescent on the pupil.
  • At least one image of the pupil taken with a photorefractor can provide a measurement of the spherical error in the direction defined by the line connecting the light source and the center of the camera’s aperture and, by symmetry, in any other direction.
  • the error in power lacks this symmetry, although it can be characterized as the difference in power between at least three orthogonal directions.
  • Another alternative with modem smartphones equipped with various cameras is to record 3 or more images simultaneously. The relative position of each camera with respect to the flash is different and then information on astigmatism is obtained without the need of moving the orientation of the smartphone itself. For example, two images taken at two orientations with two lenses at different locations on the smartphone can provide two of the three needed images for an astigmatism measurement, where a third image at a third orientation can be provided by then changing the orientation of the smartphone (rotation) and taking at least one more image.
  • Sensitivity - Sensitivity of our new system and method has been increased to substantially eliminate the “dead zone” including optimizing smartphone distance (show data). Sensitivity depends on the working distance. The longer the working distance the greater the gain. However, increasing the working distance also means a loss in resolution in the image to process. A good compromise for an exemplary iPhone XS in lab testing was found to be a working distance of around 66 cm.
  • FIG. 5A is a drawing showing theoretical modeling of the crescent that is visible in images of the pupil of the human eye, over a range of refractive errors.
  • the dashed line circle represents the opening of the pupil of the eye.
  • the gradient image of a reflected light caused by a light illumination of the eye is also roughly circular. However, as obscured by the opening of the pupil an image of the pupil shows the portion of the reflected light which is visible through the pupil.
  • the mask of the pupil combined with the aperture of the camera causes that reflected circle to appear in an image of the pupil of the human eye as a crescent shape (the portion of the reflected circle of light in the dashed line circle represents the visible part of the crescent).
  • 5A is based on the assumption that the crescent observed within the pupil is only the visible part of a circular plateau limited by an area of decreasing intensity or transition zone to zero intensity.
  • the diameter of the circle suggested by the crescent, its location relative to the pupil center and the width of the transition zone are parameters that depend on the prescription of the patient.
  • FIG. 5A by -2 Diopter (right side), the circle of reflected light is substantially masked by the iris of the eye.
  • the circle of directly reflected light from the retina is below the opening of the pupil.
  • there is still substantial scattered light (not shown in this -2 Diopter view), largely from the halo of light reflected within the eye and within structures of the eye (e.g. the fundus, lens, cornea, etc. of the eye).
  • FIG. 6A shows the light intensity in a cross section through the crescent, illustrating the transition zone where the intensity falls to zero on the assumption that there is no scattered light inside the eye.
  • FIG. 7 is a drawing showing images and graphs of a location of a center of the plateau and defocus and a diameter of a circle suggested by a crescent. This figure shows that not only the location, but also the diameter of the crescent depends on the refractive error, illustrating the importance of 2D information about its size and shape in optimizing the estimate of the refractive error.
  • FIG. 8A is a drawing which shows the central image shows a pupil with a crescent. On the left is shown how only one section ((1-D) of the intensity profile is currently used. The plot on the right illustrates how the whole 2D information throughout the pupil is used and fitted to a model according to the new system and method of the Application.
  • FIG. 8B is a drawing which shows the dashed line which outlines the circle suggested from the crescent on the pupil.
  • the location of the center and the diameter are related with the refractive error of the eye and can be used to estimate the refractive error of the eye.
  • the dashed line outlines the circle suggested from the crescent on the pupil.
  • the location of the center and the diameter are related with the refractive error of the eye and can be used to estimate it.
  • the image of the crescent in the pupil is a blurred and vignetted version of a complete, circular image which appears on the retina caused by an external illumination of the eye (e.g. the flash for a smartphone eccentric photorefraction measurement).
  • an external illumination of the eye e.g. the flash for a smartphone eccentric photorefraction measurement.
  • FIG. 5B is a drawing of the visible crescent over a range of refractive errors of the human eye, now including the scattered light of the halo of the crescent.
  • FIG. 6B shows the light intensity in a cross section through the crescent and halo, illustrating the transition zone where the intensity falls to a non- zero light distribution based on the realization that there is a useful 2D scattered light distribution inside the pupil of the eye which according to our new work, could be used to read a prescription for eyewear where the crescent is mostly or substantially completely blocked as not present in the opening of the pupil of the eye.
  • FIG. 9A is a drawing showing parameters of a 2D geometric model which can be used to transform at least one image of the pupil of the eye into an eyewear prescription.
  • the 2D image of the crescent in the pupil can be characterized by the parameters: a: baseline intensity, b: top intensity at the plateau, C: center of the circle, Re: radius of the circle, Rp: radius of the plateau, and any suitable combinations thereof.
  • FIG. 9B is a flow chart showing an exemplary process for transforming at least one image into an eyewear prescription.
  • the flow chart of FIG. 9B shows exemplary steps to obtain a prescription from at least one image of the pupil.
  • the light intensity distribution from at least one image is combined with the 2D model to obtain the values of the parameters which better fit the measured intensity.
  • the prescription can be estimated in three different ways: a) using parameter C (center of the circle), b) using parameter Re (radius of the circle) and c) using parameters Re and Rp (width of the transition) and any combination thereof.
  • EXAMPLE Prescription based on a 2D intensity plot:
  • FIG. 10 is a drawing showing a laboratory demonstration of determining a prescription based on a 2D intensity plot of the crescent.
  • the laboratory head fixture would not be used when the method is performed by smartphone selfie
  • a region of interest which includes the eye’s pupil to be measured is extracted (2).
  • the area affected by the reflection of the flash on the cornea was bounded and discarded (3).
  • the pupil should be segmented to obtain the location of the center and the radius (4).
  • all of the intensity values of the pixels within the pupil are taken as the input for the algorithm which tries to fit the data to the 2D shape expected by the geometrical model (5).
  • Another way to substantially eliminate the dead zone is to place a trial lens before the eye, which will restore the view of the crescent.
  • FIG. 11 is a drawing which illustrates the dead zone.
  • the black circles represent the pupil and the lighter shapes within the pupil, the crescent for different values of defocus of the refractive error. As defocus increases the size of the crescent becomes larger but there is a range (dead zone) where the crescent is not visible.
  • FIG. 12 is a drawing showing images of the pupil of an eye where different refractive errors have been induced by means of trial lenses.
  • the top row of FIG. 12 shows images of a pupil into the dead zone for different values of defocus.
  • the bottom row of FIG. 12 shows images of a pupil into the dead zone for different values of defocus.
  • EFFECT OF AND CORRECTION FOR ETHNICITY The ethnicity of the subject can influence the refractive error obtained from eccentric photorefraction, though the explanation for this effect has not been clearly demonstrated. We realized that variations in the amount of light reflected from the back of the eye can be attributed to and corrected for by factoring in effects of melanin as related to ethnicity and skin color. [0095] Variation by ethnicity is believed to be caused at least in part, if not predominately by variations in the spectral reflectance of the fundus of the eye. Ethnicity, skin color, and iris color are likely surrogates for the color of the fundus.
  • the appearance of the crescent in the pupil is a consequence of the back reflection of light onto the retina.
  • the fraction and chromaticity of the reflected light are mostly driven by the concentration of melanin, a compound which absorbs light and also affects the skin and iris color.
  • melanin a compound which absorbs light and also affects the skin and iris color.
  • the relevance of this fact is that, along with the crescent, there is additional light on the image of the pupil which is a consequence of back scattered light from the inner surface of the ocular globe. This scattered light may affect the apparent position of the crescent and the estimation of the refractive error. This effect is more intense as the concentration of melanin decreases.
  • FIG. 13 is a drawing showing the error in the estimation of the refractive state of the eye is plotted versus the lightness of the iris. The found correlation is statistically significant, suggesting that ethnicity might play a role in eccentric photorefraction.
  • the plot in FIG. 13 shows a statistically significant correlation between the error in the estimation of the refractive state of the eye using eccentric photorefraction and the lightness of the iris.
  • ethnicity as well as iris color into our 2-D model of eccentric photorefraction to improve the accuracy of the method.
  • Ethnicity and/or skin color can be factored into our system and method by any suitable technique, including, for example information provided to the App by a user and/or a determination of skin color from one or more selfie images, including images of the eye and surrounding skin color.
  • EXAMPLE Prescription corrected by an error plot:
  • FIG. 14 is a graph showing a photorefraction vs HS - center error correction.
  • a correction may need to be made to take into account other factors not considered in the model or uncertainties in the estimation of the parameters describing the model.
  • This correction can be, for example, obtained after a comparison of the eccentric photorefraction measurements with the Hartmann-Shack wavefront sensor.
  • FIG. 14 illustrates an exemplary comparison based on such a correction in a 1 : 1 plot, revealing that the correction to apply is simply an offset of 0.7 D. For example, an initial value of -4 D would yield a final prescription of -0.3 D.
  • the error corrections can be based on experientially derived databases or any suitable known and determined corrections for factors such as, for example aging, ethnicity, skin color, etc.
  • PUPIL EDGE SENSING We have previously demonstrated a pupil edge sensor that was able to successfully refract the eye in the benchtop prototype. In this approach, two or more images of the light returning from a flash on the retina are acquired in different focal planes near the pupil of the eye. As the light emerges from the pupil, its spatial distribution evolves in different ways depending on the refractive state of the eye, providing an opportunity to compute the refractive error by comparing images at different focal planes with each other. Pupil edge sensing can optionally be used to supplement eccentric photorefraction by smartphone systems and methods as described hereinabove.
  • SELF-CALIBRATION BASED ON RGB SPECTRAL INFORMATION - Chromatic information in the one or more images of the eye can be used to remove some of the individual differences in estimates of refraction by using the three spectral bands (RGB) available in the color image.
  • RGB spectral bands
  • Variability of the accuracy of measurements of aberrations of the eye from subject to subject in autorefractor measurements causes errors in the measurement results.
  • the reasons for the variability are not yet well understood. Variability is believed related to either of, or more likely a combination of variations in the physical structure of the fundus of the eye (back of the eye) and individual variations in chromatic aberration of the eye, more specifically lateral chromatic aberration (LCA). That is, there are small variations in the optical power of the individual eye as a function of wavelength.
  • LCA lateral chromatic aberration
  • Parameters of the eccentric photorefracted crescent, particularly the location of the crescent is known to vary as a function of wavelength for each individual eye. There is an average variation of crescent position as a function of wavelength for the human eye, and then variations from that average crescent location as a function of wavelength for each individual eye.
  • Appendix IV is to compare the location of the crescent of an individual eye in RGB images, it will be understood by those skilled in the art that other comparisons at different wavelengths can be made, such as for example, shape of the crescent curve and intensity gradient at different wavelengths.
  • the new correction can be used with any autorefractor measurement system and method which can provide images of the crescent in at least two different wavelengths.
  • the at least two different wavelengths are RGB for a mobile device imager.
  • Other wavelengths as available with any suitable imager or imaging technique can also be used.
  • the new correction can be used beyond the limited context of mobile device applications including the specific smartphone autorefractometry examples of the Application.
  • the slope of the intensity profile of the illuminated pupil which is the input to estimate refraction, not only depends on refraction but also on a number of other parameters such as the overall brightness of the image or the eccentricity of the light source, to name a few. Some of these factors are subject-dependent affecting the precision of the method.
  • Isolation of the effect of refraction may be achieved by introducing known amounts of defocus and measuring the effect on the slope, producing a self-calibration.
  • this approach may be less practical because the method uses multiples images and induction of known amounts of defocus.
  • FIG. 15 is a drawing which shows an example of a camera (e.g. a smartphone camera) collecting colored images of the pupil.
  • a camera e.g. a smartphone camera
  • This example of an implementation of the self- calibration procedure using spectral information A colored image of the pupil is taken and separated in its R, G, B components. Between each of them there is a refractive error given by the LCA of the eye. Known the slopes and the difference in focus, a self-calibration may be performed (The graph of FIG. 15 which shows the LCA, was adapted from Thibos et al., Applied Optics, 31, 19, 1992).
  • the spectral information is selected by using the three separate channels (R, G, B) which constitute the RGB color space. Each image provides a slope and the difference of focus between images is given by the standard LCA found in humans.
  • SMARTPHONES INCLUDING TABLETS, NOTEBOOKS, ETC.
  • Most modem computing devices with a camera may be suitable to ran the new process according to the Application.
  • considerations include flash, flash intensity, flash rate, distance of the flash from the camera lens, flash color spectrum, suppression and ability for an App to commandeer and control lens focus, and control or suppression of smart device “red” eye correction modes.
  • Most modern cameras, including many variable focus cameras, can be configured to measure aberrations in an optical system according to the new methods of the Application.
  • Other parameters of interest to the autorefractor processes described hereinabove include burst speeds and/or video speeds of the exemplary smart phone camera.
  • burst speeds and/or video speeds of the exemplary smart phone camera For example, from our initial investigations for running the new process on a smartphone, sufficient camera speed has likely been possible since the Samsung Galaxy S7 and Apple iPhone 7 models. Such smart phone camera features having been continuously improved every model year since up to an including, for example, recent iPhone and Galaxy models.
  • the new methods described hereinabove can be generally used with any suitable type of corrective eyewear including, for example, contact lenses, intraocular lenses, and spectacles. These new methods can also be used with refractive surgery, which can be used to correct the vision of person’s eyes through any suitable refractive surgery medical procedures. Generally, such surgical procedures are performed on an out-patient basis by a medical professional, where the person undergoing the treatment is a patient.
  • Firmware and/or software for an autorefractor process, a human eye pupil refraction model, and/or a deep learning comparison of pupil edge/sizes in different focal planes and corresponding autorefractor or subjective eye measurements for the pupil as well as any other data, models, and/or processes described hereinabove can be supplied on a computer readable non-transitory storage medium.
  • a computer readable non-transitory storage medium as non-transitory data storage includes any data stored on any suitable media in a non-fleeting manner.
  • Such data storage includes any suitable computer readable non-transitory storage medium, including, but not limited to hard drives, non-volatile RAM, SSD devices, CDs, DVDs, etc.
  • a computer readable non-transitory storage medium as non-transitory data storage includes any data stored on any suitable media in a non-fleeting manner.
  • Such data storage includes any suitable computer readable non-transitory storage medium, including, but not limited to hard drives, non-volatile RAM, SSD devices, CDs, DVDs, etc.

Landscapes

  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Engineering & Computer Science (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

An autorefractor apparatus includes at least one source of flash illumination. At least one camera is mounted on or in a surface of a mobile device. The at least one camera includes at least one focusable lens. The at least one camera is disposed at a different location on a surface of the mobile device at a same or different distance from the at least one source of flash illumination. At least one computer is programmed to run an autorefractor process to transform at least one image of an eye taken by the at least one camera into an eyewear prescription based on a two-dimensional light distribution of a crescent in the at least one image of light returning from a retina of the eye. Methods of measuring aberrations in an optical system smartphone autorefractor is also described.

Description

SYSTEM AND METHOD TO MEASURE ABERRATIONS BY IMAGING BOTH THE CRESCENT AND THE HALO OF THE CRESCENT
CROSS-REFERENCE TO RELATED APPLICATIONS
[0001] This application claims priority to and the benefit of co-pending U.S. provisional patent application Serial No. 63/135,037, MEASURING ABERRATIONS IN AN OPTICAL SYSTEM SMARTPHONE AUTREFRACTOR, filed January 8, 2021, which application is incorporated herein by reference in its entirety.
FIELD OF THE APPLICATION
[0002] The application relates to objective measurements of an optical system including the human eye, particularly to measurement of aberrations of the optical system.
BACKGROUND
[0003] Measurement of aberrations in optical systems is important across a number of technical fields and application.
[0004] For example, with regard to human eyesight, refractive errors (myopia, hyperopia and astigmatism) are the main reason for blurred vision in the population. Glasses, contact lenses, and refractive surgery are the methods most extensively used to correct vision. [0005] A precise measurement of the refractive error (refraction) is needed before the prescription of the correction. The measurement typically includes a visit to an optician or ophthalmologist who measures the person’s refraction using subjective and/or objective methods. Objective methods usually make use of bulky, expensive instruments and subjective methods are time-consuming.
[0006] It is also important to measure aberrations as corrections in various types of optical lens systems, including for example, the lenses of eyeglasses.
SUMMARY
[0007] An autorefractor apparatus includes at least one source of flash illumination. At least one camera is mounted on or in a surface of a mobile device. The at least one camera includes at least one focusable lens. The at least one camera is disposed at a different location on a surface of the mobile device at a same or different distance from the at least one source of flash illumination. At least one computer is programmed to ran an autorefractor process to transform at least one image of an eye taken by the at least one camera into an eyewear prescription based on a two-dimensional light distribution of a crescent in the at least one image of light returning from a retina of the eye. The two-dimensional image of the crescent in the pupil can be characterized by any of the parameters: baseline intensity, top intensity at the plateau, center of the circle, radius of the circle, radius of the plateau, and combinations thereof.
[0008] The autorefractor process can be based at least in part on a light intensity distribution where the crescent is substantially not visible in the at least one image and where a correction is less than about 1.5 diopter. The autorefractor process can be based further on an age of a person autorefracted. The autorefractor process can be based further on an ethnicity of a person autorefracted. The autorefractor process can be based further on a skin color of a person autorefracted. The skin color of the person autorefracted can be determined from the at least one image. The color of the back of the eye or the color of the fundus of the eye of the person autorefracted can be determined from the at least one image.
[0009] The autorefractor apparatus can further include a self-calibration process based on RGB information of the at least one image.
[0010] The camera includes a smartphone RGB camera.
[0011] A method of measuring aberrations in an optical system smartphone autorefractor includes: providing at least one source of flash illumination, at least one camera mounted on or in a surface of a mobile device, the at least one camera including at least one focusable lens, the at least one camera disposed at a different location on a surface of the mobile device at a same or different distance from the at least one source of flash illumination; and at least one computer programmed to run an autorefractor process; acquiring at least one image of an eye of a person at a distance D; and transforming the at least one image to an eyewear prescription based on a two-dimensional light distribution of a crescent in the at least one image of light returning from a retina of the eye. The two-dimensional image of the crescent in the pupil can be characterized by any of the parameters: baseline intensity, top intensity at the plateau, center of the circle, radius of the circle, radius of the plateau, and combinations thereof.
[0012] The step of transforming can include transforming the at least one image to an eyewear prescription based at least in part on a light intensity distribution where the crescent is substantially not visible in the at least one image and where a correction is less than about 1.5 diopter.
[0013] The step of transforming can include transforming the at least one image to an eyewear prescription corrected based on an age of a person autorefracted. The step of transforming can include transforming the at least one image to an eyewear prescription corrected based on an ethnicity of a person autorefracted. The step of transforming can further include transforming the at least one image to an eyewear prescription corrected based on a skin color of a person autorefracted. The method of transforming can include transforming the at least one image to an eyewear prescription corrected based on a color of the back of the eye or a color of the fundus of the eye of a person autorefracted.
[0014] The method can further include the step of determining the skin color of the person autorefracted is determined from the at least one image.
[0015] The method can further include the step of performing a self-calibration process based on RGB information of the at least one image.
[0016] The step of providing can include providing a smartphone RGB camera.
[0017] The step of acquiring can include taking a selfie with a smartphone camera.
[0018] The method can further include before the step of acquiring, the step of placing a lens between the eye and the camera to substantially eliminate the dead zone.
[0019] A method of measuring aberrations in an optical system smartphone autorefractor includes: providing at least one source of flash illumination, at least one camera mounted on or in a surface of a mobile device, the at least one camera including at least one focusable lens, the at least one camera disposed at a different location on a surface of the mobile device at a same or different distance from the at least one source of flash illumination; and at least one computer programmed to run an autorefractor process; acquiring at least one image of an eye of a person at a distance D; and transforming the at least one image to an eyewear prescription based at least in part on a provided ethnicity or a skin color read from the at least one image and a crescent in the at least one image of light returning from a retina of the eye.
[0020] A method of measuring aberrations in an optical system smartphone autorefractor includes: providing at least one source of flash illumination, at least one camera mounted on or in a surface of a mobile device, the at least one camera including at least one focusable lens, the at least one camera disposed at a different location on a surface of the mobile device at a same or different distance from the at least one source of flash illumination; and at least one computer programmed to run an autorefractor process; acquiring at least one image of an eye of a person at a distance D; and transforming the at least one image to an eyewear prescription based at least in part on a provided age of a patient and a crescent in the at least one image of light returning from a retina of the eye
[0021] The foregoing and other aspects, features, and advantages of the application will become more apparent from the following description and from the claims.
BRIEF DESCRIPTION OF THE DRAWINGS
[0022] The features of the application can be better understood with reference to the drawings described below, and the claims. The drawings are not necessarily to scale, emphasis instead generally being placed upon illustrating the principles described herein. In the drawings, like numerals are used to indicate like parts throughout the various views.
[0023] FIG. 1A is a drawing showing a person performing an autorefraction by selfie according to the Application by use of a smartphone;
[0024] FIG. IB is a drawing showing a person performing an autorefraction by selfie with a different orientation of the smartphone;
[0025] FIG. 2 is block diagram of an exemplary apparatus useful for performing a self autorefraction process;
[0026] FIG. 3 is a flowchart showing an exemplary self- autorefraction process;
[0027] FIG. 4A is a drawing that illustrates photorefraction with a smartphone and the comparison with panel;
[0028] FIG. 4B is a drawing that illustrates the equivalence of a smartphone camera;
[0029] FIG. 5A is a drawing showing the visible crescent over a range of refraction aberrations of the human eye; [0030] FIG. 5B is a drawing of the visible crescent over a range of refractive errors of the human eye, now including the scattered light of the halo of the crescent;
[0031] FIG. 6A is a drawing that represents a simulation showing how parameters change with the defocus of the prescription;
[0032] FIG. 6B, shows the light intensity in a cross section through the crescent and halo, illustrating the transition zone where the intensity falls to a non-zero light distribution; [0033] FIG. 7 is a drawing showing images and graphs of a location of a center of the plateau and defocus and a diameter of a circle suggested by a crescent;
[0034] FIG. 8A is a drawing which shows the central image shows a pupil with a crescent;
[0035] FIG. 8B is a drawing which shows the dashed line which outlines the circle suggested from the crescent on the pupil;
[0036] FIG. 9A is a drawing showing parameters of a 2D geometric model which can be used to transform at least one image of the pupil of the eye into an eyewear prescription; [0037] FIG. 9B is a flow chart showing an exemplary process for transforming at least one image into an eyewear prescription;
[0038] FIG. 10 is a drawing showing a laboratory demonstration of determining a prescription based on a 2D intensity plot of the crescent;
[0039] FIG. 11 is a drawing which illustrates the dead zone;
[0040] FIG. 12 is a drawing showing images of the pupil of an eye where different refractive errors have been induced by means of trial lenses;
[0041] FIG. 13 is a drawing showing the error in the estimation of the refractive state of the eye is plotted versus the lightness of the iris;
[0042] FIG. 14 is a graph showing a photorefraction vs HS - center error correction;
[0043] and
[0044] FIG. 15 is a drawing which shows an example of a RGB camera collecting colored images of the pupil.
DETAILED DESCRIPTION [0045] DEFINITIONS [0046] CRESENT - Eccentric photorefraction measurements are typically based on a geometric shape of light reflected back from an illuminated human eye. The light as reflected by the retina in the complete two-way passage of light through the eye creates a crescent shape in an image of the pupil of the illuminated eye. An arc can be matched to the outer curve of the crescent to create a complete circle. The center of that circle can be converted to a refractive measurement of the eye of a person and ultimately transformed into an optical prescription for eyewear. The location of the crescent in the image moves (typically in a vertical direction with varying refractive aberration) from below the pupil to above the pupil. Unfortunately, near the center, the pupil obscures the image of the crescent from the retina, and the crescent is obscured by the circular opening of the pupil. Therefore, it is difficult to impossible to obtain an accurate prescription for a relatively small correction by use of the crescent method of eccentric photorefraction.
[0047] HALO OF THE CRESCENT - By use of relatively sensitive imaging devices, as are common now in regular smartphone cameras, where previously it was believed that there was no useful information in the absence of a visible crescent or a large enough portion of a visible crescent, we believe there may be enough useful light, to make autorefraction measurements for a small correction in the crescent dead zone. As described in more detail hereinbelow, a 2 dimensional (2D) image of the light distribution from the pupil, including shape and size information of the crescent and the halo of the crescent, could provide a complete range of human eyewear prescriptions based on one more images taken by a smart phone, such as by taking a selfie.
[0048] INTRODUCTION - Autorefractors can objectively read a person’s prescription based on passing light from a light source through the pupils of a person’s eyes and back through various types of optical components, typically including a relatively complex arrangement of beam splitters, lenses, and light sensors. Lensometers use similar technologies to measure the optical parameters of a lens, as compared to the aberrations of the eyes as measured by an autorefractor. Both autorefractors and lensometers typically cost several thousands of dollars.
[0049] Autorefractors are typically used as the starting point for providing individual persons with a prescription needed for eyeglasses, contact lenses, or refractive surgery. The estimate provided by the autorefractor is then typically refined with a lengthy subjective procedure in which the practitioner asks the person which of two views of a letter chart produce the clearest view. This second step is time consuming and not always accurate.
[0050] The new method and apparatus of the Application can replace the subjective procedure with an automated procedure in which the final refraction is determined by the autorefractor. The new method and apparatus can also optionally include an algorithm that optimizes the refraction based on a learning algorithm, such as, for example, a deep layer learning algorithm.
[0051] Where optionally used, the deep layer learning algorithm can be based on comparing the autorefractor raw data with the results of a large number of subjective refractions and using the latter to self-compute by the learning algorithm what the best estimate of the refraction is. This deep learning method allows the cumulative wisdom of many practitioners to guide the optimization of the autorefraction.
[0052] As described hereinabove, relatively complex autorefractors are commercially available, and in common use. Moreover, there have been some alternative solutions described, such as based in part on a camera or smartphone camera, however, all of these solutions require additional optical components or optical assemblies disposed in the optical path between the camera and the person’ s face.
[0053] A new autorefractor according to the Application can operate with no additional optical components needed in the light path between the camera with lens or lens assembly (e.g. for a camera with detachable or interchangeable lenses) and the face of the person. For brevity, except where there might a difference in method, the cameras with built in, embedded, or integrated lens systems (e.g. a smartphone) as well as lens assemblies which attach to a camera body (e.g. a CCD or CMOS camera body with a lens coupling system) are referred to herein interchangeably as “lens” or “camera lens”.
[0054] Generally, for example, where a smartphone camera is used, a subject (person) downloads an autorefractor “app” to their smartphone. Then, for example, the person might hold the smartphone at an approximate distance from their face about in front of the face so as to capture images of both eyes. Alternatively, a smartphone (or any other suitable camera) can be placed at any suitable fixed distance from the person on any suitable mount (e.g. a stand, tripod, etc.)· Typically, a soft button starts the process, although any suitable trigger can be used ranging from a self-timer to use of one or more of the mechanical buttons of the smartphone. As explained hereinbelow in more detail, either the computer of the phone (one or more processors of the smartphone) and/or a remote computer accessed by a server perform the new method to determine an eyewear prescription from one or more smartphone camera images.
[0055] Images of the pupil of the eye, taken at about the same physical distance from the person’s face - Such images can be acquired by any suitable imaging device, typically a camera, and any suitable support means, including, for example, by selfie (camera held by the person), physical mount including any suitable camera mount, such as any suitable tripod, mounting bracket, or a camera set on or against a stationary surface, camera held by another person, etc.
[0056] It is unimportant what parts of the method are performed in whole or in part near, at, or in the camera. For example, where a smartphone camera is controlled by an app which converts the smartphone into an ophthalmic quality autorefractor, camera control, including flash where used, and focus adjustments are most conveniently controlled by one or more processors of the smart phone itself. At least one image can be processed for information such as eccentric photorefraction crescent information either on the phone or the images can be sent to a remote computer, typically a remote server on the Internet. Finally, the prescription can be returned to and displayed by the App, where the imaged data of the person’s eyes is used to generate eyeglasses or contact lenses which can then be shipped to the person who made the measurement of their eyes by camera according to the new methods of the Application.
[0057] Summarizing our general cases, an autorefractor apparatus uses a camera, and a focusable lens.
[0058] EXEMPLARY SYSTEM - FIG. 1A is a drawing showing a person performing an autorefraction by selfie according to the Application by use of a smartphone 100. The person performing an autorefraction by selfie holds the camera (here smartphone 100) at any suitable about constant distance D from their face. The autorefractor process (e.g. a smartphone “App”) running on smart phone 100 takes one or more images. Where a flash is used, the App also controls timing and firing of the flash. Generally, any manufacturer “red eye” correction mode is turned off while the autorefractor App is running. FIG. IB is a drawing showing a person performing an autorefraction by selfie with a different orientation of the smartphone. Note that the main cameras are preferred at present because of the relationship between the cameras, lenses, and flash on the surface of the smartphone. It may be possible to also use the conventional forward facing camera and flash. Where the screen is facing away from the person, voice prompts can be used to aid in timing of when the images are taken, and guidance on how far away and at what orientation to hold smartphone.
[0059] FIG. 2 is block diagram of an exemplary apparatus useful for performing a self autorefraction process. The camera 101 includes at least one focusable lens 103, and at least one source of illumination, e.g. flash 105. At least one computer, e.g. processor 109, is programmed to run all or any part of an autorefractor process 107 to transform one or more images of a human eye into an eyeglass or a contact lens prescription based on a pupil size and shape information in each of the at least two images. The camera can be a smartphone camera. The images can be taken with an App installed and running on the smartphone by taking a selfie (where the App causes at least two images to be taken at two different focal planes). The autorefractor process can include a human eye pupil refraction model. The autorefractor process can also include a model derived from a deep learning autorefractor or subjective eye measurements for each pupil. Where the camera is a smartphone, there are no additional optical components needed nor disposed between the camera and a face of a person undergoing autorefraction. Any part of the autorefractor process can optionally reside on a separate computer, e.g. computer server 201, which can be communicatively coupled to smartphone 100 by any suitable wired or wireless communication path 203 including, for example, via the Internet.
[0060] EXEMPLARY METHOD - FIG. 3 is a flowchart showing an exemplary self autorefraction process. The method of autorefraction which can be performed by a patient themselves with a smartphone taking a selfie. According to the Application, A method of measuring aberrations in an optical system smartphone autorefractor includes: A) providing at least one source of flash illumination, at least one camera mounted on or in a surface of a mobile device, the at least one camera including at least one focusable lens, the at least one camera disposed at a different location on a surface of the mobile device at a same or different distance from the at least one source of flash illumination; and at least one computer programmed to run an autorefractor process; B) acquiring at least one image of an eye of a person at a distance D; and C) transforming the at least one image to an eyewear prescription based on a two-dimensional light distribution of a crescent in the at least one image of light returning from a retina of the eye.
[0061] ECCENTRIC PHOTOREFRACTION BY SMARTPHONE - Description of eccentric photorefraction and explanation of how the smart phone geometry with its offset flash and camera lends itself naturally to this method.
[0062] Eccentric photorefraction can make use of a camera having half of its aperture shielded and a point- like light source located at the plane of the camera and at some distance (eccentricity) from the optical axis created by the eye and the center of the camera. The camera is focused on the eye and the system aperture collects only those rays diffusely reflected on the retina which crosses the pupil through certain areas depending on the refractive state of the eye. Then, a “crescent” is observed either at the upper part of the pupil (myopic eye) or the lower part (hyperopic eye). Fig 1 a) shows the optical configuration for a myopic eye.
[0063] In a smartphone, the arrangement of the front camera and the flash on the back of the device, effectively produces the same result as an eccentric photorefraction setup. The role of the shield on half of the camera’s aperture in the case of standard photorefraction is played by the distance between the flash and the camera in the smartphone. In terms of optics, there is no difference at all between both configurations and the final outcome is equivalent. [0064] FIG. 4A and FIG. 4B are illustrations of photorefraction for a myopic eye using
FIG. 4A a photorefractor, and FIG. 4B a smartphone. FIG. 4A is a drawing which illustrates photorefraction with a smartphone and the comparison with panel. FIG. 4B is a drawing which illustrates the equivalence of a smartphone camera. The lines 403 represent light rays from the retina and only those between the upper part of the pupil and the thick line 405 crosses the camera’ s aperture, which creates the appearance of the crescent on the pupil.
[0065] Spherical error in the eye is the consequence of an excess (myopia) or lack
(hyperopia) of optical power in the optics of the eye relative to the position of the retina. This error in power is angularly symmetrical and the measurement in any direction is sufficient to assess this error. At least one image of the pupil taken with a photorefractor can provide a measurement of the spherical error in the direction defined by the line connecting the light source and the center of the camera’s aperture and, by symmetry, in any other direction.
[0066] In astigmatism, on the contrary, the error in power lacks this symmetry, although it can be characterized as the difference in power between at least three orthogonal directions. Another alternative with modem smartphones equipped with various cameras is to record 3 or more images simultaneously. The relative position of each camera with respect to the flash is different and then information on astigmatism is obtained without the need of moving the orientation of the smartphone itself. For example, two images taken at two orientations with two lenses at different locations on the smartphone can provide two of the three needed images for an astigmatism measurement, where a third image at a third orientation can be provided by then changing the orientation of the smartphone (rotation) and taking at least one more image.
[0067] Sensitivity - Sensitivity of our new system and method has been increased to substantially eliminate the “dead zone” including optimizing smartphone distance (show data). Sensitivity depends on the working distance. The longer the working distance the greater the gain. However, increasing the working distance also means a loss in resolution in the image to process. A good compromise for an exemplary iPhone XS in lab testing was found to be a working distance of around 66 cm.
[0068] We realized that the spatial offset of the light flash and the smartphone camera, designed to thwart red eye in photos of people, inadvertently establishes the necessary conditions for eccentric photorefraction. We have completed preliminary work which shows that eccentric photorefraction can be used for smartphone autorefraction. We have confirmed that, using only the phone with no additional instrumentation, it is possible to record the pupil light distribution in every subject. We have also estimated the spherical refractive error based on the slope of the light distribution across the pupil obtained from smartphone images.
[0069] Consistent with the results obtained by earlier practitioners of eccentric photorefraction, the spherical refraction is highly variable, with an accuracy no better than about 0.75 diopters. A review of the literature suggests that no one yet thoroughly understands the sources of this variability nor have they been able to substantially eliminate it. As a consequence, eccentric photorefraction has been superseded by other more accurate methods. [0070] However, previous studies have overlooked important sources of information that might have the potential to improve the accuracy of eccentric photorefraction, to the point that a smartphone camera and flash can be the basis for smartphone autorefraction. For example, to the best of our knowledge, until now no one had undertaken a complete ray tracing analysis of the 2D pupil light distribution in eccentric autorefraction. Moreover, no one had attempted to incorporate spectral information into the eccentric photorefraction estimate despite its potential to self-calibrate the refraction as well as to mitigate the effect of individual differences in the spectral reflectance of the fundus. Based on theoretical modeling, we believe that by separating the color image into its RGB components, we can reliably obtain three independent spectral images of the pupil.
[0071] THE GEOMETRIC CRESENT AS USED IN AUTOREFRACTION
[0072] FIG. 5A is a drawing showing theoretical modeling of the crescent that is visible in images of the pupil of the human eye, over a range of refractive errors. The dashed line circle represents the opening of the pupil of the eye. The gradient image of a reflected light caused by a light illumination of the eye, is also roughly circular. However, as obscured by the opening of the pupil an image of the pupil shows the portion of the reflected light which is visible through the pupil. The mask of the pupil combined with the aperture of the camera causes that reflected circle to appear in an image of the pupil of the human eye as a crescent shape (the portion of the reflected circle of light in the dashed line circle represents the visible part of the crescent). The 2D fitting of the example of FIG. 5A is based on the assumption that the crescent observed within the pupil is only the visible part of a circular plateau limited by an area of decreasing intensity or transition zone to zero intensity. The diameter of the circle suggested by the crescent, its location relative to the pupil center and the width of the transition zone are parameters that depend on the prescription of the patient.
[0073] In FIG. 5A, by -2 Diopter (right side), the circle of reflected light is substantially masked by the iris of the eye. The circle of directly reflected light from the retina is below the opening of the pupil. However, we realized that in this case, there is still substantial scattered light (not shown in this -2 Diopter view), largely from the halo of light reflected within the eye and within structures of the eye (e.g. the fundus, lens, cornea, etc. of the eye).
[0074] The next figure, FIG. 6A, shows the light intensity in a cross section through the crescent, illustrating the transition zone where the intensity falls to zero on the assumption that there is no scattered light inside the eye.
[0075] FIG. 7 is a drawing showing images and graphs of a location of a center of the plateau and defocus and a diameter of a circle suggested by a crescent. This figure shows that not only the location, but also the diameter of the crescent depends on the refractive error, illustrating the importance of 2D information about its size and shape in optimizing the estimate of the refractive error.
[0076] 2D MODEL OF THE CRESCENT - Eccentric photorefraction depends on the analysis of the position of a crescent of light on the retina seen in the eye’s pupil. Though the crescent observed at the pupil is a two-dimensional light intensity distribution, the analysis of its position and the corresponding estimate of refraction has traditionally been done by making only use of a one-dimensional section of the crescent. Based on geometrical optics considerations, we have developed and realized a new model which takes into account the 2D intrinsic nature of the crescent.
[0077] FIG. 8A is a drawing which shows the central image shows a pupil with a crescent. On the left is shown how only one section ((1-D) of the intensity profile is currently used. The plot on the right illustrates how the whole 2D information throughout the pupil is used and fitted to a model according to the new system and method of the Application.
[0078] FIG. 8B is a drawing which shows the dashed line which outlines the circle suggested from the crescent on the pupil. The location of the center and the diameter are related with the refractive error of the eye and can be used to estimate the refractive error of the eye. The dashed line outlines the circle suggested from the crescent on the pupil. The location of the center and the diameter are related with the refractive error of the eye and can be used to estimate it.
[0079] The image of the crescent in the pupil, is a blurred and vignetted version of a complete, circular image which appears on the retina caused by an external illumination of the eye (e.g. the flash for a smartphone eccentric photorefraction measurement). Using techniques described herein, we determine a center of the circle from a 2D distribution of light in at least one smartphone image of the crescent.
[0080] FIG. 5B is a drawing of the visible crescent over a range of refractive errors of the human eye, now including the scattered light of the halo of the crescent. FIG. 6B, shows the light intensity in a cross section through the crescent and halo, illustrating the transition zone where the intensity falls to a non- zero light distribution based on the realization that there is a useful 2D scattered light distribution inside the pupil of the eye which according to our new work, could be used to read a prescription for eyewear where the crescent is mostly or substantially completely blocked as not present in the opening of the pupil of the eye.
[0081] However, the parameters defining this suggested circle, in particular the location of the center, depend on the spherical refractive error of the eye and hence they can be used to estimate the refractive error.
[0082] The full two-dimensional light distribution over the pupil can be used, increasing the number of data to fit (relative to the 1-D model) and the robustness of the fitting. [0083] FIG. 9A is a drawing showing parameters of a 2D geometric model which can be used to transform at least one image of the pupil of the eye into an eyewear prescription.
The 2D image of the crescent in the pupil can be characterized by the parameters: a: baseline intensity, b: top intensity at the plateau, C: center of the circle, Re: radius of the circle, Rp: radius of the plateau, and any suitable combinations thereof.
[0084] FIG. 9B is a flow chart showing an exemplary process for transforming at least one image into an eyewear prescription. The flow chart of FIG. 9B shows exemplary steps to obtain a prescription from at least one image of the pupil. The light intensity distribution from at least one image is combined with the 2D model to obtain the values of the parameters which better fit the measured intensity. Once they are obtained, the prescription can be estimated in three different ways: a) using parameter C (center of the circle), b) using parameter Re (radius of the circle) and c) using parameters Re and Rp (width of the transition) and any combination thereof.
[0085] EXAMPLE: Prescription based on a 2D intensity plot:
[0086] FIG. 10 is a drawing showing a laboratory demonstration of determining a prescription based on a 2D intensity plot of the crescent. After the image of the patient is taken (1) (The laboratory head fixture would not be used when the method is performed by smartphone selfie), a region of interest which includes the eye’s pupil to be measured is extracted (2). Next, the area affected by the reflection of the flash on the cornea was bounded and discarded (3). The pupil should be segmented to obtain the location of the center and the radius (4). Now, all of the intensity values of the pixels within the pupil are taken as the input for the algorithm which tries to fit the data to the 2D shape expected by the geometrical model (5). This assumes that the crescent observed within the pupil is the only visible part of a circular plateau limited by an area of decreasing intensity or transition zone to zero intensity. Any of the next 3 parameters of the fitted 2D circular shape including combinations thereof, can be used to determine the prescription: center of the circle suggested by the visible crescent relative to the pupil center, radius, and extent of the transition zone (6). In the example, the center was chosen and using the corresponding relationship, the prescription was obtained (7). [0087] MEASUREMENTS INTO THE DEAD ZONE - Estimations of the refractive error of the eye in eccentric photorefraction can be based on the size of the crescent appearing in the pupil. One of the drawbacks of this technique is the existence of a range of refractive errors where the pupil is not showing a crescent. This range is known as the dead zone and the location of the dead zone, and extension depends on the refractive error and a number of configuration parameters.
[0088] Another way to substantially eliminate the dead zone is to place a trial lens before the eye, which will restore the view of the crescent.
[0089] FIG. 11 is a drawing which illustrates the dead zone. The black circles represent the pupil and the lighter shapes within the pupil, the crescent for different values of defocus of the refractive error. As defocus increases the size of the crescent becomes larger but there is a range (dead zone) where the crescent is not visible.
[0090] Hence, according to the common wisdom before our work, no measurements of refractive error can be made in the dead zone based on the crescent. Unfortunately, this dead zone currently precludes estimating the refraction in patients with small refractive errors, corresponding to a substantial portion of the population.
[0091] We believe that there could be sufficient information about the refractive error in the dead zone. We found that there is light, especially including scattered light, in the image of the pupil, even when the crescent is not visible (i.e. where most of the image of the geometric crescent from the retina, is truncated by the pupil). Beyond the most luminous bright crescent, there is actually a broader image that includes a dim and broad halo around the most luminous part of the crescent. That is, even where the primary relatively bright crescent is substantially completely obscured by the pupil, we believe that there is still useful refraction information in the remaining scattered light caused by the halo of the crescent.
[0092] FIG. 12 is a drawing showing images of the pupil of an eye where different refractive errors have been induced by means of trial lenses. The top row of FIG. 12 shows images of a pupil into the dead zone for different values of defocus. The bottom row of FIG.
12 shows corresponding graphs showing the light intensity distribution. Although the crescent is not visible, the graphs show changing intensity distributions with defocus. We are into the dead zone because there is no visible crescent but the graphs on the bottom row show how still there is smooth decay of intensity that might be used to estimate the refractive error. This smooth decay of intensity is almost certainly the result of light scatter within the eye. Presumably because of its low intensity where the crescent is substantially not visible in an image of the pupil, the dead zone has previously been ignored by earlier practitioners of eccentric photorefraction. Smartphones are sensitive enough to image this scattered light which includes the halo of the crescent, which is largely light from the obscured crescent (the crescent is always present on the retina, even when obscured by the pupil). We believe that the remaining scattered light of the halo of the crescent can also be used to calculate a center of a circle representing the crescent even when the crescent is obscured in large part by the pupil, a condition which previously made such autorefraction measurements near the dead zone less accurate or impossible where the crescent is substantially no longer present.
[0093] An error correction to a 2D light distribution largely derived from the light of the halo of the crescent, can be important to arrive at an accurate prescription.
[0094] EFFECT OF AND CORRECTION FOR ETHNICITY - The ethnicity of the subject can influence the refractive error obtained from eccentric photorefraction, though the explanation for this effect has not been clearly demonstrated. We realized that variations in the amount of light reflected from the back of the eye can be attributed to and corrected for by factoring in effects of melanin as related to ethnicity and skin color. [0095] Variation by ethnicity is believed to be caused at least in part, if not predominately by variations in the spectral reflectance of the fundus of the eye. Ethnicity, skin color, and iris color are likely surrogates for the color of the fundus.
[0096] The appearance of the crescent in the pupil is a consequence of the back reflection of light onto the retina. The fraction and chromaticity of the reflected light are mostly driven by the concentration of melanin, a compound which absorbs light and also affects the skin and iris color. The relevance of this fact is that, along with the crescent, there is additional light on the image of the pupil which is a consequence of back scattered light from the inner surface of the ocular globe. This scattered light may affect the apparent position of the crescent and the estimation of the refractive error. This effect is more intense as the concentration of melanin decreases. Because there is a correlation between the amount of melanin in the iris and that in the fundus at the back of the eye, we can use the iris as a surrogate for the fundus reflectance. Of course, within ethnicities, there can be a range of skin colors.
[0097] AGE - As described hereinabove, there can be additional light on the image of the pupil which is a consequence of back scattered light from all over the inner surface of the ocular globe. This scattered light may affect the apparent position of the crescent and the estimation of the refractive error. Age related changes to the physiology of the eye structure also can cause such scattered light, due to, for example, clouding of the cornea and/or the lens of the eye with age.
[0098] DETERMINING ERROR CORRECTION FACTORS - Experience with large populations of persons using the new system and method for measuring aberrations in an optical system smartphone autorefractor according to the Application will help us to determine optimal ways to provide such corrections including, for example, a questionnaire in the process asking ethnicity, reading skin color directly from the selfie image, reading iris color and patterns, etc., and combinations thereof.
[0099] FIG. 13 is a drawing showing the error in the estimation of the refractive state of the eye is plotted versus the lightness of the iris. The found correlation is statistically significant, suggesting that ethnicity might play a role in eccentric photorefraction. The plot in FIG. 13 shows a statistically significant correlation between the error in the estimation of the refractive state of the eye using eccentric photorefraction and the lightness of the iris. We can incorporate ethnicity as well as iris color into our 2-D model of eccentric photorefraction to improve the accuracy of the method. Ethnicity and/or skin color can be factored into our system and method by any suitable technique, including, for example information provided to the App by a user and/or a determination of skin color from one or more selfie images, including images of the eye and surrounding skin color.
[00100] EXAMPLE: Prescription corrected by an error plot:
[00101] FIG. 14 is a graph showing a photorefraction vs HS - center error correction. After an initial prescription is obtained by fitting the intensity to the 2D model, a correction may need to be made to take into account other factors not considered in the model or uncertainties in the estimation of the parameters describing the model. This correction can be, for example, obtained after a comparison of the eccentric photorefraction measurements with the Hartmann-Shack wavefront sensor. FIG. 14 illustrates an exemplary comparison based on such a correction in a 1 : 1 plot, revealing that the correction to apply is simply an offset of 0.7 D. For example, an initial value of -4 D would yield a final prescription of -0.3 D.
[00102] While such comparisons can be made one to one in laboratory testing, when implemented with a new system and method for measuring aberrations in an optical system smartphone autorefractor according to the Application, the error corrections can be based on experientially derived databases or any suitable known and determined corrections for factors such as, for example aging, ethnicity, skin color, etc.
[00103] PUPIL EDGE SENSING - We have previously demonstrated a pupil edge sensor that was able to successfully refract the eye in the benchtop prototype. In this approach, two or more images of the light returning from a flash on the retina are acquired in different focal planes near the pupil of the eye. As the light emerges from the pupil, its spatial distribution evolves in different ways depending on the refractive state of the eye, providing an opportunity to compute the refractive error by comparing images at different focal planes with each other. Pupil edge sensing can optionally be used to supplement eccentric photorefraction by smartphone systems and methods as described hereinabove.
[00104] SELF-CALIBRATION BASED ON RGB SPECTRAL INFORMATION - Chromatic information in the one or more images of the eye can be used to remove some of the individual differences in estimates of refraction by using the three spectral bands (RGB) available in the color image.
[00105] Variability of the accuracy of measurements of aberrations of the eye from subject to subject in autorefractor measurements causes errors in the measurement results. The reasons for the variability are not yet well understood. Variability is believed related to either of, or more likely a combination of variations in the physical structure of the fundus of the eye (back of the eye) and individual variations in chromatic aberration of the eye, more specifically lateral chromatic aberration (LCA). That is, there are small variations in the optical power of the individual eye as a function of wavelength.
[00106] Parameters of the eccentric photorefracted crescent, particularly the location of the crescent is known to vary as a function of wavelength for each individual eye. There is an average variation of crescent position as a function of wavelength for the human eye, and then variations from that average crescent location as a function of wavelength for each individual eye.
[00107] We realized that in the one or more images acquired using a mobile device according to the systems and methods of the Application, there is wavelength separable data in the form of RGB data for pixels from the RGB imager of the mobile device. The available RGB data allows for individual R, G, and B images in monochrome color. By comparing the physical location of the eccentric photorefracted crescent in each of the R, G, and B images, at least the differences in crescent location can be measured and compared to the average known locations of the RGB crescent location for the human eye, or as might be further refined by specific populations as the method is refined and more data acquired.
[00108] The crescent in individual R, G, and B images in monochrome color can be compared to expected crescent parameters, such as crescent location at a particular wavelength for an average human eye. Our theoretical modeling indicates that this comparison can be used to further correct and remove individual errors in autorefractor measurements as described hereinabove, providing a more accurate refractive error measurement for each individual human eye.
[00109] While the example of Appendix IV is to compare the location of the crescent of an individual eye in RGB images, it will be understood by those skilled in the art that other comparisons at different wavelengths can be made, such as for example, shape of the crescent curve and intensity gradient at different wavelengths.
[00110] While the new RGB crescent correction method is first described in the context of improving eccentric photorefraction measurements, the new correction can be used with any autorefractor measurement system and method which can provide images of the crescent in at least two different wavelengths. Typically, the at least two different wavelengths are RGB for a mobile device imager. Other wavelengths as available with any suitable imager or imaging technique can also be used. The new correction can be used beyond the limited context of mobile device applications including the specific smartphone autorefractometry examples of the Application.
[00111] Measurement of optical properties of the eye with a smartphone using a chromatic eccentric photorefraction approach. A new way to increase the precision of refraction estimations by eccentric photorefraction is described. The new method uses two or more images of the pupil collected at different wavelengths and is based in the differences between them. A self-calibration of the effect of defocus can thus be performed.
[00112] In eccentric photorefraction, the slope of the intensity profile of the illuminated pupil, which is the input to estimate refraction, not only depends on refraction but also on a number of other parameters such as the overall brightness of the image or the eccentricity of the light source, to name a few. Some of these factors are subject-dependent affecting the precision of the method.
[00113] Isolation of the effect of refraction may be achieved by introducing known amounts of defocus and measuring the effect on the slope, producing a self-calibration. However, this approach may be less practical because the method uses multiples images and induction of known amounts of defocus.
[00114] An alternative approach is to use chromatic additional information. The human eye shows a spectrally dependent difference of focus very similar across individuals and known as longitudinal chromatic aberration (LCA). By taking images at different wavelengths, this fact can be used to produce the needed self-calibration.
[00115] FIG. 15 is a drawing which shows an example of a camera (e.g. a smartphone camera) collecting colored images of the pupil. This example of an implementation of the self- calibration procedure using spectral information. A colored image of the pupil is taken and separated in its R, G, B components. Between each of them there is a refractive error given by the LCA of the eye. Known the slopes and the difference in focus, a self-calibration may be performed (The graph of FIG. 15 which shows the LCA, was adapted from Thibos et al., Applied Optics, 31, 19, 1992). In this case, the spectral information is selected by using the three separate channels (R, G, B) which constitute the RGB color space. Each image provides a slope and the difference of focus between images is given by the standard LCA found in humans.
[00116] SMARTPHONES (INCLUDING TABLETS, NOTEBOOKS, ETC.) - Most modem computing devices with a camera may be suitable to ran the new process according to the Application. Generally, considerations include flash, flash intensity, flash rate, distance of the flash from the camera lens, flash color spectrum, suppression and ability for an App to commandeer and control lens focus, and control or suppression of smart device “red” eye correction modes. Most modern cameras, including many variable focus cameras, can be configured to measure aberrations in an optical system according to the new methods of the Application.
[00117] Other parameters of interest to the autorefractor processes described hereinabove, include burst speeds and/or video speeds of the exemplary smart phone camera. For example, from our initial investigations for running the new process on a smartphone, sufficient camera speed has likely been possible since the Samsung Galaxy S7 and Apple iPhone 7 models. Such smart phone camera features having been continuously improved every model year since up to an including, for example, recent iPhone and Galaxy models.
[00118] Current implantations use a flash, however it is likely possible to extend the techniques of the Application to desktop computer cameras, such as for example, the Apple iMac camera, as such cameras improve either with an external flash, or no flash. Also, as watch cameras improve, it is likely that devices, such as, for example, an Apple watch type device can run the new process according the Application.
[00119] The new methods described hereinabove can be generally used with any suitable type of corrective eyewear including, for example, contact lenses, intraocular lenses, and spectacles. These new methods can also be used with refractive surgery, which can be used to correct the vision of person’s eyes through any suitable refractive surgery medical procedures. Generally, such surgical procedures are performed on an out-patient basis by a medical professional, where the person undergoing the treatment is a patient.
[00120] Firmware and/or software for an autorefractor process, a human eye pupil refraction model, and/or a deep learning comparison of pupil edge/sizes in different focal planes and corresponding autorefractor or subjective eye measurements for the pupil as well as any other data, models, and/or processes described hereinabove can be supplied on a computer readable non-transitory storage medium. A computer readable non-transitory storage medium as non-transitory data storage includes any data stored on any suitable media in a non-fleeting manner. Such data storage includes any suitable computer readable non-transitory storage medium, including, but not limited to hard drives, non-volatile RAM, SSD devices, CDs, DVDs, etc.
[00121] It will be appreciated that variants of the above-disclosed and other features and functions, or alternatives thereof, may be combined into many other different systems or applications. Various presently unforeseen or unanticipated alternatives, modifications, variations, or improvements therein may be subsequently made by those skilled in the art which are also intended to be encompassed by the following claims.
[00122] While embodiments of the present disclosure have been particularly shown and described with reference to certain examples and features, it will be understood by one skilled in the art that various changes in detail may be affected therein without departing from the spirit and scope of the present disclosure as defined by claims that can be supported by the written description and drawings.
[00123] A computer readable non-transitory storage medium as non-transitory data storage includes any data stored on any suitable media in a non-fleeting manner. Such data storage includes any suitable computer readable non-transitory storage medium, including, but not limited to hard drives, non-volatile RAM, SSD devices, CDs, DVDs, etc.
[00124] It will be appreciated that variants of the above-disclosed and other features and functions, or alternatives thereof, may be combined into many other different systems or applications. Various presently unforeseen or unanticipated alternatives, modifications, variations, or improvements therein may be subsequently made by those skilled in the art which are also intended to be encompassed by the following claims.

Claims

What is claimed is:
1. An autorefractor apparatus comprising: at least one source of flash illumination; at least one camera mounted on or in a surface of a mobile device, said at least one camera comprising at least one focusable lens, said at least one camera disposed at a different location on a surface of said mobile device at a same or different distance from said at least one source of flash illumination; and at least one computer programmed to ran an autorefractor process to transform at least one image of an eye taken by said at least one camera into an eyewear prescription based on a two-dimensional light distribution of a crescent, in said at least one image of light returning from a retina of the eye.
2. The autorefractor apparatus of claim 1 , wherein said two-dimensional image of the crescent in the pupil is characterized by any of the parameters: baseline intensity, top intensity at the plateau, center of the circle, radius of the circle, radius of the plateau, and combinations thereof.
3. The autorefractor apparatus of claim 1, wherein said autorefractor process is based at least in part on a light intensity distribution where said crescent is substantially not visible in the at least one image and where a correction is less than about 1.5 diopter.
4. The autorefractor apparatus of claim 1 , wherein said autorefractor process is based further on an age of a person autorefracted.
5. The autorefractor apparatus of claim 1, wherein said autorefractor process is based further on an ethnicity of a person autorefracted.
6. The autorefractor apparatus of claim 1 , wherein said autorefractor process is based further on a skin color of a person autorefracted.
7. The autorefractor apparatus of claim 6, wherein said skin color of said person autorefracted is determined from said at least one image.
8. The autorefractor apparatus of claim 6, wherein a color of the back of the eye or of the fundus of the eye of said person autorefracted is determined from said at least one image.
9. The autorefractor apparatus of claim 1 , further comprising a self-calibration process based on RGB information of said at least one image.
10. The autorefractor apparatus of claim 1, wherein said camera comprises a smartphone RGB camera.
11. A method of measuring aberrations in an optical system smartphone autorefractor comprising: providing at least one source of flash illumination, at least one camera mounted on or in a surface of a mobile device, said at least one camera comprising at least one focusable lens, said at least one camera disposed at a different location on a surface of said mobile device at a same or different distance from said at least one source of flash illumination; and at least one computer programmed to run an autorefractor process; acquiring at least one image of an eye of a person at a distance D; and transforming said at least one image to an eyewear prescription based on a two- dimensional light distribution of a crescent in said at least one image of light returning from a retina of the eye.
12. The method of claim 11, wherein said two-dimensional image of the crescent in the pupil is characterized by any of the parameters: baseline intensity, top intensity at the plateau, center of the circle, radius of the circle, radius of the plateau, and combinations thereof.
13. The method of claim 11, wherein said step of transforming comprises transforming said at least one image to an eyewear prescription based at least in part on a light intensity distribution where said crescent is substantially not visible in the at least one image and where a correction is less than about 1.5 diopter.
14. The method of claim 11, wherein said step of transforming comprises transforming said at least one image to an eyewear prescription corrected based on an age of a person autorefracted.
15. The method of claim 11, wherein said step of transforming comprises transforming said at least one image to an eyewear prescription corrected based on an ethnicity of a person autorefracted.
16. The method of claim 11, wherein said step of transforming comprises transforming said at least one image to an eyewear prescription corrected based on a skin color of a person autorefracted.
17. The method of claim 11, wherein said step of transforming comprises transforming said at least one image to an eyewear prescription corrected based on a color of the back of the eye or a color of the fundus of the eye of a person autorefracted.
18. The method of claim 17, further comprising the step of determining said skin color of said person autorefracted is determined from said at least one image.
19. The method of claim 11, further comprising the step of performing a self calibration process based on RGB information of said at least one image.
20. The method of claim 11, wherein said step of providing comprises providing a smartphone RGB camera.
21. The method of claim 11, wherein said step of acquiring comprises taking a selfie with a smartphone camera.
22. The method of claim 11, further comprising before the step of acquiring, a step of placing a lens between the eye and the camera to substantially eliminate the dead zone.
23. A method of measuring aberrations in an optical system smartphone autorefractor comprising: providing at least one source of flash illumination, at least one camera mounted on or in a surface of a mobile device, said at least one camera comprising at least one focusable lens, said at least one camera disposed at a different location on a surface of said mobile device at a same or different distance from said at least one source of flash illumination; and at least one computer programmed to run an autorefractor process; acquiring at least one image of an eye of a person at a distance D; and transforming said at least one image to an eyewear prescription based at least in part on a provided ethnicity or a skin color read from said at least one image in said at least one image of light returning from a retina of the eye.
24. A method of measuring aberrations in an optical system smartphone autorefractor comprising: providing at least one source of flash illumination, at least one camera mounted on or in a surface of a mobile device, said at least one camera comprising at least one focusable lens, said at least one camera disposed at a different location on a surface of said mobile device at a same or different distance from said at least one source of flash illumination; and at least one computer programmed to run an autorefractor process; acquiring at least one image of an eye of a person at a distance D; and transforming said at least one image to an eyewear prescription based at least in part on a provided age of a patient in said at least one image of light returning from a retina of the eye.
PCT/US2022/011400 2021-01-08 2022-01-06 System and method to measure aberrations by imaging both the crescent and the halo of the crescent WO2022150448A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202163135037P 2021-01-08 2021-01-08
US63/135,037 2021-01-08

Publications (1)

Publication Number Publication Date
WO2022150448A1 true WO2022150448A1 (en) 2022-07-14

Family

ID=80050667

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/011400 WO2022150448A1 (en) 2021-01-08 2022-01-06 System and method to measure aberrations by imaging both the crescent and the halo of the crescent

Country Status (1)

Country Link
WO (1) WO2022150448A1 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10610097B2 (en) * 2016-06-30 2020-04-07 Carl Zeiss Vision International Gmbh System and a method for corrective lens determination
EP3669752A1 (en) * 2018-12-20 2020-06-24 Essilor International Method for determining a refraction feature of an eye of a subject, and associated portable electronic device

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US10610097B2 (en) * 2016-06-30 2020-04-07 Carl Zeiss Vision International Gmbh System and a method for corrective lens determination
EP3669752A1 (en) * 2018-12-20 2020-06-24 Essilor International Method for determining a refraction feature of an eye of a subject, and associated portable electronic device

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
THIBOS ET AL., APPLIED OPTICS, vol. 31, 1992, pages 19

Similar Documents

Publication Publication Date Title
US20220007939A1 (en) Apparatus and method for determining an eye property
JP6470746B2 (en) Apparatus and method for determining ophthalmic prescription
JP6212115B2 (en) Apparatus and method for measuring objective eye refraction and at least one geometrical form parameter of a person
US20180263488A1 (en) Variable Lens System for Refractive Measurement
JP6049750B2 (en) Luminance-dependent adjustment of spectacle lenses
EP3626157A1 (en) Eyeglass prescription method and system
EP3291722B1 (en) Improved objective phoropter
JP6279677B2 (en) Universal objective refraction
US12011224B2 (en) Method for determining refractive power of eye using immersive system and electronic device thereof
KR20200008996A (en) Equipment, methods and systems for measuring the impact of ophthalmic lens design
JP2003225205A (en) Fully corrected vision characteristics measuring apparatus and method, contrast sensitivity measuring apparatus and method and contrast sensitivity target presentation device
CN114340472B (en) Joint determination of accommodation and vergence
US20230255473A1 (en) Integrated apparatus for visual function testing and method thereof
WO2022150448A1 (en) System and method to measure aberrations by imaging both the crescent and the halo of the crescent
EP4382031A1 (en) Apparatus and method for determining refraction error of at least an eye of a subject
US20240225441A1 (en) Method, system and computer-program for estimating refraction of an eye of an individual
WO2024003614A1 (en) Systems and methods for retinal spectral imaging calibration phantom
EP4418982A1 (en) Methods and apparatus for ocular examination
CN114513983A (en) Method and system for determining a prescription for an eye of a person
CN118591336A (en) Method, apparatus and computer program product for determining sensitivity of at least one eye of a test object

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22701124

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 22701124

Country of ref document: EP

Kind code of ref document: A1