WO2020146714A1 - Method and device for remote optical monitoring of intraocular pressure - Google Patents

Method and device for remote optical monitoring of intraocular pressure Download PDF

Info

Publication number
WO2020146714A1
WO2020146714A1 PCT/US2020/013049 US2020013049W WO2020146714A1 WO 2020146714 A1 WO2020146714 A1 WO 2020146714A1 US 2020013049 W US2020013049 W US 2020013049W WO 2020146714 A1 WO2020146714 A1 WO 2020146714A1
Authority
WO
WIPO (PCT)
Prior art keywords
eyewear
iop
eye
user
data
Prior art date
Application number
PCT/US2020/013049
Other languages
French (fr)
Inventor
Aykutlu Dana
Sevda Agaoglu
Ahmet Taylan YAZICI
Murat BADAY
Savas Komban
Original Assignee
Smartlens, Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Smartlens, Inc. filed Critical Smartlens, Inc.
Priority to EP20738775.4A priority Critical patent/EP3908239A4/en
Publication of WO2020146714A1 publication Critical patent/WO2020146714A1/en
Priority to US17/370,735 priority patent/US12114931B2/en

Links

Classifications

    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/10Electronic devices other than hearing aids
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/16Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring intraocular pressure, e.g. tonometers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0025Operational features thereof characterised by electronic signal processing, e.g. eye models
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/107Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining the shape or measuring the curvature of the cornea
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61MDEVICES FOR INTRODUCING MEDIA INTO, OR ONTO, THE BODY; DEVICES FOR TRANSDUCING BODY MEDIA OR FOR TAKING MEDIA FROM THE BODY; DEVICES FOR PRODUCING OR ENDING SLEEP OR STUPOR
    • A61M35/00Devices for applying media, e.g. remedies, on the human body
    • A61M35/10Wearable devices, e.g. garments, glasses or masks
    • GPHYSICS
    • G02OPTICS
    • G02BOPTICAL ELEMENTS, SYSTEMS OR APPARATUS
    • G02B5/00Optical elements other than lenses
    • G02B5/30Polarising elements
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C11/00Non-optical adjuncts; Attachment thereof
    • G02C11/04Illuminating means
    • GPHYSICS
    • G02OPTICS
    • G02CSPECTACLES; SUNGLASSES OR GOGGLES INSOFAR AS THEY HAVE THE SAME FEATURES AS SPECTACLES; CONTACT LENSES
    • G02C7/00Optical parts
    • G02C7/02Lenses; Lens systems ; Methods of designing lenses
    • HELECTRICITY
    • H04ELECTRIC COMMUNICATION TECHNIQUE
    • H04NPICTORIAL COMMUNICATION, e.g. TELEVISION
    • H04N23/00Cameras or camera modules comprising electronic image sensors; Control thereof
    • H04N23/57Mechanical or electrical details of cameras or camera modules specially adapted for being embedded in other devices

Definitions

  • the present disclosure is related to a system and methods of using a wearable optical imaging sensor system for measuring intraocular pressure.
  • Glaucoma is the second most common cause of blindness in the global world. It is a multifactorial disease with several risk factors, of which intraocular pressure (IOP) is the most important. IOP measurements are used for glaucoma diagnosis and patient monitoring. IOP has wide diurnal fluctuation, and is dependent on body posture, so the occasional measurements done by the eye care expert in a clinic can be misleading.
  • IOP intraocular pressure
  • the present disclosure relates to a wearable optical device (such as eyewear, goggles or visors) for measuring the intraocular pressure of an eye.
  • a wearable optical device such as eyewear, goggles or visors
  • an eyewear device for measuring intraocular pressure (IOP).
  • the device may have a frame, a first lens mounted to the frame such that the lens may be in the field of view of a person wearing the eyewear device.
  • the eyewear device may have a first illumination source positioned to illuminate the eye of a user, a first image sensor positioned to capture images of the eye of the user, a first communication portal being in electronic or signal communication with a computational device and a first drug dispensing device being aligned to deliver a dose of a drug to the eye of the user.
  • there may be a method of training an image processing pipeline involves collecting personalized opthalmologic data on a user's anatomy and a user's corneal properties at a known IOP, collecting personalized data from an eyewear device for measuring IOP, and using at least one computational mode and ray tracing under one or more geometric configurations to generate at least one set of training data for a neural network components pipeline.
  • a system for measuring and treating the IOP of a patient may have: a computational device, a wearable eyewear device for collecting IOP data and being in signal communication with the computational device, the eyewear device having a drug dispensing component.
  • the system may also have a database containing a user profile including personalized ophthalmologic reference data where the database may be accessed by the computational device, and where the database, and the IOP data are used to determine a treatment regimen for a user's eye.
  • a drug delivery component on the eyewear may deliver the drug in response to a signal from the computational device.
  • the computational device may be a cell phone, a tablet or a laptop computer. In still other embodiments, the computational device may be attached to the wearable eyewear device.
  • Devices, systems and methods are described herein using eyewear with one or more illuminators and one or more image sensors.
  • the combination of illuminator(s) and image sensor(s) may operate to eliminate one or more of ambient lighting changes and/or misalignment error, while providing a sensitive and accurate measurement of the cornea radius.
  • a small change of the radius of curvature (as small as 4 micrometers per 1 mmHg change in IOP) may be observed for a typical adult cornea.
  • the optical design may allow image processing and sensor fusion, as well as machine learning to accurately and sensitively measure the radius of curvature changes in the cornea.
  • the measured changes may be used in a calculation using a machine learning program, a learning neural network, an artificial intelligence program, or other analytic computational program to relate the measured changes in radius to the IOP.
  • the method may use a preliminary characterization of the corneal thickness and topography where the radius of curvature at a known IOP reading is acquired by conventional ophthalmologic methods.
  • the personalized data set may then use as an input into the data processing algorithms, that also use continuous imaging measurements from the eyewear to calculate the IOP.
  • the data may be connected to a computational device such as a cell phone or the cloud, and the eyewear may dispense a drug using a drug dispensing device. The drug may help reduce the IOP of the eye.
  • the present disclosure includes a wearable optical device that measures the IOP through image acquisition from one or more image sensors, and uses the image data along with a reference data for a particular individual to accurately determine the IOP, and may dispense drugs to the eye to control the IOP.
  • FIG. 1 illustrates an optical imaging sensor goggle for measuring the intraocular pressure remotely (IOP goggle) according to an embodiment.
  • FIG. 2 illustrates a top view of an IOP goggle according to an embodiment.
  • FIG. 3 illustrates a goggle according to an embodiment.
  • FIG. 4 illustrates the change in corneal topography when the IOP changes from 15 to 30 mmHg according to an embodiment.
  • FIG. 5 illustrates a schematic ray trace showing optical beams bouncing off a cornea according to an embodiment.
  • FIG. 6 illustrates a schematic ray trace showing optical beams forming images according to an embodiment.
  • FIG. 7 illustrates a schematic ray trace that shows images of the point-sources at the camera’s image planes when the corneal Radius changes according to an
  • FIG. 8 illustrates a coordinate system used to describe the corneal position according to an embodiment.
  • FIG. 9 illustrates a schematic ray trace showing corneal X-position changes according to an embodiment.
  • FIG. 10 illustrates a schematic ray trace showing corneal z-position changes according to an embodiment.
  • FIG. 11 illustrates a schematic ray trace showing image changes when the corneal angular position changes according to an embodiment.
  • FIG. 12 illustrates a video frame capture according to an embodiment.
  • FIG. 13 illustrates an alternative video frame capture according to an
  • FIG. 14 illustrates a calculation showing changes in positions of laser points according to an embodiment.
  • FIG. 15 illustrates a video frame capture according to an embodiment.
  • FIG. 16 illustrates example data extracted from video captures according to an embodiment.
  • FIG. 17 illustrates a graph of data using a polynomial fit according to an embodiment.
  • FIG. 18 illustrates a schematic of data processing according to an embodiment.
  • FIG. 19 illustrates a sample logic according to an embodiment.
  • FIG. 20 illustrates a data processing flow chart according to an embodiment.
  • FIG. 21 illustrates an example data processing pipeline according to an embodiment.
  • FIG. 22 illustrates another example data processing pipeline according to an embodiment.
  • the present disclosure describes wearable eyewear, systems and methods for measuring the cornea of an eye, and determining the intraocular pressure of the measured eye based on the curvature of the cornea.
  • the disclosure includes eyewear, a
  • computational device for calculating IOP values based on cornea data collected by the eyewear, and methods for calculating the IOP, and dispensing a drug to the eye when needed.
  • the eyewear as described herein may take a variety of forms.
  • the form factor may be one of choice for a user, or one for the user's optometrist or other professional medical person responsible for the user's eye health.
  • the form factor may include a frame and a lens.
  • the frame may be one where the user may wear in front of his eyes (note the use of male or female pronouns may be distributed herein randomly.
  • the disclosed technology is not dependent on the gender of the user. The interchanging use of the gender of the user or other persons described herein is simply for the convenience of the applicant).
  • the frame may be any sort of eyewear frame used for modern eyewear, including frames for sun glasses, vision correction glasses, safety glasses, goggles of all types (e.g.
  • the frame may be suitable for a single lens for one eye, a lens for two eyes (e.g. a visor), or a single lens and an eye cover (such as for persons with“lazy eye” or who may suffer from the loss of one eye).
  • the lens may be a prescription lens for vision correction, a clear or tinted lens for appearance, or an opaque lens that covers the eye.
  • the lens may have a defined area for the field of view of the user. The field of few may be clear to avoid blocking the vision of the user.
  • the various elements of the eyewear device may be place on the periphery of the lens, or on the frame.
  • the frame or lens may have flanges or other protrusions or tabs for the attachment of image sensors, light sources, battery, computational devices, drug delivery devices, or any other component suitable for the use with the present disclosure.
  • the wearable eyewear may have one or more image sensors positioned to face the eye(s) of the user so the image sensor may capture an image of the eye.
  • the image sensor may be a camera, a CCD (charge coupled device), CMOS (complementary metal oxide semiconductor), or other image capture technology.
  • the wearable eyewear may have one or more light sources for projecting light at the eye.
  • the light source may be a form of illumination that produces specific wavelengths of light.
  • the light emission may be at a shallow angle to the curvature of the cornea, and projected outside the lens portion of the eye so that the light does not interfere with the users normal vision.
  • the light source may be a laser.
  • the light source may be a LED (light emitting diode), and in other embodiments the light source may be any light generating technology now known or still to be developed.
  • the light source(s) and image sensor(s) may be positioned so that images captured by the image sensor are able to ignore ambient light, glare or other optical artifacts that might interfere with the accurate reading of the change in cornea curvature.
  • the light source and the image sensor may use one or more polarizing filters to substantially reduce or eliminate light of a particular polarization, wavelength or intensity, so the captured image may have greater reliability and less signal noise.
  • the eyewear may have a light sensor to help regulate when the ambient lighting conditions are appropriate for taking a suitable image of the eye to determine the cornea curvature.
  • the images captured by the image sensors may be stored locally for a period of time, or transmitted to a computational device via a communication portal.
  • the communication portal may be an antenna for wireless transmission of data to a computational device.
  • the communication portal may send and receive information, such as sending image data, and receiving dosing information for a drug delivery device.
  • the computational device may be a cell phone, a tablet computer, a laptop computer, or any other computational device a user may select to carry out program (App) functions for the eyewear device.
  • the computational device may be resident on the eyewear.
  • the communication portal may be a wired connection between the image sensors, the light sources, the computational device, and a power supply for all the electrical components.
  • the communication portal may connect the eyewear to the cloud.
  • the method may use a basic operation pipeline.
  • the pipeline may receive image data from a variety of sources.
  • the image data may come from the eyewear as it is worn by a user.
  • the image data may come from a database having stored ophthalmologic data of the user at a fixed point in time.
  • the images may be anatomic data of a user from a fixed point in time.
  • some or all the available image data may be used in a deep neural network with an image processing front-end.
  • the image processing front-end may derive or calculate an IOP reading.
  • the IOP reading may be updated at video data rates, providing a quasi-real time output.
  • the data pipeline may cause an image sensor to change exposure levels, gain, brightness and contrast in order to capture non-saturated images.
  • the images may be passed through a threshold filter to reduce or eliminate background noise.
  • Some high resolution images may be stored in a temporary memory for rapid processing, while blurry and low resolution images are formed.
  • the low resolution images may then be passed through a match filter or feature detection filter to pinpoint spots corresponding to particular illumination/light sources in the various captured images.
  • the coarse locations may then be used to segment the high resolution images and perform peak fitting algorithms to individually determine the positions and widths of each peak in the images.
  • the results of the peak locations and widths may then be used with the previously trained neural network, which may then be used to estimate cornea coordinate and radius of curvature.
  • a nonlinear equation solver may be used to convert the radius of curvature into an IOP reading.
  • the IOP reading may then be used to determine a drug dose to administer to the eye being monitored.
  • the drug dose information may be relayed back through the communication portal to the eyewear and the drug dispensing device.
  • the drug dispensing device may then administer the proper does to the eye.
  • the drug delivery device may use an atomizer.
  • the drug delivery device may use eye drops.
  • the computational device may provide an alert to the user to self administer a drug of a certain dose at a certain time.
  • a wearable eyewear device may be coupled to a
  • the computational device to measure the IOP of a user's eye.
  • the user may be a person wearing the eyewear unless the context of the usage clearly indicates otherwise.
  • an eyewear 102 device having a frame 104 and a lens 106 may be provided.
  • the lens 106 may have a first light source 108, and one or more image sensor 112, 114 and 116 placed on it. In other embodiments, any one of the light source 108 or image sensors may be placed on the frame 104. In some embodiments the image sensors and light source 108 may be placed on either the frame 104, the lens 106, or partially on both.
  • the eyewear 102 may also have a drug delivery device 110 positioned to deliver a medication directly to the eye.
  • the drug delivery device 110 may be an atomizer or other aerosol device, a dropper or any other device for delivering mediation to the eye.
  • the drug delivery device 110 may be a mist applicator.
  • the mist applicator may be a MEMS (micro-electro- mechanical systems) atomizer with a cartridge for the dispensed drug, that may be replaced when needed.
  • a controller 118 may control the individual image sensors, the light source 108 and the drug delivery device 110.
  • the controller 118 may be connected to the other components via a wire or cable connection, or by using a short range wireless communication protocol to each.
  • each component may have its own power source.
  • a single power source may be wired to each of the components to power all the components as needed.
  • a combination of power sources, local and central may be used.
  • the power supply to the controller and other components
  • components may be replaceable.
  • the components are depicted as large blocks for illustration purposes only. The components are not to scale on the eyewear 102 and no interpretation of the size of the components should be assigned to them based on the drawing figure. The drawing figures are for illustration purposes only.
  • FIG. 2 there may be an optical design for the eyewear 202 as shown in FIG. 2.
  • the eyewear 202 may be fitted with a side illuminator made up of a planar illuminator 216, a laser diode 212 collimated by a collimator lens 214 and multiplied into a pattern by a hologram 210.
  • the assembly of the planar illuminator 216, laser diode 212 and collimator lens 214 may make up a light source 220.
  • the hologram 210 may be relayed towards the cornea 222 by a mirror 218.
  • the reflections of the hologram 210 off the cornea 222 may be captured by one or more image sensor 204, 206, 208.
  • the planar illuminator 216 may provide wide angle and uniform
  • the planar illuminator 216 may be turned on to acquire a background image of the cornea, pupil, and iris. It may then be turned off to allow background free image collection from other light sources such a laser diode 212, or any other light source that may be provided.
  • the laser diode 212 may project a laser beam through the collimator lens 214 and through a hologram 210.
  • the hologram 210 image reflects off the mirror 218 and shines on to the cornea 222.
  • the hologram image reflects to a first image sensor 208 and a second image sensor 206 as shown by the arrows.
  • the side image sensor 204 does not capture any image from the hologram reflection of the cornea 222.
  • an eyewear 302 device may be provided as shown in FIG. 3.
  • the eyewear 302 device may have a frame 306 holding a first lens 326 and a second lens 328.
  • Image sensor 304, 308, 324 may be attached to the inside (facing the eye) of the lenses or the frame.
  • a light source 310 may be positioned near the nose bridge of the eyewear frame 306.
  • a cross section of a lens 326, 328 is shown.
  • the eyewear lens has an array of spherical defects, which may also be sparsely positioned illuminators 320.
  • a side illuminator 316 may project line into the lens.
  • a low refractive index cladding layer 318 and a linear polarization film 322 form the front layer of the lens. The light from the side illuminator 316 travels through the lens.
  • the eyewear according to an embodiment may be fitted with planar side illuminator 312, 316, as well as an array of sparsely positioned illuminators 320 that may be embedded into the front cover of the lens of the eyewear 302.
  • a linear polarization film 322 may allow one (vertical, horizontal or other planar orientation) polarization from the ambient light into the eyewear 302 to facilitate vision while blocking the other polarization. This relationship may help the eyewear to work without interference of any ambient light at the linear polarization film 322.
  • a front image sensor 308, secondary image sensor 324 and a sideview image sensor 304 may have a crossed polarizer that may block the ambient light admitted by the linear polarization film 322.
  • a drug delivery device may be incorporated into the eyewear to dispense drugs for IOP control based on the IOP readings.
  • a waveguide approach to generating a see-through illumination pattern may be seen in the diamond shaped arrows in the cross section image of the lens.
  • the windows of the eyewear have an array of spherical defect 314 and may be illuminated by a side illuminator 316 from within the lens.
  • the lens maybe be coated with a low refractive index cladding layer 318 to separate the waveguide from the linear polarization film 322.
  • FIG. 4 An illustration of the cross section of corneal deflection is shown in FIG. 4.
  • the illustration shows two curves, one raised slightly above the other.
  • the top curve illustrates the corneal displacement of 30 mm Hg (30 mm of mercury pressure) and the bottom curve shows the corneal displacement for half that pressure, or 15 mm HG.
  • the illustration provides two examples where the radius and the apex of the cornea may change due to IOP within the eye.
  • An example of a ray trace diagram is now shown in FIG. 5.
  • a point source 502 may project light on to the surface of the cornea 506. The light rays may be reflected off the cornea 506 and form one or more reflection 504.
  • the curvature of the cornea 506 as well as the angle of incidence and angle of reflection may be determined using the known position of the point source 502 relative to the cornea, the known angle of image capture by one or more image sensors, and the dispersion of the light from the point source as seen in the images captured.
  • the y-axis 508 and x-axis 510 are provided for reference.
  • an example ray trace from multiple point source 602 lighting may be arranged around the cornea 608.
  • the light from each of the many point source 602 lighting may be captured at image sensor 604 and image sensor 612, producing real image 606 and real image 610 respectively.
  • Virtual images 614 may also be
  • FIG. 7 an example ray trace illustration for two different cornea radii are shown in FIG. 7.
  • the two example corneal IOP pressures are 15 and 30 mmHG.
  • a series of multiple point sources 702 are arranged around the cornea.
  • a front image sensor 704 and a side image sensor 712 are positioned to capture real image 706 and real image 710 respectively.
  • light from the multiple point sources 702 bounces of the cornea and the reflected light may be captured by the image sensors 704, 712.
  • the 15 mmHg cornea 716 has a lower y-axis projection, or a larger radius of curvature.
  • the 30 mmHg cornea 708 has a higher y-axis projection and a smaller radius of curvature.
  • the two cornea pressures may also cause the creation of two different virtual images, a 15 mmHg virtual image 714 and a 30 15 mmHg virtual image 714.
  • the virtual cornea images may be formed below the surface of the cornea.
  • the positions of the spots corresponding to the multiple point sources in the real images may be different for the two IOP values (15 and 30 mmHg), demonstrating the possibility of using such images to calculate IOP values for the eye.
  • FIG. 7 An example coordinate system is shown in FIG. 7.
  • the origin of the spherical coordinate system may be the center of vision for the eye, or an arbitrary position along the cornea or inside the eye. Note that in the various embodiments, the orientation of the x-Axis does not reduce generality.
  • the shifting of the cornea in a direction may be detected as shown in FIG. 9.
  • the multiple point sources 902 are arrayed around the cornea.
  • a first image sensor 904 may capture a first real image 906, while a second image sensor 910 may capture a second real image 908.
  • Second real image 908 may vary from one image to another based on the x-axis shift of the cornea over time.
  • a left shift cornea 912 may be slightly shifted from the position of a right shift cornea 914, with corresponding left shift virtual image 918 and right shift virtual image 916 respectively.
  • the shift in the cornea may be imaged, and used to determine the shift in the X-position of the cornea.
  • Image analysis may be used to correlate the image data to produce reliable x-shift information.
  • the z position shift of the cornea may be determined as shown in a ray trace illustration as shown in FIG. 10.
  • multiple point sources 1002 produce light that reflects off the cornea.
  • the reflected light may be captured by image sensor 1004 and side image sensor 1008.
  • Real image 1006 and real image 1010 are collected from the image sensors.
  • the cornea of the eye may shift in a z axis direction.
  • Virtual images may be similarly adjusted, producing a positive virtual image 1016 that may correspond to the z shift positive 1012 and a negative virtual image 1018 that may correspond to the z shift negative 1014 cornea position.
  • the positions of the spots in the real images 1006, 1010 represent different Z positions. The difference may be used to extract the Z position of the cornea through analysis of one or more of the various images.
  • the angular (theta) shift can be determined using ray trace images as shown in FIG. 11.
  • the light may reflect of the cornea and images may be captured in a front image sensor 1 104 and side image sensor 1 1 10, each producing real image 1 106 and real image 1 108 respectively.
  • the cornea theta positive 1 1 12 may represent a positive shift in the theta direction, while a cornea theta negative 1 1 14 may represent a negative theta position shift.
  • a positive virtual image 1 1 16 and negative virtual image 1 1 18 may also be detected.
  • the positions of the spots in real images 1 106, 1 108 may represent two different theta tilt positions.
  • a side view image of an eye may be seen, captured through a side facing image sensor (not shown), while the is illuminated using a matrix pattern from the front in FIG. 12.
  • FIG 13 there is shown another example of illumination using laser energy formed into lines as shown in FIG 13.
  • FIG 13 there may be shown a computation for laser lines incident on the cornea under two different pressure levels. It can be seen that the lines intercept the cornea at different positions for different IOP values.
  • the crescent shaped curve images are analyzed along with images of the point sources, the images contain enough information to accurately estimate the eye position with respect to the eyewear position, the corneal radius and the IOP.
  • the intercept positions of a multitude of laser energy may be formed into spots by the hologram, and may be calculated for two different IOP values as shown in FIG. 14.
  • a video frame capture from a front view camera from inside the eyewear
  • multiple laser energy spots as shown in FIG 15.
  • the reflection of the illuminating spots from the cornea, similar to the positions of spots previously described, may be visible by a front camera.
  • FIG. 16 a side view of a model cornea under two different pressure settings may be seen in FIG. 16.
  • the left figure represents the curvature and bulge of the cornea model when the model is exposed to 15 mmHg of fluid pressure.
  • the right figure shows a slight increase in the bulge of the model when exposed to 50 mmHg pressure.
  • the curvature of the model cornea may also be changing as the pressure increases or decreases. The curvature and bulge may be measured using the various techniques described herein.
  • the curvature of the cornea may be captured in images, and quantified through analysis as shown in FIG. 17.
  • the images may be processed to extract the interface between the cornea and air and to perform a polynomial fit to the extracted curves.
  • the curvature and peak position may be separately extracted and plotted as shown in the top left and right plots.
  • the changes in applied pressure may be accurately extracted from the fitted curves with a noise level below about 1 mmHg.
  • the fits may be high order polynomials - allowing baseline shifts due to linear positional shifts to be reduced or eliminated.
  • the image data from the image sensors on the eyewear may be input to a deep neural network that may be composed of image processing components, to reduce the image data to a set of data points.
  • the image processing pipeline may contain trained feature extractors or matched-filtering, edge detection algorithms, filtering algorithms and/or other filters and algorithms.
  • the use of several image sensors may allow determination of the position of the eye with respect to the illumination and eyewear image sensors as well as the head of the user.
  • the algorithms may then be used with neural networks and conventional mathematical fitting methods to extract with high precision the curvature of the cornea.
  • the schematic diagram of the training method for the neural network/deep neural network may involve having the user undergo standard ophthalmologic measurements. These measurements may give accurate values for personal values of cornea thickness, position, and corneal topography in relation to a reference IOP level. The user may also undergo a brief data collection process where the eyewear may be used and reference data may be collected at the given IOP. In this fashion the eyewear may be calibrated to an individual user. All of this may be data collected from a reference system or systems. These measurements may personalize the system for a user with a unique personal corneal topography.
  • the data collected from personalized measurements may then be fed, along with a computational model (“Geometrical Parameter generator” and“Cornel/Anatomical parameter generator”), into a ray tracing system to generate large amounts of image data for a wide variety of parameters.
  • the outputs may then be used with the NN/DNN that contains an image processing pipeline to estimate corneal Radius and IOP.
  • the locations of spots in the real images from the various image sensors may be calculated for a variety of cornea positions and tilts, as well as cornea radii using ray tracing simulations.
  • the locations of the spots and widths of the spots may be extracted from the ray tracing simulations and may form into vectors to be input into the neural network training software, and original cornea positions may be fed as desired outputs.
  • the training procedure with a large data set may permit the neural network to handle this highly nonlinear problem to be solved with sufficient speed and accuracy.
  • the material shown in FIG. 19. May be considered as“pseudo-code” that summarizes the steps of data generation from ray-tracing simulations and formatting of the data to train the neural network.
  • the eyewear may use image sensors, such as cameras, to capture images.
  • the captured images may be combined with the personal ophthalmologic and anatomic data.
  • the images may be fed into the deep neural network (DNN) with an image processing front-end, to achieve an IOP estimate.
  • the IOP estimates may be updated at video rates, providing near real time output.
  • the pipeline for data processing may be seen in greater detail, as may be seen in FIG. 21.
  • the exposure level, gain, brightness and contrast settings of the image sensor may be adjusted rapidly to capture non-saturated images. In some embodiments this adjustment may be done for each light source, even if there may be multiple point sources as described herein.
  • the images may be evaluated for image saturation, and if the image saturation is too high, the gain and exposure of the image sensor may be adjusted, and the image taken again. If the image saturation is acceptable, the images may be passed through a threshold filter, eliminating the non- relevant background signals.
  • High resolution images may be stored in a temporary memory. The high resolution images may be used to create blurred and lower resolution images (which may be useful for faster processing).
  • the low-resolution images may then be passed through a match-filter or feature detection filter to locate point matric pattern position and angles.
  • This function may allow the filter to identify each pinpoint of light in the image and match that pinpoint of light to the corresponding multiple point sources of light in each of the real images.
  • the process may then calculate the coarse positions of each point of light in the real images from the image sensors.
  • the process may then produce the appropriate x and y coordinates for each real image.
  • the coarse locations may then be used to segment each point domain and calculate peak position and peak width of each point in the high resolution real images with accuracy.
  • the accurate coordinates of x and y positions for each point in the matrix pattern for each image sensor (camera) , as well as width of peaks may then be produced.
  • the coordinate data, along with the cornea reference properties may then be fed into the neural network or deep neural network.
  • the cornea reference properties may include, by way of
  • the topography of the cornea may be used with the previously trained neural network/DNN to estimate cornea position x, y, x, theta and phi in image sensor coordinate system and corneal radius (radius of curvature).
  • a nonlinear equation solver may be used to convert the radius of curvature into an IOP reading.
  • the IOP reading may be used with a lookup table (not shown) to determine a dose of a drug.
  • the drug dose may then be dispensed through the drug delivery device.
  • the pipeline for data processing may be adjusted to include a switching between different illumination sources at the beginning of the pipeline as shown in FIG. 22.
  • the switching between different illumination sources may allow facile separation of image spots in the real images corresponding to different light sources, thereby speeding up the image processing, as well as improving accuracy of data collection.
  • the virtual images generally may not be used themselves in the process.
  • the real images may be formed from the virtual images after the image sensor focus light from the virtual images onto the imaging plane of the various image sensors.
  • the advantages of the present invention include, without limitation, a robust process for making of highly sensitive wearable contact lens sensors that have no electrical power or circuits and can be monitored remotely by a simple camera like one found in a mobile phone.

Landscapes

  • Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • General Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Animal Behavior & Ethology (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Public Health (AREA)
  • Heart & Thoracic Surgery (AREA)
  • General Physics & Mathematics (AREA)
  • Optics & Photonics (AREA)
  • Medical Informatics (AREA)
  • Biophysics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Signal Processing (AREA)
  • Anesthesiology (AREA)
  • Hematology (AREA)
  • Multimedia (AREA)
  • Otolaryngology (AREA)
  • Acoustics & Sound (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

A wearable eyewear device, methods of use and systems are described that allow a person wearing the eyewear device to accurately measure the intraocular pressure of their eye, and dispense a medication to the eye when needed.

Description

METHOD AND DEVICE FOR REMOTE OPTICAL MONITORING OF
INTRAOCULAR PRESSURE
CROSS-REFERENCE TO RELATED APPLICATIONS
[0000] This application claims priority from US Provisional Application 62/790,752 (attorney docket number 48675-707.101) entitled“Method and Device for Remote Optical Monitoring of Intraocular Pressure,” filed January 10, 2019, the contents of which are hereby incorporated by reference in their entirety.
BACKGROUND
[0001] FIELD
[0002] The present disclosure is related to a system and methods of using a wearable optical imaging sensor system for measuring intraocular pressure.
[0003] BACKGROUND
[0004] Glaucoma is the second most common cause of blindness in the global world. It is a multifactorial disease with several risk factors, of which intraocular pressure (IOP) is the most important. IOP measurements are used for glaucoma diagnosis and patient monitoring. IOP has wide diurnal fluctuation, and is dependent on body posture, so the occasional measurements done by the eye care expert in a clinic can be misleading.
[0005] Previously (US20160015265A1, 2018), an implantable microfluidic device has been proposed for intraocular pressure monitoring, that can be used for glaucoma diagnosis. Later, a wearable device was demonstrated (Lab Chip, 2018, 18, 3471-3483) to serve the same purpose, however without needing implantation. In these previous studies, it was established that intraocular pressure increases results in bulging of the cornea and consequently changes in the radius of curvature.
[0006] In literature, it is shown that the IOP changes affect the corneal topography, causing changes in corneal radius and apex height with respect to the corneal periphery. If the corneal topography can be measured accurately, the 4 micrometer change in corneal radius per 1 mmHg IOP change can be monitored and IOP value can be inferred.
[0007] Thus there remains a need for an IOP measuring device that can take multiple measurements of a patient eye through out the day as the patient goes through their normal routine. [0008] There is also a need for a device that has sufficient sensitivity to take measurements to produce reliable data for accurate diagnosis.
[0009] There is still further a need for such a device to operate in a manner that does not interfere with a patient's normal vision and activities.
[0010] There is still further a need for a device that can operate reliably while a patient carries on their normal daily activities, and the device does not require a particular critical position or alignment relative to the patient's eyes. The device should be user friendly.
BRIEF SUMMARY
[0011] These and other objectives may be met using the device, system and methods described herein. In various embodiments, the present disclosure relates to a wearable optical device (such as eyewear, goggles or visors) for measuring the intraocular pressure of an eye.
[0012] In an embodiment, there may be an eyewear device for measuring intraocular pressure (IOP). The device may have a frame, a first lens mounted to the frame such that the lens may be in the field of view of a person wearing the eyewear device. The eyewear device may have a first illumination source positioned to illuminate the eye of a user, a first image sensor positioned to capture images of the eye of the user, a first communication portal being in electronic or signal communication with a computational device and a first drug dispensing device being aligned to deliver a dose of a drug to the eye of the user.
[0013] In another embodiment there may be a method of training an image processing pipeline. The method involves collecting personalized opthalmologic data on a user's anatomy and a user's corneal properties at a known IOP, collecting personalized data from an eyewear device for measuring IOP, and using at least one computational mode and ray tracing under one or more geometric configurations to generate at least one set of training data for a neural network components pipeline.
[0014] In another embodiment there may be a system for measuring and treating the IOP of a patient. The system may have: a computational device, a wearable eyewear device for collecting IOP data and being in signal communication with the computational device, the eyewear device having a drug dispensing component. The system may also have a database containing a user profile including personalized ophthalmologic reference data where the database may be accessed by the computational device, and where the database, and the IOP data are used to determine a treatment regimen for a user's eye. A drug delivery component on the eyewear may deliver the drug in response to a signal from the computational device.
[0015] In various embodiments, the computational device may be a cell phone, a tablet or a laptop computer. In still other embodiments, the computational device may be attached to the wearable eyewear device.
[0016] Devices, systems and methods are described herein using eyewear with one or more illuminators and one or more image sensors. The combination of illuminator(s) and image sensor(s) may operate to eliminate one or more of ambient lighting changes and/or misalignment error, while providing a sensitive and accurate measurement of the cornea radius. A small change of the radius of curvature (as small as 4 micrometers per 1 mmHg change in IOP) may be observed for a typical adult cornea. The optical design may allow image processing and sensor fusion, as well as machine learning to accurately and sensitively measure the radius of curvature changes in the cornea. The measured changes may be used in a calculation using a machine learning program, a learning neural network, an artificial intelligence program, or other analytic computational program to relate the measured changes in radius to the IOP. The method may use a preliminary characterization of the corneal thickness and topography where the radius of curvature at a known IOP reading is acquired by conventional ophthalmologic methods. The personalized data set may then use as an input into the data processing algorithms, that also use continuous imaging measurements from the eyewear to calculate the IOP. The data may be connected to a computational device such as a cell phone or the cloud, and the eyewear may dispense a drug using a drug dispensing device. The drug may help reduce the IOP of the eye. The present disclosure includes a wearable optical device that measures the IOP through image acquisition from one or more image sensors, and uses the image data along with a reference data for a particular individual to accurately determine the IOP, and may dispense drugs to the eye to control the IOP.
BRIEF DESCRIPTION OF THE SEVERAL VIEWS OF THE DRAWINGS
[0017] Reference is now made to the drawings in brief, where like part numbers refer to the same part. Otherwise different part numbers, even if similar to other part numbers, represent different parts of different embodiments. Elements in the illustrations are not shown to scale unless specifically indicated, and may be distorted to some degree to emphasize the element or some characteristic of the element. Not all parts are shown in all embodiments so that the view of the figure does not become unnecessarily distorted.
[0018] FIG. 1 illustrates an optical imaging sensor goggle for measuring the intraocular pressure remotely (IOP goggle) according to an embodiment.
[0019] FIG. 2 illustrates a top view of an IOP goggle according to an embodiment.
[0020] FIG. 3 illustrates a goggle according to an embodiment.
[0021] FIG. 4 illustrates the change in corneal topography when the IOP changes from 15 to 30 mmHg according to an embodiment.
[0022] FIG. 5 illustrates a schematic ray trace showing optical beams bouncing off a cornea according to an embodiment.
[0023] FIG. 6 illustrates a schematic ray trace showing optical beams forming images according to an embodiment.
[0024] FIG. 7 illustrates a schematic ray trace that shows images of the point-sources at the camera’s image planes when the corneal Radius changes according to an
embodiment.
[0025] FIG. 8 illustrates a coordinate system used to describe the corneal position according to an embodiment.
[0026] FIG. 9 illustrates a schematic ray trace showing corneal X-position changes according to an embodiment.
[0027] FIG. 10 illustrates a schematic ray trace showing corneal z-position changes according to an embodiment.
[0028] FIG. 11 illustrates a schematic ray trace showing image changes when the corneal angular position changes according to an embodiment.
[0029] FIG. 12 illustrates a video frame capture according to an embodiment.
[0030] FIG. 13 illustrates an alternative video frame capture according to an
embodiment.
[0031] FIG. 14 illustrates a calculation showing changes in positions of laser points according to an embodiment.
[0032] FIG. 15 illustrates a video frame capture according to an embodiment.
[0033] FIG. 16 illustrates example data extracted from video captures according to an embodiment. [0034] FIG. 17 illustrates a graph of data using a polynomial fit according to an embodiment.
[0035] FIG. 18 illustrates a schematic of data processing according to an embodiment.
[0036] FIG. 19 illustrates a sample logic according to an embodiment.
[0037] FIG. 20 illustrates a data processing flow chart according to an embodiment.
[0038] FIG. 21 illustrates an example data processing pipeline according to an embodiment.
[0039] FIG. 22 illustrates another example data processing pipeline according to an embodiment.
DETAILED DESCRIPTION
[0040] The present disclosure describes wearable eyewear, systems and methods for measuring the cornea of an eye, and determining the intraocular pressure of the measured eye based on the curvature of the cornea. The disclosure includes eyewear, a
computational device for calculating IOP values based on cornea data collected by the eyewear, and methods for calculating the IOP, and dispensing a drug to the eye when needed.
[0041] The eyewear as described herein may take a variety of forms. The form factor may be one of choice for a user, or one for the user's optometrist or other professional medical person responsible for the user's eye health. In some embodiments, the form factor may include a frame and a lens. The frame may be one where the user may wear in front of his eyes (note the use of male or female pronouns may be distributed herein randomly. The disclosed technology is not dependent on the gender of the user. The interchanging use of the gender of the user or other persons described herein is simply for the convenience of the applicant). The frame may be any sort of eyewear frame used for modern eyewear, including frames for sun glasses, vision correction glasses, safety glasses, goggles of all types (e.g. Swimming, athletic, safety, skiing, and so on). The frame may be suitable for a single lens for one eye, a lens for two eyes (e.g. a visor), or a single lens and an eye cover (such as for persons with“lazy eye” or who may suffer from the loss of one eye). The lens may be a prescription lens for vision correction, a clear or tinted lens for appearance, or an opaque lens that covers the eye. In many embodiments, the lens may have a defined area for the field of view of the user. The field of few may be clear to avoid blocking the vision of the user. The various elements of the eyewear device may be place on the periphery of the lens, or on the frame. The frame or lens may have flanges or other protrusions or tabs for the attachment of image sensors, light sources, battery, computational devices, drug delivery devices, or any other component suitable for the use with the present disclosure.
[0042] The wearable eyewear may have one or more image sensors positioned to face the eye(s) of the user so the image sensor may capture an image of the eye. The image sensor may be a camera, a CCD (charge coupled device), CMOS (complementary metal oxide semiconductor), or other image capture technology. The wearable eyewear may have one or more light sources for projecting light at the eye. In some embodiments, the light source may be a form of illumination that produces specific wavelengths of light. The light emission may be at a shallow angle to the curvature of the cornea, and projected outside the lens portion of the eye so that the light does not interfere with the users normal vision. In some embodiments the light source may be a laser. In some embodiments the light source may be a LED (light emitting diode), and in other embodiments the light source may be any light generating technology now known or still to be developed.
[0043] In various embodiments, the light source(s) and image sensor(s) may be positioned so that images captured by the image sensor are able to ignore ambient light, glare or other optical artifacts that might interfere with the accurate reading of the change in cornea curvature. The light source and the image sensor may use one or more polarizing filters to substantially reduce or eliminate light of a particular polarization, wavelength or intensity, so the captured image may have greater reliability and less signal noise. In another embodiment the eyewear may have a light sensor to help regulate when the ambient lighting conditions are appropriate for taking a suitable image of the eye to determine the cornea curvature. The images captured by the image sensors may be stored locally for a period of time, or transmitted to a computational device via a communication portal.
[0044] In some embodiments, the communication portal may be an antenna for wireless transmission of data to a computational device. The communication portal may send and receive information, such as sending image data, and receiving dosing information for a drug delivery device. In various embodiments, the computational device may be a cell phone, a tablet computer, a laptop computer, or any other computational device a user may select to carry out program (App) functions for the eyewear device. In some embodiments, the computational device may be resident on the eyewear. In some embodiments, the communication portal may be a wired connection between the image sensors, the light sources, the computational device, and a power supply for all the electrical components. In still other embodiments, the communication portal may connect the eyewear to the cloud.
[0045] In an embodiment, there is a method for determining the IOP of an eye. In some embodiments, the method may use a basic operation pipeline. The pipeline may receive image data from a variety of sources. In some embodiments the image data may come from the eyewear as it is worn by a user. In some embodiments the image data may come from a database having stored ophthalmologic data of the user at a fixed point in time. In some embodiments the images may be anatomic data of a user from a fixed point in time. In an embodiment, some or all the available image data may be used in a deep neural network with an image processing front-end. The image processing front-end may derive or calculate an IOP reading. In some embodiments, the IOP reading may be updated at video data rates, providing a quasi-real time output.
[0046] In another embodiment, the data pipeline may cause an image sensor to change exposure levels, gain, brightness and contrast in order to capture non-saturated images. The images may be passed through a threshold filter to reduce or eliminate background noise. Some high resolution images may be stored in a temporary memory for rapid processing, while blurry and low resolution images are formed. The low resolution images may then be passed through a match filter or feature detection filter to pinpoint spots corresponding to particular illumination/light sources in the various captured images. The coarse locations may then be used to segment the high resolution images and perform peak fitting algorithms to individually determine the positions and widths of each peak in the images. The results of the peak locations and widths may then be used with the previously trained neural network, which may then be used to estimate cornea coordinate and radius of curvature. A nonlinear equation solver may be used to convert the radius of curvature into an IOP reading.
[0047] In an embodiment, the IOP reading may then be used to determine a drug dose to administer to the eye being monitored. The drug dose information may be relayed back through the communication portal to the eyewear and the drug dispensing device. The drug dispensing device may then administer the proper does to the eye. In some embodiments, the drug delivery device may use an atomizer. In other embodiments the drug delivery device may use eye drops. In still other embodiments, the computational device may provide an alert to the user to self administer a drug of a certain dose at a certain time.
[0048] As described herein, a wearable eyewear device may be coupled to a
computational device to measure the IOP of a user's eye. The user may be a person wearing the eyewear unless the context of the usage clearly indicates otherwise.
[0049] Various aspects, embodiments and examples are described that may be imprecise. In medical technology and treatment, diagnosis, drug prescription and usage, as well as therapy regimens may not be the same for every person do to nuances in individual biology. Thus various embodiments described herein may use a term such as "generally," or "substantially." These terms should be understood to mean that due to variations of people, and variations of eyes, from each other, and from one person to the next, there may necessarily be variations in how some embodiments operate in calculations, in communications, in data manipulation and in treatment. We refer to “generally” and“substantially” as including any variation that fits the spirit of the present disclosure.
[0050] Reference is made herein to various components and images. The use of the references are to help guide the reader in a further understanding of the present disclosure. In particular, while the singular version of a noun is often used, it should be understood that the embodiments fully consider plural numbers of components and images to also be within the scope of the disclosure.
[0051] Referring now to the FIG. 1, an eyewear 102 device having a frame 104 and a lens 106 may be provided. The lens 106 may have a first light source 108, and one or more image sensor 112, 114 and 116 placed on it. In other embodiments, any one of the light source 108 or image sensors may be placed on the frame 104. In some embodiments the image sensors and light source 108 may be placed on either the frame 104, the lens 106, or partially on both. The eyewear 102 may also have a drug delivery device 110 positioned to deliver a medication directly to the eye. The drug delivery device 110 may be an atomizer or other aerosol device, a dropper or any other device for delivering mediation to the eye. In some embodiments, the drug delivery device 110 may be a mist applicator. In some embodiments, the mist applicator may be a MEMS (micro-electro- mechanical systems) atomizer with a cartridge for the dispensed drug, that may be replaced when needed. A controller 118 may control the individual image sensors, the light source 108 and the drug delivery device 110. The controller 118 may be connected to the other components via a wire or cable connection, or by using a short range wireless communication protocol to each. In some embodiments, each component may have its own power source. In some embodiments, a single power source may be wired to each of the components to power all the components as needed. In some embodiments, a combination of power sources, local and central, may be used.
[0052] In various embodiments, the power supply to the controller and other
components may be replaceable. In some embodiments, there may be a drug reservoir (not shown) associated with the drug delivery device 110, and the drug reservoir may be replaceable, or refillable. In the drawing, the components are depicted as large blocks for illustration purposes only. The components are not to scale on the eyewear 102 and no interpretation of the size of the components should be assigned to them based on the drawing figure. The drawing figures are for illustration purposes only.
[0053] In an embodiment, there may be an optical design for the eyewear 202 as shown in FIG. 2. The eyewear 202 may be fitted with a side illuminator made up of a planar illuminator 216, a laser diode 212 collimated by a collimator lens 214 and multiplied into a pattern by a hologram 210. The assembly of the planar illuminator 216, laser diode 212 and collimator lens 214 may make up a light source 220. The hologram 210 may be relayed towards the cornea 222 by a mirror 218. The reflections of the hologram 210 off the cornea 222 may be captured by one or more image sensor 204, 206, 208. In an embodiment, the planar illuminator 216 may provide wide angle and uniform
illumination, allowing the image sensors to acquire images of the eye. The planar illuminator 216 may be turned on to acquire a background image of the cornea, pupil, and iris. It may then be turned off to allow background free image collection from other light sources such a laser diode 212, or any other light source that may be provided.
[0054] In an embodiment, the laser diode 212 may project a laser beam through the collimator lens 214 and through a hologram 210. The hologram 210 image reflects off the mirror 218 and shines on to the cornea 222. Depending on the curvature of the eye, the hologram image reflects to a first image sensor 208 and a second image sensor 206 as shown by the arrows. In this embodiment, the side image sensor 204 does not capture any image from the hologram reflection of the cornea 222. [0055] In an embodiment, an eyewear 302 device may be provided as shown in FIG. 3. In an embodiment, the eyewear 302 device may have a frame 306 holding a first lens 326 and a second lens 328. Image sensor 304, 308, 324 may be attached to the inside (facing the eye) of the lenses or the frame. A light source 310 may be positioned near the nose bridge of the eyewear frame 306.
[0056] In an embodiment, a cross section of a lens 326, 328 is shown. The eyewear lens has an array of spherical defects, which may also be sparsely positioned illuminators 320. A side illuminator 316 may project line into the lens. A low refractive index cladding layer 318 and a linear polarization film 322 form the front layer of the lens. The light from the side illuminator 316 travels through the lens.
[0057] In operation, the eyewear according to an embodiment may be fitted with planar side illuminator 312, 316, as well as an array of sparsely positioned illuminators 320 that may be embedded into the front cover of the lens of the eyewear 302. A linear polarization film 322 may allow one (vertical, horizontal or other planar orientation) polarization from the ambient light into the eyewear 302 to facilitate vision while blocking the other polarization. This relationship may help the eyewear to work without interference of any ambient light at the linear polarization film 322. A front image sensor 308, secondary image sensor 324 and a sideview image sensor 304 may have a crossed polarizer that may block the ambient light admitted by the linear polarization film 322. A drug delivery device may be incorporated into the eyewear to dispense drugs for IOP control based on the IOP readings. A waveguide approach to generating a see-through illumination pattern may be seen in the diamond shaped arrows in the cross section image of the lens. The windows of the eyewear have an array of spherical defect 314 and may be illuminated by a side illuminator 316 from within the lens. The lens maybe be coated with a low refractive index cladding layer 318 to separate the waveguide from the linear polarization film 322.
[0058] An illustration of the cross section of corneal deflection is shown in FIG. 4. The illustration shows two curves, one raised slightly above the other. The top curve illustrates the corneal displacement of 30 mm Hg (30 mm of mercury pressure) and the bottom curve shows the corneal displacement for half that pressure, or 15 mm HG. The illustration provides two examples where the radius and the apex of the cornea may change due to IOP within the eye. [0059] An example of a ray trace diagram is now shown in FIG. 5. In an embodiment, a point source 502 may project light on to the surface of the cornea 506. The light rays may be reflected off the cornea 506 and form one or more reflection 504. The curvature of the cornea 506 as well as the angle of incidence and angle of reflection may be determined using the known position of the point source 502 relative to the cornea, the known angle of image capture by one or more image sensors, and the dispersion of the light from the point source as seen in the images captured. The y-axis 508 and x-axis 510 are provided for reference.
[0060] In an embodiment, an example ray trace from multiple point source 602 lighting may be arranged around the cornea 608. The light from each of the many point source 602 lighting may be captured at image sensor 604 and image sensor 612, producing real image 606 and real image 610 respectively. Virtual images 614 may also be
conceptualized.
[0061] In another embodiment, an example ray trace illustration for two different cornea radii are shown in FIG. 7. The two example corneal IOP pressures are 15 and 30 mmHG. As previously described, a series of multiple point sources 702 are arranged around the cornea. A front image sensor 704 and a side image sensor 712 are positioned to capture real image 706 and real image 710 respectively. In various embodiments, light from the multiple point sources 702 bounces of the cornea and the reflected light may be captured by the image sensors 704, 712. In the case of a low pressure cornea, the 15 mmHg cornea 716 has a lower y-axis projection, or a larger radius of curvature. The 30 mmHg cornea 708 has a higher y-axis projection and a smaller radius of curvature. The two cornea pressures may also cause the creation of two different virtual images, a 15 mmHg virtual image 714 and a 30 15 mmHg virtual image 714. The virtual cornea images may be formed below the surface of the cornea. The positions of the spots corresponding to the multiple point sources in the real images may be different for the two IOP values (15 and 30 mmHg), demonstrating the possibility of using such images to calculate IOP values for the eye.
[0062] An example coordinate system is shown in FIG. 7. The origin of the spherical coordinate system may be the center of vision for the eye, or an arbitrary position along the cornea or inside the eye. Note that in the various embodiments, the orientation of the x-Axis does not reduce generality. [0063] In an embodiment, the shifting of the cornea in a direction may be detected as shown in FIG. 9. In an embodiment, the multiple point sources 902 are arrayed around the cornea. A first image sensor 904 may capture a first real image 906, while a second image sensor 910 may capture a second real image 908. Second real image 908 may vary from one image to another based on the x-axis shift of the cornea over time. A left shift cornea 912 may be slightly shifted from the position of a right shift cornea 914, with corresponding left shift virtual image 918 and right shift virtual image 916 respectively. Using the shifted images between a first point in time T1 and a second point in time T2, the shift in the cornea may be imaged, and used to determine the shift in the X-position of the cornea. Image analysis may be used to correlate the image data to produce reliable x-shift information.
[0064] In another embodiment, the z position shift of the cornea may be determined as shown in a ray trace illustration as shown in FIG. 10. In an embodiment, multiple point sources 1002 produce light that reflects off the cornea. The reflected light may be captured by image sensor 1004 and side image sensor 1008. Real image 1006 and real image 1010 are collected from the image sensors. The cornea of the eye may shift in a z axis direction. In some embodiments, there may be z shift positive 1012 and a z shift negative 1014 corresponding to the movement of the cornea. Virtual images may be similarly adjusted, producing a positive virtual image 1016 that may correspond to the z shift positive 1012 and a negative virtual image 1018 that may correspond to the z shift negative 1014 cornea position. The positions of the spots in the real images 1006, 1010 represent different Z positions. The difference may be used to extract the Z position of the cornea through analysis of one or more of the various images.
[0065] In another embodiment, the angular (theta) shift can be determined using ray trace images as shown in FIG. 11. In an embodiment there may be multiple point sources 1 102 of light. The light may reflect of the cornea and images may be captured in a front image sensor 1 104 and side image sensor 1 1 10, each producing real image 1 106 and real image 1 108 respectively. The cornea theta positive 1 1 12 may represent a positive shift in the theta direction, while a cornea theta negative 1 1 14 may represent a negative theta position shift. A positive virtual image 1 1 16 and negative virtual image 1 1 18 may also be detected. The positions of the spots in real images 1 106, 1 108 may represent two different theta tilt positions. The difference may be used to extract the angular tilt theta of the cornea through the analysis of the images 1 106, 1 108. [0066] In an embodiment, a side view image of an eye may be seen, captured through a side facing image sensor (not shown), while the is illuminated using a matrix pattern from the front in FIG. 12.
[0067] In an embodiment, there is shown another example of illumination using laser energy formed into lines as shown in FIG 13. In an embodiment, there may be shown a computation for laser lines incident on the cornea under two different pressure levels. It can be seen that the lines intercept the cornea at different positions for different IOP values. When the crescent shaped curve images are analyzed along with images of the point sources, the images contain enough information to accurately estimate the eye position with respect to the eyewear position, the corneal radius and the IOP.
[0068] In another embodiment, the intercept positions of a multitude of laser energy may be formed into spots by the hologram, and may be calculated for two different IOP values as shown in FIG. 14.
[0069] In another embodiment, a video frame capture from a front view camera (from inside the eyewear) with multiple laser energy spots as shown in FIG 15. The reflection of the illuminating spots from the cornea, similar to the positions of spots previously described, may be visible by a front camera.
[0070] In another embodiment, a side view of a model cornea under two different pressure settings may be seen in FIG. 16. The left figure represents the curvature and bulge of the cornea model when the model is exposed to 15 mmHg of fluid pressure. The right figure shows a slight increase in the bulge of the model when exposed to 50 mmHg pressure. The curvature of the model cornea may also be changing as the pressure increases or decreases. The curvature and bulge may be measured using the various techniques described herein.
[0071] In an embodiment, the curvature of the cornea may be captured in images, and quantified through analysis as shown in FIG. 17. The images may be processed to extract the interface between the cornea and air and to perform a polynomial fit to the extracted curves. The curvature and peak position may be separately extracted and plotted as shown in the top left and right plots. The changes in applied pressure may be accurately extracted from the fitted curves with a noise level below about 1 mmHg. In various embodiments, the fits may be high order polynomials - allowing baseline shifts due to linear positional shifts to be reduced or eliminated. In various embodiments, the image data from the image sensors on the eyewear may be input to a deep neural network that may be composed of image processing components, to reduce the image data to a set of data points. The image processing pipeline may contain trained feature extractors or matched-filtering, edge detection algorithms, filtering algorithms and/or other filters and algorithms. The use of several image sensors may allow determination of the position of the eye with respect to the illumination and eyewear image sensors as well as the head of the user. The algorithms may then be used with neural networks and conventional mathematical fitting methods to extract with high precision the curvature of the cornea.
[0072] In an embodiment, there may be a method of training a neural network or deep neural network, as shown in FIG. 18. In an embodiment, the schematic diagram of the training method for the neural network/deep neural network (NN/DNN) may involve having the user undergo standard ophthalmologic measurements. These measurements may give accurate values for personal values of cornea thickness, position, and corneal topography in relation to a reference IOP level. The user may also undergo a brief data collection process where the eyewear may be used and reference data may be collected at the given IOP. In this fashion the eyewear may be calibrated to an individual user. All of this may be data collected from a reference system or systems. These measurements may personalize the system for a user with a unique personal corneal topography. The data collected from personalized measurements may then be fed, along with a computational model (“Geometrical Parameter generator” and“Cornel/Anatomical parameter generator”), into a ray tracing system to generate large amounts of image data for a wide variety of parameters. The outputs may then be used with the NN/DNN that contains an image processing pipeline to estimate corneal Radius and IOP.
[0073] In an embodiment, there may be an algorithm for the generation of training data sets for the training of a neural network, or a deep neural network, as shown in FIG. 19. The locations of spots in the real images from the various image sensors may be calculated for a variety of cornea positions and tilts, as well as cornea radii using ray tracing simulations. The locations of the spots and widths of the spots may be extracted from the ray tracing simulations and may form into vectors to be input into the neural network training software, and original cornea positions may be fed as desired outputs. The training procedure with a large data set may permit the neural network to handle this highly nonlinear problem to be solved with sufficient speed and accuracy. In an embodiment, the material shown in FIG. 19. May be considered as“pseudo-code” that summarizes the steps of data generation from ray-tracing simulations and formatting of the data to train the neural network.
[0074] In an embodiment, the basic operation pipeline of the eyewear during
measurement may be seen in FIG. 20. The eyewear may use image sensors, such as cameras, to capture images. The captured images may be combined with the personal ophthalmologic and anatomic data. The images may be fed into the deep neural network (DNN) with an image processing front-end, to achieve an IOP estimate. The IOP estimates may be updated at video rates, providing near real time output.
[0075] In an embodiment, the pipeline for data processing may be seen in greater detail, as may be seen in FIG. 21. In an embodiment, the exposure level, gain, brightness and contrast settings of the image sensor may be adjusted rapidly to capture non-saturated images. In some embodiments this adjustment may be done for each light source, even if there may be multiple point sources as described herein. The images may be evaluated for image saturation, and if the image saturation is too high, the gain and exposure of the image sensor may be adjusted, and the image taken again. If the image saturation is acceptable, the images may be passed through a threshold filter, eliminating the non- relevant background signals. High resolution images may be stored in a temporary memory. The high resolution images may be used to create blurred and lower resolution images (which may be useful for faster processing). The low-resolution images may then be passed through a match-filter or feature detection filter to locate point matric pattern position and angles. This function may allow the filter to identify each pinpoint of light in the image and match that pinpoint of light to the corresponding multiple point sources of light in each of the real images. The process may then calculate the coarse positions of each point of light in the real images from the image sensors. The process may then produce the appropriate x and y coordinates for each real image. The coarse locations may then be used to segment each point domain and calculate peak position and peak width of each point in the high resolution real images with accuracy. The accurate coordinates of x and y positions for each point in the matrix pattern for each image sensor (camera) , as well as width of peaks may then be produced. The coordinate data, along with the cornea reference properties may then be fed into the neural network or deep neural network. The cornea reference properties may include, by way of
nonlimiting examples, the topography of the cornea, the size, the curvature, and any other measurement taken at the reference IOP). The results of the peak locations and widths, and/or the accurate measurements, may be used with the previously trained neural network/DNN to estimate cornea position x, y, x, theta and phi in image sensor coordinate system and corneal radius (radius of curvature). A nonlinear equation solver may be used to convert the radius of curvature into an IOP reading.
[0076] In another embodiment, the IOP reading may be used with a lookup table (not shown) to determine a dose of a drug. The drug dose may then be dispensed through the drug delivery device.
[0077] In another embodiment, the pipeline for data processing may be adjusted to include a switching between different illumination sources at the beginning of the pipeline as shown in FIG. 22. The switching between different illumination sources may allow facile separation of image spots in the real images corresponding to different light sources, thereby speeding up the image processing, as well as improving accuracy of data collection.
[0078] In various embodiments, the virtual images generally may not be used themselves in the process. The real images may be formed from the virtual images after the image sensor focus light from the virtual images onto the imaging plane of the various image sensors.
[0079] The advantages of the present invention include, without limitation, a robust process for making of highly sensitive wearable contact lens sensors that have no electrical power or circuits and can be monitored remotely by a simple camera like one found in a mobile phone.
[0080] While the foregoing written description of the invention enables one of ordinary skill to make and use what is considered presently to be the best mode thereof, those of ordinary skill will understand and appreciate the existence of variations, combinations, and equivalents of the specific embodiment, method, and examples herein. The invention should therefore not be limited by the above described embodiment, method, and examples, but by all embodiments and methods within the scope and spirit of the invention.

Claims

CLAIMS What we claimed is:
1. An eyewear device for measuring intraocular pressure, the device comprising:
a frame;
a first lens mounted to the frame, wherein the lens is positioned in the frame so as to be in a users normal viewing angle;
a first illumination source, the illumination source positioned to illuminate an eye of the user;
a first image sensor, the image sensor positioned to capture images of the eye of the user;
a first communication portal, the communication portal being in electronic communication with a computation device;
a first drug dispensing device, the drug dispensing device being aligned to deliver a dose of a drug to the eye of the user;
wherein the first illumination source, the first image sensor, the first
communication portal and the first drug dispensing device are mounted to the frame or to the lens.
2. The eyewear device of claim 1, wherein the first lens has a width to extend over the normal field of vision of a user who can see with two eyes.
3. The eyewear device of claim 1, wherein a second lens is mounted to the frame, the second lens positioned to be in the field of view of a second eye of the user.
4. The eyewear device of claim 1, wherein the first communication portal may be an antenna for wireless data transmission to a computational device.
5. The eyewear device of claim 4, wherein the computational device is a cellular phone, a tablet computer, or a laptop computer.
6. The eyewear device of claim 4, wherein the computational device is in a cloud based computer.
7. The eyewear device of claim 4, wherein the eyewear is a pair of goggles.
8. The eyewear device of claim 4, wherein the first illumination source further comprises:
a waveguide structure having a plurality of sparse outcouplers, the outcouplers acting as point sources of illumination.
9. The eyewear device of claim 4, wherein the first image sensor further comprises: a polarizing filter, the polarizing filter allowing the camera to capture images using only illumination for the illumination source.
10. The eyewear device of claim 1, wherein the image sensor further comprises:
a waveguide structure with at least one sparse outcoupler; and
a polarizing filter.
11. The eyewear device of claim 10, wherein the waveguide structure acts as a point source of light while enabling see through vision of an eye.
12. The eyewear device of claim 10, wherein the polarizing filter blocks the ambient light of one wavelength, allowing the image sensor with cross polarizers in their imaging path to capture only illumination from inside the eyewear.
13. The eyewear device of claim 1, further comprising an array of point sources of microscale dimensions, wherein the array of point sources are integrated with the lens to allow controlled and see-through illumination.
14. The eyewear device of claim 1, wherein the image sensor is a camera.
15. A method for training an image processing pipeline, the method comprising:
collecting personalized ophthalmologic data on a user’s anatomy and a user's corneal properties at a known IOP;
collecting personalized data from an eyewear device for measuring IOP; and using computational models and ray tracing under one or more geometric configurations to generate at least one set of training data for a neural network components pipeline.
16. A system for measuring and treating IOP, the system comprising:
a computational device;
a wearable eyewear device for collecting IOP data, the wearable eyewear device being in signal communication with the computational device, the wearable eyewear device having a drug dispensing component;
a database containing a user profile including personalized ophthalmologic reference data, the database being accessible by the computational device;
a computer implemented program for training an image processing pipeline, the method producing an IOP data for the accurate measuring of a user's IOP in at least one eye, the computer implemented program residing on the computational device;
wherein data from the database and the IOP data are used to determine a treatment regimen for a user's eye, and a treatment regimen is communicated to the eyewear device through signal communication, and the drug dispensing component dispenses a treatment drug according to the treatment regimen.
17. The system of claim 16, wherein the computational device is a cell phone, tablet computer or laptop computer.
PCT/US2020/013049 2019-01-10 2020-01-10 Method and device for remote optical monitoring of intraocular pressure WO2020146714A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
EP20738775.4A EP3908239A4 (en) 2019-01-10 2020-01-10 Method and device for remote optical monitoring of intraocular pressure
US17/370,735 US12114931B2 (en) 2019-01-10 2021-07-08 Method and device for remote optical monitoring of intraocular pressure

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201962790752P 2019-01-10 2019-01-10
US62/790,752 2019-01-10

Related Child Applications (1)

Application Number Title Priority Date Filing Date
US17/370,735 Continuation US12114931B2 (en) 2019-01-10 2021-07-08 Method and device for remote optical monitoring of intraocular pressure

Publications (1)

Publication Number Publication Date
WO2020146714A1 true WO2020146714A1 (en) 2020-07-16

Family

ID=71521195

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2020/013049 WO2020146714A1 (en) 2019-01-10 2020-01-10 Method and device for remote optical monitoring of intraocular pressure

Country Status (3)

Country Link
US (1) US12114931B2 (en)
EP (1) EP3908239A4 (en)
WO (1) WO2020146714A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118068594A (en) * 2024-04-02 2024-05-24 东莞市瞳立明企业管理有限公司 Intelligent sensor myopia prevention glasses and control method
US12114931B2 (en) 2019-01-10 2024-10-15 Smartlens, Inc. Method and device for remote optical monitoring of intraocular pressure

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US11604368B2 (en) * 2021-04-06 2023-03-14 Innovega, Inc. Contact lens and eyewear frame design using physical landmarks placed on the eye

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100016704A1 (en) 2008-07-16 2010-01-21 Naber John F Method and system for monitoring a condition of an eye
US20130184554A1 (en) * 2010-10-20 2013-07-18 Ahmed Elsheikh Device for monitoring intraocular pressure
US20130253451A1 (en) * 2010-12-03 2013-09-26 Hee Gu Kim Drug carrier device attachable to glasses
US20130278887A1 (en) * 2012-04-19 2013-10-24 Jerome A. Legerton Eye-wear borne electromagnetic radiation refractive therapy
US20140243645A1 (en) * 2011-10-05 2014-08-28 Sensimed Sa Intraocular Pressure Measuring and/or Monitoring Device
US20160015265A1 (en) 2013-03-07 2016-01-21 The Board Of Trustees Of The Leland Stanford Junior University Implantable Micro-Fluidic Device for Monitoring of Intra-Ocular Pressure
US20170000341A1 (en) 2015-03-16 2017-01-05 Magic Leap, Inc. Methods and systems for diagnosing eye conditions such as red reflex using light reflected from the eyes
US20180279870A1 (en) * 2015-09-17 2018-10-04 Envision Diagnostics, Inc. Medical interfaces and other medical devices, systems, and methods for performing eye exams
WO2018221687A1 (en) * 2017-05-31 2018-12-06 株式会社坪田ラボ Moisture mist-spraying device and method

Family Cites Families (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE4433104C1 (en) 1994-09-16 1996-05-02 Fraunhofer Ges Forschung Device for measuring mechanical properties of biological tissue
IL112264A (en) 1995-01-05 1998-09-24 Lipman Electronic Engineering Applantation tonometry apparatus
AU2013204040B2 (en) 2001-02-23 2016-12-22 Marcio Marc Aurelio Martins Abreu Noninvasive measurements of chemical substances
WO2007136993A1 (en) 2006-05-17 2007-11-29 Mayo Foundation For Medical Education And Research Monitoring intraocular pressure
US7981097B2 (en) 2008-03-27 2011-07-19 Paoli Jr Alexander Delli Medical device for the treatment and prevention of eye and respiratory tract conditions
US10219696B2 (en) 2013-03-07 2019-03-05 The Board Of Trustees Of The Leland Stanford Junior University Implantable pressure sensors for telemetric measurements through bodily tissues
US9217880B2 (en) 2013-05-30 2015-12-22 Johnson & Johnson Vision Care, Inc. Energizable ophthalmic lens device with a programmaable media insert
NO2709641T3 (en) 2014-03-10 2018-05-12
US10085637B2 (en) 2015-03-11 2018-10-02 Smartlens, Inc. Contact lens with a microfluidic channel to monitor radius of curvature of cornea
CN109219428B (en) 2016-03-09 2021-03-12 伊奎诺克斯眼科公司 Therapeutic eye treatment using gas
US20180296390A1 (en) 2017-04-18 2018-10-18 David Hoare Apparatus and method for treating a number of eye conditions of a user including dry eye syndrome and/or burns to the eyes
US10898074B2 (en) 2017-09-09 2021-01-26 Smartlens, Inc. Closed microfluidic network for strain sensing embedded in a contact lens to monitor intraocular pressure
US20190282094A1 (en) 2018-03-14 2019-09-19 Menicon Co. Ltd. Wireless Smart Contact Lens for Intraocular Pressure Measurement
CA3112502A1 (en) 2018-09-20 2020-03-26 Santa Clara University Closed microfluidic network for strain sensing embedded in a contact lens to monitor intraocular pressure
EP3894167A4 (en) 2018-12-14 2022-09-07 Smartlens, Inc. Methods and devices for wearable contact lenses for monitoring intraocular pressure
WO2020146714A1 (en) 2019-01-10 2020-07-16 Smartlens, Inc. Method and device for remote optical monitoring of intraocular pressure
US20220022744A1 (en) 2019-04-10 2022-01-27 Smartlens, Inc. Intraocular pressure monitoring devices and methods of using the same
CN115023175A (en) 2020-01-28 2022-09-06 智能隐形眼镜公司 Wearable device and method for intraocular pressure remote optical monitoring
WO2022182629A1 (en) 2021-02-24 2022-09-01 Smartlens, Inc. Method and device for remote optical monitoring of intraocular pressure

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100016704A1 (en) 2008-07-16 2010-01-21 Naber John F Method and system for monitoring a condition of an eye
US20130184554A1 (en) * 2010-10-20 2013-07-18 Ahmed Elsheikh Device for monitoring intraocular pressure
US20130253451A1 (en) * 2010-12-03 2013-09-26 Hee Gu Kim Drug carrier device attachable to glasses
US20140243645A1 (en) * 2011-10-05 2014-08-28 Sensimed Sa Intraocular Pressure Measuring and/or Monitoring Device
US20130278887A1 (en) * 2012-04-19 2013-10-24 Jerome A. Legerton Eye-wear borne electromagnetic radiation refractive therapy
US20160015265A1 (en) 2013-03-07 2016-01-21 The Board Of Trustees Of The Leland Stanford Junior University Implantable Micro-Fluidic Device for Monitoring of Intra-Ocular Pressure
US20170000341A1 (en) 2015-03-16 2017-01-05 Magic Leap, Inc. Methods and systems for diagnosing eye conditions such as red reflex using light reflected from the eyes
US20180279870A1 (en) * 2015-09-17 2018-10-04 Envision Diagnostics, Inc. Medical interfaces and other medical devices, systems, and methods for performing eye exams
WO2018221687A1 (en) * 2017-05-31 2018-12-06 株式会社坪田ラボ Moisture mist-spraying device and method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LAB CHIP, vol. 18, 2018, pages 3471 - 3483
See also references of EP3908239A4

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US12114931B2 (en) 2019-01-10 2024-10-15 Smartlens, Inc. Method and device for remote optical monitoring of intraocular pressure
CN118068594A (en) * 2024-04-02 2024-05-24 东莞市瞳立明企业管理有限公司 Intelligent sensor myopia prevention glasses and control method

Also Published As

Publication number Publication date
US12114931B2 (en) 2024-10-15
US20210369111A1 (en) 2021-12-02
EP3908239A1 (en) 2021-11-17
EP3908239A4 (en) 2022-10-05

Similar Documents

Publication Publication Date Title
US12114931B2 (en) Method and device for remote optical monitoring of intraocular pressure
EP3371781B1 (en) Systems and methods for generating and using three-dimensional images
CN105828702B (en) Method for calibrating wear-type eye tracking apparatus
JP7021783B2 (en) Optical measurement and scanning system and how to use it
US10775647B2 (en) Systems and methods for obtaining eyewear information
CN103431840B (en) Eye optical parameter detecting system and method
CN103439801B (en) Sight protectio imaging device and method
CN103500331B (en) Based reminding method and device
JP6159263B2 (en) Optical measurement apparatus and method for adjusting illumination characteristics and capturing at least one parameter in at least one eye
CN103501406B (en) Image collecting system and image collecting method
KR101300671B1 (en) Method for measuring parameter for manufacturing spectacle lens, and device for implementing the same
CN103605208A (en) Content projection system and method
EP2898819A1 (en) System for measuring the interpupillary distance using a device equipped with a screen and a camera
KR20150036147A (en) Device and method for measuring objective ocular refraction and at least one geometric-morphological parameter of an individual
CN103595912A (en) Method and device for local zoom imaging
US9730788B2 (en) Device for optically representing intraocular pressure, and a method for same
CN110420008A (en) For determining component, computer program, system and the external member of correcting lens
EP3987380B1 (en) Systems and methods for determining one or more parameters of a user's eye
US20220206573A1 (en) Devices, systems and methods for predicting gaze-related parameters
US20180107021A1 (en) Topology guided ocular lens design
US20230381017A1 (en) Method and device for remote optical monitoring of intraocular pressure
GB2559977A (en) Systems and methods for obtaining information about the face and eyes of a subject
US20210231974A1 (en) Optical measuring and scanning system and methods of use
CN109580176A (en) Method and apparatus for assessing Toric contact lenses rotational stabilization
US20230057524A1 (en) Eyeglass devices and related methods

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 20738775

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2020738775

Country of ref document: EP

Effective date: 20210810