EP4216796A1 - Système de réfraction en espace réel holographique - Google Patents
Système de réfraction en espace réel holographiqueInfo
- Publication number
- EP4216796A1 EP4216796A1 EP21873337.6A EP21873337A EP4216796A1 EP 4216796 A1 EP4216796 A1 EP 4216796A1 EP 21873337 A EP21873337 A EP 21873337A EP 4216796 A1 EP4216796 A1 EP 4216796A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- user
- virtual
- holographic
- eye
- test
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012360 testing method Methods 0.000 claims abstract description 243
- 238000000034 method Methods 0.000 claims abstract description 39
- 238000009877 rendering Methods 0.000 claims abstract description 34
- 230000004438 eyesight Effects 0.000 claims abstract description 20
- 230000004393 visual impairment Effects 0.000 claims abstract description 14
- 206010047571 Visual impairment Diseases 0.000 claims abstract description 11
- 208000029257 vision disease Diseases 0.000 claims abstract description 11
- 210000001747 pupil Anatomy 0.000 claims description 14
- 210000004087 cornea Anatomy 0.000 claims description 8
- 230000004397 blinking Effects 0.000 claims description 2
- 238000010586 diagram Methods 0.000 description 34
- 230000033001 locomotion Effects 0.000 description 34
- 230000000007 visual effect Effects 0.000 description 28
- 201000009310 astigmatism Diseases 0.000 description 18
- 238000013519 translation Methods 0.000 description 18
- 230000014616 translation Effects 0.000 description 18
- 201000006318 hyperopia Diseases 0.000 description 15
- 230000004305 hyperopia Effects 0.000 description 15
- 230000004304 visual acuity Effects 0.000 description 14
- 208000001491 myopia Diseases 0.000 description 13
- 230000004379 myopia Effects 0.000 description 13
- 239000003086 colorant Substances 0.000 description 12
- 206010020675 Hypermetropia Diseases 0.000 description 9
- 238000004458 analytical method Methods 0.000 description 9
- 208000004350 Strabismus Diseases 0.000 description 8
- 238000013459 approach Methods 0.000 description 8
- 238000003745 diagnosis Methods 0.000 description 6
- 238000011156 evaluation Methods 0.000 description 6
- 238000011084 recovery Methods 0.000 description 6
- 201000004569 Blindness Diseases 0.000 description 5
- 238000004891 communication Methods 0.000 description 5
- 238000012937 correction Methods 0.000 description 5
- 230000008447 perception Effects 0.000 description 5
- 230000008569 process Effects 0.000 description 5
- 230000004456 color vision Effects 0.000 description 4
- 238000005259 measurement Methods 0.000 description 4
- 230000002207 retinal effect Effects 0.000 description 4
- 206010039729 Scotoma Diseases 0.000 description 3
- 230000005540 biological transmission Effects 0.000 description 3
- 238000004737 colorimetric analysis Methods 0.000 description 3
- 238000001228 spectrum Methods 0.000 description 3
- 230000003068 static effect Effects 0.000 description 3
- 238000002560 therapeutic procedure Methods 0.000 description 3
- 206010021403 Illusion Diseases 0.000 description 2
- 230000008901 benefit Effects 0.000 description 2
- 230000008859 change Effects 0.000 description 2
- 238000006243 chemical reaction Methods 0.000 description 2
- 230000002950 deficient Effects 0.000 description 2
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 230000006870 function Effects 0.000 description 2
- 210000003128 head Anatomy 0.000 description 2
- 230000006872 improvement Effects 0.000 description 2
- 230000005043 peripheral vision Effects 0.000 description 2
- 230000000750 progressive effect Effects 0.000 description 2
- 230000004044 response Effects 0.000 description 2
- 210000001525 retina Anatomy 0.000 description 2
- 238000000926 separation method Methods 0.000 description 2
- 239000007787 solid Substances 0.000 description 2
- 201000009487 Amblyopia Diseases 0.000 description 1
- 208000014644 Brain disease Diseases 0.000 description 1
- 208000002177 Cataract Diseases 0.000 description 1
- 206010010254 Concussion Diseases 0.000 description 1
- 208000003556 Dry Eye Syndromes Diseases 0.000 description 1
- 206010025421 Macule Diseases 0.000 description 1
- 208000012902 Nervous system disease Diseases 0.000 description 1
- 208000025966 Neurological disease Diseases 0.000 description 1
- 208000007913 Pituitary Neoplasms Diseases 0.000 description 1
- 206010038848 Retinal detachment Diseases 0.000 description 1
- 206010044245 Toxic optic neuropathy Diseases 0.000 description 1
- 231100000265 Toxic optic neuropathy Toxicity 0.000 description 1
- 206010047513 Vision blurred Diseases 0.000 description 1
- 230000002159 abnormal effect Effects 0.000 description 1
- 230000005856 abnormality Effects 0.000 description 1
- 230000004075 alteration Effects 0.000 description 1
- 230000003190 augmentative effect Effects 0.000 description 1
- 230000027455 binding Effects 0.000 description 1
- 238000009739 binding Methods 0.000 description 1
- 238000005266 casting Methods 0.000 description 1
- 229910017052 cobalt Inorganic materials 0.000 description 1
- 239000010941 cobalt Substances 0.000 description 1
- GUTLYIVDDKVIGB-UHFFFAOYSA-N cobalt atom Chemical compound [Co] GUTLYIVDDKVIGB-UHFFFAOYSA-N 0.000 description 1
- 201000007254 color blindness Diseases 0.000 description 1
- 230000009514 concussion Effects 0.000 description 1
- 230000007812 deficiency Effects 0.000 description 1
- 201000010099 disease Diseases 0.000 description 1
- 208000035475 disorder Diseases 0.000 description 1
- 230000000694 effects Effects 0.000 description 1
- 208000030533 eye disease Diseases 0.000 description 1
- 230000004424 eye movement Effects 0.000 description 1
- GNBHRKFJIUUOQI-UHFFFAOYSA-N fluorescein Chemical compound O1C(=O)C2=CC=CC=C2C21C1=CC=C(O)C=C1OC1=CC(O)=CC=C21 GNBHRKFJIUUOQI-UHFFFAOYSA-N 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 239000011521 glass Substances 0.000 description 1
- 230000003760 hair shine Effects 0.000 description 1
- 238000005286 illumination Methods 0.000 description 1
- 230000004446 light reflex Effects 0.000 description 1
- 208000002780 macular degeneration Diseases 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 210000003205 muscle Anatomy 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 230000007170 pathology Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 238000001126 phototherapy Methods 0.000 description 1
- 208000010916 pituitary tumor Diseases 0.000 description 1
- 229920003023 plastic Polymers 0.000 description 1
- 230000001902 propagating effect Effects 0.000 description 1
- 230000001179 pupillary effect Effects 0.000 description 1
- 230000004439 pupillary reactions Effects 0.000 description 1
- 230000004264 retinal detachment Effects 0.000 description 1
- 230000004256 retinal image Effects 0.000 description 1
- 210000001210 retinal vessel Anatomy 0.000 description 1
- 229920006395 saturated elastomer Polymers 0.000 description 1
- 230000002269 spontaneous effect Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
- 201000006853 toxic maculopathy Diseases 0.000 description 1
- 230000001755 vocal effect Effects 0.000 description 1
Classifications
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/02—Subjective types, i.e. testing apparatus requiring the active assistance of the patient
- A61B3/028—Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
- A61B3/032—Devices for presenting test symbols or characters, e.g. test chart projectors
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/107—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining the shape or measuring the curvature of the cornea
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/0093—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00 with means for monitoring data relating to the user, e.g. head-tracking, eye-tracking
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B30/00—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images
- G02B30/50—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels
- G02B30/56—Optical systems or apparatus for producing three-dimensional [3D] effects, e.g. stereoscopic images the image being built up from image elements distributed over a 3D volume, e.g. voxels by projecting aerial or floating images
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H50/00—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
- G16H50/20—ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B3/00—Apparatus for testing the eyes; Instruments for examining the eyes
- A61B3/10—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
- A61B3/11—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
- A61B3/112—Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring diameter of pupils
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/0101—Head-up displays characterised by optical features
- G02B2027/0132—Head-up displays characterised by optical features comprising binocular systems
- G02B2027/0134—Head-up displays characterised by optical features comprising binocular systems of stereoscopic type
-
- G—PHYSICS
- G02—OPTICS
- G02B—OPTICAL ELEMENTS, SYSTEMS OR APPARATUS
- G02B27/00—Optical systems or apparatus not provided for by any of the groups G02B1/00 - G02B26/00, G02B30/00
- G02B27/01—Head-up displays
- G02B27/017—Head mounted
- G02B27/0172—Head mounted characterised by optical features
- G02B2027/0174—Head mounted characterised by optical features holographic
-
- H—ELECTRICITY
- H04—ELECTRIC COMMUNICATION TECHNIQUE
- H04N—PICTORIAL COMMUNICATION, e.g. TELEVISION
- H04N13/00—Stereoscopic video systems; Multi-view video systems; Details thereof
- H04N13/30—Image reproducers
- H04N13/332—Displays for viewing with the aid of special glasses or head-mounted displays [HMD]
- H04N13/344—Displays for viewing with the aid of special glasses or head-mounted displays [HMD] with head-mounted left-right displays
Definitions
- the clinician would then turn the dial to move different lenses in front of the person’s eyes to find the best subjective refraction to improve distance vision.
- the instrument was then advanced to include prisms that could be used to disassociate images or change the position of the image enabling the clinician the ability to evaluate muscle ranges and the ability to maintain eye alignment and binocularity. It also permitted assessment of the person’s ability to accommodate or focus at a near range. This was all for the purpose of designing glasses to improve eyesight and visual acuity for both distance and near ranges as well as to prescribe prisms to correct for imbalance in eye alignment affecting binocularity.
- the phoropter While the phoropter is an effective instrument and still used today, it limits the peripheral field and cannot assess binocularity in any other meridian other than primary gaze or looking straight ahead. Binocular imbalances can sometimes only be represented with gaze outside of the primary gaze position. Therefore, the instrument has limited value for these purposes and/or lead the clinician to only be able to prescribe lenses and prisms for one position of the eyes. In addition, the large phoropter blocks the peripheral vision producing an abnormal environment and restriction of side vision, which frequently affects the intensity of the attentional visual process and cause the refractive correction to be too strong or imbalanced.
- Described herein is a system to evaluate the refractive state of the eye and visual process as well as binocularity in the nine cardinal positions of gaze while in real space by using holographic projection for each eye.
- the refractive state assessment has been designed to enable the eye of the patient to focus on virtual objects in the manner that the refractive imbalance will focus to maintain clear vision. For example, a object is presented with three dimensions. The myopic eye will focus on the near side of the object and see it with clarity. The dimensions and position of the object is then moved to refocus the far or distance side of the object and calibration is determined as to the power of the eye and the power of the lens required to re-focus the eye to best visual acuity at infinity. The same would occur for the hyperopic eye, only the far portion of the three-dimensional object will be in initial focus.
- the patient uses hand movements and/or voice command to communicate the subjective measurement of the dioptric power to correct the vision to best visual acuity and, advantageously, these objectives are accomplished through manipulation of the object in real space. More particularly, in an exemplary embodiment of the present disclosure, an eye with astigmatism would be presented a three dimensional object where perpendicular lines would enable the patient to observe that one of the lines is clear and the other blurred. The object will be rotated to determine the axis of the astigmatism and then the opposite or blurred side of the object would be shifted in space virtually to bring it into focus. This sequence of operation will provide the amount of astigmatism measured in this eye and therefore the predicted amount of cylindrical correction needed to bring clarity. If the patient has both myopia or hyperopia and astigmatism, the object would be simultaneously be manipulated to determine myopia or hyperopia while also evaluation the dioptric power of the astigmatism.
- FIG. 1 is a block diagram illustrating a system for the holographic eye testing device according to an exemplary embodiment.
- FIG. 2 is a block diagram illustrating a test for horizontal phoria with a holographic eye testing device according to an exemplary embodiment.
- FIG. 3 is a block diagram illustrating a test for vertical phoria utilizing the holographic eye testing device according to an exemplary embodiment.
- FIG. 4A is a block diagram illustrating a test for astigmatism utilizing the holographic eye testing device according to an exemplary embodiment.
- FIG. 4B is a block diagram illustrating a user’s perspective of the virtual 3D objects depicted in FIG. 4A according to an exemplary embodiment.
- FIG. 5A is a block diagram illustrating a test for astigmatism utilizing the holographic eye testing device according to an exemplary embodiment.
- FIG. 5B is a block diagram illustrating a user’s perspective of the virtual 3D objects depicted in FIG. 5A according to an exemplary embodiment.
- FIG. 5C is a block diagram illustrating another perspective of the virtual 3D objects depicted in FIG. 5A according to an exemplary emsbodiment.
- FIGs. 6A-6B are diagrams illustrating a test for visual acuity utilizing the holographic eye testing device according to an exemplary embodiment.
- FIGs. 7A-7G are diagrams illustrating tests for horizontal convergent and horizontal divergent, utilizing the holographic eye testing device according to an exemplary embodiment.
- FIGs. 8A-8C are diagrams illustrating refraction testing utilizing the holographic eye testing device according to an exemplary embodiment.
- FIGs. 9A-9B are diagrams illustrating convergence testing utilizing the holographic eye testing device according to an exemplary embodiment.
- FIGs. 10A-10D are block diagrams illustrating methods for utilizing the holographic eye testing device according to an exemplary embodiment.
- FIG. 11 depicts a block diagram an exemplary computing device in accordance with an exemplary embodiment.
- FIG. 12 illustrates a visual field test rendered in VR and/or AR via the holographic eye testing device, in accordance with an exemplary embodiment.
- FIG. 13 illustrates color vision testing rendered in VR and/or AR via the holographic eye testing device, in accordance with an exemplary embodiment.
- FIG. 14 illustrates an Amsler grid visual field test rendered in VR and/or AR via the holographic eye testing device, in accordance with an exemplary embodiment.
- FIG. 15 illustrates a pupillometry test rendered in VR and/or AR via the holographic eye testing device, in accordance with an exemplary embodiment.
- FIG. 16 illustrates a retinal scan test rendered in VR and/or AR via the holographic eye testing device, in accordance with an exemplary embodiment.
- FIG. 17 illustrates a tear breakup time test via the holographic eye testing device, in accordance with an exemplary embodiment.
- FIG. 18 illustrates a syntonies color therapy test via the holographic eye testing device, in accordance with an exemplary embodiment.
- FIG. 19 illustrates a colorimetry test via the holographic eye testing device, in accordance with an exemplary embodiment.
- FIGs. 20A and B illustrate a keratometry test via the holographic eye testing device, in accordance with an exemplary embodiment.
- FIG. 21 illustrates depth perception eye test rendered in VR and/or AR via the holographic eye testing device, in accordance with an exemplary embodiment.
- Example embodiments provide a device for utilizing holographic virtual projection to perform eye testing, diagnosis, and prescriptive remedy.
- the disclosed holographic eye testing device renders on a head mounted device, one or more three dimensional objects within the holographic display device, wherein the rendering corresponds to a virtual level of depth viewable by a user.
- the holographic display device updates the rendering of the one or more three dimensional objects, wherein the updates include a virtual movement of the one or more three dimensional objects within the virtual level of depth.
- the holographic display device receives input from a user, wherein the input includes an indication of alignment of the one or more three dimensional objects based on the virtual movement.
- the indication of alignment includes a relative position between the one or more three dimensional objects.
- the holographic display device determines a delta between the relative virtual position between the one or more three dimensional objects and an optimal virtual position.
- the holographic display device generates prescriptive remedy based on the delta.
- FIG. 1 is a block diagram illustrating a system 100 for the holographic eye testing device according to an exemplary embodiment.
- the holographic eye testing device can include a head mounted display (HMD) 102.
- the HMD 102 can include a pair of combiner lenses 104 A, 104B for rendering three dimensional (3D) images within a user’s field of view (FOV).
- the combiner lenses 104A, 104B can be calibrated to the interpupillary distance from the user’s eyes 106A, 106B.
- a computing system 108 can be connected to the combiner lenses 104A, 104B.
- the holographic eye testing device can be repositioned in any of the nine primary gaze positions as needed.
- the HMD 102 can be connected to an adjustable, cushioned inner headband, which can tilt the combiner lenses 104A, 104B up and down, as well as forward and backward.
- the user fits the HMD 102 on their head, using an adjustment wheel at the back of the headband to secure it around the crown, supporting and distributing the weight of the unit equally for comfort, before tilting the visor and combiner lenses 104 A, 104B towards the front of the eyes.
- the computing system 108 can be inclusive to the HMD 102, where the holographic eye testing device is a self contained apparatus.
- the computing system 108 in the self contained apparatus can include additional power circuitry to provide electrical current to the parts of the computing system 108.
- the computing system 108 can be external to the HMD 102 and communicatively coupled either through wired or wireless communication channels to the HMD 102.
- Wired communication channels can include digital video transmission formats including High Definition Multimedia Interface (HDMI), DisplayPortTM (DisplayPort is a trademark of VESA of San Jose CA, U.S.A.), or any other transmission format capable of propagating a video signal from the computing system 108 to the combiner lenses 104A, 104B.
- HDMI High Definition Multimedia Interface
- DisplayPortTM DisplayPort is a trademark of VESA of San Jose CA, U.S.A.
- the HMD 102 can include speakers or headphones for the presentation of instructional audio to the user during the holographic eye tests.
- the HMD 102 can include a wireless adapter capable of low latency high bandwidth applications, including but not limited to IEEE 802.1 lad.
- the wireless adapter can interface with the computing system 108 for the transmission of low latency video to be displayed upon the combiner lenses 104, 104B.
- the computing system 108 can include software for the manipulation and rendering of 3D objects within a virtual space.
- the software can include both platform software to support any fundamental functionality of the HMD 102, such as motion tracking, input functionality, and eye tracking.
- Platform software can be implemented in a virtual reality (VR) framework, augmented reality (AR) framework, or mixed reality (MR) framework.
- VR virtual reality
- AR augmented reality
- MR mixed reality
- Platform software to support the fundamental functionality can include but are not limited to Steam VR® (Steam VR is a registered trademark of the Valve Corporation, Seattle WA, U.S.A) software development kit (SDK), Oculus® VR SDK (Oculus is a registered trademark of Oculus VR LLC, Irvine CA, U.S.A.), OSVR (Open source VR) (OSVR is a registered trademark of Razer Asia Pacific Pte. Ltd. Singapore) SDK, and Microsoft Windows Mixed Reality Computing Platform.
- Application software executing on the computing system 108 with the underlying platform software can be a customized rendering engine, or an off-the-shelf 3D rendering framework, such as Unity® Software (Unity Software is a registered trademark of Unity Technologies of San Francisco CA, U.S.A).
- the rendering framework can provide the basic building blocks of the virtualized environment for the holographic refractive eye test, including 3D objects and manipulation techniques to change the appearance of the 3D objects.
- the rendering framework can provide application programming interfaces (APIs) for the instantiation of 3D objects and well-defined interfaces for the manipulation of the 3D objects within the framework.
- APIs application programming interfaces
- Common software programming language bindings for rendering frameworks include but are not limited to C++, Java, and C#.
- the application software can provide settings to allow a test administrator to adjust actions within the test, such as holographic object speed and object color.
- the system 100 can be configured to perform a variety of eyes tests, including, but not limited to, acuity testing (near and far), phorias, horizontal divergence, horizontal convergence, and refraction.
- FIG. 2 is a block diagram illustrating a test for horizontal phoria with a holographic eye testing device according to an exemplary embodiment.
- two virtual 3D objects 202 A, 202B can be manipulated in a user’s field of view (FOV) 204 A, 204B.
- the virtual 3D objects 202A, 202B can be translated within the same horizontal plane until convergence.
- the virtual 3D objects 202A, 202B can have a starting point within the user’s FOV 204A, 204B equidistant within the same horizontal plane from a mid-point of the FOV.
- the virtual 3D objects 202A, 202B are translated and projected on the combiner lenses 104A, 104B to give the appearance that the virtual 3D objects are a set distance from the view of the user’s eyes 106A, 106B.
- the application software can present the virtual 3D objects 202A, 202B via the combiner lenses 104A, 104B so that the virtual 3D objects can appear to be at different distances from the user’s eyes 106A, 106B.
- the presentation of the virtual 3D objects 202A, 202B can correspond to projection of the virtual 3D objects at distances of 16 inches to 20 feet in front of the user’s eyes 106A, 106B.
- the range of distances allow phoria to be measured at different intervals of depth for better confidence in convergence results.
- the user can provide input to the application software or platform software.
- the input can take the form of voice commands, gestures, or input from a “clicker.”
- the virtual 3D objects 202A, 202B approach each other, they will begin to overlap and converge into a single virtual 3D object.
- the user can provide input to stop any motion or translation of the virtual 3D objects 202A, 202B.
- the application software evaluates a delta between the midpoint of the user’s FOV 204A, 204B and the point at which the virtual 3D objects 202A, 202B were located when the user provided input to stop the motion or translation.
- the delta can be represented as a deviation relative to the virtual distance of the virtual 3D objects 202A, 202B from the patient.
- FIG. 3 is a block diagram illustrating a test for vertical phoria utilizing the holographic eye testing device according to an exemplary embodiment.
- two virtual 3D objects 304A, 304B can be manipulated in a user’s FOV 302.
- the virtual 3D objects 304A, 304B can be translated within the same vertical plane until convergence.
- the virtual 3D objects 304 A, 304B can have a starting point within the user’s FOV 302 equidistant within the same vertical plane from a mid-point of the FOV.
- the virtual 3D objects 304A, 304B are translated and projected on the combiner lenses 104A, 104B to give the appearance that the virtual 3D objects are a set distance from the view of the user’s eyes 106A, 106B.
- the application software can present the virtual 3D objects 304A, 304B via the combiner lenses 104A, 104B so that the virtual 3D objects can appear to be at different distances from the user’s eyes 106A, 106B.
- the presentation of the virtual 3D objects 304A, 304B can correspond to projection of the virtual 3D objects at distances of 16 inches to 20 feet in front of the user’s eyes 106A, 106B.
- the range of distances allow phoria to be measured at different intervals of depth for better confidence in convergence results.
- the user can provide input to the application software or platform software.
- the input can take the form of voice commands, gestures, or input from a “clicker.”
- the virtual 3D objects 304A, 304B approach each other, they will begin to overlap and converge into a single visible virtual 3D object.
- the user can provide input to stop any motion or translation of the virtual 3D objects 304A, 304B.
- the application software evaluates a delta between the midpoint of the user’s FOV 302 and the point at which the virtual 3D objects 304A, 304B were located when the user provided input to stop the motion or translation.
- the delta can be represented as a deviation relative to the virtual distance of the virtual 3D objects 304A, 304B from the patient.
- FIG. 4A is a block diagram illustrating a test for astigmatism utilizing the holographic eye testing device according to an exemplary embodiment.
- multiple virtual 3D objects 404A, 406A, 406B can be manipulated in a user’s FOV 402.
- the virtual 3D objects 404A, 406A, 406B start in different planes 408A, 408B parallel to the plane of the combiner lenses 408C.
- the virtual 3D objects 404A can be a set of vertical lines (relative to the user’s eyes 106A, 106B) residing in a plane 408A.
- Additional virtual 3D objects 406A, 406B can be a set of horizontal lines (relative to the user’s eyes 106A, 106B) residing in a plane 408B.
- the virtual 3D objects 404A, 406A, 406B can appear as a hash mark (#) where the virtual 3D objects appear to intersect, however since they are in different planes 408A, 408B they do not actually intersect.
- the user can start the test by providing input to the computing system 108.
- the input can take the form of voice commands, including saying key words indicative of beginning the test and/or voice response(s) to automated verbal direction provided by the computing system 108, a hand-held “joy stick”, gestures, or providing input from a “clicker.”
- the user states the word “start” to begin the test.
- Control of the test can take the form voice commands including “forward” and “backward.”
- a voice command of “forward” translates the plane 408A, and associated virtual 3D objects 404A toward the combiner lenses 104 A, 104B.
- a voice command of “backward” translates the plane 408A, and associated virtual 3D objects 404A away from the combiner lenses 104A, 104B.
- a user can manipulated the virtual 3D objects 404A where the user believes the respective planes 408A, 408B and associated virtual 3D objects 404A, 406A, 406B are coincidental.
- the user can provide a voice command to the computing system 108, such as stating the word “stop” to complete the manipulation portion of the test.
- the computing system 108 Upon the receipt of the “stop” command, the computing system 108 disallows subsequent input commands, such as “forward” and “backward,” and determines a delta distance between the final location of the planes 408A, 408B. In the event the user manipulated the planes 408A, 408B to coincide, the delta would be zero.
- FIG. 4B is a block diagram illustrating a user’s perspective of the virtual 3D objects depicted in FIG. 4A.
- Virtual 3D objects 406A, 406B can be implemented as parallel lines residing the same plane 408B.
- Virtual 3D objects 404A, 404B can be implemented as parallel lines residing in the same plane 408A.
- FIG. 5A is a block diagram illustrating a test for astigmatism utilizing the holographic eye testing device according to an exemplary embodiment.
- FIG. 5B is a block diagram illustrating a user’s perspective of the virtual 3D objects depicted in FIG. 5A.
- multiple virtual 3D objects 504A, 504B can be manipulated in a user’s FOV 402.
- the virtual 3D objects 504A, 504B correspond to concentric opaque rings across the surface of an invisible sphere 510, where the concentric opaque rings traverse the surface of the sphere perpendicular to the plane of the combiner lenses 104 A, 104B.
- the virtual 3D objects 504A, 504B can be oriented along coaxial planes 506, 508. In other embodiments, the virtual 3D objects 504A, 504B can be distal or proximal portions of concentric opaque rings across the surface of an invisible sphere 510.
- the user can start the test by providing input to the computing system 108.
- the input can take the form of voice commands, including saying key words indicative of beginning the test, gestures or providing input from a “clicker.”
- the user states the word “start” to begin the test.
- the invisible sphere 510 and accompanying virtual 3D objects are translated toward the combiner lenses 104A, 104B to give the user the appearance that the virtual 3D objects are coming directly at the user’s eyes 106A.
- the user can provide input to stop the test in the form of a voice command of “stop.”
- the computing system 108 ceases translation of the invisible sphere 510 and calculates a delta distance from the starting point of the invisible sphere to the point where the invisible sphere resides at the end of the test.
- a constant point of reference on the invisible sphere 510 can be utilized to determine a consistent location to determine the delta distance.
- the user can start the test by providing input to the computing system 108.
- the input can take the form of voice commands, including saying key words indicative of beginning the test, gestures or providing input from a “clicker.”
- the user states the word “start” to begin the test.
- the virtual 3D objects 504A, 504B being the test in a parallel or coincidental plane with a starting plane 506. As the test begins the invisible sphere 510 and accompanying virtual 3D objects are rotated in a clockwise motion 512 from the user’s perspective.
- the user can provide input to stop the test in the form of a voice command of “stop.”
- the computing system 108 ceases rotation of the invisible sphere 510 and calculates a delta in degrees based on the rotation from the starting point of the invisible sphere to the orientation of the invisible sphere at the end of the test.
- the delta in degrees can be used to determine the axis of the astigmatism. This provides the amount of astigmatism measured in this eye and therefore the predicted amount of cylindrical correction needed to bring clarity.
- FIG. 5C is a block diagram illustrating another user’s perspective of the virtual 3D objects depicted in FIG. 5A.
- the virtual 3D objects 504A, 504B can be distal portions of concentric opaque rings across the surface of an invisible sphere 510.
- the virtual 3D objects 514A, 514B can be proximal portions of concentric opaque rings across the surface of an invisible sphere 510.
- the distal portions of the concentric opaque rings form a group and the proximal portions form a group. Groups are rotated and translated in the FOV 402 in unison.
- the distal portions can be offset from the proximal portions by a rotation of 45 degrees.
- the user can start the test by providing input to the computing system 108.
- the input can take the form of voice commands, including saying key words indicative of beginning the test, gestures or providing input from a “clicker.”
- the user states the word “start” to begin the test.
- the computing system 108 translates virtual 3D objects 514A, 514B corresponding to the proximal portions toward the combiner lenses 104A, 104B where the virtual 3D object 514A, 514B appear to be coming toward the user’s eyes 102A, 102B.
- the user can provide input to stop the test in the form of a voice command of “stop.”
- the computing system 108 ceases translation of the virtual 3D objects 514A, 514B corresponding to the proximal portions and calculates a delta in distance based on the translation from the starting point to the position at the end of the test.
- FIG. 6A is a block diagram illustrating a test for visual acuity utilizing the holographic eye testing device, according to an exemplary embodiment.
- virtual 2D or 3D letters or images 604 can be displayed and/or manipulated in a user’s FOV 402.
- the virtual letters or images 604 are translated and projected on the combiner lenses 104 A, 104B to give the appearance that the virtual letters or images 604 are a set distance from the view of the user’s eyes 106A, 106B.
- the application software can present the virtual letters or images 604 via the combiner lenses 104A, 104B so that the virtual letters or images 604 can appear to be at different distances from the user’s eyes 106 A, 106B.
- the presentation of the virtual letters or images 604 can correspond to projection of the virtual letters or images 604 at distances of 16 inches (near visual acuity) to 20 feet (distance visual acuity) in front of the user’s eyes 106A, 106B.
- the visual acuity test is used to determine the smallest letters or images a user can identify from the virtual letters or images 604 at a specified distance away (e.g., 6 meters). The range of distances allows visual acuity to be measured at different intervals of depth.
- the user can start the test by providing input to the computing system 108.
- the input can take the form of voice commands, including saying key words indicative of beginning the test, gestures or providing input from a “clicker.”
- the user states the word “start” to begin the test.
- the virtual letters or images 604 can be moved forward or backwards.
- Control of the test can take the form voice commands including “forward” and “backward.”
- a voice command of “forward” translates the plane 608, and associated virtual letters or images 604 toward the combiner lenses 104A, 104B.
- a voice command of “backward” translates the plane 608, and associated virtual letters or images 604 away from the combiner lenses 104A, 104B.
- a user can manipulated the virtual letters or images 604 until the user can or can no longer identify the virtual letters or images 604.
- the user can provide a voice command to the computing system 108, such as stating the word “stop” to complete the manipulation portion of the test.
- the computing system 108 disallows subsequent input commands, such as “forward” and “backward,” and determines a final distance of the virtual letters or images 604B.
- FIG. 6B is a block diagram illustrating a user’s perspective of the virtual letters depicted in FIG. 6A.
- the virtual letters 604 can be implemented as letters residing on the same plane 608. In other embodiments, one or more of the virtual letters 604 can be implemented as letters residing on different planes.
- FIG. 7A is a block diagram illustrating a test for horizontal convergent and horizontal divergent, utilizing the holographic eye testing device according to an exemplary embodiment.
- virtual 2D or 3D shapes 704 can be displayed and/or manipulated in a user’s FOV 402.
- the horizontal convergent and horizontal divergent tests are used to examine eye movement responses to symmetric stimuli.
- the virtual shapes 704 can be manipulated in a user’s field of view (FOV) 402.
- the virtual shapes 704 can have a starting point within the user’s FOV 402 equidistant within the same horizontal plane from a mid-point of the FOV 402.
- the virtual shapes 704 are translated and projected on the combiner lenses 104A, 104B to give the appearance that the virtual shapes 704 are a set distance from the view of the user’s eyes 106A, 106B.
- the application software can present the virtual shapes 704 via the combiner lenses 104A, 104B so that the virtual shapes 704 can appear to be at different distances from the user’s eyes 106A, 106B.
- the presentation of the virtual shapes 704 can correspond to projection of the virtual shapes 704 at distances of 16 inches to 20 feet in front of the user’s eyes 106A, 106B.
- the range of distances allows the horizontal convergent and the horizontal divergent to be measured at different intervals of depth for better confidence in convergence and divergence results.
- the user can start the test by providing input to the computing system 108.
- the input can take the form of voice commands, including saying key words indicative of beginning the test, gestures or providing input from a “clicker.”
- the user states the word “start” to begin the test.
- FIGs. 7B-7C are block diagrams illustrating a horizontal convergent test, utilizing the holographic eye testing device according to an exemplary embodiment.
- the horizontal convergent test is performed in two stages - a break stage and a recovery stage.
- Two objects for example, 3D or 2D shapes, such as 3D cubes
- the distance can be changed depending on needs of the user, such as whether the test is being performed for near-sightedness or far-sightedness.
- a first object 710 of the two objects is projected to a right eye and a second object 712 of the two objects is projected to the left eye.
- the first object 710 and the second object 712 begin overlaid on each other and appear as one object, as shown in FIG. 7B.
- the first object 710 and the second object 712 slowly move apart.
- the first object 710 shown to the right eye moves to the left from a center start point
- the second object 712 shown to the left eye moves to the right from the center start point, as shown in FIG. 7C.
- the user reports when the user notices that instead of viewing the single object, the user now views that there are two objects (the first object 710 and the second object 712).
- the user can provide input to the application software or platform software.
- the input can take the form of voice commands, gestures, or input from a “clicker.”
- the first object 710 moves to the left from the center start point and the second object 712 moves to the right from the center start point, the objects will begin to diverge and appear as separate objects.
- the user can provide input to stop any motion or translation of the first object 710 and the second object 712.
- the application software evaluates a delta between the midpoint of the user’s FOV 402 and the point at which the first object 710 and the second object 712 were located when the user provided input to stop the motion or translation.
- the delta can be represented as a deviation relative to the virtual distance of the first object 710 and the second object 712 from the user.
- the first object 710 and the second object 712 start out offset from each other and slowly move together.
- the first object 710 shown to the right eye starts on the left side of the user's view and moves to the center start point
- the second object 712 shown to the left eye starts on the right side of the user's view and moves to the center start point, as shown in FIG. 7D.
- the user reports when the user no longer views two distinct objects, but instead only a single object.
- the user can provide input to the application software or platform software.
- the input can take the form of voice commands, gestures, or input from a “clicker.”
- the user can provide input to stop any motion or translation of the first object 710 and the second object 712.
- the application software evaluates a delta between the midpoint of the user’ s FOV 402 and the point at which the first object 710 and the second object 712 were located when the user provided input to stop the motion or translation.
- the delta can be represented as a deviation relative to the virtual distance of the first object 710 and the second object 712 from the user.
- FIGs. 7E-7G is a block diagram illustrating a horizontal divergent test, utilizing the holographic eye testing device according to an exemplary embodiment.
- the horizontal divergent is performed in two stages - a break stage and a recovery stage.
- Two objects for example, 3D or 2D shapes, such as 3D cubes
- the distance can be changed depending on needs of the user, such as whether the test is being performed for near-sightedness or far-sightedness.
- a first object of the two objects is projected to a right eye and a second object of the two objects is projected to the left eye.
- the difference between horizontal divergent test and the horizontal convergent tests has to do with what object is shown to what eye, and where the objects move. For divergent test, the objects are shown to the opposite eye from the convergent test.
- the first object 710 and the second object 712 begin overlaid on each other and appear as one object, as shown in FIG. 7E.
- the first object 710 and the second object 712 slowly move apart.
- the first object 710 shown to the right eye moves to the right from the center start point
- the second object 712 shown to the left eye moves to the left from the center start point, as shown in FIG. 7F.
- the user reports when the user notices that instead of viewing the single object, the user now views that there are two objects (the first object 710 and the second object 712).
- the user can provide input to the application software or platform software.
- the input can take the form of voice commands, gestures, or input from a “clicker.”
- the first object 710 moves to the right from the center start point and the second object 712 moves to the left from the center start point, the objects will begin to diverge and appear as separate objects.
- the user can provide input to stop any motion or translation of the first object 710 and the second object 712.
- the application software evaluates a delta between the midpoint of the user’s FOV 402 and the point at which the first object 710 and the second object 712 were located when the user provided input to stop the motion or translation.
- the delta can be represented as a deviation relative to the virtual distance of the first object 710 and the second object 712 from the user.
- the first object 710 and the second object 712 start out offset from each other and slowly move together, as shown in FIG. 7G.
- the first object 710 shown to the right eye starts on the right side of the user's view and moves to the center start point
- the second object 712 shown to the left eye start's on the left side of the user's view and moves to the center start point.
- the user reports when the user no longer views two distinct objects, but instead only a single object.
- the user can provide input to the application software or platform software.
- the input can take the form of voice commands, gestures, or input from a “clicker.”
- the application software evaluates a delta between the midpoint of the user’s FOV 402 and the point at which the first object 710 and the second object 712 were located when the user provided input to stop the motion or translation.
- the delta can be represented as a deviation relative to the virtual distance of the first object 710 and the second object 712 from the user.
- the first object 710 is only shown the right eye and the second object 712 is only shown to the left eye.
- eye tracking data is important because it allows the system to determine where the user is looking.
- the program may request that the user look at a certain point in order to start the test, or to confirm that they are looking in the right location and see the shapes before the test is started.
- the system may request that a user attempt to gaze at a certain point while the shapes are in motion in order to ensure that the tests return accurate results. If they look away from the point, or directly at the shapes, the test may pause and wait for them to return their gaze to the requested object before continuing.
- FIGs. 8A-8C are diagrams illustrating refraction testing utilizing the holographic eye testing device according to an exemplary embodiment.
- at least one virtual object 804 and/or a series of virtual lines 810 can be displayed and/or manipulated in a user’s FOV 402.
- the object 804 or the series of lines 810 are translated on a vertical plane 808 and projected on the combiner lenses 104A, 104B to give the appearance that the object 804 or the series of lines 810 is a set distance from the view of the user’s eyes 106A, 106B.
- the application software can present the object 804 or the series of lines 810 via the combiner lenses 104A, 104B so that the object 804 or the series of lines 810 can appear to be at different distances from the user’s eyes 106A, 106B.
- the presentation of the object 804 or the series of lines 810 can correspond to projection of virtual letters, images, or lines at distances of 16 inches (near visual acuity) to 20 feet (distance visual acuity) in front of the user’s eyes 106A, 106B.
- the user can start the test by providing input to the computing system 108.
- the input can take the form of voice commands, including saying key words indicative of beginning the test, gestures, or providing input from a “clicker.”
- the user states the word “start” to begin the test.
- the described systems and methods use depth of field for testing refraction.
- the refraction testing determines the user’s level of hyperopia (farsightedness), myopia (nearsightedness), and astigmatism, and three associated numerical values for sphere, cylinder, and axis, typically needed for an eyeglass prescription.
- the computing system 108 measures the refractive state of the eye by having the user move the object 804 from an initial distance towards or away from the user until a resolution of the object 804 appears clear to the user. The distance at which the object 804 appears clear to the user is labeled as a final distance.
- the computing system 108 determines an initial measurement between the final position of the virtual object and an optimal virtual position. This initial measurement is at the focal length of the refractive spherical equivalent of the eye and is the sphere power.
- the sphere power indicates the amount of lens power, measured in diopters (D), prescribed to correct nearsightedness or farsightedness.
- a series of virtual lines 810 are presented at the final distance in a parallel or coincidental plane with the plane 808.
- the series of lines 810 correspond to concentric opaque rings across the surface of an invisible sphere 812 (shown in FIG. 8C), where the concentric opaque rings traverse the surface of the sphere 812 perpendicular to the plane of the combiner lenses 104A, 104B.
- the series of lines 810 includes a predefined number of lines (for example, three lines) in the axis plane.
- the invisible sphere 812 and accompanying series of lines 810 are rotated in a clockwise motion from the user’s perspective.
- the invisible sphere 812 and accompanying series of lines 810 appear to have rotated ninety (90) degrees about an axis from the final position, (parallel or coincidental to the axis 814 shown in FIG. 8C)
- the user can provide input to stop the test, for example, in the form of a voice command of “stop.”
- the computing system 108 ceases rotation of the invisible sphere 812 and calculates a delta in degrees based on the rotation from the starting point of the invisible sphere 812 to the orientation of the invisible sphere 812 at the end of the test.
- the delta in degrees can be used to determine the axis value.
- the second step provides the axis. This provides the amount of astigmatism measured in this eye and therefore the predicted amount of cylindrical correction needed to bring clarity.
- the cylinder value refers to the amount of astigmatism in the eyes.
- the lines 810 are shifted 90 degrees from the axis 814 and moved further away from the user to display a blurred side of the lines 810.
- the user moves the lines 810 closer to or farther from (plus cylinder or minus cylinder) the user until the lines appear clear to the user. This is the focal length of the cylinder power and corresponds to the difference in power from the sphere.
- the sphere, cylinder, and axis values are determined.
- the above described sequence provides the amount of hyperopia (farsightedness), myopia (nearsightedness), and astigmatism measured in this eye and therefore the predicted amount of correction needed to bring clarity. If the user has both myopia or hyperopia and astigmatism, the object 804 would be simultaneously manipulated to determine myopia or hyperopia while also evaluation the dioptric power of the astigmatism.
- plus (convex) lenses are included in the HMD 102 underneath the lenses 104A, 104B and in front of the user’s eyes 106A, 106B. The reason for this is that the hyperopic eye focuses beyond the fixation object.
- the hyperopic eye is mathematically made to become myopic An algorithm is used to subtracts the values to determine the dioptric value of the hyperopis.
- a fourth step is included to refine the exact sphere power and then to binocularly balance the prescription for the two eyes.
- the object 804 is shifted further away from the user to create +0.50 diopters in each the user’s eyes 106A, 106B.
- the object 804 initially appears as one object to the user and then is disassociated or separated until the user can identify two separate objects.
- the user reports which object is clearer.
- the clearer object is then moved away from the user until both objects appear equally as blurred.
- the objects are then merged and the user moves them closer until the resolution appears best to the user.
- the binocular balance has then been completed.
- the object 804 and lines 810 can be moved forward or backwards or rotated.
- Control of the test can take the form voice commands including “forward,” “backward,” and “rotate.”
- a voice command of “forward” translates the plane 808, and associated object 804 or lines 810 toward the combiner lenses 104A, 104B.
- a voice command of “backward” translates the plane 808, and associated object 804 or lines 810 away from the combiner lenses 104A, 104B.
- a voice command of “rotate” moves the plane 808, and associated object 804 or lines 810 in a rotation manner.
- a user can manipulate the object 804 until the user can or can no longer identify the object 804 or lines 810.
- the user can provide a voice command to the computing system 108, such as stating the word “stop” to complete the manipulation portion of the test.
- FIG. 8B is a block diagram illustrating a user’s perspective of the virtual object 804 described in FIG. 8A.
- the virtual object 804 can be implemented as one or more virtual 2D or 3D letters or images, such as shapes, pictures, etc., residing on the plane 808.
- FIG. 8C is a block diagram illustrating a user’s perspective of the series of lines 810 described in FIG. 8A.
- the series of lines 810 can be implemented as parallel lines (running vertical and/or horizontal) residing on the plane 808.
- the series of lines 810 correspond to concentric opaque rings across the surface of the invisible sphere 812, where the concentric opaque rings traverse the surface of the sphere 812 perpendicular to the plane of the combiner lenses 104A, 104B.
- FIG. 9 is a block diagram illustrating convergence testing utilizing the holographic eye testing device according to an exemplary embodiment.
- the head mounted display (HMD) 102 incorporates eye tracking. Eye tracking enables the HMD 102 to track where the user’s eyes 106A, 106B are looking in real time.
- at least one virtual object 902 can be displayed and/or manipulated in a user’s FOV 402.
- the object 902 is translated on a vertical plane 908 and projected on the combiner lenses 104A, 104B to give the appearance that the object 902 is a set distance from the view of the user’s eyes 106A, 106B.
- the application software can present the object 902 via the combiner lenses 104A, 104B so that the object 902 can appear to be at different distances from the user’s eyes 106A, 106B.
- the presentation of the object 902 can correspond to projection of virtual letters, images, or lines at distances of 16 inches (near visual acuity) to 20 feet (distance visual acuity) in front of the user’s eyes 106A, 106B.
- the user can start the test by providing input to the computing system 108.
- the input can take the form of voice commands, including saying key words indicative of beginning the test, gestures, or providing input from a “clicker.”
- the user states the word “start” to begin the test.
- the object 902 is presented to each eyes 106A, 106B and is moved across the user’s FOV 402.
- the user follows the object 902 from left to right and right to left.
- the object 902 is then moved in a circle clock wise and counter-clockwise.
- the HMD 102 via eye tracking, monitors fixation loss and quality of movement of the user’s eyes 106A, 106B. Points of fixation loss and the quality of movement are recorded.
- Convergence near point will be assessed by rendering a first virtual object 910 displayed to a right eye and a second virtual object 912 displayed to a left eye within the holographic display device, wherein the rendering corresponds to the first virtual object and the second virtual object aligned to appear as one virtual object to the user.
- the first virtual object 910 and the second virtual object 912 appear at a distance of 40 centimeters in front of the user’s eyes 106A, 106B.
- the user moves the first virtual object 910 and the second virtual object 912 presented to both eyes 106A, 106B toward the user’s nose.
- the HMD 102 monitors eye alignment and record the distance (in centimeter or inches) when the eyes 106A, 106B lose alignment on the first virtual object 910 and the second virtual object 912 and the first virtual object 910 and the second virtual object 912 appears as two separate objects. This is recorded as the break point.
- the first virtual object 910 and the second virtual object 912 are then moved away from the user and the HMD 102 records the distance when the eyes 106A, 106B realign and the user fuses the first virtual object 910 and the second virtual object 912 back to appear as one virtual object to the user. This is recorded as the realignment point.
- FIG. 9B is a block diagram illustrating a user’s perspective of the virtual object 902 and/or the first virtual object 910 and the second virtual object 912 as viewed aligned, as described in FIG. 9A.
- the virtual object 804 can be implemented as one or more virtual 2D or 3D letters or images, such as shapes, pictures, etc., residing on the plane 908.
- FIGs. 10A-10D illustrates methods for diagnosis and/or prescription of remedies for visual impairment in accordance with exemplary embodiments.
- FIG. 10A illustrates a method for diagnosis and prescription of remedies for visual impairment in accordance with an exemplary embodiment.
- the holographic display device renders one or more three dimensional objects with the holographic display device.
- the rendering corresponds to a virtual level of depth viewable by a user.
- the holographic display device updates the rendering of the one or more three dimensional objects within the holographic display device.
- the updated rendering includes a virtual movement of the one or more three dimensional objects within the virtual level of depth.
- the virtual movement includes moving the one or more three dimensional objects laterally in the field of view of the user.
- the virtual movement includes moving the one or more three dimensional objects vertically in the field of view of the user.
- the virtual movement includes moving the one or more three dimensional objects from a distal position to proximal position within the field of view of the user.
- the virtual level of depth corresponds to a simulated distance away from the user.
- the simulated distance can range from sixteen (16) inches to twenty (20) feet from the user.
- the holographic display device receives input from a user.
- the input can include an indication of alignment of the one or more three dimensional objects based on the virtual movement.
- the indication of alignment can include a relative virtual position between the one or more three dimensional objects.
- the input from the user can include hand gestures and voice commands
- the holographic display device determines a delta between the relative virtual position of the one or more three dimensional objects and an optimal virtual position.
- the holographic display device generates a prescriptive remedy based on the delta between the relative virtual position of the one or more three dimensional objects and the optimal virtual position.
- FIG. 10B illustrates a method for diagnosis and prescription of remedies for visual impairment in accordance with an exemplary embodiment.
- a diagnostic module configured to execute on a computing device communicatively coupled to the head mounted holographic display device renders a first virtual object displayed to the right eye and a second virtual object displayed to the left eye within a head mounted holographic display device.
- the rendering corresponds to the first virtual object and the second virtual object aligned to appear as one virtual object to a user.
- the diagnostic module updates the rendering of the first virtual object and the second virtual object within the holographic display device.
- the update includes a virtual movement of the first virtual object in a first direction and the second virtual object in a second direction opposite the first direction.
- the diagnostic module receives input from a user.
- the input includes an indication of separation of the first virtual object and the second virtual object based on the virtual movement.
- the indication of separation comprises a relative virtual position between the first virtual object and the second virtual object where the user views the first virtual object and the second virtual object as separate objects.
- the diagnostic module determines a first delta between the relative virtual position of the first virtual object and the second virtual object and an optimal virtual position.
- the diagnostic module updates the rendering of the first virtual object and the second virtual object within the holographic display device.
- the update includes a virtual movement of the first virtual object in the second direction and the second virtual object in the first direction.
- the diagnostic module receives a second input from the user.
- the input includes an indication of alignment of the first virtual object and the second virtual object based on the virtual movement.
- the indication of alignment comprises a relative virtual position between the first virtual object and the second virtual object where the user views the first virtual object and the second virtual object as aligned to appear as one virtual object.
- the diagnostic module determines a second delta between the relative virtual position of the first virtual object and the second virtual object and an optimal virtual position.
- FIG. 10C illustrates a method for diagnosis and prescription of remedies for visual impairment in accordance with an exemplary embodiment.
- the diagnostic module renders at least one virtual object within the holographic display device at an initial position.
- the rendering corresponds to a virtual level of depth corresponding to an initial simulated distance away from a user.
- the diagnostic module receives at least one first input from the user.
- the at least one first input comprises an indication to move the at least one virtual object virtually towards or away from the user.
- the diagnostic module updates the rendering of the at least one virtual object within the holographic display device.
- the update includes a virtual movement of the at least one virtual object in a direction towards or away from the user.
- the diagnostic module receives a second input from the user.
- the second input includes an indication that the at least one virtual object appears clear to the user at a final position.
- the rendering corresponds to a virtual level of depth corresponding to a final simulated distance away from the user.
- the diagnostic module determines a measurement between the final position of the virtual object and an optimal virtual position.
- the diagnostic module renders at least one line within the holographic display device at the final position.
- the diagnostic module rotates the at least one line about an axis from the final position.
- the diagnostic module receives a third input from the user.
- the third input comprises an indication that the at least one line appears to the user to have rotated ninety degrees about the axis from the final position.
- the diagnostic module determines a delta in degrees based on the rotation of the at least one line from the final point to an orientation of the at least one line at a time of receiving the third input.
- FIG. 10D illustrates a method for diagnosis and prescription of remedies for visual impairment in accordance with an exemplary embodiment.
- the diagnostic module renders at least one virtual object within the holographic display device.
- the rendering corresponds to a virtual level of depth viewable by the user.
- the diagnostic module updates the rendering of the at least one virtual object within the holographic display device.
- the update includes a virtual movement of the at least one virtual object within the virtual level of depth.
- the diagnostic module monitors, via eye tracking, at least one of fixation loss or quality of movement of the eyes of the user.
- the diagnostic module renders a first virtual object displayed to a right eye and a second virtual object displayed to a left eye within the holographic display device. The rendering corresponds to the first virtual object and the second virtual object aligned to appear as one virtual object to a user.
- the diagnostic module updates the rendering of the first virtual object and the second virtual object within the holographic display device.
- the update includes a virtual movement of the first virtual object and the second virtual object towards the user.
- the diagnostic module monitors, via the eye tracking, eye alignment as the eyes of the user track the first virtual object and the second virtual object moving towards the user.
- the diagnostic module identifies a distance when the user views the first virtual object and the second virtual object as separate objects.
- the diagnostic module determines a delta between the relative virtual position of the first virtual object and the second virtual object and an optimal virtual position.
- the diagnostic module generates a prescriptive remedy based on the delta between the relative virtual position of the first virtual object and the second virtual object and the optimal virtual position.
- FIG. 11 depicts a block diagram an exemplary computing device in accordance with an exemplary embodiment.
- Embodiments of the computing device 1100 can implement embodiments of the system including the holographic eye testing device.
- the computing device can be embodied as a portion of the holographic eye testing device, and supporting computing devices.
- the computing device 1100 includes one or more non-transitory computer-readable media for storing one or more computer-executable instructions or software for implementing exemplary embodiments.
- the non-transitory computer-readable media may include, but are not limited to, one or more types of hardware memory, non-transitory tangible media (for example, one or more magnetic storage disks, one or more optical disks, one or more flash drives, one or more solid state disks), and the like.
- memory 1106 included in the computing system 108 may store computer- readable and computer-executable instructions or software (e.g., applications 1130 such as rendering application) for implementing exemplary operations of the computing device 1100.
- the computing system 108 also includes configurable and/or programmable processor 1102 and associated core(s) 1104, and optionally, one or more additional configurable and/or programmable processor(s) 1102’ and associated core(s) 1104’ (for example, in the case of computer systems having multiple processors/cores), for executing computer-readable and computer-executable instructions or software stored in the memory 1106 and other programs for implementing exemplary embodiments of the present disclosure.
- Processor 1102 and processor(s) 1102’ may each be a single core processor or multiple core (1104 and 1104’) processor. Either or both of processor 1102 and processor(s) 1102’ may be configured to execute one or more of the instructions described in connection with computing system 108.
- Virtualization may be employed in the computing system 108 so that infrastructure and resources in the computing system 108 may be shared dynamically.
- a virtual machine 1112 may be provided to handle a process running on multiple processors so that the process appears to be using only one computing resource rather than multiple computing resources. Multiple virtual machines may also be used with one processor.
- Memory 1106 may include a computer system memory or random access memory, such as DRAM, SRAM, EDO RAM, and the like. Memory 1106 may include other types of memory as well, or combinations thereof.
- the computing system 108 can receive data from input/output devices. A user may interact with the computing system 108 through a visual display device 1114, such as a combiner lenses 1116, which may display one or more virtual graphical user interfaces, a microphone 1120 and one or more cameras 1118.
- the computing system 108 may also include one or more storage devices 1126, such as a hard-drive, CD-ROM, or other computer readable media, for storing data and computer-readable instructions and/or software that implement exemplary embodiments of the present disclosure.
- exemplary storage device 1126 can include storing information associated with platform software and the application software.
- the computing system 108 can include a network interface 1108 configured to interface via one or more network devices 1124 with one or more networks, for example, Local Area Network (LAN), Wide Area Network (WAN) or the Internet through a variety of connections including, but not limited to, standard telephone lines, LAN or WAN links (for example, 802.11, Tl, T3, 56kb, X.25), broadband connections (for example, ISDN, Frame Relay, ATM), wireless connections, controller area network (CAN), or some combination of any or all of the above.
- the computing system can include one or more antennas 1122 to facilitate wireless communication (e.g., via the network interface) between the computing system 108 and a network and/or between the computing system 108 and other computing devices.
- the network interface 1108 may include a built-in network adapter, network interface card, PCMCIA network card, card bus network adapter, wireless network adapter, USB network adapter, modem or any other device suitable for interfacing the computing system 108 to any type of network capable of communication and performing the operations described herein.
- the computing system 108 may run any operating system 1110, such as any of the versions of the Microsoft® Windows® operating systems, the different releases of the Unix and Linux operating systems, any version of the MacOS® for Macintosh computers, any embedded operating system, any real-time operating system, any open source operating system, any proprietary operating system, or any other operating system capable of running on the computing system 108 and performing the operations described herein.
- the operating system 1110 may be run in native mode or emulated mode.
- the operating system 1110 may be run on one or more cloud machine instances.
- FIG. 12 illustrates a visual field test rendered in VR and/or AR via the holographic eye testing device, in accordance with an exemplary embodiment.
- the visual field test may be dynamic and/or static, as explained below.
- a visual field is how wide of an area an eye can see when focused on a central point.
- the visual field testing described herein is a method to measure how much vision there is in an eye, and how much vision loss may have occurred in the eye over time by determining how a user views objects in the user’s field of vision.
- a visual field test can determine if there is blind spot(s) (called scotoma) in a user’s vision and where the blind spot(s) are located.
- blind spot(s) called scotoma
- a scotoma’s size and shape can show how eye disease or a brain disorder is affecting vision.
- the visual field test is administered to one eye at a time. This may be accomplished by blacking out or turning off one of the combiner lenses (e.g., combiner lenses 104A or 104B).
- the holographic eye testing device may include the user’ s lens prescription placed in front of the eye to ensure that the user is seeing as well as possible.
- the user wearing the holographic eye testing device looks at a virtual center target 1202 throughout the test.
- the virtual center target 1202 may appear as a solid black circle.
- small dim virtual lights begin to appear in different places throughout the user’s field of vision.
- a light may blink in different places throughout the user’s field of vision.
- the computing device 108 tracks which lights the user did not see (e.g. the user does not press the button or vocally indicate seeing the light).
- This process assists in creating a detailed map 1204 of where the user can and cannot see.
- the detailed map 1204 indicates which lights the user sees outside of the user’s central area of vision, such as whether there is peripheral vision loss.
- the cluster or darken area 1206 indicates that the user did not see some or all of the lights presented in that area of the visual field.
- the holographic eye testing device may present virtual objects, figures, or shapes.
- lights are shown in some areas where the computing device 108 knows that the user cannot see them. This is done deliberately to find a visual threshold, the area where the user cannot see lights half of the time.
- the dynamic visual field testing is similar to the static visual field testing described above, except that the dynamic visual field testing uses moving virtual lights instead of blinking lights.
- the holographic eye testing device uses an optical illusion to check for vision loss. For example, vertical bars (usually black and white) appear within the user’s field of vision. These bars flickers at varying rates. If the user is not able to see the vertical bars at certain times during the test, it may indicate vision loss in certain parts of the visual field.
- vertical bars usually black and white
- FIG. 13 illustrates color vision testing rendered in VR and/or AR via the holographic eye testing device, in accordance with an exemplary embodiment.
- the color vision testing may include Ishihara type, D-15, and other Farnsworth tests.
- the holographic eye testing device presents to a user a shape 1302, such as a colored number, letter, or figure, within a colored background 1304.
- the shape 1302 and/or the background 1304 may be comprised of colored dots, as shown.
- the shape 1302 and the background 1304 are different colors from each other such that the shape 1302 stands out from the background 1304.
- the colors of the shape and background are purposely pale and are carefully chosen from hues that are difficult for a color-deficient user to distinguish. Users with defective color vision see either no pattern at all or an alternative pattern based on brightness rather than hue.
- the color vision test is administered to one eye at a time. This may be accomplished by blacking out or turning off one of the combiner lenses (e.g., combiner lenses 104A or 104B).
- the holographic eye testing device may include the user’s lens prescription placed in front of the eye to ensure that the user is seeing as well as possible.
- the holographic eye testing device presents the user with a specific number of colored virtual objects, each object including a different hue.
- the hues are more saturated, and they cover the spectrum so that the user will confuse colors for which they have deficient perception (such as red and green).
- the user is asked to virtually arrange the objects in sequence, for example, using a hand to virtually move each object or using a controller.
- the user virtually arranges the colors by similarity in a sequential color series. Errors can be plotted on a simple circular diagram to define the nature of the color deficiency.
- FIG. 14 illustrates an Amsler grid visual field test rendered in VR and/or AR via the holographic eye testing device, in accordance with an exemplary embodiment.
- the Amsler grid visual field test is used to detect potential field loss or distortions caused by retinal detachments and macula pathology.
- the holographic eye testing device displays a virtual pattern of straight lines that makes a grid 1402 of many equal squares 1404 and a virtual center focal point 1406 or focal target.
- the center focal point 1406 is a black dot.
- the grid may include 0.5 centimeters (cm) squares that form a larger square of 10 cm by 10 cm. This test evaluates 10 degrees of vision from a focal point which overall evaluates a visual acuity of 20 degrees. The user looks at a dot in the middle of the grid and describe any areas that may appear wavy, blurry or blank.
- the Amsler grid visual field test is administered to one eye at a time. This may be accomplished by blacking out or turning off one of the combiner lenses (e.g., combiner lenses 104A or 104B).
- the holographic eye testing device may include the user’s lens prescription placed in front of the eye to ensure that the user is seeing as well as possible.
- the grid 1402 further includes four diagonal lines to assist the user to focus on the center dot, for example if the user has a central scotoma (blind spot in the middle of the visual field).
- the grid 1402 further includes different color lines, background, and/or dot, such as a black background with red lines and a red dot. This grid is helpful in identifying disorders that have an associated red desaturation such as pituitary tumor causing partial blindness, toxic maculopathy, or toxic optic neuropathy.
- the grid 1402 further includes a black background with a large central white dot with randomly placed smaller dots throughout the grid. There may be no lines in this grid. This grid may be used to differentiate between blind spots and distortions.
- the grid 1402 further includes a black background with white horizontal lines with a white dot in the center.
- the horizontal lines may help determine distortions related to curved sections of the cornea.
- the grid 1402 further includes a white background and black lines. Towards the black dot in the center, the horizontal lines are closer than the horizontal lines further from the black dot. This can be helpful in identifying fine visual distortions near the center of your visual field.
- the grid 1402 further includes another smaller grid at the center around the central dot. This allows identification of disease in half a degree. This is helpful in identification of macular degeneration.
- the user focuses on the center focal point 1406 and identifies, either verbally or using a controller, whether the dot in the center is visible, whether the user can see the four corners and the four sides of the grid while focusing on the dot in the center, whether there are blank or blurry sections of the grid while focusing on the center, whether there are wavy lines (horizontal or vertical) of the grid while focusing on the center, whether there are moving lines, shiny sections and/or vibrations noted in the grid while focusing on the center. If the lines appear distorted or disappear, the user may mark the areas where these were noted. The marks may be made verbally (for example, by noting the number of squares between the dot and the abnormality), using a controller, or virtually using a hand.
- the computing device 108 may store the results in the database for later reference, as changes in the area of distortion can represent a progressive condition, stabilized condition, or improvement in the user’s condition.
- FIG. 15 illustrates a pupillometry test rendered in VR and/or AR via the holographic eye testing device, in accordance with an exemplary embodiment.
- Pupillometry measures the spontaneous variation of the pupil diameter and the pupillary light reflex.
- the holographic eye testing device shines a light 1502 in the user’s eyes 1504 and measures the speed of pupillary response.
- the light 1502 may be a white light presented in the holographic eye testing device.
- the holographic eye testing device may present a beam of light to a pupil and observe the size of the pupil and reaction in the eye that is lit.
- a physical light e.g., a flashlight, a LED light, a UV light, an incandescent light, etc.
- a physical light is added to the holographic eye testing device to shine a light 1502 in the user’s eyes 1504.
- One or more cameras 1506 are coupled to the holographic eye testing device and obtain images (e.g., pictures and video) of the pupil before, during, and/or after the light 1502 is presented to the user’s eye.
- the computing device 108 uses the images to determine the sizes of the pupil before, during, and/or after the light 1502 is presented to the user’s eye.
- the pupillometry test is used to detect neurological disorders such as Parkinson’s or Alzheimer’s and a test for concussions.
- the computing device 108 may store the results in the database for later reference, as changes in the sizes of the pupil can represent a progressive condition, stabilized condition, or improvement in the user’s condition.
- FIG. 16 illustrates a retinal scan test rendered in VR and/or AR via the holographic eye testing device, in accordance with an exemplary embodiment.
- An iris biometrics module 1602 can be integrated into the holographic eye testing device and/or computing device 108 to capture the user’s iris and do eye-tracking at the same time.
- the iris biometrics module 1602 uses retinal scans to map the unique patterns of the user’s retina.
- the retinal scan is performed by casting an unperceived beam of low-energy infrared light emitted by the holographic eye testing device into a person’s eye as they look within or through the holographic eye testing device. This beam of light traces a standardized path on the retina.
- specialized software in the iris biometrics module 1602 compiles the unique features of the network of retinal blood vessels into a template.
- the iris biometrics module 1602 may further identify cataract and/or foreign bodies.
- FIG. 17 illustrates a tear breakup time test via the holographic eye testing device, in accordance with an exemplary embodiment.
- Tear breakup time is a clinical test used to assess for evaporative dry eye disease.
- An eye analysis module 1702 is integrated into the holographic eye testing device to capture images of the tear film 1702 which upon specialized software in the eye analysis module 1702 performs analysis to identify dry spots.
- fluorescein is instilled into the user's tear film 1702 before the user places his or her head into the holographic eye testing device. The user is asked not to blink while the tear film 1702 is observed under a broad beam of cobalt blue illumination emitted by the holographic eye testing device.
- the holographic eye testing device records the TBUT as the number of seconds that elapse between the last blink and the appearance of the first dry spot in the tear film, as seen in this progression of the images captured by the eye analysis module 1702 over time.
- FIG. 18 illustrates a syntonies color therapy test via the holographic eye testing device, in accordance with an exemplary embodiment.
- Syntonies is a form of light therapy used to treat vision problems. For example, a patient whose eyes turn in would most likely view a different color than a patient whose eyes turn out.
- the holographic eye testing device displays a variety of colors 1802, 1804, 1806 on the spectrum to the user based on the user’s vision problem. While three colors are illustrated in the figure, more or less colors may be used without departing from this embodiment. The colors may be displayed as shapes or blotches.
- the user verbally identifies each color 1802, 1804, 1806 displayed by the holographic eye testing device. The identified color is recorded by the holographic eye testing device and saved in a database or recorded by a technician.
- multiple vision tests are performed to determine the most successful syntonic treatment. These tests include pupil reaction tests and functional vision field tests. Based on the findings of these tests, specific colors are shown to the patient based on his or her visual condition. For example, red and orange is often used to treat lazy eyes, and green or yellow is used to treat eyes that are turned inward. The blue spectrum can help improve a patient’s focus while reading or doing near work.
- FIG. 19 illustrates a colorimetry test via the holographic eye testing device, in accordance with an exemplary embodiment.
- the colorimetry test is used to sequentially explore color space to find the optimal precision tint for the relief of perceptual distortions, commonly known as visual stress.
- the holographic eye testing device displays a variety of tinted colors overlaid on text to the user.
- the holographic eye testing device may display one tinted color at a time or two or more rows of different tinted colors. Assessment involves the presentation of each overlay in turn for comparison with others shown previously or placed beside it.
- a physical tint 1902 of a specified tinted color such as a tinted transparent plastic material, is placed between the user’s eye and the combiner lenses (e.g., combiner lenses 104A or 104B).
- the tint is virtually produced over the text as the user views the text within the holographic eye testing device.
- FIGs. 20A and B illustrate a keratometry test via the holographic eye testing device, in accordance with an exemplary embodiment.
- Keratometry is a procedure used to measure the curvature of the cornea (corneal curvature).
- a digital camera 2002 is integrated into the holographic eye testing device.
- An eye analysis module further is integrated into the holographic eye testing device and/or the computing device 108 to perform analysis as described below.
- the holographic eye testing device displays a virtual bowl containing an illuminated pattern, such as a series of concentric rings. Light is focused on the anterior surface of the user's cornea using light emitted from the holographic eye testing device or physical lights 2004 added to the holographic eye testing device.
- the light is reflected back to the digital camera 2002 at the holographic eye testing device.
- the topology of the cornea is revealed by the shape taken by the reflected pattern.
- the eye analysis module provides the necessary analysis, typically determining the position and height of several thousand points across the cornea.
- the eye analysis module may generate a topographical map 2006 as shown in FIG. 20B, such as sagittal map, which color-codes the steepness of curvature according to its dioptric value.
- FIG. 21 illustrates depth perception eye test rendered in VR and/or AR via the holographic eye testing device, in accordance with an exemplary embodiment.
- the holographic eye testing device renders a virtual object 2102 in front of a virtual circle 2104.
- the user focuses his or her gaze primarily on the circle 2104 and typically sees two images of the object 2102 on either side of the circle 2104.
- the user verbally identifies or uses a controller to identify whether he or she views two objects. If the user views the two objects, it’ s a sign of strong depth perception.
- the user may further identify whether the user views the object 2102 as larger on one side than on the other, views the object 2102 better on one side, or views only one reflection of you’re the object 2102, not two. It may signal poor depth perception if any of these occur during the test.
- Exemplary flowcharts are provided herein for illustrative purposes and are non- limiting examples of methods.
- One of ordinary skill in the art will recognize that exemplary methods can include more or fewer steps than those illustrated in the exemplary flowcharts and that the steps in the exemplary flowcharts can be performed in a different order than the order shown in the illustrative flowcharts.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Medical Informatics (AREA)
- Public Health (AREA)
- General Health & Medical Sciences (AREA)
- Biomedical Technology (AREA)
- General Physics & Mathematics (AREA)
- Surgery (AREA)
- Optics & Photonics (AREA)
- Heart & Thoracic Surgery (AREA)
- Ophthalmology & Optometry (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Animal Behavior & Ethology (AREA)
- Molecular Biology (AREA)
- Primary Health Care (AREA)
- Epidemiology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Data Mining & Analysis (AREA)
- Databases & Information Systems (AREA)
- Pathology (AREA)
- Radiology & Medical Imaging (AREA)
- Eye Examination Apparatus (AREA)
- Holo Graphy (AREA)
Abstract
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
US202063083682P | 2020-09-25 | 2020-09-25 | |
PCT/US2021/051518 WO2022066744A1 (fr) | 2020-09-25 | 2021-09-22 | Système de réfraction en espace réel holographique |
Publications (1)
Publication Number | Publication Date |
---|---|
EP4216796A1 true EP4216796A1 (fr) | 2023-08-02 |
Family
ID=80844773
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP21873337.6A Pending EP4216796A1 (fr) | 2020-09-25 | 2021-09-22 | Système de réfraction en espace réel holographique |
Country Status (6)
Country | Link |
---|---|
EP (1) | EP4216796A1 (fr) |
JP (1) | JP2023543822A (fr) |
AU (1) | AU2021347290A1 (fr) |
CA (1) | CA3193959A1 (fr) |
MX (1) | MX2023003577A (fr) |
WO (1) | WO2022066744A1 (fr) |
Family Cites Families (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US7309128B2 (en) * | 2002-09-20 | 2007-12-18 | Centrofuse Technologies, Llc | Automated stereocampimeter and related method for improved measurement of the visual field |
WO2012006330A1 (fr) * | 2010-07-06 | 2012-01-12 | Konan Medical Usa, Inc. | Évaluation de réponses pupillaires à des stimuli lumineux |
US9706910B1 (en) * | 2014-05-29 | 2017-07-18 | Vivid Vision, Inc. | Interactive system for vision assessment and correction |
US20180190011A1 (en) * | 2017-01-04 | 2018-07-05 | Osterhout Group, Inc. | Content rendering systems for head-worn computers |
US10441161B2 (en) * | 2018-02-26 | 2019-10-15 | Veyezer LLC | Holographic real space refractive sequence |
-
2021
- 2021-09-22 AU AU2021347290A patent/AU2021347290A1/en active Pending
- 2021-09-22 MX MX2023003577A patent/MX2023003577A/es unknown
- 2021-09-22 WO PCT/US2021/051518 patent/WO2022066744A1/fr unknown
- 2021-09-22 JP JP2023519446A patent/JP2023543822A/ja active Pending
- 2021-09-22 EP EP21873337.6A patent/EP4216796A1/fr active Pending
- 2021-09-22 CA CA3193959A patent/CA3193959A1/fr active Pending
Also Published As
Publication number | Publication date |
---|---|
CA3193959A1 (fr) | 2022-03-31 |
JP2023543822A (ja) | 2023-10-18 |
MX2023003577A (es) | 2023-06-27 |
AU2021347290A1 (en) | 2023-05-25 |
WO2022066744A1 (fr) | 2022-03-31 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20220007929A1 (en) | Holographic Real Space Refractive System | |
KR100729889B1 (ko) | 검안장치 | |
JP2021502130A (ja) | デジタル治療用矯正眼鏡 | |
KR102148345B1 (ko) | 대물렌즈 안굴절도와 개인의 적어도 하나의 기하학적-형태 파라미터를 측정하기 위한 장치와 방법 | |
CN117770757A (zh) | 光场处理器系统 | |
CN109688898B (zh) | 辅助建立用于矫正斜视或隐斜视的矫正的设备和相关方法 | |
RU2634682C1 (ru) | Портативное устройство для исследования зрительных функций | |
US11789259B2 (en) | Vision inspection and correction method, together with the system apparatus thereof | |
US11253149B2 (en) | Holographic real space refractive sequence | |
US20200081269A1 (en) | Apparatus, Method and System for Measuring the Influence of Ophthalmic Lens Design | |
US11614623B2 (en) | Holographic real space refractive system | |
EP4216796A1 (fr) | Système de réfraction en espace réel holographique | |
EP4183320A1 (fr) | Procédé pour comparer deux lentilles ophtalmiques ayant différentes conceptions optiques | |
US20240288716A1 (en) | Virtual reality head-mounted display device with built-in strabismus treatment and operation method thereof | |
KR102724424B1 (ko) | 디지털 치료적 보정 안경 | |
WO2023049086A2 (fr) | Système de réfraction en espace réel holographique | |
CA3188913A1 (fr) | Systeme de refraction en espace reel holographique | |
TW202434186A (zh) | 具斜視矯正功能之虛擬實境頭戴式顯示器及其運作方法 | |
Vicente | Optical Instruments and Machines | |
Coppin | Mathematical modeling of a light field fundus camera and its applications to retinal imaging | |
WO2024094839A1 (fr) | Procédés et dispositifs mis en œuvre par ordinateur pour déterminer des erreurs de réfraction | |
IL305329A (en) | Method, system and computer program product for determining optometric parameters | |
CN117295447A (zh) | 与智能手机配合使用的眼部检查设备 | |
CN113080844A (zh) | 优选视网膜区的视觉检测与视觉训练设备 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE |
|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE |
|
17P | Request for examination filed |
Effective date: 20230421 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAV | Request for validation of the european patent (deleted) | ||
DAX | Request for extension of the european patent (deleted) | ||
RIC1 | Information provided on ipc code assigned before grant |
Ipc: A61B 3/11 20060101ALN20240903BHEP Ipc: A61B 3/10 20060101ALI20240903BHEP Ipc: G02B 27/00 20060101ALI20240903BHEP Ipc: A61B 3/107 20060101ALI20240903BHEP Ipc: H04N 13/344 20180101ALI20240903BHEP Ipc: G16H 30/40 20180101ALI20240903BHEP Ipc: G16H 50/20 20180101ALI20240903BHEP Ipc: G02B 30/56 20200101ALI20240903BHEP Ipc: G02B 27/01 20060101ALI20240903BHEP Ipc: A61B 3/036 20060101ALI20240903BHEP Ipc: A61B 3/032 20060101AFI20240903BHEP |