GB2621148A - Visual acuity testing system and method - Google Patents

Visual acuity testing system and method Download PDF

Info

Publication number
GB2621148A
GB2621148A GB2211279.1A GB202211279A GB2621148A GB 2621148 A GB2621148 A GB 2621148A GB 202211279 A GB202211279 A GB 202211279A GB 2621148 A GB2621148 A GB 2621148A
Authority
GB
United Kingdom
Prior art keywords
optotype
display screen
user
virtual
gui
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
GB2211279.1A
Other versions
GB202211279D0 (en
Inventor
Mils Ian
Cleary Frances
Barnes Stephen
Dowling Phelim
Hickey Paul
Nolan John
Ankamah Emanul
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South East Technological Univ
Original Assignee
South East Technological Univ
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South East Technological Univ filed Critical South East Technological Univ
Priority to GB2211279.1A priority Critical patent/GB2621148A/en
Publication of GB202211279D0 publication Critical patent/GB202211279D0/en
Publication of GB2621148A publication Critical patent/GB2621148A/en
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/022Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing contrast sensitivity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • A61B3/032Devices for presenting test symbols or characters, e.g. test chart projectors

Abstract

Vision testing by rendering a graphical user interface (GUI) on a display screen of a Virtual Reality (VR) headset worn by a user to enable testing vision of a user in a virtual 3D space and perform a contrast sensitivity test. By using a VR headset and a corresponding vision testing application, a process is developed to calculate the position/kerning, sizing and placement in 3D space of the optotypes, without the need for a large-scale environment such as a 4 meter testing room, and the luminosity of the display screen can be calculated without the need for an external/additional hardware luminosity sensor. Specific sizing ratios of each optotype character are calculated by taking account of micromovements of the user which offset the optotype characters. An alternative contrast test may be performed by applying reverse polarity in respect of each eye.

Description

VISUAL ACUITY TESTING SYSTEM AND METHOD
Field
The present disclosure is directed towards a visual acuity testing system, and more particularly to a visual acuity testing system that uses Virtual Reality (VR).
Background
Visual acuity is the ability to see objects clearly. It's a technical term used to describe the result of an eye test that determines whether one need glasses or contact lenses to see clearly. Visual acuity would vary from person to person depending on their age, eye shape and general eye health.
A contrast sensitivity test measures one's ability to distinguish between finer and finer increments of light versus dark (contrast). This differs from common visual acuity testing in a routine eye exam, which measures one's ability to recognize smaller and smaller letters on a standard eye chart. Contrast is a physical variable that can be defined as specific difference between the luminous intensities of two adjacent points. Contrast vision is described as the ability of the eye to distinguish and interpret different luminosity areas as separate points. Contrast sensitivity is the smallest possible difference in object luminosity that can be differentiated by the eye. Luminosity is used to describe the amount of light reflected from the surface and perceived by the eye. Contrast sensitivity is a very important measure of visual function, especially in situations of low light, fog or glare, when the contrast between objects and their background often is reduced.
Current contrast sensitivity tests rely on a luminosity meter to measure light in a room to calculate the contrast sensitivity. Also, current visual acuity tests require a large testing area, i.e. a minimum of 4 meter and 20 feet, in order to conduct a standardised test.
Hence, in view of the above, there is a need for a visual acuity testing system that can perform the standard vision testing, as well as contrast sensitivity tests without requiring a luminosity meter, and also without requiring a large space.
Summary
According to the invention there is provided, as set out in the appended claims, a vision testing method comprising rendering a graphical user interface (GUI) on a display screen of a Virtual Reality (VR) headset worn by a user, wherein the GUI enables testing vision of a user in a virtual 3D space; and performing contrast sensitivity test of the user through the GUI.
In one embodiment the method comprises the step of rendering a plurality of optotypes in the virtual 3D space of the GUI, wherein the rendering includes estimating a position, a sizing and a placement of each optotype character of each optotype in the virtual 3D space.
In one embodiment the estimating includes calculating the size of each optotype character to be displayed into the virtual 3D space, from corresponding real world optotype sizes, based on corresponding chart specifications and the resolution of the display screen.
In one embodiment, the estimating includes calculating specific sizing ratios of each optotype character by taking into account of micromovements of the user, as slight movement of user head offset the optotype characters.
In one embodiment, the sizing ratios are calculated every frame cycle, to account for micro movement of the headset through onboard IMU sensors, and optical camera for room tracking. 6. The method as claimed in claim 4 further comprising calculating kerning spacing between each optotype character, based on computed size of a visual acuity (VA) line and applying sizing ratio to spacing between optotype characters.
In one embodiment, the method further includes dynamically calculating sizing and placement of each optotype character at runtime based on a size of respective display screen.
In one embodiment, the method further includes dynamically calculating a contrast sensitivity value of each optotype character based on a luminance of the display screen.
In one embodiment, the method further includes calculating the luminance of the io display screen based on hue, saturation, brightness and screen contrast of the display screen.
In one embodiment, the method further includes performing an alternative contrast test, by applying reverse polarity in respect each eye.
According to another aspect of the invention, there is provided a VR headset that includes a display screen; a memory to store one or more instructions; and a processor in communication with the memory, and configured to execute the one or more instructions to: render a graphical user interface (GUI) on the display screen, wherein the GUI enables testing vision of a user in a virtual 3D space; and perform contrast sensitivity test of the user through the GUI.
According to yet another aspect of the invention, there is provided a non-transitory computer readable medium having stored thereon computer-executable instructions which, when executed by a processor, cause the processor to render a graphical user interface (GUI) on a display screen of a Virtual Reality (VR) headset worn by a user, wherein the GUI enables testing vision of a user in a virtual 3D space; and perform contrast sensitivity test of the user through the GUI. By using a VR headset and a corresponding vision testing application, a process is developed to calculate the position/kerning, sizing and placement in 3D space of the optotypes, without the need for a large scale environment such as a 4 meter testing room. Also, the use of the VR headset and corresponding vision testing application, the luminosity of the display screen can be calculated without the need for an external/additional hardware luminosity sensor.
Various embodiments of the present invention seek to disrupt the vison testing market by introducing a small footprint, highly accurate visual acuity and contrast sensitivity testing system. This allows the process of vision testing to be drastically reduced in scale and size required and increases throughput. This has been specifically designed to reduce testing times and increase throughput of a customer through the eye testing process through a refined testing process. The use of the VR headset and frontend Ul is designed to reduce time for both a patient and clinician as well to increase throughput. The process outlined for testing patients coupled with the use of the VR software would be of major interest to the major stakeholders of vision testing as it reduces overall testing times and increases patient throughout.
Brief Description of the Drawings
The invention will be more clearly understood from the following description of an embodiment thereof, given by way of example only, with reference to the accompanying drawings, in which:-FIG.1A illustrates a vision testing system implemented through a Virtual Reality (VR) headset worn by a user, in accordance with an embodiment of the present invention; FIG.1B is a block diagram illustrating components of the VR headset, in accordance with an embodiment of the present invention; and FIG.2 illustrates an exemplary optotype, known in the art.
Detailed Description of the Drawings
FIG.1A illustrates a Virtual Reality (VR) headset 102 worn by a user 104, in accordance with an embodiment of the present invention. FIG.1B is a block diagram illustrating vision testing through the VR headset 102, in accordance with an embodiment of the present invention.
The user 104 is a patient who may seek a standardized eye test, or a contrast sensitivity test for their eyes. In the context of the present invention, the user is required to wear the VR headset 102 for performing these tests. The VR headset 102 is a head mounted display that the user may wear in a manner similar to that of glasses.
The VR headset 102 includes a display screen 106, and a vision testing system 108 in communication with the display screen 106. The vision testing system 108 enables vision testing of the user 104 in a virtual 3D space, through the display screen 106. In an example, the display screen 106 is a Liquid Crystal Display (LCD) that provides VR experience to the user 104. The vision system 108 is implemented through a vision testing application 110 running on a hardware platform (not shown) of the VR headset 102. The vision testing application 110 is configured to operably render a Graphical User Interface (GUI) 112 on the display screen 106, to enable the vision testing of the user 104 through VR.
The hardware platform may include a processor (e.g., a single or multiple processors) or other hardware processing circuit. The vision testing application 110 may be embodied as machine-readable instructions stored on a computer-readable medium, which may be non-transitory, such as hardware storage devices (e.g., RAM (random access memory), ROM (read-only memory), EPROM (erasable, programmable ROM), EEPROM (electrically erasable, programmable ROM), hard drives, and flash memory). Although, not shown, the visual testing application 110 may be an application of a remote server.
Referring to FIG.2, the vision testing application 110 includes a rendering module 202 that is configured to render a comprehensive list of known optotypes in a virtual 3D space of the GUI 112. An optotype is an eye chart, that is used to measure visual acuity or clarity of vision. An example of well-known optotype is Snellen chart as illustrated in FIG.3. In an embodiment of the present invention, the rendering module 202 estimates position/kerning, sizing and placement of each optotype character for rendering the same in a virtual 3D space on the display screen 106. This is performed for each visual frequency in a standardized vision chart (20/12.5 up to 20/400).) The visual frequency referenced here is the spatial frequency.
The rendering module 202 includes a sizing function 206 for calculating the size of each optotype character to be displayed into the virtual 3D space, directly from corresponding real world optotype sizes, based on corresponding chart specifications and the resolution of the display screen 106 itself. The size of each optotype character may be calculated in pixels per unit.
The rendering module 202 further includes a scaling function 208 for calculating very specific sizing ratios of each optotype character based upon the focal distance of the lens and unit size in the graphics engine (1m per unit). The scaling function 208 takes into account for subtle user movements (in this case micromovement), as slight movement of user head (Less than 1° in angular rotation not counting lateral movement around 6D0F) can offset the optotype characters.
The scaling function 208 runs every frame cycle, where there are approximately 60 frame cycles per second to account for movement of the headset 102 through its onboard IMU sensors (for slight head movement) and optical camera stabilization (for room tracking).
In an example, when a single optotype (H) is rendered at a distance of 4 meters from the observer 104, the calculated size for a letter at 20/20 vision may be 5.3mm. The resolution of an existing VR headset such as Oculus (now Meta) Quest 2 is 3664x1920 (or 1832x1920 per eye) with a total pixel count which equates to 7034880 (or 3517440 per eye). This provides a usable space 3517440 pixels split across a 4 meter distance to replicate a 5.3mm object. However this value does not remain static, as the ratios calculated are split across pixels, and thus values may be adjusted dependent on whether an object crossed into a new pixel territory due to micro movement of the head.
The rendering module 202 further includes a kerning function 208 for calculating kerning spacing between each optotype character taking the computed size of the visual acuity (VA) line and applying its ratio to the spacing between characters. In an example, the VA line is the current visible letter line to the observer (20/25 in Snellen notation). This ratio mentioned here is the ratio function mentioned in sizing calculations above.
The rendering module 202 further includes a placement function 210 for placing each optotype character in the virtual 3D space on the display screen 106. The rendering module 202 further includes a unity API (not shown) to read in screen size of the display screen 106 in order to dynamically run the sizing calculations of optotype characters based on the screen size. In an embodiment of the present invention, the rendering module 202 is configured to run on each frame cycle (60 cycles a second) on each display screen. Thus, the rendering module 202 is a generalized function that programmatically and dynamically adapts to various types of VR screens by rendering an optotype based on the size of VR screen. Different VR headsets would have different pixel counts, and therefore, is the sizing and placement of the optotype is dynamically calculated to ensure their rendering on different type of headsets.
The key inventive step of the present invention is dynamically performing sizing and placement calculations of an optotype at runtime instead of manually calibrating the display screen.
Thus, the display screen 106 simulates the user experience of vision testing in a physical space, to a virtual 3D space, thus alleviating the need for large space and specialised calibrated equipment.
The vision testing application 110 further includes a contrast calculation module 204 that is configured to calculate a precise contrast for the display screen 106 based on the luminance of the display screen 106. The vision testing application 110 dynamically calculates the contrast sensitivity values for each optotype character to be displayed into the VR experience, based on the luminosity of the display screen 106.
Each optotype character is developed based on a standard Snellen SVG (scale vector graphics) letter image. Using a vector image allows the rendering module 202 to size/scale an optotype character without blurring, and also eliminate bi/tri linear filtering steps usually present in graphics engines. Each optotype in this application is converted into a sprite game object with its own material and shader code. This shader code is written in HLSL shader language (containing CG programs). These shaders have direct write access to graphics pipeline underneath Unity game engine (RGBA values).Form this, the contrast calculation module 204 is configured to pull hue, saturation, brightness and screen contrast of the display screen 106 to calculate a luminosity value to be applied to each character for measuring contrast sensitivity.
The luminosity value is applied to the optotypes and set contrast of each optotype character from any range from 1.2% to 100%. The ability to derive luminosity and is thus set appropriate contrast levels is of major significance to the VR world. Since contrast is a major component of one's vision this aspect is integral to the VR industry as it evolves.
It will be appreciated that since contrast sensitivity varies for each individual, the quality of experience or interpretation of experience may be vastly different. In modern VR experiences with accessibility features, one can sometimes adjust text size and even gamma/brightness. However this is manual process, and the individual does not know their contrast sensitivity score. The shader, once used during a test, may provide an individual's contrast score. If their contrast vision is poor, then visual effects such as night, fog, size and sharpness of text, video colour may all be affected thus limiting/discriminating against an individual.
Thus, without the need for an external luminosity meter, an accurate luminosity value of the display screen 106 is computed based on the RGBA channels of the display screen 106, with their accompanying hue and saturation colour profiles.
This is key function to test contrast sensitivity. It will be appreciated that standard functions require the use of an external luminosity meter as opposed to the approach of the present invention which detects it from the Unity engine API as screen brightness within the close environment of the headset. The system is unique in that it uses a brightness value instead of a real world luminosity input. This process/approach to the calculation of luminosity allows VR applications to be adapted to the specific user and their current vision levels without the need to be tested in a traditional manner, i.e. standard 2D testing with a luminosity meter.
In an embodiment of the present invention, the vision testing application 110 is configured to perform an alternative contrast test, in that shader calculates reverse polarity for each eye for both the dashboard and VR client additional test types. The reverse polarity in this context is that the shader can be applied to the background as well reversing the black text on white background to white text on black background.
This is because we can operate systems in parallel in a smaller environment through a centralized hub (tablet interface).
In the specification the terms "comprise, comprises, comprised and comprising" or any variation thereof and the terms include, includes, included and including" or any variation thereof are considered to be totally interchangeable and they should all be afforded the widest possible interpretation and vice versa.
The invention is not limited to the embodiments hereinbefore described but may be varied in both construction and detail.

Claims (14)

  1. Claims 1. A vision testing method comprising: rendering a graphical user interface (GUI) on a display screen of a Virtual Reality (VR) headset worn by a user, wherein the GUI enables testing vision of a user in a virtual 3D space; and performing contrast sensitivity test of the user through the GUI.
  2. 2. The method as claimed in claim 1 further comprising rendering a plurality of optotypes in the virtual 3D space of the GUI, wherein the rendering includes estimating a position, a sizing and a placement of each optotype character of each optotype in the virtual 3D space.
  3. 3. The method as claimed in claim 2, wherein the estimating includes calculating the size of each optotype character to be displayed into the virtual 3D space, from corresponding real world optotype sizes, based on corresponding chart specifications and the resolution of the display screen.
  4. 4. The method as claimed in claim 3, wherein the estimating includes calculating specific sizing ratios of each optotype character by taking into account of micromovements of the user, as slight movement of user head offset the optotype characters.
  5. 5. The method as claimed in claim 4, wherein the sizing ratios are calculated every frame cycle, to account for micro movement of the headset through onboard IMU sensors, and optical camera for room tracking.
  6. 6. The method as claimed in claim 4 further comprising calculating kerning spacing between each optotype character, based on computed size of a visual acuity (VA) line and applying sizing ratio to spacing between optotype characters.
  7. 7. The method as claimed in claim 4 further comprising dynamically calculating sizing and placement of each optotype character at runtime based on a size of respective display screen.
  8. 8. The method as claimed in claim 1 further comprising dynamically calculating a contrast sensitivity value of each optotype character based on a luminance of the display screen.
  9. 9. The method as claimed in claim 7 further comprising calculating the luminance of the display screen based on hue, saturation, brightness and screen contrast of the display screen.
  10. 10. The method as claimed in claim 1 further comprising performing an alternative contrast test, by applying reverse polarity in respect each eye.
  11. 11. A VR headset comprising: a display screen; a memory to store one or more instructions; and a processor in communication with the memory, and configured to execute the one or more instructions to: render a graphical user interface (GUI) on the display screen, wherein the GUI enables testing vision of a user in a virtual 3D space; and perform contrast sensitivity test of the user through the GUI.
  12. 12. The VR headset as claimed in claim 11, wherein the processor is further configured to render a plurality of optotypes in the virtual 3D space of the GUI, wherein the rendering includes estimating a position, a sizing and a placement of each optotype character of each optotype in the virtual 3D space.
  13. 13. The VR headset as claimed in claim 12, wherein the estimating includes calculating the size of each optotype character to be displayed into the virtual 3D space, from corresponding real world optotype sizes, based on corresponding chart specifications and the resolution of the display screen.
  14. 14. A non-transitory computer readable medium having stored thereon computer-executable instructions which, when executed by a processor, cause the processor to: render a graphical user interface (GUI) on a display screen of a Virtual Reality (VR) headset worn by a user, wherein the GUI enables testing vision of a user in a virtual 3D space; and perform contrast sensitivity test of the user through the GUI.
GB2211279.1A 2022-08-03 2022-08-03 Visual acuity testing system and method Pending GB2621148A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
GB2211279.1A GB2621148A (en) 2022-08-03 2022-08-03 Visual acuity testing system and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
GB2211279.1A GB2621148A (en) 2022-08-03 2022-08-03 Visual acuity testing system and method

Publications (2)

Publication Number Publication Date
GB202211279D0 GB202211279D0 (en) 2022-09-14
GB2621148A true GB2621148A (en) 2024-02-07

Family

ID=84540627

Family Applications (1)

Application Number Title Priority Date Filing Date
GB2211279.1A Pending GB2621148A (en) 2022-08-03 2022-08-03 Visual acuity testing system and method

Country Status (1)

Country Link
GB (1) GB2621148A (en)

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180136486A1 (en) * 2016-07-25 2018-05-17 Magic Leap, Inc. Light field processor system
WO2021155136A1 (en) * 2020-01-31 2021-08-05 Olleyes, Inc. A system and method for providing visual tests
US20220160223A1 (en) * 2020-11-25 2022-05-26 Irisvision, Inc. Methods and Systems for Evaluating Vision Acuity and/or Conducting Visual Field Tests in a Head-Mounted Vision Device
WO2022115860A1 (en) * 2020-11-25 2022-06-02 Irisvision, Inc. Methods and systems for evaluating vision acuity and/or conducting visual field tests in a head-mounted vision device

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20180136486A1 (en) * 2016-07-25 2018-05-17 Magic Leap, Inc. Light field processor system
WO2021155136A1 (en) * 2020-01-31 2021-08-05 Olleyes, Inc. A system and method for providing visual tests
US20220160223A1 (en) * 2020-11-25 2022-05-26 Irisvision, Inc. Methods and Systems for Evaluating Vision Acuity and/or Conducting Visual Field Tests in a Head-Mounted Vision Device
WO2022115860A1 (en) * 2020-11-25 2022-06-02 Irisvision, Inc. Methods and systems for evaluating vision acuity and/or conducting visual field tests in a head-mounted vision device

Also Published As

Publication number Publication date
GB202211279D0 (en) 2022-09-14

Similar Documents

Publication Publication Date Title
Erickson et al. Exploring the limitations of environment lighting on optical see-through head-mounted displays
Livingston et al. Basic perception in head-worn augmented reality displays
Kelly et al. Recalibration of perceived distance in virtual environments occurs rapidly and transfers asymmetrically across scale
Toscani et al. Optimal sampling of visual information for lightness judgments
Fores et al. Toward a perceptually based metric for BRDF modeling
CN109640786B (en) Visual function inspection and optical characteristic calculation system
CN108648254A (en) A kind of image rendering method and device
US20180288405A1 (en) Viewing device adjustment based on eye accommodation in relation to a display
Gabbard et al. A perceptual color-matching method for examining color blending in augmented reality head-up display graphics
Gattullo et al. Effect of text outline and contrast polarity on AR text readability in industrial lighting
Zhang et al. Perceived transparency in optical see-through augmented reality
Cadfk et al. The naturalness of reproduced high dynamic range images
Zhang et al. Color contrast enhanced rendering for optical see-through head-mounted displays
Rodríguez-Vallejo et al. Stereopsis assessment at multiple distances with an iPad application
Livingston et al. Quantification of contrast sensitivity and color perception using head-worn augmented reality displays
Toscani et al. Assessment of OLED head mounted display for vision research with virtual reality
CN112954304B (en) Mura defect assessment method for display panel
Bruder et al. Voronoi-Based Foveated Volume Rendering.
GB2621148A (en) Visual acuity testing system and method
NZ567233A (en) Spectrally characterizing a display system that displays a video image to a trainee pilot during sensor stimulation to measure emitted radiant power
Livingston Quantification of visual capabilities using augmented reality displays
CN113726981B (en) Image data processing method, electronic device, and storage medium
Ikeda et al. Shadow induction on optical see-through head-mounted displays
Mantel Viewpoint adaptive display of HDR images
Jover et al. Relative luminance and figure-background segmentation problems: Using AMLA to avoid nondiscernible stimulus pairs in common and color blind observers