WO2022093521A1 - Systems and methods for visual field testing in head-mounted displays - Google Patents

Systems and methods for visual field testing in head-mounted displays Download PDF

Info

Publication number
WO2022093521A1
WO2022093521A1 PCT/US2021/054228 US2021054228W WO2022093521A1 WO 2022093521 A1 WO2022093521 A1 WO 2022093521A1 US 2021054228 W US2021054228 W US 2021054228W WO 2022093521 A1 WO2022093521 A1 WO 2022093521A1
Authority
WO
WIPO (PCT)
Prior art keywords
head
user
location
visual field
mounted display
Prior art date
Application number
PCT/US2021/054228
Other languages
French (fr)
Inventor
Mohamed Abou Shousha
Rashed Kashem
Original Assignee
University Of Miami
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from US17/082,983 external-priority patent/US10993612B1/en
Priority claimed from US17/392,723 external-priority patent/US20220125298A1/en
Priority claimed from US17/392,664 external-priority patent/US20220125297A1/en
Application filed by University Of Miami filed Critical University Of Miami
Priority to JP2023526382A priority Critical patent/JP2023550699A/en
Priority to EP21887174.7A priority patent/EP4236755A1/en
Publication of WO2022093521A1 publication Critical patent/WO2022093521A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • A61B3/032Devices for presenting test symbols or characters, e.g. test chart projectors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1121Determining geometric values, e.g. centre of rotation or angular range of movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6801Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient specially adapted to be attached to or worn on the body surface
    • A61B5/6802Sensor mounted on worn items
    • A61B5/6803Head-worn items, e.g. helmets, masks, headphones or goggles
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7282Event detection, e.g. detecting unique waveforms indicative of a medical condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/742Details of notification to user or communication with user or patient ; user input means using visual displays
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/14Digital output to display device ; Cooperation and interconnection of the display device with other functional units
    • G06F3/147Digital output to display device ; Cooperation and interconnection of the display device with other functional units using display panels
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2560/00Constructional details of operational features of apparatus; Accessories for medical measuring apparatus
    • A61B2560/02Operational features
    • A61B2560/0223Operational features of calibration, e.g. protocols for calibrating sensors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0219Inertial sensors, e.g. accelerometers, gyroscopes, tilt switches
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2562/00Details of sensors; Constructional details of sensor housings or probes; Accessories for sensors
    • A61B2562/02Details of sensors specially adapted for in-vivo measurements
    • A61B2562/0233Special features of optical sensors or probes classified in A61B5/00
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/02Improving the quality of display appearance
    • G09G2320/0261Improving the quality of display appearance in the context of movement of objects on the screen or movement of the observer relative to the screen
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2320/00Control of display operating conditions
    • G09G2320/06Adjustment of display parameters
    • G09G2320/0693Calibration of display systems
    • GPHYSICS
    • G09EDUCATION; CRYPTOGRAPHY; DISPLAY; ADVERTISING; SEALS
    • G09GARRANGEMENTS OR CIRCUITS FOR CONTROL OF INDICATING DEVICES USING STATIC MEANS TO PRESENT VARIABLE INFORMATION
    • G09G2380/00Specific applications
    • G09G2380/08Biomedical applications

Definitions

  • Diagnosis of visual defects can be determined with conventional testing machines, such as a Humphry visual field analyzer.
  • a patient is placed at the center of a curved portion of the analyzer and tests are performed by displaying images on the curved portion to determine where the blind spots are located in the patient’s visual field.
  • Humphry visual field analyzers as well as other testing machinery is both expensive for wide distribution and requires specialized personnel for operating the machinery.
  • head-mounted display devices and/or head-mounted display devices for visual field testing.
  • these devices for visual field testing lowers the costs related to performing visual field testing and improves accessibility to visual field testing to a wider patient base.
  • the adaption of visual field testing to these displays is not without its technical hurdles.
  • Cyclotorsion is the rotation of one eye around its visual axis. This rotation of the eye is what allows the visual field of a user to remain “right-side-up” even when the user tilts his or her head to one side or the other.
  • heads-up displays are fixed to the head of a user, cyclotorsion does not occur in the head-mounted display environment. That is, if a user tilts his or her head to one side or the other, the visual field of the user tilts accordingly.
  • the effects of cyclotorsion present a threshold technical problem to overcome when adapting introducing visual field testing into head-mounted display devices.
  • one solution to overcoming the technical problem caused by the differing effects of cyclotorsion in the head-mounted display environment is to prevent a user from tilting his or her head.
  • conventional optometry tools for preventing a user from tilting his or her head such as chin rests, or other structures built into optometry equipment are ill-suited for a head-mounted display environment.
  • a requirement for a specialized structure or modifications to head-mounted display devices negatively impacts the accessibility of the devices as well as their ease of use.
  • specialized structures such as chin rests do not prevent any tilting effects caused by the head-mounted display devices being improperly worn and/or worn in a manner that introduces a slight tilt.
  • the systems and methods disclosed herein may use specialized software and/or hardware elements implemented in the head-mounted display devices to detect a tilting head of a user.
  • the head-mounted display device may include specialized sensors and/or software used to interpret sensor data for the head-mounted display device.
  • the systems and methods may further generate alerts to a user based on detected head tilting and/or recommendations for corrections of any head tilting. These alerts and recommendation may further be presented on the head-mounted display to minimize the impact of head tilts during visual field testing.
  • a method can include retrieving a visual field testing pattern for a head-mounted display, wherein the visual field testing pattern comprises icons that are displayed at respective locations in a visual field of the head-mounted display.
  • the method can also include generating for display the visual field testing pattern on the head-mounted display; retrieving data from a tilt sensor, located at the head-mounted display, for detecting degrees of head tilt of a user wearing the head-mounted display; determining, based on the data retrieved from the tilt sensor, a degree of head tilt of the user; comparing, the degree of head tilt of the user to a first threshold degree; and in response to the degree of head tilt of the user meeting or exceeding the first threshold degree, generating for display, on the head-mounted display, a recommendation to the user.
  • calibrating a head-mounted display needs to compensate for unknown sources of error that may affect assessment of the calibration.
  • eye tracking data received from a device that needs sensor calibration may indicate that the user’s eyes are focused at a wrong location, and that data may then be interpreted as a visual defect rather than a calibration issue.
  • the visual test may progressing, the user may shift causing the sensors connected to the head-mounted display to lose alignment, thus, those sensors may need calibration to accurately perform eye tracking and detection of the user’s gaze.
  • the results of the visual test may be inaccurate because eye tracking may become inaccurate. This may result in the system incorrectly determining that the user has one or more visual defects.
  • some systems and methods disclosed herein facilitate calibration of a head-mounted display while a visual test is performed.
  • One mechanism that facilities calibration involves detecting, as a visual test is being performed, that the user’s eyes moved in the direction of a displayed stimulus but have not stopped at the point of where the stimulus is displayed, and instead stopped at a different point (e.g., a threshold distance away from the stimulus). Based on that detection, the system may determine that the user has seen the stimulus and that calibration of the sensors is needed. The system may then record that the user has seen the stimulus and perform sensor calibration before displaying the next stimulus.
  • the system may cause, during a visual field test, a first stimulus to be presented on a user device at a first visual field location.
  • the visual test may be a test to determine whether the user has any visual defects.
  • the user device may include a head-mounted display that displays the first stimulus (e.g., a visual indicator at some location on the display).
  • the system may start displaying stimuli to the user to assess whether the user has any visual defects.
  • the system may obtain, during the visual field test, first feedback data related to the first stimulus, the first feedback data indicating that the user has seen the first stimulus.
  • the user device e.g., a device including a head-mounted display
  • the user device may include one or more eye tracking sensors that are enabled to transmit eye tracking data to the user device for processing.
  • the eye tracking sensors may perform gaze detection and send that information to be processed by the user device.
  • the feedback data may include, for example, coordinates on the head-mounted display at which the user’s gaze was detected.
  • the system may then detect, based on the first feedback data, an eye of the user failing to fixate on the first visual field location corresponding to the first stimulus.
  • the first feedback data may indicate that the user’s eyes have moved towards the first stimulus but stopped short of gazing at the first stimulus or moved past the stimulus. This data may indicate that the user saw the stimulus (e.g., based on the user moving his eyes towards the stimulus) and that the eye tracking sensors need calibration (e.g., based on the gaze not being aligned with the visual field location of the stimulus itself).
  • the system may perform a calibration of the user device. For example, the system may adjust gaze location calculation as determined from the raw eye tracking data.
  • the system may store a first indication that the user has seen the first stimulus. This process may continue with every stimulus displayed on the head-mounted display. [0015] As another/second stimulus is displayed at a second field test location, the system may determine that the user’s gaze location aligns with the second field test location, thus no calibration is required at that iteration of the test. When the visual test is finished, the system may generate a visual defect assessment based on the results of the test (e.g., based on the number and locations of stimuli that the user was able to see).
  • calibrating a head-mounted display needs to compensate for unknown sources of error that may affect assessment of the calibration. For example, eye tracking data received during calibration would be affected if a head-mounted display was not being worn properly. This error (e.g., when the headmounted display is used for an eye examination) could then be interpreted as a visual defect rather than knowing it was caused by improper wearing during calibration.
  • the instant application discloses systems and methods that facilitate calibration of a head-mounted display.
  • the system may generate calibration patterns for the user to view while the system tracks the eye movement of the user to determine what they are seeing.
  • the analysis of the eye tracking data can generate a calibration for the head-mounted display as well as a “calibration score” representing the accuracy of the calibration.
  • a calibration score representing the accuracy of the calibration.
  • One method for calibrating a head-mounted display includes receiving edge eye tracking data during edge calibration periods; calculating a projective transform matrix based on the edge eye tracking data; receiving center eye tracking data during a center calibration period; applying the projective transform matrix to the center eye tracking data to determine a gaze location; and generating a calibration score based on a difference between a center location and the gaze location.
  • Figure 1A illustrates an example head-mounted display forming a wearable device for a subject in accordance with certain aspects of the present disclosure
  • Figure IB illustrates a front view of the head-mounted display in accordance with certain aspects of the present disclosure
  • Figure 1C an image of an example constructed head-mounted display in accordance with certain aspects of the present disclosure
  • Figures ID- IE illustrate another example embodiment of a head-mounted display, in accordance with certain aspects of the present disclosure
  • Figure 2 is a diagram illustrating correction of a visual field testing pattern by detecting and correcting for head tilt in accordance with certain aspects of the present disclosure
  • Figure 3 is a diagram illustrating an exemplary method of accurately replicating a visual field testing pattern from a curved surface on a flat surface in accordance with certain aspects of the present disclosure
  • Figure 4 is an illustrative system diagram for visual field testing using a head-mounted display in accordance with certain aspects of the present disclosure
  • Figure 5 is a process flow diagram for correction of a visual field testing pattern by detecting and correcting for head tilt in accordance with certain aspects of the present disclosure
  • Figure 6 is a process flow diagram for accurately replicating a visual field testing pattern from a curved surface on a flat surface in accordance with certain aspects of the present disclosure
  • Figure 7 illustrates a system for calibrating one or more sensors or other device components, in accordance with certain aspects of the present disclosure
  • Figure 8 illustrates a data structure for determining whether calibration is needed, in accordance with certain aspects of the present disclosure, and
  • Figure 9 is a process flow diagram for calibrating a head-mounted display or other user device, in accordance with certain aspects of the present disclosure.
  • Figure 10 is a diagram illustrating an exemplary relationship between a virtual plane and a display plane as used to calibrate a head-mounted display, in accordance with certain aspects of the present disclosure
  • Figure 11 is a diagram illustrating an exemplary central point and boundary as used to generate a calibration score, in accordance with certain aspects of the present disclosure
  • Figure 12 is a process flow diagram for calibrating a head-mounted display, in accordance with certain aspects of the present disclosure.
  • Figure 13 is illustrative pseudocode for calibrating a head-mounted display in accordance with certain aspects of the present disclosure.
  • the instant application describes systems and methods that facilitate performing visual field testing, particularly utilizing worn goggles that provide testing patterns.
  • One problem confronting optical care practitioners is the effect of a patient tilting their head during eye examinations. If the head is tilted, this causes cyclotorsion, which is the rotation of one eye around its visual axis. Uncorrected, this can introduce error in an eye examination and misdiagnosis of optical issues.
  • a conventional diagnostic device used for testing is a Humphry visual field analyzer “Humphry analyzer.”
  • Use of the Humphry analyzer includes a patient placing their head at the center of a semispherical region with testing patterns projected at varying locations of the semispherical region.
  • a heads up display device or a head-mounted display device.
  • a head-mounted display is a display device, worn on the head or as part of a helmet, that may have a small display optic in front of one (monocular HMD) or each eye (binocular HMD).
  • One technical problem is the occurrence of cyclotorsion in patients being tested using such goggles because while the goggles naturally provide compensation for head tilt, this only works if the goggles are worn properly (i.e., not tilted on the user’s head).
  • the instant application describes systems and methods for detection and correction of goggle tilt relative to the user’s head.
  • Another technical problem is the display of accurate testing patterns using the goggles, which have a flat viewing surface as compared to a Humphry analyzer, which has a curved viewing surface.
  • methods are disclosed for generation of testing patterns in goggles that are equivalent to those generated in a Humphry analyzer.
  • FIG. 1A illustrates an example head-mounted display 100 (e.g., goggles) forming a wearable device for a subject.
  • the head-mounted display 100 may be a part of a visioning system as described herein or in U.S. Patent Application No. 17/083,043, entitled “Vision Testing via Prediction-Based Setting of an Initial Stimuli Characteristic for a User Interface Location” and filed October 28, 2020, the contents of which are hereby incorporated by reference in its entirety.
  • the head-mounted display 100 includes a left eyepiece 102 and a right eyepiece 104. Each eyepiece 102 and 104 may contain and/or associate with a digital monitor configured to display (or project) recreated images to a respective eye of the subject.
  • digital monitors may include a display screen, projectors, and/or hardware to generate the image display on the display screen. It will be appreciated that digital monitors comprising projectors may be positioned at other locations to project images onto an eye of the subject or onto an eyepiece comprising a screen, glass, or other surface onto which images may be projected.
  • the left eye piece 102 and right eyepiece 104 may be positioned with respect to the housing 106 to fit an orbital area on the subject such that each eyepiece 102, 104 is able to collect data and display/project image data, which in a further example includes displaying/projecting image data to a different eye.
  • each eyepiece 102,104 may further includes one or more inward directed sensors 108, 110 may include infrared cameras, photodetectors, or other infrared sensors, configured to track pupil movement and to determine and track visual axes of the subject.
  • the inward directed sensors 108, 110 e.g., comprising infrared cameras, may be located in lower portions relative to the eye pieces 102, 104, so as to not block the visual field of the subject, neither their real visual field nor a visual field displayed or projected to the subject.
  • the inward directed sensors 108, 110 may be directionally aligned to point toward a presumed pupil region for better pupil and/or line of sight tracking.
  • head-mounted display 100 can include tilt sensor(s) 128 that can provide data on the degree of head tilt to a connected computing system.
  • the tilt sensors can be gyroscopes, water-based, etc.
  • FIG. IB illustrates a front view of the head-mounted display 100, showing the front view of the eye pieces 102, 104, where respective outward directed image sensors 112, 114 comprising field of vision cameras are positioned. In other embodiments, fewer or additional outward directed image sensors 112, 114 may be provided. The outward directed image sensors 112. 114 may be configured to capture continuous images.
  • FIG. 1C is an image of an example constructed head-mounted display 100 comprising eyepieces 102, 104 including two digital monitors, with focusing lens 116, 118.
  • only one inward directed optical sensor 110 is included for pupil and line of sight tracking, however, in other examples, multiple inward directed optical sensors 110 may be provided.
  • an alternative embodiment of head-mounted 170 can include, in any combination, a high-resolution camera (or cameras) 102, a power unit 193, a processing unit 194, a glass screen 195, a see-through display 196 (e.g., a transparent display), an eye tracking system 197, tilt sensor(s) 198 (similar to tilt sensors 122), and other components.
  • external sensors may be used to provide further data for assessing visual field of the subject.
  • data used to correct the captured image may be obtained from external testing devices, such as visual field testing devices, aberrometers, electro-oculograms, or visual evoked potential devices. Data obtained from those devices may be combined with pupil or line of sight tracking for visual axis determinations to create one or more modification profiles used to modify the images being projected or displayed to a user (e.g., correction profiles, enhancement profiles, etc., used to correct or enhance such images).
  • head-mounted display when referring to the “head-mounted display,” even where reference is made to the first embodiment (100), it is understood that the disclosed methods and operations apply to either head-mounted display 100 or 170, unless specifically stated otherwise. It should be noted that, although some embodiments are described herein with respect to calibration of headmounted displays, such techniques may be applied for calibration of one or more other user devices in other embodiments.
  • the head-mounted display 100 may be communicatively coupled with one or more imaging processor through wired or wireless communications, such as through a wireless transceiver embedded within the head-mounted display 100.
  • An external imaging processor may include a computer such as a laptop computer, tablet, mobile phone, network server, or other computer processing devices, centralized or distributed, and may be characterized by one or more processors and one or more memories.
  • the captured images are processed in this external image processing device; however, in other examples, the captured images may be processed by an imaging processor embedded within the digital spectacles.
  • the processed images (e.g., enhanced to improve functional visual field or other vision aspects and/or enhanced to correct for the visual field pathologies of the subject) are then transmitted to the head-mounted display 100 and displayed by the monitors for viewing by the subject.
  • the head-mounted display can be used to perform a visual assessments to identify ocular pathologies, such as, high and/or low order aberrations, pathologies of the optic nerve such as glaucoma, optic neuritis, and optic neuropathies, pathologies of the retina such as macular degeneration, retinitis pigmentosa, pathologies of the visual pathway as microvascular strokes and tumors and other conditions such as presbyopia, strabismus, high and low optical aberrations, monocular vision, anisometropia and aniseikonia, light sensitivity, anisocorian refractive errors, and astigmatism.
  • ocular pathologies such as, high and/or low order aberrations, pathologies of the optic nerve such as glaucoma, optic neuritis, and optic neuropathies, pathologies of the retina such as macular degeneration, retinitis pigmentosa, pathologies of the visual pathway as microvascular strokes and tumors and other conditions such as presbyopia
  • external sensors may be used to provide further data for assessing visual field of the subject.
  • data used to correct the captured image may be obtained from external testing devices such as visual field testing devices, aberromaters, electro-oculograms, or visual evoked potential devices. Data obtained from those devices may be combined with pupil or line of sight tracking for visual axis determinations to create the corrective profile of used to correct the images being projected of displayed to the viewer.
  • the head-mounted display 100 may be communicatively coupled with one or more imaging processor through wired or wireless communications, such as through a wireless transceiver embedded within the head-mounted display 100.
  • An external imaging processor may include a computer such as a laptop computer, tablet, mobile phone, network server, or other computer processing devices, centralized or distributed, and may be characterized by one or more processors and one or more memories.
  • real-time image processing of captured images may be executed by an imaging processor, e.g., using a custom-built MATLAB (MathWorks, Natick, MA) code, that runs on a miniature computer embedded in the head-mounted display.
  • the code may be run on an external image processing device or other computer wirelessly networked to communicate with the headmounted display.
  • FIG. 2 is a diagram illustrating correction of a visual field testing pattern by detecting and correcting for head tilt.
  • head tilt refers to the angle between an axis of the head-mounted display and an axis of the user’s head. For example, such an angle may be zero degrees when the head-mounted display 100 is worn correctly on the user.
  • a system for improving accuracy of visual field testing in head-mounted displays can include, in addition to the head-mounted display, a tilt sensor for detecting degrees of head tilt of a user wearing the head-mounted display.
  • the tilt sensor can be located at the headmounted display 100, though tilt sensors at other locations (e.g., external ones such as cameras that view the user and the head-mounted display 100) are contemplated.
  • the tilt sensor can be a water-based tilt sensor, similar to a level.
  • the tilt sensor can incorporate a gyro sensor or other types of rotation sensing hardware.
  • storage circuitry can be configured to store and/or retrieve a visual field testing pattern having stimuli (e.g., lights, patterns, icons, animations, etc.) that can be displayed at respective locations in the visual field of the headmounted display.
  • stimuli e.g., lights, patterns, icons, animations, etc.
  • control circuitry configured to generate for display the visual field testing pattern on the head-mounted display. Examples of a visual field testing pattern are shown in FIG. 2, with a fixation point 210 (typically near the center of the field of view) and a stimulus 220 that represents a displayed stimulus for determining the location of a blind spot.
  • FIG. 2 The top panel in FIG.
  • FIG. 2 shows an example location of a blind spot (coincident with stimulus 220), e.g., as determined by the user being unable to see a stimulus displayed at that location.
  • the middle panel illustrates the effect of head tilt.
  • the head tilt causes the stimulus 220 to be displayed at a different location in the user’s vision, outside of the blind spot 230.
  • the blind spot may not be identified by the user, possibly causing a misdiagnosis.
  • the system can determine, based on data retrieved from the tilt sensor, a degree of head tilt of the user.
  • the degree of head tilt can be determined, for example in the case of a water-based tilt sensor, the determination of water surface that indicates the degree of tilt.
  • One embodiment can include imaging a water surface with miniaturized cameras to capture the water surface relative to indicia that shows an un-tilted orientation. The angle between the water surface and the indicia would then be the degree of head tilt.
  • Another embodiment can include obtaining data from a plurality of water sensors (e.g., galvanic sensors) that are covered or exposed by water depending on the degree of tilt.
  • the particular sensors detecting water can then be used, such as via a lookup table, to determine the degree of head tilt.
  • the degree of head tilt can be determined from received data from a gyroscope.
  • the degree of head tilt of the user can be compared to a first threshold degree, such as 1, 2, 5, 10, degrees, or any threshold as desired.
  • the comparison itself can include one or more processors receiving the calculated degree of head tilt and performing a numerical comparison to the first threshold degree.
  • the system can generate for display, on the head-mounted display, a recommendation to the user for reducing the head tilt.
  • Such a recommendation can include a visual indication (e.g., red or green lights, a textual indication, etc.) that the head-mounted display 100 needs to be adjusted to remove the head tilt.
  • the recommendation can include a display of the degree of head tilt in, for example, a graphical format (e.g., depicting an angle) or textual format (e.g., the numerical value of the angle).
  • the system can automatically perform some corrections, e.g., if the tilt is relatively small.
  • the control circuitry can be further configured to compare the degree of head tilt of the user to a second threshold degree (e.g., 0.1, 0.5, 1, 2 degrees, etc.) that is generally smaller than the first threshold degree.
  • a second threshold degree e.g., 0.1, 0.5, 1, 2 degrees, etc.
  • Such a second threshold degree can be reflective of asymmetry in a user’s face that prevents perfect alignment, defects in the head-mounted display 100 construction, small incidental tilts occurring during measurements, etc.
  • the comparison of the degree of head tilt to the second threshold degree can be performed in a manner similar to that described for the first threshold degree.
  • the system can automatically adjust a respective location of the stimulus in the visual field of the head-mounted display by a first amount. For example, if a 0.1 degree tilt is detected, the system can automatically adjust the display location of the icon to compensate by changing the coordinates for display of the stimulus to reflect the detected tilt. In this way, the first amount can be based on a distance of the stimulus from a centerpoint 240 of the visual field of the head-mounted display and a direction of the head tilt of the user.
  • centerpoint 240 may correspond to a geometric center of the face of the headmounted display 100 and/or a center of fixation of the user.
  • different head-mounted displays may have different centerpoints. Accordingly, the system may determine the centerpoint of a head-mounted display and select respective locations of displayed icons based on the offset distance. For example, the system may determine a centerpoint of the head-mounted display based on receiving data from one or more sensors. Additionally or alternatively, the system may receive settings based on an initial calibration (e.g., an automatic calibration or a manual calibration) when the system is activated. Additionally or alternatively, the system may input a model or serial number (or other identifier) for the head-mounted display into a look-up table listing centerpoints for the model or serial number.
  • an initial calibration e.g., an automatic calibration or a manual calibration
  • centerpoint 240 can correspond to the center of the fixation point 210 and the direction 250 of the head tilt can be some angle (e.g., 10 degrees clockwise, 15 degrees, counterclockwise, etc.).
  • the terms are directional components (e.g., x/y, horizontal/vertical) of the vector r as a function of the head tilt angle 0.
  • the respective location of the icon can be defined by a first directional component (e.g., a horizontal component) and a second directional component (e.g., a vertical component).
  • correction can include where the first directional component is adjusted by a cosine of the degree of head tilt of the user and the second directional component is adjusted by a sine of the degree of head tilt.
  • the system can determine the difference between the location of the icon before and after head tilt. This difference (for each directional component) can then be the amount (e.g., in pixels, cm, etc.) by which the respective location of the icon can be adjusted.
  • FIG. 3 illustrates a simplified diagram depicting an exemplary method of accurately replicating a visual field testing pattern from a curved surface on a flat surface. Determining an angle of a visual defect can be important in diagnosing and treating it.
  • the Humphry analyzer with its semispherical testing region 310, as depicted in FIG. 3 (top) can provide a visual field testing pattern in the form of visual elements 320 at angles of constant separation (e.g., 10, 20, 30, 40, etc. degrees).
  • the head-mounted display 100 can have a flat surface 330 (shown simplified and greatly enlarged, for illustrative purposes). If stimuli 340 are displayed at equidistant locations as shown, they will not conform to the constant angular separation as described above with the Humphry analyzer, and thus not characterize the user’s vision accurately. Accordingly, the system may compensate for this difference.
  • the offset distance (dimension b in the bottom of FIG. 3) between the flat surface where stimuli are displayed and the eye of the user can vary, based on the particular construction of the head-mounted display 100, a user’s facial structure, etc. This offset distance can in turn affect where the stimuli 340 need to be displayed.
  • the disclosed methods allow respective locations of the stimuli 340 to be located in a row on the visual field and correspond to respective projections of points corresponding to different viewing angles along a curved surface onto a flat surface. This is depicted in FIG. 3 as can be seen by the stimuli 340, at their respective locations, being intersected by radial lines from visual elements 320.
  • the respective locations can be determined based on an offset distance of the head-mounted display and an angle to respective points on the visual testing machine.
  • the angle can be that referred to above, (e.g., 10, 20, 30 degrees, etc.).
  • the respective location corresponding to it is shown by dimension a, which is the distance from the center (e.g., 0 degrees) to the respective location on flat surface 330.
  • a is one of the respective locations
  • b is the offset distance
  • 0 is the angle.
  • the curvature of the head-mounted display can be determined, and the respective locations selected, based on the curvature.
  • the determination of the curvature can be known or accessed based on data from a known model of head-mounted display.
  • Such curvature values can be stored for retrieval or accessed via a network connection.
  • the exact relation of how the presence of curvature affects the shifting of the respective location is a function of the geometry of the system.
  • the disclosed methods contemplate a coordinate transformation from the intended angle 0 to, for example, an analogous angle c
  • FIG. 4 is an illustrative system diagram for visual field testing using a head-mounted display, in accordance with one or more embodiments.
  • system 400 may represent the components used to power the head-mounted displays of FIGS. 1 A-1C and perform the processes described in FIGS. 5-6.
  • system 400 may include heads up display device 422 and user terminal 424.
  • heads up display device 422 may be worn by a user, while progress of the user may be monitored via user terminal 424.
  • heads up display device 422 and user terminal 424 may be any computing device, including, but not limited to, a laptop computer, a tablet computer, a hand-held computer, other computer equipment (e.g., a server), including “smart,” wireless, wearable, and/or mobile devices.
  • FIG. 4 may also include additional components such as cloud components 410.
  • Cloud components 410 may alternatively be any computing device as described above and may include any type of mobile terminal, fixed terminal, or other device.
  • cloud components 410 may be implemented as a cloud computing system and may feature one or more component devices.
  • system 400 is not limited to three devices. Users, may, for instance, utilize one or more devices to interact with one another, one or more servers, or other components of system 400.
  • each of these devices may receive content and data via input/output (hereinafter “I/O”) paths.
  • I/O input/output
  • Each of these devices may also include processors and/or control circuitry to send and receive commands, requests, and other suitable data using the VO paths.
  • the control circuitry may comprise any suitable processing, storage, and/or input/output circuitry.
  • Each of these devices may also include a user input interface and/or user output interface (e.g., a display) for use in receiving and displaying data.
  • both head-mounted display device 422 and user terminal 424 include a display upon which to display data (e.g., a visual field test pattern).
  • the devices may have neither user input interface nor displays and may instead receive and display content using another device (e.g., a dedicated display device such as a computer screen and/or a dedicated input device such as a remote control, mouse, voice input, etc.).
  • a dedicated display device such as a computer screen and/or a dedicated input device such as a remote control, mouse, voice input, etc.
  • the devices in system 400 may run an application (or another suitable program). The application may cause the processors and/or control circuitry to perform operations related to visual field testing.
  • Each of these devices may also include electronic storages.
  • the electronic storages may include non-transitory storage media that electronically stores information.
  • the electronic storage media of the electronic storages may include one or both of (i) system storage that is provided integrally (e.g., substantially non-removable) with servers or client devices, or (ii) removable storage that is removably connectable to the servers or client devices via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.).
  • a port e.g., a USB port, a firewire port, etc.
  • a drive e.g., a disk drive, etc.
  • the electronic storages may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media.
  • the electronic storages may include one or more virtual storage resources (e.g., cloud storage, a virtual private network, and/or other virtual storage resources).
  • the electronic storages may store software algorithms, information determined by the processors, information obtained from servers, information obtained from client devices, or other information that enables the functionality as described herein.
  • FIG. 4 also includes communication paths 428, 430, and 432.
  • Communication paths 428, 430, and 432 may include the Internet, a mobile phone network, a mobile voice or data network (e.g., a 5G or LTE network), a cable network, a public switched telephone network, or other types of communications networks or combinations of communications networks.
  • Communication paths 428, 430, and 432 may separately or together include one or more communications paths, such as a satellite path, a fiber-optic path, a cable path, a path that supports Internet communications (e.g., IPTV), free-space connections (e.g., for broadcast or other wireless signals), or any other suitable wired or wireless communications path or combination of such paths.
  • the computing devices may include additional communication paths linking a plurality of hardware, software, and/or firmware components operating together. For example, the computing devices may be implemented by a cloud of computing platforms operating together as the computing devices.
  • Cloud components 410 may be a database configured to store user data for a user.
  • the database may include user data that the system has collected about the user through prior transactions. Alternatively, or additionally, the system may act as a clearing house for multiple sources of information about the user.
  • Cloud components 410 may also include control circuitry configured to perform the various operations needed to generate recommendations.
  • the cloud components 410 may include cloud-based storage circuitry configured to store a first machine learning model that is trained to detect head tilt, adjust visual testing patterns, and/or generate recommendations.
  • Cloud components 410 may also include cloud-based control circuitry configured to determine an intent of the user based on a machine learning model.
  • Cloud components 410 may also include cloud-based input/output circuitry configured to generate the dynamic conversational response during a conversational interaction.
  • Cloud components 410 includes machine learning model 402.
  • Machine learning model 402 may take inputs 404 and provide outputs 406.
  • the inputs may include multiple datasets such as a training dataset and a test dataset.
  • Each of the plurality of datasets (e.g., inputs 404) may include data subsets related to user data and visual testing patterns.
  • outputs 406 may be fed back to machine learning model 402 as input to train machine learning model 402 (e.g., alone or in conjunction with user indications of the accuracy of outputs 406, labels associated with the inputs, or with other reference feedback information).
  • the system may receive a first labeled feature input, wherein the first labeled feature input is labeled with a testing pattern adjustment for the first labeled feature input.
  • the system may then train the first machine learning model to classify the first labeled feature input with the known testing pattern adjustment.
  • FIG. 5 is a process flow diagram for correction of a visual field testing pattern by detecting and correcting for head tilt.
  • process 500 may represent the steps taken by one or more devices, as shown in FIGS. 1A-1C, when providing visual field testing using a head-mounted display.
  • process 500 retrieves a visual field testing pattern for a head-mounted display.
  • the system may retrieve a visual field testing pattern for a head-mounted display, wherein the visual field testing pattern comprises stimuli that are displayed at respective locations in a visual field of the headmounted display.
  • the respective location of the icon can be defined by a first directional component and a second directional component. The first directional component can be adjusted by a cosine of the degree of head tilt of the user and the second directional component can be adjusted by a sine of the degree of head tilt of the user.
  • the respective locations of the stimuli can be located in a row on the visual field and the respective locations can correspond to respective projections of points corresponding to different viewing angles along a curved surface onto a flat surface.
  • process 500 (e.g., using one or more components in system 400 (FIG. 4)) generate for display the visual field testing pattern.
  • the system may generate for display the visual field testing pattern on the head-mounted display.
  • process 500 retrieves data from a tilt sensor.
  • the system may retrieve data from a tilt sensor for detecting degrees of head tilt of a user wearing the head-mounted display.
  • the tilt sensor can be, for example, located at the head-mounted display.
  • process 500 determines a degree of head tilt of a user. For example, the system may determine, based on the data retrieved from the tilt sensor, a degree of head tilt of the user.
  • process 500 compare the degree of head tilts.
  • the system may compare, using the control circuitry, the degree of head tilt of the user to a first threshold degree.
  • process 500 can compare the degree of head tilt of the user to a second threshold degree and in response to the degree of head tilt of the user meeting or exceeding the second threshold degree, automatically adjusts a respective location of a stimulus of the plurality of icons in the visual field of the headmounted display by a first amount.
  • the first amount can be is based on a distance of the icon from a centerpoint of the visual field of the head-mounted display and a direction of the head tilt of the user.
  • process 500 (e.g., using one or more components in system 400 (FIG. 4)) generate a recommendation to the user.
  • the system may generate for display a recommendation to the user.
  • the recommendation can be displayed on the headmounted display.
  • the generation can also be in response to the degree of head tilt of the user meeting or exceeding the first threshold degree.
  • FIG. 5 it is contemplated that the steps or descriptions of FIG. 5 may be used with any other embodiment of this disclosure.
  • the steps and descriptions described in relation to FIG. 5 may be done in alternative orders or in parallel to further the purposes of this disclosure.
  • each of these steps may be performed in any order, in parallel, or simultaneously to reduce lag or increase the speed of the system or method.
  • any of the devices or equipment discussed in relation to FIGS. 1-3 could be used to perform one or more of the steps in FIG. 5.
  • FIG. 6 is a process flow diagram for accurately replicating a visual field testing pattern from a curved surface on a flat surface.
  • process 600 may represent the steps taken by one or more devices, as shown in FIGS. 1 A-1C, when providing visual field testing using a headmounted display.
  • process 600 retrieves a visual field testing pattern for a head-mounted display.
  • the system may retrieve a visual field testing pattern for a head-mounted display, wherein the visual field testing pattern comprises stimuli that are displayed at respective locations in a visual field of the headmounted display.
  • process 600 determines a curvature of the head-mounted display.
  • the system may determine a curvature of the head-mounted display based on receiving data from one or more sensors. Additionally or alternatively, the system may receive settings based on an initial calibration (e.g., an automatic calibration or a manual calibration) when the system is activated. Additionally or alternatively, the system may input a model or serial number (or other identifier) for the headmounted display into a look-up table listing curvatures for the model or serial number.
  • the system may determine an offset distance of the head-mounted display based on receiving data from one or more sensors. Additionally or alternatively, the system may receive settings based on an initial calibration (e.g., an automatic calibration or a manual calibration) when the system is activated indicating the offset distance. Additionally or alternatively, the system may input a model or serial number (or other identifier) for the head-mounted display into a look-up table listing offset distance for the model or serial number.
  • an initial calibration e.g., an automatic calibration or a manual calibration
  • the system may input a model or serial number (or other identifier) for the head-mounted display into a look-up table listing offset distance for the model or serial number.
  • process 600 selects the respective locations based on the curvature. For example, the system may automatically adjust the respective locations based on the curvature and/or offset distance determined by the system. In some embodiments, the system may receive the curvature and/or offset distance (e.g., via input entered into a user terminal (e.g., user terminal 424 (FIG. 4)) and adjust the respective locations accordingly.
  • a user terminal e.g., user terminal 424 (FIG. 4)
  • process 600 (e.g., using one or more components in system 400 (FIG. 4)) generates for display the visual field testing pattern on the head-mounted display.
  • the respective locations of the stimuli can be located in a row on the visual field.
  • the respective locations can correspond to respective projections of points corresponding to different viewing angles along a curved surface onto a flat surface.
  • each of these steps may be performed in any order, in parallel, or simultaneously to reduce lag or increase the speed of the system or method.
  • any of the devices or equipment discussed in relation to FIGS. 1-3 could be used to perform one or more of the steps in FIG. 6.
  • FIG. 7 illustrates a system diagram for calibrating one or more sensors or other device components.
  • FIG. 7 shows a processing system 702 and sensors 704.
  • Processing system 702 may include hardware (e.g., one or more processors, memory, etc.), software, or a combination of hardware and software.
  • Processing system 702 may include a testing subsystem 710, a communication subsystem 712, and calibration subsystem 714. Each of these subsystems may include both hardware and software components.
  • testing subsystem 710 may include processors and/or memory.
  • Communication subsystem 712 may include networking components (e.g., network card) and software to run the networking components, thus including software and hardware.
  • Calibration subsystem 714 may also include software and hardware components.
  • FIG. 7 shows a system diagram for calibrating one or more sensors or other device components.
  • FIG. 7 shows a processing system 702 and sensors 704.
  • Processing system 702 may include hardware (e.g., one or more processors, memory, etc.), software, or
  • FIG. 7 also shows sensor(s) 704.
  • Sensor(s) 704 may include one or more eye tracking sensors (e.g., inward directed sensors or other sensors). Sensors 704 may be able to perform eye tracking and gaze calculations. In some embodiments, gaze calculations may be performed using processing system 702.
  • FIG. 7 shows display 706 which may be any suitable display (e.g., a head-mounted display shown in FIGs. 1A-1C).
  • processing system 702 and sensors 704 may be located in the same enclosure as display 706.
  • the head-mounted display may include processing system 702 and sensors 704.
  • testing subsystem 710 and calibration subsystem 714 may be combined into a single subsystem.
  • Testing subsystem 710 may perform visual testing by causing display of stimuli, for example, on a head-mounted display or other interface of a user device.
  • testing subsystem 710 may cause, during a visual field test, a first stimulus to be presented on a user device at a first location.
  • the user device may include a head-mounted display or may be connected to a head-mounted display (e.g., using a wire or wirelessly).
  • Testing subsystem 710 may cause the first stimulus to be presented on the user device by transmitting a command to display 706 for displaying a stimulus (e.g., a visual indicator) at a specific location (e.g., the first location) on the display 706.
  • a stimulus e.g., a visual indicator
  • sensors(s) 704 may perform eye tracking and gaze detection operations on the user’s eye(s).
  • Sensor(s) 704 may include components that detect both the directional movement and a gaze location of the user’s eye(s).
  • Sensor(s) 704 may transmitthat data to processing system 702.
  • sensor(s) 704 may transmit raw tracking data to processing system 702 which may include components to detect both the directional movement and the gaze location of the user’s eyes.
  • Communication subsystem 712 may receive the tracking data (tracking data may be referred to as feedback data) and pass that data to calibration system 714 and/or testing subsystem 710.
  • calibration subsystem 714 may obtain, during the visual field test, first feedback data related to the first stimulus.
  • Calibration subsystem 714 may detect, based on the first feedback data, an eye of the user failing to fixate on the first location corresponding to the first stimulus.
  • the feedback data may include a gaze location of the user as a particular stimulus is displayed, for example, on a head-mounted display.
  • Calibration subsystem 714 may compare the gaze location as detected by the eye tracking sensors (e.g., sensor(s) 704) and the location of the stimulus to determine whether the two locations match. In some embodiments, calibration subsystem 714 may determine that the two locations match even if the two locations are not identical (e.g., if the locations are off by a threshold number, ratio, or percentage).
  • the threshold may be determined based on the type and accuracy of the eye tracking sensor(s). As an example, if the locations match, calibration subsystem 714 may detect that the eye(s) of the user fixated on the first location (i.e., the stimulus). However, if the locations do not match, calibration subsystem 714 may detect that the eye(s) of the user failed to fixate on the first location (i.e., the stimulus).
  • calibration subsystem 714 may determine that the eye of the user fixated on a different location that is within a threshold distance away from the first location.
  • the threshold may be one percent.
  • calibration subsystem 714 may determine that the user is gazing at the stimulus and no calibration is needed.
  • calibration subsystem 714 may determine that the user is also gazing at the stimulus and that calibration is needed.
  • Calibration subsystem 714 may determine that the user has seen the stimulus and is gazing at the stimulus based on, for example, eye movement direction.
  • the system may have a threshold detection of the user’s gaze.
  • the threshold number may be set at twenty percent or another suitable percentage.
  • calibration subsystem 714 may determine that the user is gazing at the stimulus and calibration is needed. The percentage may be based on the type of eye tracking sensors used.
  • testing subsystem 710 and/or calibration subsystem 714 may determine that the user has not seen the first stimulus based on determining that the eye of the user fixated on a different location that is outside the threshold distance away from the first location. For example, if the user fixated on a point that is more than twenty percent away from the stimulus, testing subsystem 710 and/or calibration subsystem 714 may determine that the user has not seen the stimulus.
  • Calibration subsystem 714 or testing subsystem 710 may determine that the user has seen the first stimulus even though the user failed to fixate on the first location. In some embodiments, testing subsystem 710 may determine that the user has seen the first stimulus when the difference between the first location (i.e., the location of the stimulus) and the gaze location determined by the eye tracking sensor(s) is within a threshold ratio or percentage. In some embodiments, calibration subsystem 714 may determine that the user has seen the first stimulus even when the difference between the first location (i.e., the location of the stimulus) and the gaze location determined by the eye tracking sensor(s) is above the threshold ratio or percentage.
  • calibration subsystem 714 may determine that the first stimulus was seen by the user by determining, based on the first feedback data, that user moved a threshold amount towards the first location.
  • eye tracking data e.g., feedback data
  • calibration subsystem 714 may determine whether the user’ s eye(s) moved toward the stimulus (e.g., the first location) by a threshold amount.
  • a threshold amount may be a percentage, a ratio, or another suitable threshold.
  • calibration subsystem 714 may determine that the first location (i.e., the location of the stimulus) does not match with the gaze location, calibration subsystem 714 may still determine that the user has seen the stimulus and in response update the visual test data and also calibrate the eye tracking sensors. For example, when testing subsystem 710 receives the eye tracking data from the sensors (e.g., sensor(s) 704), testing subsystem 710 may determine that the first location and the gaze location do not match and thus may determine that the user did not see the stimulus. However, calibration subsystem 714 may correct the determination by performing the operations above to indicate that the user has actually seen the stimulus and the determination is inaccurate due to a need for a calibration.
  • the sensors e.g., sensor(s) 704
  • testing subsystem 710 and calibration subsystem 714 may be part of the same subsystem.
  • calibration may be part of testing subsystem 710 and part of the visual testing process.
  • the operations described above may be performed by the same subsystem. That is, testing subsystem 710 may cause, during a visual field test, a first stimulus to be presented on a user device at a first visual field location, obtain, during the visual field test, first feedback data related to the first stimulus, the first feedback data indicating that the user has seen the first stimulus, and detect, based on the first feedback data, an eye of the user failing to fixate on the first visual field location corresponding to the first stimulus.
  • Processing system 702 may store a data structure for determining whether calibration is needed.
  • FIG. 8 illustrates a data structure for determining whether calibration is needed.
  • FIG 8 shows data structure 800 that includes identification field 802, display location field 804, gaze location field 806, eye movement direction field 808.
  • Stimulus identification field 802 stores an identifier for a particular stimulus that has been displayed or is being displayed to the user.
  • Display location field 804 includes a corresponding location of the stimulus (e.g., a visual field location).
  • Gaze location field 806 stores a location of the user’s gaze in connection with the corresponding stimulus as detected by the eye-tracking sensors.
  • Eye movement direction field 808 may indicate whether the user’s eyes moved toward the stimulus.
  • processing system 702 may use the data structure of FIG. 8. For example, when testing subsystem 710 instructs display 706 to display a stimulus, testing subsystem 710 may store a stimulus identifier for the stimulus in stimulus identifier field 802. In addition, testing subsystem 710 may store a visual field location for the stimulus in display location field 804. Display location field 804 may hold coordinates of the display where the stimulus is visible. When eye tracking data is received at processing system 702, calibration subsystem 714 or testing subsystem 710 may determine the gaze location of the user and store that information in gaze location field 806. The gaze location may include coordinates of the display. In addition, testing subsystem 710 or calibration subsystem 714 may store eye movement direction in eye movement direction field 808.
  • Testing subsystem 710 or calibration subsystem 714 may compare the data in display location field 804 and gaze location field 806. If the data in those fields matches, processing system 702 may determine that the user has seen the stimulus. If the data does not match, testing subsystem 710 or calibration subsystem 714 may use the data in the eye movement direction field 808 to determine whether the user’s eye(s) moved toward the stimulus. For example, calibration subsystem 714 or testing subsystem 710 may compare the eye tracking data with the location of the stimulus over time after the stimulus was displayed to make the determination.
  • calibration subsystem 714 may perform, during the visual field test, a calibration of the user device. For example, calibration subsystem 714 may calibrate the eye tracking sensor(s) so that the gaze location is aligned with the first visual field location (i.e., the location of the stimulus). Furthermore, calibration subsystems 714 or testing subsystem 710 may store, during the visual field test, a first indication that the user has seen the first stimulus. In some embodiments, to calibrate the user device, calibration subsystem 714 may adjust one or more parameters of a function for detecting a location that the user is viewing. For example, eye tracking sensors may be collecting raw tracking data.
  • the raw tracking data may include measurements of the motion of the eye(s) relative to the user’s head and/or measurement of the point of the user’s gaze.
  • Output of the sensor(s) may depend on the type of sensor is used.
  • Calibration subsystem 714 may then use one or more functions to determine the gaze location and/or eye movement direction. Thus, in the process of calibration one or more of those functions (e.g., the parameters of those functions) may be adjusted.
  • Data processing system 702 may repeat the test with another stimulus which can be displayed subsequent to the first stimulus.
  • testing subsystem 710 may cause, during the visual field test, a second stimulus to be presented on the user device at a second visual field location.
  • testing subsystem 710 may transmit a command to the head-mounted display to display the second stimulus at the second visual field location.
  • Testing subsystem 710 and/or calibration subsystem 714 may obtain, during the visual field test, second feedback data related to the second stimulus.
  • the second feedback data may be obtained using the same mechanism as the first feedback data.
  • Testing subsystem 710 and/or calibration subsystem 714 may detect, based on the second feedback data, that the eye of the user moved toward and fixated on the second visual field location. Based on this determination, processing system 702 may determine that the user has seen the second stimulus and store a second indication that the user has seen the second stimulus. Thus, no calibration of the user device (e.g., eye tracking sensors) is performed in response to detecting that the eye of the user fixated on the second visual field location. Based on the first indication and the second indication, processing system 702 may generate a visual defect assessment. For example, processing system 702 may determine whether the user has any visual defects. It should be noted that one advantage of calibrating the sensors during each stimulus enable the process to accurately adjust for user shifting or the user device otherwise providing inaccurate data.
  • FIG. 9 is a process flow diagram for calibrating a head-mounted display or other user device.
  • processing system 702 causes a stimulus to be presented to a user device at a visual field location.
  • Processing system 702 may use a processor to generate a command to a display (e.g., head-mounted display) to present a stimulus on the display at a particular location. The location may be included in the command.
  • Processing system 702 may pass the command to the display (e.g., head-mounted display).
  • processing system 702 obtains feedback data related to the stimulus.
  • Processing system 702 may obtain the feedback data from one or more eye tracking sensors. The feedback data may have been recorded for a time right after the stimulus has been presented and before the next stimulus is presented.
  • processing system 702 detects an eye of the user failing to fixate on the visual field location corresponding to the stimulus. For example, processing system 702 may analyze the feedback data using one or more processors to make the detection. At 908, processing system 702 performs a calibration of the user device. For example, in response to determining that the user failed to fixate on the visual field location corresponding to the stimulus, processing system 702 may adjust one or more functions for determining a gaze location and/or movement direction. At 910, processing system 702 stores an indication that the user has seen the stimulus. The indication may be stored in association with the corresponding stimulus to signal to the system that the user has seen the stimulus.
  • FIG. 9 may be used with any other embodiment of this disclosure.
  • the steps and descriptions described in relation to FIG. 9 may be done in alternative orders or in parallel to further the purposes of this disclosure.
  • each of these steps may be performed in any order, in parallel, or simultaneously to reduce lag or increase the speed of the system or method.
  • any of the devices or equipment discussed in relation to FIGS. 1 A-1E could be used to perform one or more of the steps in FIG. 9.
  • FIG. 10 illustrates a simplified diagram depicting an exemplary relationship between a virtual plane and a display plane as may be used to calibrate a head-mounted display.
  • the disclosed systems may generate a calibration pattern comprising a number of stimuli (e.g., one or more graphical elements, referred to herein as “icons”) for display at the headmounted display.
  • icons graphical elements
  • the calibration may take many forms and may comprises one or more stimuli being displayed in series and/or in parallel.
  • the system may display a pattern in which the stimuli are displayed at particular positions. The positions may be defined by the system in terms of a height, width and/or viewing angle.
  • the system may generate the stimuli at the extremes of the visual field in order to achieve the best calibration. For example, the system may display the stimuli in one or more corners of the visual field in order to receive the best measurement for calibrating a user’s gaze location on a single fixation point (e.g., a centerpoint in the visual field).
  • FIG. 10 depicts a simplified representation showing a viewing plane 1030 having a number of stimuli (e.g. edge stimuli 1032, 1034, 1036, 1038) that may be seen by a user.
  • a number of stimuli e.g. edge stimuli 1032, 1034, 1036, 1038
  • eye tracking data obtained from a user viewing any one of these edge stimuli may not correspond to where the eye tracking data would be expected based on where the edge stimulus is displayed by the head-mounted display.
  • Example eye tracking data 1010 is shown corresponding to edge stimulus 1034.
  • the eye tracking data is depicted as a dashed line representing the path of the eye over its acquisition time. In this example, even though the system generates edge stimulus 1034, the eye tracking data generally surrounds the perceived edge point 1024.
  • the system may retrieve calibration data of a given interval.
  • the use of the given interval allows the system to normalize data during this time to remove outlier that may occur as a natural result of the calibration process.
  • the edge point can be determined by receiving eye tracking data over periods of time referred to herein as edge calibration periods.
  • the edge calibration periods may be, for example, one second, five seconds, etc.
  • eye tracking data may be averaged over such periods of time to generate an average location. This process may be repeated for a number of edge stimuli, with four shown in the example of FIG. 10.
  • edge points 1032, 1034, 1036, and 1038 generate corresponding points 1022, 1024, 1026, and 1028.
  • the system may generate edge stimuli on edges of the field of view of the headmounted display. For example, along one or more of the left, right, upper, or lower edges.
  • edge stimuli may be generated at the comers of the field, for example, upper left, upper right, lower left, or lower right. It is also contemplated that stimuli may be generated anywhere within the field of view stimuli may be generated anywhere within the field of view such that the presently described calibrations may be performed.
  • the edge stimuli may define a display plane 1030 (i.e., a plane established by the system where the edge stimuli are displayed on)
  • the edge points may also define a virtual plane 1020.
  • the system may calculate a projective transform matrix based on the edge eye tracking data that converts any location in virtual plane 1020 to display plane 1030.
  • stimuli or other calibration patterns may be generated by the head-mounted display and the obtained eye tracking data may be mapped back onto the display plane for comparison with the calibration pattern.
  • the system may calculate a projective transform matrix that is especially useful for a general transformation (e.g., one that does not force parallelism to be observed as such may not be the case when formerly parallel stimuli are viewed by a person).
  • a general transformation e.g., one that does not force parallelism to be observed as such may not be the case when formerly parallel stimuli are viewed by a person.
  • the below example illustrates how the system may generate and/or utilize a projective transform matrix for a coordinate transformation between the two planes 1020 and 1030:
  • the 2x2 “a” submatrix is a rotation matrix
  • the 2x1 “Z>” submatrix is a translation vector
  • the 1x2 “c” submatrix is a projection vector.
  • the x,y elements correspond to the x,y coordinates of the edge stimulus in the display plane (e.g., edge stimulus 1034) and the x’,y’ elements corresponding to the x,y coordinates of the point in the virtual plane (e.g., point 1024).
  • the system may execute pseudocode such as shown in FIG. 10.
  • the projective transformation can be represented as transformation of an arbitrary quadrangle (i.e. system of four points) into another one.
  • the system may use a transform based on a different number of points.
  • the system may use an affine transformation, which is a transformation of a triangle.
  • the system may select the type of transform based on the number of stimuli generated.
  • the system may select the number of stimuli generated based on one or more criteria. For example, the system may determine a number of stimuli needed to achieve a determined amount of accuracy and/or meet a particular threshold level of precision.
  • the system may likewise select the number of stimuli based on a type of test, amount of calibration needed, and/or a frequency of calibration.
  • the system may determine that a four point (e.g., projective transform) calibration is used at the initiation of the user of a head mounted device.
  • the system may then determine (e.g., using a hierarchy of criteria) whether an additional calibration needs to be performed, and if so, how many stimuli are required to be displayed.
  • FIG. 11 illustrates a simplified diagram depicting an exemplary central point and boundary as used to generate a calibration score.
  • the system may generate for display a center stimulus 1100 on the headmounted display at a center location.
  • This system may also receive center eye tracking data during a center calibration period, similar to the edge calibration period(s) described above.
  • the calculated point in the virtual plane may be transformed to what is referred to herein as a “gaze location” in the display plane by the system utilizing the projective transform matrix.
  • the system may generate a calibration score based on a difference 1140 (e.g., a delta in pixels, mm, or other similar distance metrics) between the center stimulus 1100 and the gaze location 1110.
  • a difference 1140 e.g., a delta in pixels, mm, or other similar distance metrics
  • the inset shows this example in greater detail and includes exemplary eye tracking data 1130 and the difference 1140 between center stimulus 1100 and gaze location 1110.
  • the difference 1140 may be similarly calculated the virtual plane 1020 via a determination of equivalent points for the center stimulus and gaze location. In this way, is contemplated that any combination of points may be utilized in either plane and related to each other via the projective transform matrix to calculate differences, locations relative to a boundary (as described below), etc.
  • the system may assess the accuracy of the calibration based on whether the gaze location and/or eye tracking data is within a prescribed boundary. For example, as shown in FIG. 11, the system may generate boundary 1120. In some implementations, such a boundary may be a circle having a given radius from the center stimulus 1100, but other boundary shapes such as square, hexagonal, etc. may be used. While in some implementations the boundary may be visually displayed by the head-mounted display, this is not necessary and instead the boundary may merely reside as coordinates or other boundary defining algorithm in computer memory. Accordingly, the system may determine the calibration score based on the size (e.g. radius) of the boundary.
  • the calibration score may be indicative of the confidence in the calibration.
  • the system may repeat at least a portion of the calibration (e.g., the acquiring of edge eye tracking data, center eye tracking data, and/or calculation of the projective transform matrix), but making the size of the boundary larger (e.g., a larger radius boundary). For example, in one embodiment if the gaze location is calculated to be within the first (or initial) boundary generated, the calibration may be assigned a score of 100 (perhaps corresponding to the best possible calibration).
  • the radius of the boundary may be increased 20% and if that calibration succeeds then it may be assigned a score of 90. Any such relationship between boundary size and calibration score may be used by the system, as implemented by a person of skill.
  • the system may also determine, based on the gaze location, whether a user is looking at the center location. Such a determination may be made by the system, for example, if the gaze location is outside of a bounding box, an area defined by the edge stimuli, etc. Another related implementation can further refine the calibration is not allowing large, sustained deviations in the eye tracking data, even if the average location is within one of the above-described boundaries. For example, the system may determine that the user is not looking at the center location based on whether at least a portion of the center eye tracking data deviates from the gaze location more than a spatial deviation threshold and for longer than a temporal deviation threshold.
  • the spatial deviation threshold may be any distance outside the boundary, but may also be a larger boundary (e.g., l.lx, 1.5x the radius of the present boundary). While a brief excursion may be allowed, the temporal deviation threshold may be set by the system to be, for example, 1 ms, 10 ms, 100 ms, etc. In this way, the system would determine that the calibration failed if the user’s gaze drifted, for example, far to the left and stayed there, indicating a possible loss of focus or attention on the calibration process.
  • FIG. 12 is a process flow diagram for calibrating a head-mounted display.
  • process 1200 may represent steps taken by one or more devices, as shown in FIGS. 1 A-1E, when calibrating the head-mounted display.
  • process 1200 receives edge eye tracking data.
  • the system may receive edge eye tracking data during edge calibration periods. Additionally or alternatively, the system may generate for display a number of edge stimuli on the head-mounted display. Additionally or alternatively, the system may generate edge stimuli on edges or of a field of view of the head-mounted display.
  • process 1200 calculates a projective transform matrix.
  • the system may calculate a projective transform matrix based on the edge eye tracking data.
  • the system may use pseudocode 1300 (FIG. 13) and/or the process described in FIG. 10.
  • process 1200 receives center eye tracking data.
  • the system may receive center eye tracking data during a center calibration period. Additionally or alternatively, the system may generate for display a center stimulus on the head-mounted display at the center location. Additionally or alternatively, the system may generate a boundary around the center stimulus and may display the boundary at the head display.
  • process 1200 applies a projective transform matrix to the center eye tracking data.
  • the system may apply the projective transform matrix to the center eye tracking data to determine a gaze location.
  • process 1200 (e.g., using one or more components in system 400 (FIG. 4)) generates a calibration score.
  • the system may generate a calibration score based on a difference between a central location and the gaze location.
  • the calibration score may be indicative of the accuracy of an eye test performed with the head-mounted display. Additionally or alternatively, the system may determine, based on the difference, whether the gaze location is inside the boundary.
  • the calibration score may be based on a size of the boundary. Additionally or alternatively, the system, in response to the difference indicating that the gaze location is outside the boundary, may repeat at least a portion of the calibration, wherein the size of the boundary is larger. Additionally or alternatively, the system may determine whether a user is looking at the center location based on the gaze location. In response to the determination that the user is not looking at the center location, the system may repeat at least a portion of the calibration. Additionally or alternatively, the determination by the system that the user is not looking at the center location may require that at least a portion of the center eye tracking data deviates from the gaze location more than a spatial deviation threshold and for longer than a temporal deviation threshold.
  • FIG. 12 it is contemplated that the steps or descriptions of FIG. 12 may be used with any other embodiment of this disclosure.
  • the steps and descriptions described in relation to FIG. 12 may be done in alternative orders or in parallel to further the purposes of this disclosure. For example, each of these steps may be performed in any order, in parallel, or simultaneously to reduce lag or increase the speed of the system or method.
  • any of the devices or equipment discussed in relation to FIGS. 1 A-1E could be used to perform one or more of the steps in FIG. 12.
  • Figure 13 is illustrative pseudocode for calibrating a head-mounted display in accordance with certain aspects of the present disclosure.
  • pseudocode 1300 represents illustrative pseudocode for calculating a projective transform matrix as described herein.
  • the below example illustrates how the system may generate and/or utilize a projective transform matrix for a coordinate transformation between the two planes (e.g., planes 1020 and 1030 of FIG. 10).
  • pseudocode 1300 may generate values for 4x4 described in FIG. 10.
  • the points identified in pseudocode 1300 may correspond to the x,y coordinates of the edge stimulus in the display plane (e.g., edge stimulus 1034) and x,y coordinates of the point in the virtual plane (e.g., point 1024).
  • a method comprising: retrieving a visual field testing pattern for a head-mounted display; and generating for display the visual field testing pattern on the head-mounted display.
  • the first amount is based on a distance of the stimulus from a centerpoint of the visual field of the head-mounted display and a direction of the head tilt of the user.
  • the respective location of the stimulus is defined by a first directional component and a second directional component, and wherein the first directional component is adjusted by a cosine of the degree of head tilt of the user and the second directional component is adjusted by a sine of the degree of head tilt of the user.
  • the respective locations of the stimulus are located in a row on the visual field, and wherein the respective locations correspond to respective projections of points corresponding to different viewing angles along a curved surface onto a flat surface.
  • a method comprising: receiving edge eye tracking data during a plurality of edge calibration periods; calculating a projective transform matrix based on the edge eye tracking data; receiving center eye tracking data during a center calibration period; applying the projective transform matrix to the center eye tracking data to determine a gaze location; and generating a calibration score based on a difference between a center location and the gaze location.
  • 21 The method of embodiment 14 or any of the preceding embodiments that depend therefrom, further comprising: in response to the difference indicating that the gaze location is outside the boundary, repeating at least a portion of the calibration, wherein the size of the boundary is larger. 22: The method of embodiment 14 or any of the preceding embodiments that depend therefrom, further comprising: determining whether a user is looking at the center location based on the gaze location; and in response to the determination that the user is not looking at the center location, repeating at least a portion of the calibration.
  • a system comprising: a head-mounted display; inward directed sensors, located at the headmounted display, configured to track pupil movement; storage circuitry configured to store a plurality of stimuli that are displayed at respective locations in a visual field of the head-mounted display; and control circuitry configured to: to perform operations comprising those of any of embodiments 1-23.
  • a tangible, non-transitory, machine-readable medium storing instructions that, when executed by a data processing apparatus, cause the data processing apparatus to perform operations comprising those of any of embodiments 1-23.
  • a system comprising: one or more processors; and memory storing instructions that, when executed by the processors, cause the processors to effectuate operations comprising those of any of embodiments 1-23.
  • a system comprising means for performing any of embodiments 1-23.
  • a method comprising: causing a first stimulus to be presented on a user device at a first location; obtaining first feedback data related to the first stimulus presented at the first location; performing a calibration of the user device; and storing a first indication that the user has seen the first stimulus.
  • the first feedback data includes eye tracking data
  • detecting the eye and storing the first indication are based on the eye tracking data
  • performing the calibration of the user device based on the first feedback data indicating that the eye of the user failed to fixate on the first location comprises determining that the eye of the user fixated on a different location that is within a threshold distance away from the first location.
  • the calibration of the user device comprises adjusting one or more parameters of a function for detecting a location that the user is viewing.
  • the user device comprises a headmounted display.
  • a system comprising: inward directed sensors, located at a head-mounted display, configured to track movement of one or more eyes; storage circuitry configured to store data for generating stimuli that are presented at respective locations of the head-mounted display; and control circuitry configured to perform operations comprising those of any of embodiments 1-11.
  • a tangible, non-transitory, machine-readable medium storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising those of any of embodiments 1-11.
  • a system comprising: one or more processors; and memory storing instructions that, when executed by the processors, cause the processors to effectuate operations comprising those of any of embodiments 1-11.
  • a system comprising means for performing operations of any embodiments 1-11.
  • One or more aspects or features of the subject matter described herein can be realized in digital electronic circuitry, integrated circuitry, specially designed application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs) computer hardware, firmware, software, and/or combinations thereof.
  • ASICs application specific integrated circuits
  • FPGAs field programmable gate arrays
  • These various aspects or features can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which can be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device.
  • the programmable system or computing system may include clients and servers.
  • a client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a clientserver relationship to each other.
  • machine-readable signal refers to any signal used to provide machine instructions and/or data to a programmable processor.
  • the machine-readable medium can store such machine instructions non-transitorily, such as for example as would a nontransient solid-state memory or a magnetic hard drive or any equivalent storage medium.
  • the machine-readable medium can alternatively or additionally store such machine instructions in a transient manner, such as for example as would a processor cache or other random access memory associated with one or more physical processor cores.
  • a computer having a display device, such as for example a cathode ray tube (CRT) or a liquid crystal display (LCD) or a light emitting diode (LED) monitor for displaying information to the user and a keyboard and a pointing device, such as for example a mouse or a trackball, by which the user may provide input to the computer.
  • a display device such as for example a cathode ray tube (CRT) or a liquid crystal display (LCD) or a light emitting diode (LED) monitor for displaying information to the user
  • a keyboard and a pointing device such as for example a mouse or a trackball
  • feedback provided to the user can be any form of sensory feedback, such as for example visual feedback, auditory feedback, or tactile feedback; and input from the user may be received in any form, including, but not limited to, acoustic, speech, or tactile input.
  • Other possible input devices include, but are not limited to, touch screens or other touch-sensitive devices such as single or multi-point resistive or capacitive trackpads, voice recognition hardware and software, optical scanners, optical pointers, digital image capture devices and associated interpretation software, and the like.
  • phrases such as “at least one of’ or “one or more of’ may occur followed by a conjunctive list of elements or features.
  • the term “and/or” may also occur in a list of two or more elements or features. Unless otherwise implicitly or explicitly contradicted by the context in which it used, such a phrase is intended to mean any of the listed elements or features individually or any of the recited elements or features in combination with any of the other recited elements or features.
  • the phrases “at least one of A and B;” “one or more of A and B;” and “A and/or B” are each intended to mean “A alone, B alone, or A and B together.”
  • a similar interpretation is also intended for lists including three or more items.
  • the phrases “at least one of A, B, and C;” “one or more of A, B, and C;” and “A, B, and/or C” are each intended to mean “A alone, B alone, C alone, A and B together, A and C together, B and C together, or A and B and C together.”
  • Use of the term “based on,” above and in the claims is intended to mean, “based at least in part on,” such that an unrecited feature or element is also permissible.

Abstract

Some systems and methods disclosed herein facilitate calibration of a head-mounted display while a visual test is performed. One mechanism of facilitating calibration involves detecting, as a visual test is being performed, that the user's eyes moved in the direction of a displayed stimulus but have not stopped at the point of where the stimulus is displayed, and instead stopped at a different point (e.g., a threshold distance away from the stimulus). Based on that detection, the system may determine that the user has seen the stimulus and that calibration of the sensors is needed. The system may then record that the user has seen the stimulus and perform sensor calibration before displaying the next stimulus.

Description

SYSTEMS AND METHODS FOR
VISUAL FIELD TESTING IN HEAD-MOUNTED DISPLAYS
[0001] This application claims the benefit of priority of U.S. Patent Application No. 17/392,664, filed August 3, 2021, entitled “Device Calibration via a Projective Transform Matrix,” which is a continuation-in-part of U.S. Patent Application No. 17/246,054, filed April 30, 2021, entitled “Systems And Methods For Visual Field Testing In Head-Mounted Displays,” which is a continuation of U.S. Patent Application No. 17/082,983, filed October 28, 2020, entitled “Systems And Methods For Visual Field Testing In Head-Mounted Displays,” each of which is hereby incorporated by reference herein in its entirety.
[0002] This application also claims the benefit of priority of U.S. Patent Application No. 17/392,723, filed August 3, 2021, entitled “Active Calibration of Head-Mounted Displays,” which is a continuation-in-part of U.S. Patent Application No. 17/246,054, filed April 30, 2021, entitled “Systems And Methods For Visual Field Testing In Head-Mounted Displays,” which is a continuation of U.S. Patent Application No. 17/082,983, filed October 28, 2020, entitled “Systems And Methods For Visual Field Testing In Head-Mounted Displays,” each of which is hereby incorporated by reference herein in its entirety. This application also claims priority to at least each of the foregoing applications filed within twelve (12) months of the filing of this application.
BACKGROUND
[0003] Diagnosis of visual defects, such as blind spots, can be determined with conventional testing machines, such as a Humphry visual field analyzer. A patient is placed at the center of a curved portion of the analyzer and tests are performed by displaying images on the curved portion to determine where the blind spots are located in the patient’s visual field. However, Humphry visual field analyzers as well as other testing machinery is both expensive for wide distribution and requires specialized personnel for operating the machinery.
SUMMARY
[0004] Accordingly, systems and methods are disclosed herein for the use of head-mounted display devices and/or head-mounted display devices for visual field testing. For example, these devices for visual field testing lowers the costs related to performing visual field testing and improves accessibility to visual field testing to a wider patient base. However, the adaption of visual field testing to these displays is not without its technical hurdles.
[0005] As a threshold technical problem, the introduction of visual field testing into head-mounted display devices must account for the effects of, or more accurately the lack thereof, of cyclotorsion. Cyclotorsion is the rotation of one eye around its visual axis. This rotation of the eye is what allows the visual field of a user to remain “right-side-up” even when the user tilts his or her head to one side or the other. However, as heads-up displays are fixed to the head of a user, cyclotorsion does not occur in the head-mounted display environment. That is, if a user tilts his or her head to one side or the other, the visual field of the user tilts accordingly. Thus, the effects of cyclotorsion present a threshold technical problem to overcome when adapting introducing visual field testing into head-mounted display devices.
[0006] As described herein, one solution to overcoming the technical problem caused by the differing effects of cyclotorsion in the head-mounted display environment is to prevent a user from tilting his or her head. However, conventional optometry tools for preventing a user from tilting his or her head such as chin rests, or other structures built into optometry equipment are ill-suited for a head-mounted display environment. First, a requirement for a specialized structure or modifications to head-mounted display devices negatively impacts the accessibility of the devices as well as their ease of use. Second, specialized structures such as chin rests do not prevent any tilting effects caused by the head-mounted display devices being improperly worn and/or worn in a manner that introduces a slight tilt.
[0007] Accordingly, the systems and methods disclosed herein may use specialized software and/or hardware elements implemented in the head-mounted display devices to detect a tilting head of a user. For example, the head-mounted display device may include specialized sensors and/or software used to interpret sensor data for the head-mounted display device. The systems and methods may further generate alerts to a user based on detected head tilting and/or recommendations for corrections of any head tilting. These alerts and recommendation may further be presented on the head-mounted display to minimize the impact of head tilts during visual field testing.
[0008] As a supplementary technical problem, even when the differing effects of cyclotorsion in the head-mounted display environment has been addressed, the adaption of visual field testing to head-mounted displays presents a secondary problem. Namely, visual field testing such as that performed by Humphry visual field analyzers is done by generating a series of white light stimuli of varying intensities (brightness), throughout a uniformly illuminated bowl. This illuminated bowl, or more precisely the illumination on a curved surface provides for standardized measurements of vison from a center of fixation in terms of degrees. However, head-mounted display devices do not provide for surfaces with a uniformed curvature. Instead, head-mounted display devices are generated on flat surfaces and/or surfaces with non-uniformed curvature. Accordingly, light stimuli appearing on a head-mounted display must account for these issues.
[0009] Methods, systems, and computer program products for improving accuracy of visual field testing in head-mounted displays are disclosed. In one aspect, a method can include retrieving a visual field testing pattern for a head-mounted display, wherein the visual field testing pattern comprises icons that are displayed at respective locations in a visual field of the head-mounted display. The method can also include generating for display the visual field testing pattern on the head-mounted display; retrieving data from a tilt sensor, located at the head-mounted display, for detecting degrees of head tilt of a user wearing the head-mounted display; determining, based on the data retrieved from the tilt sensor, a degree of head tilt of the user; comparing, the degree of head tilt of the user to a first threshold degree; and in response to the degree of head tilt of the user meeting or exceeding the first threshold degree, generating for display, on the head-mounted display, a recommendation to the user.
[0010] Another technical problem in conventional head-mounted displays is that calibrating a head-mounted display needs to compensate for unknown sources of error that may affect assessment of the calibration. For example, eye tracking data received from a device that needs sensor calibration may indicate that the user’s eyes are focused at a wrong location, and that data may then be interpreted as a visual defect rather than a calibration issue. For example, as the visual test is progressing, the user may shift causing the sensors connected to the head-mounted display to lose alignment, thus, those sensors may need calibration to accurately perform eye tracking and detection of the user’s gaze. As the sensors get more and more out of alignment as the user shifts, the results of the visual test may be inaccurate because eye tracking may become inaccurate. This may result in the system incorrectly determining that the user has one or more visual defects.
[0011] To address the above technical problems, some systems and methods disclosed herein facilitate calibration of a head-mounted display while a visual test is performed. One mechanism that facilities calibration involves detecting, as a visual test is being performed, that the user’s eyes moved in the direction of a displayed stimulus but have not stopped at the point of where the stimulus is displayed, and instead stopped at a different point (e.g., a threshold distance away from the stimulus). Based on that detection, the system may determine that the user has seen the stimulus and that calibration of the sensors is needed. The system may then record that the user has seen the stimulus and perform sensor calibration before displaying the next stimulus.
[0012] The following operations may be performed to implement this mechanism. The system may cause, during a visual field test, a first stimulus to be presented on a user device at a first visual field location. For example, as described above, the visual test may be a test to determine whether the user has any visual defects. The user device may include a head-mounted display that displays the first stimulus (e.g., a visual indicator at some location on the display). Thus, as the visual field test begins, the system may start displaying stimuli to the user to assess whether the user has any visual defects.
[0013] The system may obtain, during the visual field test, first feedback data related to the first stimulus, the first feedback data indicating that the user has seen the first stimulus. For example, the user device (e.g., a device including a head-mounted display) may include one or more eye tracking sensors that are enabled to transmit eye tracking data to the user device for processing. Thus, as the system displays visual stimuli, the eye tracking sensors may perform gaze detection and send that information to be processed by the user device. The feedback data may include, for example, coordinates on the head-mounted display at which the user’s gaze was detected.
[0014] The system may then detect, based on the first feedback data, an eye of the user failing to fixate on the first visual field location corresponding to the first stimulus. For example, the first feedback data may indicate that the user’s eyes have moved towards the first stimulus but stopped short of gazing at the first stimulus or moved past the stimulus. This data may indicate that the user saw the stimulus (e.g., based on the user moving his eyes towards the stimulus) and that the eye tracking sensors need calibration (e.g., based on the gaze not being aligned with the visual field location of the stimulus itself). Based on detecting the eye of the user failing to fixate on the first visual field location the system may perform a calibration of the user device. For example, the system may adjust gaze location calculation as determined from the raw eye tracking data. Furthermore, the system may store a first indication that the user has seen the first stimulus. This process may continue with every stimulus displayed on the head-mounted display. [0015] As another/second stimulus is displayed at a second field test location, the system may determine that the user’s gaze location aligns with the second field test location, thus no calibration is required at that iteration of the test. When the visual test is finished, the system may generate a visual defect assessment based on the results of the test (e.g., based on the number and locations of stimuli that the user was able to see).
[0016] Another technical problem in conventional head-mounted displays is that calibrating a head-mounted display needs to compensate for unknown sources of error that may affect assessment of the calibration. For example, eye tracking data received during calibration would be affected if a head-mounted display was not being worn properly. This error (e.g., when the headmounted display is used for an eye examination) could then be interpreted as a visual defect rather than knowing it was caused by improper wearing during calibration.
[0017] To address the above technical problems, the instant application discloses systems and methods that facilitate calibration of a head-mounted display. For example, the system may generate calibration patterns for the user to view while the system tracks the eye movement of the user to determine what they are seeing. The analysis of the eye tracking data can generate a calibration for the head-mounted display as well as a “calibration score” representing the accuracy of the calibration. In this way, an optical practitioner will have both the best possible calibration of a head-mounted display that may be used for eye examinations as well as a calibration score that may be indicative of the accuracy of an eye test performed with the calibrated head-mounted display.
[0018] Accordingly, methods, systems, and computer program products are disclosed for calibrating head-mounted displays. One method for calibrating a head-mounted display includes receiving edge eye tracking data during edge calibration periods; calculating a projective transform matrix based on the edge eye tracking data; receiving center eye tracking data during a center calibration period; applying the projective transform matrix to the center eye tracking data to determine a gaze location; and generating a calibration score based on a difference between a center location and the gaze location.
[0019] The details of one or more variations of the subject matter described herein are set forth in the accompanying drawings and the description below. Other features and advantages of the subject matter described herein will be apparent from the description and drawings, and from the claims. While certain features of the currently disclosed subject matter are described for illustrative purposes in relation to particular implementations, it should be readily understood that such features are not intended to be limiting. The claims that follow this disclosure are intended to define the scope of the protected subject matter.
BRIEF DESCRIPTION OF THE DRAWINGS
[0020] The accompanying drawings, which are incorporated in and constitute a part of this specification, show certain aspects of the subject matter disclosed herein and, together with the description, help explain some of the principles associated with the disclosed implementations. In the drawings,
[0021] Figure 1A illustrates an example head-mounted display forming a wearable device for a subject in accordance with certain aspects of the present disclosure,
[0022] Figure IB illustrates a front view of the head-mounted display in accordance with certain aspects of the present disclosure,
[0023] Figure 1C an image of an example constructed head-mounted display in accordance with certain aspects of the present disclosure,
[0024] Figures ID- IE illustrate another example embodiment of a head-mounted display, in accordance with certain aspects of the present disclosure,
[0025] Figure 2 is a diagram illustrating correction of a visual field testing pattern by detecting and correcting for head tilt in accordance with certain aspects of the present disclosure,
[0026] Figure 3 is a diagram illustrating an exemplary method of accurately replicating a visual field testing pattern from a curved surface on a flat surface in accordance with certain aspects of the present disclosure,
[0027] Figure 4 is an illustrative system diagram for visual field testing using a head-mounted display in accordance with certain aspects of the present disclosure,
[0028] Figure 5 is a process flow diagram for correction of a visual field testing pattern by detecting and correcting for head tilt in accordance with certain aspects of the present disclosure, [0029] Figure 6 is a process flow diagram for accurately replicating a visual field testing pattern from a curved surface on a flat surface in accordance with certain aspects of the present disclosure, [0030] Figure 7 illustrates a system for calibrating one or more sensors or other device components, in accordance with certain aspects of the present disclosure, [0031] Figure 8 illustrates a data structure for determining whether calibration is needed, in accordance with certain aspects of the present disclosure, and
[0032] Figure 9 is a process flow diagram for calibrating a head-mounted display or other user device, in accordance with certain aspects of the present disclosure.
[0033] Figure 10 is a diagram illustrating an exemplary relationship between a virtual plane and a display plane as used to calibrate a head-mounted display, in accordance with certain aspects of the present disclosure,
[0034] Figure 11 is a diagram illustrating an exemplary central point and boundary as used to generate a calibration score, in accordance with certain aspects of the present disclosure,
[0035] Figure 12 is a process flow diagram for calibrating a head-mounted display, in accordance with certain aspects of the present disclosure, and
[0036] Figure 13 is illustrative pseudocode for calibrating a head-mounted display in accordance with certain aspects of the present disclosure.
DETAILED DESCRIPTION
[0037] The instant application describes systems and methods that facilitate performing visual field testing, particularly utilizing worn goggles that provide testing patterns. One problem confronting optical care practitioners is the effect of a patient tilting their head during eye examinations. If the head is tilted, this causes cyclotorsion, which is the rotation of one eye around its visual axis. Uncorrected, this can introduce error in an eye examination and misdiagnosis of optical issues. In the art, a conventional diagnostic device used for testing is a Humphry visual field analyzer “Humphry analyzer.” Use of the Humphry analyzer includes a patient placing their head at the center of a semispherical region with testing patterns projected at varying locations of the semispherical region. With the development of Augmented Reality (AR) and Virtual Reality (VR) goggles, similar testing can be performed by projection of testing patterns upon the viewing surfaces of such goggles. As referred to herein, embodiments may use a heads up display device or a head-mounted display device. For example, a head-mounted display is a display device, worn on the head or as part of a helmet, that may have a small display optic in front of one (monocular HMD) or each eye (binocular HMD).
[0038] One technical problem is the occurrence of cyclotorsion in patients being tested using such goggles because while the goggles naturally provide compensation for head tilt, this only works if the goggles are worn properly (i.e., not tilted on the user’s head). To address this problem, the instant application describes systems and methods for detection and correction of goggle tilt relative to the user’s head. Another technical problem is the display of accurate testing patterns using the goggles, which have a flat viewing surface as compared to a Humphry analyzer, which has a curved viewing surface. To address this additional technical problem, methods are disclosed for generation of testing patterns in goggles that are equivalent to those generated in a Humphry analyzer.
[0039] FIG. 1A illustrates an example head-mounted display 100 (e.g., goggles) forming a wearable device for a subject. In some embodiments, the head-mounted display 100 may be a part of a visioning system as described herein or in U.S. Patent Application No. 17/083,043, entitled “Vision Testing via Prediction-Based Setting of an Initial Stimuli Characteristic for a User Interface Location” and filed October 28, 2020, the contents of which are hereby incorporated by reference in its entirety. The head-mounted display 100 includes a left eyepiece 102 and a right eyepiece 104. Each eyepiece 102 and 104 may contain and/or associate with a digital monitor configured to display (or project) recreated images to a respective eye of the subject. In various embodiments, digital monitors may include a display screen, projectors, and/or hardware to generate the image display on the display screen. It will be appreciated that digital monitors comprising projectors may be positioned at other locations to project images onto an eye of the subject or onto an eyepiece comprising a screen, glass, or other surface onto which images may be projected. In one embodiment, the left eye piece 102 and right eyepiece 104 may be positioned with respect to the housing 106 to fit an orbital area on the subject such that each eyepiece 102, 104 is able to collect data and display/project image data, which in a further example includes displaying/projecting image data to a different eye.
[0040] In some embodiments, each eyepiece 102,104 may further includes one or more inward directed sensors 108, 110 may include infrared cameras, photodetectors, or other infrared sensors, configured to track pupil movement and to determine and track visual axes of the subject. The inward directed sensors 108, 110, e.g., comprising infrared cameras, may be located in lower portions relative to the eye pieces 102, 104, so as to not block the visual field of the subject, neither their real visual field nor a visual field displayed or projected to the subject. The inward directed sensors 108, 110 may be directionally aligned to point toward a presumed pupil region for better pupil and/or line of sight tracking. In some examples, the inward directed sensors 108, 110 may be embedded within the eye pieces 102, 104 to provide a continuous interior surface. In some embodiments, head-mounted display 100 can include tilt sensor(s) 128 that can provide data on the degree of head tilt to a connected computing system. As described further herein, the tilt sensors can be gyroscopes, water-based, etc.
[0041] FIG. IB illustrates a front view of the head-mounted display 100, showing the front view of the eye pieces 102, 104, where respective outward directed image sensors 112, 114 comprising field of vision cameras are positioned. In other embodiments, fewer or additional outward directed image sensors 112, 114 may be provided. The outward directed image sensors 112. 114 may be configured to capture continuous images.
[0042] FIG. 1C is an image of an example constructed head-mounted display 100 comprising eyepieces 102, 104 including two digital monitors, with focusing lens 116, 118. In this example, only one inward directed optical sensor 110 is included for pupil and line of sight tracking, however, in other examples, multiple inward directed optical sensors 110 may be provided.
[0043] With respect to the FIGS. ID- IE, an alternative embodiment of head-mounted 170 can include, in any combination, a high-resolution camera (or cameras) 102, a power unit 193, a processing unit 194, a glass screen 195, a see-through display 196 (e.g., a transparent display), an eye tracking system 197, tilt sensor(s) 198 (similar to tilt sensors 122), and other components.
[0044] In some examples, external sensors may be used to provide further data for assessing visual field of the subject. For example, data used to correct the captured image may be obtained from external testing devices, such as visual field testing devices, aberrometers, electro-oculograms, or visual evoked potential devices. Data obtained from those devices may be combined with pupil or line of sight tracking for visual axis determinations to create one or more modification profiles used to modify the images being projected or displayed to a user (e.g., correction profiles, enhancement profiles, etc., used to correct or enhance such images).
[0045] As used herein, when referring to the “head-mounted display,” even where reference is made to the first embodiment (100), it is understood that the disclosed methods and operations apply to either head-mounted display 100 or 170, unless specifically stated otherwise. It should be noted that, although some embodiments are described herein with respect to calibration of headmounted displays, such techniques may be applied for calibration of one or more other user devices in other embodiments. [0046] The head-mounted display 100 may be communicatively coupled with one or more imaging processor through wired or wireless communications, such as through a wireless transceiver embedded within the head-mounted display 100. An external imaging processor may include a computer such as a laptop computer, tablet, mobile phone, network server, or other computer processing devices, centralized or distributed, and may be characterized by one or more processors and one or more memories. In the discussed example, the captured images are processed in this external image processing device; however, in other examples, the captured images may be processed by an imaging processor embedded within the digital spectacles. The processed images (e.g., enhanced to improve functional visual field or other vision aspects and/or enhanced to correct for the visual field pathologies of the subject) are then transmitted to the head-mounted display 100 and displayed by the monitors for viewing by the subject.
[0047] The head-mounted display can be used to perform a visual assessments to identify ocular pathologies, such as, high and/or low order aberrations, pathologies of the optic nerve such as glaucoma, optic neuritis, and optic neuropathies, pathologies of the retina such as macular degeneration, retinitis pigmentosa, pathologies of the visual pathway as microvascular strokes and tumors and other conditions such as presbyopia, strabismus, high and low optical aberrations, monocular vision, anisometropia and aniseikonia, light sensitivity, anisocorian refractive errors, and astigmatism.
[0048] In some examples, external sensors may be used to provide further data for assessing visual field of the subject. For example, data used to correct the captured image may be obtained from external testing devices such as visual field testing devices, aberromaters, electro-oculograms, or visual evoked potential devices. Data obtained from those devices may be combined with pupil or line of sight tracking for visual axis determinations to create the corrective profile of used to correct the images being projected of displayed to the viewer.
[0049] The head-mounted display 100 may be communicatively coupled with one or more imaging processor through wired or wireless communications, such as through a wireless transceiver embedded within the head-mounted display 100. An external imaging processor may include a computer such as a laptop computer, tablet, mobile phone, network server, or other computer processing devices, centralized or distributed, and may be characterized by one or more processors and one or more memories. [0050] In an example operation of a vision system including the head-mounted display, real-time image processing of captured images may be executed by an imaging processor, e.g., using a custom-built MATLAB (MathWorks, Natick, MA) code, that runs on a miniature computer embedded in the head-mounted display. In other examples, the code may be run on an external image processing device or other computer wirelessly networked to communicate with the headmounted display.
[0051] FIG. 2 is a diagram illustrating correction of a visual field testing pattern by detecting and correcting for head tilt. As used herein, the term “head tilt” refers to the angle between an axis of the head-mounted display and an axis of the user’s head. For example, such an angle may be zero degrees when the head-mounted display 100 is worn correctly on the user. In an embodiment, a system for improving accuracy of visual field testing in head-mounted displays can include, in addition to the head-mounted display, a tilt sensor for detecting degrees of head tilt of a user wearing the head-mounted display. In some cases, the tilt sensor can be located at the headmounted display 100, though tilt sensors at other locations (e.g., external ones such as cameras that view the user and the head-mounted display 100) are contemplated. In some embodiments, the tilt sensor can be a water-based tilt sensor, similar to a level. In other embodiments, the tilt sensor can incorporate a gyro sensor or other types of rotation sensing hardware.
[0052] In the head-mounted display 100, or on an external computer, storage circuitry can be configured to store and/or retrieve a visual field testing pattern having stimuli (e.g., lights, patterns, icons, animations, etc.) that can be displayed at respective locations in the visual field of the headmounted display. There can also be control circuitry configured to generate for display the visual field testing pattern on the head-mounted display. Examples of a visual field testing pattern are shown in FIG. 2, with a fixation point 210 (typically near the center of the field of view) and a stimulus 220 that represents a displayed stimulus for determining the location of a blind spot. The top panel in FIG. 2 shows an example location of a blind spot (coincident with stimulus 220), e.g., as determined by the user being unable to see a stimulus displayed at that location. The middle panel illustrates the effect of head tilt. Here, the head tilt causes the stimulus 220 to be displayed at a different location in the user’s vision, outside of the blind spot 230. As a result, the blind spot may not be identified by the user, possibly causing a misdiagnosis.
[0053] To address such issues, the system can determine, based on data retrieved from the tilt sensor, a degree of head tilt of the user. The degree of head tilt can be determined, for example in the case of a water-based tilt sensor, the determination of water surface that indicates the degree of tilt. One embodiment can include imaging a water surface with miniaturized cameras to capture the water surface relative to indicia that shows an un-tilted orientation. The angle between the water surface and the indicia would then be the degree of head tilt. Another embodiment can include obtaining data from a plurality of water sensors (e.g., galvanic sensors) that are covered or exposed by water depending on the degree of tilt. The particular sensors detecting water can then be used, such as via a lookup table, to determine the degree of head tilt. In some other embodiments, the degree of head tilt can be determined from received data from a gyroscope. The degree of head tilt of the user can be compared to a first threshold degree, such as 1, 2, 5, 10, degrees, or any threshold as desired. The comparison itself can include one or more processors receiving the calculated degree of head tilt and performing a numerical comparison to the first threshold degree. In response to the degree of head tilt of the user meeting or exceeding the first threshold degree, the system can generate for display, on the head-mounted display, a recommendation to the user for reducing the head tilt. Such a recommendation can include a visual indication (e.g., red or green lights, a textual indication, etc.) that the head-mounted display 100 needs to be adjusted to remove the head tilt. The recommendation can include a display of the degree of head tilt in, for example, a graphical format (e.g., depicting an angle) or textual format (e.g., the numerical value of the angle). After adjustment of the head-mounted display 100, testing can take place as shown in bottom panel of FIG. 2, showing that the stimulus remains at the proper location for detecting the blind spot in the user’s field of vision.
[0054] In other embodiments, the system can automatically perform some corrections, e.g., if the tilt is relatively small. Here, the control circuitry can be further configured to compare the degree of head tilt of the user to a second threshold degree (e.g., 0.1, 0.5, 1, 2 degrees, etc.) that is generally smaller than the first threshold degree. Such a second threshold degree can be reflective of asymmetry in a user’s face that prevents perfect alignment, defects in the head-mounted display 100 construction, small incidental tilts occurring during measurements, etc. The comparison of the degree of head tilt to the second threshold degree can be performed in a manner similar to that described for the first threshold degree. In response to the degree of head tilt of the user meeting or exceeding the second threshold degree, the system can automatically adjust a respective location of the stimulus in the visual field of the head-mounted display by a first amount. For example, if a 0.1 degree tilt is detected, the system can automatically adjust the display location of the icon to compensate by changing the coordinates for display of the stimulus to reflect the detected tilt. In this way, the first amount can be based on a distance of the stimulus from a centerpoint 240 of the visual field of the head-mounted display and a direction of the head tilt of the user. In some embodiments, centerpoint 240 may correspond to a geometric center of the face of the headmounted display 100 and/or a center of fixation of the user. For example, in some embodiments, different head-mounted displays may have different centerpoints. Accordingly, the system may determine the centerpoint of a head-mounted display and select respective locations of displayed icons based on the offset distance. For example, the system may determine a centerpoint of the head-mounted display based on receiving data from one or more sensors. Additionally or alternatively, the system may receive settings based on an initial calibration (e.g., an automatic calibration or a manual calibration) when the system is activated. Additionally or alternatively, the system may input a model or serial number (or other identifier) for the head-mounted display into a look-up table listing centerpoints for the model or serial number.
[0055] As shown in FIG. 2, centerpoint 240 can correspond to the center of the fixation point 210 and the direction 250 of the head tilt can be some angle (e.g., 10 degrees clockwise, 15 degrees, counterclockwise, etc.). Such a formulation permits a representation of the location of the icon relative to the center point (e.g., r = R cos 0x + R sin 0 y), where f is the vector from the center point to the icon having scalar distance R, which is unchanged regardless of head tilt. The terms are directional components (e.g., x/y, horizontal/vertical) of the vector r as a function of the head tilt angle 0. Thus, in an embodiment, the respective location of the icon can be defined by a first directional component (e.g., a horizontal component) and a second directional component (e.g., a vertical component). As shown in the bottom portion of FIG. 2, correction can include where the first directional component is adjusted by a cosine of the degree of head tilt of the user and the second directional component is adjusted by a sine of the degree of head tilt. For example, the system can determine the difference between the location of the icon before and after head tilt. This difference (for each directional component) can then be the amount (e.g., in pixels, cm, etc.) by which the respective location of the icon can be adjusted.
[0056] FIG. 3 illustrates a simplified diagram depicting an exemplary method of accurately replicating a visual field testing pattern from a curved surface on a flat surface. Determining an angle of a visual defect can be important in diagnosing and treating it. The Humphry analyzer, with its semispherical testing region 310, as depicted in FIG. 3 (top) can provide a visual field testing pattern in the form of visual elements 320 at angles of constant separation (e.g., 10, 20, 30, 40, etc. degrees). However, the head-mounted display 100 can have a flat surface 330 (shown simplified and greatly enlarged, for illustrative purposes). If stimuli 340 are displayed at equidistant locations as shown, they will not conform to the constant angular separation as described above with the Humphry analyzer, and thus not characterize the user’s vision accurately. Accordingly, the system may compensate for this difference.
[0057] Another consideration is that the offset distance (dimension b in the bottom of FIG. 3) between the flat surface where stimuli are displayed and the eye of the user can vary, based on the particular construction of the head-mounted display 100, a user’s facial structure, etc. This offset distance can in turn affect where the stimuli 340 need to be displayed.
[0058] As shown in FIG. 3 (bottom), the disclosed methods allow respective locations of the stimuli 340 to be located in a row on the visual field and correspond to respective projections of points corresponding to different viewing angles along a curved surface onto a flat surface. This is depicted in FIG. 3 as can be seen by the stimuli 340, at their respective locations, being intersected by radial lines from visual elements 320. The respective locations can be determined based on an offset distance of the head-mounted display and an angle to respective points on the visual testing machine. The angle can be that referred to above, (e.g., 10, 20, 30 degrees, etc.). Given the angle and the offset distance, the respective location corresponding to it is shown by dimension a, which is the distance from the center (e.g., 0 degrees) to the respective location on flat surface 330. As one example, the respective locations can be determined based on the expression in Equation 1,
Figure imgf000016_0001
where a is one of the respective locations, b is the offset distance, and 0 is the angle.
[0059] While several simplifying assumptions have been taken for the purpose of explanation, it is understood that a person of skill would be able to incorporate variations in accordance with the present disclosure, for example, accounting for the fact that each eye is off center (as opposed to the single viewing point assumed in FIG. 3), that the flat surface may indeed not be perfectly flat, but may contain some slight curvature (e.g., as depicted in FIGSs. 1 A and IB), etc. Thus, as used herein, a “flat” surface is assumed to be the special case of a curved surface having an infinite radius of curvature. As described in some portions herein, the head-mounted display can have a finite radius of curvature and thus be “curved” in the traditional sense. [0060] In some implementations, the curvature of the head-mounted display can be determined, and the respective locations selected, based on the curvature. The determination of the curvature can be known or accessed based on data from a known model of head-mounted display. Such curvature values can be stored for retrieval or accessed via a network connection. The exact relation of how the presence of curvature affects the shifting of the respective location is a function of the geometry of the system. Thus the disclosed methods contemplate a coordinate transformation from the intended angle 0 to, for example, an analogous angle c|) that represents the angle along the curved surface of the head-mounted display which would appear to the user to be at the intended angle.
[0061] Also, while the present disclosure has described visual field testing patterns generally located on a horizontal “row,” it is contemplated that the disclosure applies to patterns that may be at an angle, vertical, or anywhere in a 2D plane. Similarly, such features can be extended to 3D visualizations, such as by altering the placement (and optionally size) of the stimuli to give a depth effect, similar to a heads-up-display.
[0062] FIG. 4 is an illustrative system diagram for visual field testing using a head-mounted display, in accordance with one or more embodiments. For example, system 400 may represent the components used to power the head-mounted displays of FIGS. 1 A-1C and perform the processes described in FIGS. 5-6. As shown in FIG. 4, system 400 may include heads up display device 422 and user terminal 424. For example, heads up display device 422 may be worn by a user, while progress of the user may be monitored via user terminal 424. It should be noted that heads up display device 422 and user terminal 424 may be any computing device, including, but not limited to, a laptop computer, a tablet computer, a hand-held computer, other computer equipment (e.g., a server), including “smart,” wireless, wearable, and/or mobile devices. FIG. 4 may also include additional components such as cloud components 410. Cloud components 410 may alternatively be any computing device as described above and may include any type of mobile terminal, fixed terminal, or other device. For example, cloud components 410 may be implemented as a cloud computing system and may feature one or more component devices. It should also be noted that system 400 is not limited to three devices. Users, may, for instance, utilize one or more devices to interact with one another, one or more servers, or other components of system 400. It should be noted, that, while one or more operations are described herein as being performed by particular components of system 400, those operations may, in some embodiments, be performed by other components of system 400. As an example, while one or more operations are described herein as being performed by components of mobile device 422, those operations, may, in some embodiments, be performed by components of cloud components 410. In some embodiments, the various computers and systems described herein may include one or more computing devices that are programmed to perform the described functions. Additionally, or alternatively, multiple users may interact with system 400 and/or one or more components of system 400. For example, in one embodiment, a first user and a second user may interact with system 400 using two different components.
[0063] With respect to the components of head-mounted display device 422, user terminal 424, and cloud components 410, each of these devices may receive content and data via input/output (hereinafter “I/O”) paths. Each of these devices may also include processors and/or control circuitry to send and receive commands, requests, and other suitable data using the VO paths. The control circuitry may comprise any suitable processing, storage, and/or input/output circuitry. Each of these devices may also include a user input interface and/or user output interface (e.g., a display) for use in receiving and displaying data. For example, as shown in FIG. 4, both head-mounted display device 422 and user terminal 424 include a display upon which to display data (e.g., a visual field test pattern).
[0064] It should be noted that in some embodiments, the devices may have neither user input interface nor displays and may instead receive and display content using another device (e.g., a dedicated display device such as a computer screen and/or a dedicated input device such as a remote control, mouse, voice input, etc.). Additionally, the devices in system 400 may run an application (or another suitable program). The application may cause the processors and/or control circuitry to perform operations related to visual field testing.
[0065] Each of these devices may also include electronic storages. The electronic storages may include non-transitory storage media that electronically stores information. The electronic storage media of the electronic storages may include one or both of (i) system storage that is provided integrally (e.g., substantially non-removable) with servers or client devices, or (ii) removable storage that is removably connectable to the servers or client devices via, for example, a port (e.g., a USB port, a firewire port, etc.) or a drive (e.g., a disk drive, etc.). The electronic storages may include one or more of optically readable storage media (e.g., optical disks, etc.), magnetically readable storage media (e.g., magnetic tape, magnetic hard drive, floppy drive, etc.), electrical charge-based storage media (e.g., EEPROM, RAM, etc.), solid-state storage media (e.g., flash drive, etc.), and/or other electronically readable storage media. The electronic storages may include one or more virtual storage resources (e.g., cloud storage, a virtual private network, and/or other virtual storage resources). The electronic storages may store software algorithms, information determined by the processors, information obtained from servers, information obtained from client devices, or other information that enables the functionality as described herein.
[0066] FIG. 4 also includes communication paths 428, 430, and 432. Communication paths 428, 430, and 432 may include the Internet, a mobile phone network, a mobile voice or data network (e.g., a 5G or LTE network), a cable network, a public switched telephone network, or other types of communications networks or combinations of communications networks. Communication paths 428, 430, and 432 may separately or together include one or more communications paths, such as a satellite path, a fiber-optic path, a cable path, a path that supports Internet communications (e.g., IPTV), free-space connections (e.g., for broadcast or other wireless signals), or any other suitable wired or wireless communications path or combination of such paths. The computing devices may include additional communication paths linking a plurality of hardware, software, and/or firmware components operating together. For example, the computing devices may be implemented by a cloud of computing platforms operating together as the computing devices.
[0067] Cloud components 410 may be a database configured to store user data for a user. For example, the database may include user data that the system has collected about the user through prior transactions. Alternatively, or additionally, the system may act as a clearing house for multiple sources of information about the user. Cloud components 410 may also include control circuitry configured to perform the various operations needed to generate recommendations. For example, the cloud components 410 may include cloud-based storage circuitry configured to store a first machine learning model that is trained to detect head tilt, adjust visual testing patterns, and/or generate recommendations. Cloud components 410 may also include cloud-based control circuitry configured to determine an intent of the user based on a machine learning model. Cloud components 410 may also include cloud-based input/output circuitry configured to generate the dynamic conversational response during a conversational interaction.
[0068] Cloud components 410 includes machine learning model 402. Machine learning model 402 may take inputs 404 and provide outputs 406. The inputs may include multiple datasets such as a training dataset and a test dataset. Each of the plurality of datasets (e.g., inputs 404) may include data subsets related to user data and visual testing patterns. In some embodiments, outputs 406 may be fed back to machine learning model 402 as input to train machine learning model 402 (e.g., alone or in conjunction with user indications of the accuracy of outputs 406, labels associated with the inputs, or with other reference feedback information). For example, the system may receive a first labeled feature input, wherein the first labeled feature input is labeled with a testing pattern adjustment for the first labeled feature input. The system may then train the first machine learning model to classify the first labeled feature input with the known testing pattern adjustment.
[0069] FIG. 5 is a process flow diagram for correction of a visual field testing pattern by detecting and correcting for head tilt. For example, process 500 may represent the steps taken by one or more devices, as shown in FIGS. 1A-1C, when providing visual field testing using a head-mounted display.
[0070] At step 502, process 500 (e.g., using one or more components in system 400 (FIG. 4)) retrieves a visual field testing pattern for a head-mounted display. For example, the system may retrieve a visual field testing pattern for a head-mounted display, wherein the visual field testing pattern comprises stimuli that are displayed at respective locations in a visual field of the headmounted display. In another example, the respective location of the icon can be defined by a first directional component and a second directional component. The first directional component can be adjusted by a cosine of the degree of head tilt of the user and the second directional component can be adjusted by a sine of the degree of head tilt of the user.
[0071] In yet another example, the respective locations of the stimuli can be located in a row on the visual field and the respective locations can correspond to respective projections of points corresponding to different viewing angles along a curved surface onto a flat surface. Also, in other examples, the respective locations can be determined based on an offset distance of the headmounted display and an angle to respective points on the visual testing machine. Accordingly, in some examples, the respective locations are determined based on the expression a =
Figure imgf000020_0001
where a is one of the respective locations, b is the offset distance, and 0 is the angle.
[0072] At step 504, process 500 (e.g., using one or more components in system 400 (FIG. 4)) generate for display the visual field testing pattern. For example, the system may generate for display the visual field testing pattern on the head-mounted display. [0073] At step 506, process 500 (e.g., using one or more components in system 400 (FIG. 4)) retrieves data from a tilt sensor. For example, the system may retrieve data from a tilt sensor for detecting degrees of head tilt of a user wearing the head-mounted display. The tilt sensor can be, for example, located at the head-mounted display.
[0074] At step 508, process 500 (e.g., using one or more components in system 400 (FIG. 4)) determines a degree of head tilt of a user. For example, the system may determine, based on the data retrieved from the tilt sensor, a degree of head tilt of the user.
[0075] At step 510, process 500 (e.g., using one or more components in system 400 (FIG. 4)) compare the degree of head tilts. For example, the system may compare, using the control circuitry, the degree of head tilt of the user to a first threshold degree. In another example, process 500 can compare the degree of head tilt of the user to a second threshold degree and in response to the degree of head tilt of the user meeting or exceeding the second threshold degree, automatically adjusts a respective location of a stimulus of the plurality of icons in the visual field of the headmounted display by a first amount. For example, the first amount can be is based on a distance of the icon from a centerpoint of the visual field of the head-mounted display and a direction of the head tilt of the user.
[0076] At step 512, process 500 (e.g., using one or more components in system 400 (FIG. 4)) generate a recommendation to the user. For example, the system may generate for display a recommendation to the user. For example, the recommendation can be displayed on the headmounted display. The generation can also be in response to the degree of head tilt of the user meeting or exceeding the first threshold degree.
[0077] It is contemplated that the steps or descriptions of FIG. 5 may be used with any other embodiment of this disclosure. In addition, the steps and descriptions described in relation to FIG. 5 may be done in alternative orders or in parallel to further the purposes of this disclosure. For example, each of these steps may be performed in any order, in parallel, or simultaneously to reduce lag or increase the speed of the system or method. Furthermore, it should be noted that any of the devices or equipment discussed in relation to FIGS. 1-3 could be used to perform one or more of the steps in FIG. 5.
[0078] FIG. 6 is a process flow diagram for accurately replicating a visual field testing pattern from a curved surface on a flat surface. For example, process 600 may represent the steps taken by one or more devices, as shown in FIGS. 1 A-1C, when providing visual field testing using a headmounted display.
[0079] At step 602, process 600 (e.g., using one or more components in system 400 (FIG. 4) retrieves a visual field testing pattern for a head-mounted display. For example, the system may retrieve a visual field testing pattern for a head-mounted display, wherein the visual field testing pattern comprises stimuli that are displayed at respective locations in a visual field of the headmounted display.
[0080] At step 604, process 600 (e.g., using one or more components in system 400 (FIG. 4)) determines a curvature of the head-mounted display. For example, the system may determine a curvature of the head-mounted display based on receiving data from one or more sensors. Additionally or alternatively, the system may receive settings based on an initial calibration (e.g., an automatic calibration or a manual calibration) when the system is activated. Additionally or alternatively, the system may input a model or serial number (or other identifier) for the headmounted display into a look-up table listing curvatures for the model or serial number.
[0081] Additionally or alternatively, in some embodiments, the system may determine an offset distance of the head-mounted display based on receiving data from one or more sensors. Additionally or alternatively, the system may receive settings based on an initial calibration (e.g., an automatic calibration or a manual calibration) when the system is activated indicating the offset distance. Additionally or alternatively, the system may input a model or serial number (or other identifier) for the head-mounted display into a look-up table listing offset distance for the model or serial number.
[0082] At step 606, process 600 (e.g., using one or more components in system 400 (FIG. 4)) selects the respective locations based on the curvature. For example, the system may automatically adjust the respective locations based on the curvature and/or offset distance determined by the system. In some embodiments, the system may receive the curvature and/or offset distance (e.g., via input entered into a user terminal (e.g., user terminal 424 (FIG. 4)) and adjust the respective locations accordingly.
[0083] At step 608, process 600 (e.g., using one or more components in system 400 (FIG. 4)) generates for display the visual field testing pattern on the head-mounted display. For example, in generating the visual field testing pattern, the respective locations of the stimuli can be located in a row on the visual field. In another example, the respective locations can correspond to respective projections of points corresponding to different viewing angles along a curved surface onto a flat surface.
[0084] It is contemplated that the steps or descriptions of FIG. 6 may be used with any other embodiment of this disclosure. In addition, the steps and descriptions described in relation to FIG.
6 may be done in alternative orders or in parallel to further the purposes of this disclosure. For example, each of these steps may be performed in any order, in parallel, or simultaneously to reduce lag or increase the speed of the system or method. Furthermore, it should be noted that any of the devices or equipment discussed in relation to FIGS. 1-3 could be used to perform one or more of the steps in FIG. 6.
[0085] FIG. 7 illustrates a system diagram for calibrating one or more sensors or other device components. FIG. 7 shows a processing system 702 and sensors 704. Processing system 702 may include hardware (e.g., one or more processors, memory, etc.), software, or a combination of hardware and software. Processing system 702 may include a testing subsystem 710, a communication subsystem 712, and calibration subsystem 714. Each of these subsystems may include both hardware and software components. For example, testing subsystem 710 may include processors and/or memory. Communication subsystem 712 may include networking components (e.g., network card) and software to run the networking components, thus including software and hardware. Calibration subsystem 714 may also include software and hardware components. FIG.
7 also shows sensor(s) 704. Sensor(s) 704 may include one or more eye tracking sensors (e.g., inward directed sensors or other sensors). Sensors 704 may be able to perform eye tracking and gaze calculations. In some embodiments, gaze calculations may be performed using processing system 702. Furthermore, FIG. 7 shows display 706 which may be any suitable display (e.g., a head-mounted display shown in FIGs. 1A-1C).
[0086] In some embodiments, processing system 702 and sensors 704 may be located in the same enclosure as display 706. Thus, the head-mounted display may include processing system 702 and sensors 704. Furthermore, in some embodiments testing subsystem 710 and calibration subsystem 714 may be combined into a single subsystem.
[0087] Testing subsystem 710 may perform visual testing by causing display of stimuli, for example, on a head-mounted display or other interface of a user device. As an example, testing subsystem 710 may cause, during a visual field test, a first stimulus to be presented on a user device at a first location. The user device may include a head-mounted display or may be connected to a head-mounted display (e.g., using a wire or wirelessly). Testing subsystem 710 may cause the first stimulus to be presented on the user device by transmitting a command to display 706 for displaying a stimulus (e.g., a visual indicator) at a specific location (e.g., the first location) on the display 706.
[0088] In some embodiments, as testing subsystem 710 causes stimuli to be displayed, sensors(s) 704 may perform eye tracking and gaze detection operations on the user’s eye(s). Sensor(s) 704 may include components that detect both the directional movement and a gaze location of the user’s eye(s). Sensor(s) 704 may transmitthat data to processing system 702. In some embodiments, sensor(s) 704 may transmit raw tracking data to processing system 702 which may include components to detect both the directional movement and the gaze location of the user’s eyes. Communication subsystem 712 may receive the tracking data (tracking data may be referred to as feedback data) and pass that data to calibration system 714 and/or testing subsystem 710. Thus, calibration subsystem 714 may obtain, during the visual field test, first feedback data related to the first stimulus.
[0089] Calibration subsystem 714 may detect, based on the first feedback data, an eye of the user failing to fixate on the first location corresponding to the first stimulus. For example, the feedback data may include a gaze location of the user as a particular stimulus is displayed, for example, on a head-mounted display. Calibration subsystem 714 may compare the gaze location as detected by the eye tracking sensors (e.g., sensor(s) 704) and the location of the stimulus to determine whether the two locations match. In some embodiments, calibration subsystem 714 may determine that the two locations match even if the two locations are not identical (e.g., if the locations are off by a threshold number, ratio, or percentage). In some embodiments, the threshold may be determined based on the type and accuracy of the eye tracking sensor(s). As an example, if the locations match, calibration subsystem 714 may detect that the eye(s) of the user fixated on the first location (i.e., the stimulus). However, if the locations do not match, calibration subsystem 714 may detect that the eye(s) of the user failed to fixate on the first location (i.e., the stimulus).
[0090] In some embodiments, when performing the calibration of the user device based on the first feedback data indicating that the eye of the user failed to fixate on the first location, calibration subsystem 714 may determine that the eye of the user fixated on a different location that is within a threshold distance away from the first location. For example, the threshold may be one percent. Thus, if the user fixated on a point within one percent threshold, calibration subsystem 714 may determine that the user is gazing at the stimulus and no calibration is needed. However, if the user fixated on a point that is outside the one percent threshold, calibration subsystem 714 may determine that the user is also gazing at the stimulus and that calibration is needed. Calibration subsystem 714 may determine that the user has seen the stimulus and is gazing at the stimulus based on, for example, eye movement direction.
[0091] In another example, the system may have a threshold detection of the user’s gaze. The threshold number may be set at twenty percent or another suitable percentage. In one use case, if the user is gazing at a location that is within the twenty percent threshold, calibration subsystem 714 may determine that the user is gazing at the stimulus and calibration is needed. The percentage may be based on the type of eye tracking sensors used. However, testing subsystem 710 and/or calibration subsystem 714 may determine that the user has not seen the first stimulus based on determining that the eye of the user fixated on a different location that is outside the threshold distance away from the first location. For example, if the user fixated on a point that is more than twenty percent away from the stimulus, testing subsystem 710 and/or calibration subsystem 714 may determine that the user has not seen the stimulus.
[0092] Calibration subsystem 714 or testing subsystem 710 may determine that the user has seen the first stimulus even though the user failed to fixate on the first location. In some embodiments, testing subsystem 710 may determine that the user has seen the first stimulus when the difference between the first location (i.e., the location of the stimulus) and the gaze location determined by the eye tracking sensor(s) is within a threshold ratio or percentage. In some embodiments, calibration subsystem 714 may determine that the user has seen the first stimulus even when the difference between the first location (i.e., the location of the stimulus) and the gaze location determined by the eye tracking sensor(s) is above the threshold ratio or percentage. For example, calibration subsystem 714 may determine that the first stimulus was seen by the user by determining, based on the first feedback data, that user moved a threshold amount towards the first location. For example, eye tracking data (e.g., feedback data) may include eye direction movement. Using the eye direction movement data, calibration subsystem 714 may determine whether the user’ s eye(s) moved toward the stimulus (e.g., the first location) by a threshold amount. A threshold amount may be a percentage, a ratio, or another suitable threshold.
[0093] In some embodiments, although calibration subsystem 714 may determine that the first location (i.e., the location of the stimulus) does not match with the gaze location, calibration subsystem 714 may still determine that the user has seen the stimulus and in response update the visual test data and also calibrate the eye tracking sensors. For example, when testing subsystem 710 receives the eye tracking data from the sensors (e.g., sensor(s) 704), testing subsystem 710 may determine that the first location and the gaze location do not match and thus may determine that the user did not see the stimulus. However, calibration subsystem 714 may correct the determination by performing the operations above to indicate that the user has actually seen the stimulus and the determination is inaccurate due to a need for a calibration.
[0094] In some embodiments, testing subsystem 710 and calibration subsystem 714 may be part of the same subsystem. For example, calibration may be part of testing subsystem 710 and part of the visual testing process. Thus, the operations described above may be performed by the same subsystem. That is, testing subsystem 710 may cause, during a visual field test, a first stimulus to be presented on a user device at a first visual field location, obtain, during the visual field test, first feedback data related to the first stimulus, the first feedback data indicating that the user has seen the first stimulus, and detect, based on the first feedback data, an eye of the user failing to fixate on the first visual field location corresponding to the first stimulus.
[0095] Processing system 702 may store a data structure for determining whether calibration is needed. FIG. 8 illustrates a data structure for determining whether calibration is needed. FIG 8 shows data structure 800 that includes identification field 802, display location field 804, gaze location field 806, eye movement direction field 808. Stimulus identification field 802 stores an identifier for a particular stimulus that has been displayed or is being displayed to the user. Display location field 804 includes a corresponding location of the stimulus (e.g., a visual field location). Gaze location field 806 stores a location of the user’s gaze in connection with the corresponding stimulus as detected by the eye-tracking sensors. Eye movement direction field 808 may indicate whether the user’s eyes moved toward the stimulus.
[0096] In some embodiments, processing system 702 may use the data structure of FIG. 8. For example, when testing subsystem 710 instructs display 706 to display a stimulus, testing subsystem 710 may store a stimulus identifier for the stimulus in stimulus identifier field 802. In addition, testing subsystem 710 may store a visual field location for the stimulus in display location field 804. Display location field 804 may hold coordinates of the display where the stimulus is visible. When eye tracking data is received at processing system 702, calibration subsystem 714 or testing subsystem 710 may determine the gaze location of the user and store that information in gaze location field 806. The gaze location may include coordinates of the display. In addition, testing subsystem 710 or calibration subsystem 714 may store eye movement direction in eye movement direction field 808. Testing subsystem 710 or calibration subsystem 714 may compare the data in display location field 804 and gaze location field 806. If the data in those fields matches, processing system 702 may determine that the user has seen the stimulus. If the data does not match, testing subsystem 710 or calibration subsystem 714 may use the data in the eye movement direction field 808 to determine whether the user’s eye(s) moved toward the stimulus. For example, calibration subsystem 714 or testing subsystem 710 may compare the eye tracking data with the location of the stimulus over time after the stimulus was displayed to make the determination.
[0097] Based on the detection of the eye of the user failing to fixate on the first visual field location, calibration subsystem 714 may perform, during the visual field test, a calibration of the user device. For example, calibration subsystem 714 may calibrate the eye tracking sensor(s) so that the gaze location is aligned with the first visual field location (i.e., the location of the stimulus). Furthermore, calibration subsystems 714 or testing subsystem 710 may store, during the visual field test, a first indication that the user has seen the first stimulus. In some embodiments, to calibrate the user device, calibration subsystem 714 may adjust one or more parameters of a function for detecting a location that the user is viewing. For example, eye tracking sensors may be collecting raw tracking data. The raw tracking data may include measurements of the motion of the eye(s) relative to the user’s head and/or measurement of the point of the user’s gaze. Output of the sensor(s) may depend on the type of sensor is used. Calibration subsystem 714 may then use one or more functions to determine the gaze location and/or eye movement direction. Thus, in the process of calibration one or more of those functions (e.g., the parameters of those functions) may be adjusted.
[0098] Data processing system 702 (e.g., via testing subsystem 710) may repeat the test with another stimulus which can be displayed subsequent to the first stimulus. Thus, testing subsystem 710 may cause, during the visual field test, a second stimulus to be presented on the user device at a second visual field location. For example, testing subsystem 710 may transmit a command to the head-mounted display to display the second stimulus at the second visual field location. Testing subsystem 710 and/or calibration subsystem 714 may obtain, during the visual field test, second feedback data related to the second stimulus. The second feedback data may be obtained using the same mechanism as the first feedback data. Testing subsystem 710 and/or calibration subsystem 714 may detect, based on the second feedback data, that the eye of the user moved toward and fixated on the second visual field location. Based on this determination, processing system 702 may determine that the user has seen the second stimulus and store a second indication that the user has seen the second stimulus. Thus, no calibration of the user device (e.g., eye tracking sensors) is performed in response to detecting that the eye of the user fixated on the second visual field location. Based on the first indication and the second indication, processing system 702 may generate a visual defect assessment. For example, processing system 702 may determine whether the user has any visual defects. It should be noted that one advantage of calibrating the sensors during each stimulus enable the process to accurately adjust for user shifting or the user device otherwise providing inaccurate data.
[0099] FIG. 9 is a process flow diagram for calibrating a head-mounted display or other user device. At 902, processing system 702 causes a stimulus to be presented to a user device at a visual field location. Processing system 702 may use a processor to generate a command to a display (e.g., head-mounted display) to present a stimulus on the display at a particular location. The location may be included in the command. Processing system 702 may pass the command to the display (e.g., head-mounted display). At 904, processing system 702 obtains feedback data related to the stimulus. Processing system 702 may obtain the feedback data from one or more eye tracking sensors. The feedback data may have been recorded for a time right after the stimulus has been presented and before the next stimulus is presented.
[00100] At 906, processing system 702 detects an eye of the user failing to fixate on the visual field location corresponding to the stimulus. For example, processing system 702 may analyze the feedback data using one or more processors to make the detection. At 908, processing system 702 performs a calibration of the user device. For example, in response to determining that the user failed to fixate on the visual field location corresponding to the stimulus, processing system 702 may adjust one or more functions for determining a gaze location and/or movement direction. At 910, processing system 702 stores an indication that the user has seen the stimulus. The indication may be stored in association with the corresponding stimulus to signal to the system that the user has seen the stimulus.
[00101] It is contemplated that the steps or descriptions of FIG. 9 may be used with any other embodiment of this disclosure. In addition, the steps and descriptions described in relation to FIG. 9 may be done in alternative orders or in parallel to further the purposes of this disclosure. For example, each of these steps may be performed in any order, in parallel, or simultaneously to reduce lag or increase the speed of the system or method. Furthermore, it should be noted that any of the devices or equipment discussed in relation to FIGS. 1 A-1E could be used to perform one or more of the steps in FIG. 9.
[00102] FIG. 10 illustrates a simplified diagram depicting an exemplary relationship between a virtual plane and a display plane as may be used to calibrate a head-mounted display. As described further herein, the disclosed systems may generate a calibration pattern comprising a number of stimuli (e.g., one or more graphical elements, referred to herein as “icons”) for display at the headmounted display.
[00103] The calibration may take many forms and may comprises one or more stimuli being displayed in series and/or in parallel. The system may display a pattern in which the stimuli are displayed at particular positions. The positions may be defined by the system in terms of a height, width and/or viewing angle. The system may generate the stimuli at the extremes of the visual field in order to achieve the best calibration. For example, the system may display the stimuli in one or more corners of the visual field in order to receive the best measurement for calibrating a user’s gaze location on a single fixation point (e.g., a centerpoint in the visual field).
[00104] The example of FIG. 10 depicts a simplified representation showing a viewing plane 1030 having a number of stimuli (e.g. edge stimuli 1032, 1034, 1036, 1038) that may be seen by a user. However, for example due to improper wearing or other errors, eye tracking data obtained from a user viewing any one of these edge stimuli may not correspond to where the eye tracking data would be expected based on where the edge stimulus is displayed by the head-mounted display. Example eye tracking data 1010 is shown corresponding to edge stimulus 1034. The eye tracking data is depicted as a dashed line representing the path of the eye over its acquisition time. In this example, even though the system generates edge stimulus 1034, the eye tracking data generally surrounds the perceived edge point 1024.
[00105] In some embodiments, the system may retrieve calibration data of a given interval. The use of the given interval allows the system to normalize data during this time to remove outlier that may occur as a natural result of the calibration process. For example, the edge point can be determined by receiving eye tracking data over periods of time referred to herein as edge calibration periods. The edge calibration periods may be, for example, one second, five seconds, etc. In some implementations, eye tracking data may be averaged over such periods of time to generate an average location. This process may be repeated for a number of edge stimuli, with four shown in the example of FIG. 10. Thus, as depicted, edge points 1032, 1034, 1036, and 1038 generate corresponding points 1022, 1024, 1026, and 1028. While four points are shown, in other embodiments, other numbers of points may be used, for example, three, five, eight, etc. in some embodiments, the system may generate edge stimuli on edges of the field of view of the headmounted display. For example, along one or more of the left, right, upper, or lower edges. In certain embodiments, edge stimuli may be generated at the comers of the field, for example, upper left, upper right, lower left, or lower right. It is also contemplated that stimuli may be generated anywhere within the field of view stimuli may be generated anywhere within the field of view such that the presently described calibrations may be performed.
[00106] As part of various technical solutions that address the disclosed shortcomings of conventional calibration methods, certain disclosed embodiments describe how the system may relate and assess differences between what is displayed and what the user sees. Similar to how the edge stimuli may define a display plane 1030 (i.e., a plane established by the system where the edge stimuli are displayed on), the edge points may also define a virtual plane 1020. The system may calculate a projective transform matrix based on the edge eye tracking data that converts any location in virtual plane 1020 to display plane 1030. Thus, as described further below, stimuli or other calibration patterns may be generated by the head-mounted display and the obtained eye tracking data may be mapped back onto the display plane for comparison with the calibration pattern.
[00107] The system may calculate a projective transform matrix that is especially useful for a general transformation (e.g., one that does not force parallelism to be observed as such may not be the case when formerly parallel stimuli are viewed by a person). The below example illustrates how the system may generate and/or utilize a projective transform matrix for a coordinate transformation between the two planes 1020 and 1030:
Figure imgf000030_0001
[00109] In the above matrix equation, the 2x2 “a” submatrix is a rotation matrix, the 2x1 “Z>” submatrix is a translation vector, and the 1x2 “c” submatrix is a projection vector. The x,y elements correspond to the x,y coordinates of the edge stimulus in the display plane (e.g., edge stimulus 1034) and the x’,y’ elements corresponding to the x,y coordinates of the point in the virtual plane (e.g., point 1024). To apply the projective transform metric, the system may execute pseudocode such as shown in FIG. 10.
[00110] For example, the projective transformation can be represented as transformation of an arbitrary quadrangle (i.e. system of four points) into another one. Alternatively or additionally, the system may use a transform based on a different number of points. For example, the system may use an affine transformation, which is a transformation of a triangle. The system may select the type of transform based on the number of stimuli generated. The system may select the number of stimuli generated based on one or more criteria. For example, the system may determine a number of stimuli needed to achieve a determined amount of accuracy and/or meet a particular threshold level of precision. The system may likewise select the number of stimuli based on a type of test, amount of calibration needed, and/or a frequency of calibration.
[00111] For example, the system may determine that a four point (e.g., projective transform) calibration is used at the initiation of the user of a head mounted device. The system may then determine (e.g., using a hierarchy of criteria) whether an additional calibration needs to be performed, and if so, how many stimuli are required to be displayed.
[00112] FIG. 11 illustrates a simplified diagram depicting an exemplary central point and boundary as used to generate a calibration score. In some implementations, it may be of more interest to base a calibration off of a point more central in a field of view, as such is where most visual testing patterns are displayed. The system may generate for display a center stimulus 1100 on the headmounted display at a center location. This system may also receive center eye tracking data during a center calibration period, similar to the edge calibration period(s) described above. The calculated point in the virtual plane may be transformed to what is referred to herein as a “gaze location” in the display plane by the system utilizing the projective transform matrix. The system may generate a calibration score based on a difference 1140 (e.g., a delta in pixels, mm, or other similar distance metrics) between the center stimulus 1100 and the gaze location 1110. The inset shows this example in greater detail and includes exemplary eye tracking data 1130 and the difference 1140 between center stimulus 1100 and gaze location 1110. Also, in various embodiments the difference 1140 may be similarly calculated the virtual plane 1020 via a determination of equivalent points for the center stimulus and gaze location. In this way, is contemplated that any combination of points may be utilized in either plane and related to each other via the projective transform matrix to calculate differences, locations relative to a boundary (as described below), etc. [00113] In some implementations, the system may assess the accuracy of the calibration based on whether the gaze location and/or eye tracking data is within a prescribed boundary. For example, as shown in FIG. 11, the system may generate boundary 1120. In some implementations, such a boundary may be a circle having a given radius from the center stimulus 1100, but other boundary shapes such as square, hexagonal, etc. may be used. While in some implementations the boundary may be visually displayed by the head-mounted display, this is not necessary and instead the boundary may merely reside as coordinates or other boundary defining algorithm in computer memory. Accordingly, the system may determine the calibration score based on the size (e.g. radius) of the boundary.
[00114] As previously mentioned, the calibration score may be indicative of the confidence in the calibration. In this way, should a calibration be determined by the system to fail (e.g., the gaze location being outside the radius of the boundary), the system may repeat at least a portion of the calibration (e.g., the acquiring of edge eye tracking data, center eye tracking data, and/or calculation of the projective transform matrix), but making the size of the boundary larger (e.g., a larger radius boundary). For example, in one embodiment if the gaze location is calculated to be within the first (or initial) boundary generated, the calibration may be assigned a score of 100 (perhaps corresponding to the best possible calibration). If that calibration attempt were to have failed, then, for example, the radius of the boundary may be increased 20% and if that calibration succeeds then it may be assigned a score of 90. Any such relationship between boundary size and calibration score may be used by the system, as implemented by a person of skill.
[00115] The system may also determine, based on the gaze location, whether a user is looking at the center location. Such a determination may be made by the system, for example, if the gaze location is outside of a bounding box, an area defined by the edge stimuli, etc. Another related implementation can further refine the calibration is not allowing large, sustained deviations in the eye tracking data, even if the average location is within one of the above-described boundaries. For example, the system may determine that the user is not looking at the center location based on whether at least a portion of the center eye tracking data deviates from the gaze location more than a spatial deviation threshold and for longer than a temporal deviation threshold. As one specific example, the spatial deviation threshold may be any distance outside the boundary, but may also be a larger boundary (e.g., l.lx, 1.5x the radius of the present boundary). While a brief excursion may be allowed, the temporal deviation threshold may be set by the system to be, for example, 1 ms, 10 ms, 100 ms, etc. In this way, the system would determine that the calibration failed if the user’s gaze drifted, for example, far to the left and stayed there, indicating a possible loss of focus or attention on the calibration process.
[00116] FIG. 12 is a process flow diagram for calibrating a head-mounted display. For example, process 1200 may represent steps taken by one or more devices, as shown in FIGS. 1 A-1E, when calibrating the head-mounted display.
[00117] At step 1202, process 1200 (e.g., using one or more components in system 400 (FIG. 4)) receives edge eye tracking data. For example, the system may receive edge eye tracking data during edge calibration periods. Additionally or alternatively, the system may generate for display a number of edge stimuli on the head-mounted display. Additionally or alternatively, the system may generate edge stimuli on edges or of a field of view of the head-mounted display.
[00118] At step 1204, process 1200 (e.g., using one or more components in system 400 (FIG. 4)) calculates a projective transform matrix. For example, the system may calculate a projective transform matrix based on the edge eye tracking data. For example, the system may use pseudocode 1300 (FIG. 13) and/or the process described in FIG. 10.
[00119] At step 1206, process 1200 (e.g., using one or more components in system 400 (FIG. 4)) receives center eye tracking data. For example, the system may receive center eye tracking data during a center calibration period. Additionally or alternatively, the system may generate for display a center stimulus on the head-mounted display at the center location. Additionally or alternatively, the system may generate a boundary around the center stimulus and may display the boundary at the head display.
[00120] At step 1208, process 1200 (e.g., using one or more components in system 400 (FIG. 4)) applies a projective transform matrix to the center eye tracking data. For example, the system may apply the projective transform matrix to the center eye tracking data to determine a gaze location. [00121] At step 1210, process 1200 (e.g., using one or more components in system 400 (FIG. 4)) generates a calibration score. For example, the system may generate a calibration score based on a difference between a central location and the gaze location. The calibration score may be indicative of the accuracy of an eye test performed with the head-mounted display. Additionally or alternatively, the system may determine, based on the difference, whether the gaze location is inside the boundary. Additionally or alternatively, the calibration score may be based on a size of the boundary. Additionally or alternatively, the system, in response to the difference indicating that the gaze location is outside the boundary, may repeat at least a portion of the calibration, wherein the size of the boundary is larger. Additionally or alternatively, the system may determine whether a user is looking at the center location based on the gaze location. In response to the determination that the user is not looking at the center location, the system may repeat at least a portion of the calibration. Additionally or alternatively, the determination by the system that the user is not looking at the center location may require that at least a portion of the center eye tracking data deviates from the gaze location more than a spatial deviation threshold and for longer than a temporal deviation threshold.
[00122] It is contemplated that the steps or descriptions of FIG. 12 may be used with any other embodiment of this disclosure. In addition, the steps and descriptions described in relation to FIG. 12 may be done in alternative orders or in parallel to further the purposes of this disclosure. For example, each of these steps may be performed in any order, in parallel, or simultaneously to reduce lag or increase the speed of the system or method. Furthermore, it should be noted that any of the devices or equipment discussed in relation to FIGS. 1 A-1E could be used to perform one or more of the steps in FIG. 12.
[00123] Figure 13 is illustrative pseudocode for calibrating a head-mounted display in accordance with certain aspects of the present disclosure. For example, pseudocode 1300 represents illustrative pseudocode for calculating a projective transform matrix as described herein. The below example illustrates how the system may generate and/or utilize a projective transform matrix for a coordinate transformation between the two planes (e.g., planes 1020 and 1030 of FIG. 10). For example, pseudocode 1300 may generate values for 4x4 described in FIG. 10. In such cases, the points identified in pseudocode 1300 may correspond to the x,y coordinates of the edge stimulus in the display plane (e.g., edge stimulus 1034) and x,y coordinates of the point in the virtual plane (e.g., point 1024).
[00124] The above-described embodiments of the present disclosure are presented for purposes of illustration and not of limitation, and the present disclosure is limited only by the claims which follow. Although some embodiments are described herein with respect to calibration of headmounted displays, such techniques may be applied for calibration of one or more other user devices in other embodiments. Furthermore, it should be noted that the features and limitations described in any one embodiment may be applied to any other embodiment herein, and flowcharts or examples relating to one embodiment may be combined with any other embodiment in a suitable manner, done in different orders, or done in parallel. In addition, the systems and methods described herein may be performed in real time. It should also be noted that the systems and/or methods described above may be applied to, or used in accordance with, other systems and/or methods.
[00125] In the following, further features, characteristics, and exemplary technical solutions of the present disclosure will be described in terms of embodiments that may be optionally claimed in any combination:
1. A method comprising: retrieving a visual field testing pattern for a head-mounted display; and generating for display the visual field testing pattern on the head-mounted display.
2. The method of any of the preceding embodiments, wherein the visual field testing pattern comprising stimuli that are displayed at respective locations in a visual field of the head-mounted display.
3. The method of any of the preceding embodiments, further comprising retrieving data from a tilt sensor, located at the head-mounted display, for detecting degrees of head tilt of a user wearing the head-mounted display; determining, based on the data retrieved from the tilt sensor, a degree of head tilt of the user; and comparing, the degree of head tilt of the user to a first threshold degree.
4. The method of any of the preceding embodiments, further comprising generating for display, on the head-mounted display, a recommendation to the user in response to the degree of head tilt of the user meeting or exceeding the first threshold degree.
5. The method of any of the preceding embodiments, further comprising: comparing the degree of head tilt of the user to a second threshold degree; and in response to the degree of head tilt of the user meeting or exceeding the second threshold degree, automatically adjusting a respective location of a stimulus of the stimuli in the visual field of the head-mounted display by a first amount.
6. The method of any of the preceding embodiments, wherein the first amount is based on a distance of the stimulus from a centerpoint of the visual field of the head-mounted display and a direction of the head tilt of the user.
7. The method of any of the preceding embodiments, wherein the respective location of the stimulus is defined by a first directional component and a second directional component, and wherein the first directional component is adjusted by a cosine of the degree of head tilt of the user and the second directional component is adjusted by a sine of the degree of head tilt of the user. 8. The method of any of the preceding embodiments, wherein the respective locations of the stimulus are located in a row on the visual field, and wherein the respective locations correspond to respective projections of points corresponding to different viewing angles along a curved surface onto a flat surface.
9. The method of any of the preceding embodiments, wherein the respective locations are determined based on an offset distance of the head-mounted display and an angle to respective points on the visual testing machine.
10. The method of any of the preceding embodiments, wherein the respective locations are determined based on the expression a = b , where a is one of the respective locations,
Figure imgf000036_0001
b is the offset distance, and 0 is the angle.
11. The method of any of the preceding embodiments, further comprising determining a curvature of the head-mounted display and selecting the respective locations based on the curvature.
12. The method of any of the preceding embodiments, further comprising determining an offset distance of the head-mounted display and selecting the respective locations based on the offset distance.
13. The method of any of the preceding embodiments, further comprising determining a centerpoint of the head-mounted display and selecting the respective locations based on the centerpoint.
14. A method comprising: receiving edge eye tracking data during a plurality of edge calibration periods; calculating a projective transform matrix based on the edge eye tracking data; receiving center eye tracking data during a center calibration period; applying the projective transform matrix to the center eye tracking data to determine a gaze location; and generating a calibration score based on a difference between a center location and the gaze location.
15: The method of embodiment 14, further comprising: generating for display stimuli on the headmounted display; and generating for display a center stimulus on the head-mounted display at the center location.
16: The method of embodiment 14 or any of the preceding embodiments that depend therefrom, wherein the stimuli are generated on edges of a field of view of the head-mounted display.
17: The method of embodiment 14 or any of the preceding embodiments that depend therefrom, wherein the stimuli are generated at comers of a field of view of the head-mounted display. 18: The method of embodiment 14 or any of the preceding embodiments that depend therefrom, wherein the calibration score is indicative of the accuracy of an eye test performed with the headmounted display.
19: The method of embodiment 14 or any of the preceding embodiments that depend therefrom, further comprising: generating a boundary around the center stimulus; determining, based on the difference, whether the gaze location is inside the boundary; and determining the calibration score based on a size of the boundary.
20: The method of embodiment 14 or any of the preceding embodiments that depend therefrom, further comprising displaying the boundary at the head mounted display.
21 : The method of embodiment 14 or any of the preceding embodiments that depend therefrom, further comprising: in response to the difference indicating that the gaze location is outside the boundary, repeating at least a portion of the calibration, wherein the size of the boundary is larger. 22: The method of embodiment 14 or any of the preceding embodiments that depend therefrom, further comprising: determining whether a user is looking at the center location based on the gaze location; and in response to the determination that the user is not looking at the center location, repeating at least a portion of the calibration.
23: The method of embodiment 14 or any of the preceding embodiments that depend therefrom, wherein the determination that the user is not looking at the center location requires that at least a portion of the center eye tracking data deviates from the gaze location more than a spatial deviation threshold and for longer than a temporal deviation threshold.
24: A system comprising: a head-mounted display; inward directed sensors, located at the headmounted display, configured to track pupil movement; storage circuitry configured to store a plurality of stimuli that are displayed at respective locations in a visual field of the head-mounted display; and control circuitry configured to: to perform operations comprising those of any of embodiments 1-23.
25: A tangible, non-transitory, machine-readable medium storing instructions that, when executed by a data processing apparatus, cause the data processing apparatus to perform operations comprising those of any of embodiments 1-23.
26. A system comprising: one or more processors; and memory storing instructions that, when executed by the processors, cause the processors to effectuate operations comprising those of any of embodiments 1-23. 27. A system comprising means for performing any of embodiments 1-23.
[00126] In the following, further features, characteristics, and exemplary technical solutions of the present disclosure will be described in terms of embodiments that may be optionally claimed in any combination:
1. A method comprising: causing a first stimulus to be presented on a user device at a first location; obtaining first feedback data related to the first stimulus presented at the first location; performing a calibration of the user device; and storing a first indication that the user has seen the first stimulus.
2. The method of any of the above embodiments, wherein the calibration of the user device is performed based on the first feedback data indicating that an eye of a user failed to fixate on the first location.
3. The method of any of the above embodiments, further comprising: causing, during the visual field test and subsequent to the calibration, a second stimulus to be presented on the user device at a second location; obtaining, during the visual field test, second feedback data related to the second stimulus; detecting, based on the second feedback data, that the eye of the user moved toward and fixated on the second location; and storing, during the visual field test, a second indication that the user has seen the second stimulus, wherein no calibration of the user device is performed in response to detecting that the eye of the user fixated on the second location.
4. The method of any of the above embodiments, further comprising: generating a visual defect assessment based on the first feedback data and the second feedback data.
5. The method of any of the above embodiments, wherein the first feedback data includes eye tracking data, and wherein detecting the eye and storing the first indication are based on the eye tracking data.
6. The method of any of the above embodiments, further comprising: determining that the first stimulus was seen by determining, based on the first feedback data, that a gaze location of the user is within a threshold distance of the first location.
7. The method of any of the above embodiments, further comprising: determining that the first stimulus was seen by determining, based on the first feedback data, that a gaze location of the user moved a threshold amount towards the first location.
8. The method of any of the above embodiments, wherein performing the calibration of the user device based on the first feedback data indicating that the eye of the user failed to fixate on the first location comprises determining that the eye of the user fixated on a different location that is within a threshold distance away from the first location.
9. The method of any of the above embodiments, further comprising determining that the user has not seen the first stimulus based on determining that the eye of the user fixated on the different location that is outside the threshold distance away from the first location.
10. The method of any of the above embodiments, wherein the calibration of the user device comprises adjusting one or more parameters of a function for detecting a location that the user is viewing.
11. The method of any of the above embodiments, wherein the user device comprises a headmounted display.
12. A system comprising: inward directed sensors, located at a head-mounted display, configured to track movement of one or more eyes; storage circuitry configured to store data for generating stimuli that are presented at respective locations of the head-mounted display; and control circuitry configured to perform operations comprising those of any of embodiments 1-11.
13. A tangible, non-transitory, machine-readable medium storing instructions that, when executed by one or more processors, cause the one or more processors to perform operations comprising those of any of embodiments 1-11.
14. A system comprising: one or more processors; and memory storing instructions that, when executed by the processors, cause the processors to effectuate operations comprising those of any of embodiments 1-11.
15. A system comprising means for performing operations of any embodiments 1-11.
[00127] The present disclosure contemplates that the calculations disclosed in the embodiments herein may be performed in a number of ways, applying the same concepts taught herein, and that such calculations are equivalent to the embodiments disclosed.
[00128] One or more aspects or features of the subject matter described herein can be realized in digital electronic circuitry, integrated circuitry, specially designed application specific integrated circuits (ASICs), field programmable gate arrays (FPGAs) computer hardware, firmware, software, and/or combinations thereof. These various aspects or features can include implementation in one or more computer programs that are executable and/or interpretable on a programmable system including at least one programmable processor, which can be special or general purpose, coupled to receive data and instructions from, and to transmit data and instructions to, a storage system, at least one input device, and at least one output device. The programmable system or computing system may include clients and servers. A client and server are generally remote from each other and typically interact through a communication network. The relationship of client and server arises by virtue of computer programs running on the respective computers and having a clientserver relationship to each other.
[00129] These computer programs, which can also be referred to programs, software, software applications, applications, components, or code, include machine instructions for a programmable processor, and can be implemented in a high-level procedural language, an object-oriented programming language, a functional programming language, a logical programming language, and/or in assembly/machine language. As used herein, the term “machine-readable medium” (or “computer readable medium”) refers to any computer program product, apparatus and/or device, such as for example magnetic discs, optical disks, memory, and Programmable Logic Devices (PLDs), used to provide machine instructions and/or data to a programmable processor, including a machine-readable medium that receives machine instructions as a machine-readable signal. The term “machine-readable signal” (or “computer readable signal”) refers to any signal used to provide machine instructions and/or data to a programmable processor. The machine-readable medium can store such machine instructions non-transitorily, such as for example as would a nontransient solid-state memory or a magnetic hard drive or any equivalent storage medium. The machine-readable medium can alternatively or additionally store such machine instructions in a transient manner, such as for example as would a processor cache or other random access memory associated with one or more physical processor cores.
[00130] To provide for interaction with a user, one or more aspects or features of the subject matter described herein can be implemented on a computer having a display device, such as for example a cathode ray tube (CRT) or a liquid crystal display (LCD) or a light emitting diode (LED) monitor for displaying information to the user and a keyboard and a pointing device, such as for example a mouse or a trackball, by which the user may provide input to the computer. Other kinds of devices can be used to provide for interaction with a user as well. For example, feedback provided to the user can be any form of sensory feedback, such as for example visual feedback, auditory feedback, or tactile feedback; and input from the user may be received in any form, including, but not limited to, acoustic, speech, or tactile input. Other possible input devices include, but are not limited to, touch screens or other touch-sensitive devices such as single or multi-point resistive or capacitive trackpads, voice recognition hardware and software, optical scanners, optical pointers, digital image capture devices and associated interpretation software, and the like.
[00131] In the descriptions above and in the claims, phrases such as “at least one of’ or “one or more of’ may occur followed by a conjunctive list of elements or features. The term “and/or” may also occur in a list of two or more elements or features. Unless otherwise implicitly or explicitly contradicted by the context in which it used, such a phrase is intended to mean any of the listed elements or features individually or any of the recited elements or features in combination with any of the other recited elements or features. For example, the phrases “at least one of A and B;” “one or more of A and B;” and “A and/or B” are each intended to mean “A alone, B alone, or A and B together.” A similar interpretation is also intended for lists including three or more items. For example, the phrases “at least one of A, B, and C;” “one or more of A, B, and C;” and “A, B, and/or C” are each intended to mean “A alone, B alone, C alone, A and B together, A and C together, B and C together, or A and B and C together.” Use of the term “based on,” above and in the claims is intended to mean, “based at least in part on,” such that an unrecited feature or element is also permissible.
[00132] The subject matter described herein can be embodied in systems, apparatus, methods, computer programs and/or articles depending on the desired configuration. Any methods or the logic flows depicted in the accompanying figures and/or described herein do not necessarily require the particular order shown, or sequential order, to achieve desirable results. The implementations set forth in the foregoing description do not represent all implementations consistent with the subject matter described herein. Instead, they are merely some examples consistent with aspects related to the described subject matter. Although a few variations have been described in detail above, other modifications or additions are possible. In particular, further features and/or variations can be provided in addition to those set forth herein. The implementations described above can be directed to various combinations and subcombinations of the disclosed features and/or combinations and subcombinations of further features noted above. Furthermore, above described advantages are not intended to limit the application of any issued claims to processes and structures accomplishing any or all of the advantages.
[00133] Additionally, section headings shall not limit or characterize the invention(s) set out in any claims that may issue from this disclosure. Further, the description of a technology in the “Background” is not to be construed as an admission that technology is prior art to any invention(s) in this disclosure. Neither is the “Summary” to be considered as a characterization of the invention(s) set forth in issued claims. Furthermore, any reference to this disclosure in general or use of the word “invention” in the singular is not intended to imply any limitation on the scope of the claims set forth below. Multiple inventions may be set forth according to the limitations of the multiple claims issuing from this disclosure, and such claims accordingly define the invention(s), and their equivalents, that are protected thereby.

Claims

CLAIMS WHAT IS CLAIMED IS:
1. A system for calibrating sensors connected to head-mounted displays, the system comprising: inward directed sensors, located at a head-mounted display, configured to track movement of one or more eyes; storage circuitry configured to store data for generating stimuli that are presented at respective locations of the head-mounted display; and control circuitry configured to: obtain a set of test locations for testing a visual field of a user, wherein each test location within the set of test locations corresponds to a visual field location of the visual field of the user; for each visual field location of the set of test locations: cause, during a visual field test, a stimulus to be presented on the headmounted display at the visual field location; obtain, during the visual field test, eye-tracking data related to the stimulus, the eye-tracking data indicating that the user has seen the stimulus; detecting, based on the eye-tracking data, that an eye of the user moved toward but did not fixate on the visual field location; in response to detecting that the eye of the user moved toward but did not fixate on the visual field location:
(i) perform, during the visual field test, a calibration of the inward directed sensors, wherein the calibration comprises adjusting configuration of the inward directed sensors such that a location that the user fixated on matches the visual field location; and
(ii) store, during the visual field test, an indication that the user has seen the stimulus; and generate a visual defect assessment based on the indication.
2. The system of claim 1, wherein the calibration of the inward directed sensors is not performed in response to detecting that the eye of the user fixated on the visual field location.
3. A method comprising: causing, during a visual field test, a first stimulus to be presented on a user device at a first visual field location; obtaining, during the visual field test, first feedback data related to the first stimulus, the first feedback data indicating that a user has seen the first stimulus; detecting, based on the first feedback data, an eye of the user failing to fixate on the first visual field location corresponding to the first stimulus; performing, during the visual field test, a calibration of the user device based on detecting the eye of the user failing to fixate on the first visual field location; and storing, during the visual field test, a first indication that the user has seen the first stimulus.
4. The method of claim 3, further comprising: causing, during the visual field test, a second stimulus to be presented on the user device at a second visual field location; obtaining, during the visual field test, second feedback data related to the second stimulus; detecting, based on the second feedback data, that the eye of the user moved toward and fixated on the second visual field location; and storing, during the visual field test, based on the second feedback data, a second indication that the user has seen the second stimulus, wherein no calibration of the user device is performed in response to detecting that the eye of the user fixated on the second visual field location.
5. The method of claim 4, further comprising: generating a visual defect assessment based on the first indication and the second indication.
6. The method of claim 3, wherein the first feedback data comprises eye tracking data, and wherein detecting the eye and storing the first indication are based on the eye tracking data.
7. The method of claim 3, further comprising: determining that the first stimulus was seen by determining, based on the first feedback data, that a gaze location of the user is within a threshold distance of the first visual field location.
8. The method of claim 3, further comprising: determining that the first stimulus was seen by determining, based on the first feedback data, that a gaze location of the user moved a threshold amount towards the first visual field location.
9. The method of claim 3, wherein the calibration of the user device comprises adjusting one or more parameters of a function for detecting a location that the user is viewing.
10. The method of claim 3, wherein the user device comprises a head-mounted display.
11. One or more non-transitory, computer-readable media storing instructions that, when executed by one or more processors, cause operations comprising: causing, during a visual field test, a first stimulus to be presented on a user device at a first location; obtaining, during the visual field test, first feedback data related to the first stimulus presented at the first location; performing, during the visual field test, a calibration of the user device based on the first feedback data indicating that an eye of a user failed to fixate on the first location; and storing, during the visual field test, based on the first feedback data, a first indication that the user has seen the first stimulus.
12. The media of claim 11, the operations further comprising: causing, during the visual field test and subsequent to the calibration, a second stimulus to be presented on the user device at a second location; obtaining, during the visual field test, second feedback data related to the second stimulus; detecting, based on the second feedback data, that the eye of the user moved toward and fixated on the second location; and storing, during the visual field test, a second indication that the user has seen the second stimulus, wherein no calibration of the user device is performed in response to detecting that the eye of the user fixated on the second location.
13. The media of claim 12, the operations further comprising: generating a visual defect assessment based on the first feedback data and the second feedback data.
14. The media of claim 11, wherein the first feedback data comprises eye tracking data, and wherein detecting the eye and storing the first indication are based on the eye tracking data.
15. The media of claim 11, the operations further comprising: determining that the first stimulus was seen by determining, based on the first feedback data, that a gaze location of the user is within a threshold distance of the first location.
16. The media of claim 11, the operations further comprising: determining that the first stimulus was seen by determining, based on the first feedback data, that a gaze location of the user moved a threshold amount towards the first location.
17. The media of claim 11, wherein performing the calibration of the user device based on the first feedback data indicating that the eye of the user failed to fixate on the first location comprises determining that the eye of the user fixated on a different location that is within a threshold distance away from the first location.
18. The media of claim 17, the operations further comprising: determining that the user has not seen the first stimulus based on determining that the eye of the user fixated on the different location that is outside the threshold distance away from the first location.
19. The media of claim 17, wherein the calibration of the user device comprises adjusting one or more parameters of a function for detecting a location that the user is viewing.
20. The media of claim 11, wherein the user device comprises a head-mounted display.
21. A system for calibrating head-mounted displays utilizing a projective transform matrix, the system comprising: a head-mounted display; inward directed sensors, located at the head-mounted display, configured to track pupil movement; storage circuitry configured to store a plurality of icons that are displayed at respective locations in a visual field of the head-mounted display; and control circuitry configured to: generate for display a plurality of edge icons on the head-mounted display; receive edge eye tracking data during a plurality of edge calibration periods corresponding to the display of the plurality of edge icons; calculate the projective transform matrix based on the edge eye tracking data; generate for display a center icon on the head-mounted display at a center location; receive center eye tracking data during a center calibration period; apply the projective transform matrix to the center eye tracking data to determine a gaze location; and generate a calibration score based on a difference between the center location and the gaze location.
22. A method comprising: receiving edge eye tracking data during a plurality of edge calibration periods; calculating a projective transform matrix based on the edge eye tracking data; receive center eye tracking data during a center calibration period; applying the projective transform matrix to the center eye tracking data to determine a gaze location; and generating a calibration score based on a difference between a center location and the gaze location.
23. The method of claim 22, further comprising: generating, for display, edge stimuli on the head-mounted display; and generating, for display, a center stimulus on the head-mounted display at the center location.
24. The method of claim 23, wherein the edge stimuli are generated on edges of a field of view of the head-mounted display.
25. The method of claim 24, wherein the edge stimuli are generated at corners of a field of view of the head-mounted display.
26. The method of claim 23, wherein the calibration score is indicative of the accuracy of an eye test performed with the head-mounted display.
27. The method of claim 23, further comprising: generating a boundary around the center stimulus; determining, based on the difference, whether the gaze location is inside the boundary; and determining the calibration score based on a size of the boundary.
28. The method of claim 27, further comprising displaying the boundary at the head mounted display.
29. The method of claim 27, further comprising: in response to the difference indicating that the gaze location is outside the boundary, repeating at least a portion of the calibration, wherein the size of the boundary is larger.
30. The method of claim 23, further comprising: determining whether a user is looking at the center location based on the gaze location; and in response to the determination that the user is not looking at the center location, repeating at least a portion of the calibration.
31. The method of claim 30, wherein the determination that the user is not looking at the center location requires that at least a portion of the center eye tracking data deviates from the gaze location more than a spatial deviation threshold and for longer than a temporal deviation threshold.
32. One or more non-transitory, computer-readable media storing instructions that, when executed by one or more processors, cause operations comprising: receiving edge eye tracking data during a plurality of edge calibration periods; calculating a projective transform matrix based on the edge eye tracking data; receiving center eye tracking data during a center calibration period; applying the projective transform matrix to the center eye tracking data to determine a gaze location; and generating a calibration score based on a difference between a center location and the gaze location.
33. The machine-readable medium of claim 32, further comprising: generating, for display, edge stimuli on the head-mounted display; and generating, for display, a center stimulus on the head-mounted display at the center location.
34. The media of claim 33, wherein the edge stimuli are generated on edges of a field of view of the head-mounted display.
35. The media of claim 34, wherein the edge stimuli are generated at corners of a field of view of the head-mounted display.
36. The media of claim 33, further comprising: generating a boundary around the center stimulus; determining, based on the difference, whether the gaze location is inside the boundary; and determining the calibration score based on a size of the boundary.
37. The media of claim 36, further comprising displaying the boundary at the head mounted display.
38. The media of claim 36, further comprising: in response to the difference indicating that the gaze location is outside the boundary, repeating at least a portion of the calibration, wherein the size of the boundary is larger.
39. The media of claim 33, further comprising: determining whether a user is looking at the center location based on the gaze location; and in response to the determination that the user is not looking at the center location, repeating at least a portion of the calibration.
40. The media of claim 39, wherein the determination that the user is not looking at the center location requires that at least a portion of the center eye tracking data deviates from the gaze location more than a spatial deviation threshold and for longer than a temporal deviation threshold.
41. A system for improving accuracy of visual field testing in head-mounted displays, the system comprising: a head-mounted display; a tilt sensor, located at the head-mounted display, for detecting degrees of head tilt of a user wearing the head-mounted display; storage circuitry configured to store a visual field testing pattern, wherein the visual field testing pattern comprises a plurality of icons that are displayed at respective locations in a visual field of the head-mounted display, and the respective locations of the plurality of icons are located in a row on the visual field, and wherein the respective locations correspond to respective projections of points corresponding to different viewing angles along a curved surface onto a flat surface; and control circuitry configured to: generate for display the visual field testing pattern on the head-mounted display; determine, based on data retrieved from the tilt sensor, a degree of head tilt of the user; compare, the degree of head tilt of the user to a first threshold degree; and in response to the degree of head tilt of the user meeting or exceeding the first threshold degree, generate for display, on the head-mounted display, a recommendation to the user.
42. The system of claim 41, wherein the control circuitry is further configured to: compare the degree of head tilt of the user to a second threshold degree; and in response to the degree of head tilt of the user meeting or exceeding the second threshold degree, automatically adjust a respective location of an icon of the plurality of icons in the visual field of the head-mounted display by a first amount.
43. The system of claim 42, wherein the first amount is based on a distance of the icon from a centerpoint of the visual field of the head-mounted display and a direction of the head tilt of the user.
44. The system of claim 41, wherein the respective locations of the plurality of icons are located in a row on the visual field, and wherein the respective locations correspond to respective projections of points corresponding to different viewing angles along a curved surface onto a flat surface.
45. The system of claim 41, wherein the control circuitry is further configured to: determine a curvature of the head-mounted display; and select the respective locations based on the curvature.
46. The system of claim 41, wherein the control circuitry is further configured to: determine an offset distance of the head-mounted display; and select the respective locations based on the offset distance.
47. A method comprising: retrieving, using control circuitry, a visual field testing pattern for a head-mounted display, wherein the visual field testing pattern comprises a plurality of icons that are displayed at respective locations in a visual field of the head-mounted display; generating for display the visual field testing pattern on the head-mounted display; retrieving data from a tilt sensor, located at the head-mounted display, for detecting degrees of head tilt of a user wearing the head-mounted display; determining, based on the data retrieved from the tilt sensor, a degree of head tilt of the user; comparing, using the control circuitry, the degree of head tilt of the user to a first threshold degree; and in response to the degree of head tilt of the user meeting or exceeding the first threshold degree, generating for display, on the head-mounted display, a recommendation to the user.
48. The method of claim 47, further comprising: comparing the degree of head tilt of the user to a second threshold degree; and in response to the degree of head tilt of the user meeting or exceeding the second threshold degree, automatically adjusting a respective location of an icon of the plurality of icons in the visual field of the head-mounted display by a first amount.
49. The method of claim 48, wherein the first amount is based on a distance of the icon from a centerpoint of the visual field of the head-mounted display and a direction of the head tilt of the user.
50. The method of claim 47, wherein the respective location of the icon is defined by a first directional component and a second directional component, and wherein the first directional component is adjusted by a cosine of the degree of head tilt of the user and the second directional component is adjusted by a sine of the degree of head tilt of the user.
50
51. The method of claim 47, wherein the respective locations of the plurality of icons are located in a row on the visual field, and wherein the respective locations correspond to respective projections of points corresponding to different viewing angles along a curved surface onto a flat surface.
52. The method of claim 41, wherein the respective locations are determined based on an offset distance of the head-mounted display and an angle to respective points on a visual testing machine.
53. The method of claim 42, wherein the respective locations are determined based on an expression a = b , where a is one of the respective locations, b is the offset distance,
Figure imgf000053_0001
and 0 is the angle.
54. One or more non-transitory, computer-readable media storing instructions that, when executed by one or more processors, cause operations comprising: retrieving a visual field testing pattern for a head-mounted display, wherein the visual field testing pattern comprising a plurality of icons that are displayed at respective locations in a visual field of the head-mounted display; generating for display the visual field testing pattern on the head-mounted display; retrieving data from a tilt sensor, located at the head-mounted display, for detecting degrees of head tilt of a user wearing the head-mounted display; determining, based on the data retrieved from the tilt sensor, a degree of head tilt of the user; comparing, the degree of head tilt of the user to a first threshold degree; and in response to the degree of head tilt of the user meeting or exceeding the first threshold degree, generating for display, on the head-mounted display, a recommendation to the user.
55. The media of claim 54, further comprising: comparing the degree of head tilt of the user to a second threshold degree; and
51 in response to the degree of head tilt of the user meeting or exceeding the second threshold degree, automatically adjusting a respective location of an icon of the plurality of icons in the visual field of the head-mounted display by a first amount.
56. The media of claim 55, wherein the first amount is based on a distance of the icon from a centerpoint of the visual field of the head-mounted display and a direction of the head tilt of the user.
57. The media of claim 54, wherein the respective location of the icon is defined by a first directional component and a second directional component, and wherein the first directional component is adjusted by a cosine of the degree of head tilt of the user and the second directional component is adjusted by a sine of the degree of head tilt of the user.
58. The media of claim 54, wherein the respective locations of the plurality of icons are located in a row on the visual field, and wherein the respective locations correspond to respective projections of points corresponding to different viewing angles along a curved surface onto a flat surface.
59. The media of claim 58, wherein the respective locations are determined based on an offset distance of the head-mounted display and an angle to respective points on the visual testing machine.
60. The media of claim 59, wherein the respective locations are determined based on an where a is one of the respective locations, b is the offset distance,
Figure imgf000054_0001
and 0 is the angle.
52
PCT/US2021/054228 2020-10-28 2021-10-08 Systems and methods for visual field testing in head-mounted displays WO2022093521A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2023526382A JP2023550699A (en) 2020-10-28 2021-10-08 System and method for visual field testing in head-mounted displays
EP21887174.7A EP4236755A1 (en) 2020-10-28 2021-10-08 Systems and methods for visual field testing in head-mounted displays

Applications Claiming Priority (8)

Application Number Priority Date Filing Date Title
US17/082,983 2020-10-28
US17/082,983 US10993612B1 (en) 2020-10-28 2020-10-28 Systems and methods for visual field testing in head-mounted displays
US17/246,054 US20220125294A1 (en) 2020-10-28 2021-04-30 Systems and methods for visual field testing in head-mounted displays
US17/246,054 2021-04-30
US17/392,723 2021-08-03
US17/392,723 US20220125298A1 (en) 2020-10-28 2021-08-03 Active calibration of head-mounted displays
US17/392,664 US20220125297A1 (en) 2020-10-28 2021-08-03 Device calibration via a projective transform matrix
US17/392,664 2021-08-03

Publications (1)

Publication Number Publication Date
WO2022093521A1 true WO2022093521A1 (en) 2022-05-05

Family

ID=81384341

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2021/054228 WO2022093521A1 (en) 2020-10-28 2021-10-08 Systems and methods for visual field testing in head-mounted displays

Country Status (3)

Country Link
EP (1) EP4236755A1 (en)
JP (1) JP2023550699A (en)
WO (1) WO2022093521A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150062323A1 (en) * 2013-09-03 2015-03-05 Tobbi Technology Ab Portable eye tracking device
US20170307880A1 (en) * 2014-10-21 2017-10-26 Philips Lighting Holding B.V. System, method and computer program for hands-free configuration of a luminous distribution
WO2019067779A1 (en) * 2017-09-27 2019-04-04 University Of Miami Digital therapeutic corrective spectacles
US20190150727A1 (en) * 2017-11-14 2019-05-23 Vivid Vision, Inc. Systems and methods for vision assessment
US10531795B1 (en) * 2017-09-27 2020-01-14 University Of Miami Vision defect determination via a dynamic eye-characteristic-based fixation point

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150062323A1 (en) * 2013-09-03 2015-03-05 Tobbi Technology Ab Portable eye tracking device
US20170307880A1 (en) * 2014-10-21 2017-10-26 Philips Lighting Holding B.V. System, method and computer program for hands-free configuration of a luminous distribution
WO2019067779A1 (en) * 2017-09-27 2019-04-04 University Of Miami Digital therapeutic corrective spectacles
US10531795B1 (en) * 2017-09-27 2020-01-14 University Of Miami Vision defect determination via a dynamic eye-characteristic-based fixation point
US20190150727A1 (en) * 2017-11-14 2019-05-23 Vivid Vision, Inc. Systems and methods for vision assessment

Also Published As

Publication number Publication date
EP4236755A1 (en) 2023-09-06
JP2023550699A (en) 2023-12-05

Similar Documents

Publication Publication Date Title
CN110573061B (en) Ophthalmologic examination method and apparatus
CN105163649B (en) Computerization dioptric and astigmatism determine
US20190246889A1 (en) Method of determining an eye parameter of a user of a display device
US9323075B2 (en) System for the measurement of the interpupillary distance using a device equipped with a screen and a camera
US9545202B2 (en) Device and method for measuring objective ocular refraction and at least one geometric-morphological parameter of an individual
JP2021502130A (en) Orthodontic glasses for digital treatment
US9664929B2 (en) Method for determining at least one head posture characteristic of a person wearing spectacles
JP6049750B2 (en) Luminance-dependent adjustment of spectacle lenses
US9671617B2 (en) Method for estimating a distance separating a pair of glasses and an eye of the wearer of the pair of glasses
US11448903B2 (en) Method for correcting centering parameters and/or an axial position and corresponding computer program and methods
CN107003752A (en) Information processor, information processing method and program
JP2019194702A (en) Method of determining at least one optical design parameter of progression ophthalmic lens
CN112689470B (en) Method for performing a test of the power of a scattered light using a computing device and corresponding computing device
JP2019215688A (en) Visual line measuring device, visual line measurement method and visual line measurement program for performing automatic calibration
ES2932157T3 (en) Determination of a refractive error of an eye
CN111699432A (en) Method for determining the endurance of an eye using an immersive system and electronic device therefor
JP2007003923A (en) Spectacle wearing parameter measuring apparatus, spectacle lens, and spectacles
US20220125297A1 (en) Device calibration via a projective transform matrix
US20220125294A1 (en) Systems and methods for visual field testing in head-mounted displays
CN113138664A (en) Eyeball tracking system and method based on light field perception
US20220125298A1 (en) Active calibration of head-mounted displays
EP4236755A1 (en) Systems and methods for visual field testing in head-mounted displays
US20230255473A1 (en) Integrated apparatus for visual function testing and method thereof
CN110414302A (en) Contactless interpupillary distance measurement method and system
EP3801193B1 (en) System and method for automatic torsion correction in diagnostic ophthalmic measurements

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21887174

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 2023526382

Country of ref document: JP

NENP Non-entry into the national phase

Ref country code: DE

ENP Entry into the national phase

Ref document number: 2021887174

Country of ref document: EP

Effective date: 20230530