AU2017402745A1 - Visual performance assessment - Google Patents

Visual performance assessment Download PDF

Info

Publication number
AU2017402745A1
AU2017402745A1 AU2017402745A AU2017402745A AU2017402745A1 AU 2017402745 A1 AU2017402745 A1 AU 2017402745A1 AU 2017402745 A AU2017402745 A AU 2017402745A AU 2017402745 A AU2017402745 A AU 2017402745A AU 2017402745 A1 AU2017402745 A1 AU 2017402745A1
Authority
AU
Australia
Prior art keywords
game
subject
visual
user
performance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
AU2017402745A
Other versions
AU2017402745B2 (en
Inventor
Dinesh Visva GUNASEKERAN
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Visre Pte Ltd
Original Assignee
Visre Pte Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Visre Pte Ltd filed Critical Visre Pte Ltd
Publication of AU2017402745A1 publication Critical patent/AU2017402745A1/en
Assigned to VISRE PTE. LTD. reassignment VISRE PTE. LTD. Request for Assignment Assignors: Gunasekeran, Dinesh
Application granted granted Critical
Publication of AU2017402745B2 publication Critical patent/AU2017402745B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/212Input arrangements for video game devices characterised by their sensors, purposes or types using sensors worn by the player, e.g. for measuring heart beat or leg activity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/211Input arrangements for video game devices characterised by their sensors, purposes or types using inertial sensors, e.g. accelerometers or gyroscopes
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/20Input arrangements for video game devices
    • A63F13/21Input arrangements for video game devices characterised by their sensors, purposes or types
    • A63F13/213Input arrangements for video game devices characterised by their sensors, purposes or types comprising photodetecting means, e.g. cameras, photodiodes or infrared cells
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/45Controlling the progress of the video game
    • A63F13/46Computing the game score
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F13/00Video games, i.e. games using an electronically generated display having two or more dimensions
    • A63F13/60Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor
    • A63F13/67Generating or modifying game content before or while executing the game program, e.g. authoring tools specially adapted for game development or game-integrated level editor adaptively or by learning from player actions, e.g. skill level adjustment or by storing successful combat sequences for re-use

Landscapes

  • Engineering & Computer Science (AREA)
  • Multimedia (AREA)
  • Human Computer Interaction (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Biophysics (AREA)
  • General Health & Medical Sciences (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Physics & Mathematics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Cardiology (AREA)
  • Theoretical Computer Science (AREA)
  • Eye Examination Apparatus (AREA)
  • Information Retrieval, Db Structures And Fs Structures Therefor (AREA)

Abstract

Social remote eye screening and monitoring gamification through integration of modular assessments of visual function into the objectives of interactive games to encourage compliance to testing instructions to facilitate improved fidelity, remote frequent reassessment/ trending, and/or automated interpretation. These are developed on any platform, including virtual and/or 3-dimensional gaming platforms which include immersive and deep-dive gaming technology platforms such as virtual reality, augmented reality or mixed reality. This novel process will also be deployed on any future platforms that involve headsets or image projections that physically surround users with virtual stimuli.

Description

VISUAL PERLORMANCE ASSESSMENT
Field of the invention
The invention relates to the general field of determining, assessing or monitoring the visual performance of a subject or of subjects. Embodiments relate to a method, a system, a game, a computer program product (or computer program) and a device employed in this field. Non-limiting embodiments relate to utilising customised gamification and design of immersive games on platforms such as virtual reality, augmented reality, and/or mixed reality.
Background of the invention
Assessments of visual function may include a large number of tests and assessment processes. Amongst the many checks that are currently used are visual acuity, peripheral visual field function, macular function, central visual field function, contrast discrimination, and colour discrimination.
Existing eye screening and/or vision assessments involve the presentation of various stimuli (such as Ishihara charts for colour discrimination assessment) by a technical expert such as an ophthalmologist or optical technician. Current techniques require the involvement of such a technical expert for interpretation and ensuring patient compliance to test instructions, as compliance may be a requirement for accurate interpretation of tests.
Despite this, test-retest variability has been described in numerous clinic-based assessments particularly in patients with eye diseases. This has prompted a recommendation for frequent reassessments over time. These may not be practical given the logistical considerations of having patients frequently present to medical facilities for repeated reassessments.
Innovators have tried to develop wearable versions of the same clinic-based assessments in head-mounted equipment to improve portability of machinery. However, drawbacks in terms of the need for technical expertise to facilitate compliance to test protocol persist.
WO 2018/164636
PCT/SG2017/050407
Taking the example of the assessment of peripheral visual field function, existing visual fields assessment can be voided or rendered uninterpretable due to difficulty in enforcing instructions particularly for youngest and oldest users. Such instructions might include maintaining a central visual axis at a centrally-located stimulus presented during assessments of the peripheral visual field. Assessment of peripheral vision function is currently done either manually or through automation using perimetry machines.
Manual assessment of peripheral vision is achieved by the technical expert (for example an ophthalmologist, or an eye specialist doctor) instructing the patient to focus their vision on the technical expert’s nose, while the technical expert presents a wiggling finger in each of the four quadrants of the patient’s peripheral visual fields one eye at a time, moving inwards towards the centre of the patient’s vision. The patient then has to tell the technical expert once they are able to see the wiggling finger, and the expert makes a note of his rough gauge of the patient’s visual field function. Each eye is tested in turn. This provides a gross assessment of the scope of a patient’s peripheral vision that may be done in a clinic setting without much spatial or equipment requirement, but is manpower intensive, low fidelity and difficult to reproduce or standardize.
Automated assessment of peripheral vision is achieved with Visual Field Perimetry Testing Equipment that attempts to focus the user’s visual attention to the central visual axis through the use of a distracting centrally-located visual stimulus. Assessment is subsequently achieved by intermittent presentation of a second stimulus (usually blinking light of varying brightness) in the peripheral visual field. Users acknowledge having caught sight of the second stimulus either verbally or manually such as by pressing a button. The test becomes invalid or uninterpretable for peripheral vision if the user shifts their visual attention and axis towards the second light instead of maintaining it centrally. This is because the second blinking light will instead be in the user’s new central visual axis and the exercise will no longer provide assessment of their peripheral vision i.e. peripheral visual field function.
These existing processes present difficulties in compelling patients to be compliant to the instructions at every step involved, often leading to voided, uninterpretable or inconsistent results. Patients often also lack motivation to perform at their best, which can lead to
WO 2018/164636
PCT/SG2017/050407 inaccurate and inconsistent results. Finally, patients may notice patterns in this test that lead to falsely good performance results. For instance, visual stimuli in perimetry are often presented at a regular and predictable frequency or rhythm. Patients sometimes admit to pressing the “acknowledgement” button at a regular rhythm once they notice this, regardless of whether they actually saw the visual stimulus presented to their peripheral vision.
It would be desirable to avoid or mitigate at least some of the above mentioned difficulties and problems.
The use of virtual or 3-dimensional gaming platforms in healthcare has gradually increased, primarily in assessment and treatment of psychiatric conditions. Platforms in this context include headsets that cover an extensive area of the visual field and that present different virtual images to each eye to facilitate dichoptic assessments. Prior art discusses the potential role of gaming platforms for treating eye diseases. It also discusses the potential role of gaming platforms for assessing the eye movements and/or neuropsychological functions such as attention or reaction time of patients with eye diseases, based on assessment of their eye movements when presented with virtual reenactions of everyday activities.
Summary of the invention
There is disclosed a technique of assessing the visual performance of a subject in which gamification of at least one of the tests is used. Gamification in this document indicates incorporation of at least one of the characterizing features of a game into a process. Such features may include one or more of scoring, competitiveness, providing rewards and providing rules. The function of this is to enhance test accuracy/fidelity, to better engage the user to improve participation, and to increase cooperation of the user to the rules of visual function assessment to facilitate valid results and assessments.
There is disclosed the embedding of process visual functional assessment(s) and their validity requirements in games to enhance test fidelity and patient compliance to testing instructions. This can standardise the quality of testing procedures, improve reproducibility and validity of tests, lower manpower requirements for conducting tests,
WO 2018/164636
PCT/SG2017/050407 and may additionally reduce time/resource wastage resulting from the need to repeat invalid test results.
In embodiments, the games are immersive games to create assessments that are embedded in dynamic situations or tasks to provide novel measures of visual function. The novel measures are based on users' in-game performance (virtual user information) as well as physical responses (physical user information) to in-game events/tasks, whereby these games can be deployed on platforms such as VR/AR/Mixed reality.
This is opposed to static repetitive assessments which are available today (such as sitting in a chair and reading letters off a visual acuity chart with one eye covered) which may be prone to being gamed (patient doing perimetry who notices the rhythm, or patient doing Visual acuity testing who peeks out under the eye cover to use his good eye to aid his bad eye - this is avoided with VR head sets that show each eye a controlled individual image as the patient cannot game it). Today's tests may not even reflect the patient's actual visual functional performance in real-world situations due to a variety of reasons (psychological, attentiveness, etc).
Embodiments of the invention may replace or augment the measures available and in-use today.
In accordance with certain embodiments gamification methods are used to facilitate patient engagement and encourage them to perform as best they can, be compliant to eye screening frequency, and finally provide a control for remote assessments of patient visual function. This may be achieved through immediate/overall goals and the deployment of social gaming elements for improved patient compliance to frequency of eye screening recommendations (e.g. screen once a month), improved convenience of more frequent eye screening or visual function assessment (e.g. weekly) by addressing logistical constraints, and provide natural controls for remote visual functional assessments to identify aberrations in performance that arise from non-organic causes (such as lack of familiarity after prolonged break, technical problems, lag, etc).
WO 2018/164636
PCT/SG2017/050407
There is also disclosed a method of assessing the visual performance of a subject, the method comprising providing a game to a subject whereby the subject interacts with the game, and using the results of the interaction to determine at least one aspect of visual performance.
There is further disclosed a system for assessing the visual performance of a subject, the system being configured to present at least the video output of a game to the subject, whereby the subject interacts with the game, the system being configured to use the results of the interaction to determine at least one aspect of visual performance of the subject.
In accordance with some embodiments, a system for assessing the visual performance of a subject is configured to present at least a video output of a game to the subject, whereby the subject interacts with the game, and to provide data indicative of the user’s reaction to game play whereby at least one aspect of visual performance of the subject may be determined.
There is still further disclosed a device configured to provide signals representing at least the video of a game for viewing by a subject, the device further configured to receive information concerning interaction of a subject with the game, whereby the subject’s visual performance may be assessed.
There is disclosed yet further a game configured, when a video from the game is presented to a subject who interacts with the video, to allow assessment of the visual performance of the subject. The assessment may be performed by a device connected to receive information concerning the subject’s interaction with the game. The assessment may be performed by a device receiving information concerning one or more of the movements and the positions adopted by a subject playing the game.
The game may be a purpose-written game. The game play may be focused on or written specifically for the assessment.
WO 2018/164636
PCT/SG2017/050407
In another setup, existing tests are gamified by inclusion of game-like features, for example features including one or more of scoring, competitiveness, providing rewards and providing rules.
In yet a further setup a pre-existing game is adapted to have additional features enabling optical or visual assessments to be made on a player.
The method may include gathering one or more of physical and virtual user information and interpretation of the user information.
Physical user information may include multiple domains such as gaze and pupillary tracking.
Physical user information may include one or more of positional and rotational tracking of the user’s entire body and individual body parts such as the head.
The method may comprise gathering physical user information using position or orientation sensors.
The method may comprise gathering user information using brain-computer interfaces.
The game may include an aim and shoot mechanic at a centrally located visual stimulus. This may maintain a user’s vision in the central visual axis.
The game may include a dodging or defending game mechanic based on secondary stimuli from various spatial regions, for example in the peripheral visual field.
The game may include one of more of performance tracking, performance registration, score tracking, score interpretation and correlation to the user’s functional capacity for specific visual function domains.
The game may include automated gradual increments in game difficulty or difficulty tailored to individual user performance. This may be achieved through edge analytics with
WO 2018/164636
PCT/SG2017/050407 machine learning. Immediate experiences and goals that are incorporated include winning or completing a game module and achieving a high score. These provide immediate engagement and motivation for users to perform as best they can during assessment.
The game may include overall experiences and goals. These may include progressing through multiple stages and collecting enhancements to upgrade a personal gaming profile or avatar. Such techniques involve the user in the game itself and thus facilitate long-term compliance and engagement.
In embodiments, the game may be made available as, say, a computer program or a computer program product as may be made available for download from, say, an online store/marketplace.
Brief description of the drawing
Fig 1 shows a block schematic view of a visual/optical assessment system embodying the invention
Description of embodiments
The figure illustrates one embodiment of a system for assessing the visual performance of a subject, the system being configured to present at least the video output of a game to the subject, whereby the subject interacts with the game, the system being configured to use the results of the interaction to determine at least one aspect of visual performance of the subject.
Referring to Figure 1, a schematic diagram of a visual/optical assessment system 100 has a computer device 101, a virtual-reality type headset device 103 and a user-actuated game control device 105. An example of a headset is the HTC Vive. The headset device 103 is designed to be worn by a user and has a display system 109 with a left display 109a for the user’s left eye and a right display 109b for the user’s right eye, and first and second sensors 111, 113 connected to supply signals 112, 114 to the computer device 101. The display system is designed to place video information in the visual field of a user when wearing the headset, as discussed below.
WO 2018/164636
PCT/SG2017/050407
The computer device 101 is configured to provide signals representing at least the video of the game for viewing by a subject, e.g. a user wearing a headset. The computer device is further configured to receive information concerning interaction of a subject with the game, whereby the subject’s visual performance may be assessed.
The game is configured, when a video from the game is presented to a subject who interacts with the video, to allow assessment of the visual performance of the subject. The assessment may be performed by a device, for example the computer device 101 connected to receive information concerning the interaction with the game. The assessment may be performed by a device, e.g. the computer device 101 or a separate connected device, receiving information concerning the movements and/or the positions adopted by a subject playing the game. The assessment may be performed by a technical expert.
First sensor(s) 111 are configured to supply data 112 allowing the user’s pupils or gaze to be tracked. Second sensor(s) 113 are configured to supply data 114 allowing head movements of the user to be tracked.
The headset device 103 also has an external camera 119 to supply signals to the computer device 101 to enable mixed or enhanced reality experience to be displayed. In one embodiment of this, virtual images are overlaid onto a virtual re-enactment of the user’s actual surroundings on an opaque display screen 109. In another embodiment of this, virtual images are overlaid onto the user’s actual surroundings as seen by the user through a transparent or translucent display screen 109.
In this embodiment, the display is configurable and controllable by signals 122 from the computer device 101 to the headset 103, to provide a three-dimensional image by using both displays 109a and 109b to present one version of a virtual image perceived by each eye independently. The computer device 101 may also cause the headset 103 to provide a different image separately to each eye of a user. It can supply no image via display 109b to the right eye, and an image via left display 109a to the left eye or vice-versa, or provide different images to each eye. It may also provide identical images to each eye. In another embodiment, displays 109a and 109b are combined in a single display system 109 which
WO 2018/164636
PCT/SG2017/050407 provides a single combined image perceived by both eyes. In another embodiment, the display system 109 is a transparent screen on which virtual images are displayed and overlaid over the actual surroundings of the user that he sees through the transparent display system 109.
In another embodiment, the display system 109 is not be contained within a headset but is presented in one of a hand-held mobile device and a standing/wall-mounted screen, with sensors 111 and 113 embedded or provided separately.
Also in this embodiment the computer device 101 outputs the audio of a game via the headset. Some embodiments have audio output via a separate external device, audio output from the computer device 101, or no audio output- for example, some mixed reality displays.
The computer device 101 in an embodiment stores in memory the instructions for displaying the game using the headset 103, and receives back from the headset, from the various sensors and the game control device 105, signals 112, 114, 121 that cause game play to progress in different ways according to the nature and content of those signals.
The computer device 101 is arranged, in this embodiment, after receiving the pupil and movement data signals 112,114 from the headset 103 and the signals 121 from user actuation of the game controller 105 to process those signals to extract from them data that is either directly indicative of the user’s visual performance or data that can be analyzed and interpreted to provide a measure of the user’s visual performance.
Data in the computer device 101 is either automatically interpreted by an embedded analytic program within the architecture of the assessment system 100, relayed via the internet and/or cloud architecture to a remote analytic program, relayed via the internet and/or cloud architecture to a remote expert who interprets the data, or some combination of the above.
In the present context “game” means any set of images intending to involve a user in play. The game in one embodiment is a game written with the purpose of carrying out optical/visual assessments. In another embodiment, an existing game is modified with
WO 2018/164636
PCT/SG2017/050407 additional features enabling optical or visual assessments to be made on a player. In yet another embodiment a set of conventional tests/assessments is gamified by adding at least one of the characterizing features of a game, including for example one or more of scoring, competitiveness, providing rewards and providing rules.
In the present embodiment, the game involves manual interaction, via the user-actuated control device 105. Also in this embodiment interaction between a user and the game can be detected by one or more of body/body part movements of the game user, body motion by image analysis software of images of real movements of the user’s body as recorded by camera 119, or eye/pupil movement by gaze/pupil tracker 112.
In a further embodiment, brain-computer interfacing is used. To that end, the system 100, in some embodiments, also has electroencephalograph monitoring devices connected to supply inputs to the computer device 101 to enable involvement by the brain to be determined.
Three-dimensional gaming platforms, in some set-ups, project virtual images that can be accurately varied in terms of distance, size, and appearance from the user’s eye. These enable reproducible assessments of visual functions such as visual acuity testing when integrated within games. This helps to overcome forms of test-retest variability that can be attributed to operator dependency in existing manual assessment processes, such as accurate placement of visual acuity charts at required fixed distances from individuals being assessed.
There are several embodiments of the method. In one embodiment, each visual assessment is carried out by a respective single game, so one game per test. In another embodiment plural assessments are carried out by a single game, for example a game having different “levels”, each “level” including a different test, or a multifaceted game that integrates assessment of plural visual functions. In a third embodiment, all visual assessments are carried out by means of a single game.
In use, in one embodiment, the computer device 101 displays on the user’s headset 103 a game in which the user’s main objective is scoring points by using two controllers 105a
WO 2018/164636
PCT/SG2017/050407 and 105b. 105a is used to toss a virtual object through a virtual basketball hoop presented in the central visual axis, while 105b is simultaneously used to defend himself from virtual objects that are thrown at him, such as balls in the game of dodge ball, that are projected from the various spatial regions of the visual field by virtual “members of an opposing team”.
Based on the user’s responsiveness to random stimuli in all spatial regions of the peripheral visual field his performance can be assessed accordingly, e.g. peripheral visual field function assessed by the user appropriately pressing a “defend” button on 105b whenever a stimulus is presented. For a user who has a normal score in all peripheral fields but gradually loses the ability to respond to stimuli from a specific area such as the inferior nasal field, this might represent gradual development or progression of an ocular condition affecting the integrity of the user’s visual function in that area of the peripheral visual field, such as Glaucoma. All this time, the patient’s central visual axis and attention is maintained in the central visual field as per requirement for peripheral visual fields assessments, due to the simultaneous task of tossing virtual objects through a virtual basketball hoop presented in the central visual axis, with scoring tabulated accordingly. Sensor 111 and/or 113 acts as additional stop-guard(s), with output(s) 112 and/or 114 interpreted by the computer device 101 which modifies the game to stimulate compliance to instructions. In one embodiment, this may involve pausing the game or deducting the user’s score whenever pupil movement or head movement corresponding to a shift in patient’s central visual axis is detected.
In another embodiment there is a game in which a user traverses a minefield by hopping through a plain grass patch, in order to assess the visual function of contrast sensitivity. If the virtual user is standing in the middle of a 3x3 grid of imaginary squares, the user would have eight movement options at any one time (front-left, front, front-right, left, right, back-left, back, back-right) to progress through the field. The safe path is represented by square blocks of land area that have a differing contrast from the other blocks. If the user chooses the right movement option corresponding to the square with the contrast difference, he survives and progresses one step. If he chooses the wrong option, a mine explodes and he dies, for example. By varying the contrast difference between the safe path option and the other death options to increase difficulty as the
WO 2018/164636
PCT/SG2017/050407 user progresses through the minefield, the user's visual function or contrast sensitivity can be tracked or assessed based on how far they manage to progress through the minefield each time they play this game.
In other embodiments, gaming platforms provide augmented or mixed reality to present an enhanced capacity to assess visual performance through gamification.
In use in another embodiment, the computer device 101 causes the headset 103 to project to the user mixed reality or augmented reality images on a single opaque display 109 or opaque independent displays 109a and 109b, with game elements layered over the real world, as picked up by the camera 119. In another embodiment, the computer device 101 causes the headset 103 to project to the user mixed reality or augmented reality images on a transparent display 109, with game elements layered over the real world, as viewed by the user through the transparent display 109. Augmented and mixed realities can provide added convenience and ease of home-based assessments. In augmented reality or mixed reality embodiments, the data of the game may be local to the headset 103- e.g. from a games console, or may be communicated from a remote provider, for example via the Internet and/or cloud architecture. Furthermore, augmented and mixed reality games enable the assessment of new visual assessments such as visual attentiveness to virtual images projected in the user’s field of vision through augmented reality, while users continue to carry out their regular daily activities and remain productive during their time spent on assessments.
Playing a game/interacting with a virtual interface during everyday tasks for visual function assessment is achieved by augmented reality. This may involve a head set such as Googleglass, allowing a user to wear it as he goes about his daily activities. During rest periods/wairing time, he can select an option to play a game that now provides assessment of at least one visual function.
For instance “aliens” may appear at the periphery of his vision, requiring hand signals from him to prompt his avatar (the version of him in the game) to shoot at said aliens. His hand signals may be detected by the head mounted camera 119.
WO 2018/164636
PCT/SG2017/050407
Another application more embedded in daily tasks is a driving assessment divorced from the usual real world setting on roads, to an empty green field whereby the user drives, and augmented reality simulations are presented such as traffic junctions, an oncoming vehicle, pedestrian dashing across the road, yellow light traffic signal, etc. The user's fitness to drive is then assessed automatically according to their responsiveness to the visual stimuli, based on physical feedback (e.g. user swerving the steering wheel as detected by motion sensors, user slowing down as detected by motion sensors of the brake/embedded speedometer, etc). This allows assessment without a need to endanger the lives of people when hazardous drivers are allowed on the road for their driving tests. This may serve as a first line to screen off incompetent drivers before they even have a real world driving test, or as a way to perform repeated assessments of a user's fitness to drive when they grow old or develop eye diseases that affect their visual function.
Virtual user information includes multiple domains such as user interaction with the virtual space, user response to virtual stimuli, and user performance in modular tasks in the virtual space. This information is gathered into the computer device 101 from the sensors 111,113 and the controller 105. Virtual user information is gathered based on performance in gamified assessments or based on performance in straightforward virtual tasks and exercises that integrate existing eye screening and/or visual assessment exercises/stations into overall tasks within games. Gathering, processing, and interpretation of certain virtual user information by the computer device 101 may require interpretation of results, play progress and outcome in concert with physical user information.
User information is interpreted as raw independent data or in correlation with in-game user performance and interaction data. Data can be both cross-sectional and collected over time from multiple gaming experiences and modules.
Cross-sectional refers to study of users at a specific point of time i.e. analysing the data of the user drawn at a specific point in time (such as machine learning edge analytics) as opposed to analysing the data based on trending the user's performance in the various games over time.
WO 2018/164636
PCT/SG2017/050407
Data is collected in terms of a single-user assessment and/or performance in a single-user setting, single-user assessment and/or performance in a multi-user setting, or multi-user assessment or performance in a multi-user setting.
For an example of single-user assessment and performance in a multi-user setting, data is compared between users in multi-player collaborative games to exclude technical interference causing any change in performance.
As an example of single-user assessment in a multi-user setting, consider a grandchild and grandparent playing an interactive 2-player game with a shared goal that assesses their individual visual function based on their individual performance in that game. For example, each is playing the same simulated game, with the aim of surviving as many waves of aliens as possible. Based on their ability to perceive visual stimuli (aliens) and respond to them (shoot) in each area of their own visual fields, this game provides an assessment of each individual’s visual function based on that individuals performance (shoot aliens appearing in all his visual fields) within the game.
As with any activity, games have a learning element that could lead to improved performance over time when the game is played frequently. Conversely, when a game is played frequently, stopped, and then restarted months later, there may be a drop in performance of the user. This, in the context of visual assessments, could be wrongly interpreted as development of eye disease. To avoid that, having the tracking of the second individual (grandchild) may provide a control that can be used to explain such examples of performance variation that does not occur as a result of organic disease development/progression (as both individuals would have a dip in performance after a protracted period of not playing).
This also serves as having a dedicated control subject for any assessment of a user’s visual performance.
By moving the tests or assessments in a game environment, technical problems of the testing regime can be overcome when considering user information and pausing of the game or deducting points whenever “instructions” are not followed. The use of the game
WO 2018/164636
PCT/SG2017/050407 with the user engaged in the game will of course lead to improved compliance with the requirements of the underlying test or tests, without the need for spoken or written instructions.
Compliance to the requirements of the assessment can then be achieved without the need for a technical expert to be present, for example someone constantly reminding users of the instructions required to maintain technical interpretability of results. Furthermore, users can be better motivated to perform at their best with each and every assessment through elements of the game play, further explained below.
Performing the assessments in the game environment can help improve the convenience and fidelity of eye screening and vision assessment and overcome technical difficulties in certain existing methods of visual assessments.
To illustrate this, an example pertains to technical difficulties in performing visual fields assessments. The user’s visual axis and attention is focused where required through the primary objective or mission of a game, with the necessary assessments incorporated as concurrent side objectives or secondary goals to be achieved within the same game. This process can be further enhanced by the integration of gaze, pupillary, and/or head movement tracking as an added measure to re-confirm and track user compliance to the requirements of the specific visual assessment.
These represent improvements over existing eye screening and/or vision assessments due to improvement in testing fidelity, reproducibility, portability, engagement/motivation, and compliance as a result of the placing/embedding in a game/the game environment.
This can be done for instance via a virtual/augmented/mixed reality game with a combination of mechanics deployed.
For example, an aim and shoot mechanic at a centrally located visual stimulus can be employed to maintain the user’s vision in the central visual axis. Performance in this objective can be used to track user’s visual axis and thereby interpretability/validity of peripheral vision assessment. This will also help discern variations in user performance
WO 2018/164636
PCT/SG2017/050407 that may not be due to changes in their visual function but instead might be attributed to increasing or decreasing familiarity with the game from having played it more or less often, respectively.
Subsequently, a dodging or defending game mechanic can be deployed based on secondary stimuli from various spatial regions of the peripheral visual field. Performance in this objective can be used to assess the peripheral visual fields of a user, by tracking the score stratified to the user’s responsiveness and/or perception of individual segments of the visual field spatial regions. “Stratified” means that the user’s in-game performance (scores for various tasks or progress for various in-game goals) are independently analysed to provide data on the user's visual function broken down into performance measures for each of the individual facets of visual function.
This will facilitate identification, processing, and/or long-term tracking of visual field spatial regions that are not well perceived by the user. The same can be done for any other domain of visual function.
In some embodiments the gaming experience is further enhanced and personalised through the incorporation of machine learning and artificial intelligence capabilities.
These capabilities facilitate personalised assessment of visual performance by varying difficulty of a gaming module to find the exact maximum difficulty that the user is able to cope with using their visual function. Machine learning and artificial intelligence also exclude the confounding effect any impaired visual performance may have on other modules, by automatically varying the difficulty in known impaired visual performance to assess the impact on the user’s performance on other modules. This also facilitates assessment of multiple areas of visual performance simultaneously through complex games in the same way.
In other embodiments, a VR headset is not used, but instead a screen is placed at a fixed or measurable distance from the users eyes. However a headset may enable better correlation of stimuli within aspects of the games to functional areas of the user’s visual field as well as improved coverage of the peripheral visual field to provide more
WO 2018/164636
PCT/SG2017/050407 immersive and engaging gamified assessments of visual function in order to boost compliance.
The assessments of each performance metric are individually embedded into the or each developed game so that they can be assessed by analysing the data collected in the backend about the user's performance in the or each game.
Performance can be monitored by the user's in-game performance (scores for various tasks/in-game goals) being analysed to provide data on the user's visual function, broken down into performance measures for each of the individual facets of visual function.
Users are studied at a specific point of time i.e. by analysing the data of the user drawn at a specific point in time (such as machine learning edge analytics) as opposed to analysing the data based on trending the user's performance in the various games over time, or through repeated trended assessments over time.
These can thereby serve as measures of functional impairments from various ocular diseases and can be investigated as potential markers of disease severity, especially after clinical validity tests. For instance, a gradual sustained decrease in a user’s score for a test in the peripheral visual fields may serve as a marker to prompt referral to an ophthalmologist to exclude chronic ocular conditions that affect the peripheral visual field such as glaucoma.
Such changes in visual function may often go undetected by users, depending on the visual demands of their daily tasks. For instance, an elderly user who only watches the television or reads the newspaper as his main daily visual tasks may not experience sufficient demand of their peripheral vision field to notice deterioration at the onset of disease. Hence these games can serve as a measure to increase visual demands of users, so as to detect deficiencies in visual function and bring them to the attention of users earlier.
Elements of gamification are further incorporated into these tests to facilitate personalized assessment and improved long-term compliance to daily monitoring of vision.
WO 2018/164636
PCT/SG2017/050407
These elements include features such as automated gradual increments in game or task difficulty tailored to individual user performance. Immediate experiences and goals that are incorporated include winning/completing a game module and achieving a high score, respectively. These provide immediate engagement and motivation for users to perform as best they can during assessment. Overall experiences and goals are also incorporated such as progressing through multiple stages and collecting enhancements to upgrade a personal gaming profile and/or avatar, respectively. These facilitate long-term compliance and engagement.
In an embodiment the computer device 101 includes a backend hardware and/or software system to interpret the raw data from the headset into understandable and actionable information. This is information that the lay-person can make sense of in the monitoring of his condition remotely. In that case, regular involvement of a healthcare worker to interpret and explain the results is not required. Alternatively, the information can be transmitted to an interpreter (such as an Ophthalmologist observing a dashboard with the performance of a list of users under his care) who can remotely monitor the user, receive prompts for assessments when a user’s visual function decreases, and/or contact the affected user to assess further or instruct them to seek medical attention early
Advantages
Social elements are built for nationwide and hyper local communities to facilitate friendly competition and further boost compliance through peer pressure. Examples of this are the display of registries for highest scores for each gaming module and best overall average scores. . Performance in the games can be monitored as an inexpensive, portable/convenient, and reproducible measure of functional impairments from ocular diseases and extrapolated as a novel marker of disease severity.
Through the gamification process there is built-in performance and/or score tracking, the conduct and interpretation of tests are automated, with performance in these modular assessments representing a novel marker for the functional assessment of vision. Tests are layered in interactive games to foster intergenerational interaction between users and family members of all ages, reduce depression and/or loneliness amongst elderly, and
WO 2018/164636
PCT/SG2017/050407 improve compliance to visual function monitoring through gamification and intergenerational interaction.
Additional benefits include, improved convenience to users due to potential home-based 5 deployment of these gaming platforms, as well as the decreased spatial requirement when compared to physical deployment of equipment over distances (such as Snellen charts at a distance of 6 meters in the assessment of visual acuity). Benefits also include time-savings such as utilizing clinic waiting time of patients for these tests while awaiting appointments, as well as reduction in the strain on healthcare resources by decanting 10 visual functional assessments or tests to the community setting (in homes or gaming arcades).
Various embodiments of the invention have now been described. The invention is not however restricted to the described features but instead extend to the full scope of the appended claims.

Claims (17)

1. A technique for assessing the visual performance of a subject comprising gamification of at least one visual performance test.
2. A method of assessing the visual performance of a subject, the method comprising presenting a video game to a subject whereby the subject interacts with the game, and using the results of the interaction to determine at least one aspect of visual performance.
3. The method of claim 2 comprising interpretation of physical user information
4. The method of claim 2 comprising interpretation of virtual user information
5. The method of claim 3 wherein the physical user information includes at least one of gaze tracking and pupillary movement tracking.
6. The method of claim 3 wherein physical user information includes one or more of the set comprising positional tracking, rotational tracking of the user’s entire body and tracking individual body parts such as the head.
7. The method of claim 3 comprising gathering physical user information using position sensors.
8. The method of claim 3 comprising gathering physical user information from orientation sensors.
9. The method of claim 2, comprising gathering user information using brain-computer interfaces.
10. The method of claim 2 wherein the video game includes one or more of performance tracking and score tracking
WO 2018/164636
PCT/SG2017/050407
11. The method of claim 2 wherein the video game has automated gradual increments in task difficulty tailored to individual user performance.
12. A system for assessing the visual performance of a subject, the system being configured to present at least a video output of a game to the subject, whereby the subject interacts with the game, the system being configured to use the results of the interaction to determine at least one aspect of visual performance of the subject
13 A system for assessing the visual performance of a subject, the system being configured to present at least a video output of a game to the subject, whereby the subject interacts with the game, the system being configured to provide data indicative of the user’s reaction to game play whereby at least one aspect of visual performance of the subject may be determined.
14. A device configured to provide signals representing at least the video of a game for viewing by a subject, the device further configured to receive information concerning interaction of a subject with the game, whereby the subject’s visual performance may be assessed.
15. A game configured, when a video from the game is presented to a subject who interacts with the video, to allow assessment of the visual performance of the subject.
16. A computer program product comprising instructions which, when executed by a computing device, cause the computing device to carry out the method of any of claims 1 to 11.
17. A computer program comprising instructions which, when executed by a computing device, cause the computing device to carry out the method of any of claims 1 to 11.
AU2017402745A 2017-03-04 2017-08-16 Visual performance assessment Active AU2017402745B2 (en)

Applications Claiming Priority (3)

Application Number Priority Date Filing Date Title
SG10201701755X 2017-03-04
SG10201701755X 2017-03-04
PCT/SG2017/050407 WO2018164636A1 (en) 2017-03-04 2017-08-16 Visual performance assessment

Publications (2)

Publication Number Publication Date
AU2017402745A1 true AU2017402745A1 (en) 2019-09-12
AU2017402745B2 AU2017402745B2 (en) 2024-01-04

Family

ID=63448544

Family Applications (1)

Application Number Title Priority Date Filing Date
AU2017402745A Active AU2017402745B2 (en) 2017-03-04 2017-08-16 Visual performance assessment

Country Status (4)

Country Link
CN (1) CN110381811A (en)
AU (1) AU2017402745B2 (en)
SG (1) SG11201907590RA (en)
WO (1) WO2018164636A1 (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111045520B (en) * 2019-12-27 2021-08-17 电子科技大学 Method for regulating and controlling user time perception and telepresence in virtual reality
CN111000524A (en) * 2019-12-30 2020-04-14 珠海广目锐视医疗科技有限公司 Visual test system based on artificial neural network
CN112842360B (en) * 2021-01-29 2022-08-30 苏州大学 Method and system for judging dominant eye and non-dominant eye
US11877033B2 (en) * 2021-07-01 2024-01-16 Tencent America LLC Qualification test in subject scoring

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
EP1819267A4 (en) * 2004-10-22 2008-06-18 Vimetrics Llc System and method for visual defect screening
GB0709405D0 (en) * 2007-05-16 2007-06-27 Univ Edinburgh Testing vision
US8083354B2 (en) * 2007-10-03 2011-12-27 Diopsys, Inc. Simultaneously multi-temporal visual test and method and apparatus therefor
EP2580709A4 (en) * 2010-06-11 2016-05-25 Back In Focus Systems and methods for rendering a display to compensate for a viewer's visual impairment
KR20140111298A (en) * 2011-12-20 2014-09-18 아이체크 헬스 커넥션, 인크. Video game to monitor visual field loss in glaucoma
US8931905B2 (en) * 2013-01-25 2015-01-13 James Waller Lambuth Lewis Binocular measurement method and device
US9788714B2 (en) * 2014-07-08 2017-10-17 Iarmourholdings, Inc. Systems and methods using virtual reality or augmented reality environments for the measurement and/or improvement of human vestibulo-ocular performance
US20140362110A1 (en) * 2013-06-08 2014-12-11 Sony Computer Entertainment Inc. Systems and methods for customizing optical representation of views provided by a head mounted display based on optical prescription of a user
CN106137112A (en) * 2016-07-21 2016-11-23 浙江理工大学 A kind of sighting target display system based on brain wave detection and sighting target display optimization method

Also Published As

Publication number Publication date
CN110381811A (en) 2019-10-25
WO2018164636A1 (en) 2018-09-13
SG11201907590RA (en) 2019-09-27
AU2017402745B2 (en) 2024-01-04

Similar Documents

Publication Publication Date Title
US20240099575A1 (en) Systems and methods for vision assessment
JP5367693B2 (en) Inspection and training of visual cognitive ability and cooperative behavior
JP7125390B2 (en) Cognitive platforms configured as biomarkers or other types of markers
US11291362B2 (en) Systems and methods for eye evaluation and treatment
AU2017402745B2 (en) Visual performance assessment
JP5654595B2 (en) System and method for measuring and / or training visual ability of a subject
CN101742957B (en) Testing vision
US10390696B2 (en) Dynamic computer images for improving visual perception
RU2634682C1 (en) Portable device for visual functions examination
US20210045628A1 (en) Methods, systems, and computer readable media for testing visual function using virtual mobility tests
Groth et al. Evaluation of virtual reality perimetry and standard automated perimetry in normal children
Ahmed et al. Democratizing Health Care in the Metaverse: How Video Games can Monitor Eye Conditions Using the Vision Performance Index: A Pilot Study
US11337605B2 (en) Simulator for the evaluation of a concussion from signs displayed during a visual cranial nerve assessment
US20220225873A1 (en) Systems and methods for eye evaluation and treatment
CN114190891B (en) Unilateral neglect evaluation system based on eye tracking and immersive driving platform
US20230293004A1 (en) Mixed reality methods and systems for efficient measurement of eye function
US20240081636A1 (en) Method for Visual Function Assessment Using Multistable Rivalry Paradigms
EP4197425A1 (en) Determining a visual performance of an eye of a person
Zaman EyeSightVR: An immersive and automated tool for comprehensive assessment of visual function
WO2023172768A1 (en) Methods, systems, and computer readable media for assessing visual function using virtual mobility tests
Kiviranta Mapping the visual field: an empirical study on the user experience benefits of gaze-based interaction in visual field testing
CN118284357A (en) Determining visual performance of a person's eyes
Arnoldi Orthoptic evaluation and treatment
Siong Relationship between vision and balance in static and dynamic manners

Legal Events

Date Code Title Description
PC1 Assignment before grant (sect. 113)

Owner name: VISRE PTE. LTD.

Free format text: FORMER APPLICANT(S): GUNASEKERAN, DINESH

FGA Letters patent sealed or granted (standard patent)