WO2024013546A1 - Procédé d'identification d'une difficulté perceptuelle visuelle plus élevée - Google Patents

Procédé d'identification d'une difficulté perceptuelle visuelle plus élevée Download PDF

Info

Publication number
WO2024013546A1
WO2024013546A1 PCT/IB2022/056438 IB2022056438W WO2024013546A1 WO 2024013546 A1 WO2024013546 A1 WO 2024013546A1 IB 2022056438 W IB2022056438 W IB 2022056438W WO 2024013546 A1 WO2024013546 A1 WO 2024013546A1
Authority
WO
WIPO (PCT)
Prior art keywords
user
visual
images
eye
orientation
Prior art date
Application number
PCT/IB2022/056438
Other languages
English (en)
Inventor
Nicola Jean MCDOWELL
Original Assignee
Mcdowell Nicola Jean
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Mcdowell Nicola Jean filed Critical Mcdowell Nicola Jean
Priority to PCT/IB2022/056438 priority Critical patent/WO2024013546A1/fr
Publication of WO2024013546A1 publication Critical patent/WO2024013546A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/028Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing visual acuity; for determination of refraction, e.g. phoropters
    • A61B3/032Devices for presenting test symbols or characters, e.g. test chart projectors
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1113Local tracking of patients, e.g. in a hospital or private home
    • A61B5/1114Tracking parts of the body
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/162Testing reaction times
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6898Portable consumer electronic devices, e.g. music players, telephones, tablet computers
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2503/00Evaluating a particular growth phase or type of persons or animals
    • A61B2503/06Children, e.g. for attention deficit diagnosis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B2505/00Evaluating, monitoring or diagnosing in the context of a particular type of medical care
    • A61B2505/09Rehabilitation or training
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0016Operational features thereof
    • A61B3/0041Operational features thereof characterised by display arrangements
    • A61B3/0058Operational features thereof characterised by display arrangements for multiple images
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F3/00Input arrangements for transferring data to be processed into a form capable of being handled by the computer; Output arrangements for transferring data from processing unit to output unit, e.g. interface arrangements
    • G06F3/01Input arrangements or combined input and output arrangements for interaction between user and computer
    • G06F3/011Arrangements for interaction with the human body, e.g. for user immersion in virtual reality
    • G06F3/013Eye tracking input arrangements
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising

Definitions

  • the present disclosure is generally related to system and methods for determining a higher visual perceptual difficulty in an individual. More specifically, some embodiments of the present disclosure provide assessment tools to detect visual perceptual difficulties in children with good visual acuity.
  • BBVI brain based visual impairment
  • CVI cerebral visual impairment
  • BBVI/CVI is an umbrella term for a myriad of different visual issues. These issues can be divided into two distinct areas of visual functioning - the basic visual functions and the higher, or perceptual functions.
  • the basic visual functions include visual acuity, contrast sensitivity, oculomotor control and visual fields and are all assessed as part of routine eye checks by optometrists or ophthalmologists.
  • the higher visual functions are not so easy to assess, resulting in most children with issues in this area going unrecognised.
  • VPD visual perceptual difficulties
  • VP abilities can appear differently in real world situations compared with clinical settings. So, children can show good or normal VP abilities in a quiet, uncluttered clinic, but may struggle in a cluttered and noisy classroom.
  • CVI previously diagnosed based on severity of visual acuity loss which limited the understanding of the condition. It is now clear that manifestations of this condition involve more than the occipital cortex and CVI is associated with a spectrum of agnosias indicating presence of higher visual function deficits HVFDs, oculomotor abnormalities and secondary changes in the optic nerve have been demonstrated.
  • CVI by consensus, is now termed Cerebral Visual Impairment, "a verifiable visual dysfunction which cannot be attributed to disorders of the anterior visual pathways or any potentially co-occurring ocular impairment" with HVFDs synonymous with visual perceptual difficulties.
  • Using visual acuity alone is likely to miss a large portion of children with HVFDs and a diagnosis of CVI should be based on the combined presence of multiple factors with reduced visual acuity being a contributory but not the defining criteria.
  • HVFDs The detection of HVFDs is difficult as young children with CVI cannot self-report and, older children are usually agnostic of their HVFDs as they have not lost an ability— they never developed the function (they know not what they know not).
  • the presence of good visual acuity which often precludes further investigations, the lack of readily available tools or the knowledge and understanding of manifestations of CVI amongst clinicians and teachers adds to the challenges of identifying HVFDs.
  • diagnosing HVFDs is essential since they can cause significant visual disability in everyday activities and education especially, while visual acuity remains largely intact.
  • a computer-implemented method for assessing neurological impairment of a user comprising: providing via a display module on a device, a visual stimulus to the user; the visual stimulus comprising a group of images corresponding to a visual assessment task; where-in the group of images comprises one or more pairs of identical images and the rest being unidentical; detecting via a facial detection module, the face of the user performing the visual assessment task on the device; tracking via a tracking module, movement of each eye and/ or pupil response of the user; determining via an orientation module, orientation of the each eye and/ or the pupil response of the user performing the visual assessment task on the device, and orientation of the device with respect to the user's face; visualizing in real-time via a visualization module, the user's eye movements rela-tive to the visual stimulus corresponding to the visual assessment task, and movement of the device; and quantifying in real-time via a processor, oculomotor control of the user to assess the neurological impairment from the data obtained through the orientation
  • an assessment tool to detect visual perceptual difficulties of a user comprising: providing, a visual stimulus on a display module of a device, the visual stimulus comprising a group of images having one or more pairs of an identical image and the rest being unidentical images; performing on the device, a visual assessment task of visually pairing the identical images in the group of images by the user; detecting, face of the user via a camera module, and tracking, movement of each eye and/ or pupil response of the user; determining : (i) the position and orientation of the each eye and/ or the pupil response of the user, and the (ii) orientation of the device with respect to the user's face; visualizing, in 2-dimensional graphical form, representation of the oculomotor function of the user from the eye movements relative to the visual stimulus and movement of the device; detecting, in real-time visual perceptual difficulties of the user from the position and orientation of the each eye and/ or pupil movement and the orientation of the device, the 2- d graphical visualization.
  • an apparatus for quantifying neurological impairment of a user comprising: a display module comprising at least one screen configured to display a visual stimulus to the user; the visual stimulus comprising a group of images corresponding to a visual assessment task; wherein the group of images in the visual assessment task comprises one or more pairs of an identical image and the rest being unidentical; a facial detection module configured to detect the face of the user performing the visual assessment task; a tracking module configured to track movement of each eye and/ or pupil response of the user; an orientation module configured to determine the position and orientation of each eye and/ or pupil response of the user, and the orientation of the apparatus with respect to the user's face; a storage module configured to record and store images and data obtained from the tracking module, the display module and the orientation module; a visualization module configured to process the data and images obtained from the tracking and the orientation module, and provide a 2-d visualization of the user's eye movements relative to the visual stimulus corresponding to the visual assessment task and movement of the device; a processor unit configured to process the data and images obtained from the tracking and the orientation
  • a visual perceptual difficulty assessment tool comprising: a display unit; a camera unit; a processor; and a computer readable medium having computer-executable instructions, when executed by the processor, causes the processor to perform a visual assessment task comprising: causing the display unit to output a visual stimulus to a user; obtaining the user's response to the visual stimulus; causing the camera unit to detect the face, and track movement of each eye and/ or pupil response of the user; obtain stimulus data from the display unit and the tracking data from the camera unit; and quantify a level of visual perceptual difficulty of the user based on the obtained da-ta from the display unit and the camera unit.
  • identifying a higher visual perceptual difficulty in an individual is based on any two or more of (i) to (iii).
  • the tracking of the eye is performed by a camera.
  • the camera is a true depth camera.
  • the method tracks both eyes of the patient.
  • the method comprises two or more levels, wherein a subsequent level comprises more indicia than a previous level.
  • the method comprises 2, 3, 4, 5, or 6 levels of testing.
  • the method is carried out on an electronic device.
  • the electronic device has a display screen.
  • the display screen displays the eye movement of the patient.
  • the dwell time is greater than one standard deviations of the aver-age dwell time for the normative range for the individual's age group.
  • the dwell time is greater than two standard deviations of the aver-age dwell time for the normative range for the individual's age group.
  • the visual assessment task comprises visually pairing the identical images in the group of images by the user.
  • quantification of the oculomotor control further comprises determining the user's accuracy in visually pairing the identical images.
  • quantification of oculomotor control further comprises comparing user's accuracy in visually pairing the identical images to a predetermined threshold accuracy.
  • the method further comprising repeating the visual assessment task if the determined accuracy of the user is lower than the predetermined threshold accuracy.
  • the method further comprises deter-mining time taken by the user in visually pairing all the identical images from the visual assessment task.
  • quantification of oculomotor control further comprises comparing time taken by the user in pairing the identical images to a predetermined threshold time.
  • the method further comprising repeating the visual assessment task if the time taken by the user exceeds the predetermined threshold time.
  • quantification of the oculomotor con-trol of the user further comprises: determining and analyzing any one or more of the nys-tagmus, fixation, saccades and pursuit.
  • quantification of the oculomotor con-trol of the user further comprises: measuring and/or detecting any plurality of the follow-ing : amplitude of a saccadic movement, maximum velocity of a saccadic movement, la-tency of a saccadic movement, accuracy of a saccadic movement, and/or direction of a saccadic movement.
  • the visual assessment task comprises plurality of images with varying levels of visual field, contrast and color.
  • quantification of the oculomotor con-trol further comprises determining a dwell time based on the user's response to the visual stimulus.
  • the dwell time is determined based on a time taken by the user to visually match each identical image amongst the plurality of images.
  • quantification of the oculomotor con-trol further comprises determining a dwell time based on the user's response to the visual stimulus.
  • the method further comprises repeating the visual assessment task if the determined dwell time is more than a predetermined threshold dwell time.
  • pupil response comprises looking at the size of both pupils to determine how well the pupil responses are synchronized.
  • the method further comprises determining the latency of the pupil response and/or eye movement.
  • determining the orientation of the device comprises determining the pitch, yaw and roll of the device.
  • the visual stimulus comprises a group of images comprising one or more pairs of identical images with the rest being unidentical.
  • the user's response to the visual stimulus comprises visually pairing the identical images in the group of images.
  • the visual assessment task further comprises a user-specific visual rehabilitation program.
  • the user-specific visual reha-bilitation program is developed based upon one or more of the following : the user's symp-toms, an objective evaluation of the user's condition, the user's subjective complaints, the user's conditions, or the user's history.
  • Figure 1 is a flow diagram.
  • Figure 2 is a flow diagram.
  • Figure 3 is a flow diagram.
  • Figure 4 is a depiction of level 1 of the visual assessment task showing a single indicia pair.
  • Figure 5 is a depiction of eye tracking from an individual without a higher visual perceptual difficulty.
  • Figure 6 is a depiction of eye tracking from an individual with a higher visual perceptual difficulty
  • CVI is a heterogenous disorder of brain-based visual impairment potentially resulting from brain injury or disruption of development of retrochiasmatic visual pathways and vision processing regions of brain, commonly occurring during gestation at or around birth.
  • CVI is typically diagnosed in a child with a suitable clinical history by observation of abnormal visually guided behaviors (i.e., behaviors that rely on normal visual function) that suggest abnormal brain development or brain damage. These behaviors can stem from HVFDs of visual processing with consequent perceptual deficits, in the presence of normal or near-normal visual acuity.
  • a visual perceptual difficulty assessment tool 100 comprising : a display unit 110; a camera unit 120; a processor 130; and a computer readable medium having computer-executable instructions, when executed by the processor 130, causes the processor to perform a visual assessment task 140 comprising :
  • the visual stimulus comprises a group of images comprising one or more pairs of identical images and the rest being unidentical.
  • the visual assessment task further comprises visually pairing the identical images in the group of images by the user.
  • the visual assessment task comprises a plurality of instances/ levels with increasing number of identical and non-identical images in each instance.
  • instance one has four images comprising one identical pair, building up to twelve images with five identical pairs in instance five. Common to all the instances is a pair that doesn't always match.
  • Described in Figure 2 is the step-wise implementation of the assessment tool algorithm according to one implementation.
  • the camera unit of the assessment tool utilizes the front-facing or user-facing TrueDepth camera system of a mobile device (for e.g., Tablet PC, a desktop PC or a mobile phone) to track the eye movements of the user while they complete each level of the assessment.
  • a mobile device for e.g., Tablet PC, a desktop PC or a mobile phone
  • the user is presented with a screen asking the user to ensure the front facing camera has a clear view of the user's face.
  • step 203 another screen is prompted which informs the user if the system has successfully detected their face.
  • next step 204 while the eye tracking is running, the user is notified by an icon at the top of the screen if the system has been unsuccessful in detecting the user's face and eyes. Once the system has successfully re-detected the user's face the notification is dismissed. The task does not progress to the next step 205, until the successful detection of the user's face.
  • the users face and position and the orientation of each eye are determined, in addition to the position and orientation of the physical device being used for the assessment task.
  • augmented reality for e.g., ARKit or ARCore
  • 3d graphics frame work for e.g., SceneKit or ViroCore
  • the processor is configured to draw a vector in virtual space from the anchor point of each eye in the direction in which the eye is orientated.
  • a plane is also drawn in the orientation of the device used for the assessment task to account for real time updates in yaw, roll, pitch as the user holds the device.
  • the processor detects if and where the two vectors intersect in this plane and find the midpoint between these two intersecting points. This point of intersection is then recalculated continuously in real time.
  • these data points are visualised in the form of a 2D graphical diagram for each level of the assessment task showing the relative size of the device to the user's eye movements.
  • These graphical diagrams are available to be viewed on a webapplication associated with the assessment task.
  • the raw eye tracking data including the x and y eye tracking positions, time stamps, level, and game configuration can be downloaded from a web application dashboard in the JSON file format. Other file formats may also be used in other implementations.
  • the visual assessment task further comprises a user-specific visual rehabilitation program, wherein the user-specific visual rehabilitation program is developed based upon one or more of the following : the user's symptoms, an objective evaluation of the user's condition, the user's subjective complaints, the user's conditions, or the user's history.
  • a user When a user (typically a child, although in some instances the user could even be an adult) completes the visual assessment task using the shapes in monochrome or multicoloured formats (in some instances, other themes such as dinosaurs, space and nature - could also be used), their time to completion of the task is compared with the determined threshold for their age group. If it is found that their time is slower than the threshold, they will be identified as needing further assessment for possible visual perceptual difficulties I issues with the higher visual functions.
  • the assessment task comprises the user performing visual pairing of the identical images amongst the group of images at each level of the task.
  • the processor is configured to calculate a percentage of accuracy in matching the identical images (for e.g., cards or patterns or numbers, etc).
  • the accuracy score is calculated to be 100%. This score gets lower with each incorrect pairing of the images and I or if the pairs are left unmatched.
  • the objective threshold is based on the assessments of 270 children aged between 5-13. For example, if a user was inaccurate on three or more of the levels in the task, further assessments were conducted on those users to determine if they had visual perceptual difficulties I issues with the higher visual functions.
  • the processor is further configured to determine the dwell time of the user based on the user's response to the visual stimulus.
  • the dwell time is the time taken to match the first pair in each level. Based on the reporting from the study, users (children in this instance) with visual perceptual issues / issues with the higher visual functions had longer dwell times on level 5 than what they did on level 1, whereas the children with typical vision had similar dwell times in level 1 and 5.
  • Thresholds for the dwell times were established by calculating either the average dwell time across all five levels, the combined dwell time across all five levels or the slowest dwell time in the 5 levels. Analysis is then conducted in the 3 groupings of age of 5-7, 8-12, 13-17 (other age groups above 17 may also be considered), and the threshold is determined based on 2 standard deviations from the norm. In some instances, comparisons in terms the dwell time at level one and the dwell time at level 5 and establishing thresholds based on the comparisons is also performed. The users are then identified as needing further assessment if their dwell time analysis are over the threshold for their age group.
  • the users are identified for further assessment if their performance meets the threshold in any one of the 3 variables above, including the overall time required to complete the task, pairing accuracy and the dwell times.
  • Figure 3 illustrates the implementation of the assessment task according to an embodiment of the disclosure. As is seen in Figure 3, at step 301, the overall time taken by the user to complete all the levels of the assessment task is compared to the predetermined threshold established for different age groups. At step 302, if it is determined that the overall time exceeds the threshold time taken to complete all the 5 levels, the user is advised to perform further assessment task.
  • step 303 the pairing accuracy (ability to match the pairs correctly at each level) of the user performing the assessment task is computed. If it is determined that the pairing accuracy is less than 100% in 3 or more levels, the user is prompted to perform further assessment task as seen in step 304.
  • the dwell time is then computed and is compared against a threshold dwell time estimated from a group of users of different ages (5-7, 8-12, 13-17). Once the comparison is made based on the user's age within the established age group from the different age groups, if it is determined that the time taken to visually match the first pair in each level is longer than the threshold, at step 306, the user is prompted to perform further assessment task.
  • the user's eye tracking data is recorded and compared against a threshold of tracking data, typically comprising number of eye movements, area of screen covered by the user during the assessment task. If it is found that the eye tracking data does not correlate with the threshold values, the user is then prompted at step 308 to perform further assessment task.
  • a threshold of tracking data typically comprising number of eye movements, area of screen covered by the user during the assessment task.
  • the disclosure provides for a computer-implemented method for assessing neurological impairment of a user.
  • the term neurological impairment may encompass disorders such as CVI and HVFDs.
  • the method comprises: providing via a display module 101 on a device 102, a visual stimulus 103 to the user; the visual stimulus comprising a group of images corresponding to a visual assessment task 104; wherein the group of images comprises one or more pairs of identical images and the rest being unidentical;
  • the method may further comprise utilizing a facial detection module to detect the face of the user performing the visual assessment task on the device.
  • the facial detection module may comprise hardware (such as a camera or any video recording device configured to capture the face of a user) and software (such as Open CV facial recognition tool or the like).
  • the method may further comprise tracking movement of each eye and/ or pupil response of the user using a tracking module.
  • the tracking module may comprise software for estimating the eye movements and the people response of the user from the data captured from the hardware (for e.g., camera).
  • the orientation of the each eye and/ or the pupil response of the user performing the visual assessment task on the device, and the orientation of the device with respect to the user's face is then computed.
  • the orientation of the device is computed based on the device's position in the cartesian space. In one instance, the orientation of the device is computed based on the data received from the onboard sensors (such as accelerometer or gyroscopes) including the pitch, yaw and roll of the device.
  • the onboard sensors such as accelerometer or gyroscopes
  • the method may further comprise quantifying in real-time via the processor, oculomotor control of the user to assess the neurological impairment from the data obtained through the orientation and the visualization modules.
  • Oculomotor control in this instance may be visual acuity or visual response of the user.
  • the visual assessment task comprises visually pairing the identical images in the group of images by the user.
  • the quantification of the oculomotor control further comprises determining the user's accuracy in visually pairing the identical images.
  • Quantification of the oculomotor control may further comprise determining the user's accuracy in visually pairing the identical images.
  • Quantification of oculomotor control further comprises comparing user's accuracy in visually pairing the identical images to a predetermined threshold accuracy. The visual assessment task is repeated if the determined accuracy of the user is lower than the predetermined threshold accuracy, quantification of the oculomotor control of the user further comprises: measuring and/or detecting any plurality of the following : amplitude of a saccadic movement, maximum velocity of a saccadic movement, latency of a saccadic movement, accuracy of a saccadic movement, and/or direction of a saccadic movement.
  • the visual assessment task comprises plurality of images with varying levels of visual field, contrast and color.
  • Tracking the pupil response comprises looking at the size of both pupils to determine how well the pupil responses are synchronized.
  • the method further comprises determining the latency of the pupil response and/or eye movement.
  • the visual assessment task further comprises a user-specific visual rehabilitation program.
  • the user-specific visual rehabilitation program is developed based upon one or more of the following : the user's symptoms, an objective evaluation of the user's condition, the user's subjective complaints, the user's conditions, or the user's history.
  • the processing methods to which embodiments of the disclosure are applied may be produced in the form of a program executed on computers and may be stored in computer-readable recording media.
  • Multimedia data with the data structure according to the disclosure may also be stored in computer-readable recording media.
  • the computer- readable recording media include all kinds of storage devices and distributed storage devices that may store computer-readable data.
  • the computer-readable recording media may include, e.g., Blu-ray discs (BDs), universal serial bus (USB) drives, ROMs, PROMs, EPROMs, EEPROMs, RAMs, CD-ROMs, magnetic tapes, floppy disks, and optical data storage.
  • the computer-readable recording media may include media implemented in the form of carrier waves (e.g., transmissions over the Internet). Bitstreams generated by the encoding method may be stored in computer-readable recording media or be transmitted via a wired/wireless communication network.
  • the embodiments of the disclosure may be implemented as computer programs by program codes which may be executed on computers according to an embodiment of the disclosure.
  • the computer codes may be stored on a computer-readable carrier.
  • the UAV and method of controlling the UAV to which embodiments of the disclosure are applied may include digital devices.
  • the digital devices encompass all kinds or types of digital devices capable of performing at least one of transmission, reception, processing, and output of, e.g., data, content, or services.
  • Processing data, content, or services by a digital device includes encoding and/or decoding the data, content, or services.
  • Such a digital device may be paired or connected with other digital device or an external server via a wired/wireless network, transmitting or receiving data or, as necessary, converting data.
  • the digital devices may include, e.g., network computers, personal computers, or other standing devices or mobile or handheld devices, such as personal digital assistants (PDAs), smartphones, tablet PCs, or laptop computers.
  • PDAs personal digital assistants
  • wired/wireless network collectively refers to communication networks supporting various communication standards or protocols for data communication and/or mutual connection between digital devices or between a digital device and an external server.
  • Such wired/wireless networks may include communication networks currently supported or to be supported in the future and communication protocols for such communication networks and may be formed by, e.g., communication standards for wired connection, including USB(Universal Serial Bus), CVBS(Composite Video Banking Sync), component, S-video(analog), DVI(Digital Visual Interface), HDMI(High Definition Multimedia Interface), RGB, or D-SUB and communication standards for wireless connection, including Bluetooth, RFID (Radio Frequency Identification), IrDA(infrared Data Association), UWB(Ultra-Wideband), ZigBee, DLNA(Digital Living Network Alliance), WLAN(Wireless LAN)(Wi-Fi), Wibro(Wireless broadband), Wimax(World Interoperability for Microwave Access), HSDPA(High Speed Downlink Packet Access), LTE
  • RFID Radio Frequency Identification
  • an embodiment of the disclosure may be implemented as a module, procedure, or function performing the above-described functions or operations.
  • the software code may be stored in a memory and driven by a processor.
  • the memory may be positioned inside or outside the processor to exchange data with the processor by various known means.
  • FIG. 4 of the current disclosure illustrates several possible embodiments for a user to implement the disclosed visual impairment assessment systems and methods.
  • the disclosed embodiments are merely exemplary in nature, and the systems and methods may be employed in any suitable environment.
  • a number of different devices are available for a user to launch the visual assessment task, such as desktop computer 404, laptop computer 406, cell phone 412, smart phone, other portable device, television 410, tablet 414, virtual reality goggles 416 (e.g., Google CardboardTM, Oculus RiftTM, Samsung VR GearTM, Sony's Playstation VR®), projector screen 402, and other computing devices 408.
  • virtual reality goggles 416 e.g., Google CardboardTM, Oculus RiftTM, Samsung VR GearTM, Sony's Playstation VR®
  • projector screen 402 e.g., Samsung CardboardTM, Samsung VR GearTM, Sony's Playstation VR®
  • FIG. 4 of the current disclosure illustrates several possible embodiments for a user to implement the disclosed visual impairment assessment systems and
  • the device discussed here is a portable device having a processor, network connection through a computer and/or on its own, etc. Many different types of devices may have these characteristics.
  • the discussion here involves the visual assessment program being implemented in a home setting, but may include different types of devices and may include other venues such as rehabilitation centers or hospitals or clinics instead of a home setting.
  • a user may have an appointment with a doctor/therapist/rehabilitationist/analyst to analyze the patient's visual function or impairment of visual function caused by a traumatic brain injury or impairment caused during gestation.
  • a doctor/therapist/rehabilitationist/analyst to analyze the patient's visual function or impairment of visual function caused by a traumatic brain injury or impairment caused during gestation.
  • the disclosed persons such as an occupational expert, therapist, are exemplary in nature and other individuals may be qualified to evaluate the user, design the system and/or method, and/or implement the system or method, such as a doctor.
  • the terms, user and patient are used throughout the specification.
  • the user does not need to be formally diagnosed to qualify as a user of the system and/or method, as would be apparent to skilled addressee in the art.
  • the user may be evaluated to develop a patient-specific visual rehabilitation program.
  • an occupational expert therapist may review the patient's case history at 502.
  • the patient's case history may include how the patient incurred the brain injury, e.g., accident, injury, fall, birth defect etc.
  • the user may or may not further provide his or her own subjective complaints about his or her condition at 504.
  • the therapist may observe and/or conduct tests on the user to obtain objective findings of the patient's condition at 506 from the assessment task.
  • the tests may vary widely from the simple to the complex, from a simple visual inspection of the patient to detailed MRI scans of the patient's brain. Such tests may be carried out, for example, by the doctor, doctor's assistant, radiologist or user at the instruction of the therapist.
  • One test widely used in the evaluation of patients with traumatic brain injury is the Diopsys Nova-LX Visual Evoked Potential instrument.
  • the skilled addressee may understand that the tests need not be conducted compulsorily or at the time of the appointment but may have occurred previously and provided to the therapist at a later time.
  • the occupational expert may use one, two, or all of the listed elements: the patient's case history 502, patient's subjective complaints 504, or objective findings of the patient's condition 506 based on the analysis from the visual assessment task.
  • objective findings of the patient's conditions in addition to the visual assessment task and symptoms may be obtained through a number of methods, including but not limited to visual evoked potential (VEP) and electroencephalogram (EEG).
  • VEP visual evoked potential
  • EEG electroencephalogram
  • the therapist may utilize some or all of this information to design a patient-specific rehabilitation program at 508.
  • the therapist may present one or more images or stimulus to the patient and measure his or her reaction to the images prior to designing the final patient specific visual rehabilitation sequences. Once the therapist designs the patient specific visual rehabilitation sequences, the therapist creates the program at 510.
  • the identification of a higher visual perceptual difficulty in an individual can be carried out by assessing one or more endpoints based on the pairing of indicia. These are:
  • normative data can be complied for adults and children against which the comparative assessment (from undertaking the visual assessment task) can be made to identify an individual with a higher visual perceptual difficulty.
  • the normative data can provide normative ranges and thresholds from which to make an assessment of an individual's performance in the testing. For example, a child's performance (to match indicia) can be assessed against normative data compiled for children to identify a higher visual perceptual difficulty. An adult's performance (to match indicia) can be assessed against normative data compiled for adults to identify a higher visual perceptual difficulty.
  • the visual assessment tasks to identify a higher visual perceptual difficulty in an individual is based on the individual matching a pair or pairs of indicia.
  • the visual assessment tasks may be divided into a series of levels, with each task level having a greater number of pairs of indicia to match.
  • Each level of assessment will include a number of unmatched indicia. For example, at the lowest level of assessment (i.e., level 1) the assessment may comprise one pair of indicia and two unmatched pairs.
  • a subsequent level may comprise two pairs of matched indicia and two or more unmatched indicia.
  • the levels may be as set out below.
  • the indicia can be a visual representation on an electronic screen.
  • the indicia may be presented as discreet indicia on the screen or as separate cards each displaying an indica for example.
  • the indicia can be of different shapes and colours, provided that matching indica have the same colour and shape.
  • the indicia are displayed in monochrome.
  • the normative data provides thresholds for assessment.
  • the use of the method increases the dataset of assessment (i.e. more participants) which alters the thresholds for assessment as the number of participants increases. This can then lead to more accurate assessments as the thresholds for assessment are refined based on a growing dataset of normative and test subjects/individuals.
  • the method comprises a central data storage that comprises test data comprising results from individuals with and without visual perceptual difficulties. As individuals complete the visual assessment task the results are added to the central data storage. This increases the size of the datasets that establish normal thresholds for assessment. That is, the normal distribution curves of data have increased resolution as the dataset increases in size from individuals completing the task and their data adding to the central data storage.
  • the time to complete describes the time that it takes the individual to complete the visual assessment task. For example, we have found that children with visual perceptual difficulties can take twice as long to match pairs across five levels of the visual assessment task where level one has four cards, one pair, building up to level five, 12 cards, five pairs as shown in Table 1 above.
  • the thresholds may be adjusted based on the size of the normative dataset based on the resolution of the normal distribution curve.
  • the thresholds for assessment will be based on a particular set indicia used for the assessment. For example, if the visual assessment task is based on a different set of indicia (e.g. multi colour) then assessment thresholds will be based on normative data based on this particular set of indicia. Pairing
  • Pairing relates to the time taken to match the first pair of indicia at each level of assessment. For each level, the method calculates a percentage of accuracy in matching the indicia. If the individual matches all the indicia correctly, the score is 100%. This percentage decreases with each incorrect pair and I or if pairs are left unmatched.
  • the thresholds for assessment become more precise.
  • the dwell time is the time taken to match the first pair in each level.
  • the individual may have a longer dwell time as the number of pairs of indicia increases. For example, from our analysis, children with visual perceptual issues have a longer dwell times on level 5 than what they did on level 1, whereas children with typical vision had similar dwell times in levels 1 and 5.
  • Thresholds for the dwell times may be established by calculating either
  • the analysis is conducted in two or more groupings of age.
  • the threshold is based on 1, 2, or 3 standard deviations from the norm. In one embodiment the threshold is based on 2 standard deviations from the norm.
  • the analysis of dwell time is conducted by comparing the dwell time at level one and the dwell time at a higher level and establishing thresholds based on that. For example, comparing between level 1 and level 5 (or the highest assessment level). Individuals may be identified as needing further assessment if their dwell time analysis is over the threshold for their age group.
  • FIG. 5 shows tracked eye movement in an individual without a higher visual perceptual difficulty.
  • Figure 6 shows tracked eye movement in an individual with a higher visual perceptual difficulty
  • the method uses machine learning to assess the eye movements to classify the eye movement to normal or outside of the normal threshold. This can be based on building up a databases of eye movements of normal individuals vs individuals with a higher visual perceptual difficulty as they complete the visual assessment task.
  • the assessor of the visual assessment task can add comments about what the individual's eyes were doing as they completed the visual assessment task. These comments may assist the computer to use machine learning to establish a database of eye movements.
  • eye tracking may be used to create an objective threshold for eye movement.
  • the method measures the overall distance travelled of the eyes over one or more of the levels of the visual assessment task.
  • a distance travelled that is higher than a threshold distance provided by normative data is indicative of a higher visual perceptual difficulty.
  • the threshold is based on 1, 2, or 3 standard deviations from the norm. In one embodiment the threshold is based on 2 standard deviations from the norm.
  • the method measures the number of eye movements over the visual assessment task.
  • a number of eye movements that is higher than a threshold number of eye movements provided by normative data is indicative of a higher visual perceptual difficulty.
  • the threshold is based on 1, 2, or 3 standard deviations from the norm. In one embodiment the threshold is based on 2 standard deviations from the norm.
  • the method measures the area of the screen covered by movement of the eyeball when tracked.
  • a area of screen covered by the movement of the eyeball that is higher than a threshold area of screen covered by the movement of the eyeball provided by normative data is indicative of a higher visual perceptual difficulty.
  • the threshold is based on 1, 2, or 3 standard deviations from the norm. In one embodiment the threshold is based on 2 standard deviations from the norm.

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Public Health (AREA)
  • Medical Informatics (AREA)
  • Biomedical Technology (AREA)
  • General Health & Medical Sciences (AREA)
  • Physics & Mathematics (AREA)
  • Pathology (AREA)
  • Biophysics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Primary Health Care (AREA)
  • Epidemiology (AREA)
  • Databases & Information Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Psychiatry (AREA)
  • Ophthalmology & Optometry (AREA)
  • Artificial Intelligence (AREA)
  • Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Theoretical Computer Science (AREA)
  • General Engineering & Computer Science (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Physiology (AREA)
  • Educational Technology (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Multimedia (AREA)
  • Dentistry (AREA)
  • Signal Processing (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Mathematical Physics (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Fuzzy Systems (AREA)

Abstract

Selon un premier aspect, l'invention concerne un procédé d'identification d'une difficulté perceptuelle visuelle plus élevée chez un individu, de présentation de quatre symboles ou plus, au moins deux des symboles étant en correspondance, à l'individu, pour permettre la mise en correspondance des symboles correspondants par l'individu, de suivi du mouvement d'au moins un œil de l'individu pendant la mise en correspondance des symboles et l'une quelconque ou plusieurs d'une précision d'appariement inférieure à 100 % par niveau, d'un temps de séjour étant plus lent que la plage normative pour le groupe d'âge de l'individu pour mettre en correspondance la première paire de symboles de chaque niveau, d'un suivi oculaire au-dessus d'un seuil normatif, indiquant une difficulté perceptuelle visuelle plus élevée.
PCT/IB2022/056438 2022-07-12 2022-07-12 Procédé d'identification d'une difficulté perceptuelle visuelle plus élevée WO2024013546A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2022/056438 WO2024013546A1 (fr) 2022-07-12 2022-07-12 Procédé d'identification d'une difficulté perceptuelle visuelle plus élevée

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2022/056438 WO2024013546A1 (fr) 2022-07-12 2022-07-12 Procédé d'identification d'une difficulté perceptuelle visuelle plus élevée

Publications (1)

Publication Number Publication Date
WO2024013546A1 true WO2024013546A1 (fr) 2024-01-18

Family

ID=89536075

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2022/056438 WO2024013546A1 (fr) 2022-07-12 2022-07-12 Procédé d'identification d'une difficulté perceptuelle visuelle plus élevée

Country Status (1)

Country Link
WO (1) WO2024013546A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100092929A1 (en) * 2008-10-14 2010-04-15 Ohio University Cognitive and Linguistic Assessment Using Eye Tracking
US20100183205A1 (en) * 2007-06-12 2010-07-22 Ernst Pfleger Method for perception measurement
US20110027766A1 (en) * 2009-08-03 2011-02-03 Nike, Inc. Unified Vision Testing And/Or Training
US20160242642A1 (en) * 2013-10-03 2016-08-25 Neuroscience Research Australia (Neura) Systems and methods for diagnosis and therapy of vision stability dysfunction
WO2018026858A1 (fr) * 2016-08-02 2018-02-08 New York University Procédés et kits d'évaluation de la fonction neurologique et de localisation de lésions neurologiques
US20190150732A1 (en) * 2012-03-26 2019-05-23 New York University Methods and kits for assessing central nervous system integrity

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20100183205A1 (en) * 2007-06-12 2010-07-22 Ernst Pfleger Method for perception measurement
US20100092929A1 (en) * 2008-10-14 2010-04-15 Ohio University Cognitive and Linguistic Assessment Using Eye Tracking
US20110027766A1 (en) * 2009-08-03 2011-02-03 Nike, Inc. Unified Vision Testing And/Or Training
US20190150732A1 (en) * 2012-03-26 2019-05-23 New York University Methods and kits for assessing central nervous system integrity
US20160242642A1 (en) * 2013-10-03 2016-08-25 Neuroscience Research Australia (Neura) Systems and methods for diagnosis and therapy of vision stability dysfunction
WO2018026858A1 (fr) * 2016-08-02 2018-02-08 New York University Procédés et kits d'évaluation de la fonction neurologique et de localisation de lésions neurologiques

Similar Documents

Publication Publication Date Title
US10856733B2 (en) Methods and systems for testing aspects of vision
WO2019099572A1 (fr) Systèmes et procédés d'analyse de champ visuel
Chen et al. Eye‐tracking‐aided digital system for strabismus diagnosis
WO2019098173A1 (fr) Appareil de diagnostic de dysfonctionnement cognitif et programme de diagnostic de dysfonctionnement cognitif
US20200073476A1 (en) Systems and methods for determining defects in visual field of a user
WO2013036629A2 (fr) Système et procédés destinés à documenter et à enregistrer un essai de protection de l'œil vis-à-vis d'un réflexe de pupille rouge et d'un réflexe de la cornée à la lumière chez les bébés et chez les enfants en bas âge
Chan et al. Diagnostic performance of the ISNT rule for glaucoma based on the Heidelberg retinal tomograph
CN111863256B (zh) 一种基于视觉认知功能损害的精神性疾病检测装置
Akbarali et al. Imaging‐Based uveitis surveillance in juvenile idiopathic arthritis: feasibility, acceptability, and diagnostic performance
Chung Size or spacing: Which limits letter recognition in people with age-related macular degeneration?
Denniss et al. Estimation of contrast sensitivity from fixational eye movements
McKee et al. The classification of amblyopia on the basis of visual and oculomotor performance
Arnold et al. Ellipsoid spectacle comparison of PlusoptiX, Retinomax and 2WIN autorefractors
Mesquita et al. A mhealth application for automated detection and diagnosis of strabismus
WO2024013546A1 (fr) Procédé d'identification d'une difficulté perceptuelle visuelle plus élevée
US20240062378A1 (en) Quality control method and quality control system for data annotation on fundus image
Garcia et al. Automated pupillometer using edge detection in opencv for pupil size and reactivity assessment
Bobb-Semple et al. Validity of smartphone fundus photography in diagnosing diabetic retinopathy at Mbarara Regional Referral Hospital, South Western, Uganda: Validity of smartphone fundus photography in diagnosing diabetic retinopathy at Mbarara Regional Referral Hospital, South Western, Uganda
US11559199B2 (en) Visual contrast sensitivity cards for pediatric subjects
RU2357651C1 (ru) Способ компьютерной диагностики открытоугольной глаукомы
US20230284899A1 (en) Head-Mounted System for Measuring Ptosis
Takahashi et al. Novel scotoma detection method using time required for fixation to the random targets
Kumar et al. Diagnosis of Glaucoma Through the Analysis of Saccadic Eye Movements Employing Machine Learning Methods
RU2690917C1 (ru) Способ объективного измерения остроты зрения (варианты)
McGinnis et al. Free Communications, Poster Presentations: Concussion Visual Assessments

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22950989

Country of ref document: EP

Kind code of ref document: A1