EP3871145A1 - Systèmes et procédés d'évaluation de sensibilité de contraste et d'autres mesures visuelles - Google Patents

Systèmes et procédés d'évaluation de sensibilité de contraste et d'autres mesures visuelles

Info

Publication number
EP3871145A1
EP3871145A1 EP19877442.4A EP19877442A EP3871145A1 EP 3871145 A1 EP3871145 A1 EP 3871145A1 EP 19877442 A EP19877442 A EP 19877442A EP 3871145 A1 EP3871145 A1 EP 3871145A1
Authority
EP
European Patent Office
Prior art keywords
visual
stimulus
visual stimulus
display
evidence
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
EP19877442.4A
Other languages
German (de)
English (en)
Other versions
EP3871145A4 (fr
Inventor
Glen PRUSKY
Scott William Joseph MOONEY
Nicholas Jeremy HILL
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Burke Neurological Institute
Original Assignee
Burke Neurological Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Burke Neurological Institute filed Critical Burke Neurological Institute
Publication of EP3871145A1 publication Critical patent/EP3871145A1/fr
Publication of EP3871145A4 publication Critical patent/EP3871145A4/fr
Pending legal-status Critical Current

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/02Subjective types, i.e. testing apparatus requiring the active assistance of the patient
    • A61B3/022Subjective types, i.e. testing apparatus requiring the active assistance of the patient for testing contrast sensitivity
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/0091Fixation targets for viewing direction
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement

Definitions

  • the stimulus textures may be generated in advance of the Presentation/Response Step (140) or created “on the fly” during the Presentation/Response Step (140).
  • the stimuli may be windowed with a continuous function that reduces contrast as a function of radial distance from the center of the stimulus (e.g. a Hann window).
  • the stimulus size may vary depending on the spatial frequency of the stimulus texture, the number of sweeps that will be presented simultaneously, the size of the physical display, and the measurement needs of the particular test being performed.
  • a calibration translation vector may be updated independently for each comer’s stimulus using the same method as the one-point calibration mode.
  • the resulting four calibration vectors may be used to create an interpolated perspective mapping that may be used to correct all future gaze position data.
  • the Gaze Pre-Calibration Step (120) may be omitted if calibration is unfeasible for the participant or if the existing calibration of the eye tracker is sufficient.
  • Each sweep has an“evidence-of-visibility” score, which indicates the current strength of evidence that the participant can see the currently active stimulus in that sweep. While a sweep is active during the Presentation/Response Step (140) (i.e. its current stimulus is displayed), this evidence-of-visibility score is monitored and updated. The evidence-of-visibility scores are used to determine when the current stimulus should be modified or exchanged for the next stimulus in that sweep, to calibrate the
  • the Presentation/Response Step (140) also uses a“global evidence” score, which is continuously monitored and updated. This score indicates whether the current stimuli of any active sweeps are visible to the participant, and informs the stopping criterion for the Presentation/Response Step (140). If only one sweep is active, the global evidence score may simply be the evidence-of-visibility score of that sweep. If more than one sweep is active, the global evidence score may be computed from the combination of evidence-of-visibility scores of one or more active sweeps, or it may be computed independently of any sweeps (e.g. by determining if the participant is looking away from the display).
  • Blinks may be detected using an algorithm that varies with the exact eye tracker hardware used during the application of the method, as different eye trackers may produce different“signature” responses to blinks. For example, blinks may be classified by detecting periods in which no gaze data is reported by the eye tracker for no longer than a predetermined time (e.g. two seconds). Gaze samples immediately before and/or immediately after the detected blink (e.g. up to 0.5 seconds on each side) may be ignored to avoid the risk of using malformed gaze data during the blink.
  • a predetermined time e.g. two seconds
  • the system may determine whether a sweep’s evidence-of-visibility score exceeds or falls below a certain threshold, or falls within or outside a certain range.
  • the exact evidence threshold or range may vary both between and within method applications as a function of the parameters of the active stimulus in that sweep or to facilitate different measurement needs.
  • the trial may be terminated. All evidence-of-visibility scores and stimulus changes that occurred within each sweep may be recorded to inform the CSF Analysis Step (150) and/or to skip some number of stimuli in future presentations of the same sweep. A new trial may then begin with new sweeps and/or repeated presentations of previous sweeps, possibly following an intermission phase where other stimuli are presented to provide a break and/or reward for the participant. If the method infers that sufficient data has been collected (e.g. a predetermined number of trials has been completed for each sweep), the Presentation/Response Step (140) may end.
  • a sweep’s evidence-of-visibility score may also be used to continuously calibrate the eye tracker in real time as the current stimulus of that sweep and/or the participant’s gaze move to different areas of the display.
  • the eye tracker hardware used in a given embodiment of the invention may not be able to accurately calibrate a participant’s gaze to the screen used in that embodiment, in which case the one-off calibration performed in the Gaze Pre-Calibration Step (120) may not be suitable for the entirety of the display.
  • one or more images or animations may, for example, appear as a visual reward for a predetermined amount of time (e.g. two seconds) at the end of a trial or when a certain duration or quality of gaze response is detected.
  • a video may be played on the computer screen between trials to provide the participant with temporary relief from the task.
  • Another component of the present invention includes an algorithm to analyze a data set of CSFs to determine the subset of one or more measured sweeps (the“basis sweeps”) whose thresholds are, statistically, the most highly predictive of the CSF computed from the full set of sweeps in the same population.
  • the predictive value of a subset of sweeps may be computed using a multivariate multiple regression with the full set of sweeps as the dependent variables.
  • This algorithm may be used to reduce the number of sweeps required to robustly estimate the CSF in future Curveball sessions within that population.
  • the single sweep whose threshold is most predictive of the entire CSF of a participant from that population is labeled as the“Concuity” sweep of that population.
  • the empirical identification of the Concuity sweep is a novel feature of the present invention.
  • the evidence-of-visibility scores may be used to calculate visual functions relevant to the dimensions of any stimulus parameterization, by finding the threshold between stimulus visibility and stimulus invisibility along the dimensions that vary across the sweep sequence. Examples of such dimensions include, without limitation: stimulus orientation; stimulus color contrast; the speed and/or direction of stimulus motion;
  • Another aspect of the present invention is the physical hardware on which the above procedure is installed and run as software.
  • Various configurations of displays, computers, and eye tracker components may be used to practice the present invention.
  • the eye tracker used to measure the participant’s gaze response may be a head-mounted eye tracker, such as the Tobii Pro Glasses, a display-mounted eye tracker, such as the Tobii 4C, or an eye tracker integrated into the display device itself, such as the Oculus Rift or Oculus Quest virtual-reality headset.
  • a head-mounted eye tracker such as the Tobii Pro Glasses
  • a display-mounted eye tracker such as the Tobii 4C
  • an eye tracker integrated into the display device itself such as the Oculus Rift or Oculus Quest virtual-reality headset.
  • Figure 2D depicts an exemplary methodology for manipulating stimulus contrast based on stimulus tracking.
  • Figure 13 depicts different-day repeatability of Curveball mean thresholds.
  • the layout is identical to Figure 4, but the lines now represent mean sensitivity in the first (blue) and second (green) testing sessions.
  • the faded subplots represent two participants whose pursuit scores were below the exclusion threshold in the second session’s standard Curveball run.
  • Figure 14 depicts Curveball thresholds vs. static 4AFC thresholds.
  • the layout and axes are identical to Figure 12.
  • the dotted and solid blue lines represent the raw and transformed Curveball CSFs, respectively, and the black lines represent the CSFs estimated from the static gratings in the 4AFC staircase task.
  • the faded subplot in the third row denotes one excluded participant who achieved an impossible threshold at 2 CPD in the staircase task. This excluded data was not used to optimize the free transformation parameters for the other participants.
  • Error bars in the 4AFC data here and in Figure 8 represent ⁇ 1 SEM, which was computed using the last four between-reversal runs of each contrast staircase.
  • Figure 15 depicts Curveball thresholds vs.
  • Figure 16 depicts Curveball thresholds measured with and without visual correction.
  • the layout and axes are identical to Figure 12.
  • the solid blue and dashed red lines represent Curveball CSFs obtained with and without corrective eyewear, respectively.
  • the shear in the uncorrected CSF curve relative to the corrected curve is given in the bottom-left corner of each subplot.
  • the standard Curveball thresholds for participants who did not complete the uncorrected condition have been left grayed out in this figure to permit easy comparison across figures. No other participants were excluded.
  • a system in accordance with the present invention may include a computer device having a computer processor (CPU) and a non-transitory computer readable storage medium, a display, and an eye-tracking device.
  • the computer device also preferably has a graphics-processing unit (GPU).
  • GPU graphics-processing unit
  • An example computer device is the 27” widescreen LCD Lenovo Horizon 2“all-in-one” computer.
  • the memory of the computer device may store software to operate the computer and run the algorithms and other software used during each evaluation.
  • the computer may also be used to process the data generated during each evaluation.
  • the gamma function and the minimum and maximum luminance of the display screen are determined.
  • Screen luminance of the display may, for example, be calibrated with the sRGB profile (gamma of approximately 2.2). Screen luminance may, for example, be measured with an ILT1700 radiometer and may range linearly, for example, from 0.1 (black) to 211.1 (white) cd/m 2 with the room lights off (the“dark” condition) and, for example, 10.0 to 221.1 cd/m 2 with the lights on (all other conditions).
  • the display may be mounted on a wheeled stand with an articulated arm and equipped with a USB display-mounted eye-tracking device, such as the Tobii 4C eye tracker.
  • the eye- tracking device may be capable of detecting the gaze position of one or of both eyes simultaneously.
  • the Tobii 4C has an operating distance of 50 to 95 cm and samples mean gaze position at 90 Hz by combining data from both eyes.
  • This calibration phase may calibrate for any small offset in gaze position, and may be used to ensure that the participant is looking at the display before launching the main task. After a predetermined period of time, such as 0.5 seconds of calibration, the disc may fade out and the trial phase may begin.
  • one or more stimulus images may appear at a random location on the screen.
  • the stimulus image may then move around the display.
  • the stimulus image may continuously veer clockwise or counter-clockwise in a sequence of smooth random turns.
  • the stimulus image paths may be procedurally generated by an algorithm.
  • the stimuli may move within an invisible grid, may avoid collisions with other stimuli by not moving to grid cells that are currently occupied, and may avoid repeating the same type of movement twice in a row and/or making the same type of concurrent movement as other active stimuli.
  • the initial positions of the stimuli may be predetermined or random, with or without additional restrictions (e.g. preventing multiple stimuli from appearing at the same location).
  • Temporal aliasing at high spatial frequencies may be prevented by applying an additional anisotropic filter to the amplitude spectrum of the noise.
  • This filter may remove all components with horizontal spatial frequency greater than 2.85 CPD, which is 95% of the Nyquist limit (3 CPD) of a stimulus moving at 10° per second on a display with a refresh rate of 60 Hz.
  • Different anisotropic filters may be applied at different stimulus speeds as the Nyquist limit changes.
  • the orientation of the noise patch may be continuously steered into its direction of motion to keep the anti-aliased direction of this filter“facing forward” at all times.
  • the noise target may sharply rebound whenever it collides with the edge of the screen and may be simultaneously rotated by 180° to continue“facing forward.” Rapid variation in stimulus position and rotation may also help ensure that it is presented at all orientations in all regions of the screen within a single trial.
  • the stimulus image size e.g., 12°
  • the stimulus image size may be chosen to make it large enough to display the lowest spatial frequency in the procedure (e.g., 0.25 CPD) whilst being small enough that its rotation does not interfere with the pursuit detection algorithm if a participant happens to fixate away from its center (where target rotations produce transient higher gaze velocities). Its size may be fixed across all spatial frequencies to avoid changing the difficulty of tracking.
  • a screenshot with the target at high contrast is depicted in Figure 2B.
  • one or more noise targets may be displayed.
  • the noise target may be generated at the start of each trial.
  • one or more noise targets may be generated and stored in memory in advance of the evaluation, and the software may retrieve the one or more noise targets from memory at the start of each trial.
  • a circular noise stimulus may move smoothly on a haphazard path (B) across a computer screen (C). Isolated circular patches of similarly band-filtered noise may be displayed as the stimulus.
  • These‘curvebalF stimuli (each with a single fixed spatial frequency) may be presented one at a time. The stimuli may drift randomly across the display in all directions. The stimuli may drift at constant speed or variable speed. The stimuli may rebound whenever they hit the edge. Eye movements that match the speed and direction of each stimulus with high adherence may provide evidence of stimulus visibility. Since such smooth eye movements are difficult to produce without the stimulus being present on the retina, eye movements that do not follow the stimulus in the course of testing (D) may provide evidence that the stimulus is not visible.
  • the starting RMS contrast of the noise may be, for example, 0.317; this contrast may be above the maximum contrast (-0.22) that can be displayed on a particular monitor without clipping, but it may be chosen for maximum initial visibility. Every frame of ongoing pursuit may cause its RMS contrast to be multiplied by a predetermined amount, such as 0.97. If a participant stops pursuing the target for a predetermined number of consecutive frames (e.g., one frame, five frames, or 10 frames), the contrast reduction may be halted. The algorithm may then wait for a predetermined number of consecutive frames of pursuit (e.g., one frame, five frames, or 10 frames) before resuming the trial. Contrast may increase and decrease during a trial.
  • a predetermined number of consecutive frames of pursuit e.g., one frame, five frames, or 10 frames
  • contrast may never increase during a trial. Participants may instinctively follow the target’s motion on each trial until it fades beyond their threshold, which typically takes up to ten seconds, depending on a participant’ s sensitivity to a particular spatial frequency and the consistency of their smooth pursuits.
  • Curveball is a valid measure of contrast sensitivity
  • the CSFs formed from its thresholds at different spatial frequencies should correspond closely to the CSFs assessed using conventional report-driven psychophysics. This relationship was tested by comparing CSFs estimated using Curveball with CSFs obtained from the traditional 4AFC staircase task completed in the same session. Separate analyses were conducted for the static and moving gratings in the 4AFC task. One participant was excluded from the comparison with the static 4AFC thresholds due to a sensitivity outlier at 2 CPD, which was likely produced by a run of false positives from correct sub-threshold guesses.
  • the raw (dotted blue) and transformed (solid blue) Curveball thresholds are plotted together with the static 4AFC thresholds (black) in Figure 14.
  • Curveball contrast sensitivity estimates are distorted in a predictable way as the user moves closer to the screen and the algorithm’s ability to detect smooth tracking appears to degrade only gradually as distance from the eye tracker varies between the optimal and maximum distance allowed by the hardware. This suggests that the participant’ s distance can be continuously monitored using the eye tracker and used to compute the true spatial frequencies being measured in each trial when estimating the CSF.
  • the display-mounted eye tracker used here required only half a second of one-point calibration at the start of the task for our smooth pursuit detection algorithm to perform well.
  • Curveball could potentially detect these false negatives and respond by adapting the number of repeats needed for that spatial frequency in real time. For example, participants who exhibit a sufficiently low difference between the first two repeats of a given threshold, in addition to a sufficiently high pursuit score, could skip the third and fourth repeats at that spatial frequency.
  • the pursuit detector may further be configured to produce real-time, frame-by frame inferences about stimulus visibility based on the similarity between gaze and stimulus trajectories, to determine a trajectory-match score for each stimulus on every frame by, for example, (1) identifying and discarding samples of gaze position that are not consistent with the known limitations of the human eye and/or human visual system; (2) computing a stimulus trajectory function from each variable-contrast stimulus position signal on each frame as that stimulus moves from the first location to the second location; (3) constructing an expected gaze trajectory function for each stimulus trajectory function based on the most recent value of the gaze position signal on each frame; (4) computing an actual gaze trajectory function on each frame from the gaze position signal over the same time window as the stimulus trajectory function; and (5) calculating a trajectory-match score for each variable-contrast stimulus based on the quantitative spatiotemporal agreement between that stimulus’s expected gaze trajectory function and the participant’s actual gaze trajectory function on each frame.
  • Sixty trajectory-match scores may be produced per stimulus per second.
  • variable-contrast stimulus may increase in contrast or decrease in contrast.
  • the stimulus contrast change may be perceptually continuous.
  • the variable-contrast stimulus may change in a step-wise manner by multiplying the current contrast by a variable between .5-1.5 on each frame.
  • Curveball stimuli For example, if six Curveball stimuli are depicted as moving smoothly in a circle in the center of the display, the four stimuli closest to the stimuli that the observe begins to track (i.e. all but the stimulus directly opposite the tracked stimulus) may temporarily disappear. The stimuli may reappear when the observer stops tracking. The ongoing presence of the opposite stimulus ensures that the observer is always provided with a new stimulus to track upon losing the first; the observer may return to a stimulus later if they have not yet pursued it to threshold.
  • tracked stimuli may change in both contrast and spatial frequency simultaneously after each discrete burst of tracking.
  • the progression of each stimulus may follow a sequence of combinations of spatial frequency and contrast (a“sweep”) through the 2D CSF space, rather than varying only contrast (i.e. a vertical vector) or only spatial frequency (i.e. a horizontal vector).
  • the variation of both contrast and spatial frequency may ensure that the stimulus continually refreshes its appearance, which counteracts the tiresome nature of extended tracking.
  • spatial vision can be specified as a 2D stimulus visibility space, with spatial frequency varying on the X-axis (sinusoidally-undulating line), and contrast varying on the Y-axis (vertical rectangle with changing contrast relative to the white background).
  • Acuity may be defined as the highest spatial frequency that is visible at maximal stimulus contrast (e.g. open arrow).
  • a contrast sensitivity function (CSF) shown as a dotted line, may be estimated from the measurement of minimal contrast visibility over a range of spatial frequencies (vertical arrows and open circles). The CSF intersects with the X-axis (closed arrow) at acuity.
  • the Curveball approach to measuring a CSF proceeds through multiple stimulus variations much faster than approaches that use discrete trials, and spends less time (trials) near the limit of ability, which reduces the fatigue of testing.
  • the optimal position of the sweep’s origin (gray- filled circle on the X-axis), as well as the angle of the vector that defines the optimal proportion of spatial frequency variation to contrast variation, may be estimated and refined empirically through the statistical analysis of a large database of CSFs.
  • the“sweep” approach to measuring the Concuity point on the CSF can be generalized to the measurement of the rest of the CSF.
  • the simultaneous manipulation of contrast and spatial frequency (dotted lines with arrows) is an efficient way to generate the points that define a CSF (open circles) because (a) the stimulus at the origin is common to all vectors, and is optimally visible to the widest range of viewers, and (b) the simultaneous manipulation of contrast (a continuous variable) and spatial frequency (a discrete variable) makes the stimulus more interesting and more easily tracked compared to stimuli that vary in contrast alone.
  • sweep sequences other than vectors may be useful. For example, sweep sequences that“skate” along an existing estimate of a participant’s CSF could be used to make detailed refinements to that CSF.
  • the“sweep vector” approach to measuring CSFs also enables a way to predict the visibility of stimuli on adjacent vectors, which can be used to improve the efficiency of the task.
  • the visibility of stimuli on sweep vectors adjacent to (i.e. a small angular difference away from) the sweep vector currently being measured can be predicted with high confidence, because (a) human CSFs generally approximate a continuous arc regardless of disease and age, (b) the sweep vectors used to measure a CSF emanate from the same origin, and (c) each vector intersects the CSF at an approximately perpendicular angle.
  • the initial stimulus conditions do not have to be those at the origin (central dot), but instead can begin at the limit of inferred indirect progress already made within that sweep (final dots on the short vectors). This is because there is a high probability that the participant can easily see the stimulus, even though the specific stimulus conditions were never tested.
  • This process of generalizing the visibility of stimuli from one sweep vector to another reduces the time required to measure a CSF because highly visible stimuli do not need to be repeatedly assessed.
  • the exact degree to which progress along one vector can be inferred from progress on another may be empirically determined by sampling the natural range of CSFs within specific populations.
  • a system in accordance with the present invention may include a display, an eye- tracking device configured to detect the gaze position of one or both eyes of the person, a non-transitory memory having a machine-readable medium comprising machine executable code; and one or more processors coupled to the memory, said one or more processors configured to execute the machine executable code.
  • Calculating the evidence-of-visibility score may produce real-time, frame-by frame inferences about stimulus visibility based on the relationship between gaze and stimulus trajectories, to determine an evidence-of-visibility score for each stimulus on every frame.
  • the method of calculating he evidence-of-visibility score may include (1) identifying and discarding samples of gaze position that are not consistent with the known limitations of the human eye and/or human visual system; (2) identifying and discarding samples of gaze position that are malformed by blinks, failure to attend to the display, and/or invalid person position relative to the display; (3) identifying fixation events by analyzing the 2D dispersion metric of gaze position and comparing gaze position to the positions of all presented stimuli; (4) identifying saccade events by detecting high-velocity, high-acceleration, near-linear eye movements and comparing the endpoint of the saccade to the positions of all presented stimuli; (5) identifying smooth pursuit events by detecting mid-velocity, low-acceleration eye movements; and/or (6) identifying

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Molecular Biology (AREA)
  • Biophysics (AREA)
  • Ophthalmology & Optometry (AREA)
  • Biomedical Technology (AREA)
  • Veterinary Medicine (AREA)
  • Medical Informatics (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Public Health (AREA)
  • Human Computer Interaction (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

La présente invention concerne un procédé pour mesurer un comportement de suivi de regard. Le procédé comprend l'affichage d'un ou de plusieurs stimuli à fréquence spatiale variable et à contraste variable, chacun se déplaçant d'un premier emplacement à un second emplacement. Le procédé comprend également la génération de séquences ordonnées de stimuli ("balayages") qui sont incrémentées lors d'un comportement de suivi détecté. Le préocédé comprend également la génération, par un dispositif de surveillance de suivi oculaire, d'un signal de position de regard lorsque le stimulus visuel se déplace de la première position à la seconde position, le signal de position de regard détectant une position d'un œil ou des deux yeux. Le procédé comprend également le calcul d'une note de correspondance de trajectoire pour chaque stimulus sur chaque vue à partir de la comparaison de la position de ce stimulus au signal de position de regard durant une fenêtre temporelle. La procédé comprend également l'identification de la fonction visuelle du sujet en réponse aux notes de correspondance de trajectoire.
EP19877442.4A 2018-10-23 2019-10-23 Systèmes et procédés d'évaluation de sensibilité de contraste et d'autres mesures visuelles Pending EP3871145A4 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US201862749360P 2018-10-23 2018-10-23
PCT/US2019/057638 WO2020086703A1 (fr) 2018-10-23 2019-10-23 Systèmes et procédés d'évaluation de sensibilité de contraste et d'autres mesures visuelles

Publications (2)

Publication Number Publication Date
EP3871145A1 true EP3871145A1 (fr) 2021-09-01
EP3871145A4 EP3871145A4 (fr) 2022-07-20

Family

ID=70281106

Family Applications (1)

Application Number Title Priority Date Filing Date
EP19877442.4A Pending EP3871145A4 (fr) 2018-10-23 2019-10-23 Systèmes et procédés d'évaluation de sensibilité de contraste et d'autres mesures visuelles

Country Status (4)

Country Link
US (2) US11583178B2 (fr)
EP (1) EP3871145A4 (fr)
IL (1) IL282419B1 (fr)
WO (1) WO2020086703A1 (fr)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2021131443A (ja) * 2020-02-19 2021-09-09 キヤノン株式会社 電子機器
SE2050893A1 (en) * 2020-07-14 2022-01-15 Heads Stockholm Ab Eye movement evaluation
JP2023540534A (ja) * 2020-09-04 2023-09-25 ノースイースタン ユニバーシティ 視覚機能評価法
EP4056101A1 (fr) * 2021-03-12 2022-09-14 Carl Zeiss Vision International GmbH Procédé et dispositif pour déterminer une performance visuelle
US11998335B2 (en) 2021-04-19 2024-06-04 Microsoft Technology Licensing, Llc Systems and methods of capturing eye-gaze data
US11619993B2 (en) 2021-04-19 2023-04-04 Microsoft Technology Licensing, Llc Systems and methods for gaze-tracking
CN114022642B (zh) * 2021-10-08 2022-07-19 北京津发科技股份有限公司 时空行为轨迹采集、生成方法、装置、设备、系统及存储介质
WO2023154535A1 (fr) * 2022-02-11 2023-08-17 Northeastern University Test de dépistage de vision adaptatif auto-administré à l'aide d'une indication angulaire

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
HUP0301899A2 (hu) * 2003-06-20 2005-05-30 János Fehér Eljárás és berendezés a szem látásfunkcióinak vizsgálatára
US20090153796A1 (en) * 2005-09-02 2009-06-18 Arthur Rabner Multi-functional optometric-ophthalmic system for testing diagnosing, or treating, vision or eyes of a subject, and methodologies thereof
US9788714B2 (en) * 2014-07-08 2017-10-17 Iarmourholdings, Inc. Systems and methods using virtual reality or augmented reality environments for the measurement and/or improvement of human vestibulo-ocular performance
US9463132B2 (en) * 2013-03-15 2016-10-11 John Castle Simmons Vision-based diagnosis and treatment
CN107530034A (zh) * 2015-03-16 2018-01-02 奇跃公司 增强现实脉冲血氧定量法
WO2018006013A1 (fr) * 2016-06-30 2018-01-04 Cornell University Optokinésys

Also Published As

Publication number Publication date
IL282419A (en) 2021-06-30
US20200121179A1 (en) 2020-04-23
EP3871145A4 (fr) 2022-07-20
US20230414092A1 (en) 2023-12-28
IL282419B1 (en) 2024-04-01
US11583178B2 (en) 2023-02-21
WO2020086703A1 (fr) 2020-04-30

Similar Documents

Publication Publication Date Title
US11583178B2 (en) Systems and methods for evaluating contrast sensitivity and other visual metrics
Santini et al. PuRe: Robust pupil detection for real-time pervasive eye tracking
Arabadzhiyska et al. Saccade landing position prediction for gaze-contingent rendering
US20200069179A1 (en) Head and eye tracking
US9775512B1 (en) Binocular eye tracking from video frame sequences
Blignaut et al. Eye-tracking data quality as affected by ethnicity and experimental design
Ko et al. Microsaccades precisely relocate gaze in a high visual acuity task
US10281980B2 (en) System and method for eye-reactive display
WO2017216118A1 (fr) Procédé et système d'oculométrie pour effectuer une procédure d'étalonnage pour étalonner un dispositif d'oculométrie
Mooney et al. Curveball: A tool for rapid measurement of contrast sensitivity based on smooth eye movements
Mantiuk et al. Gaze‐driven object tracking for real time rendering
US11529049B2 (en) Method and device for determining a contrast sensitivity threshold
Duinkharjav et al. Image features influence reaction time: A learned probabilistic perceptual model for saccade latency
CN111067474B (zh) 使用动态速度阈值滤波器的客观视敏度测量的装置和方法
Mooney et al. Gradiate: a radial sweep approach to measuring detailed contrast sensitivity functions from eye movements
EP3900638A1 (fr) Dispositif de diagnostic d'une maladie cérébrale
David et al. What are the visuo-motor tendencies of omnidirectional scene free-viewing in virtual reality?
US20240164636A1 (en) System for assessing target visibility and trackability from eye movements
CN108495584B (zh) 用于通过触觉界面确定眼球运动的装置和方法
US10503252B2 (en) System and method for eye-reactive display
Akhavein et al. Gaze behavior during 3-D face identification is depth cue invariant
EP4262520B1 (fr) Procédé et dispositif pour déterminer une performance visuelle
Ivanov et al. Improving gaze accuracy and predicting fixation onset with video based eye trackers
WO2023044520A1 (fr) Procédés et systèmes de détection de perte de champ visuel central
Sundstedt Eye Tracking

Legal Events

Date Code Title Description
STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: THE INTERNATIONAL PUBLICATION HAS BEEN MADE

PUAI Public reference made under article 153(3) epc to a published international application that has entered the european phase

Free format text: ORIGINAL CODE: 0009012

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: REQUEST FOR EXAMINATION WAS MADE

17P Request for examination filed

Effective date: 20210521

AK Designated contracting states

Kind code of ref document: A1

Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR

DAV Request for validation of the european patent (deleted)
DAX Request for extension of the european patent (deleted)
REG Reference to a national code

Ref country code: DE

Ref legal event code: R079

Free format text: PREVIOUS MAIN CLASS: G06K0009000000

Ipc: A61B0003020000

A4 Supplementary search report drawn up and despatched

Effective date: 20220615

RIC1 Information provided on ipc code assigned before grant

Ipc: A61B 3/02 20060101AFI20220611BHEP

STAA Information on the status of an ep patent application or granted ep patent

Free format text: STATUS: EXAMINATION IS IN PROGRESS

17Q First examination report despatched

Effective date: 20240304