WO2001074236A1 - Test diagnostic pour les troubles deficitaires de l'attention - Google Patents

Test diagnostic pour les troubles deficitaires de l'attention Download PDF

Info

Publication number
WO2001074236A1
WO2001074236A1 PCT/CA2001/000406 CA0100406W WO0174236A1 WO 2001074236 A1 WO2001074236 A1 WO 2001074236A1 CA 0100406 W CA0100406 W CA 0100406W WO 0174236 A1 WO0174236 A1 WO 0174236A1
Authority
WO
WIPO (PCT)
Prior art keywords
person
ball
moving object
motor task
ability
Prior art date
Application number
PCT/CA2001/000406
Other languages
English (en)
Inventor
Joan N. Vickers
Sergio Tosi Rodrigues
Original Assignee
University Technologies International Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from CA002303621A external-priority patent/CA2303621A1/fr
Application filed by University Technologies International Inc. filed Critical University Technologies International Inc.
Priority to AU44006/01A priority Critical patent/AU4400601A/en
Publication of WO2001074236A1 publication Critical patent/WO2001074236A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1124Determining motor skills
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/168Evaluating attention deficit, hyperactivity
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • AHUMAN NECESSITIES
    • A63SPORTS; GAMES; AMUSEMENTS
    • A63FCARD, BOARD, OR ROULETTE GAMES; INDOOR GAMES USING SMALL MOVING PLAYING BODIES; VIDEO GAMES; GAMES NOT OTHERWISE PROVIDED FOR
    • A63F7/00Indoor games using small moving playing bodies, e.g. balls, discs or blocks
    • A63F7/06Games simulating outdoor ball games, e.g. hockey or football
    • A63F7/0664Electric

Definitions

  • the present invention relates in general to methods and apparatus for the diagnosis of Attention Deficit Hyperactivity Disorder, or ADHD, and more particularly relates to the use of variables such as gaze pursuit tracking or quiet eye, gaze frequency, arm movement velocity and percent accuracy as criteria for the diagnosis of ADHD.
  • ADHD Attention Deficit Hyperactivity Disorder
  • ADHD is usually characterized by a persistent pattern of inattention and/or hyperactivity- impulsivity.
  • ADHD primary attention deficit
  • ADHD - PI primary attention deficit
  • ADHD - H primary hyperactivity
  • ADHD - H primary hyperactivity
  • ADHD - C exhibit deficits of both attention and hyperactivity.
  • ADHD affects approximately 4 to 6 percent of all children, with estimates varying from 1.7 to 17 percent worldwide.
  • ADHD may continue into adulthood and is more widespread in males than females during childhood and adolescence by a ratio of 4:1, and in adulthood by a ratio of 2:1.
  • the persistence of ADHD into adulthood has been estimated to be as high as 80%.
  • ADHD is clinically diagnosed using primarily subjective indicia such as observations of a child's behavior, questionnaires and interviews. Recently, however, studies have been done to explore the possible relationsliip between motor co-ordination and ADHD. It was found that the severity of children's inattentive symptomatology was a significant predictor of motor co-ordination difficulties. Other researchers have found that the relevance of the task to the real world situations is also a key factor in effective research on ADHD. Often the tasks used to study ADHD lack much relevance to tasks which are routinely performed in real life settings.
  • the present invention is directed to a diagnostic method and apparatus for diagnosing attention deficit hyperactivity disorder (ADHD) in a person.
  • ADHD attention deficit hyperactivity disorder
  • the present invention discloses an apparatus for diagnosing ADHD in a person which measures the ability of the person to visually track a moving object while performing a motor task relative to the moving object.
  • An example of such a motor task is returning a served ball with a paddle.
  • the performance of the motor task involves two phases; a pre- motor phase and a motor phase.
  • the pre-motor phase is sometimes referred to as the cognitive phase and it is during this phase that the person mentally assesses the motor task environment, in particular, commences visually tracking the moving object.
  • the motor phase commences with the onset of the movement necessary to perform the motor task relative to the moving object.
  • the invention further discloses an apparatus capable of measuring motor control of a person by measuring the ability of a person to physically perform a motor task such as returning a served ball by hitting the ball with a paddle. Further, the invention discloses an apparatus capable of measuring both a person's ability to track a moving object and a person's ability to perform a task specific to the moving object.
  • the present invention also discloses a method for diagnosing ADHD in a person by comparing the person's ability to visually track a moving object and/or the person's ability to perform a motor task specific to a moving object, to a known standard, for example, to persons' performances who are not diagnosed with ADHD.
  • the present invention discloses an apparatus for diagnosing ADHD in a person comprising a means for visually displaying a moving object to the person and a means for measuring the person's ability to visually track the moving object during the performance of the motor task.
  • Suitable means for displaying a moving object include, but is not limited to, graphically displaying a realistic virtual image generated in a computer to the person in such a fashion that the person perceives a 3D sensation of a moving object, presenting to the person a scene taken by a TV camera in such a fashion that the person perceives a 3D sensation of a moving object, or having a real object moving in real time and in real space.
  • the person In operating the apparatus for the diagnosis of ADHD, the person is asked to perform a motor task in relation to the moving object. Therefore, in order to perform the task, the person will have to first visually track the object. Suitable tasks would be, but are not limited to, pressing a button, for example, when the object nears the person, moving a limb in response to tlie object, or hitting the object with a paddle or a glove that records the velocity of the hand.
  • the means for measuring the person's ability to visually track the moving object comprises:
  • the apparatus also includes means for measuring the movement of the person.
  • Such measuring means may comprise, and by way of non-limiting examples, at least one video camera means, which is set up for viewing the person while he or she is engaged in the performance of the particular task, digital imaging means, stop watch means, motor sensing and speed measurement means including and by way of non-limiting example, radar emitting means.
  • a method for diagnosing ADHD in a person comprising:
  • the person's onset of visual tracking of the object the length of time the person tracks the object and the frequency in which the person deviates or stops tracking the object for an interval of time is measured and compared to a person's tracking ability who has not been diagnosed with ADHD.
  • a person's ability to perform a motor task relative to a moving object is measured as means for further diagnosing ADHD in a person. For example, when the task which the person must perform is to hit a ball with a paddle, measuring a person's percentage of accurate hits and the velocity of the person's arm at the instant of paddle-ball and comparing same to a non-ADHD person provides a method for diagnosing ADHD.
  • the person may be instructed to perform the task either prior to displaying the moving object to the person or after displaying the moving object to the person.
  • a method for diagnosing ADHD in a person comprising:
  • Figure 1 a is a schematic of the table tennis table test set up used in the embodiment of the invention, showing subject and location of video cameras with output showing eye motion, field of view of the eye and the position of the subject in the scene.
  • Figure lb is a schematic of the subject in relation to the set up of the magnetic head tracker.
  • Figure lc is a schematic of the subject wearing the eye tracker system.
  • Figure 2 is a representation of the visual angle between line of gaze and ball edge, gaze and x-axis of the transmitter coordinate system, and the angle between the arm and x-axis of the transmitter coordinate system.
  • Figure 3 is a schematic of an exemplary video monitor displaying the outputs from the video cameras shown in Figures la and lb.
  • Figure 4 is a plot of gaze pursuit tracking (visual angle to the ball in degrees versus relative time in percent) of a Normal subject.
  • Figure 5 is a plot of gaze pursuit tracking (visual angle to the ball in degrees versus relative time in percent) of an ADHD subject.
  • Figure 6a is a plot of gaze pursuit tracking (visual angle to the ball in degrees versus relative time in percent) of an ADHD subject during hits (solid line) and misses (hatched line) during early cue conditions.
  • Figure 6b is a plot of gaze pursuit tracking (visual angle to the ball in degrees versus relative time in percent) of a Normal subject during hits (solid line) and misses (hatched line) during early cue conditions.
  • Figure 7 shows the arm velocity at contact (ANC) in cm/sec of ADHD On subjects, ADHD Off subjects and Normal subjects during the early and late cue conditions.
  • ANC arm velocity at contact
  • Figure 8 shows the arm velocity at contact (ANC) in cm/sec of ADHD On subjects, ADHD Off subjects and Normal subjects during early and late cue conditions, on hits and misses.
  • Figures 9a, b and c show the duration of tracking on the ball results (on the left) and arm velocity at contact results under early cue conditions (on the right), of three Normal subjects.
  • Figures 10a, b and c show the duration of tracking on the ball results (on the left) and arm velocity at contact results under early cue conditions (on the right), of three ADHD subjects.
  • the system comprises a table tennis test, however similar technology, software and procedures can be used with any task performance test, for example, table hockey.
  • Figure la shows the modified table tennis table 15, fitted with right and left cue lights 8, 9 and right and left target areas 28, 29.
  • a table tennis ball 10 is delivered to the subject 3 using a serving device 7 that directs the ball 10 across the table tennis table 15 towards the subject at a moderate speed (flight time of approximately 800 ms).
  • the serving device 7 delivers the ball to a 40 x 60 cm oval 48 chalked on the table 15, denoting the location of the second bounce.
  • Subjects are to hit the ball as accurately and/or as hard as possible.
  • the subject 3 is to return the ball by means of a paddle 24 (the task) to either the right or left target areas 28, 29 depending upon which of the right and left cue lights 8, 9 is illuminated.
  • the right and left target areas 28, 29, respectively, are rectangular and large enough to be easily hit by a returned ball.
  • the right or left cue lights 8, 9 can be lit either 2 seconds prior to the ball 10 being served (early test condition) or after the ball has been served (late test condition).
  • Laser devices 50, 52 are set up 90 cm from the edge of the table 54 (the table edge where the serving device 7 is situated) so that when the served ball 10 passes the laser beam 56, one of the two cue lights 8, 9 will be lit (late test condition). Under late test conditions, the subject only has approximately 600 ms to react to the cue.
  • there are two integrated data collection systems (primary and secondary) which are used to generate the appropriate data.
  • the test subject 3 is equipped with an eye tracker system, a magnetic head tracker, and a motion analysis system (MAS).
  • MAS motion analysis system
  • the Applied Science LaboratoriesTM Model 501 (ASL 501) eye tracker was used but it is understood that any eye/gaze tracking system could be used, for example the ISCANTM system.
  • the preferred magnetic head tracker is the Flock of BirdsTM Model developed by Ascension Technology CorporationTM.
  • the ASL 501 eye tracker system is a monocular corneal reflection system that measures line-of-gaze with respect to a helmet 17 worn by the subject 3.
  • the helmet 17 has a 30-meter cord (not shown) which is attached to the waist of the subject 3 and interfaced to a first computer 1. This allows the subject 3 to have near normal mobility.
  • the helmet 17 is further equipped with a eye video camera 19 which is mounted to the helmet 17 by a support member (not shown).
  • the eye video camera 19 is directed at the eye 27 of the subject 3 so that the eye 27 of the subject 3 is in the field of view of the eye video camera 19.
  • the eye 27 is illuminated by a near infrared light source (not shown) beamed coaxially with the optical axis of the eye video camera 19.
  • the light from the near infrared light source and the resulting back-lighted bright pupil image are reflected from a visor 20 coated to be transparent in the visible spectrum but reflective to the near infrared.
  • a visor 20 coated to be transparent in the visible spectrum but reflective to the near infrared.
  • Use of the visor 20 allows unobstructed simultaneous recording of an image of the eye 27 while the eye is occupied in viewing a scene.
  • Optics (not shown) for the eye video camera 19 or image processing may be used so that the eye video shows about a 2.5 cm x 2.5 cm square centered on the eye.
  • a second light source 18 is also mounted on the helmet 17 using any of various attachment devices and is supported on the helmet 17 to provide a collimated beam of light that reflects from the visor 20 onto the surface of the cornea and returns corneal reflection into the eye video camera 19.
  • a field-of-view or scene video camera 21 is mounted on a support 22 attached to the helmet 17 which records the external scene viewed by the observer by reflecting external scene light from the back side of the visor 20.
  • the scene video camera 21 is oriented towards the visor 20 so that the field of view of the scene video camera 21 is nearly aligned with the view from the eye 27 being monitored, thus avoiding parallax problems.
  • the scene camera 21 thus appears to see the world from the same position as the subject's eye.
  • the visor 20 is oriented to pick up optimal scene in front of the subject 3. Since the subject 3 is free to move the head, the scene changes with shifts in the subject's head position.
  • the magnetic head tracker system 2 Interfaced with the ASL 501 eye tracker system is the magnetic head tracker system 2 which is used to determine the location of the gaze relative to the ball.
  • the magnetic head tracker a six degree of freedom measuring device in this instance, tracks the three-dimensional position and orientation of the eye tracker relative to a transmitter 13, which consists of three individual antennae arranged concentrically to generate a multiplicity of DC magnetic fields.
  • the transmitter 13 is generally located above and one meter behind the subject 3.
  • a receiver 14 is positioned on top of the helmet 17 of the eye tracker system, said receiver 14 consisting of three axes of antenna that are sensitive to DC magnetic fields.
  • the signal output fiom the receiver 14 travels to the signal processing electronics 37 which controls, conditions and converts the analog receiver signals into a digital format that can then be read by a computer (in this case, first computer 1).
  • the signal processing electronics 37 is equipped with an algorithm which computes the position and orientation of the receiver 14 with respect to the transmitter 13. This information is then outputted to the host computer (first computer 1).
  • the magnetic head tracker has a static positional accuracy of 0.1" (2.54 mm), a positional resolution of 0.03" (0.76 mm), a static angular accuracy of 0.5 degrees, and angular resolution of 0.1 degrees.
  • Data generated from the eye tracker system i.e. the eye video camera 12 output and the scene video camera 21 output
  • data generated from the signal processing electronics 37 of the magnetic head tracker are both inputted to the first computer 1 so as to generate the line- of-gaze relative to the environment, rather than the line-of-gaze relative to the helmet 17 (as is the case with the eye tracker system alone).
  • This is referred to as eye-head integration and the software developed for the eye- head integration is an expansion of the eye tracker software as provided by ASL. Eye-head integration data thus generated is updated at 60 Hz, the same frequency as the eye tracker.
  • the eye tracker and the magnetic head tracker calibration procedures are also fully integrated in the eye-head integration software. Calibration is divided into specification of planes and specification of eye position in space.
  • the planes to be specified to the eye-head integration software are the table tennis table surface plane and the calibration plane.
  • the calibration plane is arrived at as follows.
  • the receiver 14 is placed on a wand having a laser pointer at one end and a gimbal attached to the other end.
  • the gimbal allows motion of the wand in three dimensions relative to the center of the transmitter 13, the origin of the magnetic head tracker coordinate system.
  • the software calibrates the receiver signals to the three dimensional space (i.e. where the table tennis test takes place).
  • the specification of eye position in space is calibrated as follows.
  • the receiver 14 is placed on the helmet 17 and the subject 3 is then fitted with the helmet 17.
  • Both the eye video camera 19 and the scene video camera 21 are set up on the helmet 17and the visor 20 is oriented to the table tennis table 15.
  • the subject is then positioned in front of a calibration plane consisting of nine points (each spaced from the other by 3 cm) that are attached to a height-adjustable wooden support. While holding the head steady and moving only the eyes, the subject then looks at each of the nine points and the subject's eye position is recorded relative to each point.
  • the eye-head integration software outputs the two- dimensional coordinates of the intersection point between the line-of-gaze and the plane of interest (i.e. the surface of the table tennis table) and the three-dimensional position and orientation angles (azimuth, elevation, and roll) of the receiver 14 (i.e. head) with respect to the signal processing electronics 37.
  • the primary data collection system also includes a motion analysis system which preferably comprises four to seven high-speed video cameras.
  • the motion analysis system comprises six high-speed video cameras 31, 32, 33, 34, 35, 36 set to 180 Hz and positioned in the umbrella configuration as shown in Figure la.
  • the motion analysis system detects both the flight of the ball 10 and the arm 16 motion.
  • the ball 10 is painted with reflective paint which renders the ball 10 retro-reflective to the six cameras 31, 32, 33, 34, 35, 36.
  • the subject's arm 16 is equipped with at least three reflective markers; one attached to the head of the radius (elbow), the second to the ulnar head (wrist) and the third to the shoulder. Having both the ball 10 and the arm 16 of the subject 3 retro-reflective to the six video cameras 31, 32, 33, 34, 35, 36 allows for digital processing to occur.
  • the second computer 6 recognizes the retro-reflective markers on each camera image and then reconstructs the three-dimensional position of the markers (i.e. ball 10 and arm 16) in space. Hence, the second computer provides three- dimensional position data of arm and ball flight. Synchronization of all components of the motion analysis system is achieved by electrical pulses sent simultaneously to a central control box (not shown).
  • the motion analysis system data processing has distinct phases, its accuracy is described in terms of three-dimensional final measures of a known distance.
  • a wand of one meter (1,000 mm) having reflective markers at each end is recorded during a one minute interval at 10 Hz as moving in a volume of 1.06 m x 3.46 m x 1.36 m.
  • the eye-head integration software provides data for the horizontal and vertical gaze (table coordinate system) and tl ree-dimensional position and orientation of the head (transmitter coordinate system).
  • the motion analysis system provided the three dimensional positions of the elbow, wrist and ball (calibration cube coordinate system).
  • a software program written in MatlabTM is provided for data smoothing, interpolation, coordinate system transformation and calculation of angles of interest (hereinafter referred to as the "gaze3d.m program").
  • Elbow, wrist and head positions and head orientation data are smoothed with a forth-order ButterworthTM filter at 5 Hz.
  • Ball position data is smoothed using the same filter at 30 Mz.
  • An interpolation procedure is used to transform the eye-head integration data from 60 Hz to 180 Hz. This procedure is used to generate the final data points in different intervals to match the motion analysis system data acquisition frequency.
  • Linear interpolation calculates the gaze data points to coincide with an imaginary line between each pair of original data points. This is appropriate for gaze data due to the speed and abrupt changes in eye position.
  • Spline interpolation estimates new data points for head movement according to a smoother curve that fits the original data. This is appropriate for head data due to the relatively slow nature of head movements and their smooth trajectories.
  • Figures 2a, 2b and 2c show the visual angle between the line of gaze 50 and the ball edge 51.
  • Figure 2b shows the visual angle between the line of gaze 50 and the x-axis 52 of the transmitter coordinate system.
  • Figure 2c shows the angle between the lower arm 53 (segment from elbow to wrist) and the x-axis 52 of the transmitter coordinate system.
  • the data generated by the gaze3d.m program was used to determine the following dependent variables: Movement Time (MT) onset, offset, and duration
  • Movement time is defined as the duration of the forward phase of the arm movement until contact of the ball. This final phase of movement execution has been defined as the impulse phase.
  • MT is characterized by a decrease in the angle between the arm (elbow-wrist segment) and the x-axis of the transmitter coordinate system as shown in Figure 2c and/or when the first data point of the forward movement of the arm is greater than zero.
  • the MT onset criterion is when there is the greatest angle between the arm and the x-axis of the transmitter coordinate system and MT offset is the last data point before the ball exhibits an abrupt change in the direction of its trajectory (i.e. ball-paddle contact).
  • QE Quiet eye
  • Quiet eye is a measure of a person's ability to visually track an object while performing a motor task.
  • quiet eye is a measure of the location, onset, offset and duration of a final fixation or tracking gaze that occurs relative to the onset of movement.
  • Quiet eye onset occurs prior to the onset of movement time (MT) and quiet eye offset occurs naturally as the gaze deviates from a defined location. Quiet eye can therefore extend into MT.
  • the criteria for quiet eye is that the visual angle between line-of-gaze and a defined location (in this case the ball's edge as shown in Figure 2a) should be maintained for at least 100 ms within three degrees. A value of three degrees was used for ball tracking based on prior literature (see, for example, Bahill, A.
  • TMT movement time
  • Tracking during movement time is defined as the duration of tracking on the ball during the arm movement phase (MT).
  • the criteria for TMT is the same as for QE except that it is initiated during MT.
  • EHS Eye-head stabilization
  • Eye-head stabilization is defined as the period of stable alignment of the eye and head before ball-paddle contact.
  • the criteria for EHS is that the visual angle between the line-of-gaze and the x-axis of the transmitter coordinate system (see Figure 2b) should remain stable in the final part of ball flight. Stability is based on a fixation criterion, adapted from Helsen, W.F., Elliot, D., Starkes, J.L., and Rickers, K.L. (1998), Temporal and Spatial Coupling of Point of Gaze and Hand Movements in Aiming, Journal of Motor Behaviour, 30(3), 249-259.
  • EHS onset of EHS requires that the aforementioned angle should be maintained with a standard deviation of 1 deg or less for at least 50 ms. If this criterion is satisfied then EHS is considered to have started.
  • the offset of EHS occurs when four sequential gaze angle measurements are all farther away than 1.5 deg from the mean value of the angles within that period of stabilization.
  • Arm velocity at contact is defined as the instantaneous velocity of the wrist or other marker on tl e arm, hand or paddle at the moment of "contact”. "Contact” refers to the moment the subject strikes the ball when executing the return.
  • Arm velocity at contact (ANC, deg/sec) is the three-dimensional angular velocity between arm (elbow- wrist segment) and the x-axis of the transmitter coordinate system or other frame of reference (see Figure 2c) at the moment of ball-paddle contact.
  • ANC cent/sec
  • velocity was calculated as the change in position from the data point immediately before contact to the data point immediately after contact, divided by the time elapsed (Winter, D.A. (1990), Biomechanics and Motor Control of Human Movement, 2 nd ed., (New York: John Wiley & Sons).
  • Accuracy (in percent) is measured by recording the number of hits (i.e. the number of times the subject returns the ball to the designated target area 28, 29) and misses (i.e. the number of times the subject misses the designated target area 28, 29).
  • Accuracy scores are the percentage of accurate returns of the ball to the designated target area.
  • gazeFreq.m program the frequency of gaze-ball angle deviations (hereinafter referred to as the "gazeFreq.m program"), a measure in the spatial dimension only and independent of time.
  • the gazeFreq.m program gives a measure of the smoothness of ball tracking or gaze frequency (GF).
  • the threshold value, or sensitivity can be set to any real positive number. In the present invention, one, two and three degrees have been used. For example, if one degree is used, any change of greater than one degree is considered a signal, and any change less than one degree is regarded as noise and ignored.
  • gaze frequency is defined as follows:
  • Gaze frequency is defined as the frequency of gaze deviation from the ball's edge during the entire flight of the ball, from serve to contact.
  • GF was defined as the frequency of gaze deviation from the ball's edge more than 3 degrees visual angle.
  • the degrees visual angle can be set at 1, 2, 3 or more.
  • the secondary data collection system is described in detail in Canadian Patent Application 2,209,886 and is incorporated herein by reference.
  • This secondary system is referred to as the Nision-in-Action system and includes an ASL 501 eye tracker for video outputs (as described above), an external camera 5 (SONYTM TRN 82), a time code generator and a video mixer thereby coupling the subject's gaze, motor and ocular behavior in time.
  • the secondary data collection system is used concurrently with the first data collection system in order to ensure the accuracy of the data collection in the present invention.
  • the two video outputs from the ALS 501 eye tracker and the video output from the external camera 5 are integrated into one final video image which provides the gaze in the scene from the subject's point of view, the eye recorded, and the movements of the subject 3, the server 7 and the ball 10.
  • the Nision-in-Action system is used to calibrate the participant, monitor calibration during data collection and provide information for later clinical use.
  • Output from the eye view monitor computer i.e. first computer 1
  • scene with gaze cursor and eye video and output from the external camera 5 are mixed in a mixer having an output formed of sequential frames in which, in each frame of the mixer output, there is included synchronized video information from each of the cameras.
  • the signals are mixed so that, as shown in Fig. 2, each video output occupies a distinct window on a video monitor 100 (e.g. a SonyTM monitor).
  • a video monitor 100 e.g. a SonyTM monitor.
  • two effects generators e.g. NideonicsTM, Model MX-1, may be cascaded together as described below.
  • the first effects generator receives scene video from the eye view monitor computer at its first video input, and video of the subject from an external camera 5 (see Figure la) at its second video input. Genlock signal out from blackburst output on the first effects generator is fed back to the external camera 5. A mixed signal comprising both scene video and subject video is output from the first effects generator video output.
  • the second effects generator receives eye video from the first computer 1 at its first video input, mixes scene video, including gaze information, such that a cursor appears at the location of the subject's gaze in the scene (the gaze cursor) and subject video from the first effects generator, and mixes them to produce mixed eye, scene with gaze cursor and subject video at its video output.
  • the mixer may also be formed of a digital squeezer, itself known in the art, that squeezes each of the images from the eye video camera, scene video camera and external video camera so that a complete, but reduced, image from each camera appears in the displayed image on the display monitor.
  • Mixer output is input to a video time code generator (e.g. a Datum TM model 9100 A) which adds time code information to the mixer output for display.
  • the combined mixer output and time code information is input to a first video recorder (e.g. a ToshibaTM or SonyTM recorder), where it may be mixed with audio from an audio mixer, and recorded on video tape.
  • the video tape or a direct feed may be then input to a second video recorder, from which the mixer output and time code information may be displayed on the monitor 100.
  • the final mixed data from all cameras can also be recorded on compact disc, video disc, computer, or other recording devices.
  • Typical video output displayed on the monitor 100 is shown in Fig. 3 for the analysis of the table tennis test.
  • the upper right portion about 40% of the monitor screen, shows a scene 106 as seen by scene video camera 21, which closely approximates the scene as viewed by the subject 3.
  • the cursor 104 shows the gaze location of the subject 3 as computed by the first computer 1.
  • the lower portion 102 of the screen shows the subject 3 as viewed by the external camera 5.
  • the upper left quarter 108 shows the image of the eye of the subject 3.
  • the eye view image 118 shows the center of the pupil 110 as indicated by the cross-hairs 112 andl l4 and the location of the corneal reflection 116 as indicated by the cross-hairs 118 and 120. Time code information may be superimposed on the displayed image in any convenient location.
  • Each displayed frame of video is assigned a unique time code.
  • Each frame is also preferably coded with a unique identifier in milliseconds to show the frame interval, according to the standard used (30 frames per second, 60 frames per second or 100 frames per second for example).
  • a subject 3 performs a motor activity, such as participating in the table tennis test.
  • the eye video camera 19 is directed at the subject's eye after reflection from the visor 20, and eye video is output to the first computer 1.
  • the scene video camera 21 is directed at the scene by reflection from the visor 20, and scene video is also output to the first computer 1.
  • Eye video and scene video is mixed in the mixer (not shown).
  • the external video camera 5 is directed at the subject 3 and output from the external video camera 5 is mixed with the output from the eye video camera 19 and scene video camera 21.
  • a frame of video from each camera 19, 21 and 5 is mixed by the mixer such that each mixed frame of video information includes information from each of the video cameras.
  • each mixed frame be encoded with a unique time code so that each array of values representing physical and/or eye activity has a unique identifier.
  • the scene video mixed into the mixer output include gaze information, such that a cursor appears at the location of the subject's gaze in the scene.
  • the mixer output may be played back and analyzed frame by frame.
  • Data values are assigned to observable physical characteristics of the motor activity of the subject, people or objects in the scene, gaze locations, gaze behaviors, eye movement, and activity performance.
  • Activity performance, or outcome measures are recorded, such as accuracy of the trial, optimal verses non-optimal use of objects or tools, and effective or ineffective interaction with others and objects.
  • Temporal motor phases are defined based on the mechanics of the motor skills and needs of the subject. Onset, offset and duration, in milliseconds or other time durations, of the skill and phases of the skill are determined as is the movement time of the sldll. Number of steps, onset of the first and last step, step duration, and direction of stepping or other body movements may be assigned data values. Likewise, arm and hand movement and body velocity may also be assigned values.
  • Gaze locations may be identified and coded, relative to the motor phases of the skill.
  • a gaze behavior is defined as the gaze held stable on a location in the environment, or a shift in gaze from one location to another. Percent of trials in which a gaze behavior is observed is reported, as is the frequency, onset, offset, and duration of important gaze behaviors during each motor phase. Since a shift in gaze is normally initiated by eye movements, four or more gaze behaviors are identified based on definitions originating from the eye movement literature, for example, fixations, saccades, tracking and blink. Each is coded within accepted ranges using minimum duration parameters. For example, minimum fixation duration is 99.9 ms (3 or more frames) and defined as the stabilization of the gaze cursor on a location in the environment.
  • a tracking gaze behavior is coded when the eyes pursue a moving object or person for 3 or more frames or 99.9 ms.
  • a saccade is coded when a shift in gaze is observed from one location to another with movement time (MT) of 66.6 ms or more (2 or more frames). During a saccade, vision is normally suppressed.
  • MT movement time
  • Ocular functions are defined by pupil diameter, eye movements (ratio of pupil center to CR), blink rate and duration.
  • Analysis output includes graphical presentation of the temporal integration of the subject's gaze behaviors, motor behaviors, and ocular function relative to objects, persons and events in the sldll. Imaging techniques (i.e. NIH Image) are also used to temporarily plot the gaze relative to objects (e.g. a ball).
  • Figure 4 shows the pursuit tracking on the ball of the normal subject during 5 hits (solid line) and 5 misses (dashed line).
  • a visual angle of zero means that the subject's eye gaze was on the ball. It can be seen from Figure 4 that the normal subject detected the ball early and was able to track it over about half of the total flight time of the ball.
  • Figure 5 shows the pursuit tracking on the ball of the ADHD-PI subject during 3 hits (solid line) and 5 misses (dashed line).
  • solid line shows the pursuit tracking on the ball of the ADHD-PI subject during 3 hits
  • 5 misses shows the ADHD-PI subject.
  • ADHD subjects were recruited using an advertisement in a local newspaper. Forty-three individuals responded and eight were selected after screening interviews using Barkley and Murphy's ADHD Clinical Questionnaire (Barkley, R.A. and Murphy, K.R. (1998), Attention-deficit hyperactivity disorder, A Clinical Workbook, 2 nd Edition (New York: Guilford Press). In addition, each of the subjects' physician and/or psychologist were consulted to confirm a ADHD-PI or ADHD-C diagnosis. The eight selected ADHD subjects were then age matched with seven Normal control subjects. All chosen subjects were screened for learning disabilities, anxiety disorders, reading difficulties, use of other medications and skill level in table tennis.
  • ADHD ON Each of the eight ADHD subjects were first tested while they were on their mediation. The ADHD subjects would then be tested two weeks later when they had been off their medication for at least 48 hours (ADHD Off).
  • the target cue came on 2 seconds before the ball was served.
  • the target cue came on after the serve passed through the laser beam. The laser beam was situated so that the served ball triggered the target cue light about 300 ms after the ball was served.
  • Performance accuracy was determined using 20 trials of each condition.
  • MT, QE and TMT data were derived from the first five hits in each condition, and five misses, randomly selected. All absolute measures (millisecond) of MT, QE, and TMT were converted to relative time (percent), following procedures outlined above. Missing cells were filled using mean values for Group x Condition x Accuracy x Trial. Missing trials occurred due to equipment malfunction, or the participant being unable to complete the five hits to the target in the 20 trials allowed per condition.
  • the groups differed significantly, F (2, 12) • 12.13, p ⁇ .001, in the number of trials they were able to complete.
  • Table 3 shows the percent accuracy of the ADHD Off, the ADHD On and the Normal subjects during early and late conditions.
  • Scheffe post-hoc tests also revealed that the Normal subjects differed from the ADHD On subjects and the ADHD Off subjects.
  • the interaction of Group x Condition was not significant, however, indicating that the subjects in each groups maintained the same level of accuracy during early and late conditions.
  • Table 4 displays the group effects for gaze pursuit tracking (QE onset, offset and duration, GF), and arm movement (MT, ANC). .Also shown are the results of the repeated measures A ⁇ ONA and post-hoc tests. It can be seen from Table 4 that GF was significantly higher for both the ADHD On and the ADHD Off subjects when compared to the Normal subjects (3.83 for ADHD Off, 2.89 for the ADHD On, 2.44 for Normal). The Scheffe post hoc tests showed ADHD Off did not differ from ADHD On, but both differed from Normal. QE onset was significantly later for both ADHD groups; Normal onset occurred at 14% of ball flight as compared to 20.69% for ADHD Off and 23.17% for ADHD On. Post-hoc tests were non-significant.
  • Figures 6a and 6b provide plots of the gaze pursuit of a Normal and ADHD subject, respectively, expressed as visual angle on the edge of the ball.
  • the horizontal lines denote the three degree threshold for defining pursuit tracking.
  • Figure 6a shows the pursuit tracking of a 14 year old ADHD Off subject under early cue conditions. The ADHD Off subject was unable to maintain continuous tracking on the ball. A higher frequency of GF is shown and QE tracking is of short duration. Also evident is a different temporal pattern for hit and misses.
  • Figure 6b shows the gaze pursuit of a Normal; also age 14, under early cue conditions. The results in Figure 6b show a lower frequency of GF, and a longer duration of QE tracking. Also evident is a similar temporal pattern of gaze control on hits and misses, or greater invariance in the timing of the saccadic and tracking gaze over the course of ball flight.
  • FIG. 9a, 9b and 9c show the duration of tracking on the ball results (on the left) and arm velocity at contact results under early cue conditions (on the right), of three Normal subjects, respectively.
  • Figures 10a, 10b and 10c show the duration of tracking on the ball results (on the left) and arm velocity at contact results under early cue conditions (on the right), of three ADHD subjects, respectively.
  • ANC indicating motor invariance.
  • ADHD subjects showed normal MT onset and MT offset, they differed significantly in ANC.
  • AVC is also a good parameter to use for diagnosing individuals with ADHD.
  • all of the collected data for the ADHD Off group is compared with the data for the ADHD On group, it is clear that medication did not significantly improve either gaze control or arm movements of those diagnosed with ADHD, under either easy or hard conditions.
  • Medication also significantly improved the ability of the ADHD subjects to complete the required number of trials.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Biomedical Technology (AREA)
  • Medical Informatics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Developmental Disabilities (AREA)
  • Social Psychology (AREA)
  • Psychology (AREA)
  • Psychiatry (AREA)
  • Hospice & Palliative Care (AREA)
  • Educational Technology (AREA)
  • Child & Adolescent Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Ophthalmology & Optometry (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Epidemiology (AREA)
  • Primary Health Care (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

La présente invention concerne une technique et un appareil associé destinés au diagnostic de l'hyperactivité avec déficit de l'attention (ADHD). Cette technique consiste à mesurer la capacité d'une personne de suivre visuellement un objet en mouvement pendant qu'elle effectue une tâche motrice en relation avec cet objet en mouvement. Le début, la durée du suivi visuel et la fréquence de déviation de ce suivi sont, en particulier, mesurés pendant que cette personne effectue cette tâche motrice. Cette technique consiste aussi à mesurer la capacité d'une personne à réaliser correctement cette tâche motrice. On compare ensuite ces mesures à des normes de non ADHD pour diagnostiquer une ADHD. Cette invention concerne aussi un appareil comprenant un organe permettant d'afficher visuellement un objet en mouvement devant une personne, un organe permettant de mesurer le suivi visuel de cet objet par cette personne et un organe permettant de mesurer la capacité d'une personne d'effectuer une tâche motrice en relation avec l'objet en mouvement.
PCT/CA2001/000406 2000-03-31 2001-03-30 Test diagnostic pour les troubles deficitaires de l'attention WO2001074236A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
AU44006/01A AU4400601A (en) 2000-03-31 2001-03-30 A diagnostic test for attention deficit hyperactivity disorder

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US19375900P 2000-03-31 2000-03-31
CA002303621A CA2303621A1 (fr) 2000-03-31 2000-03-31 Detecteur de l'hyperactivite avec deficit de l'attention
CA2,303,621 2000-03-31
US60/193,759 2000-03-31

Publications (1)

Publication Number Publication Date
WO2001074236A1 true WO2001074236A1 (fr) 2001-10-11

Family

ID=25681689

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CA2001/000406 WO2001074236A1 (fr) 2000-03-31 2001-03-30 Test diagnostic pour les troubles deficitaires de l'attention

Country Status (2)

Country Link
AU (1) AU4400601A (fr)
WO (1) WO2001074236A1 (fr)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006029021A2 (fr) * 2004-09-02 2006-03-16 The Mclean Hospital Corporation Methode de diagnostic d'un trouble d'hyperactivite avec deficit de l'attention
WO2007056287A2 (fr) * 2005-11-04 2007-05-18 Eye Tracking, Inc. Production de stimuli d'essai dans des supports visuels
EP1799106A1 (fr) * 2004-09-13 2007-06-27 Biocognisafe Procede et dispositif permettant d'indiquer le niveau de vigilance d'un sujet
WO2007116849A1 (fr) 2006-03-27 2007-10-18 Fujifilm Corporation Appareil, procédé et programme de production d'image
ITTO20080862A1 (it) * 2008-11-21 2010-05-22 Fond Stella Maris Dispositivo e procedimento per la misurazione del punto di mira visiva di un soggetto
EP2451340A1 (fr) * 2009-07-09 2012-05-16 Nike International Ltd Suivi de mouvement oculaire et corporel pour test et/ou entraînement
WO2013166025A1 (fr) * 2012-05-01 2013-11-07 RightEye, LLC Systèmes et procédés pour évaluer la poursuite de l'œil humain
WO2013178871A1 (fr) * 2012-05-31 2013-12-05 Nokia Corporation Dispositif de poursuite du regard destiné à établir un diagnostic médical
US9498123B2 (en) 2006-03-27 2016-11-22 Fujifilm Corporation Image recording apparatus, image recording method and image recording program stored on a computer readable medium
KR102172037B1 (ko) * 2020-04-13 2020-11-02 가천대학교 산학협력단 소아의 adhd정도 진단을 위한 활동량 분석 시스템 및 방법
US10861605B2 (en) 2016-08-22 2020-12-08 Aic Innovations Group, Inc. Method and apparatus for determining health status

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19624135A1 (de) * 1996-06-17 1997-12-18 Fraunhofer Ges Forschung Verfahren und Vorrichtung zur Kontrolle der Augenbewegung einer ein dreidimensionales bewegtes Bild betrachtenden Person
CA2209886A1 (fr) * 1996-07-15 1998-01-15 University Technologies International, Inc. Analyseur d'execution d'activite motrice de l'oeil
WO1999005988A2 (fr) * 1997-07-30 1999-02-11 Applied Science Laboratories Systeme de detection du mouvement de l'oeil utilisant une source d'eclairage circulaire, hors de l'axe
US5913310A (en) * 1994-05-23 1999-06-22 Health Hero Network, Inc. Method for diagnosis and treatment of psychological and emotional disorders using a microprocessor-based video game

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5913310A (en) * 1994-05-23 1999-06-22 Health Hero Network, Inc. Method for diagnosis and treatment of psychological and emotional disorders using a microprocessor-based video game
DE19624135A1 (de) * 1996-06-17 1997-12-18 Fraunhofer Ges Forschung Verfahren und Vorrichtung zur Kontrolle der Augenbewegung einer ein dreidimensionales bewegtes Bild betrachtenden Person
CA2209886A1 (fr) * 1996-07-15 1998-01-15 University Technologies International, Inc. Analyseur d'execution d'activite motrice de l'oeil
WO1999005988A2 (fr) * 1997-07-30 1999-02-11 Applied Science Laboratories Systeme de detection du mouvement de l'oeil utilisant une source d'eclairage circulaire, hors de l'axe

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2006029021A3 (fr) * 2004-09-02 2006-05-26 Mclean Hospital Corp Methode de diagnostic d'un trouble d'hyperactivite avec deficit de l'attention
WO2006029021A2 (fr) * 2004-09-02 2006-03-16 The Mclean Hospital Corporation Methode de diagnostic d'un trouble d'hyperactivite avec deficit de l'attention
US7435227B2 (en) 2004-09-13 2008-10-14 Biocognisafe (Bcs) Technologies Method and apparatus for generating an indication of a level of vigilance of an individual
EP1799106A1 (fr) * 2004-09-13 2007-06-27 Biocognisafe Procede et dispositif permettant d'indiquer le niveau de vigilance d'un sujet
EP1799106A4 (fr) * 2004-09-13 2008-02-13 Biocognisafe Procede et dispositif permettant d'indiquer le niveau de vigilance d'un sujet
WO2007056287A2 (fr) * 2005-11-04 2007-05-18 Eye Tracking, Inc. Production de stimuli d'essai dans des supports visuels
US8602791B2 (en) 2005-11-04 2013-12-10 Eye Tracking, Inc. Generation of test stimuli in visual media
WO2007056287A3 (fr) * 2005-11-04 2007-12-06 Eye Tracking Inc Production de stimuli d'essai dans des supports visuels
US8243132B2 (en) 2006-03-27 2012-08-14 Fujifilm Corporation Image output apparatus, image output method and image output computer readable medium
WO2007116849A1 (fr) 2006-03-27 2007-10-18 Fujifilm Corporation Appareil, procédé et programme de production d'image
EP2004039A4 (fr) * 2006-03-27 2011-12-28 Fujifilm Corp Appareil, procédé et programme de production d'image
JP2007289658A (ja) * 2006-03-27 2007-11-08 Fujifilm Corp 画像出力装置、画像出力方法、および画像出力プログラム
US9498123B2 (en) 2006-03-27 2016-11-22 Fujifilm Corporation Image recording apparatus, image recording method and image recording program stored on a computer readable medium
EP2004039A1 (fr) * 2006-03-27 2008-12-24 FUJIFILM Corporation Appareil, procédé et programme de production d'image
ITTO20080862A1 (it) * 2008-11-21 2010-05-22 Fond Stella Maris Dispositivo e procedimento per la misurazione del punto di mira visiva di un soggetto
EP2451340A1 (fr) * 2009-07-09 2012-05-16 Nike International Ltd Suivi de mouvement oculaire et corporel pour test et/ou entraînement
CN102481093A (zh) * 2009-07-09 2012-05-30 耐克国际有限公司 用于测试和/或训练的眼睛和身体运动跟踪
EP2451340A4 (fr) * 2009-07-09 2013-09-04 Nike International Ltd Suivi de mouvement oculaire et corporel pour test et/ou entraînement
US8864310B2 (en) 2012-05-01 2014-10-21 RightEye, LLC Systems and methods for evaluating human eye tracking
US8955974B2 (en) 2012-05-01 2015-02-17 RightEye, LLC Systems and methods for evaluating human eye tracking
WO2013166025A1 (fr) * 2012-05-01 2013-11-07 RightEye, LLC Systèmes et procédés pour évaluer la poursuite de l'œil humain
US9649030B2 (en) 2012-05-01 2017-05-16 RightEye, LLC Systems and methods for evaluating human eye tracking
US10512397B2 (en) 2012-05-01 2019-12-24 RightEye, LLC Systems and methods for evaluating human eye tracking
EP3733051A1 (fr) * 2012-05-01 2020-11-04 Righteye, LLC Systèmes et procédés d'évaluation de suivi de l' il humain
US11160450B2 (en) 2012-05-01 2021-11-02 RightEye, LLC Systems and methods for evaluating human eye tracking
US11690510B2 (en) 2012-05-01 2023-07-04 Righteye Llc Systems and methods for evaluating human eye tracking
WO2013178871A1 (fr) * 2012-05-31 2013-12-05 Nokia Corporation Dispositif de poursuite du regard destiné à établir un diagnostic médical
US9888842B2 (en) 2012-05-31 2018-02-13 Nokia Technologies Oy Medical diagnostic gaze tracker
US10861605B2 (en) 2016-08-22 2020-12-08 Aic Innovations Group, Inc. Method and apparatus for determining health status
US11961620B2 (en) 2016-08-22 2024-04-16 Aic Innovations Group, Inc. Method and apparatus for determining health status
KR102172037B1 (ko) * 2020-04-13 2020-11-02 가천대학교 산학협력단 소아의 adhd정도 진단을 위한 활동량 분석 시스템 및 방법

Also Published As

Publication number Publication date
AU4400601A (en) 2001-10-15

Similar Documents

Publication Publication Date Title
AU2014329339B2 (en) Eye movement monitoring of brain function
Allison et al. Combined head and eye tracking system for dynamic testing of the vestibular system
CN101232841B (zh) 眼动传感器
Shih et al. A novel approach to 3-D gaze tracking using stereo cameras
Chandra et al. Eye tracking based human computer interaction: Applications and their uses
US20060087618A1 (en) Ocular display apparatus for assessment and measurement of and for treatment of ocular disorders, and methods therefor
CN104603673B (zh) 头戴式系统以及使用头戴式系统计算和渲染数字图像流的方法
Pagano et al. Comparing measures of monocular distance perception: Verbal and reaching errors are not correlated.
US8553936B2 (en) Gaze tracking measurement and training system and method
US20110205167A1 (en) Brain concussion screening method & apparatus
EP3561795A1 (fr) Système d'entraînement de réalité augmentée
WO2001074236A1 (fr) Test diagnostic pour les troubles deficitaires de l'attention
CN109846456A (zh) 一种基于头戴显示设备的视野检查装置
JPH074345B2 (ja) 注視点マスキングによる医療診断装置
Somia et al. A computer analysis of reflex eyelid motion in normal subjects and in facial neuropathy
Zaleski-King et al. Use of commercial virtual reality technology to assess verticality perception in static and dynamic visual backgrounds
WO2021190762A1 (fr) Méthodes de réalité virtuelle et de neurostimulation conjointes pour la rééducation visuomotrice
Vickers et al. Gaze pursuit and arm control of adolescent males diagnosed with attention deficit hyperactivity disorder (ADHD) and normal controls: Evidence of a dissociation in processing visual information of short and long duration
Miyoshi et al. Input device using eye tracker in human-computer interaction
Jerome et al. The perception and estimation of egocentric distance in real and augmented reality environments
CN113729609B (zh) 同视机
CA2209886A1 (fr) Analyseur d'execution d'activite motrice de l'oeil
Turovsky et al. Characteristics of Command Generation for Oculographic Interfaces in Conditions of Vestibular Inputs
CA2303621A1 (fr) Detecteur de l'hyperactivite avec deficit de l'attention
RU2792536C1 (ru) Цифровые очки для восстановления и эмуляции бинокулярного зрения

Legal Events

Date Code Title Description
AK Designated states

Kind code of ref document: A1

Designated state(s): AE AG AL AM AT AU AZ BA BB BG BR BY BZ CA CH CN CR CU CZ DE DK DM DZ EE ES FI GB GD GE GH GM HR HU ID IL IN IS JP KE KG KP KR KZ LC LK LR LS LT LU LV MA MD MG MK MN MW MX MZ NO NZ PL PT RO RU SD SE SG SI SK SL TJ TM TR TT TZ UA UG US UZ VN YU ZA ZW

AL Designated countries for regional patents

Kind code of ref document: A1

Designated state(s): GH GM KE LS MW MZ SD SL SZ TZ UG ZW AM AZ BY KG KZ MD RU TJ TM AT BE CH CY DE DK ES FI FR GB GR IE IT LU MC NL PT SE TR BF BJ CF CG CI CM GA GN GW ML MR NE SN TD TG

121 Ep: the epo has been informed by wipo that ep was designated in this application
REG Reference to national code

Ref country code: DE

Ref legal event code: 8642

122 Ep: pct application non-entry in european phase
NENP Non-entry into the national phase

Ref country code: JP