WO2023081123A1 - Systems and methods for determining dementia and other conditions - Google Patents

Systems and methods for determining dementia and other conditions Download PDF

Info

Publication number
WO2023081123A1
WO2023081123A1 PCT/US2022/048519 US2022048519W WO2023081123A1 WO 2023081123 A1 WO2023081123 A1 WO 2023081123A1 US 2022048519 W US2022048519 W US 2022048519W WO 2023081123 A1 WO2023081123 A1 WO 2023081123A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
tracking
eye
neurological condition
eye tracking
Prior art date
Application number
PCT/US2022/048519
Other languages
French (fr)
Other versions
WO2023081123A9 (en
Inventor
Shaun PATEL
Jonathan C. LANSEY
Irene Beatrix MEIER
Original Assignee
React Neuro Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by React Neuro Inc. filed Critical React Neuro Inc.
Publication of WO2023081123A1 publication Critical patent/WO2023081123A1/en
Publication of WO2023081123A9 publication Critical patent/WO2023081123A9/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system

Definitions

  • the present disclosure generally relates to systems and methods for determining dementia and other conditions.
  • Neurodegenerative diseases include a variety of neurological conditions with progressive worsening and neuronal death. Oculomotor abnormalities are at the core of some of these disease, as neurodegenerative processes often will affect brain circuits of eye movements. Their assessment may serve as a useful biomarker to help diagnose patients and monitor disease progression. For example, smooth pursuit eye movement has been shown to be impaired in Alzheimer’s disease (AD) patients. This may reflect degeneration of cortical oculomotor centers in the brain. In the smooth pursuit task, the presence of a stimulus is first represented on the retina, then transferred to the early visual cortex. From there, the hierarchy of the visual cortex is engaged, reflecting neurocognitive function and reactivity. However, difficulties in analyzing eye movements in subjects has limited the usefulness of such techniques. Thus, improvements are needed.
  • AD Alzheimer’s disease
  • the present disclosure generally relates to systems and methods for determining dementia and other conditions.
  • the subject matter of the present disclosure involves, in some cases, interrelated products, alternative solutions to a particular problem, and/or a plurality of different uses of one or more systems and/or articles.
  • the device comprises a display, an eye tracking sensor, and a processor.
  • the processor comprises instructions that, when executed, cause the device to display, on the display, a tracking object moving in a spiral trajectory around a center point; determine, via the eye tracking sensor, eye tracking of the object by a subject; determine an error in a radial direction relative to the center point between an actual position of the tracking object and a position where the subject’s eye is focused, as determined by the subject’s eye tracking; and diagnose the neurological condition of the subject based on the error.
  • the device comprises a display, an eye tracking sensor; and a processor.
  • the processor comprises instructions that, when executed, cause the device to display, on the display, a tracking object moving in a nonlinear trajectory; determine, via the eye tracking sensor, eye tracking of the object by a subject; and diagnose the neurological condition of the subject based on errors between an actual position of the tracking object and a position where the subject’s eye is focused.
  • the device comprises a display, an eye tracking sensor, and a processor.
  • the processor comprises instructions that, when executed, cause the device to display, on the display, a tracking object moving in a nonlinear trajectory; determine, via the eye tracking sensor, eye tracking of the object by a subject; determine an error in a first direction and an error in a second direction between an actual position of the tracking object and a position where the subject’s eye is focused, as determined by the subject’s eye tracking; and diagnosing the neurological condition of the subject based on the error in the first direction but not the error in the second direction.
  • Another aspect is generally directed to a method of diagnosing a neurological condition in a subject.
  • the method comprises displaying, to the subject, a tracking object moving in a spiral trajectory around a center point; determining, via an eye tracking sensor, eye tracking of the object by the subject; determining an error in a radial direction relative to the center point between an actual position of the tracking object and a position where the subject’s eye is focused, as determined by the subject’s eye tracking; and diagnosing a neurological condition of the subject based on the error in the radial direction.
  • the method comprises displaying, to the subject, a tracking object moving in a spiral trajectory around a center point; determining, via an eye tracking sensor, eye tracking of the object by the subject; determining an error in a radial direction relative to the center point between an actual position of the tracking object and a position where the subject’s eye is focused, as determined by the subject’s eye tracking; diagnosing a neurological condition of the subject based on the error in the radial direction; and treating the subject for the neurological condition based on the diagnosis.
  • the method comprises displaying, to the subject, a tracking object moving in a nonlinear trajectory; determining, via an eye tracking sensor, eye tracking of the object by the subject; determining an error in a first direction and an error in a second direction between an actual position of the tracking object and a position where the subject’s eye is focused, as determined by the subject’s eye tracking; and diagnosing a neurological condition of the subject based on the error in the first direction but not the error in the second direction.
  • the method comprises displaying, to the subject, a tracking object moving in a nonlinear trajectory; determining, via an eye tracking sensor, eye tracking of the object by the subject; determining an error in a first direction and an error in a second direction between an actual position of the tracking object and a position where the subject’s eye is focused, as determined by the subject’s eye tracking; diagnosing a neurological condition of the subject based on the error in the first direction but not the error in the second direction; and treating the subject for the neurological condition based on the diagnosis.
  • the present disclosure encompasses methods of making one or more of the embodiments described herein, for example, a device for assessing a neurological condition in a subject. In still another aspect, the present disclosure encompasses methods of using one or more of the embodiments described herein, for example, a device for assessing a neurological condition in a subject.
  • Fig. 1 illustrates a smooth pursuit trajectory, in accordance with one embodiment
  • Figs. 2A-2D illustrate eye tracking of a smooth pursuit trajectory for four subjects, in accordance with another embodiment
  • Figs. 3A-3D illustrate the radial errors of the trajectories shown in Figs. 2A-2D;
  • Figs. 4A-4D illustrate the angular errors of the trajectories shown in Figs. 2A-2D;
  • Fig. 5 illustrates a device for tracking eyes, in another embodiment
  • Fig. 6 illustrates a VR device, in yet another embodiment
  • Fig. 7 illustrates test results involving 11 subjects, in still another embodiment.
  • the present disclosure generally relates to systems and methods for determining dementia, Parkinson’s Disease, Alzheimer’s Disease, stroke, or other conditions.
  • a subject is presented with a tracking object that moves in a spiral and/or circular pathway, and the subject’s ability to visually track the object with their eyes is determined.
  • Subjects with certain neurological conditions may have difficulties in tracking the object, particularly in a radial direction, although angular directions may be less severely impacted. Accordingly, by determining radial errors in eye tracking, the neurological condition of the subject may be assessed, and treated as necessary.
  • Other aspects as discussed herein are generally directed to devices and methods for assessing such subjects, kits for determining the neurological condition of a subject, and the like.
  • a subject which may be a subject suspected of having some form of neurological impairment or condition, can be tested by determining the subject’s ability to visually track an object.
  • the subject is exposed to an object and asked to track the object with their eyes. This is generally known as a smooth pursuit task.
  • the object to be tracked may be a real object or a simulated object shown on a display, e.g., as part of a virtual reality device. While the subject tracks the object with their eyes, one or more eye tracking sensors may be used to track where the eyes of the subject are actually focused, i.e., to determine the apparent position of the object.
  • These tracking data based on the apparent position of the object, i.e., where the subject is looking, can be compared with the actual position of the object, and the error between the two can be determined.
  • These errors can then be analyzed to determine the subject’s neurological impairment or condition.
  • These errors may include spatial errors (e.g., where the gaze is located away from the stimulus in a random direction) and/or temporal errors (e.g., where the gaze may be located where the stimulus was moments before, or ahead in the direction that the stimulus is going to be).
  • a tracking object is moved around on a trajectory, while the eye tracking sensors used to monitor the eyes will typically record a series of saccades as the brain attempts to follow of the object. This can be seen, for example, in the eye tracking measurements shown in Fig. 2D, where a healthy subject was asked to track an object moving in a spiral trajectory. There are a number of straight-line segments, overshoots, etc., which correspond to the saccades of the eyes as they track the object.
  • various sensors can be employed to track a subject’s eye movement.
  • video-based tracking sensors infrared red, near infrared, and passive light sensing, etc.
  • Such sensors can be built into glasses or other wearable devices (e.g., VR headsets, etc.) in certain embodiments.
  • a camera may be used as a sensor to track a subject’s eye movement.
  • a video camera may be used to determine where a subject’s eye is focusing.
  • the object may be moved on a trajectory that includes different directions.
  • the object may be moved in trajectories that are oblique, i.e., not horizontal or vertical, trajectories that are not straight lines, or trajectories that are nonlinear.
  • trajectories may be more difficult for a subject to smoothly track, as the eye movements necessary to follow such trajectories require coordination between different parts of the brain. In part, this is because there are different extraocular muscles that control the direction of the eye, and these must be moved in a coordinated fashion by the brain in order for the eyes to be able to track such trajectories.
  • Figs. 2A-2C for subjects with various neurological conditions.
  • the subject’s eyes in these figures were not able to follow the object in its trajectory.
  • some angular errors may be determined as a distance along the arc of a circle from the point of gaze, to the point of the stimulus.
  • the radial error may be determined as a distance from the point of gaze directly to a circle with a radius equal to the radial distance that the stimulus is at. For instance, by focusing on radial errors, the “noise” created by angular errors may be reduced or eliminated, making it easier to identify those subjects with certain neurological conditions.
  • the system is configured to identify cognitive impairment more efficiently and more accurately that conventional approaches.
  • a tracking system is configured to identify cognitive impairment based on analyzing radial errors alone.
  • the system can reduce the error resulting from angular and radial data samples improving accuracy in system determinations.
  • FIG. 1 an example trajectory of a tracking object for a subject’s eyes to follow is shown.
  • an object moves around a center point in an outwardly expanding spiral trajectory. Once the object reaches a certain radius, the object then follows a circular trajectory around the center point.
  • the subject is asked to follow the object with their eyes, and the trajectory that their eyes follow is tracked with one or more eye tracking sensors.
  • Non-limiting examples of such eye tracking movements following an object trajectory are shown in Fig. 2, for subjects with dementia (Fig. 2A), Parkinson’s Disease (Fig. 2B), Alzheimer’s Disease (Fig. 2C), and a healthy subject (Fig. 2D).
  • Figs. 3 and 4 the radial and angular errors of the subject’s tracking, relative to the actual position of the tracked object, are respectively shown as a function of time for the subjects from Fig. 2 (keeping the same sequence, such that Figs. 3A and 4A are for the subject with dementia, Figs. 3B and 4B are for the subject with Parkinson’s Disease, etc.).
  • the straight lines in each of these figures are the actual position of the tracked object while the jagged lines show the position that the eyes were focused, i.e., the apparent position of the object based on the eyes’ focus, each as a function of time.
  • the temporal errors may also be determined.
  • Fig. 4 shows that all 4 subjects were reasonably able to follow the object’s angular position.
  • the discrepancies between the actual position of the tracked object and the apparent position of the object based on the focal position of the eyes are not large.
  • Figs. 3A- 3C shows that the subjects with various neurological conditions found it difficult to track the object’s radial position, in contrast with Fig. 3D for a healthy subject.
  • the discrepancies in positions in these figures is relatively large, and in some cases, the apparent position of the object does not appear to be correlated with its actual position.
  • eye tracking data of a subject may be processed to reduce or eliminate such errors.
  • data analysis of such eye tracking may be significantly improved by focusing on radial errors, rather than angular errors. This can significantly reduce the amount of noise in the data and improves its quality, thereby improving diagnosis.
  • certain embodiments are directed to systems and methods for determining a neurological condition in a subject by focusing on radial errors in eye tracking, and optionally treating those subjects with neurological conditions.
  • a variety of different neurological conditions may be diagnosed in some aspects as discussed herein.
  • the subject may be one that has or is suspected of potentially having a neurological condition, and systems and methods such as those described herein may be used to diagnose the subject as having a neurological condition, and in some cases, to facilitate treatment of the subject.
  • the specific neurological condition may also be determined.
  • neurological conditions that can be diagnosed include dementia, Parkinson’s Disease, Alzheimer’s Disease, stroke, etc.
  • a tracking object is presented to a subject.
  • the subject is typically human.
  • the tracking object may a real object, or it may be a projection of an object on a display, etc.
  • the tracking object may have any appearance, e.g., one that allows the subject to track it by simply focusing on it with their eyes.
  • the tracking object may be of any suitable size, shape, color, form, etc.
  • the tracking object may have a relatively small size, e.g., such that its apparent position, as observed by the subject, is known to a relatively high degree of precision.
  • the tracking object may appear to be a circle, a ball, or a sphere that has a diameter of between 10 mm and 20 mm, and may be presented to appear at a distance of between 0.8 m and 1.2 m from the subject.
  • the tracking object may be larger or smaller, and/or may appear closer or farther away from the subject.
  • the tracking object may grow or shrink in size (e.g., such that the object appears to go closer to or farther away from the subject).
  • the tracking object may appear to be at least 10 cm, at least 20 cm, at least 50 cm, 100 cm, at least 200 cm, at least 500 cm, at least 700 cm, at least 1 m, at least 1.3 m, at least 1.5 m, at least 2 m, at least 3 m, at least 5 m, at least 10 m, etc.
  • the tracking object may appear to be no more than 10 m, no more than 5 m, no more than 3 m, no more than 2 m, no more than 1.5 m, no more than 1.3 m, no more than 1 m, no more than 700 cm, no more than 500 cm, no more than 200 cm, no more than 100 cm, no more than 50 cm, no more than 20 cm, no more than 10 cm, etc. Combinations of any of these are also possible, e.g., the object may appear to be between 10 m and 15 m, between 50 cm and 100 cm, between 700 cm and 1 m, etc., from the subject. The object may appear to stay a constant distance from the subject, and/or the object may have a trajectory where it appears to move closer and/or farther away from the subject.
  • the tracking object may appear to be of any size.
  • the tracking object may appear to have a maximum dimension of at least 2 mm, at least 3 mm, at least 5 mm, at least 7 mm, at least 10 mm, at least 20 mm, at least 30 mm, at least 50 mm, at least 70 mm, at least 100 mm, at least 200 mm, at least 300 mm, at least 500 mm, at least 700 mm, at least 1 m, etc.
  • the tracking object may also appear to have, in certain embodiments, a maximum dimension of no more than 1 m, no more than 700 mm, no more than 500 mm, no more than 300 mm, no more than 200 mm, no more than 100 mm, no more than 70 mm, no more than 50 mm, no more than 30 mm, no more than 20 mm, no more than 10 mm, no more than 7 mm, no more than 5 mm, no more than 3 mm, no more than 2 mm, etc. Combinations of any of these dimensions are also possible.
  • the object may appear to have a maximum dimension of between 10 mm and 20 mm, between 200 mm and 300 mm, between 3 mm and 10 mm, etc.
  • multiple test runs can be executed, and observation information captured using tracking objects of variable sizes for respective tests.
  • the tracking object may have any shape or color.
  • the tracking object may appear to be a circle, a ball, or a sphere.
  • the tracking object may be a square, a triangle, a cube, a block, a cylinder, a tetrahedron, a polyhedron, or appear to be a flat shape or a real object, etc.
  • the object may have a single color, or exhibit a range of colors.
  • the object may change its appearance and/or size as it moves along the trajectory, at specific or random periods of time, or the like.
  • the tracking object may be moved in a nonlinear trajectory, for example, a circle, a spiral, or others such as those discussed herein.
  • a nonlinear trajectory for example, a circle, a spiral, or others such as those discussed herein.
  • the distance between the tracking object and the center point i.e., the radius
  • the angle of the tracking object relative to the center point i.e., the angle of the tracking object relative to the center point.
  • Errors in positioning along the radial dimension are radial errors
  • errors in positioning along the angular dimension are angular errors.
  • temporal errors may also be determined, e.g., in addition to or instead of spatial errors.
  • errors are not uniformly distributed, but can be dissected into different dimensions (for example, radial and angular errors), and subjects with certain neurological conditions may exhibit more errors of one type than another.
  • This unusual property can be used, according to some embodiments as described herein, to remove “noise” from eye tracking measurements and thereby improve the ability to diagnose or treat certain neurological conditions.
  • using eye tracking measurements while removing certain types of errors in movement may be useful for diagnosing or treating dementia, Parkinson’s Disease, Alzheimer’s Disease, stroke, concussion, ADHD, or other conditions such as those described herein.
  • Therapeutic treatment may be symptomatic treatment in order to relieve the symptoms of the specific indication or causal treatment in order to reverse or partially reverse the conditions of the indication or to stop or slow down progression of the condition. Treatments such as those discussed herein may be used for instance as therapeutic treatment over a period of time as well as for chronic therapy.
  • the tracking object may be moved along a trajectory for the subject to follow.
  • the tracking object may be moved in a linear or a nonlinear trajectory, for example, a sinusoid, a circle, a spiral, a square or other polygonal shape, a random trajectory, etc.
  • a nonlinear trajectory can be more difficult for a subject to smoothly track, and may be more useful for determining subjects with certain neurological conditions.
  • the tracking object may proceed along the trajectory at any suitable speed, angular velocity, etc. For instance, the speed or the angular velocity may stay relatively constant, or may change (e.g., smoothly or in jumps, at regular or irregular intervals, etc.).
  • the tracking object may be moved to follow a trajectory that, at its maximum dimension, appears to subtend an angle of at least 30°, at least 40°, at least 45°, at least 50°, at least 60°, at least 70°, at least 80°, at least 90°, at least 100°, at least 110°, at least 120°, at least 130°, at least 140°, at least 150°, at least 160°, at least 170°, at least 180o etc., from the subject’s point of view.
  • the trajectory in its maximum dimension, may be no more than 180°, no more than 170°, no more than 160°, no more than 150°, no more than 140°, no more than 130°, no more than 120°, no more than 110°, no more than 100°, no more than 90°, no more than 80°, no more than 70°, no more than 60°, no more than 50°, no more than 45°, no more than 40°, no more than 30°, etc. Combinations of these angles are also possible.
  • the trajectory of the object may appear to subtend an angle, from the subject’s point of view, of between 45° and 60°, between 90° and 180°, between 100° and 150°, etc.
  • the tracking object may be moved around a center point, for example, in a substantially circular, elliptical, or spiral trajectory.
  • the tracking object may follow any of a variety of trajectories.
  • the trajectories are defined using a center point, although this is not required.
  • combinations of these may be used in some embodiments; for instance, the tracking object may be moved in a spiral trajectory followed by a circular trajectory, in a circular trajectory followed by a spiral trajectory, in a spiral trajectory followed by an elliptical trajectory, an elliptical trajectory with non-elliptical deviations, a circular trajectory with non-circular deviations, etc.
  • the trajectory may appear to be a 2-dimensional trajectory (i.e., appearing to be within a plane) or a 3-dimensional trajectory.
  • the tracking object may appear to be approaching towards and/or receding from the subject, e.g., instead of or in addition to moving in a planar trajectory.
  • the object may follow a spiral trajectory for at least 5 seconds, at least 10 seconds, at least 15 seconds, at least 20 seconds, at least 25 seconds, at least 30 seconds, at least 40 seconds, at least 45 seconds, at least 50 seconds, at least 1 minute, at least 1.5 minutes, at least 2 minutes, at least 3 minutes, at least 4 minutes, at least 5 minutes, etc.
  • the tracking object may start near a center point, then move in a clockwise or counterclockwise direction around the center point, while the radial distance of the tracking object away from the center point may increase, for example, uniformly, linearly or nonlineraly, etc. In another embodiment, this may be in reverse; i.e., the tracking object may spiral in towards the center point.
  • the spiral may take a variety of forms, e.g., the tracking object may proceed along a trajectory that spirals around a center point.
  • the spiral is an Archimedean spiral.
  • the spiral is formed such that the angular velocity of the tracking object is constant or substantially constant, or that the pitch of the spiral is constant (e.g., if the spiral trajectory is described by a + b x, where x is the angle, then a and/or b may independently be constant).
  • the angular velocity of the tracking object, and/or the pitch of the spiral is not constant or substantially constant.
  • a and/or b may be a function of time, position, angle, or the like.
  • the spiral may not necessarily be uniform, or describable by an equation such as a + b x.
  • At least a portion of the trajectory that the tracking object follows is circular or substantially circular.
  • the object may follow a circular trajectory for at least 5 seconds, at least 10 seconds, at least 15 seconds, at least 20 seconds, at least 25 seconds, at least 30 seconds, at least 40 seconds, at least 45 seconds, at least 50 seconds, at least 1 minute, at least 1.5 minutes, at least 2 minutes, at least 3 minutes, at least 4 minutes, at least 5 minutes, etc.
  • the tracking orbit may have any suitable dimension.
  • errors in eye tracking can be determined in various aspects by comparing the actual position of the tracking object (e.g., where it appears on a display) with the apparent position that the subject’s eyes are focused (e.g., as determined using one or more eye tracking sensors).
  • the eyes were perfect, then the actual position of the object would be matched by the focal position of the subject’s eyes, i.e., the object’s apparent position.
  • the eyes are not able to perfectly track an object, and there will be a series of errors, overshoots, saccades, and the like. These differences or errors in position can then be analyzed to diagnosing a neurological condition, and optionally used to determine a treatment for the neurological condition.
  • the errors may be dissected into, at least, a first direction and a second direction, for example, two orthogonal directions (e.g., horizontal and vertical), or in some cases, a radial direction and an angular direction.
  • a first direction and a second direction for example, two orthogonal directions (e.g., horizontal and vertical), or in some cases, a radial direction and an angular direction.
  • errors in one direction may be ignored or at least minimized.
  • subjects with certain neurological conditions may find it more difficult to track objects in a radial direction.
  • the errors in eye tracking can be dissected into at least two dimensions, thus yielding errors in tracking along a first direction (e.g., a radial direction) and errors in tracking along a second direction (e.g., an angular direction).
  • the errors may be dissected into three dimensions, for example, in cases where the trajectory of the tracking object appears to move in three dimensions.
  • the errors may be dissected into errors in three orthogonal directions (e.g., horizontal, vertical, and to/from the subject), or in some cases, a radial direction, an angular direction, and to/from the subject).
  • eye tracking errors that are determined may also be self-authenticating.
  • errors of the actual position of the tracking object as opposed to its apparent position as determined by the focus of the eyes may be dissected into errors along two different dimensions (e.g., radial vs. angular, horizontal vs. vertical, etc.). The errors observed in each dimension can be compared. If the errors do not fall into an expected error distributions as discussed herein, e.g., the errors are more randomly distributed among the different types of errors, then it can be concluded that the test was not trustworthy, e.g., due to the subject not intentionally tracking the tracking object. In addition, it is believed that such self-authenticating measurements have not previously been reported.
  • a tracking system can implement self-authenticating measurements.
  • architecting the tracking system to generate selfauthenticating measurements improve over conventional approaches. For example, where system collect or use multiple runs to account for errors (e.g., user induced).
  • the system can be configured to identify errors during test execution and optionally, restart and/or stop such a test run. Accordingly, in some embodiments, the system can identify and act on errors before a human operator or another conventional approach would have even known such an issue exists.
  • a subject may be asked to perform a cognitive task while tracking an object.
  • cognitive tasks include, but are not limited to, counting numbers in some fashion (e.g., counting forwards, counting backwards to 100, counting backwards in steps of 7), reciting multiplication tables, identifying songs being played in the background, reciting state or country names, or the like.
  • counting numbers in some fashion (e.g., counting forwards, counting backwards to 100, counting backwards in steps of 7), reciting multiplication tables, identifying songs being played in the background, reciting state or country names, or the like.
  • the tracking object presented to a subject may be presented using a variety of equipment.
  • the tracking object may be the image of an object on a display, while the eyes of the subject are detected using one or more eye tracking sensors. These may be controlled by a processor.
  • system 10 is used to test a subject’s eyes 60.
  • System 10 includes a display 20, upon which an object 25 is shown.
  • the object may appear to be stationary or moving, e.g., in trajectories such as those discussed herein.
  • Eye tracking sensors 30 can be used to track the position of eyes 60 of the subject.
  • the sensors include a light source 32 and a camera 35 that records light 37 passing between light source 32 and camera 35.
  • the angle of deflection of the light may be used to determine where the eyes are focused.
  • This can be used to determine the apparent position 40 of the object, for example, based on the expected focal position of the eyes 45.
  • the difference between the actual position and the apparent position 40 of object 25 is the error. As discussed herein, this can be analyzed (for example, using processor 50) to determine or diagnose a neurological condition in a subject.
  • the components shown in Fig. 5 can be contained within a headset such as a VR headset, for example, one that can be worn by a subject as is shown in Fig. 6.
  • VR headsets can be readily obtained commercially; non-limiting examples include Oculus Quest 2, Sony PlayStation VR, HP Reverb G2, or the like.
  • the headset may contain therein a display that can be viewed by the subject, and optionally one or more eye tracking sensors, e.g., as is shown in Fig. 5.
  • a processor may also be present within the headset, and/or located externally of the headset (for example, in wired or wireless communication with the headset).
  • the headset may be able to send and/or receive signals with a processor that is external to the headset.
  • the headset may be self-contained, e.g., able to conduct a test on a subject without requiring any additional controls.
  • the device may be any suitable device.
  • the device may be a room-sized device, a VR (virtual reality) system, a headset (e.g., a VR headset), or the like.
  • the device may include a variety of displays, eye tracking sensors, processors, and/or other components, etc., in various embodiments. Displays, VR systems, headsets, and other similar components may also be obtained commercially.
  • the device can include a frame that permits a subject to rest their head within the framework to observe the display.
  • one or more sensors can be placed throughout the framework or device to provide measurements of subject’s system movement and/or position, etc.
  • the device may include a display for displaying an image of an object to be tracked.
  • the display may be a single screen, or may comprise two or more screens (for example, one for each eye).
  • the display may be provided with instructions (e.g., from a processor) to display an image or a sequence of image frames, e.g., such that an object shown the display appears to move (for example, as discussed herein).
  • a magnifying lens may be position between the display and the subject’s eyes.
  • the display may be any suitable display for producing an image, for example, LED displays, CRT displays, LCD displays, or the like.
  • the device may also include one or more eye tracking sensors for tracking the eyes. Any of a variety of eye tracking sensors may be used, many of which can be obtained commercially.
  • the eye tracking sensor may comprise an illumination source, such as an infrared or near-infrared light-emitting diode (LED) configured to illuminate an eye with light, and one or more sensors configured to detect the light reflected by the eye.
  • the sensors may include cameras, search coils, electrooculograms (for potential measurement), etc.
  • Other examples include video-based tracking sensors, infrared red, near infrared, and passive light sensing, etc.
  • the sensors may be built into glasses or other devices (e.g., VR headsets, etc.), including any of those described herein.
  • the sensors may, in some embodiments, capture or sample at relatively high frequencies, e.g., at least 30 Hz, at least 40 Hz, at least 50 Hz, at least 60 Hz, at least 100 Hz, at least 240 Hz, at least 350 Hz, at least 500 Hz, at least 1000 Hz, at least 1250 Hz, at least 2000 Hz, etc.
  • the sensors may operate without the use of an illumination source.
  • other optical methods may be used in other embodiments.
  • a processor is used to produce images on the display and track eye movements via one or more eye tracking sensors.
  • the processor may include any of various types of circuits and/or computing devices. These may include one or more processors and/or non-transitory computer-readable storage media (e.g., memory, non-volatile storage devices, etc.).
  • the processor may control writing data to and/or reading data from memory, and/or non-volatile storage devices.
  • the processor can be a programmed microprocessor or microcomputer, which can be custom made or purchased from computer chip manufacturers.
  • the processor may execute one or more processor-executable instructions stored in one or more non-transitory computer-readable storage media, which may serve as non-transitory computer-readable storage media storing processorexecutable instructions for execution by the processor.
  • the device may be a smartphone.
  • the smartphone may contain a camera (which can be used as an eye tracking sensor), a display (which can be used to display an object to be tracked), and a processor.
  • the smartphone may be held such that a built-in camera within the smartphone faces the user while the user views the display of the smartphone, e.g., such that the camera can be used to track the user’s eyes.
  • the smartphone may be connected to an external camera and/or an external display, i.e., instead of and/or in addition to using a built-in camera and/or display on the smartphone.
  • the processor may be programmed to produce images on the display and/or track eye movements via the camera (e.g., used as an eye tracking sensor), for example, as discussed herein.
  • the device may be a computer, e.g., a desktop computer or a laptop computer.
  • the computer may be connected to an external camera, and/or contain a built-in camera, which can be used as an eye tracking sensor, e.g., as discussed herein, such that the camera can be used track the user’s eyes.
  • the computer may also be connected to an external display, and/or contain a built-in display, for example, which can be used to display an object to be tracked.
  • the camera may be positioned such that the camera is able to image the user’s eyes while the user views the display.
  • the computer may also contain a processor, e.g., which may be programmed to produce images on the display and/or track eye movements via a camera (for example, used as an eye tracking sensor), such as discussed herein.
  • the device may be positioned on a stationary object, e.g., a table or a counter. However, in other embodiments, the device may not necessarily be stationary. For example, the device may be a smartphone that is held by the user.
  • positioning or motor corrections may be applied to the program, e.g., by incorporating sampling gyroscope or accelerometer data from the device in order to correct for the position of the device in order to track the user’s eyes and/or track motion of the display, etc.
  • program or “software” are used herein in a generic sense to refer to any type of computer code or set of processor-executable instructions that can be employed to program a computer or other processor (physical or virtual) to implement various aspects of embodiments as discussed herein. Additionally, according to one embodiment, one or more computer programs that when executed perform methods of the disclosure provided herein need not reside on a single computer or processor, but may be distributed in a modular fashion among different computers or processors to implement various aspects of the disclosure provided herein.
  • Processor-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices.
  • program modules include routines, programs, objects, components, data structures, etc. that perform tasks or implement abstract data types.
  • functionality of the program modules may be combined (e.g., centralized) or distributed.
  • the processor may be used to compute errors, e.g., between the actual position of the tracked object and the apparent position of the object based on the focal position of the eyes, and in some embodiments, dissect such errors into different dimensions, for example, radial and angular errors, horizontal and vertical errors, etc.
  • the processor may also output a diagnosis or a determination of the neurological condition of the subject, e.g., based on such error determinations, and optionally, one or more treatments for the neurological condition based on the diagnosis.
  • a power supply which may be rechargeable
  • an antenna a radio frequency (RF) receiver and/or transmitter, analog to digital convertors, a real time clock, etc.
  • the antenna can be used to send and/or receive signals from an external device, such as a laptop, mobile device, computer, etc. This may be used, for example, to control operation of the device, transmit data to or from the device, modify operational parameters of the device, or the like.
  • RF radio frequency
  • This non-limiting example illustrates a virtual reality-based system for neurocognitive testing utilizing eye-tracking technology to measure parameters such as left and right eye positions, eye directions, gaze vectors, and pupil characteristics in each frame of recorded activity. These measures can then be analyzed to reconstruct eye movement patterns and derive more complex behaviors such as saccades, blinks, and attentional preferences to inform diagnostic and interventional efforts for various medical conditions.
  • the system in this example utilizes a 2D and 3D smooth-pursuit task, in which a stimulus is shown on the screen and the user must track the stimulus with their eyes. The stimulus moves outwardly along, for example, an Archimedes Spiral with monotonically increasing radius and fixed angular velocity.
  • the stimulus can also circle along the same radius a few times to simulate the regular smooth-pursuit task.
  • Preliminary data has shown statistical differences between healthy individuals and individuals with dementia.
  • the data have revealed differences in oblique eye movement patterns in patients with Alzheimer’s disease, with patients having a markedly different radial error throughout the task, while differences in performance on angular error measures are not statistically significant.
  • differences can also be seen in additional cognitive disorders such as Parkinson’s disease, stroke, traumatic brain injury, and other neurological conditions.
  • Neurodegenerative diseases include a variety of neurological conditions with progressive worsening and neuronal death. Oculomotor abnormalities are at the core of these disorders as neurodegenerative processes involve brain circuits of eye movements. Their assessment may serve as a useful biomarker to help diagnose patients and monitor disease progression. Smooth pursuit eye movement may be impaired in Alzheimer’s Disease (AD) patients, where motion conditions at frequencies representing varying ranges of target velocity have been found. AD patient gains were reduced at high target velocities and lower target accelerations; they made WO 2023/081123 - I 7 ’ PCT/US2022/048519 more frequent saccades and had more saccadic intrusions, and led the target with their eyes in all 3 conditions.
  • AD Alzheimer’s Disease
  • the smooth pursuit 2D and 3D task utilizes a stimulus in the form of a 15 millimeter white sphere displayed on the virtual reality screen 1 meters from the eyes of the user.
  • this stimulus moves outwardly along an Archimedes Spiral with monotonically increasing radius and fixed angular velocity of about 3 radians per second for 15 seconds.
  • the radius stops increasing and the stimulus circles along a constant trajectory for 15 more seconds to encompass the normal smooth pursuit task.
  • the subject is instructed to track the stimulus dot with their eyes without moving their head throughout the entire task.
  • a subject puts on a headset that includes sensors that allow eye tracking to occur.
  • the subject views a screen where an object that is to be tracked is moved in a generally spiral trajectory, e.g., along an Archimedes Spiral.
  • the object can be displayed, for example, such that it appears to be a sphere with a diameter of 0.15 m positioned 1 m away from the viewer.
  • the radius about a central point that the sphere moves is 4.6 m.
  • the speed at which the object is moved may be such that it appears to orbit once about the central point in 15 seconds. See, e.g., Fig. 1.
  • this is by way of example only, and other speeds, objects, trajectories, etc., may be used in other embodiments.
  • Fig. 2 shows the projected eye tracking of the subject on the tracking object.
  • the x and y axes are measured in apparent x-y position of the subject’s focus during the experiment, measured in meters away from the central point, with positive numbers meaning right or upwards.
  • Subject A had dementia (Fig. 2A)
  • Subject B had Parkinson’s Disease (Fig. 2B)
  • Subject C had Alzheimer’s (Fig. 2C)
  • Subject D was a healthy subject (Fig. 2D). It was found that radial error was very poorly controlled in all but the healthy subject, as is shown in Fig. 3, which shows the radial error component of the respective trajectories in Figs. 2A-2D.
  • the straight line shows the actual trajectory of the tracking object (i.e., increasing linearly for the first 15 seconds, before staying constant thereafter), while the experimental line shows the eye tracking position of the subject following the object.
  • the x axis is time in seconds while the y axis shows the radial error in tracking. It will be noted that the correspondence between the actual position of the tracking object (i.e., the theoretical position of where the eyes should be tracking) and the actual tracking position of the eyes is poorly correlated in all but the healthy subject.
  • Fig. 4 shows the angular error component of the respective trajectories in Figs. 2A-2D.
  • the straight line shows the actual trajectory of the tracking object (i.e., increasing linearly for the first 15 seconds, before staying constant thereafter), while the experimental line shows the eye tracking position of the subject following the object.
  • the x axis is time in seconds while the y axis is the angle of the object in radians. It will be noted that the correspondence between the actual position of the tracking object (i.e., the theoretical position of where the eyes should be tracking) and the actual tracking position of the eyes was reasonably correlated in all 4 cases.
  • Fig. 7 illustrates a histogram where the Y-axis represents number of subjects in specified bins.
  • the X-axis displays an individual’s variation of angular error in a smooth pursuit trajectory (using a testing procedure similar to that described in Example 1).
  • the solid bars primarily on the left represent the 10 players who were normal, while the gray bars primarily on the right represent players who a physical therapist independently marked as experiencing concussive symptoms.
  • players who were normal exhibited statistically significant lower amounts of angular error, as compared to players who experienced concussions.
  • this example demonstrates that testing using a smooth pursuit trajectory can be used to identify normal and concussed subjects.
  • a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
  • “or” should be understood to have the same meaning as “and/or” as defined above.
  • the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements.
  • This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified.
  • “at least one of A and B” can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Public Health (AREA)
  • Veterinary Medicine (AREA)
  • Biophysics (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • General Health & Medical Sciences (AREA)
  • Neurology (AREA)
  • Pathology (AREA)
  • Developmental Disabilities (AREA)
  • Neurosurgery (AREA)
  • Physiology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Child & Adolescent Psychology (AREA)
  • Human Computer Interaction (AREA)
  • Educational Technology (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychiatry (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The present disclosure generally relates to systems and methods for determining dementia, Parkinson's Disease, Alzheimer's Disease, stroke, or other conditions. According to some aspects, a subject is presented with a tracking object that moves in a spiral and/or circular pathway, and the subject's ability to visually track the object with their eyes is determined. Subjects with certain neurological conditions may have difficulties in tracking the object, particularly in a radial direction, although angular directions may be less severely impacted. Accordingly, by determining radial errors in eye tracking, the neurological condition of the subject may be assessed, and treated as necessary. Other aspects as discussed herein are generally directed to devices and methods for assessing such subjects, kits for determining the neurological condition of a subject, and the like.

Description

SYSTEMS AND METHODS FOR DETERMINING DEMENTIA AND OTHER CONDITIONS
RELATED APPLICATIONS
This application claims the benefit of U.S. Provisional Patent Application Serial No. 63/274,793, filed November 2, 2021, entitled “Systems and Methods for Determining Dementia and Other Conditions,” by Patel, et al., and of U.S. Provisional Patent Application Serial No. 63/339,002, filed May 6, 2022, entitled “Systems and Methods for Determining Dementia and Other Conditions,” by Patel, et al. Each of the above is incorporated herein by reference in its entirety.
FIELD
The present disclosure generally relates to systems and methods for determining dementia and other conditions.
BACKGROUND
Neurodegenerative diseases include a variety of neurological conditions with progressive worsening and neuronal death. Oculomotor abnormalities are at the core of some of these disease, as neurodegenerative processes often will affect brain circuits of eye movements. Their assessment may serve as a useful biomarker to help diagnose patients and monitor disease progression. For example, smooth pursuit eye movement has been shown to be impaired in Alzheimer’s disease (AD) patients. This may reflect degeneration of cortical oculomotor centers in the brain. In the smooth pursuit task, the presence of a stimulus is first represented on the retina, then transferred to the early visual cortex. From there, the hierarchy of the visual cortex is engaged, reflecting neurocognitive function and reactivity. However, difficulties in analyzing eye movements in subjects has limited the usefulness of such techniques. Thus, improvements are needed.
SUMMARY
The present disclosure generally relates to systems and methods for determining dementia and other conditions. The subject matter of the present disclosure involves, in some cases, interrelated products, alternative solutions to a particular problem, and/or a plurality of different uses of one or more systems and/or articles.
One aspect is generally directed to a device for diagnosing a neurological condition in a subject. In one set of embodiments, the device comprises a display, an eye tracking sensor, and a processor. In some cases, the processor comprises instructions that, when executed, cause the device to display, on the display, a tracking object moving in a spiral trajectory around a center point; determine, via the eye tracking sensor, eye tracking of the object by a subject; determine an error in a radial direction relative to the center point between an actual position of the tracking object and a position where the subject’s eye is focused, as determined by the subject’s eye tracking; and diagnose the neurological condition of the subject based on the error.
In another set of embodiments, the device comprises a display, an eye tracking sensor; and a processor. In some embodiments, the processor comprises instructions that, when executed, cause the device to display, on the display, a tracking object moving in a nonlinear trajectory; determine, via the eye tracking sensor, eye tracking of the object by a subject; and diagnose the neurological condition of the subject based on errors between an actual position of the tracking object and a position where the subject’s eye is focused.
In another set of embodiments, the device comprises a display, an eye tracking sensor, and a processor. In certain embodiments, the processor comprises instructions that, when executed, cause the device to display, on the display, a tracking object moving in a nonlinear trajectory; determine, via the eye tracking sensor, eye tracking of the object by a subject; determine an error in a first direction and an error in a second direction between an actual position of the tracking object and a position where the subject’s eye is focused, as determined by the subject’s eye tracking; and diagnosing the neurological condition of the subject based on the error in the first direction but not the error in the second direction.
Another aspect is generally directed to a method of diagnosing a neurological condition in a subject. In one set of embodiments, the method comprises displaying, to the subject, a tracking object moving in a spiral trajectory around a center point; determining, via an eye tracking sensor, eye tracking of the object by the subject; determining an error in a radial direction relative to the center point between an actual position of the tracking object and a position where the subject’s eye is focused, as determined by the subject’s eye tracking; and diagnosing a neurological condition of the subject based on the error in the radial direction.
According to another set of embodiments, the method comprises displaying, to the subject, a tracking object moving in a spiral trajectory around a center point; determining, via an eye tracking sensor, eye tracking of the object by the subject; determining an error in a radial direction relative to the center point between an actual position of the tracking object and a position where the subject’s eye is focused, as determined by the subject’s eye tracking; diagnosing a neurological condition of the subject based on the error in the radial direction; and treating the subject for the neurological condition based on the diagnosis.
In yet another set of embodiments, the method comprises displaying, to the subject, a tracking object moving in a nonlinear trajectory; determining, via an eye tracking sensor, eye tracking of the object by the subject; determining an error in a first direction and an error in a second direction between an actual position of the tracking object and a position where the subject’s eye is focused, as determined by the subject’s eye tracking; and diagnosing a neurological condition of the subject based on the error in the first direction but not the error in the second direction.
In still another set of embodiments, the method comprises displaying, to the subject, a tracking object moving in a nonlinear trajectory; determining, via an eye tracking sensor, eye tracking of the object by the subject; determining an error in a first direction and an error in a second direction between an actual position of the tracking object and a position where the subject’s eye is focused, as determined by the subject’s eye tracking; diagnosing a neurological condition of the subject based on the error in the first direction but not the error in the second direction; and treating the subject for the neurological condition based on the diagnosis.
In another aspect, the present disclosure encompasses methods of making one or more of the embodiments described herein, for example, a device for assessing a neurological condition in a subject. In still another aspect, the present disclosure encompasses methods of using one or more of the embodiments described herein, for example, a device for assessing a neurological condition in a subject.
Other advantages and novel features of the present disclosure will become apparent from the following detailed description of various non-limiting embodiments of the disclosure when considered in conjunction with the accompanying figures.
BRIEF DESCRIPTION OF THE DRAWINGS
Non-limiting embodiments of the present disclosure will be described by way of example with reference to the accompanying figures, which are schematic and are not intended to be drawn to scale. In the figures, each identical or nearly identical component illustrated is typically represented by a single numeral. For purposes of clarity, not every component is labeled in every figure, nor is every component of each embodiment of the disclosure shown where illustration is not necessary to allow those of ordinary skill in the art to understand the disclosure. In the figures:
Fig. 1 illustrates a smooth pursuit trajectory, in accordance with one embodiment;
Figs. 2A-2D illustrate eye tracking of a smooth pursuit trajectory for four subjects, in accordance with another embodiment;
Figs. 3A-3D illustrate the radial errors of the trajectories shown in Figs. 2A-2D;
Figs. 4A-4D illustrate the angular errors of the trajectories shown in Figs. 2A-2D;
Fig. 5 illustrates a device for tracking eyes, in another embodiment;
Fig. 6 illustrates a VR device, in yet another embodiment; and
Fig. 7 illustrates test results involving 11 subjects, in still another embodiment. DETAILED DESCRIPTION
The present disclosure generally relates to systems and methods for determining dementia, Parkinson’s Disease, Alzheimer’s Disease, stroke, or other conditions. According to some aspects, a subject is presented with a tracking object that moves in a spiral and/or circular pathway, and the subject’s ability to visually track the object with their eyes is determined. Subjects with certain neurological conditions may have difficulties in tracking the object, particularly in a radial direction, although angular directions may be less severely impacted. Accordingly, by determining radial errors in eye tracking, the neurological condition of the subject may be assessed, and treated as necessary. Other aspects as discussed herein are generally directed to devices and methods for assessing such subjects, kits for determining the neurological condition of a subject, and the like.
In one aspect, a subject, which may be a subject suspected of having some form of neurological impairment or condition, can be tested by determining the subject’s ability to visually track an object. In one set of embodiments, the subject is exposed to an object and asked to track the object with their eyes. This is generally known as a smooth pursuit task. For example, the object to be tracked may be a real object or a simulated object shown on a display, e.g., as part of a virtual reality device. While the subject tracks the object with their eyes, one or more eye tracking sensors may be used to track where the eyes of the subject are actually focused, i.e., to determine the apparent position of the object. These tracking data based on the apparent position of the object, i.e., where the subject is looking, can be compared with the actual position of the object, and the error between the two can be determined. These errors can then be analyzed to determine the subject’s neurological impairment or condition. These errors may include spatial errors (e.g., where the gaze is located away from the stimulus in a random direction) and/or temporal errors (e.g., where the gaze may be located where the stimulus was moments before, or ahead in the direction that the stimulus is going to be).
In general, when the eyes track an object, they typically follow it in a series of quick jumps, known as saccades. In tasks that require smooth pursuit, a tracking object is moved around on a trajectory, while the eye tracking sensors used to monitor the eyes will typically record a series of saccades as the brain attempts to follow of the object. This can be seen, for example, in the eye tracking measurements shown in Fig. 2D, where a healthy subject was asked to track an object moving in a spiral trajectory. There are a number of straight-line segments, overshoots, etc., which correspond to the saccades of the eyes as they track the object.
As discussed herein, in some embodiments, various sensors can be employed to track a subject’s eye movement. For example, video-based tracking sensors, infrared red, near infrared, and passive light sensing, etc., can be used to determine where a subject’s eye is focusing. Such sensors can be built into glasses or other wearable devices (e.g., VR headsets, etc.) in certain embodiments. In addition, in some embodiments, a camera may be used as a sensor to track a subject’s eye movement. For example, a video camera may be used to determine where a subject’s eye is focusing.
In some cases, such as suggested by Fig. 2D, the object may be moved on a trajectory that includes different directions. For instance, the object may be moved in trajectories that are oblique, i.e., not horizontal or vertical, trajectories that are not straight lines, or trajectories that are nonlinear. Such trajectories may be more difficult for a subject to smoothly track, as the eye movements necessary to follow such trajectories require coordination between different parts of the brain. In part, this is because there are different extraocular muscles that control the direction of the eye, and these must be moved in a coordinated fashion by the brain in order for the eyes to be able to track such trajectories. Due to this complexity in brain/eye coordination, subjects having certain neurological conditions may have difficulties in following the object, which can be recorded and measured. This can be seen, for example, in Figs. 2A-2C for subjects with various neurological conditions. In these examples, although the tracked object followed a similar trajectory as in Fig. 2D, the subject’s eyes in these figures were not able to follow the object in its trajectory.
Surprisingly, however, it has been found that subjects with certain neurological conditions not only may have difficulties in tracking objects, but that the errors are not necessarily uniform. This may be particularly true of objects that move in trajectories that are not straight lines, i.e., in nonlinear trajectories. For instance, many subjects with certain neurological conditions may find it more difficult to track objects in a radial direction, although they can still track objects more easily in an angular direction. Accordingly, by dissecting the tracking errors of objects by a subject into errors in the radial and angular directions, and focusing more on the radial errors, the ability to diagnose a subject as having a neurological condition may be significantly improved. For example, some angular errors may be determined as a distance along the arc of a circle from the point of gaze, to the point of the stimulus. In some cases, the radial error may be determined as a distance from the point of gaze directly to a circle with a radius equal to the radial distance that the stimulus is at. For instance, by focusing on radial errors, the “noise” created by angular errors may be reduced or eliminated, making it easier to identify those subjects with certain neurological conditions.
Thus, in some embodiments, the system is configured to identify cognitive impairment more efficiently and more accurately that conventional approaches. In one example, a tracking system is configured to identify cognitive impairment based on analyzing radial errors alone. In another example, the system can reduce the error resulting from angular and radial data samples improving accuracy in system determinations.
This can be seen with reference to Figs. 1-4. In Fig. 1, an example trajectory of a tracking object for a subject’s eyes to follow is shown. In this example, an object moves around a center point in an outwardly expanding spiral trajectory. Once the object reaches a certain radius, the object then follows a circular trajectory around the center point. The subject is asked to follow the object with their eyes, and the trajectory that their eyes follow is tracked with one or more eye tracking sensors. Non-limiting examples of such eye tracking movements following an object trajectory are shown in Fig. 2, for subjects with dementia (Fig. 2A), Parkinson’s Disease (Fig. 2B), Alzheimer’s Disease (Fig. 2C), and a healthy subject (Fig. 2D).
In Figs. 3 and 4, the radial and angular errors of the subject’s tracking, relative to the actual position of the tracked object, are respectively shown as a function of time for the subjects from Fig. 2 (keeping the same sequence, such that Figs. 3A and 4A are for the subject with dementia, Figs. 3B and 4B are for the subject with Parkinson’s Disease, etc.). The straight lines in each of these figures are the actual position of the tracked object while the jagged lines show the position that the eyes were focused, i.e., the apparent position of the object based on the eyes’ focus, each as a function of time. In addition, in some cases, the temporal errors may also be determined.
Fig. 4 shows that all 4 subjects were reasonably able to follow the object’s angular position. The discrepancies between the actual position of the tracked object and the apparent position of the object based on the focal position of the eyes are not large. However, Figs. 3A- 3C shows that the subjects with various neurological conditions found it difficult to track the object’s radial position, in contrast with Fig. 3D for a healthy subject. The discrepancies in positions in these figures is relatively large, and in some cases, the apparent position of the object does not appear to be correlated with its actual position.
Accordingly, since errors in tracking the angular position of the tracking object may not provide much useful information regard a subject’s neurological condition, in certain embodiments, eye tracking data of a subject may be processed to reduce or eliminate such errors. Thus, surprisingly, data analysis of such eye tracking may be significantly improved by focusing on radial errors, rather than angular errors. This can significantly reduce the amount of noise in the data and improves its quality, thereby improving diagnosis. Accordingly, as discussed in more detail herein, certain embodiments are directed to systems and methods for determining a neurological condition in a subject by focusing on radial errors in eye tracking, and optionally treating those subjects with neurological conditions. Thus, a variety of different neurological conditions may be diagnosed in some aspects as discussed herein. The subject may be one that has or is suspected of potentially having a neurological condition, and systems and methods such as those described herein may be used to diagnose the subject as having a neurological condition, and in some cases, to facilitate treatment of the subject. In addition, in certain embodiments, the specific neurological condition may also be determined. Non-limiting examples of neurological conditions that can be diagnosed include dementia, Parkinson’s Disease, Alzheimer’s Disease, stroke, etc.
In one aspect, a tracking object is presented to a subject. The subject is typically human. The tracking object may a real object, or it may be a projection of an object on a display, etc. The tracking object may have any appearance, e.g., one that allows the subject to track it by simply focusing on it with their eyes. For instance, the tracking object may be of any suitable size, shape, color, form, etc. In addition, in some embodiments, the tracking object may have a relatively small size, e.g., such that its apparent position, as observed by the subject, is known to a relatively high degree of precision.
As an example, the tracking object may appear to be a circle, a ball, or a sphere that has a diameter of between 10 mm and 20 mm, and may be presented to appear at a distance of between 0.8 m and 1.2 m from the subject. However, it should be understood that this is by way of example only, and that the tracking object may be larger or smaller, and/or may appear closer or farther away from the subject. In addition, in some cases, the tracking object may grow or shrink in size (e.g., such that the object appears to go closer to or farther away from the subject).
For instance, the tracking object may appear to be at least 10 cm, at least 20 cm, at least 50 cm, 100 cm, at least 200 cm, at least 500 cm, at least 700 cm, at least 1 m, at least 1.3 m, at least 1.5 m, at least 2 m, at least 3 m, at least 5 m, at least 10 m, etc. from the subject, and/or the tracking object may appear to be no more than 10 m, no more than 5 m, no more than 3 m, no more than 2 m, no more than 1.5 m, no more than 1.3 m, no more than 1 m, no more than 700 cm, no more than 500 cm, no more than 200 cm, no more than 100 cm, no more than 50 cm, no more than 20 cm, no more than 10 cm, etc. Combinations of any of these are also possible, e.g., the object may appear to be between 10 m and 15 m, between 50 cm and 100 cm, between 700 cm and 1 m, etc., from the subject. The object may appear to stay a constant distance from the subject, and/or the object may have a trajectory where it appears to move closer and/or farther away from the subject.
In addition, the tracking object may appear to be of any size. For example, the tracking object may appear to have a maximum dimension of at least 2 mm, at least 3 mm, at least 5 mm, at least 7 mm, at least 10 mm, at least 20 mm, at least 30 mm, at least 50 mm, at least 70 mm, at least 100 mm, at least 200 mm, at least 300 mm, at least 500 mm, at least 700 mm, at least 1 m, etc. The tracking object may also appear to have, in certain embodiments, a maximum dimension of no more than 1 m, no more than 700 mm, no more than 500 mm, no more than 300 mm, no more than 200 mm, no more than 100 mm, no more than 70 mm, no more than 50 mm, no more than 30 mm, no more than 20 mm, no more than 10 mm, no more than 7 mm, no more than 5 mm, no more than 3 mm, no more than 2 mm, etc. Combinations of any of these dimensions are also possible. As a non-limiting example, the object may appear to have a maximum dimension of between 10 mm and 20 mm, between 200 mm and 300 mm, between 3 mm and 10 mm, etc. In some embodiments, multiple test runs can be executed, and observation information captured using tracking objects of variable sizes for respective tests.
The tracking object may have any shape or color. In some embodiments, the tracking object may appear to be a circle, a ball, or a sphere. However, in other embodiments, the tracking object may be a square, a triangle, a cube, a block, a cylinder, a tetrahedron, a polyhedron, or appear to be a flat shape or a real object, etc. The object may have a single color, or exhibit a range of colors. In addition, in some embodiments, the object may change its appearance and/or size as it moves along the trajectory, at specific or random periods of time, or the like.
As mentioned, in some embodiments, the tracking object may be moved in a nonlinear trajectory, for example, a circle, a spiral, or others such as those discussed herein. In such trajectories, there are two relationships that the eyes and the brain have to follow: the distance between the tracking object and the center point (i.e., the radius), and the angle of the tracking object relative to the center point. Errors in positioning along the radial dimension are radial errors, while errors in positioning along the angular dimension are angular errors. In addition, in some cases, temporal errors may also be determined, e.g., in addition to or instead of spatial errors.
For instance, if the angular velocity is held constant, then the saccades that the eye follow is proportional to the radius; as the radius changes, the angles and the lengths of the saccades necessary to follow the object also changes. Without wishing to be bound by any theory, since different sets of eye muscles control the angles at which the eye moves, this task requires a surprisingly large amount of mental processing by the brain in order to properly control the eye muscles in well-timed saccades in order to be able to follow the tracking object. Because of this complexity, errors or various neurological conditions can be determined using tests involving moving tracking objects in nonlinear trajectories, such as those described herein. Surprisingly, as previously discussed, such errors are not uniformly distributed, but can be dissected into different dimensions (for example, radial and angular errors), and subjects with certain neurological conditions may exhibit more errors of one type than another. This unusual property can be used, according to some embodiments as described herein, to remove “noise” from eye tracking measurements and thereby improve the ability to diagnose or treat certain neurological conditions. Thus, for example, using eye tracking measurements while removing certain types of errors in movement may be useful for diagnosing or treating dementia, Parkinson’s Disease, Alzheimer’s Disease, stroke, concussion, ADHD, or other conditions such as those described herein.
Terms such as “treat,” “treatment,” “treating,” etc., as used herein, comprise therapeutic treatment of subjects having already developed said condition, in particular in manifest form. Therapeutic treatment may be symptomatic treatment in order to relieve the symptoms of the specific indication or causal treatment in order to reverse or partially reverse the conditions of the indication or to stop or slow down progression of the condition. Treatments such as those discussed herein may be used for instance as therapeutic treatment over a period of time as well as for chronic therapy.
Thus, in one set of embodiments, the tracking object may be moved along a trajectory for the subject to follow. For instance, the tracking object may be moved in a linear or a nonlinear trajectory, for example, a sinusoid, a circle, a spiral, a square or other polygonal shape, a random trajectory, etc. As noted above, a nonlinear trajectory can be more difficult for a subject to smoothly track, and may be more useful for determining subjects with certain neurological conditions. The tracking object may proceed along the trajectory at any suitable speed, angular velocity, etc. For instance, the speed or the angular velocity may stay relatively constant, or may change (e.g., smoothly or in jumps, at regular or irregular intervals, etc.).
In some cases, the tracking object may be moved to follow a trajectory that, at its maximum dimension, appears to subtend an angle of at least 30°, at least 40°, at least 45°, at least 50°, at least 60°, at least 70°, at least 80°, at least 90°, at least 100°, at least 110°, at least 120°, at least 130°, at least 140°, at least 150°, at least 160°, at least 170°, at least 180o etc., from the subject’s point of view. In addition, the trajectory, in its maximum dimension, may be no more than 180°, no more than 170°, no more than 160°, no more than 150°, no more than 140°, no more than 130°, no more than 120°, no more than 110°, no more than 100°, no more than 90°, no more than 80°, no more than 70°, no more than 60°, no more than 50°, no more than 45°, no more than 40°, no more than 30°, etc. Combinations of these angles are also possible. For instance, the trajectory of the object may appear to subtend an angle, from the subject’s point of view, of between 45° and 60°, between 90° and 180°, between 100° and 150°, etc.
In some cases, the tracking object may be moved around a center point, for example, in a substantially circular, elliptical, or spiral trajectory. As mentioned, the tracking object may follow any of a variety of trajectories. In certain embodiments, the trajectories are defined using a center point, although this is not required. In addition, combinations of these may be used in some embodiments; for instance, the tracking object may be moved in a spiral trajectory followed by a circular trajectory, in a circular trajectory followed by a spiral trajectory, in a spiral trajectory followed by an elliptical trajectory, an elliptical trajectory with non-elliptical deviations, a circular trajectory with non-circular deviations, etc. In addition, the trajectory may appear to be a 2-dimensional trajectory (i.e., appearing to be within a plane) or a 3-dimensional trajectory. For example, in some embodiments, the tracking object may appear to be approaching towards and/or receding from the subject, e.g., instead of or in addition to moving in a planar trajectory.
For example, in some embodiments, at least a portion of the trajectory that the tracking object follows is spiral or substantially spiral. For instance, the object may follow a spiral trajectory for at least 5 seconds, at least 10 seconds, at least 15 seconds, at least 20 seconds, at least 25 seconds, at least 30 seconds, at least 40 seconds, at least 45 seconds, at least 50 seconds, at least 1 minute, at least 1.5 minutes, at least 2 minutes, at least 3 minutes, at least 4 minutes, at least 5 minutes, etc. In some cases, the tracking object may start near a center point, then move in a clockwise or counterclockwise direction around the center point, while the radial distance of the tracking object away from the center point may increase, for example, uniformly, linearly or nonlineraly, etc. In another embodiment, this may be in reverse; i.e., the tracking object may spiral in towards the center point.
The spiral may take a variety of forms, e.g., the tracking object may proceed along a trajectory that spirals around a center point. For instance, in one embodiment, the spiral is an Archimedean spiral. In some cases, the spiral is formed such that the angular velocity of the tracking object is constant or substantially constant, or that the pitch of the spiral is constant (e.g., if the spiral trajectory is described by a + b x, where x is the angle, then a and/or b may independently be constant). However, in other cases, the angular velocity of the tracking object, and/or the pitch of the spiral is not constant or substantially constant. For instance, a and/or b may be a function of time, position, angle, or the like. In addition, in some cases, the spiral may not necessarily be uniform, or describable by an equation such as a + b x.
In addition, in some cases, at least a portion of the trajectory that the tracking object follows is circular or substantially circular. For instance, the object may follow a circular trajectory for at least 5 seconds, at least 10 seconds, at least 15 seconds, at least 20 seconds, at least 25 seconds, at least 30 seconds, at least 40 seconds, at least 45 seconds, at least 50 seconds, at least 1 minute, at least 1.5 minutes, at least 2 minutes, at least 3 minutes, at least 4 minutes, at least 5 minutes, etc. The tracking orbit may have any suitable dimension. As discussed, errors in eye tracking can be determined in various aspects by comparing the actual position of the tracking object (e.g., where it appears on a display) with the apparent position that the subject’s eyes are focused (e.g., as determined using one or more eye tracking sensors). If the eyes were perfect, then the actual position of the object would be matched by the focal position of the subject’s eyes, i.e., the object’s apparent position. However, in reality, the eyes are not able to perfectly track an object, and there will be a series of errors, overshoots, saccades, and the like. These differences or errors in position can then be analyzed to diagnosing a neurological condition, and optionally used to determine a treatment for the neurological condition.
In some embodiments, the errors may be dissected into, at least, a first direction and a second direction, for example, two orthogonal directions (e.g., horizontal and vertical), or in some cases, a radial direction and an angular direction. However, as discussed, in some cases, errors in one direction may be ignored or at least minimized. For example, subjects with certain neurological conditions may find it more difficult to track objects in a radial direction. The errors in eye tracking can be dissected into at least two dimensions, thus yielding errors in tracking along a first direction (e.g., a radial direction) and errors in tracking along a second direction (e.g., an angular direction). In addition, it should be understood that in certain embodiments, the errors may be dissected into three dimensions, for example, in cases where the trajectory of the tracking object appears to move in three dimensions. Thus, for example, the errors may be dissected into errors in three orthogonal directions (e.g., horizontal, vertical, and to/from the subject), or in some cases, a radial direction, an angular direction, and to/from the subject).
In addition, due to the complexity of eye movements that are required to track an object moving in such trajectories, in some embodiments, it can be very difficult for a subject to intentionally disrupt testing. Without wishing to be bound by any theory, it is believed that a subject who disrupts such testing will be unable to intentionally create errors of one type relative to errors of another type. For instance, a subject would not typically be able to intentionally create errors in angular position by controlling where their eyes are focused without also creating errors in radial position, or vice versa. Accordingly, in one set of embodiments, eye tracking errors that are determined may also be self-authenticating. For example, errors of the actual position of the tracking object as opposed to its apparent position as determined by the focus of the eyes may be dissected into errors along two different dimensions (e.g., radial vs. angular, horizontal vs. vertical, etc.). The errors observed in each dimension can be compared. If the errors do not fall into an expected error distributions as discussed herein, e.g., the errors are more randomly distributed among the different types of errors, then it can be concluded that the test was not trustworthy, e.g., due to the subject not intentionally tracking the tracking object. In addition, it is believed that such self-authenticating measurements have not previously been reported.
In various embodiments, a tracking system can implement self-authenticating measurements. In various examples, architecting the tracking system to generate selfauthenticating measurements improve over conventional approaches. For example, where system collect or use multiple runs to account for errors (e.g., user induced). In addition, in some embodiments, the system can be configured to identify errors during test execution and optionally, restart and/or stop such a test run. Accordingly, in some embodiments, the system can identify and act on errors before a human operator or another conventional approach would have even known such an issue exists.
It should be noted that there may be a variety of reasons a subject disrupts testing, benign or malign. For example, the subject may have lost attention, or was distracted by an external event such as a cell phone. However, as discussed herein, tests with such disruptions can be identified and discarded as being untrustworthy and not useful for further analysis.
In addition, in one set of embodiments, a subject may be asked to perform a cognitive task while tracking an object. Non-limiting examples of cognitive tasks include, but are not limited to, counting numbers in some fashion (e.g., counting forwards, counting backwards to 100, counting backwards in steps of 7), reciting multiplication tables, identifying songs being played in the background, reciting state or country names, or the like. Without wishing to be bound by any theory, it is believed that a subject who attempts to carry out multiple actions at once has mental “interference” due to competition for mental resources. This may be useful, for example, to avoid disrupting subject testing (e.g., losing attention), or to test motor function, cognitive function, motor-cognition interference, cognition-cognition interference, or the like.
As discussed, in some aspects, the tracking object presented to a subject may be presented using a variety of equipment. As a non-limiting example, the tracking object may be the image of an object on a display, while the eyes of the subject are detected using one or more eye tracking sensors. These may be controlled by a processor.
A non-limiting example schematic figure of such a system is shown in Fig. 5. In this figure, system 10 is used to test a subject’s eyes 60. System 10 includes a display 20, upon which an object 25 is shown. The object may appear to be stationary or moving, e.g., in trajectories such as those discussed herein. Eye tracking sensors 30 can be used to track the position of eyes 60 of the subject. In this non-limiting example, the sensors include a light source 32 and a camera 35 that records light 37 passing between light source 32 and camera 35. The angle of deflection of the light may be used to determine where the eyes are focused. This can be used to determine the apparent position 40 of the object, for example, based on the expected focal position of the eyes 45. The difference between the actual position and the apparent position 40 of object 25 is the error. As discussed herein, this can be analyzed (for example, using processor 50) to determine or diagnose a neurological condition in a subject.
In addition, in some cases, the components shown in Fig. 5 can be contained within a headset such as a VR headset, for example, one that can be worn by a subject as is shown in Fig. 6. In general, VR headsets can be readily obtained commercially; non-limiting examples include Oculus Quest 2, Sony PlayStation VR, HP Reverb G2, or the like. The headset may contain therein a display that can be viewed by the subject, and optionally one or more eye tracking sensors, e.g., as is shown in Fig. 5. A processor may also be present within the headset, and/or located externally of the headset (for example, in wired or wireless communication with the headset). For example, the headset may be able to send and/or receive signals with a processor that is external to the headset. As another example, the headset may be self-contained, e.g., able to conduct a test on a subject without requiring any additional controls.
The device may be any suitable device. For example, the device may be a room-sized device, a VR (virtual reality) system, a headset (e.g., a VR headset), or the like. The device may include a variety of displays, eye tracking sensors, processors, and/or other components, etc., in various embodiments. Displays, VR systems, headsets, and other similar components may also be obtained commercially. In other examples, the device can include a frame that permits a subject to rest their head within the framework to observe the display. In yet another embodiment, one or more sensors can be placed throughout the framework or device to provide measurements of subject’s system movement and/or position, etc.
For example, in one set of embodiments, the device may include a display for displaying an image of an object to be tracked. The display may be a single screen, or may comprise two or more screens (for example, one for each eye). The display may be provided with instructions (e.g., from a processor) to display an image or a sequence of image frames, e.g., such that an object shown the display appears to move (for example, as discussed herein). In some cases, a magnifying lens may be position between the display and the subject’s eyes. The display may be any suitable display for producing an image, for example, LED displays, CRT displays, LCD displays, or the like.
The device may also include one or more eye tracking sensors for tracking the eyes. Any of a variety of eye tracking sensors may be used, many of which can be obtained commercially. For instance, in one set of embodiments, the eye tracking sensor may comprise an illumination source, such as an infrared or near-infrared light-emitting diode (LED) configured to illuminate an eye with light, and one or more sensors configured to detect the light reflected by the eye. For example, the sensors may include cameras, search coils, electrooculograms (for potential measurement), etc. Other examples include video-based tracking sensors, infrared red, near infrared, and passive light sensing, etc. In some embodiments, the sensors may be built into glasses or other devices (e.g., VR headsets, etc.), including any of those described herein.
The sensors may, in some embodiments, capture or sample at relatively high frequencies, e.g., at least 30 Hz, at least 40 Hz, at least 50 Hz, at least 60 Hz, at least 100 Hz, at least 240 Hz, at least 350 Hz, at least 500 Hz, at least 1000 Hz, at least 1250 Hz, at least 2000 Hz, etc. In some cases, the sensors may operate without the use of an illumination source. In addition, other optical methods may be used in other embodiments.
In some embodiments, a processor is used to produce images on the display and track eye movements via one or more eye tracking sensors. For example, the processor may include any of various types of circuits and/or computing devices. These may include one or more processors and/or non-transitory computer-readable storage media (e.g., memory, non-volatile storage devices, etc.). The processor may control writing data to and/or reading data from memory, and/or non-volatile storage devices. The processor can be a programmed microprocessor or microcomputer, which can be custom made or purchased from computer chip manufacturers. To perform any of the functionality described herein, the processor may execute one or more processor-executable instructions stored in one or more non-transitory computer-readable storage media, which may serve as non-transitory computer-readable storage media storing processorexecutable instructions for execution by the processor.
It should be understood that the present disclosure is not limited to only VR systems. In another set of embodiments, for example, the device may be a smartphone. In some cases, for instance, the smartphone may contain a camera (which can be used as an eye tracking sensor), a display (which can be used to display an object to be tracked), and a processor. For example, the smartphone may be held such that a built-in camera within the smartphone faces the user while the user views the display of the smartphone, e.g., such that the camera can be used to track the user’s eyes. In addition, in certain embodiments, the smartphone may be connected to an external camera and/or an external display, i.e., instead of and/or in addition to using a built-in camera and/or display on the smartphone. The processor may be programmed to produce images on the display and/or track eye movements via the camera (e.g., used as an eye tracking sensor), for example, as discussed herein.
In yet another set of embodiments, the device may be a computer, e.g., a desktop computer or a laptop computer. In certain cases, the computer may be connected to an external camera, and/or contain a built-in camera, which can be used as an eye tracking sensor, e.g., as discussed herein, such that the camera can be used track the user’s eyes. The computer may also be connected to an external display, and/or contain a built-in display, for example, which can be used to display an object to be tracked. In some embodiments, the camera may be positioned such that the camera is able to image the user’s eyes while the user views the display. The computer may also contain a processor, e.g., which may be programmed to produce images on the display and/or track eye movements via a camera (for example, used as an eye tracking sensor), such as discussed herein. In some embodiments, the device may be positioned on a stationary object, e.g., a table or a counter. However, in other embodiments, the device may not necessarily be stationary. For example, the device may be a smartphone that is held by the user. In some embodiments, positioning or motor corrections may be applied to the program, e.g., by incorporating sampling gyroscope or accelerometer data from the device in order to correct for the position of the device in order to track the user’s eyes and/or track motion of the display, etc.
The terms “program” or “software” are used herein in a generic sense to refer to any type of computer code or set of processor-executable instructions that can be employed to program a computer or other processor (physical or virtual) to implement various aspects of embodiments as discussed herein. Additionally, according to one embodiment, one or more computer programs that when executed perform methods of the disclosure provided herein need not reside on a single computer or processor, but may be distributed in a modular fashion among different computers or processors to implement various aspects of the disclosure provided herein.
Processor-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform tasks or implement abstract data types. Typically, the functionality of the program modules may be combined (e.g., centralized) or distributed.
In addition, the processor may be used to compute errors, e.g., between the actual position of the tracked object and the apparent position of the object based on the focal position of the eyes, and in some embodiments, dissect such errors into different dimensions, for example, radial and angular errors, horizontal and vertical errors, etc. In addition, in some embodiments, the processor may also output a diagnosis or a determination of the neurological condition of the subject, e.g., based on such error determinations, and optionally, one or more treatments for the neurological condition based on the diagnosis.
Other components that may be present within the device include, but are not limited to, a power supply (which may be rechargeable), an antenna, a radio frequency (RF) receiver and/or transmitter, analog to digital convertors, a real time clock, etc. For example, the antenna can be used to send and/or receive signals from an external device, such as a laptop, mobile device, computer, etc. This may be used, for example, to control operation of the device, transmit data to or from the device, modify operational parameters of the device, or the like.
The following are each incorporated herein by reference in their entireties: U.S. Provisional Patent Application Serial No. 63/274,793, filed November 2, 2021, entitled “Systems and Methods for Determining Dementia and Other Conditions,” by Patel, et al., and of U.S. Provisional Patent Application Serial No. 63/339,002, filed May 6, 2022, entitled “Systems and Methods for Determining Dementia and Other Conditions,” by Patel, et al.
The following examples are intended to illustrate certain embodiments of the present disclosure, but do not exemplify the full scope of the disclosure.
EXAMPLE 1
This non-limiting example illustrates a virtual reality-based system for neurocognitive testing utilizing eye-tracking technology to measure parameters such as left and right eye positions, eye directions, gaze vectors, and pupil characteristics in each frame of recorded activity. These measures can then be analyzed to reconstruct eye movement patterns and derive more complex behaviors such as saccades, blinks, and attentional preferences to inform diagnostic and interventional efforts for various medical conditions. In particular, the system in this example utilizes a 2D and 3D smooth-pursuit task, in which a stimulus is shown on the screen and the user must track the stimulus with their eyes. The stimulus moves outwardly along, for example, an Archimedes Spiral with monotonically increasing radius and fixed angular velocity. Once at the outermost edge, the stimulus can also circle along the same radius a few times to simulate the regular smooth-pursuit task. Preliminary data has shown statistical differences between healthy individuals and individuals with dementia. In particular, the data have revealed differences in oblique eye movement patterns in patients with Alzheimer’s disease, with patients having a markedly different radial error throughout the task, while differences in performance on angular error measures are not statistically significant. Using this same test, differences can also be seen in additional cognitive disorders such as Parkinson’s disease, stroke, traumatic brain injury, and other neurological conditions.
Neurodegenerative diseases include a variety of neurological conditions with progressive worsening and neuronal death. Oculomotor abnormalities are at the core of these disorders as neurodegenerative processes involve brain circuits of eye movements. Their assessment may serve as a useful biomarker to help diagnose patients and monitor disease progression. Smooth pursuit eye movement may be impaired in Alzheimer’s Disease (AD) patients, where motion conditions at frequencies representing varying ranges of target velocity have been found. AD patient gains were reduced at high target velocities and lower target accelerations; they made WO 2023/081123 - I7 ’ PCT/US2022/048519 more frequent saccades and had more saccadic intrusions, and led the target with their eyes in all 3 conditions. Furthermore, lag-adjusted cross-correlation coefficients were lower and most sensitive to between-group variability, suggesting that smooth pursuit eye movements reflect degeneration of cortical oculomotor centers in the brain. In the smooth pursuit task, the presence of the stimulus is first represented on the retina, then transferred to the early visual cortex. From there, the hierarchy of the visual cortex is engaged, reflecting neurocognitive function and reactivity.
Specifically, the smooth pursuit 2D and 3D task utilizes a stimulus in the form of a 15 millimeter white sphere displayed on the virtual reality screen 1 meters from the eyes of the user. As described in this example, this stimulus moves outwardly along an Archimedes Spiral with monotonically increasing radius and fixed angular velocity of about 3 radians per second for 15 seconds. After 15 seconds, the radius stops increasing and the stimulus circles along a constant trajectory for 15 more seconds to encompass the normal smooth pursuit task. The subject is instructed to track the stimulus dot with their eyes without moving their head throughout the entire task.
In one embodiment, a subject puts on a headset that includes sensors that allow eye tracking to occur. The subject views a screen where an object that is to be tracked is moved in a generally spiral trajectory, e.g., along an Archimedes Spiral. As a non-limiting example, the object’s x and y positions may be calculated on a spiral using the following formulas: x = cos(wt) min(r, pt) y = sin(wt) min(r, pt), where w is a speed constant for how quickly the stimulus moves or spins, p is a speed factor for how quickly the spiral expands radially, r is the maximum radius, and t is time. The object can be displayed, for example, such that it appears to be a sphere with a diameter of 0.15 m positioned 1 m away from the viewer. The radius about a central point that the sphere moves is 4.6 m. The speed at which the object is moved may be such that it appears to orbit once about the central point in 15 seconds. See, e.g., Fig. 1. Of course, this is by way of example only, and other speeds, objects, trajectories, etc., may be used in other embodiments.
Several subjects were tested and their results are shown in Figs. 2-4. Fig. 2 shows the projected eye tracking of the subject on the tracking object. The x and y axes are measured in apparent x-y position of the subject’s focus during the experiment, measured in meters away from the central point, with positive numbers meaning right or upwards. Subject A had dementia (Fig. 2A), Subject B had Parkinson’s Disease (Fig. 2B), Subject C had Alzheimer’s (Fig. 2C), and Subject D was a healthy subject (Fig. 2D). It was found that radial error was very poorly controlled in all but the healthy subject, as is shown in Fig. 3, which shows the radial error component of the respective trajectories in Figs. 2A-2D. The straight line shows the actual trajectory of the tracking object (i.e., increasing linearly for the first 15 seconds, before staying constant thereafter), while the experimental line shows the eye tracking position of the subject following the object. The x axis is time in seconds while the y axis shows the radial error in tracking. It will be noted that the correspondence between the actual position of the tracking object (i.e., the theoretical position of where the eyes should be tracking) and the actual tracking position of the eyes is poorly correlated in all but the healthy subject.
In contrast, it was found that the angular error was quite controlled in most cases, as is shown in Fig. 4. This figure shows the angular error component of the respective trajectories in Figs. 2A-2D. The straight line shows the actual trajectory of the tracking object (i.e., increasing linearly for the first 15 seconds, before staying constant thereafter), while the experimental line shows the eye tracking position of the subject following the object. The x axis is time in seconds while the y axis is the angle of the object in radians. It will be noted that the correspondence between the actual position of the tracking object (i.e., the theoretical position of where the eyes should be tracking) and the actual tracking position of the eyes was reasonably correlated in all 4 cases.
EXAMPLE 2
In this example, 18 normal and concussed players from a professional soccer team were tested, using one embodiment as described herein, showing the predictive capacity of this test in detecting concussed vs non-concussed subjects.
Fig. 7 illustrates a histogram where the Y-axis represents number of subjects in specified bins. The X-axis displays an individual’s variation of angular error in a smooth pursuit trajectory (using a testing procedure similar to that described in Example 1). The solid bars primarily on the left represent the 10 players who were normal, while the gray bars primarily on the right represent players who a physical therapist independently marked as experiencing concussive symptoms. As can be seen in Fig. 7, players who were normal exhibited statistically significant lower amounts of angular error, as compared to players who experienced concussions.
Accordingly, this example demonstrates that testing using a smooth pursuit trajectory can be used to identify normal and concussed subjects.
While several embodiments of the present disclosure have been described and illustrated herein, those of ordinary skill in the art will readily envision a variety of other means and/or structures for performing the functions and/or obtaining the results and/or one or more of the advantages described herein, and each of such variations and/or modifications is deemed to be within the scope of the present disclosure. More generally, those skilled in the art will readily appreciate that all parameters, dimensions, materials, and configurations described herein are meant to be exemplary and that the actual parameters, dimensions, materials, and/or configurations will depend upon the specific application or applications for which the teachings of the present disclosure is/are used. Those skilled in the art will recognize, or be able to ascertain using no more than routine experimentation, many equivalents to the specific embodiments of the disclosure described herein. It is, therefore, to be understood that the foregoing embodiments are presented by way of example only and that, within the scope of the appended claims and equivalents thereto, the disclosure may be practiced otherwise than as specifically described and claimed. The present disclosure is directed to each individual feature, system, article, material, kit, and/or method described herein. In addition, any combination of two or more such features, systems, articles, materials, kits, and/or methods, if such features, systems, articles, materials, kits, and/or methods are not mutually inconsistent, is included within the scope of the present disclosure.
In cases where the present specification and a document incorporated by reference include conflicting and/or inconsistent disclosure, the present specification shall control. If two or more documents incorporated by reference include conflicting and/or inconsistent disclosure with respect to each other, then the document having the later effective date shall control.
All definitions, as defined and used herein, should be understood to control over dictionary definitions, definitions in documents incorporated by reference, and/or ordinary meanings of the defined terms.
The indefinite articles “a” and “an,” as used herein in the specification and in the claims, unless clearly indicated to the contrary, should be understood to mean “at least one.”
The phrase “and/or,” as used herein in the specification and in the claims, should be understood to mean “either or both” of the elements so conjoined, i.e., elements that are conjunctively present in some cases and disjunctively present in other cases. Multiple elements listed with “and/or” should be construed in the same fashion, i.e., “one or more” of the elements so conjoined. Other elements may optionally be present other than the elements specifically identified by the “and/or” clause, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc. As used herein in the specification and in the claims, “or” should be understood to have the same meaning as “and/or” as defined above. For example, when separating items in a list, “or” or “and/or” shall be interpreted as being inclusive, i.e., the inclusion of at least one, but also including more than one, of a number or list of elements, and, optionally, additional unlisted items. Only terms clearly indicated to the contrary, such as “only one of’ or “exactly one of,” or, when used in the claims, “consisting of,” will refer to the inclusion of exactly one element of a number or list of elements. In general, the term “or” as used herein shall only be interpreted as indicating exclusive alternatives (i.e. “one or the other but not both”) when preceded by terms of exclusivity, such as “either,” “one of,” “only one of,” or “exactly one of.”
As used herein in the specification and in the claims, the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements. This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified. Thus, as a non-limiting example, “at least one of A and B” (or, equivalently, “at least one of A or B,” or, equivalently “at least one of A and/or B”) can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
When the word “about” is used herein in reference to a number, it should be understood that still another embodiment of the disclosure includes that number not modified by the presence of the word “about.”
It should also be understood that, unless clearly indicated to the contrary, in any methods claimed herein that include more than one step or act, the order of the steps or acts of the method is not necessarily limited to the order in which the steps or acts of the method are recited.
In the claims, as well as in the specification above, all transitional phrases such as “comprising,” “including,” “carrying,” “having,” “containing,” “involving,” “holding,” “composed of,” and the like are to be understood to be open-ended, i.e., to mean including but not limited to. Only the transitional phrases “consisting of’ and “consisting essentially of’ shall be closed or semi-closed transitional phrases, respectively, as set forth in the United States Patent Office Manual of Patent Examining Procedures, Section 2111.03.

Claims

CLAIMS A device for diagnosing a neurological condition in a subject, the device comprising: a display; an eye tracking sensor; and a processor, wherein the processor comprises instructions that, when executed, cause the device to: display, on the display, a tracking object moving in a spiral trajectory around a center point; determine, via the eye tracking sensor, eye tracking of the object by a subject; determine an error in a radial direction relative to the center point between an actual position of the tracking object and a position where the subject’s eye is focused, as determined by the subject’s eye tracking; and diagnose the neurological condition of the subject based on the error. The device of claim 1, wherein the device is a VR device. The device of any one of claims 1 or 2, wherein the display, the eye tracking sensor, and the processor are contained within a headset. The device of claim 1, wherein the device is a smartphone. The device of claim 4, wherein the eye tracking sensor is a camera contained within the smartphone. The device of claim 1, wherein the device comprises a computer. The device of claim 6, wherein the computer is a laptop computer. The device of any one of claims 6 or 7, wherein the eye tracking sensor comprises a camera within the computer. The device of any one of claims 6-8, wherein the eye tracking sensor comprises a camera external to the computer. The device of any one of claims 1-9, wherein the device further outputs a determination of the neurological condition of the subject based on the error in the radial direction. A method of diagnosing a neurological condition in a subject, the method comprising: displaying, to the subject, a tracking object moving in a spiral trajectory around a center point; determining, via an eye tracking sensor, eye tracking of the object by the subject; determining an error in a radial direction relative to the center point between an actual position of the tracking object and a position where the subject’s eye is focused, as determined by the subject’s eye tracking; and diagnosing a neurological condition of the subject based on the error in the radial direction. The method of claim 10, wherein the tracking object moves at a speed of between 1 and 5 radians per second around the center point for at least a portion of the spiral trajectory. The method of any one of claims 10 or 11, wherein the tracking object moves at a constant speed for at least a portion of the spiral trajectory. The method of any one of claims 10-12, wherein the tracking object moves at a constant angular velocity for at least a portion of the spiral trajectory. The method of any one of claims 10-13, wherein a distance between the tracking object and the center point increases with respect to time for at least a portion of the spiral trajectory. The method of any one of claims 10-14, wherein a distance between the tracking object and the center point increases linearly with respect to time for at least a portion of the spiral trajectory. The method of any one of claims 10-15, comprising displaying the tracking object moving in the spiral trajectory around a center point for at least 5 seconds. The method of any one of claims 10-16, wherein the tracking object has an apparent visual appearance of a sphere.
18. The method of claim 17, wherein the sphere appears to have a diameter of between 10 mm and 20 mm at a distance of between 0.8 m and 1.2 m from the subject.
19. The method of any one of claims 10-18, further comprising displaying, to the subject, the tracking object moving in a circular trajectory around the center point.
20. The method of claim 19, wherein the circular trajectory occurs after the spiral trajectory.
21. The method of any one of claims 19 or 20, comprising displaying the tracking object moving in the circular trajectory around a center point for at least 5 seconds.
22. The method of any one of claims 10-21, wherein the neurological condition is a neurodegenerative disease.
23. The method of any one of claims 10-21, wherein the neurological condition is dementia.
24. The method of any one of claims 10-21, wherein the neurological condition is Alzheimer’s disease.
25. The method of any one of claims 10-21, wherein the neurological condition is Parkinson’s disease.
26. The method of any one of claims 10-21, wherein the neurological condition is stroke.
27. The method of any one of claims 10-21, wherein the neurological condition is traumatic brain injury.
28. The method of any one of claims 10-27, further comprising requesting the subject perform a cognitive task while displaying, to the subject, the tracking object.
29. The method of claim 28, wherein the cognitive task comprises counting numbers.
30. A method of treating a neurological condition in a subject, the method comprising: displaying, to the subject, a tracking object moving in a spiral trajectory around a center point; determining, via an eye tracking sensor, eye tracking of the object by the subject; determining an error in a radial direction relative to the center point between an actual position of the tracking object and a position where the subject’s eye is focused, as determined by the subject’s eye tracking; diagnosing a neurological condition of the subject based on the error in the radial direction; and treating the subject for the neurological condition based on the diagnosis.
31. A device for diagnosing a neurological condition in a subject, the device comprising: a display; an eye tracking sensor; and a processor, wherein the processor comprises instructions that, when executed, cause the device to: display, on the display, a tracking object moving in a nonlinear trajectory; determine, via the eye tracking sensor, eye tracking of the object by a subject; and diagnose the neurological condition of the subject based on errors between an actual position of the tracking object and a position where the subject’s eye is focused.
32. The device of claim 31, wherein the errors include spatial errors.
33. The device of any one of claims 31 or 32, wherein the errors include temporal errors.
34. The device of claim 31, wherein the device is a VR device.
35. The device of claim 31, wherein the device is a smartphone.
36. The device of claim 31, wherein the device comprises a computer.
37. The device of claim 36, wherein the computer is laptop computer.
PCT/US2022/048519 2021-11-02 2022-11-01 Systems and methods for determining dementia and other conditions WO2023081123A1 (en)

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
US202163274793P 2021-11-02 2021-11-02
US63/274,793 2021-11-02
US202263339002P 2022-05-06 2022-05-06
US63/339,002 2022-05-06

Publications (2)

Publication Number Publication Date
WO2023081123A1 true WO2023081123A1 (en) 2023-05-11
WO2023081123A9 WO2023081123A9 (en) 2024-04-18

Family

ID=86241783

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2022/048519 WO2023081123A1 (en) 2021-11-02 2022-11-01 Systems and methods for determining dementia and other conditions

Country Status (1)

Country Link
WO (1) WO2023081123A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150282705A1 (en) * 2014-04-07 2015-10-08 Ofer Avital Method and System of Using Eye Tracking to Evaluate Subjects
WO2021064734A2 (en) * 2019-10-03 2021-04-08 Eyejets Ltd. Compact retinal scanning device for tracking movement of the eye's pupil and applications thereof
US20210265038A1 (en) * 2018-08-03 2021-08-26 The Regents Of The University Of California Virtual reality enabled neurotherapy for improving spatial-temporal neurocognitive procesing

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20150282705A1 (en) * 2014-04-07 2015-10-08 Ofer Avital Method and System of Using Eye Tracking to Evaluate Subjects
US20210265038A1 (en) * 2018-08-03 2021-08-26 The Regents Of The University Of California Virtual reality enabled neurotherapy for improving spatial-temporal neurocognitive procesing
WO2021064734A2 (en) * 2019-10-03 2021-04-08 Eyejets Ltd. Compact retinal scanning device for tracking movement of the eye's pupil and applications thereof

Also Published As

Publication number Publication date
WO2023081123A9 (en) 2024-04-18

Similar Documents

Publication Publication Date Title
Gibaldi et al. Evaluation of the Tobii EyeX Eye tracking controller and Matlab toolkit for research
US10376139B2 (en) Systems and methods for improved ease and accuracy of gaze tracking
Snegireva et al. Eye tracking technology in sports-related concussion: a systematic review and meta-analysis
US20220047158A1 (en) Cognitive Training System with Binocular Coordination Analysis and Cognitive Timing Training Feedback
Nyström et al. The influence of calibration method and eye physiology on eyetracking data quality
JP2024009889A (en) System and method for visual analysis
Rantanen et al. The effect of mental workload on the visual field size and shape
US7195355B2 (en) Isolating and quantifying functional impairments of the gaze stabilization system
JP2018520820A (en) Method and system for inspecting visual aspects
CN110167421A (en) Integrally measure the system of the clinical parameter of visual performance
US20230248236A1 (en) Method and apparatus for detecting ocular movement disorders
CN104254281A (en) Method of measuring attention
KR102344493B1 (en) A smart inspecting system, method and program for nystagmus using artificial intelligence
Tafaj et al. Vishnoo—An open-source software for vision research
JP2008206830A (en) Schizophrenia diagnosing apparatus and program
WO2023081123A1 (en) Systems and methods for determining dementia and other conditions
Jacobsen et al. Is regression gain or instantaneous gain the most reliable and reproducible gain value when performing video head impulse testing of the lateral semicircular canals?
Brooks et al. Development and validation of a high-speed video system for measuring saccadic eye movement
Daniol et al. Eye-tracking in Mixed Reality for Diagnosis of Neurodegenerative Diseases
CN114190929B (en) Spasm quantitative evaluation method, device and system
US20240081636A1 (en) Method for Visual Function Assessment Using Multistable Rivalry Paradigms
US11596302B2 (en) Eye examination apparatus for use with a smartphone
Miller A handheld open-field infant keratometer (an american ophthalmological society thesis)
Borys et al. Using machine learning models to classify user performance in the ruff figural fluency test from eye-tracking features
CN117295447A (en) Eye examination equipment matched with smart phone

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22890668

Country of ref document: EP

Kind code of ref document: A1