WO2024019917A1 - Systèmes et procédés d'évaluation et d'apprentissage de l'efficacité du moteur perceptuel à l'aide d'un mouvement - Google Patents

Systèmes et procédés d'évaluation et d'apprentissage de l'efficacité du moteur perceptuel à l'aide d'un mouvement Download PDF

Info

Publication number
WO2024019917A1
WO2024019917A1 PCT/US2023/027595 US2023027595W WO2024019917A1 WO 2024019917 A1 WO2024019917 A1 WO 2024019917A1 US 2023027595 W US2023027595 W US 2023027595W WO 2024019917 A1 WO2024019917 A1 WO 2024019917A1
Authority
WO
WIPO (PCT)
Prior art keywords
subject
response
determining
visual
displaying
Prior art date
Application number
PCT/US2023/027595
Other languages
English (en)
Inventor
Gary Blaine WILKERSON
Shaun PATEL
Original Assignee
React Neuro Inc.
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by React Neuro Inc. filed Critical React Neuro Inc.
Publication of WO2024019917A1 publication Critical patent/WO2024019917A1/fr

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/162Testing reaction times
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4082Diagnosing or monitoring movement diseases, e.g. Parkinson, Huntington or Tourette
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/40Detecting, measuring or recording for evaluating the nervous system
    • A61B5/4076Diagnosing or monitoring particular conditions of the nervous system
    • A61B5/4088Diagnosing of monitoring cognitive diseases, e.g. Alzheimer, prion diseases or dementia
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/30ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to physical therapies or activities, e.g. physiotherapy, acupressure or exercising
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H20/00ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance
    • G16H20/70ICT specially adapted for therapies or health-improving plans, e.g. for handling prescriptions, for steering therapy or for monitoring patient compliance relating to mental therapies, e.g. psychological therapy or autogenous training
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H40/00ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices
    • G16H40/60ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices
    • G16H40/63ICT specially adapted for the management or administration of healthcare resources or facilities; ICT specially adapted for the management or operation of medical equipment or devices for the operation of medical equipment or devices for local operation
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/20ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for computer-aided diagnosis, e.g. based on medical expert systems
    • GPHYSICS
    • G16INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
    • G16HHEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
    • G16H50/00ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics
    • G16H50/30ICT specially adapted for medical diagnosis, medical simulation or medical data mining; ICT specially adapted for detecting, monitoring or modelling epidemics or pandemics for calculating health indices; for individual health risk assessment

Definitions

  • the present disclosure generally relates to systems and methods for assessment and/or training of brain function, for example, including neural processes associated with perceptual-motor efficiency.
  • Brain output does not necessarily involve serial processing of sensory inputs, but execution of an optimal response requires integration of neural information derived from somewhat segregated perceptual, cognitive, and motor processes.
  • Very recent technological advances have dramatically increased our understanding of structural connectivity between defined populations of neurons (e.g., white matter tracts), as well as functional connectivity among spatially separated brain areas (e.g., temporally correlated activation patterns).
  • DTI diffusion tensor imaging
  • fMRI functional magnetic resonance imaging
  • MRS magnetic resonance spectroscopy
  • EEG electroencephalography
  • MEG magnetoencephalography
  • TMS transcranial magnetic stimulation
  • tDCS transcranial direct current stimulation
  • EMG electromyography
  • Brain networks are defined by both structural connections and functional activations that support specific neural processes, which may involve relatively isolated activity within a given network, or activity that is highly integrated with that of other networks.
  • Structural connectivity may be necessary for transmission the of electrochemical signals from one area to another, but functional connectivity patterns reflect a complex balance of excitatory and inhibitory effects on signal transmission within both local and long-range neural circuits. Synchronization of oscillating neural signals (e.g., brainwaves) between spatially separated brain areas may also be important to information processing efficiency.
  • Certain aspects such as discussed herein are generally directed to systems and methods for assessment and training of perceptual-motor efficiency using moving visual stimuli and reactive body movements.
  • Some aspects use congruent and incongruent trials.
  • discrimination tests and neural efficiency training involve presentation of different types of visual stimuli, which may include congruent and incongruent trials.
  • some aspect of the congruent stimulus corresponds to some feature of a response target, thereby providing a “cue” for execution of a correct response. If movement direction defines congruency, the correct body movement response is executed in the “same” direction as the motion of a congruent visual stimulus (e.g., a filled circle; white against black background).
  • the correct body movement response is executed in the “opposite” direction of the motion of a visual “cue” stimulus (e.g., an open circle; white border around black interior against black background), which is known to elicit activations of specific brain areas that resolve conflict.
  • a visual “cue” stimulus e.g., an open circle; white border around black interior against black background
  • the speed and accuracy of responses to congruent and incongruent trials may be related to direct measurements of structural and functional connectivity within the anterior insula (Al) and the dorsal anterior cingulate cortex (dACC), which are important nodes of the salience network (SN).
  • “discrimination” reaction times may be determined as a technique to assess impulse control and/or the conflict resolution process associated with perceptual and cognitive interpretation of directional symbols.
  • the visuospatial nature of some of the tests described herein may be used to selectively assess information processing.
  • the responses to the stimuli may be simple button presses or the touching of suitable response targets that do not require complex motor activation patterns.
  • “dual-task” tests may be used that impose simultaneous cognitive and motor challenges, which may cause performance decrements in one or both tasks compared to performance of the respective tasks separately.
  • wholebody movement responses to various types of stimuli may have greater relevance to functional activities.
  • certain embodiments are directed to certain tests that are applied to the subject, and the speed and/or accuracy of the subject’s response can be determined.
  • the test may involve a discrimination task that uses both congruent and incongruent motion stimuli.
  • a response of the subject e.g., movement to the left or right
  • presentation of a moving visual stimulus for measurement of reaction time may provide more sensitive detection of a subtle impairment than a static stimulus.
  • the reaction time in response to perceived movement onset of a visual stimulus may be associated with the timing of brain signals that that are processed by the V5/MT+ area, whereas the timing of supplementary motor area (SMA) activation does not appear to exert any substantial influence on the speed of a pre-specified muscle responses to the moving stimuli.
  • SMA supplementary motor area
  • reaction times such as functional reaction times may be determined.
  • Functional reaction time may refer to a designation for measurements of the speed of different types of motive responses to visual stimuli that impose some level of cognitive challenge, which may require activation of the executive control network (ECN) for visual-spatial calibration and decision-making, and require translocation of the body mass.
  • ECN executive control network
  • the SN Al and dACC deactivates the default mode network (DMN) and activates the ECN, which includes the dorsolateral pre-frontal cortex (dlPFC) and superior parietal lobe (SPL).
  • the ventral attention network which includes the temporo-parietal junction (TP J) and mid-frontal cortex (MFG) of the right hemisphere, may play an important role in detection of salient visual stimuli that engage the SN.
  • the VAN also may convey information to ECN through the dorsal attention network (DAN), which includes the frontal eye field (FEF) and intra-parietal sulcus (IPS).
  • DAN dorsal attention network
  • FEF frontal eye field
  • IPS intra-parietal sulcus
  • Execution of a movement responses to visual stimuli may engage the somatosensory-motor network (SMN) to a variable extent, which can depend on the complexity of the required motor activation sequences.
  • SSN somatosensory-motor network
  • Important nodes of the SMN include the primary somatosensory cortex (SI), the primary motor cortex (Ml), the premotor cortex (PMC), the supplementary motor area (SMA), and the cerebellum (CBM).
  • SI primary somatosensory cortex
  • Ml primary motor cortex
  • PMC premotor cortex
  • SMA supplementary motor area
  • CBM cerebellum
  • the thalamus (TH) and basal ganglia (BG) play important roles as integration hubs that link visual, cognitive, and motor processes.
  • the extent of activation of a given network node may depend on the precise nature of task. For example, activation of the CBM may be much greater for bilateral and multi-directional whole-body movements than unilateral extremity reaching motions in a standing position, or simple button presses in a seated position.
  • some perceptual-motor responses may require inter-hemispheric transmission of neural signals for effective coordination of movements that can reveal asymmetries in performance capabilities.
  • certain embodiments as discussed herein may be directed to systems for functional/discrimination reaction time measurement that combines detection of visual stimulus movement direction (V5/MT+), impulse control and conflict resolution (Al, dACC), and execution of different types of motor responses.
  • Some embodiments such as those discussed herein are generally directed to virtual reality (VR) or augmented reality (AR) systems.
  • VR virtual reality
  • AR augmented reality
  • use of a virtual reality or augmented reality headset for presentation of visual stimuli may offer the advantages of a standardized viewing distance and tracking of pupil or eye movements, in certain embodiments.
  • An integrated representation of perceptual-cognitive-motor processing may be obtained through synchronous acquisition of data from an inertial measurement unit affixed to the body in some cases.
  • differing levels of simultaneous motor demand may be imposed by single-leg postural balancing, walking gait, lateral lunges, and/or lateral side-shuffling movements over a defined distance.
  • data derived from the inertial measurement unit may be used to assess postural j erk, asymmetrical body mass displacements during gait, acceleration, deceleration, or the like.
  • visual stimulus presentation on a display combined with synchronous acquisition of accelerometer data, may allow for relatively unconstrained performance of various functional movement patterns. Metrics derived from such testing may, in certain cases, be compared to inertial measurement data obtained during unconstrained participation in physically demanding activity, which may establish the relevance of standardized test results to a subject’s performance capabilities.
  • One feature of certain embodiments is the modification of difficulty for any given component of a combined perceptual, cognitive, or motor challenge.
  • relationships to known neural correlates may be used as indirect representations of the global processing efficiency of the various network subsystems engaged by specific combinations of visual, cognitive, and motor challenges.
  • indirect identification of impaired neural processes may be achieved through non-traditional statistical analyses that may utilize measures of intra-individual variability and composite metrics, which may be linked to suboptimal synchronization of oscillatory neural signaling between networks or among the nodes of a given network.
  • the limited accessibility and cost of the diagnostic technology required to acquire direct measures of brain structure and function make its use impractical for identification of individuals who might derive substantial benefit from targeted interventions for improvement of perceptual-motor efficiency.
  • the present disclosure generally relates, in certain aspects, to systems and methods for determining brain function, including assessing and/or training perceptual- motor efficiency.
  • the subject matter of the present disclosure involves, in some cases, interrelated products, alternative solutions to a particular problem, and/or a plurality of different uses of one or more systems and/or articles.
  • the method comprises displaying, to a subject, a discrimination test and a plurality of response targets that appear to be at least 1 meter away from the subject; determining a response target of the plurality of response targets that the subject appears to touch in response to the discrimination test; and determining a reaction time and/or an accuracy of the subject in response to the discrimination test.
  • the method in another set of embodiments, comprises (a) displaying, to a subject, a visual stimulus comprising either a first moving object or a second moving object, wherein the first moving object and the second moving object are visually distinct; (b) displaying, to the subject, a plurality of response targets that appear to be positioned to require the subject to take at least one step to reach; (c) determining which response target of the plurality of response targets the subject appears to touch in response to the visual stimulus comprising either the first moving object or the second moving object; (d) repeating (a)-(c) at least once; and (e) determining the subject’s accuracy in touching the corresponding response targets in response to the plurality of displayed visual stimuli.
  • the method comprises displaying, to a subject, a visual stimulus comprising a plurality of moving objects; displaying, to the subject, a plurality of response targets; determining a response target of the plurality of response targets that the subject appears to touch in response to the visual stimulus; and determining a plurality of reaction times, including at least two of: (a) a reaction time between the time the visual stimulus is displayed, and a response of the subject’s eyes to the visual stimulus; (b) a reaction time between the response of the subject’s eyes to the visual stimulus, and the time the subject begins moving towards one of the response targets; and (c) a reaction time between the time the subject begins moving towards one of the response targets, and the time the subject reaches the response target.
  • the present disclosure encompasses methods of making one or more of the embodiments described herein. In still another aspect, the present disclosure encompasses methods of using one or more of the embodiments described herein.
  • Fig. 1 illustrates various congruency tests, in accordance with certain embodiments
  • Fig. 2 illustrates a device for conducting tests such as congruency tests, in accordance with one embodiment
  • Fig. 3 illustrates a VR device, in another embodiment
  • Figs. 4A and 4B illustrate data visualizations of the responses of two individuals to certain test stimuli, in accordance with certain embodiments
  • Figs. 5A-5D illustrate certain congruency tests, in accordance with some embodiments.
  • Fig. 6 illustrates a VR image implementing a congruency test, in still another embodiment
  • Fig. 7 illustrates an interaction between VR test performance (e.g., inconsistency among responses following discrimination between congruent and incongruent visual stimuli) and self-rated impact of various health-related conditions (e.g., Sport Fitness and Wellness Index score) for identification of individuals with remote history of concussion in yet another embodiment.
  • VR test performance e.g., inconsistency among responses following discrimination between congruent and incongruent visual stimuli
  • self-rated impact of various health-related conditions e.g., Sport Fitness and Wellness Index score
  • the present disclosure generally relates to systems and methods for assessment and training of brain function, specifically processes associated with perceptual-motor efficiency.
  • a discrimination task or other perceptual-motor tasks involving certain visual objects in motion, is presented to a subject, who responds by moving in some way, e.g., touching a response target depending on what objects are displayed and/or how the objects move.
  • the subject may use a virtual reality system and motion sensors attached to the subject’s arms or held in the hands.
  • a variety of parameters may be determined, e.g., reaction time, accuracy, etc., which may be useful for assessing the subject’s perceptual, cognitive, and motor functioning.
  • such tests may be given repeatedly, e.g., over several days, weeks, etc., to train or improve the subject’s response.
  • the tests may adaptively increase in difficulty, e.g., on the basis of the subject’s preceding performance.
  • Other aspects as discussed herein are generally directed to devices or kits for conducting such tests, methods of using such tests, and the like.
  • Certain aspects are generally directed to systems and methods for assessing and/or training perceptual-motor efficiency in a subject, such as a human subject. For example tests such as congruency tests or other perceptual-motor tests, may be administered to a subject.
  • a visual stimulus comprising a plurality of objects is displayed to the subject, for example, on a display in a virtual reality or an augmented reality system.
  • a focal point is also displayed to the subject, e.g., before displaying a plurality of objects.
  • the objects that are displayed may independently appear to be the same or different, and/or appear in the same or different locations.
  • the plurality of objects within the visual stimulus may include one or more objects that move and/or one or more objects that do not move.
  • the subject may be asked to move in response to the visual display, for example, to touch one or more response targets.
  • the subject may be asked to move in a first direction in response to a first object (e.g., moving in certain direction) and a second direction in response to a different type of object (e.g., moving in a certain direction).
  • a first direction in response to a first object
  • a second direction in response to a different type of object
  • the response targets may appear to be positioned at some distance away from the subject, e.g., to cause the subject to need to move significantly in order to reach the response targets, for example, by taking one or more steps towards the response target.
  • the response targets may appear at least a meter away from the subject, thereby requiring the subject to take at least one step in order to touch the response target, or otherwise shift their body mass towards the response target.
  • one or more motion sensors may be used to determine motion of the subject, e.g., towards a response target, and/or to determine that the subject has touched one of the response targets.
  • a variety of motion sensors may be used, e.g., attached to the subject, or positioned externally of the subject (for example, a camera).
  • the subject may have one or two motion sensors attached to the subject’s arms, wrists, hands, trunk, lower extremities, legs, feet, etc., which can be used to determine motions of the subject’s body segments.
  • the performance of the subject may be determined, e.g., reaction time, accuracy, or other suitable metrics.
  • performance metrics such as accuracy (e.g., of the subject touching the correct response target) and/or reaction time (e.g., the time between when the visual stimulus is presented and the time the subject touches a response target) may be determined.
  • accuracy e.g., of the subject touching the correct response target
  • reaction time e.g., the time between when the visual stimulus is presented and the time the subject touches a response target
  • other metrics may be determined as well.
  • the spatial accuracy of the subject’s touching of a response target may be determined.
  • the smoothness of the gait or motion of the subject towards a response target may be determined.
  • reaction times that may be determined include, but are not limited to, the time between the visual stimulus and a response of the subject’s eyes to the visual stimulus, the time between the response of the subject’s eyes to the visual stimulus and when the subject starts moving, the time between when the subject starts moving and when the subject touches the response target, or the like.
  • performance metrics may be affected by mental and/or physical fatigue due to prolonged testing (e.g., averaged values for the second half of a test may differ from those demonstrated during the first half of a test).
  • multiple such tests may be administered to a subject.
  • a subject may be exposed to series of tests to determine overall reaction times, accuracy, etc. for the subject.
  • one or more such tests may be given on different days or at different times, or spaced apart by a minimum time, etc.
  • such tests may be used to determine if there is any change to a subject (e.g., the perceptual -motor efficiency of the subject), for example, if the subject has a neurological condition.
  • neurological conditions include, but are not limited to, dementia, Parkinson’s Disease, Alzheimer’s Disease, stroke, concussion, ADHD, or the like.
  • the tests may be used to improve or train the subject. For instance, in certain cases, a subject may be administered such tests to improve the subject’s hand-eye coordination, reaction time, athletic performance, or the like over time.
  • the difficulty of the tests may be adaptively increased over time, for example, if the subject exhibits improved performance (e.g., of the neurological condition, of athletic fitness, of reaction time, etc.).
  • the difficulty may be increased linearly, although in other cases, the difficulty may be increased in response to a subject’s performance (for example, if the subject achieves a certain accuracy, reaction time, or the like), or the difficulty may be increased based on other criteria or performance metrics.
  • the difficulty may be increased by changing the number of objects (moving and/or fixed), the appearance of the objects (e.g., size, shape, color, etc.), the amount of time the subject has to touch a response target, the lighting of the objects, the position of the objects, or the like.
  • the difficulty may be increased based on the positioning of objects, e.g., towards or at the peripheral limit of the useful field of view.
  • the tests may be “gamified,” e.g., with a score presented to the subject based on the subject’s performance.
  • scores may be compared to other subjects, e.g., as a competitive measure. For example, different subjects may compete to see who can achieve the highest difficulty levels the quickest, or achieve the highest score, etc.
  • a task may include a single choice between options that is followed by others, for example, a single repetition, while a test describes a series of successive choices, e.g., as identified by a specified number of repetitions.
  • a plurality of objects is shown to a subject, e.g., on the display of a VR device.
  • the objects may independently be stationary and/or may move during a test.
  • the subject is to touch a response target depending on how the objects move.
  • the response targets may also be shown on the display, i.e., the response targets may be virtual.
  • the response targets are positioned such that the subject has to move to reach a response target, for instance, by having to take one or more steps towards the response target, or otherwise shift their body mass towards the response target. Movement of the subject may be determined, for instance, using one or more motion sensors. These may be held or worn by the subject, and/or positioned externally of the subject, etc.
  • the objects and the response targets may each independently take on a variety of shapes, colors, sizes, locations, etc., depending on the type of test to be administered. For instance, as is shown in Fig. 1, these are shown as
  • some aspects are generally drawn to tests that use one or more visual stimuli that comprise a plurality of objects.
  • the plurality of objects may be real objects, and/or they may be a projection of an object on a display, e.g., in a virtual reality or an augmented reality system, or other display (e.g., a smartphone).
  • the objects may have any appearance.
  • some or all of the objects may have a relatively small size, e.g., such that its apparent position, as observed by the subject, is known to a relatively high degree of precision.
  • the objects within the visual stimulus may include one or more objects that move and/or one or more objects that do not move.
  • the objects that are displayed may independently appear to be the same or different, and may each independently have any shape or color.
  • an object may appear to be a circle, a ball, or a sphere.
  • an object may be a square, a triangle, a diamond, a cube, a block, a cylinder, a tetrahedron, a polyhedron, or appear to be a flat shape or a real object, etc.
  • the object may have a single color, or exhibit a range of colors.
  • the object may change its appearance (e.g., luminance or oscillatory motion) and/or size, etc., for example, as a function of time or position, etc. Such differences among multiple objects may serve as “cues” for the locations of response targets, which will permit assessment of anticipatory neural processes.
  • an object may appear to be a circle, a ball, or a sphere that has a diameter of between 10 mm and 20 mm, and may be presented to appear at a distance of between 0.8 m and 1.2 m from the subject.
  • an object may be larger or smaller, and/or may appear closer or farther away from the subject.
  • the object may grow or shrink in size (e.g., such that the object appears to go closer to or farther away from the subject).
  • an object may appear to be at least 10 cm, at least 20 cm, at least 50 cm, 100 cm, at least 200 cm, at least 500 cm, at least 700 cm, at least 1 m, at least 1.3 m, at least 1.5 m, at least 2 m, at least 3 m, at least 5 m, at least 10 m, etc.
  • the tracking object may appear to be no more than 10 m, no more than 5 m, no more than 3 m, no more than 2 m, no more than 1.5 m, no more than 1.3 m, no more than 1 m, no more than 700 cm, no more than 500 cm, no more than 200 cm, no more than 100 cm, no more than 50 cm, no more than 20 cm, no more than 10 cm, etc. Combinations of any of these are also possible, e.g., an object may appear to be between 10 m and 15 m, between 50 cm and 100 cm, between 700 cm and 1 m, etc., from the subject. An object may appear to stay a constant distance from the subject, and/or an object may have a trajectory where it appears to move closer and/or farther away from the subject.
  • an object may appear to be of any size.
  • an object may appear to have a maximum dimension of at least 2 mm, at least 3 mm, at least 5 mm, at least 7 mm, at least 10 mm, at least 20 mm, at least 30 mm, at least 50 mm, at least 70 mm, at least 100 mm, at least 200 mm, at least 300 mm, at least 500 mm, at least 700 mm, at least 1 m, etc.
  • An object may also appear to have, in certain embodiments, a maximum dimension of no more than 1 m, no more than 700 mm, no more than 500 mm, no more than 300 mm, no more than 200 mm, no more than 100 mm, no more than 70 mm, no more than 50 mm, no more than 30 mm, no more than 20 mm, no more than 10 mm, no more than 7 mm, no more than 5 mm, no more than 3 mm, no more than 2 mm, etc. Combinations of any of these dimensions are also possible. As a non-limiting example, an object may appear to have a maximum dimension of between 10 mm and 20 mm, between 200 mm and 300 mm, between 3 mm and 10 mm, etc.
  • the various objects may independently be stationary or may move. If an object moves, it may do so on any suitable trajectory.
  • the trajectory may be linear. In other embodiments, the trajectory may be nonlinear, e.g., circular, spiral, random, or the like.
  • the object may also move at any suitable speed.
  • the object may move at an apparent speed of at least 0.01 m/s, at least 0.02 m/s, at least 0.03 m/s, at least 0.05 m/s, at least 0.1 m/s, at least 0.2 m/s, at least 0.3 m/s, at least 0.5 m/s, at least 1 m/s, at least 2 m/s, at least 3 m/s, at least 5 m/s, at least 10 m/s, etc.
  • the object may also move at an apparent speed of no more than 10 m/s, no more than 5 m/s, no more than 3 m/s, no more than 2 m/s, no more than 1 m/s, no more than 0.5 m/s, no more than 0.3 m/s, no more than 0.2 m/s, no more than 0.1 m/s, no more than 0.05 m/s, no more than 0.03 m/s, no more than 0.02 m/s, no more than 0.01 m/s, etc.
  • an object may move at an apparent speed of between 1 m/s and 2 m/s, between 0.05 m/s and 3 m/s, between 2 m/s and 10 m/s, or the like.
  • the speed of an object may be constant, or may vary in some instances.
  • the object may follow the trajectory for any suitable length of time and/or distance, e.g., before disappearing from the field of view.
  • the object may follow a trajectory, such as a linear trajectory, for at least 5 seconds, at least 10 seconds, at least 15 seconds, at least 20 seconds, at least 25 seconds, at least 30 seconds, at least 40 seconds, at least 45 seconds, at least 50 seconds, at least 1 minute, at least 1.5 minutes, at least 2 minutes, at least 3 minutes, at least 4 minutes, at least 5 minutes, etc.
  • the objects may have trajectories based on one or more tests administered to the subject.
  • the tests may include one or more perceptual- motor tests.
  • Various tests such as those described herein may be used to assess or train perceptual -motor efficiency in a subject.
  • certain tests may include the display of one or more moving objects, and require the subject to respond in some way based on the appearance of a designated object and/or how the objects move.
  • the subject may be required to touch one of a number of response targets based on the trajectories of the objects.
  • it is believed that such tests may be useful, for example, to assess and/or to train the perceptual-motor efficiency of a subject.
  • the specific types of demands that are imposed may promote specific neuroplastic adaptations in brain structure and function.
  • the tests may measure various reaction times of the subject. Such reaction times generally measure the reaction of a subject to a visual display.
  • the test may include determinations of the time between the visual stimulus (e.g., involving one or more moving objects) and a response of the subject’s eyes to the visual stimulus, the time between the response of the subject’s eyes to the visual stimulus and when the subject starts moving towards a response target, the time between when the subject starts moving and when the subject touches the response target, etc.
  • a variety of sensors may be used to determine such reactions times.
  • the subject’s motions, actions, positioning, etc. may be determined using one or more motion sensors.
  • a motion sensor may be attached to the subject, e.g., to a subject’s hands or arms.
  • a motion sensor may be external to the subject. More than one motion sensor may be used in some instances.
  • the motion sensors may be used to determine whether the subject is touching or is reaching towards a response target, etc. For example, the motion sensors may be used to determine whether the subject has moved, e.g., in a certain direction, and/or whether the subject has touched the correct response target, e.g., in response to a discrimination trial.
  • the motion sensors may be used to determine a subject’s position, e.g., in a room, or relative to a response target, etc.
  • response targets may remain stationary in position and/or appearance, or the response targets may change appearance or even move, e.g., between and/or during a test.
  • the response targets may have a variety of appearances.
  • the response targets may be visually similar or dissimilar to the objects, and may have the same descriptions as those given herein for various types of objects, e.g., having any size, shape, color, appearance, etc.
  • the response targets may be positioned at a relatively far distance away from the subject, e.g., such that the subject will need to take one or more steps in order to touch a response target, or otherwise shift their body mass towards the response target.
  • the response targets may be at least 0.5 m, at least 0.75 m, at least 1 m, at least 1.5 m, at least 2 m, etc. away from a subject.
  • the response target’s location may correspond to a distance that is proportional to the subject’s physical stature (e.g., height or wingspan). There may be 1, 2, 3, 4, 5, or more response targets presented to a subject.
  • response targets there may be two response targets, one on the left of the subject and one on the right of the subject. The subject thus would need to move leftward or rightward to touch the corresponding response target.
  • a variety of different response targets may be presented, e.g., depending on the type of test administered to the subject.
  • the subject may be presented with a focal point, e.g., that the subject is to focus on, prior to a test or other visual stimulus.
  • a focal point e.g., that the subject is to focus on, prior to a test or other visual stimulus.
  • one or more eye tracking sensors may be used to determine whether the subject is visually focusing on the focal point before a test can begin. Eye tracking sensors are discussed in more detail herein.
  • a focal point presented to a subject may have any suitable appearance.
  • the focal point may be visually similar or dissimilar to the objects, and may have the same descriptions as those given herein for various types of objects, e.g., having any size, shape, color, appearance, etc.
  • the focal point may be useful to ensure that the subject’s eyes are positioned in a standardized location prior to the display of objects.
  • the focal point may be useful to provide a cue to the direction from which a moving visual stimulus may imminently appear from a position beyond the subject’s peripheral field of view.
  • certain aspects are generally directed to assessing and/or training perceptual-motor efficiency in a subject, for example, using perceptual-motor tests involving certain objects in motion where a subject has to respond by moving in some way, e.g., touching a response target depending on what objects are displayed and/or how the objects move.
  • a discrimination test a visual stimulus is presented to a subject, that includes a plurality of objects that require an understanding of congruency or incongruency in order to properly respond.
  • the objects that are presented may be visually congruent in some way, e.g., the objects may be substantially identical or compatible, such as a series of arrows that all point the same direction.
  • congruency involves consistency of some characteristic(s) of a stimulus with a correct response. More complex discrimination tasks may involve perception of additional features that provide cues about correct responses.
  • the objects that are presented may also be visually incongruent in some way, e.g., one of the objects may be different than the other objects, for example, a series of arrows where one points in the opposite direction.
  • a subject may be asked to perform a motion (for example, moving towards a response target) that is congruent to an object’s motion (i.e., where the subject is to move in the same or congruent direction as the object’s motion), or to perform a motion that is incongruent to an object’s motion (i.e., where the subject is to move in the opposite or incongruent direction as the object’s motion).
  • a motion for example, moving towards a response target
  • a motion that is congruent to an object’s motion i.e., where the subject is to move in the same or congruent direction as the object’s motion
  • a motion that is incongruent to an object’s motion i.e., where the subject is to move in the opposite or incongruent direction as the object’s motion.
  • the type of test may be indicated, for example, by using two or more visually distinct objects.
  • a first object for example, a filled circle
  • a second object that is visually distinct in some way (e.g., an open circle, a differently-colored circle, a different shape such as a square, etc.) may be used to indicate that the subject is to incongruently move in a direction opposite to the direction of the second object.
  • the objects may be visually distinct for any of a number of reasons, including color, size, shape, position, distance, etc., as discussed herein.
  • the objects may have a marking or a character (e.g., a “+” or a star, etc.) to make them visually distinct.
  • the subject may be instructed to touch one of several possible response targets based on the direction (or the opposite of the direction) that an object moves, depending on the type of discrimination test that is applied.
  • an object may move substantially horizontally, and/or at an angle relative to horizontal (for example, at an angle of between +/- 45°, between +/- 35°, between +/- 30°, between +/- 25°, between +/- 20°, between +/- 15°, between +/- 10°, or between +/- 5° relative to the horizontal, for example, such that the object is readily identifiable as moving either left or right.
  • the subject may respond, for example, by moving congruently (either left or right in the same direction that the object is moving), or by moving incongruently (either left or right in the same direction that the object is moving), for example, based on the appearance of the object.
  • the precise location of a response target may correspond to position on the angular path of a designated moving object.
  • Level 1 there may be two response targets presented to a subject, one to the left and one to the right of the subject.
  • the subject may be exposed to a visual stimulus (optionally with a preceding focal point, e.g., to alert and/or orient the subject).
  • a visual stimulus optionally with a preceding focal point, e.g., to alert and/or orient the subject.
  • the subject sees a filled circle on a display, thereby indicating a congruent stimulus, the subject is to touch the response target based on the filled circle’s motion (i.e., if the filled circle is moving to the left, the subject is to touch the response target on the left, and if the filled circle is moving to the right, the subject is to touch the response target on the right).
  • the direction of the visual stimulus is shown with a solid arrow, while the direction the subject should move to reach the correct response target is shown with a dashed arrow.
  • the subject sees an open circle, indicating an incongruent test, the subject is to touch the response target in the direction opposite to the open circle’s motion (i.e., if the open circle is moving to the left, the subject is to touch the response target on the right, and if the open circle is moving to the right, the subject is to touch the response target on the left).
  • the discrimination tests can be increased in difficulty, for example, by addition additional objects around the objects that move, thereby distracting or confusing the subject and increasing the difficulty of the test.
  • the objects may be stationary and not move, and/or some of the objects may move.
  • the objects may be positioned around the visual stimulus, e.g., that the subject is responding to. Non-limiting examples include those shown in Levels 2-5 of Fig. 1.
  • the difficulty can be increased, for example, stepwise, in response to a subject’s reaction times or accuracy, or due to other factors (e.g., beating a previous record by the subject or by others).
  • a variety of different types, numbers, positions, etc. of other objects may be displayed to the subject to increase the difficulty of the tests, such as the specific non-limiting examples that are shown in Levels 2-5. For example, in some cases, multiple moving objects may be displayed after the focal point disappears to increase the difficulty.
  • multiple tests can be executed, and information about the subject in response to those tests determined.
  • a plurality of discrimination tests may be administered, including tests involving congruent stimuli and incongruent stimuli. These may be presented to the subject in any suitable order, e.g., randomly so as to prevent the subject from anticipating the correct motions to take. For instance, at least 10, at least 15, at least 20, at least 25, at least 30, at least 40, at least 50, at least 75, at least 100, etc. tests, such as congruency tests, or other perceptual-motor tests, may be applied to a subject, e.g., in a single test session.
  • the subject’s accuracy to tests such as these may be determined, for example, as the accuracy of touching the correct response target in response to a presented test (e.g., did the subject touch the congruently positioned response target in response to a congruent test and the incongruently-positioned response target in response to an incongruent test).
  • the accuracy of the subject’s touching of a response target in response to a test may be determined.
  • the accuracy may be reported as percentage (e.g., the percent of tests where the subject touched the correct response target), although in some cases, other methods may be used (e.g., the scoring of arbitrary points, as in a game).
  • a subject may be asked to perform a physical task while simultaneously engaged in a cognitive activity, e.g., a “dual-task” test.
  • cognitive tasks include, but are not limited to, counting numbers in some fashion (e.g., counting forwards, counting backwards to 100, counting backwards in steps of 7), reciting multiplication tables, identifying songs being played in the background, reciting state or country names, or the like.
  • such tasks may be performed by the subject (or attempted to be performed) while the subject is also taking one or more tests such as those described herein, e.g., discrimination tests.
  • a subject who attempts to carry out multiple actions at once may experience some degree of neural interference due to competition among perceptual, cognitive, and motor processes for access to a finite amount of neural resources. This may be useful, for example, to identify the underlying neural impairment that is responsible for a subtle performance deficiency.
  • the subject’s reaction time in response to the tests may be determined, e.g., in addition or instead of determining accuracy.
  • one reaction time that may be determined is the time between when a visual stimulus is presented (e.g., involving one or more moving objects) and a response of the subject’s eyes to the visual stimulus (for example, as determined by one or more eye tracking sensors such as described herein).
  • another reaction time that may be determined is the time between the response of the subject’s eyes to the visual stimulus and when the subject starts moving towards a response target (for example, as determined by one or more motion sensors such as described herein).
  • a reaction time that may be determined is the time between when the subject starts moving and when the subject touches the response target, etc.
  • reaction times may be indicative of certain perceptual, cognitive, and motor processes in a subject.
  • perceptual, cognitive, and motor processes are somewhat segregated within the brain. Accordingly, tests that rely on such processes may be used to assess a subject, e.g., the mental functioning of the subject, or the subject’s perceptual -motor efficiency.
  • a significantly longer reaction time compared to other reaction times in the same subject may indicate a deficiency in the subject’s ability to integrate perceptual, cognitive, and motor processes. This may be used, for example, as a baseline measurement of the subject’s mental status, and/or there may be some intervention or treatment of the subject (e.g., therapy, pharmaceuticals, counseling, etc.) as a result of such tests.
  • a plurality of tests may be administered to the subject in accordance with certain aspects. These may be applied all at once, and/or a variety of test sessions (each comprising one or more tests) may be administered to the subject over an extended period of time. This may be useful, for example, to train or improve the athletic performance of a subject, to assess whether a subject having a neurological condition is improving or not, etc. (for example, in response to therapy or intervention, etc.) For example, a plurality of test sessions may be administered on a daily basis (or at least 12 hours apart), on a weekly basis, or the like.
  • the tests may include visual stimuli that can be presented using a variety of equipment.
  • a plurality of objects may be presented as visual stimuli to a subject, e.g., using a display.
  • the eye positions of the subject are detected using one or more eye tracking sensors. These may be controlled, for example, using a processor.
  • system 10 is used to test a subject, e.g., as a human subject.
  • the subject is shown a display 20, upon which an object 25 is shown.
  • the object may appear to be stationary or moving, e.g., in trajectories such as those discussed herein.
  • more than one object may be present in some embodiments.
  • the display may be, for example, a display on a VR headset, an external monitor, a display on the smartphone, or the like, e.g., as discussed herein.
  • eye tracking sensor 30 can be used to track the position of eyes 60 of the subject.
  • the senor includes a light source 32 and a camera 35 that records light 37 passing between light source 32 and camera 35.
  • the angle of deflection of the light may be used to determine where the eyes are focused. This may be used, for example, to determine a reaction time between the time an object is displayed, e.g., as part of a test, and/or a response of the subject’s eyes to the test.
  • eye tracking sensors are not necessarily required in all embodiments.
  • a processor 50 may be used to produce images on display 20 and/or track eye movements via eye tracking sensors 30.
  • VR headset 2 can be contained within a headset such as a VR headset, for example, one that can be worn by a subject as is shown in Fig. 3.
  • VR headsets can be readily obtained commercially; non-limiting examples include Oculus Quest 2, Sony PlayStation VR, HP Reverb G2, or the like.
  • the headset may contain therein a display that can be viewed by the subject, and optionally one or more eye tracking sensors.
  • a processor may also be present within the headset, and/or located externally of the headset (for example, in wired or wireless communication with the headset).
  • the headset may be able to send and/or receive signals with a processor that is external to the headset.
  • the headset may be self-contained, e.g., able to conduct a test on a subject without requiring any additional controls.
  • the device may be any suitable device.
  • the device may be a room-sized device, a VR (virtual reality) system, a headset (e.g., a VR headset), or the like.
  • the device may include a variety of displays, eye tracking sensors, processors, and/or other components, etc., in various embodiments. Displays, VR systems, headsets, and other similar components may also be obtained commercially.
  • the device can include a frame that permits a subject to rest their head within the framework to observe the display.
  • one or more sensors can be placed throughout the framework or device to provide measurements of subject’s system movement and/or position, etc.
  • the device may include a display for displaying an image of an object to be tracked.
  • the display may be a single screen, or may comprise two or more screens (for example, one for each eye).
  • the display may be provided with instructions (e.g., from a processor) to display an image or a sequence of image frames, e.g., such that an object shown the display appears to move (for example, as discussed herein).
  • a magnifying lens may be position between the display and the subject’s eyes.
  • the display may be any suitable display for producing an image, for example, LED displays, CRT displays, LCD displays, or the like.
  • the device may also include one or more eye tracking sensors for tracking the eyes. Any of a variety of eye tracking sensors may be used, many of which can be obtained commercially.
  • the eye tracking sensor may comprise an illumination source, such as an infrared or near-infrared light-emitting diode (LED) configured to illuminate an eye with light, and one or more sensors configured to detect the light reflected by the eye.
  • the sensors may include cameras, search coils, electrooculograms (for potential measurement), etc.
  • Other examples include videobased tracking sensors, infrared red, near infrared, and passive light sensing, etc.
  • the sensors may be built into glasses or other devices (e.g., VR headsets, etc.), including any of those described herein.
  • the sensors may, in some embodiments, capture or sample at relatively high frequencies, e.g., at least 30 Hz, at least 40 Hz, at least 50 Hz, at least 60 Hz, at least 100 Hz, at least 240 Hz, at least 350 Hz, at least 500 Hz, at least 1000 Hz, at least 1250 Hz, at least 2000 Hz, etc.
  • the sensors may operate without the use of an illumination source.
  • other optical methods may be used in other embodiments.
  • a processor is used to produce images on the display and track eye movements via one or more eye tracking sensors.
  • the processor may include any of various types of circuits and/or computing devices. These may include one or more processors and/or non-transitory computer-readable storage media (e.g., memory, nonvolatile storage devices, etc.).
  • the processor may control writing data to and/or reading data from memory, and/or non-volatile storage devices.
  • the processor can be a programmed microprocessor or microcomputer, which can be custom made or purchased from computer chip manufacturers.
  • the processor may execute one or more processor-executable instructions stored in one or more non-transitory computer-readable storage media, which may serve as non-transitory computer-readable storage media storing processor-executable instructions for execution by the processor.
  • the device may be a smartphone.
  • the smartphone may contain a camera (which can be used as an eye tracking sensor), a display (which can be used to display an object to be tracked), and a processor.
  • the smartphone may be held such that a built-in camera within the smartphone faces the user while the user views the display of the smartphone, e.g., such that the camera can be used to track the user’s eyes.
  • the smartphone may be connected to an external camera and/or an external display, i.e., instead of and/or in addition to using a built-in camera and/or display on the smartphone.
  • the processor may be programmed to produce images on the display and/or track eye movements via the camera (e.g., used as an eye tracking sensor), for example, as discussed herein.
  • the device may be a computer, e.g., a desktop computer or a laptop computer.
  • the computer may be connected to an external camera, and/or contain a built-in camera, which can be used as an eye tracking sensor, e.g., as discussed herein, such that the camera can be used track the user’s eyes.
  • the computer may also be connected to an external display, and/or contain a built-in display, for example, which can be used to display an object to be tracked.
  • the camera may be positioned such that the camera is able to image the user’s eyes while the user views the display.
  • the computer may also contain a processor, e.g., which may be programmed to produce images on the display and/or track eye movements via a camera (for example, used as an eye tracking sensor), such as discussed herein.
  • the device may be positioned on a stationary object, e.g., a table or a counter. However, in other embodiments, the device may not necessarily be stationary. For example, the device may be a smartphone that is held by the user. In some embodiments, positioning or motor corrections may be applied to the program, e.g., by incorporating sampling gyroscope or accelerometer data from the device in order to correct for the position of the device in order to track the user’s eyes and/or track motion of the display, etc.
  • program or “software” are used herein in a generic sense to refer to any type of computer code or set of processor-executable instructions that can be employed to program a computer or other processor (physical or virtual) to implement various aspects as discussed herein. Additionally, according to one embodiment, one or more computer programs that when executed perform methods of the disclosure provided herein need not reside on a single computer or processor, but may be distributed in a modular fashion among different computers or processors to implement various embodiments of the disclosure provided herein.
  • Processor-executable instructions may be in many forms, such as program modules, executed by one or more computers or other devices.
  • program modules include routines, programs, objects, components, data structures, etc. that perform tasks or implement abstract data types.
  • functionality of the program modules may be combined (e.g., centralized) or distributed.
  • a power supply which may be rechargeable
  • an antenna can be used to send and/or receive signals from an external device, such as a laptop, mobile device, computer, etc. This may be used, for example, to control operation of the device, transmit data to or from the device, modify operational parameters of the device, or the like.
  • RF radio frequency
  • a 4-week training program designed with the goal to improve perceptual-motor performance consisted of 1-3 training sessions per week, which was initiated after the conclusion of a 4-week pre-season practice period.
  • Each training session involved 2 sets of 20 lower extremity lunging movements in left or right directions, which were executed in response to white visual stimuli presented against a black background on a laptop computer monitor.
  • the visual “cue” stimulus for a given trial was either a solid white circle or an open circle with a white outline that appeared at the center of the screen and moved horizontally in either a left or right direction beyond the screen margin within 500 milliseconds.
  • the interstimulus interval was 1 second for all trials.
  • a 5-circle configuration that included a center cue stimulus circle with flanking circles placed above, below, left, and right provided fourth level of difficulty, which also presented a combination of horizontal and diagonal circle motions in left and right directions.
  • the participant executed a left or right lateral lunging movement of the lower extremity according to the stimulus-response instruction for a filled versus an open cue stimulus circle at the center of the 5-circle display.
  • the fifth difficulty level designated the cue stimulus circle with a cross at its center, which randomly appeared in any one of the 5-circle display positions from one trial to the next.
  • the single-circle level 1
  • 3-circle column level 2
  • 3-circle row level 3
  • 5-circle display levels 4 and 5 appeared at the center of the screen immediately prior to motion.
  • the sixth difficulty level randomly presented the 5- circle display centered within either the upper-left, lower-left, upper-right, or lower-right quadrant of the tablet screen, along with the requirement to locate the cue stimulus circle containing a cross at its center as adjacent unmarked filled or open circles moved in overlapping horizontal or diagonal paths across the screen in left or right directions.
  • Fig. 1 The tests used in this example are shown in Fig. 1.
  • the solid arrows indicate the direction the objects move; the dashed arrows indicate the direction the subject is to move to reach a response target.
  • a moving filled object e.g., a circle
  • an open filled object indicates that the subject is to move in the direction opposite to that the object moves.
  • the test can increase in difficulty in a number of different ways, such as by adding additional objects that do not move (which may, for example, also be solid or open), or having the subject identify a marked object amongst other similar objects that are not marked. In some cases, the unmarked objects may also move.
  • This example illustrates a series of difficulty levels for improve perceptual-motor performance as discussed in Example 1. These are schematically shown in Fig. 1.
  • Subject must rapidly identify the “cue” stimulus circle marked with cross to visual track its motion relative to that of the other flanking/distracting circles.
  • Level 6 Central fixation cross disappears, and 5-circle configuration appears in any of 4 screen quadrants (Upper-Left, Lower-Left, Upper-Right, or Lower-Right. “Cue” stimulus circle (filled or open) marked with cross may be in any one of the 5 positions.
  • This example relates the results of perceptual-motor training of college football players, which was improved in accordance with certain embodiments. Interrelationships among performance metrics linked to brain processing efficiency were assessed in this example for a cohort of college football players. A retrospective association between suboptimal conflict resolution and prior core or lower extremity injury during football participation (i.e., previous 12 months) was used to assess the potential for perceptual-motor performance improvement in players believed to possess elevated injury risk.
  • a smartphone app was used to administer the Ericksen flanker test to a cohort of 87 players (20.7 +/- 1.7 years; 185.2 +/- 10.1 cm; 102.5 +/- 19.5 kg) during the week prior to initiation of pre-season practice sessions.
  • Rapid manual tilting of the device registered responses to the center arrow directions of 5-arrow congruent and incongruent stimuli.
  • Speed-accuracy trade-off was quantified by Rate Correct per Second (RCS) of response time and Flanker Conflict Effect (FCE) represented difference in average response times for 10 incongruent and 10 congruent displays.
  • Reaction Time Variability was defined as the intra-individual standard deviation of 20 response times.
  • a total of 75 college students (20.7 +/- 2.1 years) from 3 different organizational cohorts voluntarily participated in a virtual reality (VR) test and provided survey responses relating to current well-being.
  • the VR test involved 40 lunging/reaching movements in left or right directions in response to the directional motion of horizontally moving white circles that were visible for 250 ms against a black background on the headset screen (Pico Neo 3 Pro Eye, ByteDance, Ltd., Bejing, China).
  • each participant assumed a “T-pose” with VR controllers in each hand (i.e., standing upright with horizontally outstretched arms) to acquire a measurement that was used to calibrate the lunging/reaching distance required to make contact with a VR response target (i.e., a green sphere) located beyond the peripheral margin of the visual field when looking forward.
  • a VR response target i.e., a green sphere
  • Stimulus-response instructions were to lunge/reach to hit the green target sphere located in the same direction as the horizontal motion of “solid” white circles (i.e., congruent stimulus-response motion) and to lunge/reach to hit the green target sphere located in the opposite direction to the horizontal motion of “open” circles with a white border (i.e., incongruent stimulus-response motion). Participants were instructed to assume a semi-crouched ready position, with feet positioned at shoulder width and hand controllers against the chest.
  • a central fixation cross appeared on the VR headset screen prior to the initiation of each trial to ensure that the eyes were centrally positioned before a moving circle stimulus appeared.
  • a solid or open circle either initially appeared in a central position and disappeared at the left or right peripheral margin of the viewing screen or it initially emerged from either the left or right peripheral margin and disappeared at the center.
  • Performance metrics derived from VR hand controller i.e., arm reach
  • movement responses included Rate Correct per Second (RCS: number of correct directional responses divided by the sum of response times), Motion Conflict Effect (MCE: median of 20 incongruent response times minus median of 20 congruent response times), and Reaction Time Variability (RTV: standard deviation of 40 response times).
  • RCS Rate Correct per Second
  • MCE Motion Conflict Effect
  • RTV Reaction Time Variability
  • VR Dispersion represented the intra-individual standard deviation of standardized (i.e., T-score) values for multiple metrics derived from eye, neck, arm, and lower extremity response times for the 4 different types of visual stimuli.
  • T-score standardization An advantage of T-score standardization is representation of standard deviations above and below a standardized mean value of 50 in units of 10. Because prior research has associated “dispersion” of standardized performance values for tasks representing different domains of cognitive function with various neurodegenerative disorders, the prospectively selected performance metric of primary interest was VR Dispersion.
  • the Sport Fitness and Wellness Index provides a global estimate of wellbeing that is derived from a combination of responses to the Sport Fitness Index and the Overall Wellness Index surveys, both of which have been validated as indicators of suboptimal status.
  • a 0-100 SFWI score was generated from responses to 20 items pertaining to persisting effects of prior musculoskeletal injuries, physical problems, sleep disruption, cognitive limitations, and mood disorders (i.e., high score corresponding to overall status self-rated as optimal).
  • Receiver operating characteristic analysis area under curve was used to assess the strength of univariable association of potential predictive variables with self-reported history of concussion. Youden’s Index was used to identify the optimal cut point for maximum discrimination, which permitted cross-tabulation analysis for determination of sensitivity, specificity, and an odds ratio (OR) and its 95% confidence interval (CI). Logistic regression analysis was used to assess various combinations of continuous predictive variables for maximum discrimination. To avoid overfitting, the logistic regression modeling was limited to the strongest 2-factor combination of predictive variables. To potentially simplify interpretation of the logistic regression result, binary modeling of the strongest 2- factor combination of variables was also performed.
  • AUC Receiver operating characteristic analysis area under curve
  • This example illustrates data visualizations that illustrate differences between 2 individuals (college football players), in accordance with certain embodiments of the invention.
  • this example illustrates the simultaneous engagement of visual- cognitive processes and multi-segmental muscle responses.
  • Figs. 4A-4B illustrate synchronization of eye, neck, arm, and lower extremity responses, and suboptimal synchronization associated with prolonged and less accurate responses to the test stimuli.
  • Each of the metrics listed in the tables adjacent to the movement tracking illustrations are better for case 1 than case 2. Both cases in this example have a history of sport-related concussion, but case 2 sustained his injury more recently.
  • Dispersion refers to variability among 16 standardized measures (individual’s values in relation to a group mean value) derived from the test (4 body parts (eye, neck, arm, and lower extremity) and 2 stimulus types (incongruent and congruent) at 2 locations of initial stimulus presentation (central and peripheral)). “Latency+ refers to the time between stimulus appearance and initiation of movement, whereas “Response” refers to the time between stimulus appearance and movement beyond a specified threshold. The “Dispersion” values provide a numerical representation of the illustrated degree of synchronization among the 4 body parts and the 4 possible combinations of trial type.
  • a reference to “A and/or B”, when used in conjunction with open-ended language such as “comprising” can refer, in one embodiment, to A only (optionally including elements other than B); in another embodiment, to B only (optionally including elements other than A); in yet another embodiment, to both A and B (optionally including other elements); etc.
  • the phrase “at least one,” in reference to a list of one or more elements, should be understood to mean at least one element selected from any one or more of the elements in the list of elements, but not necessarily including at least one of each and every element specifically listed within the list of elements and not excluding any combinations of elements in the list of elements.
  • This definition also allows that elements may optionally be present other than the elements specifically identified within the list of elements to which the phrase “at least one” refers, whether related or unrelated to those elements specifically identified.
  • “at least one of A and B” can refer, in one embodiment, to at least one, optionally including more than one, A, with no B present (and optionally including elements other than B); in another embodiment, to at least one, optionally including more than one, B, with no A present (and optionally including elements other than A); in yet another embodiment, to at least one, optionally including more than one, A, and at least one, optionally including more than one, B (and optionally including other elements); etc.
  • the word “about” is used herein in reference to a number, it should be understood that still another embodiment of the disclosure includes that number not modified by the presence of the word “about.”

Landscapes

  • Health & Medical Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Medical Informatics (AREA)
  • Public Health (AREA)
  • General Health & Medical Sciences (AREA)
  • Biomedical Technology (AREA)
  • Pathology (AREA)
  • Neurology (AREA)
  • Developmental Disabilities (AREA)
  • Primary Health Care (AREA)
  • Biophysics (AREA)
  • Epidemiology (AREA)
  • Molecular Biology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Physics & Mathematics (AREA)
  • Child & Adolescent Psychology (AREA)
  • Surgery (AREA)
  • Animal Behavior & Ethology (AREA)
  • Psychology (AREA)
  • Psychiatry (AREA)
  • Veterinary Medicine (AREA)
  • Hospice & Palliative Care (AREA)
  • Social Psychology (AREA)
  • Neurosurgery (AREA)
  • Physiology (AREA)
  • Data Mining & Analysis (AREA)
  • Databases & Information Systems (AREA)
  • Educational Technology (AREA)
  • Physical Education & Sports Medicine (AREA)
  • Business, Economics & Management (AREA)
  • General Business, Economics & Management (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

La présente divulgation concerne de manière générale des systèmes et des procédés d'évaluation et d'apprentissage de fonction cérébrale, spécifiquement des processus associés à une efficacité du moteur perceptuel. Dans certains aspects, une tâche de discrimination, ou d'autres tâches du moteur perceptuel impliquant certains objets visuels en mouvement, est présentée à un sujet, qui répond en se déplaçant d'une certaine manière, par exemple, en touchant une cible de réponse en fonction des objets affichés et/ou de la façon dont les objets se déplacent. Par exemple, le sujet peut utiliser un système de réalité virtuelle et des capteurs de mouvement fixés aux bras du sujet ou maintenus dans les mains. Divers paramètres peuvent être déterminés, par exemple, le temps de réaction, la précision, etc, qui peuvent être utiles pour évaluer le fonctionnement perceptuel, cognitif et moteur du sujet. De plus, dans certains modes de réalisation, de tels tests peuvent être donnés de manière répétée, par exemple, sur plusieurs jours, semaines, etc, pour entraîner ou améliorer la réponse du sujet.
PCT/US2023/027595 2022-07-19 2023-07-13 Systèmes et procédés d'évaluation et d'apprentissage de l'efficacité du moteur perceptuel à l'aide d'un mouvement WO2024019917A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
US202263390619P 2022-07-19 2022-07-19
US63/390,619 2022-07-19

Publications (1)

Publication Number Publication Date
WO2024019917A1 true WO2024019917A1 (fr) 2024-01-25

Family

ID=89618294

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/US2023/027595 WO2024019917A1 (fr) 2022-07-19 2023-07-13 Systèmes et procédés d'évaluation et d'apprentissage de l'efficacité du moteur perceptuel à l'aide d'un mouvement

Country Status (1)

Country Link
WO (1) WO2024019917A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120075586A1 (en) * 2010-03-01 2012-03-29 David Gary Kirschen Methods and systems for intelligent visual function assessments
WO2017108441A1 (fr) * 2015-12-22 2017-06-29 Universiteit Maastricht Traitement de la déficience cognitive à l'aide d'un stimulateur de la sgc
WO2017141166A1 (fr) * 2016-02-19 2017-08-24 Hicheur Halim Dispositif d'évaluation et d'entraînement des performances de perception, de cognition et motrices, et son procédé
US20180090024A1 (en) * 2015-03-31 2018-03-29 Tali Health Pty Ltd System And Process For Cognitive Assessment And Training
US20180308383A1 (en) * 2017-04-20 2018-10-25 Millisecond Software Llc Device and method for measuring implicit attitudes
US20190026681A1 (en) * 2015-12-23 2019-01-24 Pymetrics, Inc. Systems and methods for data-driven identification of talent

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20120075586A1 (en) * 2010-03-01 2012-03-29 David Gary Kirschen Methods and systems for intelligent visual function assessments
US20180090024A1 (en) * 2015-03-31 2018-03-29 Tali Health Pty Ltd System And Process For Cognitive Assessment And Training
WO2017108441A1 (fr) * 2015-12-22 2017-06-29 Universiteit Maastricht Traitement de la déficience cognitive à l'aide d'un stimulateur de la sgc
US20190026681A1 (en) * 2015-12-23 2019-01-24 Pymetrics, Inc. Systems and methods for data-driven identification of talent
WO2017141166A1 (fr) * 2016-02-19 2017-08-24 Hicheur Halim Dispositif d'évaluation et d'entraînement des performances de perception, de cognition et motrices, et son procédé
US20180308383A1 (en) * 2017-04-20 2018-10-25 Millisecond Software Llc Device and method for measuring implicit attitudes

Similar Documents

Publication Publication Date Title
JP7358305B2 (ja) 認知を増強する方法及びそれを実施するためのシステム
US20240000370A1 (en) Cognitive platform configured as a biomarker or other type of marker
Formenti et al. Perceptual vision training in non-sport-specific context: effect on performance skills and cognition in young females
Ladouce et al. Understanding minds in real-world environments: toward a mobile cognition approach
Yates et al. Virtual reality gaming in the rehabilitation of the upper extremities post-stroke
Wu et al. Playing a first-person shooter video game induces neuroplastic change
Tomeo et al. Fooling the kickers but not the goalkeepers: behavioral and neurophysiological correlates of fake action detection in soccer
Nakamoto et al. Experts in fast-ball sports reduce anticipation timing cost by developing inhibitory control
Moreau et al. Enhancing spatial ability through sport practice
Maeda et al. Functional properties of parietal hand manipulation–related neurons and mirror neurons responding to vision of own hand action
Ganin et al. A P300-based brain-computer interface with stimuli on moving objects: four-session single-trial and triple-trial tests with a game-like task design
Jaquess et al. Changes in mental workload and motor performance throughout multiple practice sessions under various levels of task difficulty
WO2018087408A1 (fr) Système de mesure intégrale de paramètres cliniques de la fonction visuelle
EP3494565A1 (fr) Plate-forme cognitive comprenant des éléments évocateurs informatisés
WO2018132483A1 (fr) Plateforme cognitive configurée pour déterminer la présence ou la probabilité d'apparition d'un déficit ou d'un trouble neuropsychologique
Brown et al. Repetitive transcranial magnetic stimulation to the primary motor cortex interferes with motor learning by observing
Gorbet et al. Move faster, think later: Women who play action video games have quicker visually-guided responses with later onset visuomotor-related brain activity
Güldenpenning et al. Motor expertise modulates the unconscious processing of human body postures
Brown et al. Effect of trial order and error magnitude on motor learning by observing
WO2019028376A1 (fr) Plate-forme cognitive comprenant des éléments évocateurs informatisés dans des modes
McCormick et al. Eye gaze metrics reflect a shared motor representation for action observation and movement imagery
You et al. Unconscious response inhibition differences between table tennis athletes and non-athletes
US20120094263A1 (en) Video games for training sensory and perceptual skills
WO2024019917A1 (fr) Systèmes et procédés d'évaluation et d'apprentissage de l'efficacité du moteur perceptuel à l'aide d'un mouvement
Rugani et al. Numerical affordance influences action execution: a kinematic study of finger movement

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23843559

Country of ref document: EP

Kind code of ref document: A1