CN116392123A - Multi-movement symptom screening method and system based on game interaction and eye movement tracking - Google Patents
Multi-movement symptom screening method and system based on game interaction and eye movement tracking Download PDFInfo
- Publication number
- CN116392123A CN116392123A CN202310315525.3A CN202310315525A CN116392123A CN 116392123 A CN116392123 A CN 116392123A CN 202310315525 A CN202310315525 A CN 202310315525A CN 116392123 A CN116392123 A CN 116392123A
- Authority
- CN
- China
- Prior art keywords
- eye movement
- interaction
- eye
- module
- task
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 230000004424 eye movement Effects 0.000 title claims abstract description 346
- 230000003993 interaction Effects 0.000 title claims abstract description 213
- 238000000034 method Methods 0.000 title claims abstract description 53
- 238000012216 screening Methods 0.000 title claims abstract description 43
- 208000024891 symptom Diseases 0.000 title claims abstract description 40
- 238000011156 evaluation Methods 0.000 claims abstract description 62
- 238000012360 testing method Methods 0.000 claims abstract description 57
- 230000001660 hyperkinetic effect Effects 0.000 claims abstract description 30
- 238000011157 data evaluation Methods 0.000 claims abstract description 23
- 210000001508 eye Anatomy 0.000 claims description 60
- 238000012706 support-vector machine Methods 0.000 claims description 30
- 230000002441 reversible effect Effects 0.000 claims description 27
- 238000012549 training Methods 0.000 claims description 17
- 210000005252 bulbus oculi Anatomy 0.000 claims description 13
- 238000007667 floating Methods 0.000 claims description 9
- 238000012545 processing Methods 0.000 claims description 9
- 230000009471 action Effects 0.000 claims description 8
- 230000000694 effects Effects 0.000 claims description 6
- 239000003086 colorant Substances 0.000 claims description 5
- 238000002790 cross-validation Methods 0.000 claims description 5
- 230000002452 interceptive effect Effects 0.000 claims 1
- 208000013403 hyperactivity Diseases 0.000 description 15
- 230000008569 process Effects 0.000 description 12
- 230000006399 behavior Effects 0.000 description 9
- 208000037265 diseases, disorders, signs and symptoms Diseases 0.000 description 9
- 208000035475 disorder Diseases 0.000 description 9
- 230000006870 function Effects 0.000 description 9
- 208000006096 Attention Deficit Disorder with Hyperactivity Diseases 0.000 description 8
- 208000036864 Attention deficit/hyperactivity disease Diseases 0.000 description 8
- 238000005516 engineering process Methods 0.000 description 6
- 238000004891 communication Methods 0.000 description 5
- 238000004590 computer program Methods 0.000 description 4
- 238000003745 diagnosis Methods 0.000 description 4
- 238000010801 machine learning Methods 0.000 description 4
- 230000001105 regulatory effect Effects 0.000 description 4
- 230000001149 cognitive effect Effects 0.000 description 3
- 230000006735 deficit Effects 0.000 description 3
- 238000011161 development Methods 0.000 description 3
- 238000010586 diagram Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 206010037211 Psychomotor hyperactivity Diseases 0.000 description 2
- 230000001684 chronic effect Effects 0.000 description 2
- 230000001276 controlling effect Effects 0.000 description 2
- 238000001514 detection method Methods 0.000 description 2
- 230000009191 jumping Effects 0.000 description 2
- 239000003550 marker Substances 0.000 description 2
- 230000003287 optical effect Effects 0.000 description 2
- 230000002085 persistent effect Effects 0.000 description 2
- 238000011160 research Methods 0.000 description 2
- 230000004434 saccadic eye movement Effects 0.000 description 2
- 208000011580 syndromic disease Diseases 0.000 description 2
- 206010061818 Disease progression Diseases 0.000 description 1
- 206010022998 Irritability Diseases 0.000 description 1
- 208000016285 Movement disease Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 238000005094 computer simulation Methods 0.000 description 1
- 239000012141 concentrate Substances 0.000 description 1
- 230000007547 defect Effects 0.000 description 1
- 230000001419 dependent effect Effects 0.000 description 1
- 238000007599 discharging Methods 0.000 description 1
- 230000005750 disease progression Effects 0.000 description 1
- 239000003814 drug Substances 0.000 description 1
- 210000003128 head Anatomy 0.000 description 1
- 230000005764 inhibitory process Effects 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 238000012544 monitoring process Methods 0.000 description 1
- 230000001537 neural effect Effects 0.000 description 1
- 230000000926 neurological effect Effects 0.000 description 1
- 230000008447 perception Effects 0.000 description 1
- 230000002093 peripheral effect Effects 0.000 description 1
- 230000011514 reflex Effects 0.000 description 1
- 230000000452 restraining effect Effects 0.000 description 1
- 230000035807 sensation Effects 0.000 description 1
- 230000001953 sensory effect Effects 0.000 description 1
- 239000000126 substance Substances 0.000 description 1
- 238000010200 validation analysis Methods 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/168—Evaluating attention deficit, hyperactivity
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Developmental Disabilities (AREA)
- Engineering & Computer Science (AREA)
- Hospice & Palliative Care (AREA)
- Biomedical Technology (AREA)
- Psychiatry (AREA)
- Psychology (AREA)
- Social Psychology (AREA)
- Physics & Mathematics (AREA)
- Child & Adolescent Psychology (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Educational Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Veterinary Medicine (AREA)
- Eye Examination Apparatus (AREA)
Abstract
The embodiment of the application discloses a method and a system for screening hyperkinetic symptoms based on game interaction and eye movement tracking. The task scene module in the hyperkinetic symptom screening system is used for establishing a virtual reality environment and an eye movement interaction task scene; the virtual reality equipment module is used for displaying a virtual reality environment to a user and executing an eye movement interaction task in an eye movement interaction task scene; the eye movement interaction module is used for carrying out eye movement interaction with the user according to the eye movement interaction task; the eye movement data module is used for collecting eye movement data when the user performs eye movement interaction; the evaluation module is used for receiving the eye movement data, evaluating the eye movement data through the data evaluation model and outputting an evaluation result. The method and the device can provide more objective parameter indexes, and not only improve the accuracy of the eye movement data test of patients with hyperkinetic symptoms; but also can automatically mark and adjust the content and sequence of the task scene according to the user test condition.
Description
Technical Field
The application relates to the technical field of medicine, in particular to a method and a system for screening hyperkinetic symptoms based on game interaction and eye movement tracking.
Background
Attention deficit and hyperactivity disorder, also known as hyperactivity disorder, refers to a group of syndromes that occur in childhood and are characterized by significant difficulty in focusing on, short duration of attention, overactivity or impulsivity and irritability as compared to the same age children. These symptoms are characterized by being chronic and persistent, which can lead to the physical development of the child patient being lower than the development level of the normal age, and also bring physical, academic and interpersonal trouble and burden to the patient and his family. However, the current evaluation of hyperactivity is also mainly dependent on interviews, questionnaires and clinical observations, which have great influence on subjective factors or subjective bias and situational dependence, which cannot be objectively quantified, and are unfavorable for the system to manage the evaluation condition and disease progression.
Therefore, how to better and more effectively screen and evaluate hyperactivity has become a problem to be solved by those skilled in the art.
Disclosure of Invention
In view of the above, the present specification has been presented in order to provide a system and method for screening for hyperactivity based on game interactions and eye tracking that overcomes or at least partially solves the above problems.
Other features and advantages of the present disclosure will be apparent from the following detailed description, or may be learned in part by the practice of the disclosure.
In a first aspect, an embodiment of the present application provides a multiple movement symptom screening system based on game interaction and eye movement tracking, including a virtual reality device module, a task scene module, an eye movement interaction module, an eye movement data module, and an evaluation module, wherein: the task scene module is used for establishing a virtual reality environment and an eye movement interaction task scene; the virtual reality equipment module is used for displaying a virtual reality environment to a user and executing an eye movement interaction task in an eye movement interaction task scene; the eye movement interaction module is used for carrying out eye movement interaction with the user according to the eye movement interaction task; the eye movement data module is used for collecting eye movement data when the user performs eye movement interaction and sending the eye movement data to the evaluation module; the evaluation module is used for receiving the eye movement data sent by the eye movement data module, evaluating the eye movement data through the data evaluation model and outputting an evaluation result.
In some embodiments, the virtual reality device module includes a headset, a controller, and an eye tracker, wherein: the head-wearing end is used for displaying an eye movement interaction task interface in the virtual reality environment and playing sound effects corresponding to the eye movement interaction task; the controller is used for completing the configuration of the multiple movement symptom screening system and executing the eye movement interaction task in the eye movement interaction task scene; and the eyeball tracker is used for capturing the eyeball movement of the user and recording and tracking the eye movement track of the user.
In some embodiments, the eye movement interaction tasks in the eye movement interaction module comprise a plurality of eye movement interaction subtasks, each eye movement interaction subtask comprises a guiding stage and a testing stage, the guiding stage is an eye movement interaction demonstration stage for a user, and the testing stage is an eye movement interaction stage for the user.
In some embodiments, the eye-movement interaction module is a shooting game interaction module, and the number of eye-movement interaction subtasks in the shooting game interaction module includes a normal mode shooting task, a specific target shooting task, and a reverse shooting task, wherein: the normal mode shooting task is to find floating balloons in different living scenes and shoot the floating balloons; the specific target shooting task is to find specific targets in balloons with different colors and shapes and shoot the specific targets; the reverse shooting task is that the interference stimulus is arranged beside the fixation point of the user, and the user needs to move to the horizontal mirror image position of the interference stimulus to shoot.
In some embodiments, in the shooting game interaction module, when the user performs the eye movement interaction subtask, a mark is displayed at the central position of the eye movement interaction task interface, and the mark is used for prompting the user to perform gaze point calibration and prepare to start a game; when the mark of the central position of the eye movement interaction task interface disappears after the first preset time length, the user enters a guiding stage of the eye movement interaction subtask, and the eye movement interaction task interface demonstrates game interaction rules; after the instruction stage is finished, displaying the mark again at the central position of the eye movement interaction task interface, and after the second preset time length, disappearing the mark at the central position of the eye movement interaction task interface, enabling the user to enter the test stage of the eye movement interaction subtask, and completing one shooting action by moving the eyeball to watch the target according to the game prompt.
In some embodiments, the system further includes a personalization processing module, where the personalization processing module is configured to mark task scene content and sequence corresponding to last eye movement data after the last eye movement data evaluation performed on the user is finished, and reduce occurrence frequency of the task scene content and sequence corresponding to last eye movement data when the user enters the next eye movement data evaluation.
In some embodiments, the evaluation module includes an evaluation sub-module, which is configured to evaluate the eye movement data by using a support vector machine, and output an evaluation result.
In some embodiments, the method further comprises, before evaluating the eye movement data by the support vector machine and outputting the evaluation result: acquiring eye movement test sample data, wherein the eye movement test sample data comprises eye movement data of healthy people and patients with hyperkinetic symptoms after test; dividing the eye movement test sample data into an eye movement data sample training set and an eye movement data sample test set; based on the eye movement data sample training set and the eye movement data sample testing set, the support vector machine is trained and tested by adopting a cross validation method, and model parameters in the support vector machine are adjusted to obtain the support vector machine after training.
In some embodiments, the eye movement data includes, but is not limited to, two-dimensional coordinates of the gaze point, eye movement trajectories, gaze times, saccade times, average eye jump times, eye jump direction error rates.
In a second aspect, embodiments of the present application provide a method for screening hyperkinetic symptoms based on game interaction and eye movement tracking, applied to a system for screening hyperkinetic symptoms based on game interaction and eye movement tracking, including: establishing a virtual reality environment and an eye movement interaction task scene; displaying a virtual reality environment to a user, and executing an eye movement interaction task in an eye movement interaction task scene; performing eye movement interaction with a user according to the eye movement interaction task; collecting eye movement data of a user during eye movement interaction, and sending the eye movement data to an evaluation module; and receiving the eye movement data through an evaluation module, evaluating the eye movement data through a data evaluation model, and outputting an evaluation result.
The embodiment of the application can establish a virtual reality environment and an eye movement interaction task scene; then, the virtual reality environment is displayed to the user, and an eye movement interaction task is executed in an eye movement interaction task scene; then performing eye movement interaction with the user according to the eye movement interaction task; then, collecting eye movement data of a user in the eye movement interaction process, storing the eye movement data and sending the eye movement data to an evaluation module; and finally, receiving the eye movement data sent by the eye movement data module, evaluating the eye movement data through the data evaluation model, and outputting an evaluation result.
According to the invention, not only are VR game interaction and eye movement tracking technology adopted, but also immersive experience is closer to a natural reality state, so that the external effectiveness of the test is ensured; in addition, the reverse shooting task in the eye movement interaction module of the application uses a reverse eye movement behavior paradigm, and the hyperkinetic patient can be screened more effectively through the error rate of the eye jump direction of the hyperkinetic patient in the reverse shooting task; meanwhile, the trained machine learning support vector machine classifier is used for carrying out the multiple movement symptom score prediction on the eye movement data of the user, so that the accuracy of the eye movement data test of the multiple movement symptom patient is improved; and the content and the sequence of the task scene can be automatically marked and regulated according to the testing condition of the user. Compared with the existing method mainly relying on questionnaires or behavior observation, the method can provide more objective parameter indexes and assist medical workers to finish primary diagnosis of the childhood hyperkinetic syndrome.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present application, the drawings that are needed in the description of the embodiments will be briefly introduced below, it being obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a schematic diagram of a system for screening for hyperactivity based on game interaction and eye tracking according to an embodiment of the present application;
FIG. 2 is a flow chart of a method for screening hyperkinetic syndrome based on game interaction and eye movement tracking provided in an embodiment of the present application;
fig. 3 is a schematic structural diagram of a server according to an embodiment of the present application.
Detailed Description
The following description of the embodiments of the present application will be made clearly and fully with reference to the accompanying drawings, in which it is evident that the embodiments described are only some, but not all, of the embodiments of the present application. All other embodiments, which can be made by those skilled in the art based on the embodiments herein without making any inventive effort, are intended to be within the scope of the present application.
The embodiment of the application provides a system and a method for screening multiple movement symptoms based on game interaction and eye movement tracking.
Virtual Reality (VR) is a computer simulation system that can create and experience a Virtual world, and uses real life data, electronic signals generated by computer technology, and various output devices to combine them to generate a realistic Virtual world with multiple sensory experiences such as three-dimensional vision, touch, smell, etc., so that a person in the Virtual world generates an immersive sensation. The various phenomena in the virtual world can be real and cut objects in reality, or substances which cannot be seen by naked eyes and are represented by a three-dimensional model. These phenomena are not directly visible but are simulated by computer technology in the real world, and are therefore referred to as virtual reality.
Eye tracking is the process of measuring the gaze point (where a person is looking) or the movement of the eyes relative to the head. Many disciplines can use eye tracking techniques including cognitive sciences, psychology (in particular psycholinguistics, visual world paradigms), human-machine interaction (HCI), human factors and ergonomics, market research and medical research (neurological diagnostics). Specific applications include tracking eye movement in language reading, music reading, human activity recognition, advertisement perception, movement, distraction detection, and cognitive load, among others.
Attention deficit and hyperactivity disorder (Attention deficit and hyperactivity disorder, ADHD), also known as hyperactivity disorder, refers to a group of syndromes that occur in childhood and are characterized by significant difficulty in focusing on, short duration of attention, overactivity, or impulsivity, as compared to the same age children. These symptoms are characterized by chronic and persistent symptoms, which can lead to lower levels of physical development in the infant than normal age. Hyperactivity is a disorder that is more common in children, with prevalence reported to be 3% -5%.
In a first aspect, an embodiment of the present application provides a system for screening multiple movement based on game interaction and eye movement tracking, as shown in fig. 1, where the system for screening multiple movement includes a virtual reality device module 100, a task scene module 110, an eye movement interaction module 120, an eye movement data module 130, and an evaluation module 140, where: the task scene module 110 is configured to establish a virtual reality environment and an eye movement interaction task scene; the virtual reality device module 100 is configured to present a virtual reality environment to a user and perform an eye movement interaction task in an eye movement interaction task scene; an eye movement interaction module 120, configured to perform eye movement interaction with the user according to the eye movement interaction task; the eye movement data module 130 is configured to collect eye movement data when the user performs eye movement interaction, and send the eye movement data to the evaluation module 140; the evaluation module 140 is configured to receive the eye movement data sent by the eye movement data module 130, evaluate the eye movement data through the data evaluation model, and output an evaluation result.
The eye movement data includes, but is not limited to, two-dimensional coordinates of the gaze point, eye movement locus, gaze time, glance time, average eye jump time, and eye jump direction error rate.
In the embodiment of the present application, the task scenario module 110 may establish a virtual reality environment and a task scenario. The task scene module 110 is developed through a Unity engine, and the picture material in the scene can be Photoshop drawing and the like. The eye movement data module 130 is configured to collect, archive and process eye movement data, and backup the eye movement data to the cloud end when archiving the eye movement data, and process the eye movement data, that is, the eye movement data corresponds to a data evaluation model that sends the eye movement data to the evaluation module 140. The evaluation module 140 receives the eye movement data transmitted by the eye movement data module 130 and outputs an evaluation result through the data evaluation model. The evaluation result may include whether it is an hyperactivity disorder or not and an hyperactivity disorder probability.
In some embodiments, the virtual reality device module 100 includes a headset, a controller, and an eye tracker, wherein: the head-wearing end is used for displaying an eye movement interaction task interface in the virtual reality environment and playing sound effects corresponding to the eye movement interaction task; the controller is used for completing the configuration of the multiple movement symptom screening system and executing the eye movement interaction task in the eye movement interaction task scene; and the eyeball tracker is used for capturing the eyeball movement of the user and recording and tracking the eye movement track of the user.
In the embodiment of the present application, the virtual reality device module 100 is a VR all-in-one device, which may include a highly integrated head-mounted end, a plurality of controllers, an eye tracker, and the like. The head-wearing end can be a VR helmet, and the head-wearing end can display a task interface and play sound effects. The controllers can be integrated into the VR helmet, and the controllers can assist a user in completing system configuration and performing tasks. The eye tracker can capture eye movements, record and track the eye movement trajectories of the user. According to the VR all-in-one device in the embodiment of the application, eye movement data can be recorded by the eye movement instrument, such as a Tobii 90Hz eye movement instrument, and the like, and in other embodiments of the application, the eye movement data can be recorded by other devices, so that the application is not limited.
In some embodiments, the eye-movement interaction tasks in the eye-movement interaction module 120 include a plurality of eye-movement interaction subtasks, each including a guiding phase and a testing phase, the guiding phase being an eye-movement interaction demonstration phase for a user, the testing phase being an eye-movement interaction phase for the user.
In some embodiments, the eye-play interaction module 120 is a shooting game interaction module, several eye-play interaction subtasks in the shooting game interaction module including a normal mode shooting task, a specific target shooting task, and a reverse shooting task, wherein: the normal mode shooting task is to find floating balloons in different living scenes and shoot the floating balloons; the specific target shooting task is to find specific targets in balloons with different colors and shapes and shoot the specific targets; the reverse shooting task is that the interference stimulus is arranged beside the fixation point of the user, and the user needs to move to the horizontal mirror image position of the interference stimulus to shoot.
In the embodiment of the application, the shooting game may be taken as a theme, and the plurality of eye movement interaction subtasks in the shooting game interaction module may include a normal mode shooting task, a specific target shooting task and a reverse shooting task, where each eye movement interaction subtask is divided into two phases, namely a guiding phase and a testing phase. The guiding stage is game demonstration, the testing stage is that the user formally starts playing the game, and according to the game prompt, the shooting action is completed by moving eyeballs to watch the target. The normal mode shooting task can shoot for finding floating balloons in various life scenes; the specific target shooting task can shoot for finding a specific target object in balloons with different colors and shapes; the reverse shooting task is that disturbance stimulus appears beside the fixation point, and the user needs to move to a horizontal mirror image position of the disturbance stimulus to shoot.
Eye jump refers to the jump of the eyeball, which can concentrate the gaze point at a location where attention is required to extract the required information. The generation of eye jumps is affected by various factors, such as the color, movement state, attention state of an observer, etc., and the eye jumps can be divided into two types according to the generation mode of the eye jumps: one is exogenous eye jump, which is mainly affected by visual stimulus characteristics; the other is endogenous eye jump, which is mainly influenced by factors such as targets, expectations, cognitive strategies and the like.
The eye jump direction error rate refers to the proportion of the wrong-direction eye jump made in the reverse eye jump task. The stimulus suddenly appearing in the visual field can excite the reflex eye jump of a person towards the target, but the action is required to be restrained in the backward eye jump, so that the error rate of the eye jump direction can reflect the autonomous control capability of the tested eye jump, and the method is an important index for examining the restraining function of the tested action.
According to the embodiment of the application, the reverse shooting task is based on an experimental paradigm of reverse eye jump, and when the reverse shooting task is executed, interference stimulus can appear beside a fixation point in the game interaction process, and shooting is carried out by moving to a horizontal mirror image position of the interference stimulus. The neural loop for controlling exogenous eye movement behaviors of the ADHD children has normal or near normal functions, and the eye movement inhibition capability (namely, the endogenous components of gazing and eye jumping) of the ADHD children has defects, so that the eye jumping direction error rate of the patients with hyperkinetic symptoms is remarkably different from that of normal people.
The experimental paradigm of reverse eye jump is to first present the user with a central gaze point in the center of the screen, after which a target stimulus is presented somewhere to the left or right of the central gaze point. Unlike conventional eye-ward-jump tasks, in the backward eye-jump task, the user is required not to look at the peripheral target stimulus, but at the mirror position opposite thereto.
In some embodiments, in the shooting game interaction module, when the user performs the eye movement interaction subtask, a mark is displayed at the central position of the eye movement interaction task interface, and the mark is used for prompting the user to perform gaze point calibration and prepare to start a game; when the mark of the central position of the eye movement interaction task interface disappears after the first preset time length, the user enters a guiding stage of the eye movement interaction subtask, and the eye movement interaction task interface demonstrates game interaction rules; after the instruction stage is finished, displaying the mark again at the central position of the eye movement interaction task interface, and after the second preset time length, disappearing the mark at the central position of the eye movement interaction task interface, enabling the user to enter the test stage of the eye movement interaction subtask, and completing one shooting action by moving the eyeball to watch the target according to the game prompt.
In some embodiments, the system further includes a personalization processing module 150, where the personalization processing module is configured to mark the task scene content and the sequence corresponding to the last eye movement data after the last eye movement data evaluation performed on the user is finished, and reduce the occurrence frequency of the task scene content and the sequence corresponding to the last eye movement data when the user enters the next eye movement data evaluation.
In some embodiments, the evaluation module 140 includes an evaluation sub-module, which is configured to evaluate the eye movement data by using a support vector machine, and output an evaluation result.
In some embodiments, the method further comprises, before evaluating the eye movement data by the support vector machine and outputting the evaluation result: acquiring eye movement test sample data, wherein the eye movement test sample data comprises eye movement data of healthy people and patients with hyperkinetic symptoms after test; dividing the eye movement test sample data into an eye movement data sample training set and an eye movement data sample test set; based on the eye movement data sample training set and the eye movement data sample testing set, the support vector machine is trained and tested by adopting a cross validation method, and model parameters in the support vector machine are adjusted to obtain the support vector machine after training.
In the embodiment of the application, the support vector machine (Support Vector Machines, SVM) is a generalized linear classifier for binary classification of data according to a supervised learning mode. According to the embodiment of the application, the eye movement data are evaluated through the support vector machine, and training is carried out on the support vector machine before the evaluation result is output. Specifically, the samples were collected: collecting and collating post-test eye movement data for healthy and hyperkinetic patients; model evaluation: evaluating the model using a cross-validation method and dividing the sample data into a training set and a validation set; model training: and training a support vector machine, and adjusting the relaxation factor C and gamma parameters. And then, using the trained support vector machine to conduct the multiple movement symptom score prediction on the eye movement data of the user.
Further, the multi-vision screening system based on game interaction and eye movement tracking in the embodiment of the application includes a client 10 and a server 20, wherein a virtual reality device module 100, a task scene module 110, an eye movement interaction module 120, an eye movement data module 130, and an evaluation module 140 are located at the client 10. In the process of using the client 10, a user wears the VR helmet, the VR helmet power supply is turned on, the client 10 automatically starts cloud storage service, and the acquired eye movement data is backed up to the cloud. When entering the eye movement interaction module 120, a mark (such as a cross star mark) appears in the center of the screen, so that a user can conveniently calibrate the gaze point and prepare to start a game; after a first preset time (such as 2 seconds), the mark disappears, a guiding stage is entered, and a game playing method is demonstrated on a screen; after the instruction stage is finished, the mark appears again in the center of the screen, the mark disappears after a second preset time (for example, 2 seconds), namely, the test is formally started, and the user moves the eyeballs to watch the target according to the game prompt, so that the shooting action is finished once. In the embodiment of the application, the test time of each subtask can be set to 5-10 minutes. The evaluation module 140 outputs an evaluation result including whether it is hyperactivity, the probability of hyperactivity (i.e., the accuracy of the hyperactivity screening).
In the embodiment of the application, the server 20 and the client 10 of the multi-movement symptom screening system based on game interaction and eye movement tracking are synchronously started, and the server 20 is used for recording and backing up user IDs, subtask IDs and duration time of each subtask, and starting and ending time of the whole field test; receiving and backing up eye movement data, and importing the constructed data evaluation model to the client 10; the evaluation result is derived from the client 10.
The task scene module 110, the eye movement interaction module 120, the eye movement data module 130, the evaluation module 140 and the server 20 in the client 10 of the multiple movement symptom screening system based on game interaction and eye movement tracking can be integrated in an electronic device, which can be a terminal, a server or the like. The terminal can be a mobile phone, a tablet computer, an intelligent Bluetooth device, a notebook computer, a personal computer (Personal Computer, PC) or the like; the server may be a single server or a server cluster composed of a plurality of servers.
In some embodiments, the task scene module 110, the eye movement interaction module 120, the eye movement data module 130, the evaluation module 140, and the server 20 in the client 10 may be integrated in a plurality of electronic devices, for example, may be integrated in a plurality of servers, and the game interaction and eye movement tracking-based multi-movement screening method of the present application is implemented by the plurality of servers.
In some embodiments, the task scene module 110, the eye-movement interaction module 120, the eye-movement data module 130, the assessment module 140, and the server 20 in the client 10 may correspond to a first server, a second server, a third server, a fourth server, and a fifth server, respectively.
The first server corresponding to the task scene module 110 is configured to establish a virtual reality environment, an eye movement interaction task scene, and the like. The second server corresponding to the eye movement interaction module 120 is configured to perform eye movement interaction with the user according to the eye movement interaction task. The third server corresponding to the eye movement data module 130 is configured to collect eye movement data when the user performs eye movement interaction, send the eye movement data to the evaluation module, and so on. And the fourth server corresponding to the evaluation module 140 is configured to receive the eye movement data sent by the eye movement data module, evaluate the eye movement data through the data evaluation model, and output an evaluation result and the like. The fifth server corresponding to the server 20 is configured to record and backup the user ID, the subtask ID, and the duration of each subtask, and start and end times of the whole field test; receiving and backing up eye movement data, and importing the constructed data evaluation model to the client 10; derives an evaluation result, etc. from the client 10.
According to the invention, not only are VR game interaction and eye movement tracking technology adopted, but also immersive experience is closer to a natural reality state, so that the external effectiveness of the test is ensured; in addition, the reverse shooting task in the eye movement interaction module of the application uses a reverse eye movement behavior paradigm, and the hyperkinetic patient can be screened more effectively through the error rate of the eye jump direction of the hyperkinetic patient in the reverse shooting task; meanwhile, the trained machine learning support vector machine classifier is used for carrying out the multiple movement symptom score prediction on the eye movement data of the user, so that the accuracy of the eye movement data test of the multiple movement symptom patient is improved; and the content and the sequence of the task scene can be automatically marked and regulated according to the testing condition of the user. Compared with the existing method mainly relying on questionnaires or behavior observation, the method can provide more objective parameter indexes and assist medical workers to finish primary diagnosis of the childhood hyperkinetic syndrome.
In a second aspect, an embodiment of the present application further provides a method for screening hyperkinetic symptoms based on game interaction and eye movement tracking, as shown in fig. 2, where the method for screening hyperkinetic symptoms based on game interaction and eye movement tracking may include the following specific steps:
S200, establishing a virtual reality environment and an eye movement interaction task scene.
And S210, displaying the virtual reality environment to the user, and executing the eye movement interaction task in the eye movement interaction task scene.
In the embodiment of the application, the eye movement interaction task interface can be displayed in the virtual reality environment through the head-wearing end, and the sound effect corresponding to the eye movement interaction task is played; completing the configuration of a hyperkinetic screening system through a controller, and executing an eye movement interaction task in an eye movement interaction task scene; and the eye tracker captures the eye movement of the user, and records and tracks the eye movement track of the user.
S220, performing eye movement interaction with the user according to the eye movement interaction task.
In the embodiment of the application, the eye movement interaction task in the eye movement interaction module comprises a plurality of eye movement interaction subtasks, each eye movement interaction subtask comprises a guiding stage and a testing stage, the guiding stage is an eye movement interaction demonstration stage for a user, and the testing stage is an eye movement interaction stage for the user.
In some embodiments, the number of eye-movement interaction subtasks includes a normal mode shooting task, a specific target shooting task, and a reverse shooting task, wherein: the normal mode shooting task is to find floating balloons in different living scenes and shoot the floating balloons; the specific target shooting task is to find specific targets in balloons with different colors and shapes and shoot the specific targets; the reverse shooting task is that the interference stimulus is arranged beside the fixation point of the user, and the user needs to move to the horizontal mirror image position of the interference stimulus to shoot.
In some embodiments, when the user performs the eye movement interaction subtask, a marker is displayed at the central location of the eye movement interaction task interface, the marker being used to prompt the user to perform gaze point calibration and prepare to start the game; when the mark of the central position of the eye movement interaction task interface disappears after the first preset time length, the user enters a guiding stage of the eye movement interaction subtask, and the eye movement interaction task interface demonstrates game interaction rules; after the instruction stage is finished, displaying the mark again at the central position of the eye movement interaction task interface, and after the second preset time length, disappearing the mark at the central position of the eye movement interaction task interface, enabling the user to enter the test stage of the eye movement interaction subtask, and completing one shooting action by moving the eyeball to watch the target according to the game prompt.
S230, acquiring eye movement data of a user during eye movement interaction, and sending the eye movement data to an evaluation module.
In some embodiments, the eye movement data includes, but is not limited to, two-dimensional coordinates of the gaze point, eye movement trajectories, gaze times, saccade times, average eye jump times, eye jump direction error rates.
S240, receiving the eye movement data through an evaluation module, evaluating the eye movement data through a data evaluation model, and outputting an evaluation result.
In some embodiments, the eye movement data may also be evaluated by a support vector machine, and the evaluation result may be output.
In some embodiments, the method further comprises, before evaluating the eye movement data by the support vector machine and outputting the evaluation result: acquiring eye movement test sample data, wherein the eye movement test sample data comprises eye movement data of healthy people and patients with hyperkinetic symptoms after test; dividing the eye movement test sample data into an eye movement data sample training set and an eye movement data sample test set; based on the eye movement data sample training set and the eye movement data sample testing set, the support vector machine is trained and tested by adopting a cross validation method, and model parameters in the support vector machine are adjusted to obtain the support vector machine after training.
In some embodiments, after the last eye movement data evaluation performed on the user is finished, the task scene content and sequence corresponding to the last eye movement data are marked, and when the user enters the next eye movement data evaluation, the occurrence frequency of the task scene content and sequence corresponding to the last eye movement data is reduced.
The embodiment of the application can establish a virtual reality environment and an eye movement interaction task scene; then, the virtual reality environment is displayed to the user, and an eye movement interaction task is executed in an eye movement interaction task scene; then performing eye movement interaction with the user according to the eye movement interaction task; then, collecting eye movement data of a user in the eye movement interaction process, storing the eye movement data and sending the eye movement data to an evaluation module; and finally, receiving the eye movement data sent by the eye movement data module, evaluating the eye movement data through the data evaluation model, and outputting an evaluation result.
According to the invention, not only are VR game interaction and eye movement tracking technology adopted, but also immersive experience is closer to a natural reality state, so that the external effectiveness of the test is ensured; in addition, the reverse shooting task in the eye movement interaction module of the application uses a reverse eye movement behavior paradigm, and the hyperkinetic patient can be screened more effectively through the error rate of the eye jump direction of the hyperkinetic patient in the reverse shooting task; meanwhile, the trained machine learning support vector machine classifier is used for carrying out the multiple movement symptom score prediction on the eye movement data of the user, so that the accuracy of the eye movement data test of the multiple movement symptom patient is improved; and the content and the sequence of the task scene can be automatically marked and regulated according to the testing condition of the user. Compared with the existing method mainly relying on questionnaires or behavior observation, the method can provide more objective parameter indexes and assist medical workers to finish primary diagnosis of the childhood hyperkinetic syndrome.
In the embodiment of the present application, a detailed description will be given taking an example that each module of the embodiment is a server, for example, as shown in fig. 3, which shows a schematic structural diagram of the server according to the embodiment of the present application, specifically:
The server may include one or more processors 301 of a processing core, memory 302 of one or more computer readable storage media, a power supply 303, an input module 304, and a communication module 305, among other components. Those skilled in the art will appreciate that the server architecture shown in fig. 3 is not limiting of the server and may include more or fewer components than shown, or may combine certain components, or a different arrangement of components. Wherein:
the processor 301 is a control center of the server, connects various parts of the entire server using various interfaces and lines, and performs various functions of the server and processes data by running or executing software programs and/or modules stored in the memory 302, and calling data stored in the memory 302, thereby performing overall monitoring of the server. In some embodiments, processor 301 may include one or more processing cores; in some embodiments, processor 301 may integrate an application processor that primarily processes operating systems, user interfaces, applications, etc., with a modem processor that primarily processes wireless communications. It will be appreciated that the modem processor described above may not be integrated into the processor 301.
The memory 302 may be used to store software programs and modules, and the processor 301 executes various functional applications and data processing by executing the software programs and modules stored in the memory 302. The memory 302 may mainly include a storage program area and a storage data area, wherein the storage program area may store an operating system, an application program (such as a sound playing function, an image playing function, etc.) required for at least one function, and the like; the storage data area may store data created according to the use of the server, etc. In addition, memory 302 may include high-speed random access memory, and may also include non-volatile memory, such as at least one magnetic disk storage device, flash memory device, or other volatile solid-state storage device. Accordingly, the memory 302 may also include a memory controller to provide the processor 301 with access to the memory 302.
The server also includes a power supply 303 that powers the various components, and in some embodiments, the power supply 303 may be logically connected to the processor 301 through a power management system to perform functions such as managing charging, discharging, and power consumption. The power supply 303 may also include one or more of any components, such as a direct current or alternating current power supply, a recharging system, a power failure detection circuit, a power converter or inverter, a power status indicator, and the like.
The server may also include an input module 304, which input module 304 may be used to receive entered numeric or character information and to generate keyboard, mouse, joystick, optical or trackball signal inputs related to user settings and function control.
The server may also include a communication module 305, and in some embodiments the communication module 305 may include a wireless module, through which the server may wirelessly transmit over short distances, thereby providing wireless broadband internet access to the user. For example, the communication module 305 may be used to assist a user in e-mail, browsing web pages, accessing streaming media, and the like.
Although not shown, the server may further include a display unit or the like, which is not described herein. In particular, in this embodiment, the processor 301 in the server loads executable files corresponding to the processes of one or more application programs into the memory 302 according to the following instructions, and the processor 301 executes the application programs stored in the memory 302, so as to implement various functions in the multiple movement symptom screening device based on game interaction and eye movement tracking.
In some embodiments, a computer program product is also presented, comprising a computer program or instructions which, when executed by a processor, implement the steps in any of the above-described method of screening for multiple motion disorders based on game interaction and eye tracking.
The specific implementation of each operation above may be referred to the previous embodiments, and will not be described herein.
From the above, the embodiment of the application not only adopts VR game interaction and eye tracking technology, but also ensures that the immersive experience is closer to the natural reality state, thus ensuring the external effectiveness of the test; in addition, the reverse shooting task in the eye movement interaction module of the application uses a reverse eye movement behavior paradigm, and the hyperkinetic patient can be screened more effectively through the error rate of the eye jump direction of the hyperkinetic patient in the reverse shooting task; meanwhile, the trained machine learning support vector machine classifier is used for carrying out the multiple movement symptom score prediction on the eye movement data of the user, so that the accuracy of the eye movement data test of the multiple movement symptom patient is improved; and the content and the sequence of the task scene can be automatically marked and regulated according to the testing condition of the user. Compared with the existing method mainly relying on questionnaires or behavior observation, the method can provide more objective parameter indexes and assist medical workers to finish primary diagnosis of the childhood hyperkinetic syndrome.
Those of ordinary skill in the art will appreciate that all or a portion of the steps of the various methods of the above embodiments may be performed by instructions, or by instructions controlling associated hardware, which may be stored in a computer-readable storage medium and loaded and executed by a processor.
To this end, embodiments of the present application provide a computer readable storage medium having stored therein a plurality of instructions capable of being loaded by a processor to perform the steps of any of the method of multiple vision screening based on game interaction and eye tracking provided by embodiments of the present application. For example, the instructions may perform the steps of: firstly, establishing a virtual reality environment and an eye movement interaction task scene; then, the virtual reality environment is displayed to a user, and an eye movement interaction task is executed in the eye movement interaction task scene; then, performing eye movement interaction with the user according to the eye movement interaction task; then, acquiring eye movement data of the user in the eye movement interaction process, and storing and transmitting the eye movement data to an evaluation module; and finally, receiving the eye movement data sent by the eye movement data module, evaluating the eye movement data through a data evaluation model, and outputting an evaluation result and the like.
Wherein the storage medium may include: read Only Memory (ROM), random access Memory (RAM, random Access Memory), magnetic or optical disk, and the like.
According to one aspect of the present application, there is provided a computer program product or computer program comprising computer instructions stored in a computer readable storage medium. The processor of the computer device reads the computer instructions from the computer-readable storage medium, and the processor executes the computer instructions to cause the computer device to perform the methods provided in various alternative implementations of the method of screening for multiple movement disorders based on game interactions and eye tracking provided in the above-described embodiments.
Because the instructions stored in the storage medium can execute the steps in any of the method for screening hyperkinetic symptoms based on game interaction and eye movement tracking provided in the embodiments of the present application, the beneficial effects that any of the method for screening hyperkinetic symptoms based on game interaction and eye movement tracking provided in the embodiments of the present application can be achieved, which are detailed in the previous embodiments and are not described herein.
The above describes in detail a method, apparatus, server and computer readable storage medium for screening hyperkinetic symptoms based on game interaction and eye tracking provided in the embodiments of the present application, and specific examples are applied herein to illustrate the principles and embodiments of the present application, and the above description of the embodiments is only used to help understand the method and core ideas of the present application; meanwhile, as those skilled in the art will vary in the specific embodiments and application scope according to the ideas of the present application, the contents of the present specification should not be construed as limiting the present application in summary.
Claims (10)
1. The utility model provides a multi-movement symptom screening system based on recreation is mutual and eye moves tracking which characterized in that includes virtual reality equipment module, task scene module, eye move interactive module, eye move data module and evaluation module, wherein:
The task scene module is used for establishing a virtual reality environment and an eye movement interaction task scene;
the virtual reality equipment module is used for displaying the virtual reality environment to a user and executing an eye movement interaction task in the eye movement interaction task scene;
the eye movement interaction module is used for carrying out eye movement interaction with the user according to the eye movement interaction task;
the eye movement data module is used for collecting eye movement data when the user performs eye movement interaction and sending the eye movement data to the evaluation module;
the evaluation module is used for receiving the eye movement data sent by the eye movement data module, evaluating the eye movement data through a data evaluation model and outputting an evaluation result.
2. The game-interactive and eye-tracking based hyperkinetic screening system of claim 1, wherein the virtual reality device module comprises a head-mounted end, a controller, and an eye tracker, wherein:
the head-wearing end is used for displaying an eye movement interaction task interface in the virtual reality environment and playing sound effects corresponding to the eye movement interaction task;
the controller is used for completing the configuration of the hyperkinetic screening system and executing the eye movement interaction task in the eye movement interaction task scene;
The eyeball tracker is used for capturing eyeball movement of the user and recording and tracking the eye movement track of the user.
3. A game interaction and eye tracking based multi-vision screening system according to claim 1 or 2, wherein the eye movement interaction tasks in the eye movement interaction module comprise a number of eye movement interaction subtasks, each comprising a guiding phase for eye movement interaction presentation phase to the user and a testing phase for eye movement interaction phase to the user.
4. The multiple vision screening system based on game interactions and eye tracking of claim 3, wherein the eye interaction module is a shooting game interaction module, a number of eye interaction subtasks in the shooting game interaction module comprising a normal mode shooting task, a specific target shooting task, and a reverse shooting task, wherein:
the normal mode shooting task is to find floating balloons in different living scenes and shoot the floating balloons;
the specific target shooting task is to find specific targets in balloons with different colors and shapes and shoot the specific targets;
The reverse shooting task is that interference stimulus is arranged beside a fixation point of the user, and the user needs to move to a horizontal mirror image position of the interference stimulus to shoot.
5. The multiple vision screening system based on game interaction and eye tracking of claim 4, wherein in the shooting game interaction module, when the user performs the eye interaction subtask, a mark is displayed at a central position of the eye interaction task interface, the mark being used to prompt the user to perform gaze point calibration and prepare to start a game; when the mark at the central position of the eye movement interaction task interface disappears after a first preset time length, the user enters a guiding stage of the eye movement interaction subtask, and the eye movement interaction task interface demonstrates game interaction rules; after the instruction stage is finished, the mark is displayed again at the central position of the eye movement interaction task interface, and after a second preset time length, the mark at the central position of the eye movement interaction task interface disappears, the user enters the test stage of the eye movement interaction subtask, and the user completes a shooting action by moving an eyeball to watch a target according to a game prompt.
6. The multiple vision screening system based on game interaction and eye tracking of claim 1, further comprising a personalization processing module, wherein the personalization processing module is configured to mark task scene content and sequence corresponding to previous eye movement data after the previous eye movement data evaluation performed by a user is finished, and reduce occurrence frequency of the task scene content and sequence corresponding to the previous eye movement data when the user enters the next eye movement data evaluation.
7. The multiple vision screening system based on game interactions and eye tracking of claim 1, wherein the evaluation module comprises an evaluation sub-module for evaluating the eye movement data by a support vector machine and outputting an evaluation result.
8. The multiple vision screening system based on game interaction and eye tracking of claim 7, wherein the evaluation of the eye movement data by a support vector machine, before outputting the evaluation result, further comprises:
acquiring eye movement test sample data, wherein the eye movement test sample data comprises eye movement data of healthy people and patients with hyperkinetic symptoms after test;
Dividing the eye movement test sample data into an eye movement data sample training set and an eye movement data sample test set;
based on the eye movement data sample training set and the eye movement data sample testing set, training and testing the support vector machine by adopting a cross validation method, and simultaneously adjusting model parameters in the support vector machine to obtain the support vector machine after training.
9. A multiple vision screening system based on game interactions and eye tracking as claimed in claim 1, wherein said eye movement data includes, but is not limited to, two-dimensional coordinates of gaze point, eye movement trajectory, gaze time, glance time, average eye jump time, eye jump direction error rate.
10. The utility model provides a hyperkinetic symptom screening method based on game interaction and eye movement tracking, is applied to the hyperkinetic symptom screening system based on game interaction and eye movement tracking, and is characterized by comprising the following steps:
establishing a virtual reality environment and an eye movement interaction task scene;
displaying the virtual reality environment to a user, and executing an eye movement interaction task in the eye movement interaction task scene;
performing eye movement interaction with the user according to the eye movement interaction task;
collecting eye movement data when the user performs eye movement interaction, and sending the eye movement data to an evaluation module;
And receiving the eye movement data through the evaluation module, evaluating the eye movement data through a data evaluation model, and outputting an evaluation result.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310315525.3A CN116392123A (en) | 2023-03-28 | 2023-03-28 | Multi-movement symptom screening method and system based on game interaction and eye movement tracking |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310315525.3A CN116392123A (en) | 2023-03-28 | 2023-03-28 | Multi-movement symptom screening method and system based on game interaction and eye movement tracking |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116392123A true CN116392123A (en) | 2023-07-07 |
Family
ID=87009648
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310315525.3A Pending CN116392123A (en) | 2023-03-28 | 2023-03-28 | Multi-movement symptom screening method and system based on game interaction and eye movement tracking |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116392123A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117198537A (en) * | 2023-11-07 | 2023-12-08 | 北京无疆脑智科技有限公司 | Task completion data analysis method and device, electronic equipment and storage medium |
-
2023
- 2023-03-28 CN CN202310315525.3A patent/CN116392123A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117198537A (en) * | 2023-11-07 | 2023-12-08 | 北京无疆脑智科技有限公司 | Task completion data analysis method and device, electronic equipment and storage medium |
CN117198537B (en) * | 2023-11-07 | 2024-03-26 | 北京无疆脑智科技有限公司 | Task completion data analysis method and device, electronic equipment and storage medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP7049379B2 (en) | Increased cognition in the presence of attention diversion and / or disturbance | |
Klaib et al. | Eye tracking algorithms, techniques, tools, and applications with an emphasis on machine learning and Internet of Things technologies | |
US11696720B2 (en) | Processor implemented systems and methods for measuring cognitive abilities | |
Wu et al. | Eye-tracking metrics predict perceived workload in robotic surgical skills training | |
Chen et al. | Automatic classification of eye activity for cognitive load measurement with emotion interference | |
Islam et al. | Cybersickness prediction from integrated hmd’s sensors: A multimodal deep fusion approach using eye-tracking and head-tracking data | |
Dye et al. | Increasing speed of processing with action video games | |
Huynh et al. | Engagemon: Multi-modal engagement sensing for mobile games | |
JP2019534061A (en) | Cognitive platform connected to physiological components | |
Robinson et al. | " Let's Get Physiological, Physiological!" A Systematic Review of Affective Gaming | |
US20230225609A1 (en) | A system and method for providing visual tests | |
Knobel et al. | Development of a search task using immersive virtual reality: proof-of-concept study | |
Krueger et al. | Microsaccades distinguish looking from seeing | |
Lee et al. | Predicting mind-wandering with facial videos in online lectures | |
CN116392123A (en) | Multi-movement symptom screening method and system based on game interaction and eye movement tracking | |
Orlosky et al. | Using eye tracked virtual reality to classify understanding of vocabulary in recall tasks | |
Atkinson et al. | Design of an introductory medical gaming environment for diagnosis and management of Parkinson's disease | |
Pedroli et al. | A virtual reality test for the assessment of cognitive deficits: usability and perspectives | |
Joselli et al. | MindNinja: Concept, development and evaluation of a mind action game based on EEGs | |
Blankertz et al. | 23—BCI APPLICATIONS FOR THE GENERAL POPULATION | |
WO2024134621A1 (en) | Systems and methods for assessing social skills in virtual reality | |
Derick et al. | Study of the user's eye tracking to analyze the blinking behavior while playing a video game to identify cognitive load levels | |
Thilderkvist et al. | On current limitations of online eye-tracking to study the visual processing of source code | |
US20150160474A1 (en) | Corrective lens prescription adaptation system for personalized optometry | |
Parsons et al. | Neurocognitive workload assessment using the virtual reality cognitive performance assessment test |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |