CN115089181A - Fatigue measurement method and system based on eye movement data - Google Patents

Fatigue measurement method and system based on eye movement data Download PDF

Info

Publication number
CN115089181A
CN115089181A CN202210720565.1A CN202210720565A CN115089181A CN 115089181 A CN115089181 A CN 115089181A CN 202210720565 A CN202210720565 A CN 202210720565A CN 115089181 A CN115089181 A CN 115089181A
Authority
CN
China
Prior art keywords
eye movement
eye
calculating
saccade
blink
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202210720565.1A
Other languages
Chinese (zh)
Inventor
侯文军
刘佳鑫
朱喜明
陈筱琳
王柯然
赵欣
刘华
曹晨
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Posts and Telecommunications
CETC 27 Research Institute
Original Assignee
Beijing University of Posts and Telecommunications
CETC 27 Research Institute
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Posts and Telecommunications, CETC 27 Research Institute filed Critical Beijing University of Posts and Telecommunications
Priority to CN202210720565.1A priority Critical patent/CN115089181A/en
Publication of CN115089181A publication Critical patent/CN115089181A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/163Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/11Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils
    • A61B3/112Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for measuring interpupillary distance or diameter of pupils for measuring diameter of pupils
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/113Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions for determining or recording eye movement
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B3/00Apparatus for testing the eyes; Instruments for examining the eyes
    • A61B3/10Objective types, i.e. instruments for examining the eyes independent of the patients' perceptions or reactions
    • A61B3/14Arrangements specially adapted for eye photography
    • A61B3/145Arrangements specially adapted for eye photography by video means
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems

Abstract

A fatigue measurement method and system based on eye movement data, the method comprises the following steps: the method comprises the steps that an eye movement instrument is adopted to collect eye movement data of tasks to be executed in a waking state and a fatigue state respectively, and eye movement characteristics are calculated based on the eye movement data; the eye movement characteristics comprise a left eye pupil diameter change rate, a right eye pupil diameter change rate, a fixation time length, an average saccade speed and an eyelid closure degree change rate; establishing a fatigue measurement machine learning model, training the fatigue measurement machine learning model based on the eye movement characteristics, and obtaining a trained fatigue measurement machine learning model; the method comprises the steps of obtaining eye movement data of a user to be measured, calculating eye movement characteristics based on the eye movement data, inputting the eye movement characteristics into a trained fatigue measurement machine learning model, and obtaining a fatigue measurement result of the user to be identified.

Description

Fatigue measurement method and system based on eye movement data
Technical Field
The invention relates to the technical field of fatigue measurement, in particular to a fatigue measurement method and system based on eye movement data.
Background
The subjective fatigue degree quantified at present mainly has the following modes: questionnaires, task completion based behaviors, physiological signal based. The questionnaire method mainly uses a self-subjective questionnaire or a karelin sleepiness scale to actively score the points of the user. The reliability of the method is not high, and the method is greatly influenced by the subjective psychology of a user. The task completion behavior is based on fatigue degree division through the operation performance of the user, and the method is greatly influenced by environment and task difficulty. The fatigue degree detection method based on physiological signals such as electroencephalogram, electrocardiogram and electromyogram has complex data processing and higher requirements on equipment.
Because the users may be influenced by personal psychological clues and habitual deviations of the users when answering by means of questionnaires, the obtained data may deviate from the real state of the users. The method of relying on task completion behavior may be affected by task difficulty and user proficiency. However, the methods based on electroencephalogram and electromyogram require connecting a plurality of complex devices to the whole body of the user, affect the normal task execution state of the user, and have no universal applicability.
Disclosure of Invention
In view of the foregoing analysis, embodiments of the present invention are directed to providing a fatigue measurement method and system based on eye movement data, so as to solve the problems of inaccuracy and complex measurement process of the existing fatigue measurement.
In one aspect, an embodiment of the present invention provides a fatigue measurement method based on eye movement data, including the following steps:
the method comprises the steps that an eye movement instrument is adopted to collect eye movement data of tasks to be executed in a waking state and a fatigue state respectively, and eye movement characteristics are calculated based on the eye movement data; the eye movement characteristics comprise a left eye pupil diameter change rate, a right eye pupil diameter change rate, a fixation time length, an average saccade speed and an eyelid closure degree change rate;
establishing a fatigue measurement machine learning model, training the fatigue measurement machine learning model based on the eye movement characteristics, and obtaining a trained fatigue measurement machine learning model;
the method comprises the steps of obtaining eye movement data of a user to be detected, calculating eye movement characteristics based on the eye movement data, inputting the eye movement characteristics into a trained fatigue measurement machine learning model, and obtaining a fatigue measurement result of the user to be identified.
In a further improvement of the above method, the eye movement data comprises: pupil diameter, eye movement type, and gaze coordinates; the eye movement types include gaze, saccade, and blink;
calculating an eye movement feature based on the eye movement data comprises:
according to the pupil diameter data, based on the formula
Figure BDA0003711109770000021
Calculating a pupil diameter change rate, where P represents the mean pupil diameter per unit time of the subject performing the task, and P 0 Means for representing the mean pupil diameter per unit time of the subject in the awake task-free state;
calculating the fixation duration, the blink duration and the saccade duration in unit time according to the eye movement type data, and calculating the eyelid closure degree based on the mapping regression relationship between the eyelid closure degree and the fixation duration, the blink duration and the saccade duration; according to the formula
Figure BDA0003711109770000022
Calculating the rate of change of the degree of eyelid closure, wherein f represents the degree of eyelid closure when the task is performed, f 0 Representing the eyelid closure mean value of the tested eye in the waking state without task;
the average saccade velocity is calculated from the gaze coordinates.
Based on a further improvement of the above method, calculating eyelid closure, based on a mapping regression relationship of eyelid closure with gaze duration, blink duration, and saccade duration, comprises calculating eyelid closure according to the following formula:
Figure BDA0003711109770000031
interval=blink+fixation+saccade+CLOS
wherein gamma represents a blink coefficient, blink represents blink time length in unit time, fixtion represents fixation time length in unit time, saccade is glance time length in unit time, and CLOS represents time length of no data of the eye tracker in unit time.
On the basis of a further development of the method described above,
calculating the average saccade velocity from the gaze coordinates includes:
calculating the included angle of the gazing coordinates of two adjacent sampling moments in unit time according to the following formula:
Figure BDA0003711109770000032
(x i-1 ,y i-1 ,z i-1 ) Denotes the gaze coordinate at the i-1 sampling instant, (x) i ,y i ,z i ) A gaze coordinate representing a sample time of i;
the saccade velocities at adjacent sample instants are calculated according to the following formula:
V i =α i ×freq
wherein freq represents the sampling frequency;
the average saccade is obtained by averaging the saccades for all adjacent sample times per unit time.
Based on a further improvement of the above method, the blink coefficient is obtained by:
the aspect ratio of the eyes when the eyes are closed and opened is respectively obtained by adopting an image recognition method according to the condition that T is equal to I 1 +(I 2 -I 1 ) X (1- β) calculating a threshold value of eye aspect ratio for judging eyelid closure; in which I 1 、I 2 The eye aspect ratio average values when the eye is closed and opened respectively, and beta represents the proportion of the eye closed;
obtaining eyelid closure degree according to the duration that the aspect ratio of the eyes is smaller than the threshold value in unit time, and analyzing the correlation between the eyelid closure degree and the fixation duration, the blink duration and the glancing duration advance to obtain the blink coefficient.
Based on the further improvement of the method, a fatigue measurement machine learning model is constructed by adopting a support vector machine; and determining the penalty coefficient of the radial basis kernel function and the optimal variable combination of the radial basis kernel function and the kernel variable by adopting a grid search method.
In another aspect, an embodiment of the present invention provides a fatigue measurement system based on eye movement data, including the following modules:
the characteristic extraction module is used for respectively acquiring eye movement data of a task to be executed in a waking state and a fatigue state by adopting an eye movement instrument and calculating eye movement characteristics based on the eye movement data; the eye movement characteristics comprise a left eye pupil diameter change rate, a right eye pupil diameter change rate, a fixation time length, an average saccade speed and an eyelid closure degree change rate;
the model training module is used for establishing a fatigue measurement machine learning model, training the fatigue measurement machine learning model based on the eye movement characteristics and obtaining a trained fatigue measurement machine learning model;
the fatigue measurement module is used for acquiring eye movement data of a user to be measured, calculating eye movement characteristics based on the eye movement data, and inputting the eye movement characteristics into a trained fatigue measurement machine learning model to obtain a fatigue measurement result of the user to be identified.
Further, the eye movement data includes: pupil diameter, eye movement type, and gaze coordinates; the eye movement types include fixations, saccades, and blinks;
the data acquisition module calculating eye movement characteristics based on the eye movement data comprises:
according to the pupil diameter data, based on the formula
Figure BDA0003711109770000041
Calculating a pupil diameter change rate, where P represents the mean pupil diameter per unit time of the subject performing the task, and P 0 Means mean pupil diameter per unit time of the subject in the awake, task-free state;
calculating the fixation duration, the blink duration and the saccade duration in unit time according to the eye movement type data, and calculating the eyelid closure degree based on the mapping regression relationship between the eyelid closure degree and the fixation duration, the blink duration and the saccade duration; according to the formula
Figure BDA0003711109770000051
Calculating the rate of change of the degree of eyelid closure, wherein f represents the degree of eyelid closure when the task is performed, f 0 Representing the mean value of the eyelid closure degree of the tested eye in the waking and task-free state;
the average saccade velocity is calculated from the gaze coordinates.
Further, calculating eyelid closure based on the mapped regression relationship of eyelid closure with gaze duration, blink duration, and saccade duration comprises calculating eyelid closure according to the following formula:
Figure BDA0003711109770000052
interval=blink+fixation+saccade+CLOS
wherein, gamma represents a blink coefficient, blink represents blink time length in unit time, fire represents fixation time length in unit time, saccade is glance time length in unit time, and CLOS represents the time length of no data of the eye tracker in unit time.
Further, the air conditioner is provided with a fan,
calculating the average saccade velocity from the gaze coordinates includes:
calculating the included angle of the gazing coordinates of two adjacent sampling moments in unit time according to the following formula:
Figure BDA0003711109770000053
(x i-1 ,y i-1 ,z i-1 ) Denotes the gaze coordinate at the i-1 sampling instant, (x) i ,y i ,z i ) A gaze coordinate representing a sample time of i;
the saccade velocities at adjacent sample instants are calculated according to the following formula:
V i =α i ×freq
wherein freq represents the sampling frequency;
the average saccade is obtained by averaging the saccade at all adjacent sample times per unit time.
Compared with the prior art, the invention can realize at least one of the following beneficial effects:
1. the eye movement data are collected by the eye movement instrument, the fatigue of the user is measured based on the eye movement data, the equipment is simple, the implementation is convenient, the normal task execution state of the user cannot be influenced by the equipment, the measurement result cannot be influenced by the subjective feeling of the user, the task difficulty and environmental factors, and the measurement result is more accurate.
2. The left eye pupil diameter change rate, the right eye pupil diameter change rate, the fixation time, the saccade speed and the eyelid closure degree change rate are used as characteristics to train the fatigue measurement machine learning model, so that personal physiological factors and eye habit differences can be eliminated, and the model is more accurate.
3. Based on the mapping regression relationship, the eyelid closure degree is directly calculated according to the eye movement data, so that the problem of low calculation efficiency caused by the fact that the eyelid closure degree value can be obtained only by carrying out complex data processing on the video image is solved, the processing process is simplified, and the calculation efficiency is improved.
In the invention, the technical schemes can be combined with each other to realize more preferable combination schemes. Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the structure particularly pointed out in the written description and drawings.
Drawings
The drawings are only for purposes of illustrating particular embodiments and are not to be construed as limiting the invention, wherein like reference numerals are used to designate like parts throughout.
FIG. 1 is a flow chart of a method for measuring fatigue based on eye movement data according to an embodiment of the present invention;
FIG. 2 is a block diagram of a fatigue measurement system based on eye movement data according to an embodiment of the present invention;
FIG. 3 is a schematic view of an eye feature point according to an embodiment of the present invention;
FIG. 4 is a statistical chart of EAR data according to an embodiment of the present invention.
Detailed Description
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate preferred embodiments of the invention and together with the description, serve to explain the principles of the invention and not to limit the scope of the invention.
The subjective fatigue degree quantified at present mainly has the following modes: questionnaires, task completion based behaviors, physiological signal based. The questionnaire method mainly uses a self-subjective questionnaire or a karolins sleepiness scale to allow the user to actively score. The reliability of the method is not high, and the method is greatly influenced by the subjective psychology of a user. The task completion behavior is based on fatigue degree division through the operation performance of the user, and the method is greatly influenced by environment and task difficulty. And the fatigue degree detection method based on physiological signals such as electroencephalogram, electrocardiogram and electromyogram has complex data processing and high requirements on equipment.
A specific embodiment of the present invention provides a fatigue measurement method based on eye movement data, as shown in fig. 1, including the following steps:
s1, respectively collecting eye movement data of the tasks to be executed in a waking state and a fatigue state by using an eye movement instrument, and calculating eye movement characteristics based on the eye movement data; the eye movement characteristics comprise a left eye pupil diameter change rate, a right eye pupil diameter change rate, a fixation time length, an average saccade speed and eyelid closure;
s2, establishing a fatigue measurement machine learning model, training the fatigue measurement machine learning model based on the eye movement characteristics, and obtaining a trained fatigue measurement machine learning model;
s3, obtaining eye movement data of the user to be measured, calculating eye movement characteristics based on the eye movement data, and inputting the eye movement characteristics into a trained fatigue measurement machine learning model to obtain a fatigue measurement result of the user to be identified.
During the implementation, eye moves the appearance and adopts wearing formula eye moves the appearance, for example tobii's wearing formula eye moves the appearance, and wearing formula eye moves the appearance and wears the convenience, does not influence the task execution state that is tried on normally to make the measuring result more accurate, and measurement process is simple, the implementation of being convenient for.
The eye movement data under different states are respectively tested by adopting the eye movement instrument, the eye movement characteristic training fatigue measurement machine learning model is extracted, the eye movement characteristic of the user to be measured is input into the trained fatigue measurement machine learning model, and then the measurement result can be obtained, and the fatigue measurement is carried out through objective eye movement data, so that the result inaccuracy caused by the influence of the subjective psychology, environment and task difficulty of the user is avoided.
Specifically, the eye movement data that adopts wearing formula eye movement appearance to gather includes: pupil diameter, eye movement type, and gaze coordinates; wherein the type of eye movement collected by the eye tracker includes gaze, saccade, blink, and unclassified.
The gazing duration is the gazing duration collected by the eye tracker in unit time. As the degree of fatigue increases, the fixation time becomes significantly longer, and therefore, it is taken as a measurement feature.
By performing the homogeneous variance test and the one-way ANOVA test on the 11 tested and total 832306 pupil diameter data, the verification results are shown in table 1 and table 2, the pupil diameter is affected by individual differences, and the significance P is 0.000<0.05, i.e. the pupil diameter difference between different individuals is significant. A total of 11 tested data, namely 11 groups, have significant pupil diameter data difference. If the pupil diameter is directly adopted as the eye movement characteristic, fatigue identification cannot be effectively carried out.
Figure BDA0003711109770000081
Figure BDA0003711109770000091
TABLE 1 pupil diameter data variance homogeneity check
Figure BDA0003711109770000092
TABLE 2 pupil diameter data ANOVA verification
Analysis of the pupil diameter data shows that the pupil diameter when the subject is tested in different states of fatigue and wakefulness to perform the task has a significant difference in the rate of change with respect to the pupil diameter in the wakefulness state without the task.
Specifically, according to the pupil diameter data, based on the formula
Figure BDA0003711109770000093
Calculating a pupil diameter change rate, where P represents the mean pupil diameter per unit time of the subject performing the task, and P 0 Representing the mean pupil diameter per unit time of the subject in the awake, task-free state.
Partial data as shown in table 3, table 3 shows that for a selected tested 86944 samples of left and right pupil change rate data for awake and tired states, the collected data is labeled as awake or tired, the awake state is labeled as 0 and the tired state is labeled as 1. After descriptive statistics using SPSS software, it can be observed that the mean values of the pupil diameter change rates in the fatigue state are all larger than those in the awake state. Further analysis of variance and one-way ANOVA was performed. By observing the significance value of the SPSS analysis result, as shown in tables 4 and 5, it can be seen that the significance thereof is 0.000<0.05, with significance. Namely, the pupil diameter change rate in the waking state and the fatigue state has a significant difference, and the index can be used as a feature vector for training a fatigue measurement machine model.
Figure BDA0003711109770000101
TABLE 3 pupil Rate of change data
Figure BDA0003711109770000102
Table 4 pupil rate of change data variance homogeneity check
Figure BDA0003711109770000111
TABLE 5 pupil Rate of Change data ANOVA verification
The saccadic velocities of the eyes are significantly different when in a tired state and a conscious state, and are therefore characterized as eye movements. Specifically, calculating the average saccade velocity from the gaze coordinates includes:
calculating the average saccade velocity from the gaze coordinates includes:
calculating the included angle of the gazing coordinates of two adjacent sampling moments in unit time according to the following formula:
Figure BDA0003711109770000112
(x i-1 ,y i-1 ,z i-1 ) Denotes the gaze coordinate at the i-1 sampling instant, (x) i ,y i ,z i ) A gaze coordinate representing a sample time of i;
the saccade velocities at adjacent sample instants are calculated according to the following formula:
V i =α i ×freq
wherein freq represents the sampling frequency;
the average saccade is obtained by averaging the saccade at all adjacent sample times per unit time.
For example, the unit time is 1s, the sampling frequency is 100Hz, that is, 100 sampling moments exist in the unit time, and 100 pieces of sampling data are generated.
Therefore, according to the sampling data of two adjacent sampling moments, the panning speeds of the two adjacent sampling moments are calculated according to the formula, and all the panning speeds are averaged, so that the average panning speed in unit time is obtained.
Eyelid closure (perclos) is also a commonly used indicator to measure fatigue. It uses the ratio of the eye closure time to a specific time to determine the degree of fatigue, where closure is a state where the pupil is covered by the eyelids. The perclos method has three judgment standards of P70, P80 and EM, which respectively represent the time rate that the pupil is longitudinally covered by the eyelid by 70 percent, the time rate that the pupil is longitudinally covered by the eyelid by 80 percent and the mean square closing rate of the eyelid. Studies have shown that there is the best correlation between P80 and the degree of fatigue.
Currently, the eyelid closure degree is usually calculated by means of image recognition. Because a series of complex data processing needs to be carried out on the video image to calculate the eyelid closure degree, the processing process is complex, and the calculation efficiency is low. The embodiment of the invention adopts the eye movement data and calculates the eyelid closure degree based on the mapping relation between the eye movement data and the eyelid closure degree, thereby simplifying the processing process and improving the calculation efficiency.
Specifically, according to the eye movement type data, the fixation duration, the blink duration and the glance duration in unit time are calculated, and the eyelid closure degree is calculated based on the mapping regression relation between the eyelid closure degree and the fixation duration, the blink duration and the glance duration.
Specifically, the eyelid closure is calculated using the following formula:
Figure BDA0003711109770000121
interval=blink+fixation+saccade+CLOS
wherein, gamma represents a blink coefficient, blink represents blink time length in unit time, fire represents fixation time length in unit time, saccade is glance time length in unit time, and CLOS represents the time length of no data of the eye tracker in unit time.
In practice, the blink coefficient in the above formula is obtained in advance by the following steps:
and S11, respectively acquiring the eye aspect ratio, namely EAR value when the eye is closed and opened by adopting an image recognition method based on the eye video data.
In implementation, 11 tested eye video data and eye movement data in the same time period are collected through the eye tracker.
As shown in fig. 3, eye feature data are obtained based on face feature recognition, and then 6 feature points of the eye are extracted according to the data
Figure BDA0003711109770000131
The eye aspect ratio value is calculated. A total of 123172 EAR data were collected. P1-P6 respectively represent the coordinates of the 6 feature points P1-P6 in FIG. 3.
S12, according to T ═ I 1 +(I 2 -I 1 ) X (1- β) calculating a threshold value of eye aspect ratio for judging eyelid closure; in which I 1 、I 2 The eye aspect ratio averages when the eye is closed and open, respectively, and β represents the proportion of eye closure. In practice, since the criterion P80 was adopted, β was 0.8.
In practice, EAR data obtained by experiment when the operator opens and closes his eyes are shown in fig. 4.
As can be seen, the peak of EAR distribution at eye closure is [0.13,0.18 ]]The peak values of EAR distribution at eye opening are [0.28,0.38 ]]Selecting the intermediate value of the two intervals as I 1 、I 2 The value of (a).
Namely, it is
Figure BDA0003711109770000132
Figure BDA0003711109770000133
T=0.155+(0.33-0.155)*(1-0.8)=0.19
That is, the threshold value of EAR is T ═ 0.19, and when the EAR value is less than 0.19, the eyelid closure is considered.
S13, obtaining eyelid closing degree according to the duration that the eye aspect ratio is smaller than a threshold value in unit time, and analyzing correlation between the eyelid closing degree and the fixation duration, the blink duration and the saccade duration to obtain a blink coefficient.
Calculating and calculating the fixation time length, the blink time length and the glance time length in unit time according to the collected eye movement data of the user, and carrying out correlation analysis on the eyelid closure degree, the fixation time length, the blink time length and the glance time length to obtain a blink coefficient. The specific correlation analysis method is the prior art and is not described herein again.
After the blink coefficient is obtained, the eyelid closure degree can be calculated only according to the eye movement data measured by the eye movement instrument, video image processing is not needed, the method is simple, and the measuring efficiency can be improved.
PERCLOS is the proportion of eyelid closure time per unit time, however, due to personal physiological factors and differences in eye habits, the value of PERCLOS for each individual performing the same task may vary. For example, some people are habitually blinking in a non-fatigue state, and some people are habitually squinting to observe. Therefore, homogeneity of variance test and one-way ANOVA test were performed on the perclos values of the current subjects as shown in tables 6 and 7.
Figure BDA0003711109770000141
TABLE 6 Perclose variance homogeneity test
Figure BDA0003711109770000142
TABLE 7 Perclose ANOVA test
After variance analysis is carried out on 758 hyperclos values of 11 subjects, the significance P is 0.003<0.05, and the hyperclos values of different people are remarkably different. Therefore, the application takes the individual difference into consideration in index processing, and determines to select the change rate of the perclos individual to replace the original absolute value index of the perclos.
The formula for this rate of change is as follows:
Figure BDA0003711109770000143
wherein: f is the degree of eyelid closure when the task is being performed, f 0 The average value of the eyelid closure degrees of the tested eye in the waking and task-free state is a reference value.
By analyzing the eye movement data, the change rate of the diameter of the left eye pupil, the change rate of the diameter of the right eye pupil, the fixation time, the average saccade speed and the eyelid closure degree change rate are selected as eye movement characteristic data, and training of a fatigue measurement machine learning model is carried out.
Specifically, a fatigue measurement machine learning model is constructed by adopting a support vector machine; and adopting a radial basic kernel function as a kernel function, and adopting a grid search method to determine the penalty coefficient of the radial basic kernel function and the optimal variable combination of the sum kernel variable.
The support vector machine is a supervised machine learning algorithm and is often applied to the binary problem. The method comprises the steps of mapping data to a high-dimensional space by using a technology called a kernel function, exemplarily, taking a radial basic kernel function as the kernel function, and determining penalty coefficients of the radial basic kernel function and optimal variable combinations of the kernel variable by using a grid search method.
The serial number of the data sample is represented by i, and the characteristic data of the ith sample, namely the change rate of the pupil diameter of the left eye, the change rate of the pupil diameter of the right eye, the fixation duration, the average saccade speed and the change rate of the eyelid closure degree are represented by xi. The fatigue state is represented by y, wherein y i When 0 stands forAwake state, y i When 1, it represents fatigue. Divided (x) i ,y i ) Forming a training set, training a support vector machine model based on the training set to obtain a fatigue measurement model, wherein the specific process for training the support vector machine is the prior art and is not described in detail herein.
After the eye movement data of the user to be measured is obtained, the eye movement characteristics are obtained according to the method in the step S1, and the eye movement characteristics are input into the trained fatigue measurement model, so that the measurement result can be obtained.
One specific embodiment of the present invention provides a fatigue measurement system based on eye movement data, as shown in fig. 2, comprising the following modules:
the characteristic extraction module is used for respectively acquiring eye movement data of a task to be executed in a waking state and a fatigue state by adopting an eye movement instrument and calculating eye movement characteristics based on the eye movement data; the eye movement characteristics comprise a left eye pupil diameter change rate, a right eye pupil diameter change rate, a fixation time length, an average saccade speed and an eyelid closure degree change rate;
the model training module is used for establishing a fatigue measurement machine learning model, training the fatigue measurement machine learning model based on the eye movement characteristics and obtaining a trained fatigue measurement machine learning model;
the fatigue measurement module is used for acquiring eye movement data of a user to be measured, calculating eye movement characteristics based on the eye movement data, and inputting the eye movement characteristics into a trained fatigue measurement machine learning model to obtain a fatigue measurement result of the user to be identified.
Preferably, the eye movement data includes: pupil diameter, eye movement type, and gaze coordinates; the eye movement types include fixations, saccades, and blinks;
the data acquisition module calculating eye movement characteristics based on the eye movement data comprises:
according to the pupil diameter data, based on the formula
Figure BDA0003711109770000161
Calculating the rate of change of pupil diameter, where P represents the unit time when the task is performedInner mean pupil diameter, P 0 Means mean pupil diameter per unit time of the subject in the awake, task-free state;
calculating the fixation duration, the blink duration and the saccade duration in unit time according to the eye movement type data, and calculating the eyelid closure degree based on the mapping regression relationship between the eyelid closure degree and the fixation duration, the blink duration and the saccade duration; according to the formula
Figure BDA0003711109770000162
Calculating the rate of change of the degree of eyelid closure, wherein f represents the degree of eyelid closure when the task is performed, f 0 Representing the mean value of the eyelid closure degree of the tested eye in the waking and task-free state;
the average saccade velocity is calculated from the gaze coordinates.
Preferably, calculating eyelid closure, based on the mapped regression relationship of eyelid closure with gaze duration, blink duration, and saccade duration, comprises calculating eyelid closure according to the following formula:
Figure BDA0003711109770000163
interval=blink+fixation+saccade+CLOS
wherein, gamma represents a blink coefficient, blink represents blink time length in unit time, fire represents fixation time length in unit time, saccade is glance time length in unit time, and CLOS represents the time length of no data of the eye tracker in unit time.
Preferably, calculating the saccade velocity from the gaze coordinates comprises:
calculating the included angle of the gazing coordinates at two adjacent moments according to the following formula:
Figure BDA0003711109770000171
(x i-1 ,y i-1 ,z i-1 ) Denotes the gaze coordinate at time i-1, (x) i ,y i ,z i ) When represents iPunctual gaze coordinates;
the saccade velocities for adjacent sample instants are calculated according to the following formula:
V i =α i ×freq
wherein freq represents the sampling frequency;
the average saccade is obtained by averaging the saccades for all adjacent sample times per unit time.
The method embodiment and the system embodiment are based on the same principle, and related parts can be referenced mutually, and the same technical effect can be achieved. For a specific implementation process, reference is made to the foregoing embodiments, which are not described herein again.
Those skilled in the art will appreciate that all or part of the flow of the method implementing the above embodiments may be implemented by a computer program, which is stored in a computer readable storage medium, to instruct related hardware. The computer readable storage medium is a magnetic disk, an optical disk, a read-only memory or a random access memory.
The above description is only for the preferred embodiment of the present invention, but the scope of the present invention is not limited thereto, and any changes or substitutions that can be easily conceived by those skilled in the art within the technical scope of the present invention are included in the scope of the present invention.

Claims (10)

1. A fatigue measurement method based on eye movement data is characterized by comprising the following steps:
the method comprises the steps that an eye movement instrument is adopted to collect eye movement data of tasks to be executed in a waking state and a fatigue state respectively, and eye movement characteristics are calculated based on the eye movement data; the eye movement characteristics comprise a left eye pupil diameter change rate, a right eye pupil diameter change rate, a fixation time length, an average saccade speed and an eyelid closure degree change rate;
establishing a fatigue measurement machine learning model, training the fatigue measurement machine learning model based on the eye movement characteristics, and obtaining a trained fatigue measurement machine learning model;
the method comprises the steps of obtaining eye movement data of a user to be measured, calculating eye movement characteristics based on the eye movement data, inputting the eye movement characteristics into a trained fatigue measurement machine learning model, and obtaining a fatigue measurement result of the user to be identified.
2. The eye movement data-based fatigue measurement method according to claim 1, wherein the eye movement data comprises: pupil diameter, eye movement type, and gaze coordinates; the eye movement types include gaze, saccade, and blink;
calculating an eye movement feature based on the eye movement data comprises:
according to the pupil diameter data, based on the formula
Figure FDA0003711109760000011
Calculating a pupil diameter change rate, where P represents the mean pupil diameter per unit time of the subject performing the task, and P 0 Means mean pupil diameter per unit time of the subject in the awake, task-free state;
calculating fixation time length, blink time length and glance time length in unit time according to the eye movement type data, and calculating eyelid closing degree based on the mapping regression relation between the eyelid closing degree and the fixation time length, the blink time length and the glance time length; according to the formula
Figure FDA0003711109760000012
Calculating the rate of change of the degree of eyelid closure, wherein f represents the degree of eyelid closure when the task is performed, f 0 Representing the mean value of the eyelid closure degree of the tested eye in the waking and task-free state;
the average saccade velocity is calculated from the gaze coordinates.
3. The eye movement data based fatigue measurement method of claim 2, wherein calculating eyelid closure based on a mapped regression relationship of eyelid closure with gaze duration, blink duration, and saccade duration comprises: eyelid closure was calculated according to the following formula:
Figure FDA0003711109760000021
interval=blink+fixation+saccade+CLOS
wherein, gamma represents a blink coefficient, blink represents blink time length in unit time, fire represents fixation time length in unit time, saccade is glance time length in unit time, and CLOS represents the time length of no data of the eye tracker in unit time.
4. The method of eye movement data based fatigue measurement according to claim 2, wherein calculating an average saccade velocity from the gaze coordinates comprises:
calculating the included angle of the gazing coordinates of two adjacent sampling moments in unit time according to the following formula:
Figure FDA0003711109760000022
(x i-1 ,y i-1 ,z i-1 ) Denotes the gaze coordinate at the i-1 sampling instant, (x) i ,y i ,z i ) A gaze coordinate representing a sample time of i;
the saccade velocities at adjacent sample instants are calculated according to the following formula:
V i =α i ×freq
wherein freq represents the sampling frequency;
the average saccade is obtained by averaging the saccades for all adjacent sample times per unit time.
5. The eye movement data-based fatigue measurement method according to claim 3, wherein the blink coefficient is obtained by:
the aspect ratio of the eyes when the eyes are closed and opened is respectively obtained by adopting an image recognition method, and the aspect ratio is determined according to the ratio of T to I 1 +(I 2 -I 1 ) X (1- β) calculating a threshold value of eye aspect ratio for judging eyelid closure; wherein I 1 、I 2 The average value of the aspect ratio of the eyes when the eyes are closed and opened respectively, and beta represents the index of the eye closing area;
and obtaining eyelid closing degree according to the duration that the eye aspect ratio is smaller than the threshold value in unit time, and carrying out correlation analysis on the eyelid closing degree, the fixation duration, the blink duration and the scan duration to obtain a blink coefficient.
6. The eye movement data-based fatigue measurement method according to claim 1, wherein a fatigue measurement machine learning model is constructed using a support vector machine; and determining the penalty coefficient of the radial basis kernel function and the optimal variable combination of the radial basis kernel function and the kernel variable by adopting a grid search method.
7. A fatigue measurement system based on eye movement data is characterized by comprising the following modules:
the characteristic extraction module is used for respectively acquiring eye movement data of a task to be executed in a waking state and a fatigue state by adopting an eye movement instrument and calculating eye movement characteristics based on the eye movement data; the eye movement characteristics comprise a left eye pupil diameter change rate, a right eye pupil diameter change rate, a fixation time length, an average saccade speed and an eyelid closure degree change rate;
the model training module is used for establishing a fatigue measurement machine learning model, training the fatigue measurement machine learning model based on the eye movement characteristics and obtaining a trained fatigue measurement machine learning model;
the fatigue measurement module is used for acquiring eye movement data of a user to be measured, calculating eye movement characteristics based on the eye movement data, and inputting the eye movement characteristics into a trained fatigue measurement machine learning model to obtain a fatigue measurement result of the user to be identified.
8. The eye movement data based fatigue measurement system of claim 7, wherein the eye movement data comprises: pupil diameter, eye movement type, and gaze coordinates; the eye movement types include gaze, saccade, and blink;
the data acquisition module calculating an eye movement characteristic based on the eye movement data comprises:
according to the pupil diameter data, based on the formula
Figure FDA0003711109760000031
Calculating the change rate of the pupil diameter, wherein P represents the average pupil diameter per unit time when the task is performed, and P 0 Means for representing the mean pupil diameter per unit time of the subject in the awake task-free state;
calculating the fixation duration, the blink duration and the saccade duration in unit time according to the eye movement type data, and calculating the eyelid closure degree based on the mapping regression relationship between the eyelid closure degree and the fixation duration, the blink duration and the saccade duration; according to the formula
Figure FDA0003711109760000041
Calculating the rate of change of the degree of closure of the eyelids, wherein f represents the degree of closure of the eyelids when the task is performed, f 0 Representing the eyelid closure mean value of the tested eye in the waking state without task;
the saccade velocity is calculated from the gaze coordinates.
9. The eye movement data based fatigue measurement system of claim 8, wherein calculating eyelid closure based on a mapped regression relationship of eyelid closure with gaze duration, blink duration, and saccade duration comprises calculating eyelid closure according to the following formula:
Figure FDA0003711109760000042
interval=blink+fixation+saccade+CLOS
wherein, gamma represents a blink coefficient, blink represents blink time length in unit time, fire represents fixation time length in unit time, saccade is glance time length in unit time, and CLOS represents the time length of no data of the eye tracker in unit time.
10. The eye movement data based fatigue measurement method of claim 8, wherein calculating the saccade velocity from the gaze coordinates comprises:
calculating the included angle of the gazing coordinates at two adjacent moments according to the following formula:
Figure FDA0003711109760000043
(x i-1 ,y i-1 ,z i-1 ) Denotes the gaze coordinate at time i-1, (x) i ,y i ,z i ) Representing the gaze coordinate at time i;
the saccade velocities at adjacent sample instants are calculated according to the following formula:
V i =α i ×freq
wherein freq represents the sampling frequency;
the average saccade is obtained by averaging the saccades for all adjacent sample times per unit time.
CN202210720565.1A 2022-06-23 2022-06-23 Fatigue measurement method and system based on eye movement data Pending CN115089181A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202210720565.1A CN115089181A (en) 2022-06-23 2022-06-23 Fatigue measurement method and system based on eye movement data

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202210720565.1A CN115089181A (en) 2022-06-23 2022-06-23 Fatigue measurement method and system based on eye movement data

Publications (1)

Publication Number Publication Date
CN115089181A true CN115089181A (en) 2022-09-23

Family

ID=83293561

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202210720565.1A Pending CN115089181A (en) 2022-06-23 2022-06-23 Fatigue measurement method and system based on eye movement data

Country Status (1)

Country Link
CN (1) CN115089181A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115272645A (en) * 2022-09-29 2022-11-01 北京鹰瞳科技发展股份有限公司 Multi-mode data acquisition equipment and method for training central fatigue detection model

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN115272645A (en) * 2022-09-29 2022-11-01 北京鹰瞳科技发展股份有限公司 Multi-mode data acquisition equipment and method for training central fatigue detection model

Similar Documents

Publication Publication Date Title
US20230195222A1 (en) Methods and Systems for Obtaining, Aggregating, and Analyzing Vision Data to Assess a Person&#39;s Vision Performance
EP0634031B1 (en) Apparatus and method for eye tracking interface
US5517021A (en) Apparatus and method for eye tracking interface
EP3617815B1 (en) Work support device, work support method, and work support program
EP1799105B1 (en) System and method for mental workload measurement based on rapid eye movement
US5447166A (en) Neurocognitive adaptive computer interface method and system based on on-line measurement of the user&#39;s mental effort
CN110600103B (en) Wearable intelligent service system for improving eyesight
CN112259237B (en) Depression evaluation system based on multi-emotion stimulus and multi-stage classification model
CN108186030B (en) Stimulation information providing device and cognitive index analysis method for potential value test
CN108922629A (en) The screening and its application of brain function corelation behaviour normal form index
Koçer et al. Nintendo Wii assessment of Hoehn and Yahr score with Parkinson's disease tremor
Lu et al. A dual model approach to EOG-based human activity recognition
CN115089181A (en) Fatigue measurement method and system based on eye movement data
US20210236023A1 (en) TECHNOLOGY ADAPTED TO ENABLE IMPROVED COLLECTION OF INVOLUNTARY EYELlD MOVEMENT PARAMETERS, INCLUDING COLLECTION OF EYELlD MOVEMENT PARAMETERS TO SUPPORT ANALYSIS OF NEUROLOGICAL FACTORS
CN113440151A (en) Concentration detection system, detection method and use method of system
CN115444422A (en) Eye movement data-based real environment psychological load assessment method and system
CN113907757B (en) Alertness testing method based on attention system theory
CN113974589B (en) Multi-modal behavior paradigm evaluation optimization system and cognitive ability evaluation method
Caryn et al. Driver drowsiness detection based on drivers’ physical behaviours: a systematic literature review
CN114869272A (en) Posture tremor detection model, posture tremor detection algorithm, and posture tremor detection apparatus
KR20220158957A (en) System for prediction personal propensity using eye tracking and real-time facial expression analysis and method thereof
WO2019227690A1 (en) Screening of behavioral paradigm indicators and application thereof
Tébar Saiz Análisis de respuestas fisiológicas a estímulos emocionales
Ohashi et al. Human-error-potential Estimation based on Wearable Biometric Sensors
Papanikolaou et al. Predicting ADHD in Children Using Eye-Tracking Data and Mathematical Modeling

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination