CN111985351B - Eye movement-based fatigue detection method - Google Patents

Eye movement-based fatigue detection method Download PDF

Info

Publication number
CN111985351B
CN111985351B CN202010750681.9A CN202010750681A CN111985351B CN 111985351 B CN111985351 B CN 111985351B CN 202010750681 A CN202010750681 A CN 202010750681A CN 111985351 B CN111985351 B CN 111985351B
Authority
CN
China
Prior art keywords
blink
sequence
eye movement
time length
fatigue
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010750681.9A
Other languages
Chinese (zh)
Other versions
CN111985351A (en
Inventor
陈飞燕
吴云鹰
徐天勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Zhejiang University ZJU
Original Assignee
Zhejiang University ZJU
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Zhejiang University ZJU filed Critical Zhejiang University ZJU
Priority to CN202010750681.9A priority Critical patent/CN111985351B/en
Publication of CN111985351A publication Critical patent/CN111985351A/en
Application granted granted Critical
Publication of CN111985351B publication Critical patent/CN111985351B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/193Preprocessing; Feature extraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/18Eye characteristics, e.g. of the iris
    • G06V40/197Matching; Classification

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Multimedia (AREA)
  • Theoretical Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Ophthalmology & Optometry (AREA)
  • Human Computer Interaction (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Eye Examination Apparatus (AREA)

Abstract

The invention discloses a fatigue detection method based on eye movement, which comprises the steps of S1, establishing an initial classification model, including two states of waking and fatigue; s2, collecting eye movement data, recording eye movement characteristics including a blink event sequence through a high-speed camera, S3, constructing a new eye movement sequence, converting the collected blink event sequence, respectively establishing a blink frequency input sequence, a blink average time length sequence and a blink time length sequence, S4, extracting eye movement characteristics, and respectively carrying out Fourier transform on a blink frequency input sequence diagram, a blink average time length sequence diagram and a blink time length sequence diagram; and S5, outputting a result, judging the index by the initial classification model, and outputting a first result of the waking state and the fatigue state. Data features in the eyes are extracted, and features more reflective of brain fatigue than other surface features. Therefore, the expression of fatigue is more accurate and sensitive.

Description

Eye movement-based fatigue detection method
Technical Field
The invention relates to the technical field of fatigue detection, in particular to a fatigue detection method based on eye movement.
Background
With the pace of life becoming faster and faster, people often unconsciously enter a fatigue state and work in the fatigue state, so that great safety risks exist. Particularly, in recent years, with the improvement of the living standard of the public, the holding quantity of automobiles of people of each country is more and more. However, accompanying traffic accidents are also increasing. Research shows that fatigue driving is one of the important reasons that traffic accidents become increasingly serious, so that the research has very important practical significance for accurately detecting the fatigue of a driver in real time and providing early warning.
It is considered that the blinking time period becomes longer in the case of fatigue. Therefore, this detection method uses the eye-closing period during blinking as an index of whether or not fatigue is present.
However, it is found that when a person is in an operating state, the person enters a fatigue state from an awake state, and the blinking time length slightly increases with the operating time as the operating time increases. After entering the fatigue state, it is found that the blinking time length in the fatigue state is rather decreased and the blinking frequency is increased as time goes on compared with the awake state. Therefore, the blinking time may appear more at the early stage of fatigue, and is not an obvious feature, and features such as simple blinking time and the like cannot completely and accurately express fatigue, so that whether the fatigue state is entered or not cannot be accurately detected.
Disclosure of Invention
According to the defects of the prior art, the invention provides the fatigue detection method based on the eye movement, which is more accurate and sensitive in expressing fatigue.
In order to solve the technical problems, the technical scheme of the invention is as follows:
a fatigue detection method based on eye movement comprises the steps of
S1, establishing an initial classification model, including two states of wakefulness and fatigue;
s2, collecting eye movement data, recording eye movement characteristics including a blink event sequence through a high-speed camera,
s3, constructing a new eye movement sequence, converting the collected blink event sequence, respectively establishing a blink frequency input sequence, a blink average time length sequence and a blink time length sequence,
wherein, the blink frequency input sequence constructs a sampling interval of 4-6s, counts the blink frequency to form the blink frequency input sequence,
wherein the blink average time length sequence defines a sampling interval of 4-6s, the total blink time length in the sampling interval is counted to form the blink average time length sequence,
the blink time length sequence defines a sampling interval of 4-6s, and the average blink time length in the sampling interval is counted to form a blink time length sequence;
s4, extracting eye movement characteristics, and performing Fourier transform on the blink frequency input sequence, the blink average time length sequence and the blink time length sequence respectively;
and S5, outputting a result, judging the index by the initial classification model, and outputting a first result of the waking state and the fatigue state.
Preferably, in step S4, after the blink frequency input sequence, the blink average time length sequence, and the blink time length sequence are respectively fourier-transformed, the energy sum of each frequency point in the fourier-transformed low frequency range is calculated.
Preferably, in step S4, after the blink frequency input sequence diagram, the blink average time length sequence, and the blink time length sequence are respectively fourier-transformed, the sum of the amplitudes of the frequency points in the fourier-transformed low-frequency range is calculated.
Preferably, in step S4, the low frequency range after fourier transform of the blink frequency input sequence, the blink average time length sequence, and the blink time length sequence is 0 to 0.03 Hz.
Preferably, in step S4, the blink feature extraction after wavelet transform:
and performing equal-frequency-band multi-scale wavelet transformation on the blink frequency sequence and the blink time length sequence to establish a time sequence corresponding to multiple frequency bands, wherein each sequence represents the information change of the frequency band, then performing energy calculation on each frequency band to obtain an energy value under each frequency band, and using the maximum value of a preset frequency band range to represent the main characteristic of the frequency band range.
Preferably, in step S4, the preset frequency band is a low frequency range of 0 to 0.03 Hz.
Preferably, in step S2, the collected eye movement data further includes a gaze event sequence, and in step S3, the eye movement new sequence is constructed and further used for transforming the collected gaze event sequence and establishing a gaze spatial position sequence.
Preferably, in step S4, the eye movement feature extraction is performed to count a distribution range of the gaze region based on the recorded gaze spatial position sequence, to obtain a minimum region range of 70% to 90% of the gaze points, where the minimum region range is recorded with a diameter of a minimum circumscribed circle containing 70% to 90% of the number of the gaze points, to extract coordinates of the gaze points in the gaze event sequence to form a new spatial position sequence, to count distribution of the sequence in a two-dimensional space, and to output a gaze region thermodynamic diagram.
Preferably, in step S5, the result is output, the attention area thermodynamic diagram output from the obtained minimum attention area range of the gaze point is compared with the minimum index area of the attention range of the original classification model, and the second result of the awake state and the fatigue state is output.
Preferably, in step S5, the first result and the second result are combined to output a result of the final wakeful state and the fatigue state.
The invention has the following characteristics and beneficial effects:
1. data features in the eyes are extracted, and features more reflective of brain fatigue than other surface features. Therefore, the expression of fatigue is more accurate and sensitive.
2. The generality of the rule is detected, the rule being independent of the task being performed by the subject. Any application may be used as long as the subject is performing a task, a specific task.
3. The brain activity characteristics are expressed more internally, so that the method does not need more machine learning, can be used for real-time detection, has low requirement on the performance of detection equipment, and can be popularized and applied.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions in the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and for those skilled in the art, other drawings can be obtained according to these drawings without creative efforts.
FIG. 1 is a schematic diagram of a psychometric eye movement experiment according to an embodiment of the present invention;
FIG. 2 is a waveform illustrating eye movement characteristics analysis during an awake state according to an embodiment of the present invention;
FIG. 3 is a waveform of eye movement characteristic analysis in a fatigue state according to an embodiment of the present invention;
FIG. 4 is a waveform diagram of wavelet analysis of eye movement characteristics in an awake state according to an embodiment of the present invention;
FIG. 5 is a wavelet analysis waveform of eye movement characteristics in fatigue state according to an embodiment of the present invention;
FIG. 6 is a diagram of an awake state gaze thermodynamic analysis in an embodiment of the present invention;
FIG. 7 is a fatigue state gaze thermodynamic analysis diagram in an embodiment of the present invention;
FIG. 8 is a distribution diagram of the swept area in an embodiment of the invention.
Detailed Description
The following further describes embodiments of the present invention with reference to the drawings. It should be noted that the description of the embodiments is provided to help understanding of the present invention, but the present invention is not limited thereto. In addition, the technical features related to the embodiments of the present invention described below may be combined with each other as long as they are in conflict with each other.
Example 1
The invention provides a fatigue detection method based on eye movement, which comprises the following steps
S1, establishing an initial classification model which comprises two states of wakefulness and fatigue;
it is understood that the initial classification model is established by performing a psycho-rotational eye movement experiment and an event scenario experiment on the subject.
Specifically, in the psychological rotation eye movement experiment, as shown in fig. 1, a repeated measurement experiment design is performed in a test using three factors of 2 (graph pair type) × 5 (rotation angle) × 2 (fatigue state), wherein one factor is a graph pair type and is divided into two levels of a plane pair (one graph rotates on a plane by a certain degree) and a mirror pair (two different graphs). Another factor is the angle of rotation of the pattern, divided into 0 °, 45 °, 90 °, 135 °, 180 ° clockwise. The fatigue state is classified into a waking state and a fatigue state.
And (3) an event scene experiment, specifically a driving scene simulation experiment, continuously presenting a driving record video for 12 minutes, thereby obtaining a more accurate original classification model.
S2, collecting eye movement data, recording eye movement characteristics including a blink event sequence through a high-speed camera,
s3, constructing a new eye movement sequence, as shown in figures 2 and 3, converting the collected blink event sequence, respectively establishing a blink frequency input sequence, a blink average time length sequence and a blink time length sequence,
wherein, the blink frequency input sequence is constructed by sampling interval 4s and counting the number of blinks to form the blink frequency input sequence, as shown in fig. 1-a1, which is the blink frequency input sequence in the awake state, as shown in fig. 2-a1, which is the blink frequency input sequence in the fatigue state.
The blink average time length sequence defines a sampling interval of 4s, and counts total blink time lengths in the sampling interval to form a blink average time length sequence, as shown in fig. 1-B1, which is a sequence diagram of the blink average time lengths in the awake state, as shown in fig. 2-B1, which is a sequence of the blink average time lengths in the fatigue state.
The blink time length sequence defines a sampling interval of 4s, and the average blink time length in the sampling interval is counted to form a blink time length sequence; as shown in fig. 1-C1, is a sequence of blink durations in awake state, as shown in fig. 2-C1, is a sequence of blink durations in fatigue state.
S4, extracting eye movement characteristics, and performing Fourier transform on the blink frequency input sequence, the blink average time length sequence and the blink time length sequence respectively; in step S4, after the blink frequency input sequence, the blink average time length sequence, and the blink time length sequence are respectively fourier-transformed, the energy sum of each frequency point in the fourier-transformed low-frequency range is calculated.
In particular, the Fourier transform
Figure BDA0002608365100000061
The fourier transform subjects the time domain signal to a decomposition transform, extracting the corresponding frequency portion, and then converting into a representation of the frequency domain. Some features are more visible in the frequency domain.
The invention uses FFT (fast fourier transform algorithm) to calculate the discrete fourier transform of the constructed time series.
Thus, the awake mode is obtained, as shown in fig. 2, in which fig. 1-A3 is obtained corresponding to fig. 1-a1, fig. 1-B3 is obtained corresponding to fig. 1-B1, and fig. 1-C3 is obtained corresponding to fig. 1-C1.
The model was obtained in a fatigue state as shown in fig. 3, in which fig. 2-A3 was obtained corresponding to fig. 2-a1, fig. 2-B3 was obtained corresponding to fig. 2-B1, and fig. 2-C3 was obtained corresponding to fig. 2-C1.
As can be seen from fig. 2 and 3, the present invention first constructs a unit time frequency, a unit time average duration, and a unit time blink duration sequence, and then finds the distinction between waking and fatigue on the energy spectrum through the analysis of the frequency domain. The low frequency energy in the fatigue state is higher than the low frequency energy in the wake state. It can be seen that data features in the eye are extracted, and features that reflect brain fatigue more than other surface features. Therefore, the expression of fatigue is more accurate and sensitive.
And S5, outputting a result, judging the index by the initial classification model, and outputting a first result of the waking state and the fatigue state.
It can be understood that, in the present invention, the blink frequency input sequence, the blink average time length sequence, and the blink time length sequence are respectively analyzed by the man-kender method, which specifically includes the following steps:
can be used to detect the variation trend of the sequence. The method comprises the following steps: for a time sequence X with n samples, a rank-one sequence is constructed:
Figure BDA0002608365100000071
the rank sequence Sk is the cumulative number of values at time i greater than the number of values at time j. A statistical quantity is defined that,
Figure BDA0002608365100000072
where UF1 ═ 0, e (Sk), and var (Sk) are the mean and variance of the cumulative number Sk, and when X1, X2, …, and Xn are independent of each other and have the same continuous distribution, they are calculated by the following formula:
Figure BDA0002608365100000073
UFi is a standard normal distribution, which is a statistical sequence calculated according to the sequence X, X1, X, …, Xn, given a significance level a, looking up the normal distribution table, if | UFi | > Ua, it indicates that there is a significant trend change in the sequence.
In the time sequence x, in reverse order, xn-1, …, x1, the above process is repeated while UBk is-UFk, k is n, n-1, …,1), and UB1 is 0. The method has the advantages of simple calculation, and can clarify the time of mutation initiation and point out the mutation region, thereby obtaining a Mankender analysis model.
The fatigue state is reflected more accurately and sensitively by combining the Mankender analysis model with the Fourier transform model.
Further, in step S4, the low frequency range is 0 to 0.03Hz after the blink frequency input sequence, the blink average time length sequence, and the blink time length sequence are fourier transformed.
In a further configuration of the present invention, in step S4, the blink feature extraction after wavelet transform:
and performing equal-frequency-band multi-scale wavelet transformation on the blink frequency sequence and the blink time length sequence to establish a time sequence corresponding to multiple frequency bands, wherein each sequence represents the information change of the frequency band, then performing energy calculation on each frequency band to obtain an energy value under each frequency band, and using the maximum value of a preset frequency band range to represent the main characteristic of the frequency band range.
The wavelet analysis is also a transformation based on wavelet basis, and the following formula is given:
Figure BDA0002608365100000081
the wavelet transformation not only reflects the frequency information of the signal, but also reflects the time information of the frequency change, as shown in fig. 4 and fig. 5, it can be known that in the present invention, the feature of frequency subdivision at the low frequency of the wavelet transformation is applied to obtain the significant contrast of the eye movement characteristics at the low frequency in the waking state and the fatigue state.
Where the high frequency components generally represent rapid changes, mainly a stimulus response to the surrounding environment, or a short-term intrinsic change. While the low frequency components represent states of intrinsic physiology. An increase in low frequency energy indicates an increase in blink activity itself (alternating frequency and duration, or co-acting), whereas an increase in blink activity itself indicates a decrease in brain work capacity, minimizing the intake of visual information. The low-frequency difference obtained in the frequency spectrum analysis of the blinking duration can directly reflect whether the brain is tired, and is the essential embodiment of the physiological characteristics of the brain fatigue in the periphery.
Preferably, in step S4, the preset frequency band is a low frequency range of 0 to 0.03 Hz.
According to a further arrangement of the present invention, as shown in step S2 in fig. 6-8, the collected eye movement data further includes a gaze event sequence, and in step S3, the eye movement new sequence is constructed, and is further configured to convert the collected gaze event sequence and establish a gaze spatial position sequence.
Preferably, in step S4, the eye movement feature extraction is performed to count a distribution range of the gaze region based on the recorded gaze spatial position sequence, to obtain a minimum region range of 70% of the gaze points, where the minimum region range is recorded with a diameter of a minimum circumscribed circle including 70% of the number of the gaze points, to extract coordinates of the gaze points in the gaze event sequence to form a new spatial position sequence, to count distribution of the sequence in a two-dimensional space, and to output a gaze region thermodynamic diagram.
And step S5, outputting a result, comparing and judging the gaze region thermodynamic diagram output by the obtained gaze point minimum region range with the minimum index region of the gaze range of the original classification model, and outputting a second result of the waking state and the fatigue state.
As shown in fig. 6 to 8, it can be seen that the brain always tries to reduce the amount of work in the fatigue state, and maintains the necessary work in a more economical manner. It can also be seen from the statistical results of the gazing regions that the driver in fatigue state has reduced the gazing range, reduced the gazing to the periphery, including the gazing to the near periphery of the main target.
On one hand, the driver aims to reduce unnecessary fixation, avoid extra brain expenses (including a visual area for visual processing and a forehead for conflict resolution), and simultaneously, the processing capacity of the forehead is reduced, so that active glance is reduced; on the other hand, also in the case of fatigue, the peripheral fixation may delay the reaction to the primary object itself. Such a reduction in the region of interest is not only found in driving simulation scenarios, but also in psycho-rotational eye movement experiments.
The percentage of fixation time in AOI7 and AOI8 was significantly increased, the percentage of fixation in AOI4 was also increased to some extent, and the percentage of fixation time in the remaining 5 regions of interest was decreased to different extents, in the attempt to move from the initial non-fatigue state to the later fatigue state.
Therefore, the difference of the gazing areas can also directly reflect whether the brain is tired or not, and is the essence of the physiological characteristics of the brain fatigue in the periphery.
Further, in step S5, the first result and the second result are combined to output a result of the final awake state and the fatigue state.
In the technical scheme, a blink unit time accumulated time length sequence is constructed through the blink event sequence, and a fatigue spectrum and wavelet energy analysis method is provided for Fourier spectrum analysis and wavelet energy spectrum distribution analysis of the new sequence. The method enables the brain fatigue detection to be simpler and more universal.
Through statistical findings of the gaze area, fatigue states reduce gaze to the peripheral environment. This reduced degree leaf reflects the degree of fatigue.
It can be understood that the steps of the present invention further include original state model optimization, presetting a detection time domain, integrating and optimizing the state model acquired and detected in the time domain and the original state model to obtain an individualized original state model.
Example 2
The difference between this embodiment and embodiment 1 is that the step S3, the eye movement new sequence structure, as shown in fig. 2 and fig. 3, converts the collected blink event sequence to respectively establish a blink frequency input sequence, a blink average time length sequence and a blink time length sequence,
wherein, the blink frequency is input into a sequence to construct a sampling interval 6 s.
Wherein the sequence of blink average time durations defines a sampling interval of 6 s.
Wherein the sequence of blink time durations defines a sampling interval of 6 s.
Example 3
The difference between this embodiment and embodiment 1 is that, in step S4, after the blink frequency input sequence, the blink average time length sequence, and the blink time length sequence are respectively fourier-transformed, the sum of amplitudes of frequency points in the fourier-transformed low-frequency range is calculated.
Example 4
The present embodiment is different from embodiment 1 in that, in step S4, the eye movement feature extraction is performed, the distribution range of the gaze area is counted based on the recorded gaze spatial position sequence, the minimum area range of 90% of the gaze points is obtained, where the minimum area range is recorded with the diameter of the minimum circumscribed circle containing 90% of the number of the gaze points, the gaze point coordinates in the gaze event sequence are extracted to form a new spatial position sequence, the distribution of the statistical sequence in the two-dimensional space is calculated, and the gaze area thermodynamic diagram is output.
The embodiments of the present invention have been described in detail with reference to the accompanying drawings, but the present invention is not limited to the described embodiments. It will be apparent to those skilled in the art that various changes, modifications, substitutions and alterations can be made in these embodiments, including the components, without departing from the principles and spirit of the invention, and still fall within the scope of the invention.

Claims (6)

1. A fatigue detection method based on eye movement is characterized by comprising the following steps
S1, establishing an initial classification model which comprises two states of wakefulness and fatigue;
s2, collecting eye movement data, recording eye movement characteristics including a blink event sequence through a high-speed camera,
s3, constructing a new eye movement sequence, converting the collected blink event sequence, respectively establishing a blink frequency input sequence, a blink average time length sequence and a blink time length sequence,
wherein, the blink frequency input sequence constructs a sampling interval of 4-6s, counts the blink frequency to form the blink frequency input sequence,
wherein the blink average time length sequence defines a sampling interval of 4-6s, the total blink time length in the sampling interval is counted to form the blink average time length sequence,
the blink time length sequence defines a sampling interval of 4-6s, and the average blink time length in the sampling interval is counted to form a blink time length sequence;
s4, extracting eye movement characteristics, performing Fourier transform on the blink frequency input sequence, the blink average time length sequence and the blink time length sequence respectively, and calculating the energy sum of each frequency point in a Fourier transform low-frequency range and the amplitude sum of each frequency point in the Fourier transform low-frequency range;
and S5, outputting a result, judging the index by the initial classification model, and outputting a first result of the waking state and the fatigue state.
2. The method for detecting fatigue based on eye movement of claim 1, wherein in step S4, the low frequency range of the input sequence of blink frequency, the sequence of blink average time duration and the sequence of blink time duration after fourier transformation is 0-0.03 Hz.
3. The eye movement-based fatigue detection method according to claim 1, wherein the collected eye movement data further comprises a gaze event sequence in step S2, and the eye movement new sequence is constructed in step S3 and further used for transforming the collected gaze event sequence and establishing a gaze spatial position sequence.
4. The eye movement-based fatigue detection method according to claim 3, wherein in step S4, the eye movement features are extracted, the distribution range of the gaze area is counted based on the recorded gaze spatial position sequence, the minimum area range of 70% -90% of the gaze points is obtained, wherein the minimum area range is recorded by the diameter of the minimum circumscribed circle containing 70% -90% of the number of the gaze points, the gaze point coordinates in the gaze event sequence are extracted to form a new spatial position sequence, the distribution of the statistical sequence in the two-dimensional space is calculated, and the gaze area thermodynamic diagram is output.
5. The eye-movement-based fatigue detection method according to claim 4, wherein the step S5 is a step of outputting a result, comparing a gaze region thermodynamic diagram output from the obtained gaze point minimum region range with a minimum index region of the gaze range of the original classification model, and determining the result, and outputting a second result of the awake state and the fatigue state.
6. The method of detecting fatigue based on eye movements according to claim 5, wherein in step S5, the first result and the second result are combined to output the results of the final awake state and the fatigue state.
CN202010750681.9A 2020-07-29 2020-07-29 Eye movement-based fatigue detection method Active CN111985351B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010750681.9A CN111985351B (en) 2020-07-29 2020-07-29 Eye movement-based fatigue detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010750681.9A CN111985351B (en) 2020-07-29 2020-07-29 Eye movement-based fatigue detection method

Publications (2)

Publication Number Publication Date
CN111985351A CN111985351A (en) 2020-11-24
CN111985351B true CN111985351B (en) 2022-08-05

Family

ID=73444342

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010750681.9A Active CN111985351B (en) 2020-07-29 2020-07-29 Eye movement-based fatigue detection method

Country Status (1)

Country Link
CN (1) CN111985351B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112434611B (en) * 2020-11-25 2023-04-07 中山大学 Early fatigue detection method and system based on eye movement subtle features
CN117379062B (en) * 2023-12-12 2024-04-05 浙江好络维医疗技术有限公司 Single-lead dry electrode electrocardiogram P wave identification method, device, equipment and medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103617421A (en) * 2013-12-17 2014-03-05 上海电机学院 Fatigue detecting method and system based on comprehensive video feature analysis
CN105249961A (en) * 2015-11-02 2016-01-20 东南大学 Real-time driving fatigue detection system and detection method based on Bluetooth electroencephalogram headset
CN107280694A (en) * 2017-07-18 2017-10-24 燕山大学 A kind of fatigue detection method based on Multi-source Information Fusion
CN207545073U (en) * 2017-05-12 2018-06-29 杭州电子科技大学 A kind of fatigue driving intelligent identification device based on signal of blinking analysis
CN109498028A (en) * 2018-12-24 2019-03-22 深圳和而泰数据资源与云技术有限公司 Blink detection method, apparatus, blink monitor and eye wear equipment

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103617421A (en) * 2013-12-17 2014-03-05 上海电机学院 Fatigue detecting method and system based on comprehensive video feature analysis
CN105249961A (en) * 2015-11-02 2016-01-20 东南大学 Real-time driving fatigue detection system and detection method based on Bluetooth electroencephalogram headset
CN207545073U (en) * 2017-05-12 2018-06-29 杭州电子科技大学 A kind of fatigue driving intelligent identification device based on signal of blinking analysis
CN107280694A (en) * 2017-07-18 2017-10-24 燕山大学 A kind of fatigue detection method based on Multi-source Information Fusion
CN109498028A (en) * 2018-12-24 2019-03-22 深圳和而泰数据资源与云技术有限公司 Blink detection method, apparatus, blink monitor and eye wear equipment

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
Mining EEG with SVM for understanding cognitive underpinnings of math problem solving strategies;Bosch P., et.al;《Behav. Neurol.》;20181231;全文 *
基于脑电实验的眨眼次数与疲劳相关性研究;曾友雯等;《长春理工大学学报(自然科学版)》;20170215(第01期);全文 *

Also Published As

Publication number Publication date
CN111985351A (en) 2020-11-24

Similar Documents

Publication Publication Date Title
Yang et al. Epileptic seizure prediction based on permutation entropy
Rios-Urrego et al. Analysis and evaluation of handwriting in patients with Parkinson’s disease using kinematic, geometrical, and non-linear features
CN109646022B (en) Child attention assessment system and method thereof
CN105286890B (en) Driver doze state monitoring method based on electroencephalogram signals
CN112890834B (en) Attention-recognition-oriented machine learning-based eye electrical signal classifier
Niehorster et al. Characterizing gaze position signals and synthesizing noise during fixations in eye-tracking data
CN111985351B (en) Eye movement-based fatigue detection method
Lu et al. Classification of single-channel EEG signals for epileptic seizures detection based on hybrid features
CN106913333B (en) Method for selecting sensitivity characteristic index of continuous attention level
CN109512442A (en) A kind of EEG fatigue state classification method based on LightGBM
Solon et al. Decoding P300 variability using convolutional neural networks
CN115312195A (en) Health assessment method for calculating individual psychological abnormality based on emotion data
Zhang et al. Automatic detection of interictal epileptiform discharges based on time-series sequence merging method
CN107411738A (en) A kind of mood based on resting electroencephalogramidentification similitude is across individual discrimination method
Kaur et al. Heart rate variability (HRV): an indicator of stress
CN114391846A (en) Emotion recognition method and system based on filtering type feature selection
Pander et al. An automatic saccadic eye movement detection in an optokinetic nystagmus signal
CN116211305A (en) Dynamic real-time emotion detection method and system
CN114366103B (en) Attention assessment method and device and electronic equipment
Antunes et al. A morphology-based feature set for automated Amyotrophic Lateral Sclerosis diagnosis on surface electromyography
CN117918840A (en) Mental state detection method and system for electroencephalogram signals based on entropy complexity plane
CN110852307B (en) Brain-computer interface detection method, system, medium and device based on electroencephalogram signals
Ma et al. A new measure to characterize multifractality of sleep electroencephalogram
CN104793743A (en) Virtual social contact system and control method thereof
Murugappan et al. Subtractive fuzzy classifier based driver drowsiness levels classification using EEG

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant