CN107233103B - High-speed rail dispatcher fatigue state evaluation method and system - Google Patents

High-speed rail dispatcher fatigue state evaluation method and system Download PDF

Info

Publication number
CN107233103B
CN107233103B CN201710397928.1A CN201710397928A CN107233103B CN 107233103 B CN107233103 B CN 107233103B CN 201710397928 A CN201710397928 A CN 201710397928A CN 107233103 B CN107233103 B CN 107233103B
Authority
CN
China
Prior art keywords
fatigue
signal
characteristic
extracting
value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201710397928.1A
Other languages
Chinese (zh)
Other versions
CN107233103A (en
Inventor
郭孜政
张骏
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Southwest Jiaotong University
Original Assignee
Southwest Jiaotong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Southwest Jiaotong University filed Critical Southwest Jiaotong University
Priority to CN201710397928.1A priority Critical patent/CN107233103B/en
Publication of CN107233103A publication Critical patent/CN107233103A/en
Application granted granted Critical
Publication of CN107233103B publication Critical patent/CN107233103B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7242Details of waveform analysis using integration
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7271Specific aspects of physiological measurement analysis
    • A61B5/7282Event detection, e.g. detecting unique waveforms indicative of a medical condition

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • Veterinary Medicine (AREA)
  • Molecular Biology (AREA)
  • Public Health (AREA)
  • Psychiatry (AREA)
  • General Health & Medical Sciences (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physics & Mathematics (AREA)
  • Surgery (AREA)
  • Biophysics (AREA)
  • Pathology (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physiology (AREA)
  • Signal Processing (AREA)
  • Developmental Disabilities (AREA)
  • Social Psychology (AREA)
  • Psychology (AREA)
  • Hospice & Palliative Care (AREA)
  • Child & Adolescent Psychology (AREA)
  • Educational Technology (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Measurement And Recording Of Electrical Phenomena And Electrical Characteristics Of The Living Body (AREA)

Abstract

The invention discloses a method and a system for evaluating fatigue state of a high-speed rail dispatcher. The method comprises the steps of obtaining a heart rate signal, an electrocardiosignal, an electroencephalogram signal, a facial image signal and an eye movement characteristic signal of a high-speed rail dispatcher; extracting a characteristic value below a fatigue warning value based on the heart rate signal, the electrocardiosignal, the electroencephalogram signal, the facial image signal and the eye movement characteristic signal; and determining the fatigue state of the high-speed rail dispatcher by using a multi-channel data fusion algorithm based on the characteristic value below the fatigue warning value. By adopting the technical scheme, the embodiment of the invention takes the daily operation of a high-speed rail dispatcher as a test background, integrates a plurality of signals for judgment, solves the technical problem of how to improve the precision and accuracy of evaluation, and makes monitoring and evaluation more practical.

Description

High-speed rail dispatcher fatigue state evaluation method and system
Technical Field
The embodiment of the invention relates to the technical field of high-speed rails, in particular to a method and a system for evaluating fatigue states of high-speed rail dispatchers.
Background
The work mode of the high-speed rail dispatcher in shift mode determines that the high-speed rail dispatcher needs high anti-fatigue capability. At present, a system for monitoring fatigue of a high-speed rail dispatcher is relatively rare at home and abroad.
The existing fatigue monitoring equipment has a large difference between the working background of personnel and the working background of a high-speed railway dispatcher, and cannot be simply copied and applied to the fatigue monitoring and evaluation of the high-speed railway dispatcher; moreover, most of the existing fatigue monitoring and evaluating devices are based on single-channel analysis, and the precision and the accuracy are not high.
Therefore, a set of fatigue monitoring and intervention device is urgently needed to solve the harm brought by fatigue operation of high-speed rail dispatchers.
Disclosure of Invention
The embodiment of the invention mainly aims to provide a high-speed rail dispatcher fatigue state evaluation method, which at least partially solves the technical problem of how to improve the precision and accuracy of evaluation. In addition, a fatigue state evaluation system for the high-speed rail dispatcher is further provided.
In order to achieve the above object, according to one aspect of the present invention, the following technical solutions are provided:
a fatigue state evaluation method for a high-speed rail dispatcher. The method may include at least:
acquiring a heart rate signal, an electrocardiosignal, an electroencephalogram signal, a facial image signal and an eye movement characteristic signal of the high-speed rail dispatcher;
extracting a characteristic value below a fatigue alert value based on the heart rate signal, the electrocardiosignal, the electroencephalogram signal, the facial image signal and the eye movement characteristic signal;
and determining the fatigue state of the high-speed rail dispatcher by utilizing a multi-channel data fusion algorithm based on the characteristic value below the fatigue warning value.
Further, the extracting, based on the heart rate signal, the cardiac signal, the electroencephalogram signal, the facial image signal, and the eye movement feature signal, a feature value below a fatigue alert value may specifically include:
extracting a characteristic value below a heart rate index fatigue warning value based on the heart rate signal;
extracting a characteristic value below the fatigue warning value of the electrocardio index based on the electrocardio signal;
extracting characteristic values below the fatigue alert value of the electroencephalogram index based on the electroencephalogram signals;
extracting feature values below a fatigue alert value of a facial feature index based on the facial image signal;
and extracting a characteristic value below the eye movement index fatigue warning value based on the eye movement characteristic signal.
Further, the extracting, based on the heart rate signal, a feature value below a fatigue alert value of a heart rate index may specifically include:
obtaining a heart rate value based on the heart rate signal;
drawing a heart rate change curve based on the heart rate value;
and extracting characteristic values below the fatigue warning value of the heart rate index based on the heart rate change curve.
Further, the extracting, based on the electrocardiographic signal, a feature value below the fatigue alert value of the electrocardiographic index may specifically include:
filtering the electrocardiosignals;
performing artifact removal processing on the filtered signal;
carrying out time domain to frequency domain transformation on the signal without the artifacts;
extracting frequency domain features based on the transformation result;
and extracting a characteristic value below the fatigue warning value of the electrocardio index based on the frequency domain characteristics.
Further, the extracting, based on the electroencephalogram signal, a feature value below an electroencephalogram index fatigue alert value may specifically include:
filtering the electroencephalogram signals;
performing artifact removal processing on the filtered signal;
carrying out time domain to frequency domain transformation on the signal without the artifacts;
extracting frequency domain features based on the transformed signal;
and extracting characteristic values below the fatigue warning value of the electroencephalogram index based on the frequency domain characteristics.
Further, the extracting, based on the facial image signal, a feature value below a fatigue alert value of a facial feature index may specifically include:
performing digital image processing on the facial image signal;
determining facial features based on the processed results;
determining facial fatigue expression features based on the facial features;
and extracting a characteristic value below the fatigue warning value of the facial characteristic index based on the facial fatigue expression characteristic.
Further, the extracting, based on the eye movement characteristic signal, a characteristic value below an eye movement index fatigue warning value may specifically include:
performing digital image processing on the eye movement characteristic signal;
extracting eye movement parameters based on the processed result;
and extracting characteristic values below the eye movement index fatigue warning value based on the eye movement parameters.
Further, the determining the fatigue state of the high-speed rail dispatcher by using a multi-channel data fusion algorithm based on the characteristic values below the fatigue alert value may specifically include:
calculating the probability of the characteristic value below the heart rate index fatigue warning value, the characteristic value below the electrocardio index fatigue warning value, the characteristic value below the electroencephalogram index fatigue warning value, the characteristic value below the facial characteristic index fatigue warning value and the characteristic value below the eye movement index fatigue warning value according to the following formula by using an entropy method:
mi(Θ)=-k[qiMF log2qiMF+(1-qiMF)log2(1-qiMF)]
wherein, the
Figure BDA0001308022940000031
The MF represents fatigue; the above-mentioned
Figure BDA0001308022940000032
No fatigue is indicated; q is a number ofiMFRepresenting the probability that the ith channel judges that the high-speed rail dispatcher is in a fatigue state; 1-q of the aboveiMFRepresenting the probability that the ith channel judges that the high-speed rail dispatcher is in the fatigue-free state, wherein i is 1,2, … 5; the characteristic value below the heart rate index fatigue warning value, the characteristic value below the electrocardio index fatigue warning value, the characteristic value below the electroencephalogram index fatigue warning value, the characteristic value below the facial characteristic index fatigue warning value and the characteristic value below the eye movement index fatigue warning value are respectively input into each channel; the k represents an adjustment factor, and k ∈ (0, 1);
calculating the base allocation probability in a fatigue state and the base allocation probability in a non-fatigue state according to the following formula:
Figure BDA0001308022940000033
wherein, said mi(MF) representing a base distribution probability in a fatigue state; the above-mentioned
Figure BDA0001308022940000034
Indicating a base allocation probability in a non-fatigued state;
Performing multi-channel fusion based on a D-S evidence theory according to the following formula, and calculating the probability of the high-speed rail dispatcher in a fatigue state, the probability of the high-speed rail dispatcher in a non-fatigue state and the probability after fusion of characteristic values below all fatigue warning values:
Figure BDA0001308022940000041
Figure BDA0001308022940000042
Figure BDA0001308022940000043
wherein M (MF) represents a probability that the high-speed rail dispatcher is in a fatigue state; the above-mentioned
Figure BDA0001308022940000044
Representing a probability that the high-speed rail dispatcher is in an unworn state; the m (theta) represents the probability of fusion of characteristic values below the fatigue warning value; the above-mentioned
Figure BDA0001308022940000045
Or Θ, said i being 1, 2.. 5, said j being 1,2 k;
the fatigue and non-fatigue belief and likelihood functions are calculated according to the following formula:
Bel(MF)=m(MF)
Figure BDA0001308022940000046
Figure BDA0001308022940000047
Figure BDA0001308022940000048
wherein said Bel (MF) represents a belief function for said fatigue; said pl (mf) represents a likelihood function of said fatigue; the above-mentioned
Figure BDA0001308022940000049
A trust function representing said non-fatigue; the above-mentioned
Figure BDA00013080229400000410
A likelihood function representing said lack of fatigue;
and judging whether the high-speed rail dispatcher is in a fatigue state or not according to the fatigue trust function and the fatigue likelihood function and the non-fatigue trust function and the fatigue likelihood function.
Further, the method may further include:
comparing the determined fatigue state to a fatigue state threshold;
and if the threshold value is exceeded, performing early warning intervention.
According to another aspect of the invention, a fatigue state evaluation system for a high-speed rail dispatcher is also provided. The system may include at least:
the acquisition module is used for acquiring a heart rate signal, an electrocardiosignal, an electroencephalogram signal, a facial image signal and an eye movement characteristic signal of the high-speed rail dispatcher;
the extraction module is used for extracting a characteristic value below a fatigue warning value based on the heart rate signal, the electrocardiosignal, the electroencephalogram signal, the facial image signal and the eye movement characteristic signal;
and the determining module is used for determining the fatigue state of the high-speed rail dispatcher by utilizing a multi-channel data fusion algorithm based on the characteristic value below the fatigue warning value.
Further, the extraction module may specifically include:
the first extraction unit is used for extracting a characteristic value below a fatigue warning value of a heart rate index based on the heart rate signal;
the second extraction unit is used for extracting a characteristic value below the fatigue warning value of the electrocardio index based on the electrocardio signal;
the third extraction unit is used for extracting a characteristic value below the fatigue warning value of the electroencephalogram index based on the electroencephalogram signal;
a fourth extraction unit configured to extract a feature value of a facial feature index fatigue alert value or less based on the facial image signal;
and a fifth extraction unit, configured to extract a feature value below the eye movement index fatigue warning value based on the eye movement feature signal.
Further, the first extraction unit may specifically include:
the acquisition unit is used for obtaining a heart rate value based on the heart rate signal;
the drawing unit is used for drawing a heart rate change curve based on the heart rate value;
and the first extraction subunit is used for extracting a characteristic value below the fatigue warning value of the heart rate index based on the heart rate change curve.
Further, the second extraction unit may specifically include:
the first filtering unit is used for filtering the electrocardiosignals;
the first artifact removing unit is used for removing artifacts of the filtered signal;
the first transformation unit is used for carrying out time domain to frequency domain transformation on the signal without the artifacts;
a second extraction subunit, configured to extract frequency domain features based on the transform result;
and the third extraction subunit is used for extracting a characteristic value below the fatigue warning value of the electrocardio index based on the frequency domain characteristic.
Further, the third extraction unit may specifically include:
the second filtering unit is used for filtering the electroencephalogram signals;
the second artifact removing unit is used for removing artifacts of the filtered signal;
the second transformation unit is used for carrying out time domain to frequency domain transformation on the signal without the artifacts;
a fourth extraction subunit, configured to extract frequency domain features based on the transformed signal;
and the fifth extraction subunit is used for extracting a characteristic value below the electroencephalogram index fatigue warning value based on the frequency domain characteristics.
Further, the fourth extraction unit may specifically include:
a first processing unit for performing digital image processing on the face image signal;
a first determination unit configured to determine a facial feature based on a result after the processing;
a second determination unit configured to determine a facial fatigue expression feature based on the facial feature;
and the sixth extraction subunit is used for extracting a characteristic value below the fatigue warning value of the facial characteristic index based on the facial fatigue expression characteristic.
Further, the fifth extraction unit may specifically include:
the second processing unit is used for carrying out digital image processing on the eye movement characteristic signal;
a seventh extraction subunit, configured to extract an eye movement parameter based on the processed result;
and the eighth extraction subunit is used for extracting a characteristic value below the eye movement index fatigue warning value based on the eye movement parameter.
Further, the determining module may specifically include:
a first calculating unit, configured to calculate, by using an entropy method, probabilities of feature values below the heart rate index fatigue alert value, feature values below the electrocardiogram index fatigue alert value, feature values below the electroencephalogram index fatigue alert value, feature values below the facial feature index fatigue alert value, and feature values below the eye movement index fatigue alert value according to the following formulas:
mi(Θ)=-k[qiMF log2qiMF+(1-qiMF)log2(1-qiMF)]
wherein, the
Figure BDA0001308022940000061
The MF represents fatigue; the above-mentioned
Figure BDA0001308022940000062
No fatigue is indicated; q is a number ofiMFRepresenting the probability that the ith channel judges that the high-speed rail dispatcher is in a fatigue state; 1-q of the aboveiMFRepresenting the probability that the ith channel judges that the high-speed rail dispatcher is in the fatigue-free state, wherein i is 1,2, … 5; the characteristic value below the heart rate index fatigue warning value, the characteristic value below the electrocardio index fatigue warning value, the characteristic value below the electroencephalogram index fatigue warning value, the characteristic value below the facial characteristic index fatigue warning value and the characteristic value below the eye movement index fatigue warning value are respectively input into each channel; the k represents an adjustment factor, and k ∈ (0, 1);
a second calculation unit for calculating the basic distribution probability in the fatigue state and the basic distribution probability in the non-fatigue state according to the following formula:
Figure BDA0001308022940000071
wherein, said mi(MF) representing a base distribution probability in a fatigue state; the above-mentioned
Figure BDA0001308022940000072
Representing a base allocation probability in a non-fatigued state;
a third calculating unit, configured to perform multi-channel fusion based on the D-S evidence theory according to the following formula, and calculate a probability that the high-speed rail dispatcher is in a fatigue state, a probability that the high-speed rail dispatcher is in a non-fatigue state, and a probability that feature values below the fatigue warning values are fused:
Figure BDA0001308022940000073
Figure BDA0001308022940000074
Figure BDA0001308022940000075
wherein M (MF) represents a probability that the high-speed rail dispatcher is in a fatigue state; the above-mentioned
Figure BDA0001308022940000081
Representing a probability that the high-speed rail dispatcher is in an unworn state; the m (theta) represents the probability of fusion of characteristic values below the fatigue warning value; the above-mentioned
Figure BDA0001308022940000082
Or Θ, said i being 1, 2.. 5, said j being 1,2 k;
a fourth calculation unit for calculating the tired belief function and the likelihood function, and the tired belief function and the likelihood function according to the following equations:
Bel(MF)=m(MF)
Figure BDA0001308022940000083
Figure BDA0001308022940000084
Figure BDA0001308022940000085
wherein said Bel (MF) represents a belief function for said fatigue; said pl (mf) represents a likelihood function of said fatigue; the above-mentioned
Figure BDA0001308022940000086
A trust function representing said non-fatigue; the above-mentioned
Figure BDA0001308022940000087
A likelihood function representing said lack of fatigue;
and the judging unit is used for judging whether the high-speed rail dispatcher is in a fatigue state or not according to the fatigue trust function and the fatigue likelihood function and the non-fatigue trust function and the fatigue likelihood function.
Further, the system may further include:
a comparison unit for comparing the determined fatigue state with a fatigue state threshold;
and the early warning intervention unit is used for carrying out early warning intervention under the condition that the fatigue state exceeds the fatigue state threshold value.
Compared with the prior art, the technical scheme at least has the following beneficial effects:
the embodiment of the invention provides a method and a system for evaluating fatigue states of high-speed rail dispatchers. The method comprises the steps of obtaining a heart rate signal, an electrocardiosignal, an electroencephalogram signal, a facial image signal and an eye movement characteristic signal of a high-speed rail dispatcher; extracting a characteristic value below a fatigue warning value based on the heart rate signal, the electrocardiosignal, the electroencephalogram signal, the facial image signal and the eye movement characteristic signal; and determining the fatigue state of the high-speed rail dispatcher by using a multi-channel data fusion algorithm based on the characteristic value below the fatigue warning value. By adopting the technical scheme, the embodiment of the invention takes the daily operation of a high-speed rail dispatcher as a test background, so that the monitoring and evaluation have more practical significance; in addition, the embodiment of the invention integrates a plurality of signals for judgment, so that the judgment precision and accuracy are higher.
Of course, it is not necessary for any product in which the invention is practiced to achieve all of the above-described advantages at the same time.
Additional features and advantages of the invention will be set forth in the description which follows, and in part will be obvious from the description, or may be learned by practice of the invention. The objectives and other advantages of the invention will be realized and attained by the means particularly pointed out in the written description and claims hereof as well as the appended drawings.
Drawings
The accompanying drawings, which are included to provide a further understanding of the invention, are incorporated in and constitute a part of this specification, illustrate embodiments of the invention and together with the description serve to explain the invention without limiting the invention to the right. It is obvious that the drawings in the following description are only some embodiments, and that for a person skilled in the art, other drawings can be derived from them without inventive effort. In the drawings:
FIG. 1 is a schematic flow diagram illustrating a high-speed rail dispatcher fatigue status assessment method according to an exemplary embodiment;
FIG. 2 is a schematic diagram illustrating a high-speed rail dispatcher fatigue status assessment system according to an exemplary embodiment;
fig. 3 is a schematic structural diagram illustrating a high-speed rail dispatcher fatigue status assessment system according to another exemplary embodiment.
These drawings and written description are not intended to limit the scope of the inventive concepts in any manner, but rather to illustrate the inventive concepts to those skilled in the art by reference to specific embodiments.
Detailed Description
The technical problems solved, the technical solutions adopted and the technical effects achieved by the embodiments of the present invention are clearly and completely described below with reference to the accompanying drawings and the specific embodiments. It is to be understood that the described embodiments are merely a few, and not all, of the embodiments of the present application. All other equivalent or obviously modified embodiments obtained by the person skilled in the art based on the embodiments in this application fall within the scope of protection of the invention without inventive step. The embodiments of the invention can be embodied in many different ways as defined and covered by the claims.
It should be noted that in the following description, numerous specific details are set forth in order to provide an understanding. It may be evident, however, that the subject invention may be practiced without these specific details.
It should be noted that, in the case where there is no explicit limitation or conflict, the embodiments and technical features in the present invention may be combined with each other to form a technical solution.
In practical application, in order to solve the technical problem of how to improve the precision and accuracy of the fatigue state evaluation of the high-speed rail dispatcher, the embodiment of the invention provides a fatigue state evaluation method of the high-speed rail dispatcher. The method may be implemented through step S100 to step S120.
S100: acquiring a heart rate signal, an electrocardiosignal, an electroencephalogram signal, a facial image signal and an eye movement characteristic signal of a high-speed rail dispatcher.
In the step, the heart rate signal, the electrocardiosignal, the electroencephalogram signal, the facial image signal and the eye movement characteristic signal of the high-speed rail dispatcher can be obtained through signal acquisition equipment. The signal acquisition equipment comprises a monitoring bracelet, a monitoring brain ring, image acquisition equipment (such as a camera, a camera and the like) and an eye tracker. The monitoring bracelet is used for collecting heart rate signals and electrocardiosignals. The monitoring brain ring is used for collecting brain electrical signals. The image acquisition device is used for acquiring a facial image signal. The eye tracker is used for collecting eye movement characteristic signals.
In practical applications, the collected signals may be transmitted to a database.
S110: and extracting a characteristic value below the fatigue warning value based on the heart rate signal, the electrocardiosignal, the electroencephalogram signal, the facial image signal and the eye movement characteristic signal.
Specifically, this step may be realized by S111 to S115.
S111: and extracting a characteristic value below the fatigue warning value of the heart rate index based on the heart rate signal.
In some optional embodiments, step S111 may further include:
step a 1: based on the heart rate signal, a heart rate value is obtained.
Step a 2: based on the heart rate values, heart rate variation curves are drawn.
Step a 3: and extracting a characteristic value below the fatigue warning value of the heart rate index based on the heart rate change curve.
S112: and extracting a characteristic value below the fatigue warning value of the electrocardio index based on the electrocardio signal.
In some optional embodiments, step S112 may further include:
step b 1: the electrocardiosignal is filtered.
Step b 2: and performing artifact removal processing on the filtered signal.
Step b 3: and carrying out time domain to frequency domain transformation on the signal subjected to artifact removal.
Step b 4: based on the transform results, frequency domain features are extracted.
Step b 5: and extracting characteristic values below the fatigue warning value of the electrocardio index based on the frequency domain characteristics.
S113: and extracting a characteristic value below the fatigue warning value of the electroencephalogram index based on the electroencephalogram signal.
In some optional embodiments, step S113 may further include:
step c 1: and filtering the electroencephalogram signals.
Step c 2: and performing artifact removal processing on the filtered signal.
Step c 3: and carrying out time domain to frequency domain transformation on the signal subjected to artifact removal.
Step c 4: based on the transformed signal, frequency domain features are extracted.
Step c 5: and extracting characteristic values below the fatigue warning value of the electroencephalogram index based on the frequency domain characteristics.
S114: based on the face image signal, feature values below the fatigue alert value of the face feature index are extracted.
In some optional embodiments, step S114 may further include:
step d 1: digital image processing is performed on the face image signal.
Step d 2: based on the processed result, facial features are determined.
Step d 3: based on the facial features, facial fatigue expression features are determined.
The facial fatigue expression features may be, for example, yawning facial fatigue expressions.
Step d 4: and extracting a characteristic value below the fatigue warning value of the facial characteristic index based on the facial fatigue expression characteristics.
S115: and extracting a characteristic value below the eye movement index fatigue warning value based on the eye movement characteristic signal.
In some optional embodiments, step S115 may further include:
step e 1: and carrying out digital image processing on the eye movement characteristic signal.
Step e 2: based on the processed result, an eye movement parameter is extracted.
Eye movement parameters include, but are not limited to, eyelid openness, blink frequency, and eye movement trajectory.
Step e 3: and extracting a characteristic value below the eye movement index fatigue warning value based on the eye movement parameters.
S120: and determining the fatigue state of the high-speed rail dispatcher by using a multi-channel data fusion algorithm based on the characteristic value below the fatigue warning value.
If the high-speed rail dispatcher is judged to be in a fatigue state according to a certain signal, the judgment precision cannot reach 100%. Therefore, the embodiment of the invention adopts means such as a multi-channel fusion algorithm based on a D-S evidence theory and the like to fuse various signals for judgment, thereby improving the judgment precision and accuracy.
Specifically, the step may include:
step f 1: determining the probability of the characteristic value below the fatigue warning value of the heart rate index, the characteristic value below the fatigue warning value of the electrocardio index, the characteristic value below the fatigue warning value of the electroencephalogram index, the characteristic value below the fatigue warning value of the facial characteristic index and the characteristic value below the fatigue warning value of the eye movement index according to the following formula by utilizing an entropy method:
mi(Θ)=-k[qiMF log2qiMF+(1-qiMF)log2(1-qiMF)]
wherein the content of the first and second substances,
Figure BDA0001308022940000121
MF represents fatigue;
Figure BDA0001308022940000122
no fatigue is indicated; q. q.siMFRepresenting the probability that the ith channel judges that the high-speed rail dispatcher is in a fatigue state; 1-qiMFThe probability that the ith channel judges that the high-speed rail dispatcher is in the fatigue-free state is shown, i is 1,2 and … 5; the characteristic value below the heart rate index fatigue warning value, the characteristic value below the electrocardio index fatigue warning value, the characteristic value below the electroencephalogram index fatigue warning value, the characteristic value below the facial characteristic index fatigue warning value and the characteristic value below the eye movement index fatigue warning value are respectively input into each channel; k represents an adjustment factor, and k ∈ (0, 1).
Wherein, each channel has only a limited number to whether the high-speed railway dispatcher is in the fatigue state recognition result: fatigue MF and No fatigue
Figure BDA0001308022940000123
The heart rate signal, the electrocardiosignal, the electroencephalogram signal, the facial image signal and the eye movement characteristic signal respectively pass through each channel.
Step f 2: determining the base allocation probability in a fatigue state and the base allocation probability in a non-fatigue state according to the following formula:
Figure BDA0001308022940000124
wherein m isi(MF) representing a base distribution probability in a fatigue state;
Figure BDA0001308022940000125
representing the underlying assigned probability of being in an unfamiliar state.
Step f 3: performing multi-channel fusion based on a D-S evidence theory according to the following formula, and determining the probability of the high-speed rail dispatcher in a fatigue state, the probability of the high-speed rail dispatcher in a non-fatigue state and the probability after fusion of characteristic values below all fatigue warning values:
Figure BDA0001308022940000126
Figure BDA0001308022940000131
Figure BDA0001308022940000132
wherein m (mf) represents the probability that the high-speed rail dispatcher is in a fatigue state;
Figure BDA0001308022940000133
the probability that the high-speed rail dispatcher is in an fatigue-free state is represented; m (Θ) represents the probability of fusion of characteristic values below the fatigue alert value;
Figure BDA0001308022940000134
or Θ, i ═ 1,2, … 5, j ═ 1, 2.
Step f 4: the fatigue and non-fatigue belief and likelihood functions are calculated according to the following formula:
Bel(MF)=m(MF)
Figure BDA0001308022940000135
Figure BDA0001308022940000136
Figure BDA0001308022940000137
wherein Bel (MF) represents a confidence function of fatigue; pl (mf) represents a likelihood function of fatigue;
Figure BDA0001308022940000138
a trust function representing no fatigue;
Figure BDA0001308022940000139
representing a plausibility function that is not fatigue.
Step f 5: and judging whether the high-speed rail dispatcher is in a fatigue state or not according to the fatigue trust function and the fatigue likelihood function and the non-fatigue trust function and the fatigue likelihood function.
For example:
when in use
Figure BDA00013080229400001310
Judging that the high-speed rail dispatcher is in a fatigue state; when in use
Figure BDA00013080229400001311
When so, it cannot be determined; when in use
Figure BDA00013080229400001312
And if so, judging that the high-speed rail dispatcher is in a non-fatigue state.
In some optional embodiments, on the basis of the above embodiments, the method for evaluating fatigue status of a high-speed rail dispatcher may further include:
s130: the determined fatigue state is compared to a fatigue state threshold.
S140: and if the threshold value is exceeded, performing early warning intervention.
By adopting the technical scheme, the embodiment of the invention takes the daily operation of a high-speed rail dispatcher as a test background, so that the monitoring and evaluation have more practical significance; in addition, the embodiment of the invention integrates a plurality of signals for judgment, so that the judgment precision and accuracy are higher.
Although the foregoing embodiments describe the steps in the above sequential order, those skilled in the art will understand that, in order to achieve the effect of the present embodiments, the steps may not be executed in such an order, and may be executed simultaneously (in parallel) or in an inverse order, and these simple variations are within the scope of the present invention.
Based on the same technical concept as the method embodiment, the embodiment of the invention also provides a fatigue state evaluation system for the high-speed rail dispatcher. As shown in fig. 2, the system 20 may include at least: an acquisition module 22, an extraction module 24, and a determination module 26. The obtaining module 22 is configured to obtain a heart rate signal, an electrocardiograph signal, an electroencephalogram signal, a facial image signal, and an eye movement feature signal of the high-speed rail dispatcher. The extraction module 24 is configured to extract a feature value below the fatigue alert value based on the heart rate signal, the electrocardiograph signal, the electroencephalogram signal, the facial image signal, and the eye movement feature signal. The determination module 26 is configured to determine the fatigue state of the high-speed rail dispatcher by using a multi-channel data fusion algorithm based on the characteristic value below the fatigue warning value.
In some optional embodiments, on the basis of the above embodiments, the extracting module may specifically include: the device comprises a first extraction unit, a second extraction unit, a third extraction unit, a fourth extraction unit and a fifth extraction unit. The first extraction unit is used for extracting a characteristic value below a fatigue warning value of the heart rate index based on the heart rate signal. The second extraction unit is used for extracting a characteristic value below the fatigue warning value of the electrocardio index based on the electrocardio signal. The third extraction unit is used for extracting a characteristic value below the fatigue warning value of the electroencephalogram index based on the electroencephalogram signal. The fourth extraction unit is used for extracting a characteristic value below the fatigue warning value of the facial characteristic index based on the facial image signal. The fifth extraction unit is used for extracting a characteristic value below the eye movement index fatigue warning value based on the eye movement characteristic signal.
In some optional embodiments, on the basis of the foregoing embodiments, the first extraction unit may specifically include: the device comprises an acquisition unit, a drawing unit and a first extraction subunit. The acquisition unit is used for obtaining a heart rate value based on the heart rate signal. The drawing unit is used for drawing a heart rate change curve based on the heart rate value. The first extraction subunit is used for extracting a characteristic value below the fatigue warning value of the heart rate index based on the heart rate variation curve.
In some optional embodiments, on the basis of the foregoing embodiments, the second extraction unit may specifically include: the device comprises a first filtering unit, a first artifact removing unit, a first transformation unit, a second extraction subunit and a third extraction subunit. The first filtering unit is used for filtering the electrocardiosignal. The first artifact removing unit is used for removing artifacts of the filtered signals. The first transformation unit is used for carrying out time domain to frequency domain transformation on the signal subjected to artifact removal. The second extraction subunit is used for extracting frequency domain features based on the transformation result. And the third extraction subunit is used for extracting a characteristic value below the fatigue warning value of the electrocardio index based on the frequency domain characteristics.
In some optional embodiments, on the basis of the foregoing embodiments, the third extraction unit specifically includes: the second filtering unit, the second artifact removing unit, the second transformation unit, the fourth extraction subunit and the fifth extraction subunit. The second filtering unit is used for filtering the electroencephalogram signals. And the second artifact removing unit is used for removing the artifacts of the filtered signal. The second transformation unit is used for carrying out time domain to frequency domain transformation on the signal subjected to artifact removal. The fourth extraction subunit is configured to extract frequency domain features based on the transformed signal. And the fifth extraction subunit is used for extracting a characteristic value below the fatigue warning value of the electroencephalogram index based on the frequency domain characteristics.
In some optional embodiments, on the basis of the foregoing embodiments, the fourth extraction unit specifically includes: the device comprises a first processing unit, a first determining unit, a second determining unit and a sixth extracting sub-unit. The first processing unit is used for carrying out digital image processing on the face image signal. The first determination unit is configured to determine a facial feature based on the processed result. The second determination unit is used for determining facial fatigue expression characteristics based on the facial features. And the sixth extraction subunit is used for extracting a characteristic value below the fatigue warning value of the facial characteristic index based on the facial fatigue expression characteristics.
In some optional embodiments, on the basis of the foregoing embodiments, the fifth extraction unit specifically includes: a second processing unit, a seventh extraction subunit and an eighth extraction subunit. The second processing unit is used for carrying out digital image processing on the eye movement characteristic signal. And the seventh extraction subunit is used for extracting the eye movement parameters based on the processed result. The eighth extraction subunit is configured to extract, based on the eye movement parameter, a feature value below the eye movement index fatigue warning value.
In some optional embodiments, the determining module may specifically include: the device comprises a first calculating unit, a second calculating unit, a third calculating unit, a fourth calculating unit and a judging unit. The first calculation unit is used for calculating the probability of the characteristic value below the heart rate index fatigue warning value, the characteristic value below the electrocardiogram index fatigue warning value, the characteristic value below the electroencephalogram index fatigue warning value, the characteristic value below the facial characteristic index fatigue warning value and the characteristic value below the eye movement index fatigue warning value according to the following formula by using an entropy method:
mi(Θ)=-k[qiMF log2qiMF+(1-qiMF)log2(1-qiMF)]
wherein the content of the first and second substances,
Figure BDA0001308022940000161
MF represents fatigue;
Figure BDA0001308022940000162
no fatigue is indicated; q. q.siMFRepresenting the probability that the ith channel judges that the high-speed rail dispatcher is in a fatigue state; 1-qiMFThe probability that the ith channel judges that the high-speed rail dispatcher is in the fatigue-free state is shown, i is 1,2 and … 5; the characteristic value below the heart rate index fatigue warning value, the characteristic value below the electrocardio index fatigue warning value, the characteristic value below the electroencephalogram index fatigue warning value, the characteristic value below the facial characteristic index fatigue warning value and the characteristic value below the eye movement index fatigue warning value are respectively input into each channel; k represents an adjustment factor, and k ∈ (0, 1). The second calculation unit is used for calculating the basic distribution probability in the fatigue state and the basic distribution probability in the non-fatigue state according to the following formula:
Figure BDA0001308022940000163
wherein m isi(MF) representing a base distribution probability in a fatigue state;
Figure BDA0001308022940000164
representing the underlying assigned probability of being in an unfamiliar state. The third calculation unit is used for performing multi-channel fusion based on the D-S evidence theory according to the following formula, and calculating the probability of the high-speed rail dispatcher in the fatigue state, the probability of the high-speed rail dispatcher in the non-fatigue state and the probability of fusion of characteristic values below all fatigue warning values:
Figure BDA0001308022940000165
Figure BDA0001308022940000166
Figure BDA0001308022940000171
wherein m (mf) represents the probability that the high-speed rail dispatcher is in a fatigue state;
Figure BDA0001308022940000172
the probability that the high-speed rail dispatcher is in an fatigue-free state is represented; m (Θ) represents the probability of fusion of characteristic values below the fatigue alert value;
Figure BDA0001308022940000173
or Θ, i ═ 1, 2.. 5, j ═ 1,2 k. The fourth calculating unit is used for calculating the fatigue trust function and the fatigue likelihood function and the non-fatigue trust function and the fatigue likelihood function according to the following formulas:
Bel(MF)=m(MF)
Figure BDA0001308022940000174
Figure BDA0001308022940000175
Figure BDA0001308022940000176
wherein Bel (MF) represents a confidence function of fatigue; pl (mf) represents a likelihood function of fatigue;
Figure BDA0001308022940000177
a trust function representing no fatigue;
Figure BDA0001308022940000178
representing a plausibility function that is not fatigue. The judging unit is used for judging whether the high-speed rail dispatcher is in a fatigue state or not according to the fatigue trust function and the fatigue likelihood function and the non-fatigue trust function and the fatigue likelihood function.
In some optional embodiments, on the basis of the embodiment shown in fig. 2, the system for evaluating fatigue status of a high-speed rail dispatcher may further include: the device comprises a comparison unit and an early warning intervention unit. Wherein the comparison unit is configured to compare the determined fatigue state with a fatigue state threshold. The early warning intervention unit is used for carrying out early warning intervention under the condition that the fatigue state exceeds the fatigue state threshold value.
Fig. 3 exemplarily shows a preferred implementation manner of the high-speed rail dispatcher fatigue state evaluation system provided by the embodiment of the invention. The high-speed rail dispatcher comprises a bracelet component, a brain ring component, a camera and an eye movement instrument, wherein the bracelet component, the brain ring component, the camera and the eye movement instrument are used for collecting heart rate signals, electrocardiosignals, electroencephalogram signals, facial image signals and eye movement characteristic signals of the high-speed rail dispatcher. The database is used for storing heart rate signals, electrocardiosignals, electroencephalogram signals, facial image signals and eye movement characteristic signals of a high-speed rail dispatcher. The processing means is used for executing the operations of the extraction module and the determination module. The intervention component is used for executing the operation of the comparison unit and the early warning intervention unit. The printer is used for printing output results.
Those skilled in the art will appreciate that the high-speed rail dispatcher fatigue status assessment system described above may also include some other well-known structures, such as processors, memories, etc., wherein the memories include, but are not limited to, random access memory, flash memory, read-only memory, volatile memory, non-volatile memory, serial memory, parallel memory or registers, etc., and the processors include, but are not limited to, CPLD/FPGA, DSP, ARM processor, MIPS processor, etc., which are not shown in fig. 2 in order to unnecessarily obscure embodiments of the present disclosure.
It should be understood that the number of individual modules in fig. 2 is merely illustrative. There may be any number of acquisition modules, extraction modules, and determination modules, as desired for an implementation.
It should be noted that: in the fatigue state evaluation system for the high-speed rail dispatcher provided in the above embodiment, only the division of the above functional modules is taken as an example for performing fatigue state evaluation, and in practical application, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the system is divided into different functional modules to complete all or part of the above described functions.
As used herein, the term "module" may refer to a software object or routine that executes on a computing system. The different modules described herein may be implemented as objects or processes that execute on the computing system (e.g., as separate threads). While the systems and methods described herein are preferably implemented in software, implementations in hardware or a combination of software and hardware are also possible and contemplated.
The above system embodiment may be used to implement the above method embodiment, and the technical principle, the technical problems solved, and the technical effects are similar, and it can be clearly understood by those skilled in the art that, for convenience and brevity of description, the specific working process of the above described system may refer to the corresponding process in the foregoing method embodiment, and is not described herein again.
It should be noted that the system embodiment and the method embodiment of the present invention have been described above separately, but the details described for one embodiment may also be applied to another embodiment. The names of the modules and steps involved in the embodiments of the present invention are only for distinguishing the modules or steps, and are not to be construed as unduly limiting the present invention. Those skilled in the art will understand that: modules or steps in embodiments of the present invention may also be broken down or combined. For example, the modules of the above embodiments may be combined into one module, or may be further split into multiple sub-modules.
The technical solutions provided by the embodiments of the present invention are described in detail above. Although specific examples have been employed herein to illustrate the principles and practice of the invention, the foregoing descriptions of embodiments are merely provided to assist in understanding the principles of embodiments of the invention; also, it will be apparent to those skilled in the art that variations may be made in the embodiments and applications of the invention without departing from the spirit and scope of the invention.
It should be noted that the flowcharts and block diagrams referred to herein are not limited to the forms shown herein, and may be divided and/or combined.
It should also be noted that: the numerals and text in the figures are only used to illustrate the invention more clearly and are not to be considered as an undue limitation of the scope of the invention.
The terms "comprises," "comprising," or any other similar term are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be noted that the above-mentioned embodiments illustrate rather than limit the invention, and that those skilled in the art will be able to design alternative embodiments without departing from the scope of the appended claims. In the claims, any reference signs placed between parentheses shall not be construed as limiting the claim. The word "comprising" does not exclude the presence of elements or steps not listed in a claim. The word "a" or "an" preceding an element does not exclude the presence of a plurality of such elements. The invention may be implemented by means of hardware comprising several distinct elements, and by means of a suitably programmed PC. In the unit claims enumerating several means, several of these means may be embodied by one and the same item of hardware. The usage of the words first, second and third, etcetera do not indicate any ordering. These words may be interpreted as names.
The various steps of the present invention may be implemented in a general purpose computing device, for example, they may be centralized on a single computing device, such as: personal computers, server computers, hand-held or portable devices, tablet-type devices or multi-processor apparatus, which may be distributed over a network of computing devices, may perform the steps shown or described in a different order than those shown or described herein, or may be implemented as separate integrated circuit modules, or may be implemented as a single integrated circuit module from multiple modules or steps. Thus, the present invention is not limited to any specific hardware or software or combination thereof.
The methods provided by the present invention may be implemented using programmable logic devices or as computer program software or program modules (including routines, programs, objects, components, data structures, etc.) including performing particular tasks or implementing particular abstract data types, such as a computer program product which is executed to cause a computer to perform the methods described herein. The computer program product includes a computer-readable storage medium having computer program logic or code portions embodied in the medium for performing the method. The computer-readable storage medium may be a built-in medium installed in the computer or a removable medium detachable from the computer main body (e.g., a storage device using a hot-plug technology). The built-in medium includes, but is not limited to, rewritable non-volatile memory such as: RAM, ROM, flash memory, and hard disk. The removable media include, but are not limited to: optical storage media (e.g., CD-ROMs and DVDs), magneto-optical storage media (e.g., MOs), magnetic storage media (e.g., magnetic tapes or removable disks), media with built-in rewritable non-volatile memory (e.g., memory cards), and media with built-in ROMs (e.g., ROM cartridges).
The present invention is not limited to the above-described embodiments, and any variations, modifications, or alterations that may occur to one skilled in the art without departing from the spirit of the invention fall within the scope of the invention.

Claims (16)

1. A method for evaluating fatigue state of a high-speed rail dispatcher, which is characterized by at least comprising the following steps:
acquiring a heart rate signal, an electrocardiosignal, an electroencephalogram signal, a facial image signal and an eye movement characteristic signal of the high-speed rail dispatcher;
extracting a characteristic value below a fatigue alert value based on the heart rate signal, the electrocardiosignal, the electroencephalogram signal, the facial image signal and the eye movement characteristic signal;
and determining the fatigue state of the high-speed rail dispatcher by utilizing a multi-channel data fusion algorithm based on the characteristic value below the fatigue warning value.
2. The method according to claim 1, wherein the extracting feature values below a fatigue alert value based on the heart rate signal, the cardiac signal, the electroencephalogram signal, the facial image signal, and the eye movement feature signal specifically comprises:
extracting a characteristic value below a heart rate index fatigue warning value based on the heart rate signal;
extracting a characteristic value below the fatigue warning value of the electrocardio index based on the electrocardio signal;
extracting characteristic values below the fatigue alert value of the electroencephalogram index based on the electroencephalogram signals;
extracting feature values below a fatigue alert value of a facial feature index based on the facial image signal;
and extracting a characteristic value below the eye movement index fatigue warning value based on the eye movement characteristic signal.
3. The method according to claim 2, wherein extracting, based on the heart rate signal, a characteristic value below a heart rate indicator fatigue alert value specifically comprises:
obtaining a heart rate value based on the heart rate signal;
drawing a heart rate change curve based on the heart rate value;
and extracting characteristic values below the fatigue warning value of the heart rate index based on the heart rate change curve.
4. The method according to claim 2, wherein the extracting, based on the electrocardiographic signal, a feature value below an electrocardiographic index fatigue alert value specifically comprises:
filtering the electrocardiosignals;
performing artifact removal processing on the filtered signal;
carrying out time domain to frequency domain transformation on the signal without the artifacts;
extracting frequency domain features based on the transformation result;
and extracting a characteristic value below the fatigue warning value of the electrocardio index based on the frequency domain characteristics.
5. The method according to claim 2, wherein the extracting, based on the electroencephalogram signal, a feature value below an electroencephalogram index fatigue alert value specifically comprises:
filtering the electroencephalogram signals;
performing artifact removal processing on the filtered signal;
carrying out time domain to frequency domain transformation on the signal without the artifacts;
extracting frequency domain features based on the transformed signal;
and extracting characteristic values below the fatigue warning value of the electroencephalogram index based on the frequency domain characteristics.
6. The method according to claim 2, wherein the extracting feature values below a facial feature index fatigue alert value based on the facial image signal specifically comprises:
performing digital image processing on the facial image signal;
determining facial features based on the processed results;
determining facial fatigue expression features based on the facial features;
and extracting a characteristic value below the fatigue warning value of the facial characteristic index based on the facial fatigue expression characteristic.
7. The method according to claim 2, wherein the extracting, based on the eye movement characteristic signal, characteristic values below an eye movement indicator fatigue alert value specifically comprises:
performing digital image processing on the eye movement characteristic signal;
extracting eye movement parameters based on the processed result;
and extracting characteristic values below the eye movement index fatigue warning value based on the eye movement parameters.
8. The method of claim 1, further comprising:
comparing the determined fatigue state to a fatigue state threshold;
and if the threshold value is exceeded, performing early warning intervention.
9. A high-speed rail dispatcher fatigue state assessment system, characterized in that the system at least comprises:
the acquisition module is used for acquiring a heart rate signal, an electrocardiosignal, an electroencephalogram signal, a facial image signal and an eye movement characteristic signal of the high-speed rail dispatcher;
the extraction module is used for extracting a characteristic value below a fatigue warning value based on the heart rate signal, the electrocardiosignal, the electroencephalogram signal, the facial image signal and the eye movement characteristic signal;
and the determining module is used for determining the fatigue state of the high-speed rail dispatcher by utilizing a multi-channel data fusion algorithm based on the characteristic value below the fatigue warning value.
10. The system according to claim 9, wherein the extraction module specifically comprises:
the first extraction unit is used for extracting a characteristic value below a fatigue warning value of a heart rate index based on the heart rate signal;
the second extraction unit is used for extracting a characteristic value below the fatigue warning value of the electrocardio index based on the electrocardio signal;
the third extraction unit is used for extracting a characteristic value below the fatigue warning value of the electroencephalogram index based on the electroencephalogram signal;
a fourth extraction unit configured to extract a feature value of a facial feature index fatigue alert value or less based on the facial image signal;
and a fifth extraction unit, configured to extract a feature value below the eye movement index fatigue warning value based on the eye movement feature signal.
11. The system according to claim 10, wherein the first extraction unit specifically comprises:
the acquisition unit is used for obtaining a heart rate value based on the heart rate signal;
the drawing unit is used for drawing a heart rate change curve based on the heart rate value;
and the first extraction subunit is used for extracting a characteristic value below the fatigue warning value of the heart rate index based on the heart rate change curve.
12. The system according to claim 10, wherein the second extraction unit comprises:
the first filtering unit is used for filtering the electrocardiosignals;
the first artifact removing unit is used for removing artifacts of the filtered signal;
the first transformation unit is used for carrying out time domain to frequency domain transformation on the signal without the artifacts;
a second extraction subunit, configured to extract frequency domain features based on the transform result;
and the third extraction subunit is used for extracting a characteristic value below the fatigue warning value of the electrocardio index based on the frequency domain characteristic.
13. The system according to claim 10, wherein the third extraction unit specifically comprises:
the second filtering unit is used for filtering the electroencephalogram signals;
the second artifact removing unit is used for removing artifacts of the filtered signal;
the second transformation unit is used for carrying out time domain to frequency domain transformation on the signal without the artifacts;
a fourth extraction subunit, configured to extract frequency domain features based on the transformed signal;
and the fifth extraction subunit is used for extracting a characteristic value below the electroencephalogram index fatigue warning value based on the frequency domain characteristics.
14. The system according to claim 10, wherein the fourth extraction unit specifically comprises:
a first processing unit for performing digital image processing on the face image signal;
a first determination unit configured to determine a facial feature based on a result after the processing;
a second determination unit configured to determine a facial fatigue expression feature based on the facial feature;
and the sixth extraction subunit is used for extracting a characteristic value below the fatigue warning value of the facial characteristic index based on the facial fatigue expression characteristic.
15. The system according to claim 10, wherein the fifth extraction unit specifically comprises:
the second processing unit is used for carrying out digital image processing on the eye movement characteristic signal;
a seventh extraction subunit, configured to extract an eye movement parameter based on the processed result;
and the eighth extraction subunit is used for extracting a characteristic value below the eye movement index fatigue warning value based on the eye movement parameter.
16. The system of claim 9, further comprising:
a comparison unit for comparing the determined fatigue state with a fatigue state threshold;
and the early warning intervention unit is used for carrying out early warning intervention under the condition that the fatigue state exceeds the fatigue state threshold value.
CN201710397928.1A 2017-05-27 2017-05-27 High-speed rail dispatcher fatigue state evaluation method and system Active CN107233103B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201710397928.1A CN107233103B (en) 2017-05-27 2017-05-27 High-speed rail dispatcher fatigue state evaluation method and system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201710397928.1A CN107233103B (en) 2017-05-27 2017-05-27 High-speed rail dispatcher fatigue state evaluation method and system

Publications (2)

Publication Number Publication Date
CN107233103A CN107233103A (en) 2017-10-10
CN107233103B true CN107233103B (en) 2020-11-20

Family

ID=59985289

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201710397928.1A Active CN107233103B (en) 2017-05-27 2017-05-27 High-speed rail dispatcher fatigue state evaluation method and system

Country Status (1)

Country Link
CN (1) CN107233103B (en)

Families Citing this family (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108852380A (en) * 2018-07-09 2018-11-23 南京邮电大学 Fatigue, mood analysis method based on ECG signal
CN109431498A (en) * 2018-11-14 2019-03-08 天津大学 Wearable multi-modal physiological driver's condition monitoring system
US11813060B2 (en) 2020-06-29 2023-11-14 Lear Corporation System and method for biometric evoked response monitoring and feedback
CN112450933B (en) * 2020-11-10 2022-09-20 东北电力大学 Driving fatigue monitoring method based on multiple types of characteristics of human body
CN113509189A (en) * 2021-07-07 2021-10-19 科大讯飞股份有限公司 Learning state monitoring method and related equipment thereof
CN113951903B (en) * 2021-10-29 2022-07-08 西南交通大学 High-speed railway dispatcher overload state identification method based on electroencephalogram data determination
CN114081491B (en) * 2021-11-15 2023-04-25 西南交通大学 Fatigue prediction method for high-speed railway dispatcher based on electroencephalogram time sequence data measurement
CN115359545B (en) * 2022-10-19 2023-01-24 深圳海星智驾科技有限公司 Staff fatigue detection method and device, electronic equipment and storage medium

Family Cites Families (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN101090482B (en) * 2006-06-13 2010-09-08 唐琎 Driver fatigue monitoring system and method based on image process and information mixing technology
CN101540090B (en) * 2009-04-14 2011-06-15 华南理工大学 Driver fatigue monitoring method based on multivariate information fusion
CN101872171B (en) * 2009-04-24 2012-06-27 中国农业大学 Driver fatigue state recognition method and system based on information fusion
US20140375785A1 (en) * 2013-06-19 2014-12-25 Raytheon Company Imaging-based monitoring of stress and fatigue
EP3158392B1 (en) * 2014-06-23 2020-03-04 Honda Motor Co., Ltd. System and method for responding to driver state
CN104952210B (en) * 2015-05-15 2018-01-05 南京邮电大学 A kind of fatigue driving state detecting system and method based on decision making level data fusion
CN106580349B (en) * 2016-12-07 2020-01-21 中国民用航空总局第二研究所 Controller fatigue detection method and device and controller fatigue response method and device

Also Published As

Publication number Publication date
CN107233103A (en) 2017-10-10

Similar Documents

Publication Publication Date Title
CN107233103B (en) High-speed rail dispatcher fatigue state evaluation method and system
Clayson et al. How does noise affect amplitude and latency measurement of event‐related potentials (ERPs)? A methodological critique and simulation study
Tiwari et al. Automated diagnosis of epilepsy using key-point-based local binary pattern of EEG signals
Zhang et al. Recognition of mental workload levels under complex human–machine collaboration by using physiological features and adaptive support vector machines
Xie et al. Wavelet-based sparse functional linear model with applications to EEGs seizure detection and epilepsy diagnosis
EP3876191B1 (en) Estimator generation device, monitoring device, estimator generation method, estimator generation program
Boroujeni et al. Diagnosis of attention deficit hyperactivity disorder using non‐linear analysis of the EEG signal
EP3143933B1 (en) Method for estimating a mental state, in particular a workload, and related apparatus
KR20120046230A (en) State detecting device, state detecting method, and program
US11561959B2 (en) Method and system for automatic anomaly detection in data
CN103690160B (en) Electroencephalogram feature extraction method based on non-Gaussian time sequence model
WO2019040675A1 (en) System and method for racing data analysis using telemetry data and wearable sensor data
EP3359022A1 (en) Method and system for pre-processing of an eeg signal for cognitive load measurement
Akbari et al. Recognizing seizure using Poincaré plot of EEG signals and graphical features in DWT domain
Sinha et al. Statistical feature analysis for EEG baseline classification: Eyes Open vs Eyes Closed
Karpov et al. Extreme value theory inspires explainable machine learning approach for seizure detection
Thilagaraj et al. Identification of Drivers Drowsiness Based on Features Extracted from EEG Signal Using SVM Classifier
JP6458387B2 (en) Bioelectric noise identification system and bioelectric noise removal system
JP2003310564A (en) Automatic brain wave analyzing apparatus and method
Goshvarpour et al. Classification of heart rate signals during meditation using Lyapunov exponents and entropy
JP2012161379A5 (en)
CN111938594B (en) Epilepsia electroencephalogram signal anomaly monitoring system and equipment based on hierarchical graph model
Mesin et al. Prognostic value of EEG indexes for the Glasgow outcome scale of comatose patients in the acute phase
Dinh et al. Eeg peak detection in cognitive conflict processing using summit navigator and clustering-based ranking
Wang et al. Mental workload recognition by combining wavelet packet transform and kernel spectral regression techniques

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant