CN113243919A - Train driver fatigue state identification and monitoring system - Google Patents

Train driver fatigue state identification and monitoring system Download PDF

Info

Publication number
CN113243919A
CN113243919A CN202110357293.9A CN202110357293A CN113243919A CN 113243919 A CN113243919 A CN 113243919A CN 202110357293 A CN202110357293 A CN 202110357293A CN 113243919 A CN113243919 A CN 113243919A
Authority
CN
China
Prior art keywords
fatigue
train driver
data
ecg
mouth
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202110357293.9A
Other languages
Chinese (zh)
Inventor
于颖慧
丁泓九
朱海燕
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai University of Engineering Science
Original Assignee
Shanghai University of Engineering Science
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai University of Engineering Science filed Critical Shanghai University of Engineering Science
Priority to CN202110357293.9A priority Critical patent/CN113243919A/en
Publication of CN113243919A publication Critical patent/CN113243919A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0033Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room
    • A61B5/004Features or image-related aspects of imaging apparatus classified in A61B5/00, e.g. for MRI, optical tomography or impedance tomography apparatus; arrangements of imaging apparatus in a room adapted for image acquisition of a particular organ or body part
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7203Signal processing specially adapted for physiological signals or for diagnostic purposes for noise prevention, reduction or removal
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/72Signal processing specially adapted for physiological signals or for diagnostic purposes
    • A61B5/7235Details of waveform analysis
    • A61B5/7264Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems
    • A61B5/7267Classification of physiological signals or data, e.g. using neural networks, statistical classifiers, expert systems or fuzzy systems involving training the classification device
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/7405Details of notification to user or communication with user or patient ; user input means using sound
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/74Details of notification to user or communication with user or patient ; user input means
    • A61B5/746Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms

Abstract

The invention relates to a train driver fatigue state identification and monitoring system, comprising: the monitoring terminal is used for acquiring an ECG signal, a face characteristic image and PERCLOS data of a train driver; the cloud platform is used for identifying the fatigue state of a train driver according to the acquired signals, and storing, displaying, updating and reminding the fatigue state result; and the remote client acquires real-time fatigue state information and historical data through logging in a front-end page. Compared with the prior art, the method has the advantages of improving the accuracy and reliability of the fatigue identification result and the like.

Description

Train driver fatigue state identification and monitoring system
Technical Field
The invention relates to the technical field of rail transit train driving safety, in particular to a train driver fatigue state recognition and monitoring system.
Background
As an operator at a key post in a subway, the sensing judgment of a train driver on site conditions and the random strain during emergency influence the running safety of the train, and the continuous and repeated boring driving and the unavoidable shift system cause the subway management to face the safety risk problem caused by fatigue driving. In order to overcome the problem, the development of a train driver fatigue monitoring system has important significance for accident prevention. However, most existing methods use only one information source for detection, such as facial expression, behavior characteristics or physiological characteristics of the driver, and have certain limitations in terms of accuracy and reliability, such as fatigue is not necessarily implied by the larger degree of mouth opening, which can be reevaluated by heart rate variability and eye opening degree, and the data acquisition of facial characteristics is very dependent on the driving environment.
Disclosure of Invention
The invention aims to overcome the defects of the prior art and provide a train driver fatigue state identification and monitoring system, which integrates heart rate variability, eye opening degree and mouth opening degree, utilizes a support vector machine to construct a fatigue identification model with multiple characteristics, and can obviously improve the accuracy and reliability of a fatigue identification result.
The purpose of the invention can be realized by the following technical scheme:
a train driver fatigue state identification and monitoring system, the system includes:
the monitoring terminal is used for acquiring an ECG signal, a face characteristic image and PERCLOS data of a train driver;
the cloud platform is used for identifying the fatigue state of a train driver according to the acquired signals, and storing, displaying, updating and reminding the fatigue state result;
and the remote client acquires real-time fatigue state information and historical data through logging in a front-end page.
The cloud platform comprises:
the back-end processing module is used for processing the received ECG signal, the face characteristic image and the PERCLOS data, respectively extracting fatigue indexes of the ECG, the eyes and the mouth based on the ECG technology and the face recognition technology, constructing a multi-feature fatigue recognition model by utilizing an SVM (support vector machine), substituting the fatigue indexes of the ECG, the eyes and the mouth measured in real time into the model to obtain a prediction result, and sending the obtained prediction result to a database;
the database is used for acquiring and storing train information and driver information and storing a prediction result received from the cloud platform;
and the front-end page is interacted with the database in real time, and provides fatigue state display, reminding and historical data query channels for users who are in line with login.
In the back-end processing module, the process of extracting the fatigue index of the ECG comprises the following steps:
11) acquiring ECG data according to the acquired ECG signal of the train driver;
12) carrying out wavelet noise reduction and R peak data detection processing on the ECG data in sequence;
13) calculating a time domain indicator for the processed ECG data;
14) and inputting the time domain indexes into a multi-feature fatigue recognition model constructed by the SVM.
The process of extracting fatigue indexes of eyes and mouths comprises the following steps:
21) extracting human face characteristic points from the collected human face characteristic images;
22) calculating the opening degree of eyes and the opening degree of mouth according to the extracted characteristic points of the face;
23) and inputting the calculated eye opening and mouth opening into a multi-feature fatigue recognition model constructed by using an SVM.
The multivariate characteristic fatigue recognition model constructed by the SVM is trained by utilizing prestored data, and the kernel function of the multivariate characteristic fatigue recognition model constructed by the SVM is a Gaussian kernel function. Wherein the pre-stored data comprises RMSSD, SDSD, SDNN, Mean _ NN, Std _ hr, Mean _ hr, eye aspect ratio and mouth aspect ratio measured by the monitoring terminal.
The monitoring terminal includes:
the ECG signal monitoring terminal is used for acquiring ECG data of a train driver;
the human face index monitoring terminal is used for acquiring a human face characteristic image of a train driver;
the eye tracker is used for collecting PERCLOS data of a train driver;
and the wireless transmission module transmits each acquired data to the cloud platform for processing.
The front end page includes:
the login module is used for providing a user login platform, and opening the display module, the reminding module and the query module for users meeting login requirements;
the display module is used for displaying various information stored in the database in real time;
the reminding module is used for alarming the fatigue state information judged by the cloud platform;
and the query module is used for providing a query channel of various historical data.
Further, ECG signal monitoring terminal adopts intelligent bracelet, people's face index monitoring terminal adopts the network camera.
Further, the database adopts a MySQL database.
Compared with the prior art, the method combines a plurality of information sources based on the electrocardio and face image recognition technology, performs fatigue recognition by taking the heart rate variability, the eye opening degree and the mouth opening degree as fatigue indexes, and improves the accuracy and reliability of a fatigue recognition result unlike the conventional fatigue recognition based on a single information source; the fatigue state recognition and monitoring system can monitor the fatigue state of a train driver in real time and perform sound early warning, so that the safety level of subway operation is improved.
Drawings
FIG. 1 is a schematic block diagram of a fatigue status recognition and monitoring system for a train driver in an embodiment;
fig. 2 is a schematic flow chart of fatigue state identification performed by the cloud platform in the embodiment;
FIG. 3 is a network topology diagram of a train driver fatigue status identification and monitoring system in an embodiment;
fig. 4 is a schematic diagram of facial makeup features in an embodiment.
Detailed Description
The invention is described in detail below with reference to the figures and specific embodiments. It is to be understood that the embodiments described are only a few embodiments of the present invention, and not all embodiments. All other embodiments, which can be obtained by a person skilled in the art without any inventive step based on the embodiments of the present invention, shall fall within the scope of protection of the present invention.
Examples
As shown in fig. 1, the present invention relates to a train driver fatigue status recognition and monitoring system, which can overcome the limitations of the method based on a single information source in terms of accuracy and reliability by collecting the electrocardiogram characteristics (RMSSD, SDSD, SDNN, Mean _ NN, Std _ hr, Mean _ hr) through a network camera, collecting the facial characteristics (eye aspect ratio and mouth aspect ratio) through a network camera, and using these characteristics as the characteristic values of a support vector machine to perform fatigue prediction, such as the larger the degree of mouth opening does not necessarily Mean fatigue, which can be reevaluated by heart rate variability and eye opening degree, and the data acquisition of the facial characteristics also depends on the driving environment (e.g., day, night, weather). Specifically, the system comprises a monitoring terminal, a cloud platform and a remote client.
And the monitoring terminal is used for acquiring the ECG, the eye and mouth characteristic data.
A cloud platform comprising:
the back-end processing module is used for processing the received fatigue indexes, extracting the fatigue indexes of the ECG, the eyes and the mouth respectively based on Electrocardio (ECG) and face recognition technology, utilizing a multi-feature fatigue recognition model constructed by an SVM, and substituting the real-time measured fatigue indexes of the ECG, the eyes and the mouth into the model to obtain a prediction result; sending the obtained data to a database;
the database is used for acquiring and storing train information and driver information and storing data received from the cloud platform;
and the front-end page is interacted with the database in real time, and provides fatigue state display, reminding and historical data query channels for users who are in line with login.
And the remote client logs in a front-end page and acquires the fatigue state information and the historical data in real time.
The monitoring terminal of the invention comprises:
the ECG index monitoring terminal is used for acquiring ECG data of a train driver;
the eye tracker is used for acquiring PERCLOS data of a train driver, wherein PERCLOS represents the time percentage of eye closure in unit time;
the human face index monitoring terminal is used for acquiring a human face characteristic image of a train driver;
and the wireless transmission module is used for transmitting each acquired data to the cloud platform for processing.
The front end page of the present invention includes:
the login module is used for providing a user login platform, and opening the display module, the reminding module and the query module of a user which meet the login requirements;
the display module is used for displaying various information stored in the database in real time;
the reminding module is used for alarming the fatigue state information judged by the cloud platform;
and the query module is used for querying various types of historical data.
Specifically, as shown in fig. 3, a plurality of monitoring terminals can be arranged, and are respectively used for acquiring electrocardiosignals and facial images of different train drivers, performing data processing, and uploading data to a cloud-end platform; the cloud platform performs fatigue identification according to the electrocardiogram data and the facial image, so that on one hand, a fatigue monitoring result is displayed in a webpage platform (a display module of a front-end page), and on the other hand, fatigue personnel can be warned in a voice mode (a reminding module of the front-end page); the remote client can acquire real-time data of each post person on a front-end page or retrieve historical data through equipment such as a desktop PC (personal computer), a smart phone and the like. The monitoring terminal mainly comprises an electrocardio sensor, a singlechip and a network camera; the cloud platform is supported by a remote server and consists of a database and a webpage platform.
In the processing of the cloud platform, firstly, ECG fatigue indexes are extracted and fatigue indexes of eyes and mouths are extracted. Wherein:
the extraction process of the fatigue index of the ECG sequentially comprises the steps of acquiring ECG data, performing wavelet denoising, detecting R peak data, calculating a time domain index and identifying fatigue. The ECG monitoring terminal comprises an electrocardio single chip microcomputer, an electrocardio sensor and a wireless transmission module, and in actual use, the ECG monitoring terminal can particularly select an intelligent bracelet to collect ECG data. Acquiring ECG data by using a lower computer background of an electrocardio monitoring program, and realizing filtering processing on an upper computer background; and analyzing the heart rate variability characteristics around the variation of the RR interval by a statistical discrete trend analysis method, and calculating time domain indexes of RMSSD, SDSD, SDNN, Mean _ NN, Std _ hr and Mean _ hr, namely the electrocardio indexes. The meanings of the indices are shown in the following table.
TABLE 1 electrocardiographic data time domain index definition
Figure BDA0003003913880000051
The extraction process of the fatigue indexes of the eyes and the mouth sequentially comprises the steps of obtaining human face image data, extracting human face characteristic points, calculating the length-width ratio of the eyes and calculating the length-width ratio of the mouth. The face image data is collected through a network camera, and the PERCLOS data is measured by an eye tracker; the face in the image is identified by using a face identification method on the collected face image data, and the feature points of the face are extracted, preferably 68 feature points of the face are extracted. And then, positioning the upper eyelid, the lower eyelid, the upper lip and the lower lip by using the facial feature points, further calculating the key points of the upper eyelid, the lower eyelid, the upper lip and the lower lip, and then calculating the length-width ratio of the eyes and the length-width ratio of the mouth by using a hypot function to obtain the fatigue indexes of the eyes and the mouth. When fatigue recognition is performed by using the human face features, firstly, the human face needs to be positioned. The human face can be located through 68 feature points, the human face in the picture is identified by utilizing the dlib library, the human face detector is obtained, 68 feature points of the face are extracted, and the 68 feature points are shown in fig. 4.
1. Emphasis on upper and lower eyelids:
after the face is located, the eyes and mouth need to be located. The eyes can be positioned through the positions of the upper eyelid and the lower eyelid, and the mouth can be positioned through the positions of the upper mouth skin and the lower mouth skin. Therefore, the positions of the feature points of 37-42 and 43-48 in the upper image need to be obtained, and the emphasis points of the upper eyelid and the lower eyelid need to be calculated. And acquiring the feature point positions of 49-62, and calculating the key point of the lower mouth skin of the upper mouth skin.
2. Calculate eye and mouth aspect ratio:
after positioning the eyes and mouth, it is necessary to acquire the eye and mouth state. And obtaining coordinates of two sides of eyes on the face, coordinates of upper and lower eyelids of the eyes, coordinates of two sides of a mouth and coordinates of upper and lower mouth covers of the mouth by using points on the facial makeup feature map, and simultaneously calculating coordinates of a middle point. And connecting the left eye and the right eye with the upper line and the lower line, connecting the left mouth and the upper line with the lines, and calculating the length of the line outgoing section by using a hypot function to obtain the length-width ratio of the eyes and the length-width ratio of the mouth.
The multivariate characteristic fatigue recognition model is constructed by using the SVM, the ECG measured in real time and fatigue indexes of eyes and mouths are substituted into the model to obtain a prediction result, and the flow is shown in fig. 2. The multivariate characteristic fatigue recognition model constructed by the SVM comprises training by using pre-stored data, and the selected kernel function is a Gaussian kernel function. The pre-stored data are RMSSD, SDSD, SDNN, Mean _ NN, Std _ hr, Mean _ hr measured by the electrocardiographic hardware, and eye aspect ratio, mouth aspect ratio measured by the webcam. Specifically, in fig. 2, the previously collected RMSSD, SDSD, SDNN, Mean _ NN, Std _ hr, Mean _ hr, eye aspect ratio, and mouth aspect ratio are used as feature values of the support vector machine, and PERCLOS is used as a label of the support vector machine to perform training, so as to obtain a one-to-one correspondence relationship between the feature values and the label. During testing, only real-time driver RMSSD, SDSD, SDNN, Mean _ NN, Std _ hr, Mean _ hr, eye aspect ratio and mouth aspect ratio data need to be collected and substituted into a model to obtain PERCLOS data, wherein 0 represents fatigue and 1 represents fatigue. And then adding the newly acquired data into a training set of the support vector machine, wherein the accuracy of the model is higher and higher along with the increase of the data of the training set.
The existing research shows that fatigue is indicated when the PERCLOS of a driver is larger than 0.15, so the PERCLOS can be used as a fatigue measuring standard, but the measurement of the PERCLOS index needs very precise equipment, has strong invasiveness to the driver and is difficult to be applied in practice. The innovative idea of the invention is to use the easily obtained electrocardio and facial indexes to replace the PERCLOS index to measure the fatigue. When the first data acquisition is performed, the electrocardiographic device is used for measuring RMSSD, SDSD, SDNN, Mean _ NN, Std _ hr and Mean _ hr per second, the network camera is used for measuring the mouth aspect ratio of the eye aspect ratio per second, the eye tracker is used for measuring PERCLOS data, data larger than 0.15 is recorded as 1, and data smaller than or equal to 0.15 is recorded as 0. Thus, there is a one-to-one correspondence between RMSSD, SDSD, SDNN, Mean _ NN, Std _ hr, Mean _ hr, eye aspect ratio, and mouth aspect ratio and PERCLOS.
The remote client can acquire the information of each post personnel through equipment such as a desktop PC, a smart phone and the like through a login front-end page, and the information comprises user login, real-time fatigue state display, sound reminding and historical data query.
As a preferred scheme, the cloud platform of this embodiment includes an electrocardiograph monitoring program developed by Labview, a face recognition program developed by python, and a fatigue recognition model developed by python. Further, the electrocardio monitoring program is used for acquiring ECG data acquired by the electrocardio hardware to obtain an electrocardio characteristic value; the face recognition program is used for acquiring a face characteristic value acquired by face image hardware; the fatigue recognition model is a multi-feature fatigue recognition model constructed by the SVM. Further, the front page shows the name, the job number, the team, the number of the team and the fatigue state of the driver.
As a preferred scheme, the database adopts a MySQL database, and the stored content of the database comprises the name, the work number, the group, the shift, the electrocardio characteristic value, the face characteristic value and the fatigue state of a train driver.
The invention combines a plurality of information sources based on the electrocardio and face image recognition technology, carries out fatigue recognition by taking heart rate variability, eye opening degree and mouth opening degree as fatigue indexes, is different from the prior fatigue recognition based on a single information source, and improves the accuracy and reliability of a fatigue recognition result; the fatigue state recognition and monitoring system can monitor the fatigue state of a train driver in real time and perform sound early warning, so that the safety level of subway operation is improved.
While the invention has been described with reference to specific embodiments, the invention is not limited thereto, and those skilled in the art can easily conceive of various equivalent modifications or substitutions within the technical scope of the invention. Therefore, the protection scope of the present invention shall be subject to the protection scope of the claims.

Claims (10)

1. The utility model provides a train driver fatigue state discernment and monitored control system which characterized in that includes:
the monitoring terminal is used for acquiring an ECG signal, a face characteristic image and PERCLOS data of a train driver;
the cloud platform is used for identifying the fatigue state of a train driver according to the acquired signals, and storing, displaying, updating and reminding the fatigue state result;
and the remote client acquires real-time fatigue state information and historical data through logging in a front-end page.
2. The train driver fatigue status identification and monitoring system of claim 1, wherein the cloud platform comprises:
the back-end processing module is used for processing the received ECG signal, the face characteristic image and the PERCLOS data, respectively extracting fatigue indexes of the ECG, the eyes and the mouth based on the ECG technology and the face recognition technology, constructing a multi-feature fatigue recognition model by utilizing an SVM (support vector machine), substituting the fatigue indexes of the ECG, the eyes and the mouth measured in real time into the model to obtain a prediction result, and sending the obtained prediction result to a database;
the database is used for acquiring and storing train information and driver information and storing a prediction result received from the cloud platform;
and the front-end page is interacted with the database in real time, and provides fatigue state display, reminding and historical data query channels for users who are in line with login.
3. The train driver fatigue status recognition and monitoring system of claim 2, wherein the process of extracting fatigue indicators of ECG comprises the steps of:
11) acquiring ECG data according to the acquired ECG signal of the train driver;
12) carrying out wavelet noise reduction and R peak data detection processing on the ECG data in sequence;
13) calculating a time domain indicator for the processed ECG data;
14) and inputting the time domain indexes into a multi-feature fatigue recognition model constructed by the SVM.
4. The train driver fatigue status recognition and monitoring system of claim 3, wherein the process of extracting eye and mouth fatigue indicators comprises the steps of:
21) extracting human face characteristic points from the collected human face characteristic images;
22) calculating the opening degree of eyes and the opening degree of mouth according to the extracted characteristic points of the face;
23) and inputting the calculated eye opening and mouth opening into a multi-feature fatigue recognition model constructed by using an SVM.
5. The train driver fatigue state recognition and monitoring system of claim 4, wherein the multivariate characteristic fatigue recognition model constructed by the SVM is trained using pre-stored data, and the kernel function of the multivariate characteristic fatigue recognition model constructed by the SVM is a Gaussian kernel function.
6. The train driver fatigue status identification and monitoring system of claim 5, wherein the pre-stored data includes RMSSD, SDSD, SDNN, Mean _ NN, Std _ hr, Mean _ hr, eye aspect ratio and mouth aspect ratio measured by the monitoring terminal.
7. The system for identifying and monitoring the fatigue status of train drivers as claimed in claim 1, wherein the monitoring terminal comprises:
the ECG signal monitoring terminal is used for acquiring ECG data of a train driver;
the human face index monitoring terminal is used for acquiring a human face characteristic image of a train driver;
the eye tracker is used for collecting PERCLOS data of a train driver;
and the wireless transmission module transmits each acquired data to the cloud platform for processing.
8. The train driver fatigue status identification and monitoring system of claim 2, wherein the front end page comprises:
the login module is used for providing a user login platform, and opening the display module, the reminding module and the query module for users meeting login requirements;
the display module is used for displaying various information stored in the database in real time;
the reminding module is used for alarming the fatigue state information judged by the cloud platform;
and the query module is used for providing a query channel of various historical data.
9. The train driver fatigue state identification and monitoring system of claim 7, wherein the ECG signal monitoring terminal employs an intelligent bracelet, and the human face index monitoring terminal employs a web camera.
10. The train driver fatigue status identification and monitoring system of claim 2, wherein the database is a MySQL database.
CN202110357293.9A 2021-04-01 2021-04-01 Train driver fatigue state identification and monitoring system Pending CN113243919A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202110357293.9A CN113243919A (en) 2021-04-01 2021-04-01 Train driver fatigue state identification and monitoring system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202110357293.9A CN113243919A (en) 2021-04-01 2021-04-01 Train driver fatigue state identification and monitoring system

Publications (1)

Publication Number Publication Date
CN113243919A true CN113243919A (en) 2021-08-13

Family

ID=77220203

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202110357293.9A Pending CN113243919A (en) 2021-04-01 2021-04-01 Train driver fatigue state identification and monitoring system

Country Status (1)

Country Link
CN (1) CN113243919A (en)

Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030146841A1 (en) * 2000-08-29 2003-08-07 Winfried Koenig Method and device for diagnosing in a motor vehicle a driver's fitness drive
CN101236695A (en) * 2008-03-05 2008-08-06 中科院嘉兴中心微系统所分中心 Driver status estimation system based on vehicle mounted sensor network
CN103714660A (en) * 2013-12-26 2014-04-09 苏州清研微视电子科技有限公司 System for achieving fatigue driving judgment on basis of image processing and fusion between heart rate characteristic and expression characteristic
CN103824420A (en) * 2013-12-26 2014-05-28 苏州清研微视电子科技有限公司 Fatigue driving identification system based on heart rate variability non-contact measuring
CN106599821A (en) * 2016-12-07 2017-04-26 中国民用航空总局第二研究所 Controller fatigue detection method and system based on BP neural network
CN207106343U (en) * 2017-08-26 2018-03-16 山西省交通科学研究院 A kind of commercial vehicle driver tired driving state recognition and prior-warning device
CN110731787A (en) * 2019-09-26 2020-01-31 首都师范大学 fatigue state causal network method based on multi-source data information
CN112132475A (en) * 2020-09-27 2020-12-25 上海应用技术大学 Driver driving safety performance assessment method and system
CN112241658A (en) * 2019-07-17 2021-01-19 青岛大学 Fatigue driving early warning system and method based on depth camera
CN112395900A (en) * 2019-08-12 2021-02-23 天津大学青岛海洋技术研究院 Fatigue driving state detection algorithm based on YOLOv3 algorithm

Patent Citations (10)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20030146841A1 (en) * 2000-08-29 2003-08-07 Winfried Koenig Method and device for diagnosing in a motor vehicle a driver's fitness drive
CN101236695A (en) * 2008-03-05 2008-08-06 中科院嘉兴中心微系统所分中心 Driver status estimation system based on vehicle mounted sensor network
CN103714660A (en) * 2013-12-26 2014-04-09 苏州清研微视电子科技有限公司 System for achieving fatigue driving judgment on basis of image processing and fusion between heart rate characteristic and expression characteristic
CN103824420A (en) * 2013-12-26 2014-05-28 苏州清研微视电子科技有限公司 Fatigue driving identification system based on heart rate variability non-contact measuring
CN106599821A (en) * 2016-12-07 2017-04-26 中国民用航空总局第二研究所 Controller fatigue detection method and system based on BP neural network
CN207106343U (en) * 2017-08-26 2018-03-16 山西省交通科学研究院 A kind of commercial vehicle driver tired driving state recognition and prior-warning device
CN112241658A (en) * 2019-07-17 2021-01-19 青岛大学 Fatigue driving early warning system and method based on depth camera
CN112395900A (en) * 2019-08-12 2021-02-23 天津大学青岛海洋技术研究院 Fatigue driving state detection algorithm based on YOLOv3 algorithm
CN110731787A (en) * 2019-09-26 2020-01-31 首都师范大学 fatigue state causal network method based on multi-source data information
CN112132475A (en) * 2020-09-27 2020-12-25 上海应用技术大学 Driver driving safety performance assessment method and system

Similar Documents

Publication Publication Date Title
EP3698707B1 (en) Electrocardiogram information dynamic monitoring system, computer program and computer readable storage medium
US11013470B2 (en) Detecting abnormalities in ECG signals
CN111166357A (en) Fatigue monitoring device system with multi-sensor fusion and monitoring method thereof
WO2001091627B1 (en) System and device for multi-scale analysis and representation of electrocardiographic data
US10368792B2 (en) Method for detecting deception and predicting interviewer accuracy in investigative interviewing using interviewer, interviewee and dyadic physiological and behavioral measurements
CN112365978A (en) Method and device for establishing early risk assessment model of tachycardia event
CN109691994A (en) A kind of rhythm of the heart analysis method based on electrocardiogram
CN112258790A (en) Rail transit train driver on-duty fatigue level and body state detection system
CN104207769A (en) Electrocardiosignal detection system
CN112309552A (en) AI-based bracelet-type intelligent whole-course radiotherapy safety management system and method
US20240099639A1 (en) System for measuring heart rate
CN109567832A (en) A kind of method and system of the angry driving condition of detection based on Intelligent bracelet
CN110755091A (en) Personal mental health monitoring system and method
CN115089179A (en) Psychological emotion insights analysis method and system
CN107516019A (en) Noninvasive health forecast system and method
CN113243919A (en) Train driver fatigue state identification and monitoring system
CN109620263B (en) Physical sign safety analysis method for post workers in enterprises and public institutions
Sadek et al. Sensor data quality processing for vital signs with opportunistic ambient sensing
JP2879663B2 (en) Fetal monitoring device
CN113017633B (en) Intelligent mental analysis and evaluation method and system based on human body characteristic data
CN113040773A (en) Data acquisition and processing method
CN114492656A (en) Fatigue degree monitoring system based on computer vision and sensor
CN113724853A (en) Intelligent medical system based on deep learning
WO2022201639A1 (en) Biological data evaluation server, biological data evaluation system, and biological data evaluation method
RU129680U1 (en) SYSTEM FOR DETERMINING THE FUNCTIONAL STATE OF A PEOPLE GROUP

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication

Application publication date: 20210813

RJ01 Rejection of invention patent application after publication