CN109389806B - Fatigue driving detection early warning method, system and medium based on multi-information fusion - Google Patents

Fatigue driving detection early warning method, system and medium based on multi-information fusion Download PDF

Info

Publication number
CN109389806B
CN109389806B CN201811325931.3A CN201811325931A CN109389806B CN 109389806 B CN109389806 B CN 109389806B CN 201811325931 A CN201811325931 A CN 201811325931A CN 109389806 B CN109389806 B CN 109389806B
Authority
CN
China
Prior art keywords
fatigue
pulse
characteristic
support vector
vector machine
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201811325931.3A
Other languages
Chinese (zh)
Other versions
CN109389806A (en
Inventor
杨立才
边军
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shandong University
Original Assignee
Shandong University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shandong University filed Critical Shandong University
Priority to CN201811325931.3A priority Critical patent/CN109389806B/en
Publication of CN109389806A publication Critical patent/CN109389806A/en
Application granted granted Critical
Publication of CN109389806B publication Critical patent/CN109389806B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08BSIGNALLING OR CALLING SYSTEMS; ORDER TELEGRAPHS; ALARM SYSTEMS
    • G08B21/00Alarms responsive to a single specified undesired or abnormal condition and not otherwise provided for
    • G08B21/02Alarms for ensuring the safety of persons
    • G08B21/06Alarms for ensuring the safety of persons indicating a condition of sleep, e.g. anti-dozing alarms
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/0205Simultaneously evaluating both cardiovascular conditions and different types of body conditions, e.g. heart and respiratory condition
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/24Detecting, measuring or recording bioelectric or biomagnetic signals of the body or parts thereof
    • A61B5/316Modalities, i.e. specific diagnostic methods
    • A61B5/369Electroencephalography [EEG]
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/68Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient
    • A61B5/6887Arrangements of detecting, measuring or recording means, e.g. sensors, in relation to patient mounted on external non-worn devices, e.g. non-medical devices
    • A61B5/6893Cars

Abstract

The invention discloses a fatigue driving detection early warning method, a system and a medium based on multi-information fusion.A high-definition camera arranged in an automobile cab is used for acquiring facial image information of a driver in a driving state, and the acquired image information is subjected to identity recognition and eye feature extraction and analysis; acquiring pulse data through an electronic bracelet with a photoelectric pulse sensor, and performing feature extraction analysis; acquiring steering data of a steering wheel through a steering wheel angle sensor; and fusing the extracted features, judging whether the vehicle is in a fatigue driving state through the MCU, and giving corresponding warning. Compared with the prior art, the method has the advantages that the facial information, the pulse data, the steering wheel rotation angle and the driving duration are fused, the interference to a driver is reduced to the greatest extent in the data information acquisition process, and the fatigue driving judgment accuracy is higher through multi-source information processing.

Description

Fatigue driving detection early warning method, system and medium based on multi-information fusion
Technical Field
The disclosure relates to the field of traffic safety protection, in particular to a fatigue driving detection early warning method, a system and a medium based on multi-information fusion.
Background
With the continuous development of global economy, the demand of traffic volume is increasing, the holding capacity of private cars is continuously increasing at a high speed, traffic accidents are increasing, and fatigue driving becomes a major potential hazard of traffic safety as one of important factors for inducing traffic accidents.
The driving fatigue means that the driver can generate the imbalance of physiological function and psychological function under the condition of long-time uninterrupted driving or non-mental full state, and the phenomenon of the decline of driving skill objectively appears. The driver has poor or insufficient sleeping quality, and fatigue is likely to occur if the vehicle is driven for a long time. Driving fatigue affects the driver's attention, feeling, perception, thinking, judgment, consciousness, decision and movement. Fatigue is divided into active fatigue and passive fatigue, and fatigue caused by insufficient sleep or diseases is called active fatigue; fatigue due to the tedious lack of excitation of the work environment is called passive fatigue. When the vehicle continues to run after fatigue, a driver feels sleepy, tired, weak in limbs, blurred vision, incapability of concentrating attention, slow response, reduced judgment capability, even absentmindedness and instant memory loss, and the vehicle is out of control due to misoperation, so that serious traffic safety accidents are easily caused.
According to statistics, about 10 thousands of traffic accidents are caused by fatigue driving of drivers every year in the United states, more than 30% of traffic accidents are caused by fatigue driving of drivers every year in China, and accidents are disastrous, so that the life and property safety of people is seriously influenced. Therefore, it is important to research and design a fatigue driving detection device to improve the active safety of the automobile.
As a detection method for preventing fatigue driving of a driver, a large number of scholars have studied it, and methods adopted so far are mainly classified into the following two major categories:
(1) the vehicle information is used as a main detection object. Whether the driver is in a fatigue state or not is indirectly judged by collecting the running state information of the vehicle, such as vehicle speed change, vehicle running track and the like. The method has the defects that the identification precision is not high, the interference of road conditions and the like is large when data are collected and analyzed, and the driving state of a driver cannot be accurately identified.
(2) The driver is the main detection object. The fatigue condition of the driver is judged by acquiring physiological data of the driver, such as brain waves, pulse, heart rate, respiration and the like, acquiring facial image information of the driver through a camera and analyzing the states of eyes, mouth and the like of the driver. However, the method has the defects that the difficulty of data acquisition is high under the condition that normal driving of a driver is not interfered, earphones, glasses, hats and the like with sensors are generally required to be worn, the normal driving of the driver is interfered, the driving potential safety hazard is increased, the safety privacy problem of the driver is involved, and the like.
In summary, the main problems of the prior art are: the method solves the problems of single signal source, low identification precision, poor practicability and the like in the fatigue driving identification process.
Disclosure of Invention
In order to overcome the defects of the prior art, the fatigue driving detection early warning method, system and medium based on multi-information fusion are provided by the disclosure, and the method has the effects of high identification precision and high practicability;
in a first aspect, the present disclosure provides a fatigue driving detection early warning method based on multi-information fusion;
the fatigue driving detection early warning method based on multi-information fusion comprises the following steps:
extracting human eye state characteristics, pulse characteristics and steering wheel rotation characteristics of a driver;
and carrying out feature fusion and fatigue recognition on the human eye state features, the pulse features and the steering wheel rotation features based on an SVM-DS algorithm, and early warning on a fatigue recognition result.
In some possible implementations, the method further includes: the high-definition camera mounted on the automobile rearview mirror is used for collecting the face image of the driver and identifying the identity of the driver through the face image. The traffic accident hit-and-miss monitoring system has the advantages that if the vehicle belongs to operation vehicles such as a taxi company, the phenomenon of shift replacement of a driver can exist, the driving record can be uploaded to the monitoring server in real time for remote monitoring through identification of the identity of the driver, and identity management of the driver and identity discrimination of traffic accident hit-and-miss people caused by fatigue driving are facilitated.
In some possible implementations, the specific steps of acquiring the state characteristics of the human eyes of the driver are as follows:
a step (101): acquiring a facial image of a driver in a driving state through a high-definition camera mounted on an automobile rearview mirror;
a step (102): realizing face positioning of the face image in an YCbCr space based on a skin color face detection algorithm to obtain a face positioning image;
step (103): on the face positioning image, realizing the positioning of the human eye region according to the gray scale integral projection;
calculating an eye opening duration, an eye closing duration, a blink frequency and a proportion of eye closures per unit time PERC L OS (percentage of eye closed over the duration time) according to the located human eye area;
a step (105): the characteristic that the eye opening duration exceeds a set threshold is regarded as a fatigue characteristic; the characteristic that the eye opening duration is less than or equal to a set threshold value is regarded as a non-fatigue characteristic;
the characteristic that the eye closing time exceeds a set threshold value is regarded as a fatigue characteristic; the characteristic that the eye closing duration is less than or equal to a set threshold value is regarded as a non-fatigue characteristic;
taking the characteristic that the blink frequency is smaller than a set threshold value as a fatigue characteristic; the characteristic that the blink frequency is larger than or equal to a set threshold value is regarded as a non-fatigue characteristic;
regarding a characteristic that the proportion PERC L OS of eye closure in the unit time is less than a set threshold as a fatigue characteristic, and regarding a characteristic that the proportion PERC L OS of eye closure in the unit time is greater than a set threshold as a non-fatigue characteristic;
dividing fatigue characteristics into a training set and a testing set; non-fatigue features are also divided into training and test sets.
The high-definition camera is mounted on the rearview mirror, so that the influence of the mounting position of the camera on the operation activity space of a driver can be avoided, and the shielding of the observation range of the driver can be avoided.
In some possible implementations, the specific steps of step (103) are:
step (1031): selecting face-positioning images
Figure BDA0001858758500000031
To
Figure BDA0001858758500000032
H is the height of the face positioning image, and the height is obtained
Figure BDA0001858758500000033
Figure BDA0001858758500000034
To
Figure BDA0001858758500000035
Image lowest point b ofmTaking the ordinate of the image as [ bm-30,bm+30]The region of (a) is an approximate eye region;
step (1032): calculating the vertical gray scale integral projection function of the approximate eye area image, normalizing the function, calculating the coordinates of the lowest point of the eyeball and recording the coordinates as amTo obtain [ a ]m,bm];
Step (1033): from the located nadir coordinates [ a ]m,bm]22 pixels are taken from the left side and the right side respectively, 16 pixels are taken from the upper side and the lower side respectively, then a human eye area is determined, histogram equalization is carried out on an image of the human eye area, binarization processing is carried out, and finally a two-dimensional feature map for human eye feature calculation is obtained;
step (1034): recording the height l of a two-dimensional profileyAnd width lxThe ratio L of height to width is used as the eye opening and closing value;
and (1035) normalizing the eye opening degrees of all the human eye region images to [0,1], considering that the human eye is in the eye closing state with the eye opening degree less than 20%, and further calculating the eye opening time, the eye closing time, the blinking frequency and the proportion PERC L OS of the eye closing in unit time.
In some possible implementations, the pulse characteristics of the driver are collected:
pulse data are obtained through a wireless bracelet with a photoelectric pulse sensor, filtering processing is carried out on the pulse data based on wavelet transformation, and pulse features are extracted.
The wireless bracelet includes: and the control chip is respectively connected with the power supply module, the pulse acquisition module, the vibration module, the liquid crystal display module, the Bluetooth module and the GPS module. The pulse acquisition module is a photoelectric pulse sensor. The liquid crystal display module is a liquid crystal display screen and is used for displaying time information.
In some possible implementation manners, pulse data is acquired through a wireless bracelet with a photoelectric pulse sensor, filtering processing is carried out on the pulse data based on wavelet transformation, and the specific steps of extracting pulse features are as follows:
step (200): for signals acquired by the photoelectric pulse sensor, db6 wavelet is used for solving, and the db6 wavelet is marked as omega (m, n) and is a coefficient matrix obtained after wavelet transformation of noise-containing signals, wherein m is a stretching factor, and n is a translation factor;
step (201): calculating a correlation coefficient R (m, n), wherein R (m, n) represents the product of the expansion factor and the translation factor;
step (202): carrying out normalization processing on the correlation coefficient R (m, n):
Figure BDA0001858758500000041
wherein N isR(m,n)Representing a normalized correlation coefficient matrix; z represents an integer.
Step (203): comparison of NR(m,n)The absolute value of (d) and the absolute value of ω (m, n); if N is presentR(m,n)Is large, ω (m, n) is considered to be derived from the original signal, and ω (m, n) is given to the reconstructed signal function ωf(m, n) and zero ω (m, n); if the absolute value of ω (m, n) is large, ω (m, n) is considered to be from the noise signal, and ω (m, n) is retained;
a step (204): calculating the mean square error of the noise
Figure BDA0001858758500000042
A ratio to the unbiased estimate σ (m, n) is λ;
Figure BDA0001858758500000043
wherein L represents the number of zeroed points;
step (205): if the lambda value is larger than 1, iterating the process from the step (200) to the step (204); if lambda is less than or equal to 1, a reconstructed signal omega is obtainedf(m, n), completing denoising;
step (206): extracting the position of a main wave crest in a reconstructed signal:
selecting orthogonal wavelets as wavelet bases, and performing three-layer wavelet decomposition on pulse information by adopting an orthogonal wavelet transform method; extracting the decomposed third-layer high-frequency coefficient, and reconstructing a third-layer high-frequency signal by using the third-layer high-frequency coefficient; in the high-frequency coefficient of the third layer, a self-adaptive threshold method is adopted to detect the maximum value point in each period range; taking the detected maximum value point as a reference point and corresponding to the original signal; in the original signal, respectively taking M points before and after as a search range, and detecting a maximum value point of the original signal in the range, wherein the point is the main wave crest position of the pulse;
the time difference between two adjacent main wave peaks is calculated and recorded as xiI is a positive integer, then xiIs the mean of the dominant wave crest, xiThe standard deviation is the standard deviation of the main wave crest;
transforming the pulse signal into a frequency domain through FFT, respectively calculating power HF of a high-frequency component and power L F of a low-frequency component of the pulse signal transformed into the frequency domain, and comparing the power HF of the high-frequency component with the power L F of the low-frequency component to obtain the ratio of the high-frequency power and the low-frequency power of the pulse;
taking the position of the main wave crest of the pulse, the mean value of the main wave crest, the standard deviation of the main wave crest and the ratio of the high-frequency power and the low-frequency power of the pulse as the pulse characteristics;
the pulse characteristic that the main wave crest position of the pulse exceeds a set threshold value is regarded as fatigue characteristic; the pulse characteristic that the position of the main wave crest of the pulse is lower than a set threshold value is regarded as a non-fatigue characteristic;
taking the pulse characteristic that the mean value of the main wave peak exceeds a set threshold value as fatigue characteristic; taking the pulse characteristic that the mean value of the main wave peak is lower than a set threshold as a non-fatigue characteristic;
taking the pulse characteristic that the standard deviation of the main wave crest exceeds a set threshold value as a fatigue characteristic; taking the pulse characteristic that the standard deviation of the main wave crest is lower than a set threshold value as a non-fatigue characteristic;
regarding the pulse characteristics of which the ratio of the high frequency power to the low frequency power of the pulse exceeds a set threshold as fatigue characteristics; regarding the pulse characteristics of which the ratio of the high frequency power and the low frequency power of the pulse is lower than a set threshold as non-fatigue characteristics;
dividing fatigue characteristics into a training set and a testing set; non-fatigue features are also divided into training and test sets.
In some possible implementations, the driver's steering wheel rotation characteristics are collected:
acquiring the rotation frequency and the rotation angle of a steering wheel through a rotation angle sensor;
the characteristic that the rotating frequency of the steering wheel exceeds a set threshold value is regarded as fatigue characteristic; the characteristic that the rotation frequency of the steering wheel is lower than a set threshold value is regarded as a non-fatigue characteristic;
the characteristic that the rotating angle exceeds a set threshold value is regarded as fatigue characteristic; the characteristic that the rotating angle is lower than a set threshold value is regarded as a non-fatigue characteristic;
dividing fatigue characteristics into a training set and a testing set; non-fatigue features are also divided into training and test sets.
In some possible implementation manners, the specific steps of performing early warning on the fatigue identification result are as follows:
if the driver is judged to be fatigue driving, the equipment warns the driver in a voice mode and a bracelet vibration mode;
and if the taxi information is used by a taxi company or a motorcade, uploading the driving information of the taxi to a server.
In some possible implementation manners, the specific steps of performing feature fusion and fatigue recognition on the human eye state feature, the pulse feature and the steering wheel rotation feature based on the SVM-DS algorithm are as follows:
inputting fatigue characteristics and non-fatigue characteristics of the training set into a support vector machine, and training the support vector machine to obtain a trained support vector machine;
inputting the characteristics of the test set into the support vector machine, and calculating the posterior probability p of each support vector machinei
Inputting the characteristics of the test set into the support vector machines to obtain confusion matrixes of the support vector machines;
calculating the local credibility of the corresponding support vector machine based on each confusion matrix;
when the support vector machine identifies a certain sample x, the sample class is wiWhen the reliability of the result output by the support vector machine is PC (w)i);
Posterior probability p based on each support vector machineiAnd the reliability of the result output by the support vector machine is PC (w)i) Calculating to obtain BPA during decision fusion;
ml(wi)=Pi×PC(wi)
wherein m isl(wi) Indicates that the classifier l belongs to w for the sample xiProbability assignment of classes;
and obtaining a final fatigue identification result through DS fusion.
In a second aspect, the present disclosure further provides a fatigue driving detection and early warning system based on multi-information fusion;
fatigue driving detection early warning system based on multi-information fusion includes:
a data acquisition module: collecting human eye state characteristics, pulse characteristics and steering wheel rotation characteristics of a driver;
the identification early warning module: and carrying out feature fusion and fatigue recognition on the human eye state features, the pulse features and the steering wheel rotation features based on an SVM-DS algorithm, and early warning on a fatigue recognition result.
In a third aspect, the present disclosure also provides an electronic device, including a memory, a processor, and computer instructions stored in the memory and executed on the processor, where the computer instructions, when executed by the processor, implement the method in any possible implementation manner of the first aspect.
In a fourth aspect, the present disclosure also provides a computer-readable storage medium for storing computer instructions which, when executed by a processor, perform the steps of any of the method in any possible implementation manner of the first aspect;
compared with the prior art, the beneficial effect of this disclosure is:
the method has the advantages that feature fusion and fatigue recognition are carried out on the human eye state feature, the pulse feature, the steering wheel rotation feature and the driving time duration feature based on the SVM-DS algorithm, early warning is carried out on a fatigue recognition result, the accuracy of fatigue detection can be improved through fusion recognition of a plurality of features, and interference on a driver is reduced to the maximum extent in the data information acquisition process.
Drawings
The accompanying drawings, which are incorporated in and constitute a part of this application, illustrate embodiments of the application and, together with the description, serve to explain the application and are not intended to limit the application.
FIG. 1 is an overall flow chart of the present disclosure;
FIG. 2 is a schematic diagram of the calculation method of P80 (the ratio of time per unit time that the eye is closed when the eyelid covers more than 80% of the pupil area) according to the present disclosure;
fig. 3 is a schematic diagram of a hardware structure of a wireless bracelet module according to the present disclosure;
FIG. 4 is a schematic diagram of a pulse signal acquisition circuit according to the present disclosure;
FIG. 5 is a schematic view of a pulse capture image according to the present disclosure;
fig. 6 is a model schematic diagram of a fatigue identification algorithm of the present disclosure.
Detailed Description
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
The utility model provides a can automatic identification driver whether get into fatigue state and carry out on-vehicle safety device of early warning thereof, whole flow is shown as figure 1 and figure 6, and the main part flow of its algorithm divide into the data acquisition and the preliminary treatment of sensor, the extraction of characteristic, the fusion of three kinds of characteristics, fatigue identification and warn, its implementation step as follows:
acquiring facial image information of a driver in a driving state through a high-definition infrared camera arranged in an automobile cab; the driver can observe the rear vehicle information through the rearview mirror when driving, and the camera is mounted on the rearview mirror, so that the driving sight of the driver cannot be interfered, and the camera is the best position for collecting the video information of the driver.
And carrying out face positioning.
Converting the image into YCbCr space image for clustering calculation and skin color segmentation, YmaxAnd YminThe maximum value and the minimum value in the clustering region have the following parameters according to the test result: y ismax=153,Ymin=31,Cbmax=128,Cbmin=113,Crmax=153,Crmin=126。
When the Cb and Cr components in the image are in the range, the image is identified as a human face potential area, and conversely, the image is identified as a non-human face potential area. And carrying out skin color segmentation on the image through the skin color algorithm, carrying out preliminary judgment on the position of the face, setting the potential area of the face image to be white, setting the potential area of the non-face image to be black, and carrying out binarization on the processed image.
In order to make the image recognition more accurate, the image is morphologically processed. As basic operation in morphological operation, corrosion operation and expansion operation have the functions of noise reduction filtering, feature extraction, boundary detection, contrast enhancement and the like.
Erosion operations can cut boundary points and can filter out small, meaningless regions. And carrying out corrosion operation processing on the image by using the structural element B, wherein the expression is as follows:
Figure BDA0001858758500000071
the dilation operation enables the outer boundary to be expanded externally, merging regions that are partially miscut. And performing expansion operation processing on the image by using the structural element B, wherein the expression is as follows:
Figure BDA0001858758500000081
the following post-processing is carried out:
(1) filtering out a region of which the area of the connected region is smaller than 15 × 15;
(2) the width-height ratio of the circumscribed rectangle of the face region is P, the value of the circumscribed rectangle is not too large or too small, and the region with the P value in the range of [0.4,1.2] is filtered.
(3) And performing the expansion operation on the image again to obtain a binary image containing the face region and having a single connected domain as a mask template.
And carrying out human eye positioning.
And converting the determined face area into a gray image, filtering and denoising by using median filtering, adjusting the integral gray of the image by using histogram equalization operation, and extending the integral gray of the image within the range of [0,255], so that the number of pixels in each section tends to be the same, the gray distribution is more uniform, and the illumination influence is reduced.
Let the variable I (a, b) be the gray scale information of the point with coordinates (a, b) on the image, and then be located at [ a1,a2]、[b1,b2]The expressions of the horizontal gray scale integral projection function H (b) and the vertical integral projection function V (a) in the range are:
Figure BDA0001858758500000082
Figure BDA0001858758500000083
in a digital image, its dispersion is expressed as:
Figure BDA0001858758500000084
Figure BDA0001858758500000085
reducing the image range to the original image
Figure BDA0001858758500000086
To
Figure BDA0001858758500000087
(h is image height) and finding the lowest point b of the imagemIn order to completely reserve effective information of the eye region, the vertical coordinate of the image is taken as [ bm-30,bm+30]And (4) a region. Calculating the vertical gray scale integral projection function of the image, normalizing the function, and calculating the coordinate of the lowest point of the eyeball as amTo obtain [ a ]m,bm]。
The schematic diagram of PERC L OS P80 parameter calculation is shown in FIG. 2, and the calculation formula is as follows:
Figure BDA0001858758500000088
definition of P80: when the area of the eyelid covering the pupil exceeds 80%, the time ratio of the eye closing per unit time is. P is the ratio, t1~t4The four time points are represented by the process of the eyelid covering the pupil by 80%, closing the eye, opening the eye, and covering the pupil by 80%.
And (3) respectively taking 20 pixels from the left and right of the positioned human eye coordinate, respectively taking 15 pixels from the upper and lower parts, wherein the region comprises the whole eye, then carrying out histogram equalization, and finally carrying out binarization processing to obtain a two-dimensional feature map for human eye feature calculation. Record its height lyAnd width lxThe ratio L is used as the eye opening degree value, and all the opening degrees are normalized to [0,1]]And the eye opening time, the eye closing time, the blinking frequency, the PERC L OS and other characteristics are extracted.
And (II) acquiring pulse information through a wireless bracelet with a photoelectric pulse sensor.
The pulse signal acquisition terminal uses an STM8S207R8 chip as a control chip, the hardware structure of a bracelet is shown in figure 3, the control chip is respectively connected with a power module, a pulse acquisition module, a vibration module, a liquid crystal display module, a Bluetooth module and a GPS module, a photoelectric heart rate sensor is used for acquiring pulse signals, and the signals are transmitted to a host computer through wireless Bluetooth after AD conversion.
A pulse signal acquisition module is formed by using a YK1303 photoelectric heart rate sensor, an HR6706 heart rate chip and the like, and a signal acquisition schematic diagram is shown in fig. 4. A pin 1 of the photoelectric heart rate sensor YK1303 is connected with a 3.3V power supply VCC through a resistor R8; pin 2 of the photoelectric heart rate sensor YK1303 is grounded; a No. 3 pin of the photoelectric heart rate sensor YK1303 is connected with a 3.3V power supply VCC through a resistor R10; a No. 4 pin of the photoelectric heart rate sensor YK1303 is connected with a 3.3V power supply VCC; the No. 5 pin of the photoelectric heart rate sensor YK1303 is grounded; the No. 6 pin of the photoelectric heart rate sensor YK1303 is grounded through a resistor R11; the No. 6 pin of the photoelectric heart rate sensor YK1303 is also connected with a 3.3V power supply VCC through a resistor R12 and a capacitor C18; the No. 6 pin of the photoelectric heart rate sensor YK1303 is also connected with the No. 2 pin of the HR6706 heart rate chip through a capacitor C16 and a resistor R9 in sequence; the No. 1 pin of the HR6706 heart rate chip is connected with the No. 6 pin of the HR6706 heart rate chip through a capacitor C15 and a resistor R6 in sequence; the No. 2 pin of the HR6706 heart rate chip is connected with the No. 1 pin of the HR6706 heart rate chip through a resistor R18 and a resistor R7 in sequence; the No. 2 pin of the HR6706 heart rate chip is grounded through a capacitor C14 and a resistor R5; the No. 1 pin of the HR6706 heart rate chip is grounded through a resistor R5; a No. 3 pin of the HR6706 heart rate chip is connected with a 3.3V power supply VCC through a capacitor C18; the No. 3 pin of the HR6706 heart rate chip is connected with the No. 5 pin of the HR6706 heart rate chip; the No. 4 pin of the HR6706 heart rate chip is grounded; the No. 6 pin of the HR6706 heart rate chip is connected with the No. 7 pin of the HR6706 heart rate chip through a capacitor C17 and a resistor R13 which are connected in parallel; the No. 7 pin of the HR6706 heart rate chip is grounded through a resistor R16 and a resistor R17 in sequence; no. 8 pin of HR6706 heart rate chip is connected with 3.3V power VCC. The connection point between the resistor R16 and the resistor R17 serves as a signal output point to be output to the bracelet controller STM8 in the form of a digital signal.
The heart rate sensor senses the pulse information of a human body in a Photoelectric Plethysmography (PPG) mode, extracts the pulse information and finally outputs a pulse waveform. HR6707 is a heart rate IC designed to cooperate with YK1303P heart rate sensor to output pulse wave to MCU for AD conversion. Fig. 5 is a pulse acquisition image.
And for the acquired signals, a db6 wavelet is used for carrying out solution, and the ω (m, n) is a coefficient matrix of the noise-containing signals after wavelet transformation.
The correlation coefficient R (m, n) represents the product of the m decomposition scale and the n adjacent scale, and R (m, n) is normalized:
Figure BDA0001858758500000101
comparison of NR(m,n)And the magnitude of the absolute value of ω (m, N), if NR(m,n)Is large, ω (m, n) is considered to be from the original signal and is assigned to ωf(m, n) are zeroed, and if the absolute value of ω (m, n) is large, ω (m, n) is retained, as if it came from the slave noise signal.
Figure BDA0001858758500000102
Wherein L represents the number of zeroed points;
and (5) obtaining the mean square error of the noise and the unbiased estimation ratio as lambda, and if the lambda value is larger than 1, iterating the processes from the step (200) to the step (204). Final reconstruction of omegaf(m, n), denoising is completed.
The dominant wave position is extracted in the reconstructed signal. In order to simplify the calculation, the orthogonal wavelet Coiflet1 with better symmetry is selected as a wavelet base, and the calculation is performed by adopting an orthogonal wavelet transform method, and the specific steps are as follows:
(1) performing three-layer wavelet decomposition on the pulse signal by taking Coifletl as a wavelet basis;
(2) independently extracting the decomposed third-layer high-frequency coefficient, and reconstructing a third-layer high-frequency signal according to the third-layer high-frequency coefficient;
(3) in the high-frequency coefficient of the third layer, a self-adaptive threshold method is adopted to detect the maximum value point in each period range;
(4) and (4) taking the maximum value point detected in the step (3) as a reference point and corresponding to the original signal. In the original signal, 100 points are taken before and after as a search range, and the maximum point of the original signal in the range is detected, wherein the point is the position of the main wave crest of the pulse.
The time difference between two adjacent main waves is determined and recorded as xi(i is 1,2,3, …, n), then xiThe mean of (a) is the mean of the dominant phase interval, xiThe pulse signals are transformed into a frequency domain through FFT, and the power HF and L F of high-frequency (0.15-0.4 Hz) and low-frequency (0.04-0.15 Hz) components of the pulse signals are respectively calculated, so that the ratio of the high-frequency power and the low-frequency power of the pulse is obtained by comparing the high-frequency power and the low-frequency power.
And thirdly, acquiring the rotation frequency and angle data of the steering wheel through a rotation angle sensor.
During normal driving, the rotating frequency and the rotating angle of the steering wheel are kept within a normal range. When a driver is tired, the driver cannot concentrate on the driver, the response speed is slow, the corresponding steering wheel rotating frequency can be obviously reduced, and even a small-angle jolt steering wheel can appear. The steering wheel rotation frequency f, as well as the rotation angle a, is recorded by a rotation angle sensor mounted at the steering wheel rotation axis.
And (IV) feature fusion and fatigue identification.
DS synthesis algorithm based on matrix analysis
For the case that n types of characteristics simultaneously identify one target, the n types of characteristics are given to m types of target types by using a matrix, and the m types of target types are mutually independent basic credible distribution values mijAnd uncertainty probability θiIs shown as
Figure BDA0001858758500000111
The same characteristic gives m kinds of target state independent basic credible distribution value mijAnd uncertainty probability θiThe sum should be 1, so the sum of the elements of each row of the matrix should satisfy the normalization condition, i.e. the sum of the elements of each column of the matrix should satisfy
mi1+mi2+…+mimi=1(i=1,2,...,n)
Multiplication of one row of the matrix by another results in a new matrix R of (m +1) × (m +1)
Figure BDA0001858758500000112
Where the uncertainty factor k is the sum of the off-diagonal elements of the first m × m order sub-matrix in the matrix, i.e.
Figure BDA0001858758500000113
1. And (4) single feature identification.
On the basis of image preprocessing, three types of feature dimensions are extracted, and a Support Vector Machine (SVM) is used for carrying out primary identification on the three types of features.
BPA function constructs.
Probability modeling is carried out on the fatigue recognition of the one-to-one multi-classification SVM in a mode of combining a plurality of probability results of 2 classifications. Estimating the probability of a pairing class using a sigmoid function, i.e.
rij≈p(y=i|y=i or j,x)
For the posterior probability piComprises the following steps:
Figure BDA0001858758500000121
then testing the learning sample set to obtain the identification accuracy qiThen the BPA function may be defined as:
mj(A)=piqi
3. and (5) decision fusion and judgment rules.
Let Ai(i-1, 2,3,4) is the driving state, aωIs in a target state;
get evidence for A in the framework thetaiConfidence of (1) and uncertainty m of evidencejAfter (θ), the classification decision follows the following rules:
①m(Aw)=max{m(Ai) The state of maximum confidence is the target state.
②m(Aw)-m(Ai)>1(1>0) I.e. the difference in the confidence level that the target state is to be compared to the other states must be greater than a certain threshold.
③m(Aw)-m(θ)>2(2>0) I.e., the confidence of the target state must be greater than the uncertainty confidence assignment value.
④m(θ)<3(3>0) I.e. the uncertainty confidence assignment value must be less than a certain threshold, i.e. the uncertainty confidence in the evidence of the target state cannot be too large.
And (V) warning the driver and uploading data.
If judge for driver fatigue, MCU passes through bluetooth send instruction to the bracelet, makes the bracelet produce the vibration in order to remind the driver, and MCU still sends voice alarm through external speaker simultaneously. If the vehicle driving data are used by taxi companies or fleets, the MCU uploads the vehicle driving data and the judgment result to the server.
The above description is only a preferred embodiment of the present application and is not intended to limit the present application, and various modifications and changes may be made by those skilled in the art. Any modification, equivalent replacement, improvement and the like made within the spirit and principle of the present application shall be included in the protection scope of the present application.

Claims (9)

1. The fatigue driving detection early warning method based on multi-information fusion is characterized by comprising the following steps:
extracting human eye state characteristics, pulse characteristics and steering wheel rotation characteristics of a driver;
carrying out feature fusion and fatigue recognition on the human eye state features, the pulse features and the steering wheel rotation features based on an SVM-DS algorithm, and early warning on a fatigue recognition result;
the specific steps of carrying out feature fusion and fatigue recognition on the human eye state feature, the pulse feature and the steering wheel rotation feature based on the SVM-DS algorithm are as follows:
inputting fatigue characteristics and non-fatigue characteristics of the training set into a support vector machine, and training the support vector machine to obtain a trained support vector machine;
inputting the characteristics of the test set into the support vector machine, and calculating the posterior probability P of each support vector machinei
Inputting the characteristics of the test set into the support vector machines to obtain confusion matrixes of the support vector machines;
calculating the local credibility of the corresponding support vector machine based on each confusion matrix;
when the support vector machine identifies a certain sample x, the sample class is wiWhen the reliability of the result output by the support vector machine is PC (w)i);
Posterior probability P based on each support vector machineiAnd the reliability of the result output by the support vector machine is PC (w)i) Calculating to obtain BPA during decision fusion;
ml(wi)=Pi×PC(wi)
wherein m isl(wi) Indicates that the classifier l belongs to w for the sample xiProbability assignment of classes;
and obtaining a final fatigue identification result through DS fusion.
2. The fatigue driving detection and early warning method based on multi-information fusion as claimed in claim 1, wherein the specific steps of collecting the human eye state characteristics of the driver are as follows:
a step (101): acquiring a facial image of a driver in a driving state through a high-definition camera mounted on an automobile rearview mirror;
a step (102): realizing face positioning of the face image in an YCbCr space based on a skin color face detection algorithm to obtain a face positioning image;
step (103): on the face positioning image, realizing the positioning of the human eye region according to the gray scale integral projection;
calculating the eye opening duration, the eye closing duration, the blink frequency and the eye closing proportion PERC L OS in unit time according to the positioned human eye area;
a step (105): the characteristic that the eye opening duration exceeds a set threshold is regarded as a fatigue characteristic; the characteristic that the eye opening duration is less than or equal to a set threshold value is regarded as a non-fatigue characteristic;
the characteristic that the eye closing time exceeds a set threshold value is regarded as a fatigue characteristic; the characteristic that the eye closing duration is less than or equal to a set threshold value is regarded as a non-fatigue characteristic;
taking the characteristic that the blink frequency is smaller than a set threshold value as a fatigue characteristic; the characteristic that the blink frequency is larger than or equal to a set threshold value is regarded as a non-fatigue characteristic;
regarding a characteristic that the proportion PERC L OS of eye closure in the unit time is less than a set threshold as a fatigue characteristic, and regarding a characteristic that the proportion PERC L OS of eye closure in the unit time is greater than a set threshold as a non-fatigue characteristic;
dividing fatigue characteristics into a training set and a testing set; non-fatigue features are also divided into training and test sets.
3. The fatigue driving detection and early warning method based on multi-information fusion as claimed in claim 2, wherein the specific steps of the step (103) are as follows:
step (1031): selecting face-positioning images
Figure FDA0002471317430000021
To
Figure FDA0002471317430000022
H is the height of the face positioning image, and the height is obtained
Figure FDA0002471317430000023
To
Figure FDA0002471317430000024
Image lowest point b ofmTaking the ordinate [ b ] of the imagem-30,bm+30]As an approximate eye region;
step (1032): calculating the vertical gray scale integral projection function of the approximate eye area image, normalizing the function, and calculating the coordinate of the lowest point of the eyeball as amTo obtain [ a ]m,bm];
Step (1033): from the located nadir coordinates [ a ]m,bm]22 pixels are taken from the left side and the right side respectively, 16 pixels are taken from the upper side and the lower side respectively, then a human eye area is determined, histogram equalization is carried out on an image of the human eye area, binarization processing is carried out, and finally a two-dimensional feature map for human eye feature calculation is obtained;
step (1034): recording the height l of a two-dimensional profileyAnd width lxThe ratio L of height to width is used as the eye opening and closing value;
and (1035) normalizing the eye opening degrees of all the human eye area images to [0,1], considering that the human eyes are in a closed state when the eye opening degrees are less than 20%, and further calculating the eye opening time, the eye closing time, the blinking frequency and the proportion PERC L OS of the closed eyes in unit time.
4. The fatigue driving detection and early warning method based on multi-information fusion as claimed in claim 1, wherein the pulse characteristics of the driver are collected:
pulse data are obtained through a wireless bracelet with a photoelectric pulse sensor, filtering processing is carried out on the pulse data based on wavelet transformation, and pulse features are extracted.
5. The fatigue driving detection and early warning method based on multi-information fusion as claimed in claim 4, wherein pulse data is obtained through a wireless bracelet with a photoelectric pulse sensor, the pulse data is filtered based on wavelet transformation, and the specific steps of extracting pulse features are as follows:
step (200): for signals acquired by the photoelectric pulse sensor, db6 wavelet is used for solving, and omega (m, n) is a coefficient matrix obtained after wavelet transformation of noise-containing signals, wherein m is a telescopic factor, and n is a translation factor;
step (201): calculating a correlation coefficient R (m, n), wherein R (m, n) represents the product of the expansion factor and the translation factor;
step (202): carrying out normalization processing on the correlation coefficient R (m, n):
Figure FDA0002471317430000031
wherein N isR(m,n)Representing a normalized correlation coefficient matrix; z represents an integer;
step (203): comparison of NR(m,n)The absolute value of (d) and the absolute value of ω (m, n); if N is presentR(m,n)Is large, ω (m, n) is considered to be derived from the original signal, and ω (m, n) is given to the reconstructed signal function ωf(m, n) and zero ω (m, n); if the absolute value of ω (m, n) is large, ω (m, n) is considered to be from the noise signal, and ω (m, n) is retained;
a step (204): calculating the mean square error of the noise
Figure FDA0002471317430000032
A ratio to the unbiased estimate σ (m, n) is λ;
Figure FDA0002471317430000033
wherein L represents the number of zeroed points;
step (205): if the lambda value is larger than 1, iterating the process from the step (200) to the step (204); if lambda is less than or equal to 1, a reconstructed signal omega is obtainedf(m, n), completing denoising;
step (206): extracting the position of a main wave crest in a reconstructed signal:
selecting orthogonal wavelets as wavelet bases, and performing three-layer wavelet decomposition on pulse information by adopting an orthogonal wavelet transform method; extracting the decomposed third-layer high-frequency coefficient, and reconstructing a third-layer high-frequency signal by using the third-layer high-frequency coefficient; in the high-frequency coefficient of the third layer, a self-adaptive threshold method is adopted to detect the maximum value point in each period range; taking the detected maximum value point as a reference point and corresponding to the original signal; in the original signal, respectively taking M points before and after as a search range, and detecting a maximum value point of the original signal in the range, wherein the point is the main wave crest position of the pulse;
finding two neighborsThe time difference between the main wave peaks, denoted xiI is a positive integer, then xiIs the mean of the dominant wave crest, xiThe standard deviation is the standard deviation of the main wave crest;
transforming the pulse signal into a frequency domain through FFT, respectively calculating power HF of a high-frequency component and power L F of a low-frequency component of the pulse signal transformed into the frequency domain, and comparing the power HF of the high-frequency component with the power L F of the low-frequency component to obtain the ratio of the high-frequency power and the low-frequency power of the pulse;
taking the position of the main wave crest of the pulse, the mean value of the main wave crest, the standard deviation of the main wave crest and the ratio of the high-frequency power and the low-frequency power of the pulse as the pulse characteristics;
the pulse characteristic that the main wave crest position of the pulse exceeds a set threshold value is regarded as fatigue characteristic; the pulse characteristic that the position of the main wave crest of the pulse is lower than a set threshold value is regarded as a non-fatigue characteristic;
taking the pulse characteristic that the mean value of the main wave peak exceeds a set threshold value as fatigue characteristic; taking the pulse characteristic that the mean value of the main wave peak is lower than a set threshold as a non-fatigue characteristic;
taking the pulse characteristic that the standard deviation of the main wave crest exceeds a set threshold value as a fatigue characteristic; taking the pulse characteristic that the standard deviation of the main wave crest is lower than a set threshold value as a non-fatigue characteristic;
regarding the pulse characteristics of which the ratio of the high frequency power to the low frequency power of the pulse exceeds a set threshold as fatigue characteristics; regarding the pulse characteristics of which the ratio of the high frequency power and the low frequency power of the pulse is lower than a set threshold as non-fatigue characteristics;
dividing fatigue characteristics into a training set and a testing set; non-fatigue features are also divided into training and test sets.
6. The fatigue driving detection and early warning method based on multi-information fusion as claimed in claim 1, wherein the rotation characteristics of the steering wheel of the driver are collected:
acquiring the rotation frequency and the rotation angle of a steering wheel through a rotation angle sensor;
the characteristic that the rotating frequency of the steering wheel exceeds a set threshold value is regarded as fatigue characteristic; the characteristic that the rotation frequency of the steering wheel is lower than a set threshold value is regarded as a non-fatigue characteristic;
the characteristic that the rotating angle exceeds a set threshold value is regarded as fatigue characteristic; the characteristic that the rotating angle is lower than a set threshold value is regarded as a non-fatigue characteristic;
dividing fatigue characteristics into a training set and a testing set; non-fatigue features are also divided into training and test sets.
7. Fatigue driving detection early warning system based on multi-information fusion, characterized by includes:
a data acquisition module: collecting human eye state characteristics, pulse characteristics and steering wheel rotation characteristics of a driver;
the identification early warning module: carrying out feature fusion and fatigue recognition on the human eye state features, the pulse features and the steering wheel rotation features based on an SVM-DS algorithm, and early warning on a fatigue recognition result;
the specific steps of carrying out feature fusion and fatigue recognition on the human eye state feature, the pulse feature and the steering wheel rotation feature based on the SVM-DS algorithm are as follows:
inputting fatigue characteristics and non-fatigue characteristics of the training set into a support vector machine, and training the support vector machine to obtain a trained support vector machine;
inputting the characteristics of the test set into the support vector machine, and calculating the posterior probability P of each support vector machinei
Inputting the characteristics of the test set into the support vector machines to obtain confusion matrixes of the support vector machines;
calculating the local credibility of the corresponding support vector machine based on each confusion matrix;
when the support vector machine identifies a certain sample x, the sample class is wiWhen the reliability of the result output by the support vector machine is PC (w)i);
Posterior probability P based on each support vector machineiAnd the reliability of the result output by the support vector machine is PC (w)i) Calculating to obtain BPA during decision fusion;
ml(wi)=Pi×PC(wi)
wherein m isl(wi) Indicates that the classifier l belongs to w for the sample xiProbability assignment of classes;
and obtaining a final fatigue identification result through DS fusion.
8. An electronic device comprising a memory and a processor and computer instructions stored on the memory and executable on the processor, the computer instructions when executed by the processor performing the steps of the method of any of claims 1 to 6.
9. A computer-readable storage medium storing computer instructions which, when executed by a processor, perform the steps of the method of any one of claims 1 to 6.
CN201811325931.3A 2018-11-08 2018-11-08 Fatigue driving detection early warning method, system and medium based on multi-information fusion Active CN109389806B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201811325931.3A CN109389806B (en) 2018-11-08 2018-11-08 Fatigue driving detection early warning method, system and medium based on multi-information fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201811325931.3A CN109389806B (en) 2018-11-08 2018-11-08 Fatigue driving detection early warning method, system and medium based on multi-information fusion

Publications (2)

Publication Number Publication Date
CN109389806A CN109389806A (en) 2019-02-26
CN109389806B true CN109389806B (en) 2020-07-24

Family

ID=65427099

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201811325931.3A Active CN109389806B (en) 2018-11-08 2018-11-08 Fatigue driving detection early warning method, system and medium based on multi-information fusion

Country Status (1)

Country Link
CN (1) CN109389806B (en)

Families Citing this family (12)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109823337A (en) * 2019-02-28 2019-05-31 重庆交通大学 The autonomous avoiding system of vehicle and method under a kind of operating passenger car driver abnormal conditions
CN110012114B (en) * 2019-05-05 2020-01-21 北京市众诚恒祥能源投资管理有限公司 Environmental safety early warning system based on thing networking
CN110664417A (en) * 2019-09-18 2020-01-10 朔黄铁路发展有限责任公司 Train safe driving early warning equipment and system
CN111080940A (en) * 2019-11-28 2020-04-28 同济大学 Fatigue driving early warning method and system based on threshold system
CN111062300A (en) * 2019-12-11 2020-04-24 深圳市赛梅斯凯科技有限公司 Driving state detection method, device, equipment and computer readable storage medium
CN113312948A (en) * 2020-03-26 2021-08-27 香港生产力促进局 Method, equipment and system for detecting drowsiness by using deep learning model
CN111345783B (en) * 2020-03-26 2021-02-23 山东大学 Vestibular dysfunction detection system based on inertial sensor
CN111583585B (en) * 2020-05-26 2021-12-31 苏州智华汽车电子有限公司 Information fusion fatigue driving early warning method, system, device and medium
CN111797794A (en) * 2020-07-13 2020-10-20 中国人民公安大学 Facial dynamic blood flow distribution detection method
CN112233276B (en) * 2020-10-13 2022-04-29 重庆科技学院 Steering wheel corner statistical characteristic fusion method for fatigue state recognition
CN114298189A (en) * 2021-12-20 2022-04-08 深圳市海清视讯科技有限公司 Fatigue driving detection method, device, equipment and storage medium
CN116502047A (en) * 2023-05-23 2023-07-28 成都市第四人民医院 Method and system for processing biomedical data

Family Cites Families (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011248850A (en) * 2010-04-28 2011-12-08 Imasen Electric Ind Co Ltd Doze prevention device
CN104952210B (en) * 2015-05-15 2018-01-05 南京邮电大学 A kind of fatigue driving state detecting system and method based on decision making level data fusion
CN105261153A (en) * 2015-11-03 2016-01-20 北京奇虎科技有限公司 Vehicle running monitoring method and device
CN108216254B (en) * 2018-01-10 2020-03-10 山东大学 Road anger emotion recognition method based on fusion of facial image and pulse information
CN108710756A (en) * 2018-05-18 2018-10-26 上海电力学院 The method for diagnosing faults of lower multicharacteristic information Weighted Fusion is analyzed based on spectral clustering
CN108765876A (en) * 2018-05-31 2018-11-06 东北大学 Driving fatigue depth analysis early warning system based on multimode signal and method

Also Published As

Publication number Publication date
CN109389806A (en) 2019-02-26

Similar Documents

Publication Publication Date Title
CN109389806B (en) Fatigue driving detection early warning method, system and medium based on multi-information fusion
US11783601B2 (en) Driver fatigue detection method and system based on combining a pseudo-3D convolutional neural network and an attention mechanism
Alioua et al. Driver’s fatigue detection based on yawning extraction
US5570698A (en) System for monitoring eyes for detecting sleep behavior
Fuletra et al. A survey on drivers drowsiness detection techniques
Ahmad et al. Drowsy driver identification using eye blink detection
Hossain et al. IOT based real-time drowsy driving detection system for the prevention of road accidents
Batista A drowsiness and point of attention monitoring system for driver vigilance
CN109460703B (en) Non-invasive fatigue driving identification method based on heart rate and facial features
CN104200192A (en) Driver gaze detection system
CN111753674A (en) Fatigue driving detection and identification method based on deep learning
CN111460950A (en) Cognitive distraction method based on head-eye evidence fusion in natural driving conversation behavior
Ursulescu et al. Driver drowsiness detection based on eye analysis
Devi et al. Fuzzy based driver fatigue detection
Zhao et al. Deep convolutional neural network for drowsy student state detection
Nakamura et al. Driver drowsiness estimation from facial expression features computer vision feature investigation using a CG model
CN113989788A (en) Fatigue detection method based on deep learning and multi-index fusion
Khan et al. Efficient Car Alarming System for Fatigue Detectionduring Driving
Ribarić et al. A neural-network-based system for monitoring driver fatigue
Kassem et al. Drivers fatigue level prediction using facial, and head behavior information
Wongphanngam et al. Fatigue warning system for driver nodding off using depth image from Kinect
Panicker et al. Open-eye detection using iris–sclera pattern analysis for driver drowsiness detection
Selvathi FPGA based human fatigue and drowsiness detection system using deep neural network for vehicle drivers in road accident avoidance system
CN117542027A (en) Unit disabling state monitoring method based on non-contact sensor
Tarba et al. The driver's attention level

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant