CN110781872A - Driver fatigue grade recognition system with bimodal feature fusion - Google Patents

Driver fatigue grade recognition system with bimodal feature fusion Download PDF

Info

Publication number
CN110781872A
CN110781872A CN201911401097.6A CN201911401097A CN110781872A CN 110781872 A CN110781872 A CN 110781872A CN 201911401097 A CN201911401097 A CN 201911401097A CN 110781872 A CN110781872 A CN 110781872A
Authority
CN
China
Prior art keywords
steering wheel
fatigue
value
formula
driving
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201911401097.6A
Other languages
Chinese (zh)
Inventor
张宇
冯鹏翔
王磊
陆林
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
South Sagittarius Integration Co Ltd
Original Assignee
South Sagittarius Integration Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by South Sagittarius Integration Co Ltd filed Critical South Sagittarius Integration Co Ltd
Priority to CN201911401097.6A priority Critical patent/CN110781872A/en
Publication of CN110781872A publication Critical patent/CN110781872A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/08Learning methods
    • GPHYSICS
    • G07CHECKING-DEVICES
    • G07CTIME OR ATTENDANCE REGISTERS; REGISTERING OR INDICATING THE WORKING OF MACHINES; GENERATING RANDOM NUMBERS; VOTING OR LOTTERY APPARATUS; ARRANGEMENTS, SYSTEMS OR APPARATUS FOR CHECKING NOT PROVIDED FOR ELSEWHERE
    • G07C5/00Registering or indicating the working of vehicles
    • G07C5/08Registering or indicating performance data other than driving, working, idle, or waiting time, with or without registering driving, working, idle or waiting time
    • G07C5/0841Registering performance data

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Evolutionary Computation (AREA)
  • Computing Systems (AREA)
  • Biomedical Technology (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Artificial Intelligence (AREA)
  • General Engineering & Computer Science (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Health & Medical Sciences (AREA)
  • Multimedia (AREA)
  • Traffic Control Systems (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

A driver fatigue grade recognition system with fusion of bimodal features comprises an image acquisition module, a facial behavior recognition module and a driving recorder host, wherein the image acquisition module and the driving recorder host are both connected with the facial behavior recognition module, and the driving recorder host is also connected with a CAN bus of a vehicle; the image acquisition module is used for acquiring a face dynamic image of a driver; the facial behavior recognition module is used for recognizing eye closing characteristics and mouth opening and closing characteristics from the acquired dynamic human face image; the driving recorder host is used for acquiring and analyzing vehicle operation information from a vehicle CAN bus, calculating driving behavior characteristics through the vehicle operation information, and analyzing the fatigue level of a driver through fusion of eye closing characteristics, mouth opening and closing characteristics and driving behavior characteristics. The invention overcomes the limitation of a single information source, fully considers the correlation and complementarity of each information source, and has more accurate fatigue grade analysis.

Description

Driver fatigue grade recognition system with bimodal feature fusion
Technical Field
The invention relates to the field of fatigue driving prediction, in particular to a driver fatigue grade recognition system with bimodal feature fusion.
Background
The data of the national statistical bureau show that the number of the traffic accidents per year in nearly five years in China exceeds 12 thousands, wherein the truck traffic accidents are particularly serious, 5.04 thousands of the traffic accidents of truck responsibility roads occur in 2016, 2.5 thousands of people die and 4.68 thousands of people are injured, the truck accident rate is higher than that of common motor vehicles, and the caused loss is also higher than the average level. Among them, traffic accidents due to fatigue driving cause significant losses to people's life and property safety every year, and various studies have shown that about 20% of all road accidents are associated with fatigue, and up to 50% on some roads. The sampling survey results of the freight vehicle drivers by relevant departments in China are displayed: 84% of the freight vehicle drivers are driving for a daily average time of more than 8 hours, 40% of them for more than 12 hours, and 64% of the freight vehicles are equipped with only 1 driver.
Therefore, the fatigue driving detecting system can help prevent accidents caused by drowsiness of the driver. At present, a single fatigue driving detection means in the market has a plurality of defects, for example, when driving behavior characteristic analysis is utilized, the types of bus data are multiple, the data volume is large, the characteristic behaviors of fatigue driving are extremely difficult to identify and extract, and the driving habits of each driver are different, so that intelligent identification cannot be realized; by using the method of facial state recognition, the actions of closing eyes, yawning, making a call and the like of a driver can be effectively recognized, but the driver cannot normally work in a poor light environment in a cabin or when a camera is shielded.
Disclosure of Invention
In order to solve the technical problems, the invention provides a driver fatigue level identification system with bimodal feature fusion, which combines eye closing feature, mouth opening and closing feature and driving behavior feature fusion to analyze the driver fatigue level so as to overcome the limitation of a single information source and fully consider the correlation and complementarity of each information source.
The technical scheme of the invention is as follows:
a driver fatigue grade recognition system with fusion of bimodal features comprises an image acquisition module, a facial behavior recognition module and a driving recorder host, wherein the image acquisition module and the driving recorder host are both connected with the facial behavior recognition module, and the driving recorder host is also connected with a CAN bus of a vehicle;
the image acquisition module is used for acquiring a face dynamic image of a driver;
the facial behavior recognition module is used for recognizing eye closing characteristics and mouth opening and closing characteristics from the acquired dynamic human face image;
the driving recorder host is used for acquiring and analyzing vehicle operation information from a vehicle CAN bus, calculating driving behavior characteristics through the vehicle operation information, and analyzing the fatigue level of a driver through fusion of eye closing characteristics, mouth opening and closing characteristics and driving behavior characteristics.
Wherein the eye closure characteristics comprise a maximum eye closure time, a blink frequency, and a percentage of eye closure time, and the mouth opening characteristics comprise a maximum mouth opening time, a yawning frequency, and a percentage of mouth opening time;
the driving recorder host comprises a vehicle operation information acquisition module and a driving behavior feature calculation module, wherein the vehicle operation information acquisition module is used for acquiring and analyzing vehicle operation information from a vehicle CAN bus, the driving behavior feature calculation module is used for calculating driving behavior features through the vehicle operation information, the vehicle operation information comprises a steering wheel angle SA and a steering wheel angular velocity SAR, and the driving behavior features comprise a steering wheel angle absolute MEAN value SAMEAN, a steering wheel angle standard difference SASTD, a steering wheel angle lower quartile value MEAN value SAQ1MEAN, a steering wheel angle upper quartile value MEAN value SAQ3MEAN, a steering wheel angular entropy, a steering wheel angular velocity absolute value MEAN value SAVMEAN, a steering wheel angular velocity standard difference SAVSTD, a zero speed percentage PNS and accumulated driving duration;
the driving recorder host further comprises a fatigue prediction neural network model, and the fatigue grade of the driver is analyzed through fusion of eye closing characteristics, mouth opening and closing characteristics and driving behavior characteristics, wherein the method specifically comprises the following steps:
forming a fusion feature vector set by fusing eye closing features, mouth opening and closing features and driving behavior features, wherein X, X = { X1, X2, …, X15}, X1 is the longest eye closing time, X2 is the blink frequency, X3 is the percentage eye closing time, X4 is the longest mouth opening time, X5 is the yawning time, X6 is the percentage mouth opening time, X7 is the absolute MEAN SAMEAN of the steering wheel rotation angle, X8 is the standard deviation SASTD of the steering wheel rotation angle, X9 is the MEAN SAQ1MEAN of the quartiles under the steering wheel rotation angle, X10 is the MEAN SAQ3MEAN of the quartiles under the steering wheel rotation angle, X11 is the steering wheel rotation angle SE, X12 is the MEAN savnean of the steering wheel rotation angle speed absolute value, X13 is the savtd, X14 is the zero speed percentage PNS, and X15 is the running total sum;
the fatigue prediction neural network model judges fatigue driving probability by using a full connection layer, the input of the model is a fusion feature vector set in a fusion window, the fusion feature vector set and a weight vector w of the full connection layer carry out vector product operation, the vector product is input to a Sigmoid activation function, and the fatigue probability value y between 0 and 1 is output through the Sigmoid activation function, and the specific formula is as follows:
Figure 852593DEST_PATH_IMAGE001
further, the image acquisition module comprises a visible light camera and an infrared camera; the visible light camera is used for collecting a face dynamic image of the driver in the daytime, and the infrared camera is used for collecting the face dynamic image of the driver at night.
Further, the time window corresponding to the maximum eye closing time, the percentage of eye closing time, the maximum mouth opening time and the percentage of mouth opening time is 10 seconds, and the time window corresponding to the blink frequency and the yawning frequency is 60 seconds.
Further, the driving behavior feature calculation module calculates the driving behavior feature through the vehicle operation information, specifically:
the absolute steering wheel angle mean value SAMEAN is an average value of absolute steering wheel angles, and a calculation formula is shown as formula one:
the formula I is as follows:
Figure 59452DEST_PATH_IMAGE002
wherein N is the steering wheel angle sample number, SA iIs the ith steering wheel angle sample;
the calculation formula of the steering wheel angle standard deviation SASTD is shown as a formula II:
the formula II is as follows:
Figure 937409DEST_PATH_IMAGE003
wherein, SA mThe calculation formula is shown as formula three:
the formula III is as follows:
Figure 991953DEST_PATH_IMAGE004
arranging N numerical values in a steering wheel corner sample from small to large, counting the numerical values from small to large, wherein the numerical value at the fourth quarter is a lower quartile value SAQ1 of the steering wheel corner, the numerical value at the third quarter is an upper quartile value SAQ3 of the steering wheel corner, the MEAN value SAQ1MEAN of the lower quartile value of the steering wheel corner is the MEAN value smaller than the lower quartile value SAQ1 of the steering wheel corner sample, and the MEAN value SAQ3MEAN of the upper quartile value 383 MEAN of the steering wheel corner sample is the MEAN value larger than the upper quartile value SAQ3 of the steering wheel corner sample;
the steering wheel corner entropy SE reflects the chaos degree and randomness of the operation of a driver on a steering wheel, the larger the steering wheel corner entropy SE is, the larger the randomness of the operation of the driver on the steering wheel is, the higher the fatigue degree of the driver is, the steering wheel corner entropy SE is calculated according to the probability of the occurrence of the prediction deviation of the steering wheel corner, and the predicted value theta of the steering wheel corner is firstly calculated according to the formula IV p(n)
The formula four is as follows:
Figure 110213DEST_PATH_IMAGE005
then, the actual value theta is calculated according to the steering wheel angle (n)Predicted value theta of steering wheel angle p(n)Calculating a steering wheel angle prediction deviation e from the difference nThe calculation formula is the following formula five:
the formula five is as follows:
Figure 758363DEST_PATH_IMAGE006
steering wheel angle predicted deviation e nObeying a normal distribution N (mu, sigma) 2) Predicting the deviation e of the steering wheel angle nDivided into 9 intervals, (- ∞, -5. mu.)],(−5μ,− 2.5μ],(−2.5μ,−μ],(−μ,−0.5μ](-0.5 μ, 0.5 μ), [0.5 μ, μ), [ μ, 2.5 μ), [2.5 μ, 5 μ), [5 μ, + ∞) and then calculates the probability values for each interval p i And finally, calculating the steering wheel corner entropy SE according to a formula six:
formula six:
Figure 486016DEST_PATH_IMAGE007
the mean value SAVMEAN and the standard deviation SAVSTD of the absolute values of the steering wheel angular velocity reflect the fluctuation condition of the vehicle, and the steering wheel angular velocity SA in the first formula and the second formula is replaced by the SAR of the steering wheel, so that the mean value SAVMEAN and the standard deviation SAVSTD of the absolute values of the steering wheel angular velocity are calculated;
the zero-speed percentage PNS detects the continuous and immovable operation characteristic PNS of the steering wheel, and the calculation formula is shown as formula seven:
the formula seven:
Figure 395066DEST_PATH_IMAGE008
wherein N is the total number of samples of angular velocity in the selected time, and N is the sample of angular velocity between + -0.1 deg./s in the total sample.
Further, in the training process of the fatigue prediction neural network model, a cross-entropy (cross-entropy) function is adopted as the lossThe lost function Em, let the training set be N sample pairs<X i,O i>In which X iIs the fused feature vector set of the ith window sample, O iIs a label corresponding to the fused feature vector set of the ith window sample, O iThe value is 1 or 0, 1 represents fatigue driving, 0 represents non-fatigue driving, and when the fusion feature vector set of the ith window sample corresponds to fatigue driving, O iThe value is 1, and when the fusion feature vector set of the ith window sample corresponds to fatigue driving, O iWith a value of 0, the calculation formula of the loss function Em is as follows:
Figure 933495DEST_PATH_IMAGE009
and dividing the training set into small batches to be used as input of each iteration in the training process, and performing model training by using a random gradient descent optimization algorithm through multiple iterations until a loss function is converged to obtain the trained fatigue prediction neural network model.
Further, the driving recorder host further comprises a fatigue grade judging module, wherein the fatigue grade judging module is used for judging the fatigue grade of the driver according to the fatigue probability value, and specifically comprises the following steps:
when the fatigue probability value is less than 0.6, judging that the vehicle is in non-fatigue driving;
when the fatigue probability value is more than or equal to 0.6 and less than 0.9, judging the driver is light fatigue driving;
and when the fatigue probability value is more than or equal to 0.9, judging the deep fatigue driving.
The invention has the following beneficial effects:
1. compared with the existing method for analyzing the fatigue level of the driver only through the driving behavior characteristics, the method for analyzing the fatigue level of the driver based on the eye closing characteristics, the mouth opening and closing characteristics and the driving behavior characteristics is combined to analyze the fatigue level of the driver, so that the limitation of a single information source is overcome, the relevance and the complementarity of each information source are fully considered, and the fatigue level analysis is more accurate.
2. The fatigue prediction method predicts the fatigue probability value through the fatigue prediction neural network model based on the obtained eye closing characteristics, mouth opening and closing characteristics and driving behavior characteristics, judges the fatigue grade through the fatigue probability value, and has strong real-time performance of fatigue prediction.
3. According to the invention, through analysis of fatigue driving, parameters which can reflect driving behaviors most are determined as driving behavior characteristics, including a steering wheel corner absolute MEAN value SAMEAN, a steering wheel corner standard difference SASTD, a steering wheel corner lower quartile value MEAN value SAQ1MEAN, a steering wheel corner upper quartile value MEAN value SAQ3MEAN, a steering wheel corner entropy SE, a steering wheel corner speed absolute MEAN value SAVMEAN, a steering wheel corner speed standard difference SAVSTD, a zero speed percentage PNS and the like are taken as the driving behavior characteristics, and the driving behavior characteristics can be calculated through the steering wheel corner SA and the steering wheel corner speed SAR, so that a large amount of unnecessary data acquisition and related calculation are reduced, and compared with the prior art, the fatigue grade analysis is more accurate.
Drawings
FIG. 1 is a schematic block diagram of a driver fatigue level identification system with a bimodal feature fusion according to an embodiment of the present invention;
FIG. 2 is a schematic diagram of a sliding time window provided by an embodiment of the present invention;
fig. 3 is a fatigue prediction neural network model according to an embodiment of the present invention.
Detailed Description
The technical solutions in the embodiments of the present invention will be clearly and completely described below with reference to the drawings in the embodiments of the present invention, and it is obvious that the described embodiments are only a part of the present invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the embodiments given herein without making any creative effort, shall fall within the protection scope of the present invention.
As shown in fig. 1, the invention provides a driver fatigue level recognition system with fusion of bimodal features, which comprises an image acquisition module, a facial behavior recognition module and a driving recorder host, wherein the image acquisition module and the driving recorder host are both connected with the facial behavior recognition module, and the driving recorder host is also connected with a CAN bus of a vehicle;
the image acquisition module is used for acquiring a face dynamic image of a driver;
the facial behavior recognition module is used for recognizing eye closing characteristics and mouth opening and closing characteristics from the acquired dynamic human face image;
the driving recorder host is used for acquiring and analyzing vehicle operation information from a vehicle CAN bus, calculating driving behavior characteristics through the vehicle operation information, and analyzing the fatigue level of a driver through fusion of eye closing characteristics, mouth opening and closing characteristics and driving behavior characteristics.
The driving recorder host comprises a CAN bus communication module, and the CAN bus communication module is connected with a vehicle CAN bus to acquire and analyze vehicle operation information in the vehicle CAN bus.
The eye closing characteristic and the mouth opening and closing characteristic are used as visual processing information characteristics of the driver, and the method and the device fuse the visual processing information and the driving behavior characteristics of the driver to judge whether the driver is in a fatigue driving state or not so as to overcome the limitation of a single information source, fully consider the relevance and complementarity of each information source and analyze the fatigue grade more accurately.
Preferably, the image acquisition module comprises a visible light camera and an infrared camera; the visible light camera is used for collecting a face dynamic image of the driver in the daytime, and the infrared camera is used for collecting the face dynamic image of the driver at night.
Preferably, the tachograph host is further configured to fuse characteristics of two modalities, namely a visual processing information characteristic and a driving behavior characteristic, specifically, a characteristic parameter group fusion method based on a sliding time window, as shown in fig. 2, where a current time is t and a characteristic parameter x is iHas an optimal time window of T iAssuming a characteristic parameter x iThe next extraction time is T +. Δ, the time window slides forward to the current time period of [ T +. T-T iAt, T +. T ], data repetition is (T) i-∆t)/T iThe invention selects the sliding time window Δ t =4 s.
Preferably, the eye closure characteristics include a maximum closed-eye time within a time window, a frequency of blinking within a time window, and a percentage of closed-eye time within a time window, and the mouth opening and closing characteristics include a maximum mouth opening time within a time window, a frequency of yawning within a time window, and a percentage of mouth opening time within a time window.
Wherein the time window corresponding to the maximum eye closing time, the percentage of eye closing time, the maximum mouth opening time and the percentage of mouth opening time is 10 seconds, and the time window corresponding to the blink frequency and the yawning frequency is 60 seconds.
Preferably, the facial behavior recognition module recognizes the eye closing feature and the mouth opening and closing feature through a built-in face recognition algorithm.
Preferably, when the opening range of the mouth of the driver is detected to be one third of the face range and lasts for more than 2S, the driver is judged to be yawned, and when the eye closing of the driver is detected and lasts for 1.5 seconds, the driver is judged to be eye closing.
Preferably, the driving recorder host comprises a vehicle operation information acquisition module and a driving behavior feature calculation module, the driving recorder host is configured to acquire and analyze vehicle operation information from a vehicle CAN bus, and the driving behavior feature calculation module is configured to calculate driving behavior features through the vehicle operation information, where the vehicle operation information includes a steering wheel angle SA and a steering wheel angular velocity SAR, and the driving behavior features include a steering wheel angle absolute MEAN SAMEAN, a steering wheel angle standard deviation SASTD, a steering wheel angle lower quartile MEAN SAQ1MEAN, a steering wheel angle upper quartile MEAN SAQ3MEAN, a steering wheel angle entropy, a steering wheel angle absolute MEAN SAVMEAN, a steering wheel angular velocity standard deviation SAVSTD, a zero speed percentage PNS, and an accumulated driving duration.
The driving behavior feature calculation module specifically calculates the driving behavior features through vehicle operation information as follows:
the absolute steering wheel angle mean value SAMEAN is an average value of absolute steering wheel angles, and a calculation formula is shown as formula one:
the formula I is as follows:
Figure 554095DEST_PATH_IMAGE002
wherein N is the steering wheel angle sample number, SA iIs the ith steering wheel angle sample;
the calculation formula of the steering wheel angle standard deviation SASTD is shown as a formula II:
the formula II is as follows:
Figure 164068DEST_PATH_IMAGE003
wherein, SA mThe calculation formula is shown as formula three:
the formula III is as follows:
Figure 865307DEST_PATH_IMAGE010
arranging N numerical values in a steering wheel corner sample from small to large, counting the numerical values from small to large, wherein the numerical value at the fourth quarter is a lower quartile value SAQ1 of the steering wheel corner, the numerical value at the third quarter is an upper quartile value SAQ3 of the steering wheel corner, the MEAN value SAQ1MEAN of the lower quartile value of the steering wheel corner is the MEAN value smaller than the lower quartile value SAQ1 of the steering wheel corner sample, and the MEAN value SAQ3MEAN of the upper quartile value 383 MEAN of the steering wheel corner sample is the MEAN value larger than the upper quartile value SAQ3 of the steering wheel corner sample;
the steering wheel corner entropy SE reflects the chaos degree and randomness of the operation of a driver on a steering wheel, the larger the steering wheel corner entropy SE is, the larger the randomness of the operation of the driver on the steering wheel is, the higher the fatigue degree of the driver is, the steering wheel corner entropy SE is calculated according to the probability of the occurrence of the prediction deviation of the steering wheel corner, and the predicted value theta of the steering wheel corner is firstly calculated according to the formula IV p(n)
The formula four is as follows:
Figure 823905DEST_PATH_IMAGE005
then, according to the steering wheel angleActual value theta (n)Predicted value theta of steering wheel angle p(n)Calculating a steering wheel angle prediction deviation e from the difference nThe calculation formula is the following formula five:
the formula five is as follows:
steering wheel angle predicted deviation e nObeying a normal distribution N (mu, sigma) 2) Predicting the deviation e of the steering wheel angle nDivided into 9 intervals, (- ∞, -5. mu.)],(−5μ,− 2.5μ],(−2.5μ,−μ],(−μ,−0.5μ](-0.5 μ, 0.5 μ), [0.5 μ, μ), [ μ, 2.5 μ), [2.5 μ, 5 μ), [5 μ, + ∞) and then calculates the probability values for each interval p i And finally, calculating the steering wheel corner entropy SE according to a formula six:
formula six:
Figure 329153DEST_PATH_IMAGE007
the mean value SAVMEAN and the standard deviation SAVSTD of the absolute values of the steering wheel angular velocity reflect the fluctuation condition of the vehicle, and the steering wheel angular velocity SA in the first formula and the second formula is replaced by the SAR of the steering wheel, so that the mean value SAVMEAN and the standard deviation SAVSTD of the absolute values of the steering wheel angular velocity are calculated;
the zero-speed percentage PNS detects the continuous and immovable operation characteristic PNS of the steering wheel, and the calculation formula is shown as formula seven:
the formula seven:
Figure 635631DEST_PATH_IMAGE012
wherein N is the total number of samples of angular velocity in the selected time, and N is the sample of angular velocity between + -0.1 deg./s in the total sample.
Preferably, the driving recorder host comprises a fatigue prediction neural network model, and the analysis of the fatigue level of the driver through the fusion of the eye closing feature, the mouth opening and closing feature and the driving behavior feature specifically comprises the following steps:
forming a fusion feature vector set by fusing eye closing features, mouth opening and closing features and driving behavior features, wherein X, X = { X1, X2, …, X15}, X1 is the longest eye closing time, X2 is the blink frequency, X3 is the percentage eye closing time, X4 is the longest mouth opening time, X5 is the yawning time, X6 is the percentage mouth opening time, X7 is the absolute MEAN SAMEAN of the steering wheel rotation angle, X8 is the standard deviation SASTD of the steering wheel rotation angle, X9 is the MEAN SAQ1MEAN of the quartiles under the steering wheel rotation angle, X10 is the MEAN SAQ3MEAN of the quartiles under the steering wheel rotation angle, X11 is the steering wheel rotation angle SE, X12 is the MEAN savnean of the steering wheel rotation angle speed absolute value, X13 is the savtd, X14 is the zero speed percentage PNS, and X15 is the running total sum;
constructing a fatigue prediction neural network model, wherein the fatigue prediction neural network model uses a full connection layer to judge fatigue driving probability, as shown in fig. 3, the input of the model is a fusion feature vector set in a fusion window, the fusion feature vector set and a weight vector w of the full connection layer perform vector product operation, the vector product is input to a Sigmoid activation function (represented as σ in fig. 3), and a fatigue probability value y between 0 and 1 is output through the Sigmoid activation function, and the specific formula is as follows:
Figure 719125DEST_PATH_IMAGE013
preferably, in the training process of the fatigue prediction neural network model, a cross-entropy (cross-entropy) function is adopted as the loss function Em, so that a training set is N sample pairs<X i,O i>In which X iIs the fused feature vector set of the ith window sample, O iIs a label corresponding to the fused feature vector set of the ith window sample, O iThe value is 1 or 0, 1 represents fatigue driving, 0 represents non-fatigue driving, and when the fusion feature vector set of the ith window sample corresponds to fatigue driving, O iThe value is 1, and when the fusion feature vector set of the ith window sample corresponds to fatigue driving, O iWith a value of 0, the calculation formula of the loss function Em is as follows:
Figure 609589DEST_PATH_IMAGE014
wherein the content of the first and second substances,
Figure 561365DEST_PATH_IMAGE015
and dividing the training set into small batches to be used as input of each iteration in the training process, and performing model training by using a random gradient descent optimization algorithm through multiple iterations until a loss function is converged to obtain the trained fatigue prediction neural network model.
Wherein, the training set is: under a plurality of scenes that the driver is determined to be in fatigue driving, acquiring a set of fusion feature vector sets formed by fusing eye closing features, mouth opening and closing features and driving behavior features of the corresponding driver; and under a plurality of scenes that the driver is determined to be in non-fatigue driving, acquiring a set of fusion feature vector sets formed by eye closing features, mouth opening and closing features and driving behavior features of the corresponding driver.
Preferably, the driving recorder host further comprises a fatigue level judgment module, the fatigue level judgment module is used for judging whether the vehicle is in a driving state according to a fatigue probability value, and the fatigue level judgment rule is as follows:
when the fatigue probability value is less than 0.6, judging that the vehicle is in non-fatigue driving;
when the fatigue probability value is more than or equal to 0.6 and less than 0.9, judging the driver is light fatigue driving;
and when the fatigue probability value is more than or equal to 0.9, judging the deep fatigue driving.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (6)

1. A driver fatigue grade recognition system with dual-modal feature fusion is characterized by comprising an image acquisition module, a facial behavior recognition module and a driving recorder host, wherein the image acquisition module and the driving recorder host are both connected with the facial behavior recognition module, and the driving recorder host is also connected with a CAN bus of a vehicle;
the image acquisition module is used for acquiring a face dynamic image of a driver;
the facial behavior recognition module is used for recognizing eye closing characteristics and mouth opening and closing characteristics from the acquired dynamic human face image;
the driving recorder host is used for acquiring and analyzing vehicle operation information from a vehicle CAN bus, calculating driving behavior characteristics through the vehicle operation information, and analyzing the fatigue level of a driver through fusion of eye closing characteristics, mouth opening and closing characteristics and driving behavior characteristics;
wherein the eye closure characteristics comprise a maximum eye closure time, a blink frequency, and a percentage of eye closure time, and the mouth opening characteristics comprise a maximum mouth opening time, a yawning frequency, and a percentage of mouth opening time;
the driving recorder host comprises a vehicle operation information acquisition module and a driving behavior feature calculation module, wherein the vehicle operation information acquisition module is used for acquiring and analyzing vehicle operation information from a vehicle CAN bus, the driving behavior feature calculation module is used for calculating driving behavior features through the vehicle operation information, the vehicle operation information comprises a steering wheel angle SA and a steering wheel angular velocity SAR, and the driving behavior features comprise a steering wheel angle absolute MEAN value SAMEAN, a steering wheel angle standard difference SASTD, a steering wheel angle lower quartile value MEAN value SAQ1MEAN, a steering wheel angle upper quartile value MEAN value SAQ3MEAN, a steering wheel angular entropy, a steering wheel angular velocity absolute value MEAN value SAVMEAN, a steering wheel angular velocity standard difference SAVSTD, a zero speed percentage PNS and accumulated driving duration;
the driving recorder host further comprises a fatigue prediction neural network model, and the fatigue grade of the driver is analyzed through fusion of eye closing characteristics, mouth opening and closing characteristics and driving behavior characteristics, wherein the method specifically comprises the following steps:
forming a fusion feature vector set by fusing eye closing features, mouth opening and closing features and driving behavior features, wherein X, X = { X1, X2, …, X15}, X1 is the longest eye closing time, X2 is the blink frequency, X3 is the percentage eye closing time, X4 is the longest mouth opening time, X5 is the yawning time, X6 is the percentage mouth opening time, X7 is the absolute MEAN SAMEAN of the steering wheel rotation angle, X8 is the standard deviation SASTD of the steering wheel rotation angle, X9 is the MEAN SAQ1MEAN of the quartiles under the steering wheel rotation angle, X10 is the MEAN SAQ3MEAN of the quartiles under the steering wheel rotation angle, X11 is the steering wheel rotation angle SE, X12 is the MEAN savnean of the steering wheel rotation angle speed absolute value, X13 is the savtd, X14 is the zero speed percentage PNS, and X15 is the running total sum;
the fatigue prediction neural network model judges fatigue driving probability by using a full connection layer, the input of the model is a fusion feature vector set in a fusion window, the fusion feature vector set and a weight vector w of the full connection layer carry out vector product operation, the vector product is input to a Sigmoid activation function, and the fatigue probability value y between 0 and 1 is output through the Sigmoid activation function, and the specific formula is as follows:
Figure 445370DEST_PATH_IMAGE001
2. the bimodal feature fused driver fatigue level recognition system as claimed in claim 1, wherein the image acquisition module comprises a visible light camera and an infrared camera; the visible light camera is used for collecting a face dynamic image of the driver in the daytime, and the infrared camera is used for collecting the face dynamic image of the driver at night.
3. The bimodal feature fused driver fatigue level identification system as claimed in claim 1, wherein the time window for the maximum eye closure time, the percentage of eye closure time, the maximum mouth opening time and the percentage of mouth opening time is 10 seconds, and the time window for the blink frequency and the yawning frequency is 60 seconds.
4. The bimodal feature fused driver fatigue level recognition system as claimed in claim 1, wherein the driving behavior feature calculation module calculates the driving behavior feature from the vehicle operation information, specifically:
the absolute steering wheel angle mean value SAMEAN is an average value of absolute steering wheel angles, and a calculation formula is shown as formula one:
the formula I is as follows:
Figure 628090DEST_PATH_IMAGE002
wherein N is the steering wheel angle sample number, SA iIs the ith steering wheel angle sample;
the calculation formula of the steering wheel angle standard deviation SASTD is shown as a formula II:
the formula II is as follows:
Figure 96242DEST_PATH_IMAGE003
wherein, SA mThe calculation formula is shown as formula three:
the formula III is as follows:
Figure 694714DEST_PATH_IMAGE004
arranging N numerical values in a steering wheel corner sample from small to large, counting the numerical values from small to large, wherein the numerical value at the fourth quarter is a lower quartile value SAQ1 of the steering wheel corner, the numerical value at the third quarter is an upper quartile value SAQ3 of the steering wheel corner, the MEAN value SAQ1MEAN of the lower quartile value of the steering wheel corner is the MEAN value smaller than the lower quartile value SAQ1 of the steering wheel corner sample, and the MEAN value SAQ3MEAN of the upper quartile value 383 MEAN of the steering wheel corner sample is the MEAN value larger than the upper quartile value SAQ3 of the steering wheel corner sample;
the steering wheel corner entropy SE reflects the chaos degree and randomness of the operation of a driver on the steering wheel, and the larger the steering wheel corner entropy SE is, the larger the randomness of the operation of the driver on the steering wheel is, the higher the fatigue degree of the driver isThe steering wheel corner entropy SE is calculated according to the probability of occurrence of the predicted deviation of the steering wheel corner, and the predicted steering wheel corner value theta is calculated according to a formula IV p(n)
The formula four is as follows:
Figure 164878DEST_PATH_IMAGE005
then, the actual value theta is calculated according to the steering wheel angle (n)Predicted value theta of steering wheel angle p(n)Calculating a steering wheel angle prediction deviation e from the difference nThe calculation formula is the following formula five:
the formula five is as follows:
steering wheel angle predicted deviation e nObeying a normal distribution N (mu, sigma) 2) Predicting the deviation e of the steering wheel angle nDivided into 9 intervals, (- ∞, -5. mu.)],(−5μ,− 2.5μ],(−2.5μ,−μ],(−μ,−0.5μ](-0.5 μ, 0.5 μ), [0.5 μ, μ), [ μ, 2.5 μ), [2.5 μ, 5 μ), [5 μ, + ∞) and then calculates the probability values for each interval p i And finally, calculating the steering wheel corner entropy SE according to a formula six:
formula six:
the mean value SAVMEAN and the standard deviation SAVSTD of the absolute values of the steering wheel angular velocity reflect the fluctuation condition of the vehicle, and the steering wheel angular velocity SA in the first formula and the second formula is replaced by the SAR of the steering wheel, so that the mean value SAVMEAN and the standard deviation SAVSTD of the absolute values of the steering wheel angular velocity are calculated;
the zero-speed percentage PNS detects the continuous and immovable operation characteristic PNS of the steering wheel, and the calculation formula is shown as formula seven:
the formula seven:
wherein N is the total number of samples of angular velocity in the selected time, and N is the sample of angular velocity between + -0.1 deg./s in the total sample.
5. The bimodal feature fused driver fatigue level identification system according to claim 1, wherein in the training process of the fatigue prediction neural network model, a cross-entropy (cross-entropy) function is adopted as the loss function Em, so that a training set is N sample pairs<X i,O i>In which X iIs the fused feature vector set of the ith window sample, O iIs a label corresponding to the fused feature vector set of the ith window sample, O iThe value is 1 or 0, 1 represents fatigue driving, 0 represents non-fatigue driving, and when the fusion feature vector set of the ith window sample corresponds to fatigue driving, O iThe value is 1, and when the fusion feature vector set of the ith window sample corresponds to fatigue driving, O iWith a value of 0, the calculation formula of the loss function Em is as follows:
and dividing the training set into small batches to be used as input of each iteration in the training process, and performing model training by using a random gradient descent optimization algorithm through multiple iterations until a loss function is converged to obtain the trained fatigue prediction neural network model.
6. The bimodal feature fused driver fatigue level recognition system as claimed in claim 1, wherein the driving recorder host further comprises a fatigue level judgment module, and the fatigue level judgment module is configured to judge a driver fatigue level according to a fatigue probability value, specifically:
when the fatigue probability value is less than 0.6, judging that the vehicle is in non-fatigue driving;
when the fatigue probability value is more than or equal to 0.6 and less than 0.9, judging the driver is light fatigue driving;
and when the fatigue probability value is more than or equal to 0.9, judging the deep fatigue driving.
CN201911401097.6A 2019-12-31 2019-12-31 Driver fatigue grade recognition system with bimodal feature fusion Pending CN110781872A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201911401097.6A CN110781872A (en) 2019-12-31 2019-12-31 Driver fatigue grade recognition system with bimodal feature fusion

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201911401097.6A CN110781872A (en) 2019-12-31 2019-12-31 Driver fatigue grade recognition system with bimodal feature fusion

Publications (1)

Publication Number Publication Date
CN110781872A true CN110781872A (en) 2020-02-11

Family

ID=69394801

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201911401097.6A Pending CN110781872A (en) 2019-12-31 2019-12-31 Driver fatigue grade recognition system with bimodal feature fusion

Country Status (1)

Country Link
CN (1) CN110781872A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111806230A (en) * 2020-06-17 2020-10-23 宁波智翔信息技术有限公司 Anti-fatigue protection method and protection system based on truck driver
CN112381015A (en) * 2020-11-19 2021-02-19 联通智网科技有限公司 Fatigue degree identification method, device and equipment
CN112686103A (en) * 2020-12-17 2021-04-20 浙江省交通投资集团有限公司智慧交通研究分公司 Vehicle-road cooperative fatigue driving monitoring system
CN113312948A (en) * 2020-03-26 2021-08-27 香港生产力促进局 Method, equipment and system for detecting drowsiness by using deep learning model
CN117975665A (en) * 2024-03-28 2024-05-03 钧捷智能(深圳)有限公司 DMS driver fatigue grade identification system

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105787510A (en) * 2016-02-26 2016-07-20 华东理工大学 System and method for realizing subway scene classification based on deep learning
CN105956548A (en) * 2016-04-29 2016-09-21 奇瑞汽车股份有限公司 Driver fatigue state detection method and device
EP3279700A1 (en) * 2016-08-04 2018-02-07 Nuctech Company Limited Security inspection centralized management system
CN108230619A (en) * 2016-12-14 2018-06-29 贵港市瑞成科技有限公司 Method for detecting fatigue driving based on multi-feature fusion
CN110096957A (en) * 2019-03-27 2019-08-06 苏州清研微视电子科技有限公司 The fatigue driving monitoring method and system merged based on face recognition and Activity recognition
CN110119714A (en) * 2019-05-14 2019-08-13 济南浪潮高新科技投资发展有限公司 A kind of Driver Fatigue Detection and device based on convolutional neural networks

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105787510A (en) * 2016-02-26 2016-07-20 华东理工大学 System and method for realizing subway scene classification based on deep learning
CN105956548A (en) * 2016-04-29 2016-09-21 奇瑞汽车股份有限公司 Driver fatigue state detection method and device
EP3279700A1 (en) * 2016-08-04 2018-02-07 Nuctech Company Limited Security inspection centralized management system
CN108230619A (en) * 2016-12-14 2018-06-29 贵港市瑞成科技有限公司 Method for detecting fatigue driving based on multi-feature fusion
CN110096957A (en) * 2019-03-27 2019-08-06 苏州清研微视电子科技有限公司 The fatigue driving monitoring method and system merged based on face recognition and Activity recognition
CN110119714A (en) * 2019-05-14 2019-08-13 济南浪潮高新科技投资发展有限公司 A kind of Driver Fatigue Detection and device based on convolutional neural networks

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
牛清宁: "基于信息融合的疲劳驾驶检测方法研究", 《中国博士学位论文全文数据工程科技II辑》 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113312948A (en) * 2020-03-26 2021-08-27 香港生产力促进局 Method, equipment and system for detecting drowsiness by using deep learning model
CN111806230A (en) * 2020-06-17 2020-10-23 宁波智翔信息技术有限公司 Anti-fatigue protection method and protection system based on truck driver
CN111806230B (en) * 2020-06-17 2023-10-24 宁波智翔信息技术有限公司 Fatigue protection method and system based on truck driver
CN112381015A (en) * 2020-11-19 2021-02-19 联通智网科技有限公司 Fatigue degree identification method, device and equipment
CN112686103A (en) * 2020-12-17 2021-04-20 浙江省交通投资集团有限公司智慧交通研究分公司 Vehicle-road cooperative fatigue driving monitoring system
CN112686103B (en) * 2020-12-17 2024-04-26 浙江省交通投资集团有限公司智慧交通研究分公司 Fatigue driving monitoring system for vehicle-road cooperation
CN117975665A (en) * 2024-03-28 2024-05-03 钧捷智能(深圳)有限公司 DMS driver fatigue grade identification system

Similar Documents

Publication Publication Date Title
CN110781873A (en) Driver fatigue grade identification method based on bimodal feature fusion
CN110781872A (en) Driver fatigue grade recognition system with bimodal feature fusion
CN111274881B (en) Driving safety monitoring method and device, computer equipment and storage medium
CN110901385B (en) Active speed limiting method based on fatigue state of driver
CN108469806B (en) Driving right transfer method in alternating type man-machine common driving
CN109147279B (en) Driver fatigue driving monitoring and early warning method and system based on Internet of vehicles
CN108372785A (en) A kind of non-security driving detection device of the automobile based on image recognition and detection method
US20220175287A1 (en) Method and device for detecting driver distraction
CN202257856U (en) Driver fatigue-driving monitoring device
CN110895662A (en) Vehicle overload alarm method and device, electronic equipment and storage medium
CN108091132B (en) Traffic flow prediction method and device
CN112700470A (en) Target detection and track extraction method based on traffic video stream
CN111738337B (en) Driver distraction state detection and identification method in mixed traffic environment
CN109740477A (en) Study in Driver Fatigue State Surveillance System and its fatigue detection method
CN111242484A (en) Vehicle risk comprehensive evaluation method based on transition probability
CN107563346A (en) One kind realizes that driver fatigue sentences method for distinguishing based on eye image processing
CN110992709A (en) Active speed limiting system based on fatigue state of driver
CN110588658A (en) Method for detecting risk level of driver based on comprehensive model
CN112381015A (en) Fatigue degree identification method, device and equipment
CN110562261A (en) Method for detecting risk level of driver based on Markov model
Shi et al. Real-time driving risk assessment using deep learning with XGBoost
CN115782905A (en) Automatic driving vehicle driving safety degree quantification system
CN115797403A (en) Traffic accident prediction method and device, storage medium and electronic device
Corcoran et al. Traffic risk assessment: A two-stream approach using dynamic-attention
CN111563468A (en) Driver abnormal behavior detection method based on attention of neural network

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
RJ01 Rejection of invention patent application after publication
RJ01 Rejection of invention patent application after publication

Application publication date: 20200211