CN112220480B - Driver state detection system based on millimeter wave radar and camera fusion and vehicle - Google Patents

Driver state detection system based on millimeter wave radar and camera fusion and vehicle Download PDF

Info

Publication number
CN112220480B
CN112220480B CN202011132636.3A CN202011132636A CN112220480B CN 112220480 B CN112220480 B CN 112220480B CN 202011132636 A CN202011132636 A CN 202011132636A CN 112220480 B CN112220480 B CN 112220480B
Authority
CN
China
Prior art keywords
driver
fatigue
nodding
threshold
module
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202011132636.3A
Other languages
Chinese (zh)
Other versions
CN112220480A (en
Inventor
赵林峰
曹琴星
蔡必鑫
严明月
陈无畏
胡延平
刘晟山
王宁
张曼玲
王天元
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hefei University of Technology
Original Assignee
Hefei University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hefei University of Technology filed Critical Hefei University of Technology
Priority to CN202011132636.3A priority Critical patent/CN112220480B/en
Publication of CN112220480A publication Critical patent/CN112220480A/en
Application granted granted Critical
Publication of CN112220480B publication Critical patent/CN112220480B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/0059Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
    • A61B5/0077Devices for viewing the surface of the body, e.g. camera, magnifying lens
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/02Detecting, measuring or recording pulse, heart rate, blood pressure or blood flow; Combined pulse/heart-rate/blood pressure determination; Evaluating a cardiovascular condition not otherwise provided for, e.g. using combinations of techniques provided for in this group with electrocardiography or electroauscultation; Heart catheters for measuring blood pressure
    • A61B5/024Detecting, measuring or recording pulse rate or heart rate
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1103Detecting eye twinkling
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1116Determining posture transitions
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/103Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
    • A61B5/11Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb
    • A61B5/1126Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique
    • A61B5/1128Measuring movement of the entire body or parts thereof, e.g. head or hand tremor, mobility of a limb using a particular sensing technique using image analysis
    • AHUMAN NECESSITIES
    • A61MEDICAL OR VETERINARY SCIENCE; HYGIENE
    • A61BDIAGNOSIS; SURGERY; IDENTIFICATION
    • A61B5/00Measuring for diagnostic purposes; Identification of persons
    • A61B5/16Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
    • A61B5/18Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators

Landscapes

  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Engineering & Computer Science (AREA)
  • General Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Public Health (AREA)
  • Surgery (AREA)
  • Veterinary Medicine (AREA)
  • Pathology (AREA)
  • Physics & Mathematics (AREA)
  • Biomedical Technology (AREA)
  • Heart & Thoracic Surgery (AREA)
  • Medical Informatics (AREA)
  • Molecular Biology (AREA)
  • Animal Behavior & Ethology (AREA)
  • Physiology (AREA)
  • Dentistry (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Cardiology (AREA)
  • Ophthalmology & Optometry (AREA)
  • Child & Adolescent Psychology (AREA)
  • Developmental Disabilities (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
  • Psychiatry (AREA)
  • Hospice & Palliative Care (AREA)
  • Psychology (AREA)
  • Social Psychology (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Educational Technology (AREA)
  • Traffic Control Systems (AREA)

Abstract

The invention discloses a driver state detection system based on millimeter wave radar and camera fusion and a vehicle. The driver state detection system comprises a data acquisition module and a data processing system. The data processing system comprises a data processing module, a driver fatigue judging module and a driving state judging module. The driver fatigue judging module comprises a facial fatigue judging sub-module, a head fatigue judging sub-module and a driver fatigue judging sub-module. According to the invention, the millimeter wave radar and the infrared camera are used for capturing information such as heartbeat, expression and action of the driver in real time, so that the driving purpose of detecting whether the driver is tired is realized; under the condition of detecting the fatigue of the driver, the fatigue degree is obtained by quantifying the fatigue of the driver, so that the fatigue degree of the driver has a concept of accurate quantity, the driving capability of the driver has a reasonable judgment, and finally, a judgment basis which can not be effectively controlled by the driver and can be born by the driver is provided for the driver, thereby obtaining the full acceptance and trust of the driver to the invention.

Description

Driver state detection system based on millimeter wave radar and camera fusion and vehicle
Technical Field
The invention relates to the field of driving state detection of drivers, in particular to a driver state detection system based on millimeter wave radar and camera fusion and a vehicle with the driver state detection system.
Background
With the rapid development of the automobile industry, the probability of traffic accidents is also increasing, wherein most accidents are caused by human factors, such as fatigue driving, sudden diseases of drivers, distraction of drivers and the like. How to prevent and reduce such traffic accidents caused by abnormal states of drivers is critical to detection and identification of abnormal states of drivers. Currently, the state detection of the driver is mostly based on fatigue state detection of a visual sensor, or health detection of heartbeat, blood pressure, etc. of the driver by a contact device. The single sensor detects the state of the driver, has the defects of less detected characteristic information and single identification state, and the touch sensor has the defects of troublesome wearing or uncomfortable feeling of the driver.
The prior patent document patent application (publication number is CN111179552A, publication date is No. 05-19 in 2020) discloses a method and a system for monitoring the state of a driver based on multi-sensor fusion, which are used for judging that the driver is in a first fatigue state by collecting eye closing behaviors of the driver and when the duration of the eye closing behaviors exceeds an eye closing threshold value, judging that the driver is in a second fatigue state by collecting yawning behaviors of the driver and when the duration of the yawning behaviors exceeds the yawning threshold value, and judging that the driver is in a third fatigue state when the processed heart rate information exceeds the fatigue threshold value after collecting heart rate information of the driver and processing the data. The three fatigue states are respectively provided with a weight, and then are overlapped, the fatigue degree is judged according to the overlapped value, and then the driver is reminded through corresponding voice broadcasting and/or vibration seats according to the fatigue degree (mild fatigue, moderate fatigue and severe fatigue).
However, the current fatigue result judgment is not very accurate, so that the experience of a driver is very poor, and the function is often forcedly closed compared with the reminding modes of the dislike vehicle, so that the function becomes almost a false proposition, and the popularization of the technology is not facilitated. In addition, heart rate information is generally used to reflect exercise intensity and physical abnormalities, and fatigue is not well expressed due to the difference of human bodies in terms of fatigue degree, so that erroneous judgment of fatigue is also easily caused.
Disclosure of Invention
In order to solve the technical problem that the traditional driver state detection method is difficult to popularize due to low precision, the invention provides a driver state detection system based on millimeter wave radar and camera fusion and a vehicle with the driver state detection system.
The invention changes the traditional driver state detection mode, carries out different data analysis on the collected data, and realizes the driving purpose of detecting whether the driver is tired; under the condition of detecting the fatigue of the driver, the fatigue degree is obtained by quantifying the fatigue of the driver, so that the fatigue degree of the driver has a concept of accurate quantity, the driving capability of the driver has a reasonable judgment, and finally, a judgment basis which can not be effectively controlled by the driver and can be born by the driver is provided for the driver, thereby obtaining the full acceptance and trust of the driver to the invention. In addition, the invention can update and upgrade the existing vehicle only by installing the millimeter wave radar, the infrared camera and the data processing system (which can be an independent control board or can be loaded on the control board of the vehicle in a software mode) on the existing vehicle, thereby realizing the detection of the state of the driver without the mode of replacing the vehicle, and having low cost and easy popularization and realization.
The invention is realized by adopting the following technical scheme: a driver state detection system based on millimeter wave radar and camera fusion, comprising:
the data acquisition module comprises a millimeter wave radar and an infrared camera; the millimeter wave radar is used for detecting a front-back movement distance FB of the forehead or the chin of the driver and recording corresponding time length of the front-back movement distance FB, and is also used for detecting a left-right movement distance LR of the forehead or the chin of the driver and recording corresponding time length of the left-right movement distance LR; the infrared camera is used for detecting the facial organs of the driver by acquiring the facial images of the driver; and
the data processing system comprises a data processing module, a driver fatigue judging module and a driving state judging module;
the data processing module compares the front-back movement distance FB with a front-back movement threshold Tfb, judges that a driver has front-back nodding action, namely front-back nodding times, and defines the corresponding time length of the front-back movement distance FB as front-back nodding time length T1, divides the front-back movement distance FB by the front-back nodding time length T1 to obtain front-back nodding speed F1, and counts the front-back nodding times N1 in unit time T and the front-back nodding time length T1 and the front-back nodding speed F1 of each front-back nodding if the front-back movement distance FB is larger than the front-back movement threshold Tfb; the data processing module compares the left-right movement distance LR with a left-right movement threshold Tlr, judges that a driver has left-right nodding action, namely left-right nodding times, and defines the corresponding time length of the left-right movement distance LR as left-right nodding time length T2, divides the left-right movement distance LR by the left-right nodding time length T2 to obtain a left-right nodding speed F2, and counts the left-right nodding times N2 in unit time T and the left-right nodding time length T2 and the left-right nodding speed F2 of each left-right nodding if the left-right movement distance LR is larger than the left-right movement threshold Tlr; the data processing module is used for positioning the eye part according to the facial organ, judging the eye closing state of the front and rear sampling moments for the positioned eye part, counting the times of closing eyes once each time the eyes are closed, and counting the times of closing eyes in unit time T to obtain blink frequency FE; the data processing module is used for positioning the mouth part according to the facial organ, judging whether the mouth is opened or not according to the positioned mouth part, if the mouth is opened, judging whether a person shields the mouth, counting the number of times of yawing once, if the mouth part cannot be positioned, detecting the hand of the driver, if the hand is detected, judging that the hand shields the mouth, counting the number of times of yawing once, and counting the number of times of yawing FM in unit time T;
The driver fatigue judging module comprises a facial fatigue judging sub-module, a head fatigue judging sub-module and a driver fatigue judging sub-module;
the facial fatigue judging submodule judges whether the driver is in a facial fatigue state according to the blink frequency FE and the yawing frequency FM, the facial fatigue state is defined as facial fatigue FF, and the judging method of facial fatigue FF=1 is that the conditions are simultaneously satisfied:
(1) The blink frequency FE is greater than a blink frequency threshold FE; and
(2) The yawing frequency FM is greater than a yawing frequency threshold FM;
the head fatigue judging sub-module judges whether a driver is in a head fatigue state according to the front and rear nodding times N1, the front and rear nodding speed F1, the left and right nodding times N2 and the left and right nodding speeds F2, and defines as head fatigue FH, and the judging method of the head fatigue FH=1 is that the conditions are simultaneously satisfied:
(1) The front-back nodding time length T1 is greater than a front-back nodding time length threshold T1 or the left-right nodding time length T2 is greater than a left-right nodding time length threshold T2;
(2) The front and rear nodding times N1 is larger than a front and rear nodding times threshold N1 or the left and right nodding times N2 is larger than a left and right nodding times threshold N2; and
(3) The front and rear nodding speed F1 is greater than a front and rear nodding speed threshold F1 or the left and right nodding speed F2 is greater than a left and right nodding speed threshold F2; and
The driver fatigue judging sub-module judges driver fatigue when facial fatigue ff=1 or head fatigue fh=1;
the driving state judging module comprises a comprehensive fatigue degree calculating sub-module and a driving capacity judging sub-module;
the comprehensive fatigue degree calculation submodule calculates the comprehensive fatigue degree FD of the driver:
FD=b1*△FE+b2*△FM+b3*△T+b4*△N+b5*△F
where Δfe represents the absolute value of the difference between the driver blink frequency FE and the threshold FE; Δfm represents the absolute value of the difference between the yawing frequency FM and the yawing frequency threshold FM under driver fatigue; delta T represents the larger value of the absolute value of the difference between the front and rear nod duration T1 and the front and rear nod duration threshold T1, and the absolute value of the difference between the left and right nod duration T2 and the left and right nod duration threshold T2 under the fatigue of the driver; Δn represents the larger value of the absolute value of the difference between the front-rear number of taps N1 and the front-rear number of taps threshold N1, and the absolute value of the difference between the left-right number of taps N2 and the left-right number of taps threshold N2, in the case of fatigue of the driver; Δf represents the larger of the absolute value of the difference between the front-rear head-striking speed F1 and the front-rear head-striking speed threshold F1 and the absolute value of the difference between the left-right head-striking speed F2 and the left-right head-striking speed threshold F2 in the fatigue of the driver; b1, b2, b3, b4, b5 respectively represent weight coefficients of Δfe, Δfm, Δt, Δn, Δf in the calculation process; and
The driving capability judging sub-module judges that the driver is in a non-driving state when the comprehensive fatigue degree FD is greater than a comprehensive fatigue degree threshold FD.
As a further improvement of the above scheme, the data processing system further includes a body abnormality judging module, where the body abnormality judging module judges whether the body of the driver is abnormal according to the average heart rate HR of the driver, the shielding time TH of the head when the driver makes a yawning each time, and the expression of the driver, and judges that the body of the driver is abnormal when the following conditions are satisfied:
(1) The average heart rate HR is greater than an upper heart rate threshold HR2 or less than a lower heart rate threshold HR1;
(2) The shielding duration TH is greater than a shielding threshold TH; and
(3) The driver's expression is an abnormal expression.
Preferably, the millimeter wave radar is further used for detecting the thoracic cavity fluctuation times B of the driver in a unit time T;
the data processing module also calculates the chest relief rate B/min: B/T; obtaining the average heart rate HR of the driver according to the chest cavity fluctuation rate B/min: k (B/T), where K is the conversion coefficient.
Preferably, the data processing module is used for recording the time length of the hand shielding the mouth when judging that the hand shields the mouth and counting the times of yawning once, and the time length of shielding the mouth by the hand each time is corresponding shielding time length TH.
Preferably, the data processing module further performs recognition of the facial expression of the driver on the facial image by using a convolutional neural network and a trained expression library according to the facial image, so as to recognize that the expression of the driver is a normal expression or an abnormal expression.
Preferably, the driving state judging module further comprises a comprehensive abnormality degree calculating sub-module;
the comprehensive anomaly degree calculation sub-module calculates the comprehensive anomaly degree AD of the driver:
AD=a1*△HR+a2*△TH
where Δhr represents the absolute value of the difference between the average heart rate HR and the heart rate lower limit threshold HR1 or the absolute value of the difference between the average heart rate HR and the heart rate upper limit HR2 under the abnormality of the body of the driver; Δth represents the absolute value of the difference between the occlusion time TH and the threshold TH in the case of abnormality of the driver's body, and a1 and a2 represent the weight coefficients of Δhr and Δth in the calculation process, respectively;
the driving ability judging sub-module also judges that the driver is in a non-driving state when the integrated anomaly degree AD is greater than an integrated anomaly degree threshold value AD.
Further, a1 and a2 are 0.6 and 0.4, respectively.
As a further improvement of the above, b1, b2, b3, b4, b5 are 0.3, 0.1, 0.2, 0.1, respectively.
The invention also provides a vehicle provided with the driver state detection system and the intelligent auxiliary system; the driving state detection system is any driver state detection system based on the combination of the millimeter wave radar and the camera, and when the driver state detection system judges that the driver cannot effectively control the vehicle, the intelligent auxiliary system is started, so that the vehicle starts an automatic driving function.
As a further improvement of the scheme, the vehicle is further provided with an alarm system, and when the driving state detection system judges that the driver cannot effectively control the vehicle, the alarm system is started, so that the vehicle starts an alarm prompt function of the driver.
The invention uses millimeter wave radar and infrared camera to real-time capture the information of driver's heartbeat, expression and action; analyzing the driving state of the driver at the current moment according to the heart rate variation, facial organ actions, head actions and other information of the driver; judging the driving state of the driver, and if the driver is not suitable for operating the vehicle currently, performing communication with an intelligent auxiliary system of the vehicle to execute corresponding operation; according to the current state of the driver, if the driver is in a fatigue state, the intelligent auxiliary system of the vehicle starts a warning function of the warning system, starts sound, vibration or sound combined vibration, and reminds the driver of safe driving; if an emergency occurs, the intelligent auxiliary system of the vehicle can take over the control right of the vehicle, stop the vehicle by the side and give an alarm.
Drawings
Fig. 1 is a flowchart of a driving state detection method for a driver according to embodiment 1 of the present invention.
Fig. 2 is a flowchart of a method for calculating the average heart rate HR used in the driving state detection method of fig. 1.
Fig. 3 is a flowchart of a method for calculating the shielding time length TH used in the driving state detection method in fig. 1.
Fig. 4 is a flowchart of a method for determining abnormality of the body of the driver used in the driving state detection method of fig. 1.
Fig. 5 is a schematic structural diagram of a driver state detection system based on millimeter wave radar and camera fusion according to embodiment 2 of the present invention.
FIG. 6 is a flow chart of a data processing method of the data processing module of the driver status detection system of FIG. 5.
Fig. 7 is a flow chart of a driver state detection method of the driver state detection system of fig. 5.
Fig. 8 is a flowchart of a driving ability determination method for a driver provided in embodiment 3 of the present invention.
Fig. 9 is a block diagram of a driver status detection system based on millimeter wave radar and camera fusion according to embodiment 4 of the present invention.
Fig. 10 is a flowchart of a processing method of the radar processing module in fig. 9.
FIG. 11 is a flowchart of a processing method of the vision processing module in FIG. 9.
Fig. 12 is a physiological abnormal state analysis diagram of the driver state detection system of fig. 9.
Fig. 13 is a fatigue state analysis diagram of the driver state detection system in fig. 9.
Fig. 14 is a driving ability analysis chart of the driver state detection system in fig. 9.
Detailed Description
The present invention will be described in further detail with reference to the drawings and examples, in order to make the objects, technical solutions and advantages of the present invention more apparent. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the scope of the invention.
Example 1
Referring to fig. 1, a flowchart of a driving state detection method of a driver, which is also referred to as a driver state detection method, is disclosed in this embodiment. The state detection method mainly comprises three major steps: 1. collecting data; 2. judging whether the body of the driver is abnormal; 3. under the abnormal condition of the driver body, the driving state of the driver is detected. If the body of the driver is normal, the data acquisition is repeated, and the driving state detection of the next round is performed.
The data acquisition mainly acquires the average heart rate HR of a driver in a unit time T, and the shielding duration TH of the head when the yawning is made each time, so that the expression of the driver is acquired.
Referring to fig. 2, in this embodiment, the calculation method of the average heart rate HR is as follows: acquiring the thoracic cavity fluctuation times B of a driver in unit time T (the thoracic cavity fluctuation times B of the driver in unit time T can be detected through millimeter wave radar); calculating the thoracic cavity fluctuation rate B/min: B/T; obtaining the average heart rate HR of the driver according to the chest cavity fluctuation rate B/min: k (B/T). Wherein K is a conversion coefficient, and can be obtained through experiments.
Referring to fig. 3, in this embodiment, the method for calculating the shielding duration TH includes: detecting facial organs of a driver and positioning mouth parts; judging whether the mouth is opened or not for the positioned mouth part, if the mouth is opened, judging whether a person shields the mouth, if so, counting the number of times of yawning once, and simultaneously recording the time of shielding the mouth by the person; if the mouth part cannot be positioned, detecting the hand of the driver, if the hand is detected, judging that the hand shields the mouth, counting the number of times of yawning once, and simultaneously recording the time when the hand shields the mouth; and counting the yawning times FM in the unit time T, wherein the time length of each time that the hand shields the mouth is the corresponding shielding time length TH. The facial organs can be identified by an infrared camera, for example, the infrared camera detects the facial organs of the driver by acquiring the facial images of the driver, and the facial recognition technology can meet the requirements of the present invention by adopting the prior art means, so the details of the facial recognition technology will not be described in detail herein.
With an infrared camera, the driver's expression can also be recognized through the facial image. The judgment method of the expression of the driver comprises the following steps: acquiring a face image of a driver; and (3) utilizing a convolutional neural network and a trained expression library to identify the facial expression of the driver from the facial image, so as to identify the expression of the driver as a normal expression or an abnormal expression.
Judging whether the body of the driver is abnormal or not, mainly judging whether the body of the driver is abnormal or not according to the average heart rate HR, the shielding duration TH and the expression of the driver. Referring to fig. 4, in this embodiment, when the following conditions are required to be satisfied at the same time, it is determined that the driver is physically abnormal:
(1) The average heart rate HR is greater than an upper heart rate threshold HR2 or less than a lower heart rate threshold HR1;
(2) The shielding duration TH is greater than a shielding threshold TH; and
(3) The driver's expression is an abnormal expression. Abnormal expressions may refer to painful expressions.
The method for detecting the driving state under the abnormal condition of the body of the driver comprises the following steps: calculating the comprehensive degree of abnormality AD of the driver; when the integrated abnormality degree AD is greater than an integrated abnormality degree threshold value AD, it is determined that the driver is in a non-driving state. Of course, when the integrated abnormality degree AD is not greater than the integrated abnormality degree threshold AD, it is determined that the driver is in the normal driving state.
Comprehensive degree of anomaly AD: ad=a1 Δhr+a2 Δth
Where Δhr represents the absolute value of the difference between the average heart rate HR and the heart rate lower limit threshold HR1 or the absolute value of the difference between the average heart rate HR and the heart rate upper limit HR2 under the abnormality of the body of the driver; Δth represents the absolute value of the difference between the occlusion time TH and the threshold TH in the case of abnormality of the driver's body, and a1 and a2 represent the weight coefficients of Δhr and Δth in the calculation process, respectively. In this embodiment, a1 and a2 are 0.6 and 0.4, respectively.
The invention changes the traditional driver state detection mode, carries out different data analysis on the collected data, and realizes the driving purpose of detecting whether the body of the driver is abnormal or not; under the condition of detecting the abnormal body of the driver, the abnormal body degree is obtained by quantifying the abnormal body of the driver, so that the abnormal body degree of the driver has a concept of accurate quantity, the driving capability of the driver has a reasonable judgment, and finally, a judgment basis which can not be effectively controlled by the driver and can be born by the driver is provided for the driver, thereby obtaining the full acceptance and trust of the driver to the invention.
The driving state detection method of the embodiment can be implemented by designing software, and the architecture mode of the driving state detection device can be a driving state detection device. The driving state detection device comprises a data acquisition module, a body abnormality judgment module, a driver fatigue judgment module and a driving state judgment module.
The data acquisition module is used for acquiring driving data of a driver. For example, an average heart rate HR of the driver in a unit time T, a shielding time TH of the head when the yawning is made each time, an expression of the driver, etc. may be collected, in other embodiments (e.g. examples 2, 3, 4), a number of times N1 of the front and rear nods of the driver in the unit time T, a front and rear nod time T1 and a front and rear nod speed F1 of the front and rear nods of the driver each time, a number of times N2 of the left and right nods of the driver in the unit time T, a left and right nod time T2 and a left and right nod speed F2 of the left and right nods of the driver each time; the blink frequency FE, the yawning frequency FM of the driver in the unit time T is also collected.
The body abnormality judging module is used for judging whether the body of the driver is different. In this embodiment, the body abnormality determination module determines whether the body of the driver is abnormal according to the average heart rate HR, the shielding duration TH, and the expression of the driver, and determines that the body of the driver is abnormal when the following conditions are satisfied:
(1) The average heart rate HR is greater than an upper heart rate threshold HR2 or less than a lower heart rate threshold HR1;
(2) The shielding duration TH is greater than a shielding threshold TH; and
(3) The driver's expression is an abnormal expression.
The driver fatigue judging module is used for judging whether the driver is tired or not, and comprises a facial fatigue judging sub-module and a driver fatigue judging sub-module. The facial fatigue judging sub-module is used for judging whether the driver is in a facial fatigue state according to the blink frequency FE and the yawing frequency FM, and the judging method is defined as facial fatigue FF, and the judging method of facial fatigue FF=1 is that the conditions are simultaneously satisfied:
(1) The blink frequency FE is greater than a blink frequency threshold FE; and
(2) The yawing frequency FM is greater than a yawing frequency threshold FM.
The driver fatigue judging submodule is used for judging whether the driver is tired: driver fatigue is judged when facial fatigue ff=1 or head fatigue fh=1.
The driving state judging module comprises a comprehensive abnormality degree computing sub-module and a driving capacity judging sub-module. The comprehensive anomaly degree calculation sub-module is used for calculating the comprehensive anomaly degree AD of the driver:
AD=a1*△HR+a2*△TH
where Δhr represents the absolute value of the difference between the average heart rate HR and the heart rate lower limit threshold HR1 or the absolute value of the difference between the average heart rate HR and the heart rate upper limit HR2 under the abnormality of the body of the driver; Δth represents the absolute value of the difference between the occlusion time TH and the threshold TH in the case of abnormality of the driver's body, and a1 and a2 represent the weight coefficients of Δhr and Δth in the calculation process, respectively.
The driving ability judgment submodule is used for judging the driving ability of the driver: when the integrated abnormality degree AD is greater than an integrated abnormality degree threshold value AD, it is determined that the driver cannot effectively control the vehicle.
The driving state detection method of the embodiment can be applied to the existing vehicle, and the vehicle is provided with a driver state detection system, an intelligent auxiliary system and an alarm system. The driver state detection system adopts the driving state detection method of the embodiment, and when the driver state detection method judges that the driver cannot effectively control the vehicle, the intelligent auxiliary system is started, so that the vehicle starts an automatic driving function; the warning system may also be activated such that the vehicle turns on the driver warning prompt.
The driver state detection system can update and upgrade the existing vehicle only by installing the millimeter wave radar, the infrared camera and the driver state detection system (which can be an independent control board or a control board of the vehicle in a software mode) on the existing vehicle, so that the driver state detection is realized, and the vehicle is not required to be replaced, so that the cost is low, and the vehicle state detection system is easy to popularize and realize.
Example 2
Please refer to fig. 5, which is a schematic diagram of a driver status detection system based on millimeter wave radar and camera fusion according to the present embodiment. The driver state detection system comprises a data acquisition module and a data processing system.
The data acquisition module comprises a millimeter wave radar and an infrared camera. The millimeter wave radar is used for detecting a front-back movement distance FB of the forehead or the chin of the driver and recording corresponding time length of the front-back movement distance FB, and the millimeter wave radar is also used for detecting a left-right movement distance LR of the forehead or the chin of the driver and recording corresponding time length of the left-right movement distance LR. The infrared camera is used for detecting facial organs of a driver by acquiring facial images of the driver.
The data processing system comprises a data processing module, a driver fatigue judging module and a driving state judging module.
Referring to fig. 6, the data processing module compares the front-back movement distance FB with a front-back movement threshold Tfb, if the front-back movement distance FB is greater than the front-back movement threshold Tfb, determines that the front-back nodding motion of the driver occurs, that is, the front-back nodding times, defines the corresponding time length of the front-back movement distance FB as the front-back nodding time length T1, divides the front-back movement distance FB by the front-back nodding time length T1 to obtain the front-back nodding speed F1, and counts the front-back nodding times N1 in the unit time T, and the front-back nodding time length T1 and the front-back nodding speed F1 of each front-back nodding.
The data processing module compares the left-right movement distance LR with a left-right movement threshold Tlr, if the left-right movement distance LR is larger than the left-right movement threshold Tlr, judges that the left-right nodding action of the driver occurs, namely the left-right nodding time is one time, defines the corresponding time length of the left-right movement distance LR as the left-right nodding time length T2, divides the left-right movement distance LR by the left-right nodding time length T2 to obtain the left-right nodding speed F2, and counts the left-right nodding times N2 in unit time T and the left-right nodding time length T2 and the left-right nodding speed F2 of each time.
Referring to fig. 7, the data processing module further locates the eye portion according to the facial organ, determines the eye closing state of the front and rear sampling moments for the located eye portion, counts the number of times of closing eyes once each time the eyes are closed, and counts the number of times of closing eyes in unit time T to obtain the blink frequency FE.
The data processing module is used for positioning the mouth part according to the facial organ, judging whether the mouth is opened or not according to the positioned mouth part, if the mouth is opened, judging whether a person shields the mouth, counting the number of times of yawning once, if the mouth part cannot be positioned, detecting the hand of the driver, if the hand is detected, judging that the hand shields the mouth, counting the number of times of yawning once, and counting the number of times of yawning FM in unit time T. The data processing module is used for judging that the hand shields the mouth and counting the times of yawning for one time, and simultaneously recording the time length of the hand shielding the mouth, wherein the time length of each time of the hand shielding the mouth is corresponding shielding time length TH, as shown in figure 3.
The driver fatigue judging module comprises a facial fatigue judging sub-module, a head fatigue judging sub-module and a driver fatigue judging sub-module.
The facial fatigue judging submodule judges whether the driver is in a facial fatigue state according to the blink frequency FE and the yawing frequency FM, the facial fatigue state is defined as facial fatigue FF, and the judging method of facial fatigue FF=1 is that the conditions are simultaneously satisfied:
(1) The blink frequency FE is greater than a blink frequency threshold FE; and
(2) The yawing frequency FM is greater than a yawing frequency threshold FM.
The head fatigue judging sub-module judges whether the driver is in a head fatigue state according to the front-back nodding times N1, the front-back nodding speed F1, the left-right nodding times N2 and the left-right nodding speed F2, and is defined as head fatigue FH, and the judging method of the head fatigue FH=1 is that the conditions are simultaneously satisfied (please combine with fig. 8):
(1) The front-back nodding time length T1 is greater than a front-back nodding time length threshold T1 or the left-right nodding time length T2 is greater than a left-right nodding time length threshold T2;
(2) The front and rear nodding times N1 is larger than a front and rear nodding times threshold N1 or the left and right nodding times N2 is larger than a left and right nodding times threshold N2;
(3) The front-rear nodding speed F1 is greater than a front-rear nodding speed threshold F1 or the left-right nodding speed F2 is greater than a left-right nodding speed threshold F2.
The driver fatigue determination sub-module determines driver fatigue when facial fatigue ff=1 or head fatigue fh=1.
The driving state judging module comprises a comprehensive fatigue degree calculating sub-module and a driving capacity judging sub-module. The comprehensive fatigue degree calculation submodule calculates the comprehensive fatigue degree FD of the driver:
FD=b1*△FE+b2*△FM+b3*△T+b4*△N+b5*△F
where Δfe represents the absolute value of the difference between the driver blink frequency FE and the threshold FE; Δfm represents the absolute value of the difference between the yawing frequency FM and the yawing frequency threshold FM under driver fatigue; delta T represents the larger value of the absolute value of the difference between the front and rear nod duration T1 and the front and rear nod duration threshold T1, and the absolute value of the difference between the left and right nod duration T2 and the left and right nod duration threshold T2 under the fatigue of the driver; Δn represents the larger value of the absolute value of the difference between the front-rear number of taps N1 and the front-rear number of taps threshold N1, and the absolute value of the difference between the left-right number of taps N2 and the left-right number of taps threshold N2, in the case of fatigue of the driver; Δf represents the larger of the absolute value of the difference between the front-rear head-striking speed F1 and the front-rear head-striking speed threshold F1 and the absolute value of the difference between the left-right head-striking speed F2 and the left-right head-striking speed threshold F2 in the fatigue of the driver; b1, b2, b3, b4, b5 represent the weight coefficients of Δfe, Δfm, Δt, Δn, Δf, respectively, during the calculation. In this embodiment, b1, b2, b3, b4, b5 are 0.3, 0.1, 0.2, 0.1, respectively.
The driving capability judging sub-module judges that the driver is in a non-driving state when the comprehensive fatigue degree FD is greater than a comprehensive fatigue degree threshold FD.
The invention changes the traditional driver state detection mode, carries out different data analysis on the collected data, and realizes the driving purpose of detecting whether the driver is tired; under the condition of detecting the fatigue of the driver, the fatigue degree is obtained by quantifying the fatigue of the driver, so that the fatigue degree of the driver has a concept of accurate quantity, the driving capability of the driver has a reasonable judgment, and finally, a judgment basis which can not be effectively controlled by the driver and can be born by the driver is provided for the driver, thereby obtaining the full acceptance and trust of the driver to the invention. In addition, the invention can update and upgrade the existing vehicle only by installing the millimeter wave radar, the infrared camera and the data processing system (which can be an independent control board or can be loaded on the control board of the vehicle in a software mode) on the existing vehicle, thereby realizing the detection of the state of the driver without the mode of replacing the vehicle, and having low cost and easy popularization and realization.
In other embodiments, the data processing system may further include a physical anomaly determination module. The body abnormality judging module judges whether the body of the driver is abnormal according to the average heart rate HR of the driver, the shielding time length TH of the head when the driver makes a yawning each time and the expression of the driver, and judges that the body of the driver is abnormal when the following conditions are met (as shown in fig. 4):
(1) The average heart rate HR is greater than an upper heart rate threshold HR2 or less than a lower heart rate threshold HR1;
(2) The shielding duration TH is greater than a shielding threshold TH; and
(3) The driver's expression is an abnormal expression.
For the average heart rate HR, millimeter wave mines can be adopted to detect the thoracic cavity fluctuation times B of a driver in unit time T; the data processing module also calculates the chest relief rate B/min: B/T; obtaining the average heart rate HR of the driver according to the chest cavity fluctuation rate B/min: k (B/T), where K is the conversion factor (as shown in fig. 2).
That is, the method of embodiment 1 can also be added to the present embodiment. Of course, the driver state detection system of the present embodiment may be installed in a vehicle for use. The vehicle is provided with a driver state detection system, an intelligent auxiliary system and an alarm system. The driving state detection system may be a driver state detection system based on millimeter wave radar and camera fusion of the present embodiment, where when the driver state detection system determines that the driver cannot effectively control the vehicle, the intelligent auxiliary system is started, so that the vehicle starts an automatic driving function; the warning system may also be activated such that the vehicle turns on the driver warning prompt.
Example 3
Please refer to fig. 8, which is a flowchart of a driving ability determination method of a driver according to the present embodiment. The driving ability judging method includes the steps of:
1. collecting data;
2. judging whether the body of the driver is abnormal;
3. judging whether the driver is tired;
4. the driving ability of the driver is judged.
In the data acquisition step, an average heart rate HR of a driver in a unit time T is acquired, the front and rear nodding times N1 of the driver in the unit time T, the front and rear nodding time T1 and the front and rear nodding speed F1 of each front and rear nodding, the left and right nodding times N2 of the driver in the unit time T, the left and right nodding time T2 and the left and right nodding speed F2 of each left and right nodding; and collecting the blink frequency FE of the driver in the unit time T, the yawing frequency FM, the shielding duration TH of the head when yawing is carried out each time, and the expression of the driver. The calculation method of the average heart rate HR is described in embodiment 1, as shown in fig. 2; the number of times of front and rear nodding N1, the length of time of front and rear nodding T1, the speed of front and rear nodding F1, the number of times of left and right nodding N2, the length of time of left and right nodding T2, and the speed of left and right nodding F2 are described with reference to embodiment 2, as shown in fig. 6.
In the step of judging whether the body of the driver is abnormal, judging whether the body of the driver is abnormal according to the average heart rate HR, the shielding duration TH and the expression of the driver, and judging that the body of the driver is abnormal when the following conditions are satisfied (refer to description in embodiment 1, as shown in fig. 4):
(1) The average heart rate HR is greater than an upper heart rate threshold HR2 or less than a lower heart rate threshold HR1;
(2) The shielding duration TH is greater than a shielding threshold TH; and
(3) The driver's expression is an abnormal expression.
In the step of judging whether the driver is tired, the driver fatigue judging method includes the step.
Step one, judging whether the driver is in a facial fatigue state according to the blink frequency FE and the yawing frequency FM, which are defined as facial fatigue FF, and the judging method of facial fatigue ff=1 is that the conditions are satisfied simultaneously (refer to the description in embodiment 2, as shown in fig. 7):
(1) The blink frequency FE is greater than a blink frequency threshold FE; and
(2) The yawing frequency FM is greater than a yawing frequency threshold FM.
Step two, judging whether the driver is in a head fatigue state according to the front and rear nodding times N1, the front and rear nodding speed F1, the left and right nodding times N2 and the left and right nodding speeds F2, wherein the judgment method is defined as head fatigue FH, and the judgment method for the head fatigue FH=1 simultaneously meets the conditions:
(1) The front-back nodding time length T1 is greater than a front-back nodding time length threshold T1 or the left-right nodding time length T2 is greater than a left-right nodding time length threshold T2;
(2) The front and rear nodding times N1 is larger than a front and rear nodding times threshold N1 or the left and right nodding times N2 is larger than a left and right nodding times threshold N2;
(3) The front-rear nodding speed F1 is greater than a front-rear nodding speed threshold F1 or the left-right nodding speed F2 is greater than a left-right nodding speed threshold F2.
Step three, when the face fatigue ff=1 or the head fatigue fh=1, the driver fatigue is determined.
In the step of judging the drivability of the driver, the method of judging the drivability includes the steps.
Step one, calculating the comprehensive degree of abnormality AD of a driver:
AD=a1*△HR+a2*△TH
where Δhr represents the absolute value of the difference between the average heart rate HR and the heart rate lower limit threshold HR1 or the absolute value of the difference between the average heart rate HR and the heart rate upper limit HR2 under the abnormality of the body of the driver; Δth represents the absolute value of the difference between the occlusion time TH and the threshold TH in the case of abnormality of the driver's body, and a1 and a2 represent the weight coefficients of Δhr and Δth in the calculation process, respectively.
Step two, calculating the comprehensive fatigue degree FD of the driver:
FD=b1*△FE+b2*△FM+b3*△T+b4*△N+b5*△F
where Δfe represents the absolute value of the difference between the driver blink frequency FE and the threshold FE; Δfm represents the absolute value of the difference between the yawing frequency FM and the yawing frequency threshold FM under driver fatigue; delta T represents the larger value of the absolute value of the difference between the front and rear nod duration T1 and the front and rear nod duration threshold T1, and the absolute value of the difference between the left and right nod duration T2 and the left and right nod duration threshold T2 under the fatigue of the driver; Δn represents the larger value of the absolute value of the difference between the front-rear number of taps N1 and the front-rear number of taps threshold N1, and the absolute value of the difference between the left-right number of taps N2 and the left-right number of taps threshold N2, in the case of fatigue of the driver; Δf represents the larger of the absolute value of the difference between the front-rear head-striking speed F1 and the front-rear head-striking speed threshold F1 and the absolute value of the difference between the left-right head-striking speed F2 and the left-right head-striking speed threshold F2 in the fatigue of the driver; b1, b2, b3, b4, b5 represent the weight coefficients of Δfe, Δfm, Δt, Δn, Δf, respectively, during the calculation.
Step three, judging the driving capability of the driver
When the integrated anomaly degree AD is greater than an integrated anomaly degree threshold value AD, or the integrated fatigue degree FD is greater than an integrated fatigue degree threshold value FD, it is determined that the driver cannot effectively control the vehicle.
The driving ability judging method of the present embodiment is applicable to a safe driving method of a vehicle, the safe driving method of the vehicle including the steps of:
the driving state detection method of embodiment 1 or the driver state detection system of embodiment 2 is employed to determine that the driver can not effectively control the vehicle;
when the driver is judged to be incapable of effectively controlling the vehicle, an intelligent auxiliary system of the vehicle is started, so that the vehicle starts an automatic driving function, and a driver alarming prompt function of the vehicle is started.
The driving ability determination method of the present embodiment may be implemented by a driving ability determination device of a driver. The driving ability determination device includes the following modules.
1. And the data acquisition module is used for acquiring driving data of a driver. The driving data is as described above and will not be described in detail here.
2. The body abnormality determination module is used for determining whether the body of the driver is abnormal, and the method for determining the body abnormality of the driver is described above and is not further described herein.
3. The driver fatigue judging module is used for judging whether the driver is tired or not and comprises a facial fatigue judging sub-module, a head fatigue judging sub-module and a driver fatigue judging sub-module. The facial fatigue judging sub-module is configured to judge whether the driver is in a facial fatigue state according to the blink frequency FE and the yawing frequency FM, and the judging method defined as facial fatigue FF and facial fatigue ff=1 is described above, and will not be described again here. The head fatigue judging submodule is used for judging whether the driver is head fatigue or not, and the head fatigue judging method is described above and is not further described herein. The driver fatigue judging submodule is used for judging whether the driver is tired: driver fatigue is judged when facial fatigue ff=1 or head fatigue fh=1.
4. The driving capability judging module comprises a comprehensive abnormality degree computing sub-module, a comprehensive fatigue degree computing sub-module and a driving capability judging sub-module. The comprehensive anomaly degree computing sub-module is used for computing the comprehensive anomaly degree AD of the driver; the comprehensive fatigue degree calculation submodule is used for calculating the comprehensive fatigue degree FD of the driver; the driving ability judgment submodule is used for judging the driving ability of the driver: when the integrated anomaly degree AD is greater than an integrated anomaly degree threshold value AD, or the integrated fatigue degree FD is greater than an integrated fatigue degree threshold value FD, it is determined that the driver cannot effectively control the vehicle.
The driving capability judging device can be applied to a safe driving system of a vehicle to realize safe driving of the vehicle. The safe driving system comprises an intelligent auxiliary system and an alarm system. When the driving capability judging device system judges that a driver cannot effectively control the vehicle, starting the intelligent auxiliary system to enable the vehicle to start an automatic driving function; the warning system may also be activated such that the vehicle turns on the driver warning prompt.
Example 4
Referring to fig. 9, a block diagram of a driver status detection system based on millimeter wave radar and camera fusion is shown. The driver state detection system comprises a driver state detection module, an information processing module and a decision module. The driver state detection module comprises a radar processing module and a vision processing module, and the information processing module is divided into driver state analysis and driver driving capability analysis.
The radar processing module comprises a millimeter wave radar, the millimeter wave radar is used for detecting the thoracic cavity fluctuation rate of the driver and the overall state of the head of the driver, and the heartbeat frequency of the driver, the nodding time of the driver, the nodding times and the nodding speed information are obtained and used as the input of the information processing module.
The vision processing module comprises an infrared camera, and detects facial organ actions and hand actions of a driver through the infrared camera to obtain blink frequency, yawning frequency, shielding time of hands on the chest or the head and facial expression information of the driver, and the information is used as input of the information processing module.
And the information processing module analyzes the state of the driver according to the received data of the driver state detection module, and judges whether the state of the driver is abnormal or not, wherein the abnormal situation is divided into physiological abnormality and fatigue. If the driver state is abnormal, analyzing the driving capacity of the driver according to the abnormal result information to obtain whether the driver can effectively control the vehicle in the abnormal state, and sending the analysis result to the decision module.
The decision module executes corresponding alarm operation according to the received analysis result of the information processing module: when the driver has physiological abnormality or fatigue problem and can still effectively control the vehicle, the driver is reminded through voice warning; when the driver has physiological abnormality or fatigue problem and the vehicle cannot be effectively controlled, the vehicle alarm device alarms and takes over the authority of the driver to control the vehicle.
The driver state detection method of the driver state detection system mainly includes the following steps.
Step one: the millimeter wave radar detects the thoracic cavity fluctuation rate and the overall head state of the driver, obtains the heart rate, the nodding time, the nodding times and the nodding speed information of the driver, and sends the information to the signal processing module.
Step two: the infrared camera detects the facial organ actions and the hand actions of the driver, obtains the blink frequency, the yawning frequency, the shielding time of hands on the chest or the head and the facial expression information of the driver, and sends the information to the information processing module.
Step three: the information processing module is used for analyzing the state of the driver according to the received data of the radar processing module and the visual processing module to obtain the abnormal state type of the driver, and then analyzing the driving capacity of the driver to obtain the driving capacity judging result of the driver. And finally, sending the analysis result to a decision module.
Step four: and the decision module sends out voice warning to remind the driver or give an alarm and take over the authority of the driver to operate the vehicle according to the received analysis result of the information processing module.
Fig. 10 is a flowchart of a processing method of the radar processing module of the system for driver state detection. The processing method of the radar processing module is mainly divided into two parts, namely, detection of the heart rate of a driver and detection of nodding information of the driver. In the present embodiment, Y in the flowchart indicates yes and N indicates no.
The steps of the heart rate detection of the driver mainly comprise:
step one: counting the fluctuation times B of the driver in the thoracic cavity unit time by the millimeter wave radar;
step two: calculating the thoracic cavity fluctuation rate B/min of the driver;
step three: and calculating the heart rate HR of the driver according to the thoracic cavity fluctuation rate, and taking the heart rate HR as an input of the information processing module.
The driver nodding information detection steps mainly comprise:
detecting a distance FB of the front and back movement of the forehead and the chin and a distance LR of the left and right movement of the forehead and the chin of a driver through a millimeter wave radar;
step two: distance FB and threshold T for moving forehead and chin back and forth fb Comparing if FB is greater than threshold T fb Judging the front and back nodding actions of a driver;
step three: distance LR for moving forehead and chin left and right and threshold T lr Comparing if LR is greater than threshold T lr Judging that the driver has left and right nodding actions;
step four: counting the time length T1 of the front and back nodding, the times N1 of the front and back nodding in unit time and the nodding speed F1, and taking the time length T1, the times N1 and the nodding speed F1 as the input of an information processing module;
step five: counting left and right nod duration T2, front and rear nod times N2 in unit time and nod speed F2, and taking the counted left and right nod duration T2, the counted front and rear nod times N2 and nod speed F2 as inputs of an information processing module.
FIG. 11 is a flow chart of a method of processing by the vision processing module of the driver status detection system. The processing method of the vision processing module mainly comprises four parts, namely, blink frequency detection of a driver, yawning frequency detection of the driver, shielding duration detection of the chest or the head by hands of the driver and driving facial expression detection.
The step of detecting the blink frequency table of the driver mainly comprises the following steps:
step one: the infrared camera detects the facial organs of the driver and positions the eyes;
step two: judging the eye closing state;
step three: counting the eye closing times in unit time, and calculating the blink frequency FE as the input of the information processing module.
The step of detecting the yawning frequency of the driver mainly comprises the following steps:
step one: detecting facial organs of a driver by an infrared camera, and positioning a mouth part;
step two: if the mouth cannot be positioned, detecting the hands of the person;
step three: judging the closing state of the mouth, if the mouth is opened, entering a step four, otherwise, returning to the step one;
step four: judging whether a hand shields the mouth, counting the times of opening the mouth and the times of shielding the mouth, and calculating the yawning frequency FM as the input of the information processing module.
The method for detecting the shielding duration of the driver's hands on the chest or the head mainly comprises the following steps:
step one: the infrared camera identifies the hands;
step two: judging whether the chest or the head of the human hand is shielded or not;
step three: and counting the shielding time TH of the driver's hands to the chest or the head, and taking the shielding time TH as an input of the information processing module.
The facial expression detection of the driver mainly obtains a facial image of the driver through an infrared camera, and the facial expression of the driver is identified by utilizing a convolutional neural network and a trained expression library, so that the expression of the driver is classified into two categories, namely normal and abnormal. The abnormal expression is mainly expression such as stress, pain and the like occurring in sudden diseases and accidents.
Fig. 12 is a physiological abnormal state analysis chart of the driver state detection system. The physiological abnormal state analysis mainly analyzes the heart rate HR of the driver, the shielding duration TH of the chest or the head by the hands of the driver, and the facial expression of the driver.
When the driver state is determined to be abnormal, the following conditions are satisfied at the same time:
(1) The driver heart rate HR is greater than a threshold HR2 or less than a threshold HR1;
(2) The shielding duration TH of the driver's hands on the chest or head is greater than the threshold TH;
(3) The driver's facial expression is abnormal.
Fig. 13 is a fatigue state analysis chart of the driver state detection system, which mainly analyzes the blink frequency FE of the driver, the yawing frequency FM of the driver, the facial expression of the driver, the time periods T1 and T2 of the driver before and after the head and the left and right head, the times N1 and N2 of the driver before and after the head and the left and right head, and the speeds F1 and F2 of the driver before and after the head and the left and right head.
When the driver state is determined to be tired, any one of the following conditions needs to be satisfied:
(1) The driver face information determination result is fatigue (ff=1);
(2) The driver's head information determination result is fatigue (fh=1).
Wherein, the judging result of the face information of the driver is that the fatigue needs to meet the following conditions at the same time:
(1) The driver blink frequency FE is greater than the threshold FE;
(2) The driver yawning frequency FM is greater than a threshold FM;
the head information of the driver judges that the fatigue needs to meet the following conditions simultaneously:
(1) The front-back time length T1 of the driver is greater than a threshold T1 or the left-right click time length T2 is greater than a threshold T2;
(2) The front and rear nodding times N1 of the driver are larger than a threshold value N1 or the left and right nodding times N2 are larger than a threshold value N2;
(3) The driver front-rear click speed F1 is greater than the threshold F1 or the left-right click speed is greater than the threshold F2.
Fig. 14 is a driving ability analysis chart of the driver state detection system. The driving capability analysis is mainly used for judging the driving capability of the driver in two states by analyzing the comprehensive physiological anomaly degree A and the comprehensive fatigue degree F of the driver.
The comprehensive physiological anomaly degree calculation formula of the driver is as follows:
AD=a1*△HR+a2*△TH
where Δhr and Δth represent the absolute value of the difference between the driver heart rate HR and the threshold value HR1 or HR2 and the absolute value of the difference between the driver hand-shielded chest or head duration TH and the threshold value TH, respectively. The values HR and TH are values obtained when the driver state is determined to be abnormal. a1 and a2 respectively represent weight coefficients of DeltaHR and DeltaTH in the calculation process, which are respectively 0.6 and 0.4, and are determined according to the importance of the weight coefficients to the driving capability judgment.
The comprehensive fatigue degree calculation formula of the driver is as follows:
FD=b1*△FE+b2*△FM+b3*△T+b4*△N+b5*△F
where Δfe represents the absolute value of the difference between the driver blink frequency FE and the threshold value FE, Δfm represents the absolute value of the difference between the driver yawning frequency FM and the threshold value FM, Δt represents the larger of the absolute value of the difference between the driver front-rear head time period T1 and the threshold value T1 or the absolute value of the difference between the left-right head time period T2 and the threshold value T2, Δn represents the larger of the absolute value of the difference between the driver front-rear head times N1 and the threshold value N1 or the absolute value of the difference between the left-right head times N2 and the threshold value N2, and Δf represents the larger of the absolute value of the difference between the driver front-rear head speed F1 and the threshold value F1 or the absolute value of the difference between the left-right head speed F2 and the threshold value F2. The above values FE, FM, T1, T2, N1, N2, F1, F2 are all values obtained when the driver state is determined to be tired. b1, b2, b3, b4, b5 represent weight coefficients of Δfe, Δfm, Δt, Δn, Δf in the calculation process, respectively, and are 0.3, 0.1, 0.2, 0.1, respectively, determined according to the importance of the calculation to the driving ability.
And the driving capability analysis of the driver judges whether the driving capability of the driver in the state can effectively control the vehicle according to the calculated comprehensive physiological abnormal degree AD or the comprehensive fatigue degree FD of the driver, and one of two conclusions that the driver can effectively control the vehicle or the driver cannot effectively control the vehicle is obtained.
The foregoing description of the preferred embodiments of the invention is not intended to be limiting, but rather is intended to cover all modifications, equivalents, and alternatives falling within the spirit and principles of the invention.

Claims (10)

1. A driver state detection system based on millimeter wave radar and camera fusion, comprising:
the data acquisition module comprises a millimeter wave radar and an infrared camera; the millimeter wave radar is used for detecting a front-back movement distance FB of the forehead or the chin of the driver and recording corresponding time length of the front-back movement distance FB, and is also used for detecting a left-right movement distance LR of the forehead or the chin of the driver and recording corresponding time length of the left-right movement distance LR; the infrared camera is used for detecting the facial organs of the driver by acquiring the facial images of the driver; and
the data processing system comprises a data processing module, a driver fatigue judging module and a driving state judging module;
the data processing module compares the front-back movement distance FB with a front-back movement threshold Tfb, judges that a driver has front-back nodding action, namely front-back nodding times, and defines the corresponding time length of the front-back movement distance FB as front-back nodding time length T1, divides the front-back movement distance FB by the front-back nodding time length T1 to obtain front-back nodding speed F1, and counts the front-back nodding times N1 in unit time T and the front-back nodding time length T1 and the front-back nodding speed F1 of each front-back nodding if the front-back movement distance FB is larger than the front-back movement threshold Tfb; the data processing module compares the left-right movement distance LR with a left-right movement threshold Tlr, judges that a driver has left-right nodding action, namely left-right nodding times, and defines the corresponding time length of the left-right movement distance LR as left-right nodding time length T2, divides the left-right movement distance LR by the left-right nodding time length T2 to obtain a left-right nodding speed F2, and counts the left-right nodding times N2 in unit time T and the left-right nodding time length T2 and the left-right nodding speed F2 of each left-right nodding if the left-right movement distance LR is larger than the left-right movement threshold Tlr; the data processing module is used for positioning the eye part according to the facial organ, judging the eye closing state of the front and rear sampling moments for the positioned eye part, counting the times of closing eyes once each time the eyes are closed, and counting the times of closing eyes in unit time T to obtain blink frequency FE; the data processing module is used for positioning the mouth part according to the facial organ, judging whether the mouth is opened or not according to the positioned mouth part, if the mouth is opened, judging whether a person shields the mouth, counting the number of times of yawing once, if the mouth part cannot be positioned, detecting the hand of the driver, if the hand is detected, judging that the hand shields the mouth, counting the number of times of yawing once, and counting the number of times of yawing FM in unit time T;
The driver fatigue judging module comprises a facial fatigue judging sub-module, a head fatigue judging sub-module and a driver fatigue judging sub-module;
the facial fatigue judging submodule judges whether the driver is in a facial fatigue state according to the blink frequency FE and the yawing frequency FM, the facial fatigue state is defined as facial fatigue FF, and the judging method of facial fatigue FF=1 is that the conditions are simultaneously satisfied:
(1) The blink frequency FE is greater than a blink frequency threshold FE; and
(2) The yawing frequency FM is greater than a yawing frequency threshold FM;
the head fatigue judging sub-module judges whether a driver is in a head fatigue state according to the front and rear nodding times N1, the front and rear nodding speed F1, the left and right nodding times N2 and the left and right nodding speeds F2, and defines as head fatigue FH, and the judging method of the head fatigue FH=1 is that the conditions are simultaneously satisfied:
(1) The front-back nodding time length T1 is greater than a front-back nodding time length threshold T1 or the left-right nodding time length T2 is greater than a left-right nodding time length threshold T2;
(2) The front and rear nodding times N1 is larger than a front and rear nodding times threshold N1 or the left and right nodding times N2 is larger than a left and right nodding times threshold N2; and
(3) The front and rear nodding speed F1 is greater than a front and rear nodding speed threshold F1 or the left and right nodding speed F2 is greater than a left and right nodding speed threshold F2; and
The driver fatigue judging sub-module judges driver fatigue when facial fatigue ff=1 or head fatigue fh=1;
the driving state judging module comprises a comprehensive fatigue degree calculating sub-module and a driving capacity judging sub-module;
the comprehensive fatigue degree calculation submodule calculates the comprehensive fatigue degree FD of the driver:
FD=b1*△FE+b2*△FM+b3*△T+b4*△N+b5*△F
where Δfe represents the absolute value of the difference between the driver blink frequency FE and the threshold FE; Δfm represents the absolute value of the difference between the yawing frequency FM and the yawing frequency threshold FM under driver fatigue; delta T represents the larger value of the absolute value of the difference between the front and rear nod duration T1 and the front and rear nod duration threshold T1, and the absolute value of the difference between the left and right nod duration T2 and the left and right nod duration threshold T2 under the fatigue of the driver; Δn represents the larger value of the absolute value of the difference between the front-rear number of taps N1 and the front-rear number of taps threshold N1, and the absolute value of the difference between the left-right number of taps N2 and the left-right number of taps threshold N2, in the case of fatigue of the driver; Δf represents the larger of the absolute value of the difference between the front-rear head-striking speed F1 and the front-rear head-striking speed threshold F1 and the absolute value of the difference between the left-right head-striking speed F2 and the left-right head-striking speed threshold F2 in the fatigue of the driver; b1, b2, b3, b4, b5 respectively represent weight coefficients of Δfe, Δfm, Δt, Δn, Δf in the calculation process; and
The driving capability judging sub-module judges that the driver is in a non-driving state when the comprehensive fatigue degree FD is greater than a comprehensive fatigue degree threshold FD.
2. The driver state detection system based on millimeter wave radar and camera fusion according to claim 1, wherein the data processing system further comprises a body abnormality judging module, the body abnormality judging module judges whether the body of the driver is abnormal according to the average heart rate HR of the driver, the shielding duration TH of the head of the driver when the driver makes a yawning each time, and the expression of the driver, and judges that the body of the driver is abnormal when the following conditions are satisfied:
(1) The average heart rate HR is greater than an upper heart rate threshold HR2 or less than a lower heart rate threshold HR1;
(2) The shielding duration TH is greater than a shielding threshold TH; and
(3) The driver's expression is an abnormal expression.
3. The driver state detection system based on the combination of the millimeter wave radar and the camera according to claim 2, wherein the millimeter wave radar is further configured to detect the number B of thoracic cavity undulations per unit time T of the driver;
the data processing module also calculates the chest relief rate B/min: B/T; obtaining the average heart rate HR of the driver according to the chest cavity fluctuation rate B/min: k (B/T), where K is the conversion coefficient.
4. The driver state detection system based on millimeter wave radar and camera fusion according to claim 2, wherein the data processing module is used for recording the time length when the hand shields the mouth at the moment when judging that the hand shields the mouth and counting the number of times of yawning, and the time length when the hand shields the mouth each time is corresponding shielding time length TH.
5. The driver state detection system based on millimeter wave radar and camera fusion of claim 2, wherein the data processing module further performs recognition of the facial expression of the driver on the facial image by using a convolutional neural network and a trained expression library according to the facial image, so as to recognize the driver's expression as a normal expression or as an abnormal expression.
6. The driver state detection system based on millimeter wave radar and camera fusion of claim 2, wherein the driving state determination module further comprises a comprehensive abnormality degree calculation sub-module;
the comprehensive anomaly degree calculation sub-module calculates the comprehensive anomaly degree AD of the driver:
AD=a1*△HR+a2*△TH
where Δhr represents the absolute value of the difference between the average heart rate HR and the heart rate lower limit threshold HR1 or the absolute value of the difference between the average heart rate HR and the heart rate upper limit HR2 under the abnormality of the body of the driver; Δth represents the absolute value of the difference between the occlusion time TH and the threshold TH in the case of abnormality of the driver's body, and a1 and a2 represent the weight coefficients of Δhr and Δth in the calculation process, respectively;
The driving ability judging sub-module also judges that the driver is in a non-driving state when the integrated anomaly degree AD is greater than an integrated anomaly degree threshold value AD.
7. The driver state detection system based on millimeter wave radar and camera fusion according to claim 6, wherein a1 and a2 are 0.6 and 0.4, respectively.
8. The driver state detection system based on millimeter wave radar and camera fusion according to claim 1, wherein b1, b2, b3, b4, b5 are 0.3, 0.1, 0.2, 0.1, respectively.
9. A vehicle is provided with a driver state detection system and an intelligent auxiliary system; the driving state detection system is the driver state detection system based on the combination of the millimeter wave radar and the camera according to any one of claims 1 to 8, and when the driver state detection system judges that the driver cannot effectively control the vehicle, the intelligent auxiliary system is started so that the vehicle starts an automatic driving function.
10. The vehicle of claim 9, wherein the vehicle is further equipped with an alarm system, and wherein the alarm system is further activated when the driving situation detection system determines that the driver is not able to effectively control the vehicle, such that the vehicle turns on the driver alarm prompt function.
CN202011132636.3A 2020-10-21 2020-10-21 Driver state detection system based on millimeter wave radar and camera fusion and vehicle Active CN112220480B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202011132636.3A CN112220480B (en) 2020-10-21 2020-10-21 Driver state detection system based on millimeter wave radar and camera fusion and vehicle

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202011132636.3A CN112220480B (en) 2020-10-21 2020-10-21 Driver state detection system based on millimeter wave radar and camera fusion and vehicle

Publications (2)

Publication Number Publication Date
CN112220480A CN112220480A (en) 2021-01-15
CN112220480B true CN112220480B (en) 2023-08-04

Family

ID=74108908

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202011132636.3A Active CN112220480B (en) 2020-10-21 2020-10-21 Driver state detection system based on millimeter wave radar and camera fusion and vehicle

Country Status (1)

Country Link
CN (1) CN112220480B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114041767A (en) * 2021-10-11 2022-02-15 宁波春建电子科技有限公司 Heart rate detection method based on depth camera and millimeter wave radar
CN114132326A (en) * 2021-11-26 2022-03-04 北京经纬恒润科技股份有限公司 Method and device for processing fatigue driving
CN114506335A (en) * 2022-01-13 2022-05-17 四川豪智融科技有限公司 Driver fatigue and health monitoring system for automatic driving
CN115227247B (en) * 2022-07-20 2023-12-26 中南大学 Fatigue driving detection method, system and storage medium based on multisource information fusion
CN115294730A (en) * 2022-08-11 2022-11-04 深圳市锐驰曼科技发展有限公司 Fatigue driving monitoring system based on AI video and millimeter wave radar fusion technology
CN118356176B (en) * 2024-06-19 2024-08-16 大连海事大学 Wireless sign mining method suitable for non-contact fatigue detection

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110059019A (en) * 2009-11-27 2011-06-02 현대자동차주식회사 Sening system of driver's sleepiness and tireness
CN110077414A (en) * 2019-04-04 2019-08-02 合肥思艾汽车科技有限公司 A kind of vehicle driving safety support method and system based on driver status monitoring
CN110276273A (en) * 2019-05-30 2019-09-24 福建工程学院 Merge the Driver Fatigue Detection of facial characteristics and the estimation of image pulse heart rate

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20110059019A (en) * 2009-11-27 2011-06-02 현대자동차주식회사 Sening system of driver's sleepiness and tireness
CN110077414A (en) * 2019-04-04 2019-08-02 合肥思艾汽车科技有限公司 A kind of vehicle driving safety support method and system based on driver status monitoring
CN110276273A (en) * 2019-05-30 2019-09-24 福建工程学院 Merge the Driver Fatigue Detection of facial characteristics and the estimation of image pulse heart rate

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
基于面部特征的驾驶员疲劳智能预警系统;姜晓艳;赵娟;张然;吴俊芹;吴晓飞;巩金海;;数码世界(09);全文 *

Also Published As

Publication number Publication date
CN112220480A (en) 2021-01-15

Similar Documents

Publication Publication Date Title
CN112220480B (en) Driver state detection system based on millimeter wave radar and camera fusion and vehicle
CN110276273B (en) Driver fatigue detection method integrating facial features and image pulse heart rate estimation
Kang Various approaches for driver and driving behavior monitoring: A review
Assari et al. Driver drowsiness detection using face expression recognition
Lee et al. Real-time physiological and vision monitoring of vehicle driver for non-intrusive drowsiness detection
CN112434611B (en) Early fatigue detection method and system based on eye movement subtle features
CN112220481B (en) Driver driving state detection method and safe driving method thereof
Wu et al. Reasoning-based framework for driving safety monitoring using driving event recognition
CN112829767B (en) Automatic driving control system and method based on monitoring of misoperation of driver
CN105788176A (en) Fatigue driving monitoring and prompting method and system
CN109367539A (en) A kind of intelligence system detecting fatigue driving
CN112208544B (en) Driving capability judgment method for driver, safe driving method and system thereof
US20220036101A1 (en) Methods, systems and computer program products for driver monitoring
Liu et al. A review of driver fatigue detection: Progress and prospect
JP3480483B2 (en) Arousal level estimation device
Kumar et al. Detecting distraction in drivers using electroencephalogram (EEG) signals
Flores-Monroy et al. Visual-based real time driver drowsiness detection system using CNN
Yin et al. A driver fatigue detection method based on multi-sensor signals
CN117333853A (en) Driver fatigue monitoring method and device based on image processing and storage medium
Chiou et al. Abnormal driving behavior detection using sparse representation
Zhan et al. A Review of Driver Fatigue Detection and Warning Based on Multi-Information Fusion
CN113901866A (en) Fatigue driving early warning method based on machine vision
Ahir et al. Driver inattention monitoring system: A review
CN115610430A (en) Intelligent automobile safe driving auxiliary system and method
Srivastava Driver's drowsiness identification using eye aspect ratio with adaptive thresholding

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant