CN112494045A - Driver fatigue detection method and device - Google Patents
Driver fatigue detection method and device Download PDFInfo
- Publication number
- CN112494045A CN112494045A CN202011481838.9A CN202011481838A CN112494045A CN 112494045 A CN112494045 A CN 112494045A CN 202011481838 A CN202011481838 A CN 202011481838A CN 112494045 A CN112494045 A CN 112494045A
- Authority
- CN
- China
- Prior art keywords
- fatigue
- driver
- eyes
- eye
- image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/165—Evaluating the state of mind, e.g. depression, anxiety
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/163—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state by tracking eye movement, gaze, or pupil change
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/16—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state
- A61B5/18—Devices for psychotechnics; Testing reaction times ; Devices for evaluating the psychological state for vehicle drivers or machine operators
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/72—Signal processing specially adapted for physiological signals or for diagnostic purposes
- A61B5/7235—Details of waveform analysis
- A61B5/725—Details of waveform analysis using specific filters therefor, e.g. Kalman or adaptive filters
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/746—Alarms related to a physiological condition, e.g. details of setting alarm thresholds or avoiding false alarms
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/174—Facial expression recognition
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/18—Eye characteristics, e.g. of the iris
- G06V40/19—Sensors therefor
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2503/00—Evaluating a particular growth phase or type of persons or animals
- A61B2503/20—Workers
- A61B2503/22—Motor vehicles operators, e.g. drivers, pilots, captains
Landscapes
- Health & Medical Sciences (AREA)
- Engineering & Computer Science (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- General Health & Medical Sciences (AREA)
- Molecular Biology (AREA)
- Animal Behavior & Ethology (AREA)
- Surgery (AREA)
- Public Health (AREA)
- Medical Informatics (AREA)
- Heart & Thoracic Surgery (AREA)
- Biomedical Technology (AREA)
- Pathology (AREA)
- Biophysics (AREA)
- Psychiatry (AREA)
- Veterinary Medicine (AREA)
- Social Psychology (AREA)
- Theoretical Computer Science (AREA)
- Hospice & Palliative Care (AREA)
- Educational Technology (AREA)
- Developmental Disabilities (AREA)
- Child & Adolescent Psychology (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Psychology (AREA)
- Human Computer Interaction (AREA)
- Physiology (AREA)
- Ophthalmology & Optometry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Artificial Intelligence (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Signal Processing (AREA)
- Emergency Alarm Devices (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
- Traffic Control Systems (AREA)
Abstract
The invention relates to a method and a device for detecting fatigue of a driver, wherein the method comprises the following steps: acquiring face information of a driver in real time; analyzing the obtained face information, and performing fatigue classification; and reminding the driver according to the fatigue classification. The invention tracks and predicts the positions of eyes, analyzes the information of the eyes to calculate the eye characteristics and the facial expression characteristics which can be expressed by the mouth state, sets shallow fatigue, moderate fatigue and over fatigue criteria for the characteristics and gives alarms of different levels, thereby realizing the function of fatigue detection.
Description
Technical Field
The invention relates to the technical field of communication, in particular to a method and a device for detecting fatigue of a driver.
Background
The current driver fatigue detection research methods can be divided into two main categories:
a) starting from the characteristics of a driver, acquiring physiological parameter characteristics or visual characteristics of the driver through certain equipment, and classifying and judging by adopting a corresponding mode identification technology according to different characteristic modes of the driver in a normal state and a fatigue state so as to detect whether fatigue is generated;
b) and indirectly judging whether the driver is tired or not according to the behavior of the vehicle. In such techniques, various parameters of the vehicle during driving are acquired by sensors, and whether the driver has fatigue is determined according to abnormal conditions during the driving of the vehicle, such as whether the vehicle exceeds a road sign line, whether the speed is over speed, whether the distance between the vehicles is too close, and the like.
At present, a plurality of fatigue detection systems are provided, and the following effects are provided:
a) DAS, developed by international university of australia, is now commercially available. The human eye tracking system arranged on the automobile instrument panel is used for monitoring a driver, the performance of the driver can be monitored, the grip force feedback of a steering wheel is utilized, and meanwhile, the road surface tracking deviation is fed back to the driver. The road surface tracking device is intended to detect an illegal action such as sudden vehicle rubbing of a road surface sign or a road edge, and the system gives a warning when an abnormality is detected and has a braking function of a steering wheel. Seat vibration warnings are also employed in connection with the degree of lane side deviation to encourage them to correct lane deviation.
b) The european union completed the AWAKE project in 2004. The characteristic states used include eyelid movement, grip strength change and road tracking, and braking actions using brakes and steering wheel position, which are combined to counteract traffic risks.
c) The Copilot monitoring system developed by the university of cartegiemon. By measuring eyelid movement by Perelose, the monitoring system is small and well-functioning, providing an effective research tool.
d) The FaeeLAB III system developed by the Seeingmachine research group monitors the behavior of the driver and can detect the situations of fatigue, energy dispersion and the like. And obtaining video images by using a pair of video cameras, and matching from left to right to obtain the three-dimensional position of each feature. And positioning the three-dimensional position of the head by adopting a least square optimization method, parallelly processing eye gaze data by adopting faceLAB software, positioning the center of an iris, determining the eye gaze direction according to the eye gaze vector, calculating the eye opening and blink frequency, and monitoring the eyelid. FaceLAB has proven to be very effective in simulated driving, and has good results in low light, large head range motion, and visual direction tracking, even if the driver has sunglasses. It can be seen that the system utilizes various visual characteristics and comprehensive characteristics such as vehicle behavior to detect whether fatigue occurs, and some alarm modes are adopted to remind the driver of paying attention to driving safety. At present, the method involves a small amount of research results in China.
The existing problems and development trends are as follows:
because of the individual difference of drivers and the large difference of driving environments such as light, road surface and the like, the current fatigue detection algorithm is basically based on the simulated driving environment, and is developed to the real driving environment in the next step, so that the fatigue detection technology is widely applied to the commercial field, and the traffic accidents caused by fatigue driving are reduced. There are limited types of features that can be directly obtained, whether from the driver's own features or from the vehicle behavior perspective, and there is a lot of and redundancy in surface feature data that is directly extracted. Therefore, on one hand, fatigue characteristics are mined, characteristic parameters which can represent fatigue most are extracted by an advanced signal processing method, on the other hand, a signal fusion processing method is adopted, a plurality of fatigue characteristic parameters are combined to detect the fatigue condition of a driver, the influences of space, illumination and the like are overcome, and the real-time performance and the accuracy of a detection algorithm are improved. The driver fatigue detection system should have a fatigue decision function. Because different people have different fatigue performance characteristics, the fatigue detection system has the advantages of intelligence, self-learning and reasoning functions. In the initial driving stage, the system trains the system according to the acquired relevant data of the driver, can obtain the fatigue characteristics of the driver, and selects the detection method which is most suitable for the driver, and can adapt to each driver.
At present, fatigue detection equipment at home and abroad mainly focuses on detecting eyes of a driver, and the judgment of a mouth shape is not realized, so that the conditions of missed judgment and wrong judgment are easy to occur; in addition, at present, fatigue driving equipment at home and abroad can only detect the positions and the characteristics of eyes in real time and cannot realize the function of predicting the positions and the characteristics of the eyes, so that the function of predicting the fatigue state of a driver is not realized; at present, fatigue detection equipment at home and abroad detects the positions and characteristics of eyes by adopting a pixel comparison method of odd and even two frames of images, and has low detection speed and low detection precision.
Disclosure of Invention
In order to overcome the technical defects in the prior art, the invention provides a driver fatigue detection method and device, which utilize a Kalman filter and a Mean-shift algorithm to adopt continuous detection and short-time tracking strategies, track and predict the positions of eyes, analyze eye information to calculate eye characteristics and facial expression characteristics which can be expressed by a mouth state, set shallow fatigue, medium fatigue and over fatigue criteria for the characteristics and give different levels of alarm to realize the fatigue detection function, and can effectively solve the problems in the background technology.
In order to solve the technical problems, the technical scheme of the driver fatigue detection method and device provided by the invention is as follows:
in a first aspect, an embodiment of the invention discloses a method for detecting fatigue of a driver, which comprises the following steps:
acquiring face information of a driver in real time;
analyzing the obtained face information, and performing fatigue classification;
and reminding the driver according to the fatigue classification.
In any of the above schemes, preferably, the facial information of the driver includes an eye state and a mouth state, an infrared light source is used to collect images, the eye state is accurately detected by adopting a method of differentiating two odd-even frame images, a Kalman filter and a Mean-shift algorithm are used to track and predict the positions of eyes by adopting a continuous detection and short-time tracking strategy, and the information of the eyes is analyzed to calculate the eye characteristics and the facial expression characteristics that can be expressed by the mouth state.
In any of the above schemes, preferably, the driver fatigue detection method further includes tracking and predicting the eye positions, analyzing the eye information to calculate the eye characteristics and the facial expression characteristics that can be expressed by the mouth state, and implementing fatigue detection by setting shallow fatigue, medium fatigue and over fatigue criteria for the characteristics and giving different levels of alarms.
In any of the above schemes, preferably, the method for detecting fatigue of a driver further includes determining whether the driver is fatigued by determining whether the eyes of the driver are opened or closed, determining whether the characteristic values and the position values of the eyes and the mouth reach set values, and if the characteristic values and the position values reach the set values, controlling an alarm device to give an alarm to remind the driver of taking a rest and driving safety.
In any of the above schemes, preferably, the infrared light source is composed of two groups of inner and outer ring infrared diodes, which are evenly distributed on a ring with the same plane and the same axis, and the two are freely switched, so that the limited range of the visual field of the camera is reduced, and the detection of the infrared detection on the eyes is realized.
In any of the above schemes, preferably, the driver fatigue detection method further includes an eye positioning analysis algorithm, specifically including the following steps:
detecting eyes, namely acquiring a face video image by using a CCD camera after infrared light irradiates a face, and acquiring difference calculation of reflection images with two wavelengths to detect the eyes;
due to the influence of noise, the image obtained by difference needs to be preprocessed, histogram equalization processing is carried out, the gray levels with more pixels in the image are widened, and the gray levels with less pixels are reduced;
the histogram of the original image is transformed into a uniformly distributed form and then converted into a binary image.
In any of the above schemes, preferably, the driver fatigue detection method further includes an eye tracking algorithm, and specifically includes the following steps:
eye tracking, combining both Mean-Shift algorithm and Kalman filter,
if the Kalman filter detects the human eyes, directly adopting the detected result and simultaneously recording the current image and the positions of the human eyes;
if the image is not detected, initializing the Mean-Shift algorithm by using the image recorded last time and the position of the human eyes, searching the human eye area on the current image in a matching way by using the Mean-Shift algorithm, and if the Kalman filter in the subsequent image is still not detected, always tracking the Mean-Shift algorithm.
In any of the above schemes, preferably, the analyzing the information of the eyes includes analyzing eyelid movement parameters, extracting characteristic information of eyelids of the driver in real time, and calculating the fatigue value of the driver by using an average eye closing time PERCLOS algorithm and an average eye closing speed algorithm, where PERCLOS refers to a time proportion occupied by the closed eyes in a certain time.
In any of the above schemes, preferably, the reminding the driver according to the fatigue classification includes a cold air alarm and a non-toxic irritant gas alarm, and if it is determined that the driver is in a fatigue state, the sound alarm is activated, and if the continuous sound alarm is not turned off after a certain preset time, the cold air alarm and the non-toxic irritant gas alarm are issued with instructions to spray cold air and the non-toxic irritant gas to remind the driver to keep the driver awake.
The invention adopts continuous detection and short-time tracking strategies by utilizing a Kalman filter and a Mean-shift algorithm, tracks and predicts the positions of eyes, analyzes the information of the eyes to calculate the eye characteristics and facial expression characteristics which can be expressed by the mouth state, sets shallow fatigue, medium fatigue and over fatigue criteria for the characteristics, and gives alarms of different levels to realize the function of fatigue detection.
In a second aspect, an apparatus for preventing fatigue driving includes:
the acquisition module is used for acquiring the facial information of the driver in real time;
the analysis module is used for analyzing the acquired face information and carrying out fatigue classification; the system is used for tracking and predicting the positions of eyes, analyzing the information of the eyes to calculate the characteristics of the eyes and facial expression characteristics which can be expressed by the mouth state, setting shallow fatigue, moderate fatigue and over fatigue criteria for the characteristics and giving alarms of different levels to realize fatigue detection; the device is used for judging whether the driver is tired or not by judging the opening and closing of the eyes of the driver, judging whether the characteristic values and the position values of the eyes and the mouth reach set values or not, and controlling alarm equipment to give an alarm if the characteristic values and the position values reach the set values to remind the driver of taking a rest and driving safety;
and the reminding module is used for reminding the driver according to the fatigue classification.
The invention adopts continuous detection and short-time tracking strategies by utilizing a Kalman filter and a Mean-shift algorithm, tracks and predicts the positions of eyes, analyzes the information of the eyes to calculate the eye characteristics and facial expression characteristics which can be expressed by the mouth state, sets shallow fatigue, medium fatigue and over fatigue criteria for the characteristics, and gives alarms of different levels to realize the function of fatigue detection.
Drawings
The drawings are included to provide a further understanding of the invention, and are incorporated in and constitute a part of this specification.
FIG. 1 is a schematic diagram of a driver fatigue detection method according to the present invention;
FIG. 2 is a schematic diagram of the principle of bright (upper) and dark (lower) pupil effects in the driver fatigue detection method according to the present invention;
fig. 3 is a schematic diagram of the judgment principle in the driver fatigue detection method according to the present invention.
Fig. 4 is a schematic view of an arrangement of infrared light sources in the driver fatigue detection method according to the present invention.
Fig. 5 is a schematic view of an apparatus for preventing fatigue driving according to the present invention.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, the present invention is described in further detail below with reference to the accompanying drawings and embodiments. It should be understood that the specific embodiments described herein are merely illustrative of the invention and are not intended to limit the invention.
It will be understood that when an element is referred to as being "secured to" or "disposed on" another element, it can be directly on the other element or be indirectly on the other element. When an element is referred to as being "connected to" another element, it can be directly connected to the other element or be indirectly connected to the other element.
In the description of the present invention, it is to be understood that the terms "length", "width", "upper", "lower", "front", "rear", "left", "right", "vertical", "horizontal", "top", "bottom", "inner", "outer", and the like, indicate orientations or positional relationships based on the orientations or positional relationships illustrated in the drawings, and are used merely for convenience in describing the present invention and for simplicity in description, and do not indicate or imply that the devices or elements referred to must have a particular orientation, be constructed in a particular orientation, and be operated, and thus, are not to be construed as limiting the present invention.
Furthermore, the terms "first", "second" and "first" are used for descriptive purposes only and are not to be construed as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the present invention, "a plurality" means two or more unless specifically defined otherwise.
For better understanding of the above technical solutions, the technical solutions of the present invention will be described in detail below with reference to the drawings and the detailed description of the present invention.
Example (b):
in a first aspect, as shown in fig. 1, an embodiment of the present invention discloses a method for detecting fatigue of a driver, where the method includes the following steps:
step 1: the face information of the driver is acquired in real time.
As shown in figure 2, the invention can collect the micro expression of the whole face of the driver, including the eye state and the mouth state, based on the characteristic that the reflectivity of the infrared light with two wavelengths (850nm/950nm) of human eyes has obvious difference, adopts an infrared light source to collect images, adopts a method of differentiating two odd-even frame images to accurately detect the eye state, adopts a continuous detection and short-time tracking strategy by utilizing a Kalman filter and a Mean-shift algorithm to track and predict the eye position, and analyzes the information of eyes to calculate the eye characteristic and the facial expression characteristic which can be expressed by the mouth state.
Step 2: the acquired face information is analyzed and fatigue classification is performed.
As shown in figure 3, the invention tracks and predicts the positions of the eyes, analyzes the information of the eyes to calculate the eye characteristics and the facial expression characteristics which can be expressed by the mouth state, sets the criteria of light fatigue, medium fatigue and over fatigue for the characteristics and gives alarms of different levels, thereby realizing the function of fatigue detection.
As shown in fig. 2, for example, a CCD camera based on infrared detection is used, and the CCD camera connects an infrared detection image to a digital signal processor DSP through a video input decoder, and performs pattern recognition and processing on the detection image.
And step 3: and reminding the driver according to the fatigue classification.
The digital signal processor is connected with the alarm device, whether the driver is tired or not is judged by judging the opening and closing of the eyes of the driver, and the alarm device gives an alarm to remind the driver to pay attention to rest and driving safety by further judging whether the characteristic values and the position values of the eyes and the mouth reach set values or not.
As shown in fig. 4, in the above CCD camera based on infrared detection, the infrared light source is composed of two groups of inner and outer ring infrared diodes, which are evenly distributed on a ring with the same plane and the same axis, and the two are freely switched, so that the limited range of the camera's visual field is reduced; the two groups of infrared diodes respectively emit infrared light with the wavelength of 850nm and the wavelength of 950nm, and the detection of infrared detection on eyes is realized.
In the invention, the method for detecting the eyes by infrared detection comprises the following steps:
the human eyes have different degrees of reflection of infrared light with different wavelengths, at the wavelength of 850nm, the retina can reflect 90% of incident light, at the wavelength of 950nm, the retina can only reflect 40% of the incident light, and within the range of 880 +/-80 nm, the reflection degrees of other parts of the human face to the infrared light are basically consistent; the difference calculation is carried out on images obtained by respectively irradiating human faces by infrared rays with wavelength of 850nm and infrared rays with wavelength of 950nm, so that eyes can be accurately detected; the optimal size of the led ring mounted on the lens surface can only be an empirical value, and if the outer ring of diodes is turned on, a dark pupil effect is produced, and the inner ring of diodes is turned on, a bright pupil effect is produced.
The invention also comprises a positioning analysis and tracking algorithm for the eyes, which comprises the following steps:
step 11: and (4) detecting eyes.
The method comprises the steps of finishing eye detection based on the infrared detection method, after infrared light irradiates a human face, utilizing a CCD camera to collect a human face video image, collecting reflected image difference calculation of two wavelengths, namely detecting the eyes, wherein due to the influence of noise, the image obtained by difference needs to be preprocessed, firstly carrying out histogram equalization processing, widening the gray levels with more pixels in the image, reducing the gray levels with less pixels, converting the histogram of the original image into a uniformly distributed form, and then converting the histogram into a binary image.
Step 12: eye tracking.
Combining the Mean-Shift algorithm and the Kalman filter, and adopting a continuous detection and short-time tracking strategy, namely when the Kalman filter detects human eyes, directly adopting the detected result and simultaneously recording the current image and the positions of the human eyes; if the image is not detected, initializing the Mean-Shift algorithm by using the image recorded last time and the position of the human eyes, searching the human eye area on the current image in a matching way by using the Mean-Shift algorithm, and if the Kalman filter in the subsequent image is still not detected, always tracking the Mean-Shift algorithm.
In the invention, the analysis of the eye information comprises the analysis of eyelid movement parameters, the eye opening and closing speed becomes slow after people are tired, the eye rotation frequency is reduced, the eyelid has the closing trend, the eyes are dull, and other characteristics are extracted to obtain the characteristic information of the eyelids, the fatigue value of the driver is calculated by using the eye closing average time PERCLOS algorithm and the eye average closing speed algorithm, and the PERCLOS is the time proportion occupied by the eyes when the eyes are closed in a certain time.
In the specific experiment, there are three metrics of P70, P80, eyemea (em):
p70: a time percentage of eye closure area above 70%;
p80: the percentage of time over 80% of the eye closure area, the most commonly used indicator; where P80 is believed to best reflect the degree of fatigue in humans.
In the invention, the driver fatigue detection method also comprises human face posture analysis, the facial posture of the driver is evaluated by using the position information of the eyes, and in the formula, the position is x2-x1, the position is y2-y1, and the (x2, y2) is the coordinate position of the left eye and the right eye;
when the face is deviated to the left direction, theta is larger than 8 degrees;
when the face is right ahead, the angle theta is less than or equal to 8 degrees;
when the face is biased to the right, then θ < -8 °;
in the invention, the method for detecting the fatigue of the driver also comprises the evaluation and tracking of the eye gazing direction, the gazing information of the person comprises the gazing direction of the face and the gazing direction of the eyes, and compared with the gazing direction of the eyes, the visual field of the face in the gazing direction is larger, so that the human face is gazed for a long time as generalized gazing, the eyes are gazed for a long time as narrow gazing, and the two gazing information are comprehensively considered in the design. The generalized gaze is obtained by calculating the head posture information, and the narrowly-defined gaze is obtained by calculating the geometric parameters of the pupils of the eyes shot by the camera. Setting delta x and delta y as displacement deflection parameters of pupil-reflex effect; r is the ratio of the major and minor axes of the pupil reflection image ellipse; θ is the azimuth of the ellipse; gx, gy are coordinate functions of the pupil reflected light image. Δ x and Δ y reflect the functional relationship between the reflected image and the through-hole, which is the narrowly defined gaze information. r represents face rotation in the antiphase plane, the ratio is 1 when the face is right ahead, and the ratio becomes larger or smaller when the face moves up and down, left and right. The angle theta is the face motion direction in the plane of the optical axis of the camera objective lens, and the pupil centroid coordinates in the plane (gx, gy).
The six parameters are obtained, a mathematical function is constructed for the eye gazing direction by utilizing the extensive regression neural network, the gazing information of the driver is calculated, and if the gazing time of the driver to a certain direction is overtime, the fatigue state can be judged.
In the invention, the driver fatigue detection method also comprises mouth state analysis, the state information of the mouth can be used for describing facial expression characteristics, the lower half part of the face is subjected to thresholding processing mainly by adopting an iterative threshold value selection algorithm, so that the problem that the complete mouth contour cannot be obtained in a binary image due to small difference between lips and skin color is solved, the shot image is converted into a gray image, and automatic threshold value processing is carried out on the threshold value, so that the contour information of the mouth region can be obtained.
Because the gray value of the lips is lighter than that of the human face, a cavity area appears after binarization processing, and the influence of cavities and noise points caused by nostrils also appears, so that the image is denoised after the contour information of the mouth area is obtained, the contour of the cavity area generated by the mouth is represented by a minimum external rectangle, and finally, the corners of the mouth are searched by a classical Harris corner point detection method. The detection of the angular points is carried out in the original gray level image, and only the mouth area is detected, and the specific algorithm of the angular point detection is as follows:
R=det(M)-k(tr2(M))
where M (x, y) ═ Iu2(x, y) Iuv (x, y) Ivu (x, y) Iv2(x, y), ] > where Iu (x, y), Iv (x, y), Iuv (x, y) are the partial derivatives of the gray scale at the image point (x, y) in the u and v directions, and the second order mixed partial derivative, respectively; k is a constant; tr (M) is the trace of the M matrix, det (M) is the determinant of the matrix M.
The degree of opening of the mouth is determined by the ratio of the height of the upper lip and the lower lip of the opened mouth to the width of the left mouth corner and the right mouth corner, and the fatigue degree of the driver is calculated by using the frequency of yawning of the mouth.
The driver fatigue detection method also comprises the steps of alarming for cold air and non-toxic irritant gas, when the device judges that the driver is in a fatigue state, starting sound alarm, and if the continuous sound alarm is not disconnected for more than 10 seconds, triggering a cold air alarm and non-toxic irritant gas alarm mechanism to spray cold air and non-toxic irritant to remind the driver of waking. The sound alarm module consists of a loudspeaker and a stop alarm control switch, and the triggered cold air alarm and the non-toxic irritant gas alarm consist of a cold air device and a gas device.
The multi-feature fatigue measurement technology in the device accurately extracts the eye features of the driver under the detection of an infrared light source, and detects the eye eyelid movement characteristic parameters (blink speed, eye opening and closing time and pupil geometric features) of the driver so as to calculate and judge the fatigue degree; the characteristic parameters of the mouth of the driver are extracted, and the facial expression characteristics of the face of the driver in a fatigue state are measured. The fatigue detection of the driver is realized by a multi-characteristic comprehensive detection strategy, the fatigue condition of the driver is accurately detected in real time, and great significance is brought to traffic safety.
As shown in fig. 5, in a second aspect, an apparatus for preventing fatigue driving includes:
the acquisition module is used for acquiring the facial information of the driver in real time;
the analysis module is used for analyzing the acquired face information and carrying out fatigue classification; the system is used for tracking and predicting the positions of eyes, analyzing the information of the eyes to calculate the characteristics of the eyes and facial expression characteristics which can be expressed by the mouth state, setting shallow fatigue, moderate fatigue and over fatigue criteria for the characteristics and giving alarms of different levels to realize fatigue detection; the device is used for judging whether the driver is tired or not by judging the opening and closing of the eyes of the driver, judging whether the characteristic values and the position values of the eyes and the mouth reach set values or not, and controlling alarm equipment to give an alarm if the characteristic values and the position values reach the set values to remind the driver of taking a rest and driving safety;
and the reminding module is used for reminding the driver according to the fatigue classification.
The invention adopts continuous detection and short-time tracking strategies by utilizing a Kalman filter and a Mean-shift algorithm, tracks and predicts the positions of eyes, analyzes the information of the eyes to calculate the eye characteristics and facial expression characteristics which can be expressed by the mouth state, sets shallow fatigue, medium fatigue and over fatigue criteria for the characteristics, and gives alarms of different levels to realize the function of fatigue detection.
Although the present invention has been described in detail with reference to the foregoing embodiments, those skilled in the art will understand that various changes, modifications and substitutions can be made without departing from the spirit and scope of the invention as defined by the appended claims. Any modification, equivalent replacement, or improvement made within the spirit and principle of the present invention should be included in the protection scope of the present invention.
Claims (10)
1. A driver fatigue detection method, characterized in that the method comprises the steps of:
acquiring face information of a driver in real time;
analyzing the obtained face information, and performing fatigue classification;
and reminding the driver according to the fatigue classification.
2. The method as claimed in claim 1, wherein the facial information of the driver includes eye state and mouth state, the image is collected by using infrared light source, the eye state is accurately detected by differentiating the two odd-even frames of image, the eye position is tracked and predicted by using Kalman filter and Mean-shift algorithm and using continuous detection and short-time tracking strategy, the eye information is analyzed to calculate the eye characteristic and the facial expression characteristic which can be expressed by the mouth state.
3. The method for detecting the fatigue of the driver as claimed in claim 2, further comprising tracking and predicting the positions of the eyes, analyzing the information of the eyes to calculate the characteristics of the eyes and the facial expression characteristics that the mouth can represent, and realizing the fatigue detection by setting the light fatigue, medium fatigue and over fatigue criteria for the characteristics and giving different levels of alarm.
4. The method for detecting the fatigue of the driver as claimed in claim 3, further comprising judging whether the driver is tired by opening and closing the eyes of the driver, judging whether the characteristic values and the position values of the eyes and the mouth reach the set values, and controlling the alarm device to give an alarm if the characteristic values and the position values reach the set values, so as to remind the driver of taking a rest and driving safely.
5. The method for detecting the fatigue of the driver as claimed in claim 4, wherein the infrared light source is composed of two groups of inner and outer ring infrared diodes which are evenly distributed on a ring which is coplanar and coaxial, and the two groups of inner and outer ring infrared diodes are freely switched, so that the limited range of the visual field of the camera is reduced, and the detection of infrared detection on eyes is realized.
6. The method for detecting driver fatigue according to claim 5, further comprising an eye positioning analysis algorithm, specifically comprising the steps of:
detecting eyes, namely acquiring a face video image by using a CCD camera after infrared light irradiates a face, and acquiring difference calculation of reflection images with two wavelengths to detect the eyes;
due to the influence of noise, the image obtained by difference needs to be preprocessed, histogram equalization processing is carried out, the gray levels with more pixels in the image are widened, and the gray levels with less pixels are reduced;
the histogram of the original image is transformed into a uniformly distributed form and then converted into a binary image.
7. The method for detecting driver fatigue according to claim 6, further comprising an eye tracking algorithm, comprising the steps of:
eye tracking, combining both Mean-Shift algorithm and Kalman filter,
if the Kalman filter detects the human eyes, directly adopting the detected result and simultaneously recording the current image and the positions of the human eyes;
if the image is not detected, initializing the Mean-Shift algorithm by using the image recorded last time and the position of the human eyes, searching the human eye area on the current image in a matching way by using the Mean-Shift algorithm, and if the Kalman filter in the subsequent image is still not detected, always tracking the Mean-Shift algorithm.
8. The method of claim 7, wherein the analyzing the eye information comprises analyzing eyelid movement parameters, extracting the characteristic information of the eyelids of the driver in real time, and calculating the fatigue value of the driver by using a mean time per eye closure (PERCLOS) algorithm, which is a time ratio of eye closure in a certain time, and a mean speed per eye closure algorithm.
9. The method as claimed in claim 8, wherein the reminding of the driver according to the fatigue classification includes a cold air alarm and a non-toxic irritant gas alarm, and if it is determined that the driver is in a fatigue state, the audio alarm is activated, and if the continuous audio alarm is not turned off for a predetermined time, the cold air alarm and the non-toxic irritant gas alarm are commanded to inject cold air and the non-toxic irritant gas to remind the driver of waking the driver.
10. A driver fatigue detecting device, characterized by comprising:
the acquisition module is used for acquiring the facial information of the driver in real time;
the analysis module is used for analyzing the acquired face information and carrying out fatigue classification; the system is used for tracking and predicting the positions of eyes, analyzing the information of the eyes to calculate the characteristics of the eyes and facial expression characteristics which can be expressed by the mouth state, setting shallow fatigue, moderate fatigue and over fatigue criteria for the characteristics and giving alarms of different levels to realize fatigue detection; the device is used for judging whether the driver is tired or not by judging the opening and closing of the eyes of the driver, judging whether the characteristic values and the position values of the eyes and the mouth reach set values or not, and controlling alarm equipment to give an alarm if the characteristic values and the position values reach the set values to remind the driver of taking a rest and driving safety;
and the reminding module is used for reminding the driver according to the fatigue classification.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011481838.9A CN112494045A (en) | 2020-12-15 | 2020-12-15 | Driver fatigue detection method and device |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202011481838.9A CN112494045A (en) | 2020-12-15 | 2020-12-15 | Driver fatigue detection method and device |
Publications (1)
Publication Number | Publication Date |
---|---|
CN112494045A true CN112494045A (en) | 2021-03-16 |
Family
ID=74972269
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202011481838.9A Pending CN112494045A (en) | 2020-12-15 | 2020-12-15 | Driver fatigue detection method and device |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN112494045A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115593375A (en) * | 2022-12-16 | 2023-01-13 | 广汽埃安新能源汽车股份有限公司(Cn) | Vehicle emergency braking method, device, equipment and computer readable medium |
Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104240446A (en) * | 2014-09-26 | 2014-12-24 | 长春工业大学 | Fatigue driving warning system on basis of human face recognition |
CN104224204A (en) * | 2013-12-24 | 2014-12-24 | 烟台通用照明有限公司 | Driver fatigue detection system on basis of infrared detection technology |
CN108446600A (en) * | 2018-02-27 | 2018-08-24 | 上海汽车集团股份有限公司 | A kind of vehicle driver's fatigue monitoring early warning system and method |
CN108875642A (en) * | 2018-06-21 | 2018-11-23 | 长安大学 | A kind of method of the driver fatigue detection of multi-index amalgamation |
-
2020
- 2020-12-15 CN CN202011481838.9A patent/CN112494045A/en active Pending
Patent Citations (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN104224204A (en) * | 2013-12-24 | 2014-12-24 | 烟台通用照明有限公司 | Driver fatigue detection system on basis of infrared detection technology |
CN104240446A (en) * | 2014-09-26 | 2014-12-24 | 长春工业大学 | Fatigue driving warning system on basis of human face recognition |
CN108446600A (en) * | 2018-02-27 | 2018-08-24 | 上海汽车集团股份有限公司 | A kind of vehicle driver's fatigue monitoring early warning system and method |
CN108875642A (en) * | 2018-06-21 | 2018-11-23 | 长安大学 | A kind of method of the driver fatigue detection of multi-index amalgamation |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN115593375A (en) * | 2022-12-16 | 2023-01-13 | 广汽埃安新能源汽车股份有限公司(Cn) | Vehicle emergency braking method, device, equipment and computer readable medium |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
Wang et al. | Driver fatigue detection: a survey | |
Kaplan et al. | Driver behavior analysis for safe driving: A survey | |
Dong et al. | Driver inattention monitoring system for intelligent vehicles: A review | |
US7460940B2 (en) | Method and arrangement for interpreting a subjects head and eye activity | |
Hossain et al. | IOT based real-time drowsy driving detection system for the prevention of road accidents | |
CN104224204B (en) | A kind of Study in Driver Fatigue State Surveillance System based on infrared detection technology | |
CN112434611B (en) | Early fatigue detection method and system based on eye movement subtle features | |
CN102054163A (en) | Method for testing driver fatigue based on monocular vision | |
Flores et al. | Driver drowsiness detection system under infrared illumination for an intelligent vehicle | |
CN117227740B (en) | Multi-mode sensing system and method for intelligent driving vehicle | |
Tang et al. | Real-time image-based driver fatigue detection and monitoring system for monitoring driver vigilance | |
Hasan et al. | State-of-the-art analysis of modern drowsiness detection algorithms based on computer vision | |
Bergasa et al. | Visual monitoring of driver inattention | |
Coetzer et al. | Driver fatigue detection: A survey | |
Bergasa et al. | Real-time system for monitoring driver vigilance | |
CN112494045A (en) | Driver fatigue detection method and device | |
Mašanović et al. | Driver monitoring using the in-vehicle camera | |
Rani et al. | Computer vision based gaze tracking for accident prevention | |
CN203885510U (en) | Driver fatigue detection system based on infrared detection technology | |
Deng et al. | Fatigue driving detection based on multi feature fusion | |
Rozali et al. | Driver drowsiness detection and monitoring system (DDDMS) | |
Ashwini et al. | Deep Learning Based Drowsiness Detection With Alert System Using Raspberry Pi Pico | |
Ahir et al. | Driver inattention monitoring system: A review | |
CN115610430A (en) | Intelligent automobile safe driving auxiliary system and method | |
CN203677103U (en) | CCD (Charge Coupled Device) camera based on infrared detection technology |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
RJ01 | Rejection of invention patent application after publication |
Application publication date: 20210316 |
|
RJ01 | Rejection of invention patent application after publication |