CN116513199A - Driver fatigue monitoring method integrating visual recognition and vehicle signals - Google Patents
Driver fatigue monitoring method integrating visual recognition and vehicle signals Download PDFInfo
- Publication number
- CN116513199A CN116513199A CN202310274925.4A CN202310274925A CN116513199A CN 116513199 A CN116513199 A CN 116513199A CN 202310274925 A CN202310274925 A CN 202310274925A CN 116513199 A CN116513199 A CN 116513199A
- Authority
- CN
- China
- Prior art keywords
- driver
- vehicle
- fatigue
- monitoring
- vehicle signals
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Pending
Links
- 238000012544 monitoring process Methods 0.000 title claims abstract description 41
- 238000000034 method Methods 0.000 title claims abstract description 29
- 230000000007 visual effect Effects 0.000 title claims abstract description 20
- 230000001815 facial effect Effects 0.000 claims abstract description 27
- 230000008569 process Effects 0.000 claims abstract description 9
- 210000003128 head Anatomy 0.000 claims description 21
- 230000009471 action Effects 0.000 claims description 7
- 230000008859 change Effects 0.000 claims description 5
- 230000011664 signaling Effects 0.000 claims 1
- 238000001514 detection method Methods 0.000 abstract description 12
- 230000007547 defect Effects 0.000 description 2
- 238000005516 engineering process Methods 0.000 description 2
- 241001282135 Poromitra oscitans Species 0.000 description 1
- 206010039203 Road traffic accident Diseases 0.000 description 1
- 206010048232 Yawning Diseases 0.000 description 1
- 230000009286 beneficial effect Effects 0.000 description 1
- 210000004556 brain Anatomy 0.000 description 1
- 238000006243 chemical reaction Methods 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 230000005611 electricity Effects 0.000 description 1
- 230000008451 emotion Effects 0.000 description 1
- 230000004399 eye closure Effects 0.000 description 1
- 230000004927 fusion Effects 0.000 description 1
- 230000003993 interaction Effects 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000035790 physiological processes and functions Effects 0.000 description 1
- 230000006461 physiological response Effects 0.000 description 1
- 238000011160 research Methods 0.000 description 1
- 230000029058 respiratory gaseous exchange Effects 0.000 description 1
- 230000004044 response Effects 0.000 description 1
- 238000006467 substitution reaction Methods 0.000 description 1
Classifications
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W50/00—Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
- B60W50/08—Interaction between the driver and the control system
- B60W50/14—Means for informing the driver, warning the driver or prompting a driver intervention
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/161—Detection; Localisation; Normalisation
- G06V40/165—Detection; Localisation; Normalisation using facial parts and geometric relationships
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0818—Inactivity or incapacity of driver
- B60W2040/0827—Inactivity or incapacity of driver due to sleepiness
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2520/00—Input parameters relating to overall vehicle dynamics
- B60W2520/10—Longitudinal speed
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/229—Attention level, e.g. attentive to driving, reading or sleeping
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2540/00—Input parameters relating to occupants
- B60W2540/30—Driving style
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- Automation & Control Theory (AREA)
- Multimedia (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Human Computer Interaction (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Mathematical Physics (AREA)
- Geometry (AREA)
- General Health & Medical Sciences (AREA)
- Traffic Control Systems (AREA)
- Auxiliary Drives, Propulsion Controls, And Safety Devices (AREA)
Abstract
The invention belongs to the technical field of driver detection, and particularly relates to a driver fatigue monitoring method integrating visual identification and vehicle signals; the strategy can automatically and preferentially process in the states of direct fatigue monitoring, indirect fatigue monitoring, fatigue warning reminding and the like, so that a judgment result which is closer to the actual situation is obtained. And the availability of facial feature points and vehicle signals is continuously and timely checked, so that a monitoring mode of automatically recovering to high reliability after the recoverable fault disappears is ensured.
Description
Technical Field
The invention belongs to the technical field of driver detection, and particularly relates to a driver fatigue monitoring method integrating visual identification and vehicle signals.
Background
The driver being in a tired state may lead to a reduced judgment capacity, a sluggish response and an increased risk of mishandling. If the vehicle is still being driven, traffic accidents may occur. According to research, 10% -25% of road collision accidents worldwide occur due to fatigue driving, and particularly fatigue driving is easier to occur in long-distance and uniform driving scenes. To effectively ensure driver traffic regulations and safety, a driver monitoring system (Driver Monitoring System) has been developed.
The driver monitoring system is an active safety technology, and utilizes sensing, communication, decision making, executing and other devices installed on the vehicle to monitor the driver and the driving environment thereof in real time and send prompt information when confirming that fatigue, distraction and other actions occur. The present method can be classified into a detection method based on a physiological signal of a driver, a detection method based on a physiological characteristic of the driver, and a detection method based on a running state of a vehicle.
The detection method based on the physiological signals of the driver is to judge by monitoring the deviation rate of indexes such as brain electricity, myoelectricity, pulse and respiration of the driver and normal values, but the monitoring of the physiological signals often requires the driver to wear additional monitoring equipment, and the emotion change of the driver can also have great influence on the fatigue judgment result, so the detection method has great limitation in fatigue monitoring application of passenger vehicles.
The detection method based on the physiological characteristics of the driver is a direct fatigue alarm strategy, and the physiological response characteristics of the driver such as the opening and closing degree of eyes, the sight line direction, the mouth state, the head position and the like are generally collected through a non-contact machine vision technology. The blink amplitude, blink frequency and average closure time can all be used directly to detect fatigue. PERCLOS (Percent of Eye Closure) is considered in the industry to be the most accurate real-time fatigue detection index at present, and refers to the proportion of time occupied by eyes in closing in a certain time, and people are most prone to yawning action in fatigue. These driver physiological characteristics can intuitively monitor the driver state in real time, but the monitoring result is easily affected by light and individual physiological states.
The monitoring method based on the running state of the vehicle is also called an indirect fatigue alarm strategy, which is to indirectly judge the fatigue state of the driver through the control condition of the driver on the vehicle, and depends on detection signals of a forward camera and a sensor, and common judgment is based on lanes, steering wheel holding torque, yaw angle, control frequency and the like. However, the judgment result is affected by road conditions, driving habits and algorithm analysis factors, and the accuracy is not high.
Disclosure of Invention
In order to overcome the problems, the invention provides a driver fatigue monitoring method integrating visual identification and vehicle signals, which combines the advantages of each mode of the existing driver monitoring and compensates the defects, and can furthest reduce the misjudgment caused by a single method.
A driver fatigue monitoring method integrating visual recognition and vehicle signals, comprising the following contents:
firstly, installing an infrared camera capable of performing visual monitoring and identification on the position of the head of a driver in a cab, and installing a forward camera capable of acquiring lane line information at the front end outside a vehicle body;
if the vehicle speed is greater than a preset starting threshold value, further analyzing the driving style, judging whether the driver is in a fatigue driving state, and starting a self-calibration current head forward direction process of the driver;
step three, if the facial feature points of the driver shot by the infrared camera can be used and the vehicle signals are effective, entering a direct fatigue alarm strategy;
step four, if the facial feature points of the driver shot by the infrared camera cannot be used and the vehicle signals are effective, entering an indirect fatigue alarm strategy;
and fifthly, if the facial feature points of the driver shot by the infrared camera cannot be used and the vehicle signals are invalid, returning to the traditional fatigue warning reminding, namely triggering the warning reminding after the driving time is fixed.
The content of the driving style analysis in the second step comprises the frequency of steering control by the driver in the lane lines, the change of the yaw angle of the vehicle and the number of lane departure times, and if the operating frequency of steering control by the driver in unit time is too low and the number of lane departure times is too high, the driver is recorded as being in a fatigue driving state.
The operation frequency of the driver controlled steering is too low, namely the operation frequency of the driver controlled steering is lower than a preset frequency, and the lane departure times are too large and are higher than the preset times.
The step two, the self-calibration of the head positive direction of the driver refers to recording the face orientation in the head motion of the driver with the duration longer than N seconds, and recording the head direction which is kept for more than a certain time as the positive direction for judging the angle of the head deviating from the initial position at each moment in the follow-up driving process of the driver in real time. And continuously collecting the holding action with longer duration, namely, after the initial self-calibration, marking the direction of which the holding time of the driver is longer than the last time as a new positive direction.
In the third step, the facial feature points of the driver can be used, which means that the face, eyes, mouth and nose outline of the driver can be identified at the same time in the facial image of the driver shot by the infrared camera.
In the third step, the vehicle signal is valid, which means that the vehicle signal: the lane line, the steering wheel angle, the steering wheel angular velocity and the vehicle yaw rate can be obtained stably, namely the vehicle yaw rate is effective.
In the fourth step, the facial feature points of the driver cannot be used, which means that the face, eyes, mouth and nose contours of the driver cannot be identified at the same time in the facial image of the driver shot by the infrared camera.
In the fifth step, the invalid vehicle signal refers to a vehicle signal: and any one of the lane line, the steering wheel angle, the steering wheel angular speed and the vehicle yaw rate can not be stably acquired, namely the vehicle yaw rate is invalid.
The invention has the beneficial effects that:
the detection method based on multi-feature information fusion can combine the advantages of different fatigue monitoring modes and make up for the defects, and the misjudgment caused by a single method is reduced to the greatest extent.
The strategy can automatically and preferentially process in the states of direct fatigue monitoring, indirect fatigue monitoring, fatigue warning reminding and the like, so that a judgment result which is closer to the actual situation is obtained. And the availability of facial feature points and vehicle signals is continuously and timely checked, so that a monitoring mode of automatically recovering to high reliability after the recoverable fault disappears is ensured.
Detailed Description
The present invention will be described in further detail with reference to examples. It is to be understood that the specific embodiments described herein are merely illustrative of the invention and are not limiting thereof.
In the description of the present invention, unless explicitly stated and limited otherwise, the terms "connected," "connected," and "fixed" are to be construed broadly, and may be, for example, fixedly connected, detachably connected, or integrally formed; can be mechanically or electrically connected; can be directly connected or indirectly connected through an intermediate medium, and can be communicated with the inside of two elements or the interaction relationship of the two elements. The specific meaning of the above terms in the present invention will be understood in specific cases by those of ordinary skill in the art.
In the present invention, unless expressly stated or limited otherwise, a first feature "above" or "below" a second feature may include both the first and second features being in direct contact, as well as the first and second features not being in direct contact but being in contact with each other through additional features therebetween. Moreover, a first feature being "above," "over" and "on" a second feature includes the first feature being directly above and obliquely above the second feature, or simply indicating that the first feature is higher in level than the second feature. The first feature being "under", "below" and "beneath" the second feature includes the first feature being directly under and obliquely below the second feature, or simply means that the first feature is less level than the second feature.
In the description of the present embodiment, the terms "upper," "lower," "left," "right," and the like are merely for convenience of description and simplicity of operation, and do not indicate or imply that the devices or elements in question must have a particular orientation, be constructed and operated in a particular orientation, and thus are not to be construed as limiting the invention. Furthermore, the terms "first," "second," and the like, are used merely for distinguishing between descriptions and not for distinguishing between them.
Example 1
A driver fatigue monitoring method integrating visual recognition and vehicle signals, comprising the following contents:
firstly, installing an infrared camera capable of performing visual monitoring and identification on the position of the head of a driver in a cab, and installing a forward camera capable of acquiring lane line information at the front end outside a vehicle body;
if the vehicle speed is greater than a preset starting threshold value of the fatigue monitoring system, further analyzing the driving style, judging whether the driver is in a fatigue driving state, and starting a self-calibration current head forward direction process of the driver;
step three, if the facial feature points of the driver shot by the infrared camera can be used and the vehicle signals are effective, entering a direct fatigue alarm strategy;
step four, if the facial feature points of the driver shot by the infrared camera cannot be used and the vehicle signals are effective, entering an indirect fatigue alarm strategy;
and fifthly, if the facial feature points of the driver shot by the infrared camera cannot be used and the vehicle signals are invalid, returning to the traditional fatigue warning reminding, namely triggering the warning reminding after the driving time is fixed.
The content of the driving style analysis in the second step comprises the frequency of steering control by the driver in the lane lines, the change of the yaw angle of the vehicle and the number of lane departure times, and is used for an indirect fatigue detection alarm strategy which is mentioned later, namely if the operating frequency of steering control by the driver in unit time is too low and the number of lane departure times is too high, the driver is recorded as being in a fatigue driving state.
The operation frequency of the driver controlled steering is too low, namely the operation frequency of the driver controlled steering is lower than a preset frequency, and the lane departure times are too large and are higher than the preset times.
The step two, the self-calibration of the head positive direction of the driver refers to recording the face orientation in the head motion of the driver with the duration longer than N seconds, and recording the head direction which is kept for more than a certain time as the positive direction for judging the angle of the head deviating from the initial position at each moment in the follow-up driving process of the driver in real time. And continuously collecting the holding action with longer duration, namely, after the initial self-calibration, marking the direction of which the holding time of the driver is longer than the last time as a new positive direction.
In the third step, the facial feature points of the driver can be used, which means that the face, eyes, mouth and nose outline of the driver can be identified at the same time in the facial image of the driver shot by the infrared camera.
In the third step, the valid vehicle signal refers to the verified vehicle signal: the lane line, the steering wheel angle, the steering wheel angular velocity and the vehicle yaw rate can be obtained stably, namely the vehicle yaw rate is effective.
In the fourth step, the facial feature points of the driver cannot be used, which means that the face, eyes, mouth and nose contours of the driver cannot be identified at the same time in the facial image of the driver shot by the infrared camera.
In the fifth step, the invalid vehicle signal refers to a verified vehicle signal: and any one of the lane line, the steering wheel angle, the steering wheel angular speed and the vehicle yaw rate can not be stably acquired, namely the vehicle yaw rate is invalid.
Example 2
A driver fatigue monitoring method integrating visual recognition and vehicle signals requires that a vehicle is provided with an infrared camera capable of realizing visual monitoring and recognition of the position of the driver, and a controller such as a forward camera and an electronic steering control unit.
The specific implementation logic comprises the following steps:
after the vehicle is started, the controller is electrified for self-checking, the camera is driven to start, the infrared camera collects current facial feature points of a target driver, the forward camera collects track information such as lane lines, and the controller collects vehicle information such as steering wheel moment and vehicle speed.
If the vehicle speed is greater than a preset starting threshold value of the fatigue monitoring system. The driving style is further analyzed and a self-calibrating current driver head forward direction course is initiated. The driving style analysis comprises lane departure degree, steering wheel operation frequency, yaw angle change and the like, and the driver head positive direction self-calibration refers to recording a holding action with a longer duration of face orientation and continuous collection in an action with a duration longer than N seconds.
If facial feature points are available and the vehicle signal is valid, a direct fatigue alarm strategy is entered.
If the facial feature points are unavailable due to misuse of the belt mask and the like, checking whether the vehicle signals are available or not, and performing an indirect fatigue alarm strategy.
And if the acquired results are unavailable, returning to the traditional fatigue warning reminding, and triggering the reminding after the driving time is fixed.
The invention relies on the recognition of the face information by the in-cabin camera, the recognition of the lane lines by the forward camera, the bus signal of the own vehicle EPS, EMS, BCM and the ECU module of the fatigue monitoring system to realize the whole judging process. The self-vehicle speed collected by the sensing module is a necessary condition for starting fatigue monitoring and judging, and the driving analysis and self-calibration flow can be started after the condition is met. The detailed direct fatigue alarm strategy and the indirect fatigue alarm strategy are not in the technical proposal of the invention by combining the comprehensive judgment of related signals sent by the EPS, BCM, EMS system.
The key pre-protection point of the invention is a perfect strategy. The specific reference is as follows: and a redundancy judgment scheme is added for all scenes of the fatigue state, and different fatigue judgment strategies can be called according to actual scenes, so that the obtained result has a more complete coverage. The detailed direct fatigue alarm judgment strategy and the indirect fatigue alarm strategy can be optimally defined according to the requirements of users. The bright point is the conversion of multiple fatigue judgment strategies, and is not limited to the fatigue alarm strategy which uses multiple signals to output unified results.
The preferred embodiments of the present invention have been described in detail above, but the scope of the present invention is not limited to the specific details of the above embodiments, and within the scope of the technical spirit of the present invention, any person skilled in the art may make equivalent substitutions or changes according to the technical solution of the present invention and the inventive concept thereof within the scope of the technical spirit of the present invention, and these simple modifications are all within the scope of the present invention.
In addition, the specific features described in the above embodiments may be combined in any suitable manner, and in order to avoid unnecessary repetition, various possible combinations are not described further.
Moreover, any combination of the various embodiments of the invention can be made without departing from the spirit of the invention, which should also be considered as disclosed herein.
Claims (8)
1. A driver fatigue monitoring method integrating visual recognition and vehicle signals, comprising the following steps:
firstly, installing an infrared camera capable of performing visual monitoring and identification on the position of the head of a driver in a cab, and installing a forward camera capable of acquiring lane line information at the front end outside a vehicle body;
if the vehicle speed is greater than a preset starting threshold value, further analyzing the driving style, judging whether the driver is in a fatigue driving state, and starting a self-calibration current head forward direction process of the driver;
step three, if the facial feature points of the driver shot by the infrared camera can be used and the vehicle signals are effective, entering a direct fatigue alarm strategy;
step four, if the facial feature points of the driver shot by the infrared camera cannot be used and the vehicle signals are effective, entering an indirect fatigue alarm strategy;
and fifthly, if the facial feature points of the driver shot by the infrared camera cannot be used and the vehicle signals are invalid, returning to the traditional fatigue warning reminding, namely triggering the warning reminding after the driving time is fixed.
2. The method for monitoring fatigue of a driver by integrating visual recognition and vehicle signals according to claim 1, wherein the content of the driving style analysis in the second step includes frequency of driver control steering, vehicle yaw rate change and number of lane departure times in the lane line, and if the frequency of driver control steering is too low and the number of lane departure times is too high in unit time, the driver is recorded as being in a fatigue driving state.
3. The method for monitoring fatigue of a driver in combination with visual recognition and vehicle signaling according to claim 2, wherein the driver controlled steering frequency is too low, the driver controlled steering frequency is lower than a predetermined frequency, and the lane departure frequency is too high.
4. The method for monitoring fatigue of a driver by integrating visual recognition and vehicle signals according to claim 1, wherein the step two of self-calibration of the head positive direction of the driver is to record the face orientation in the head motion of the driver with a duration longer than N seconds, and record the head direction kept for more than a certain time as the positive direction for real-time judgment of the head angle deviating from the initial position at each moment in the following driving process of the driver. And continuously collecting the holding action with longer duration, namely, after the initial self-calibration, marking the direction of which the holding time of the driver is longer than the last time as a new positive direction.
5. The method for monitoring fatigue of a driver by combining visual recognition and vehicle signals according to claim 1, wherein the step three is characterized in that the facial feature points of the driver can be used, namely the facial, eye, mouth and nose contours of the driver can be recognized simultaneously in the facial image of the driver shot by the infrared camera.
6. The method for monitoring fatigue of a driver by combining visual recognition and vehicle signals according to claim 1, wherein the vehicle signals in the third step are valid, which means vehicle signals: the lane line, the steering wheel angle, the steering wheel angular velocity and the vehicle yaw rate can be obtained stably, namely the vehicle yaw rate is effective.
7. The method for monitoring fatigue of a driver by combining visual recognition and vehicle signals according to claim 1, wherein the fact that the facial feature points of the driver cannot be used in the fourth step means that the face, eyes, mouth and nose contours of the driver cannot be recognized at the same time in the facial image of the driver captured by the infrared camera.
8. The method for monitoring fatigue of a driver by combining visual recognition and vehicle signals according to claim 1, wherein the vehicle signal is invalid in the fifth step, the vehicle signal is: and any one of the lane line, the steering wheel angle, the steering wheel angular speed and the vehicle yaw rate can not be stably acquired, namely the vehicle yaw rate is invalid.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310274925.4A CN116513199A (en) | 2023-03-21 | 2023-03-21 | Driver fatigue monitoring method integrating visual recognition and vehicle signals |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202310274925.4A CN116513199A (en) | 2023-03-21 | 2023-03-21 | Driver fatigue monitoring method integrating visual recognition and vehicle signals |
Publications (1)
Publication Number | Publication Date |
---|---|
CN116513199A true CN116513199A (en) | 2023-08-01 |
Family
ID=87394771
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202310274925.4A Pending CN116513199A (en) | 2023-03-21 | 2023-03-21 | Driver fatigue monitoring method integrating visual recognition and vehicle signals |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN116513199A (en) |
Cited By (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117261930A (en) * | 2023-11-21 | 2023-12-22 | 北京大也智慧数据科技服务有限公司 | Early warning method and device for fatigue driving |
-
2023
- 2023-03-21 CN CN202310274925.4A patent/CN116513199A/en active Pending
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN117261930A (en) * | 2023-11-21 | 2023-12-22 | 北京大也智慧数据科技服务有限公司 | Early warning method and device for fatigue driving |
CN117261930B (en) * | 2023-11-21 | 2024-04-19 | 北京大也智慧数据科技服务有限公司 | Early warning method and device for fatigue driving |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN107640152B (en) | A kind of lane keeps the Control for Dependability device and method of auxiliary system | |
CN107832748B (en) | Shared automobile driver replacing system and method | |
CN105197011B (en) | Vehicle steering person's hazard index management system and its method | |
CN107697069A (en) | Fatigue of automobile driver driving intelligent control method | |
CN112289003B (en) | Method for monitoring end-of-driving behavior of fatigue driving and active safety driving monitoring system | |
CN110472556B (en) | Monocular vision-based driver attention state analysis system and analysis method | |
Feng et al. | An on-board system for detecting driver drowsiness based on multi-sensor data fusion using Dempster-Shafer theory | |
CN107323338A (en) | Vehicle turn light control system, control method and vehicle | |
CN105303830A (en) | Driving behavior analysis system and analysis method | |
CN102985302A (en) | Vehicle emergency escape device | |
CN106571015A (en) | Driving behavior data collection method based on Internet | |
CN110203202A (en) | A kind of lane-change auxiliary method for early warning and device based on Driver intention recognition | |
US20200070848A1 (en) | Method and System for Initiating Autonomous Drive of a Vehicle | |
CN106467057B (en) | The method, apparatus and system of lane departure warning | |
CN110422169A (en) | A kind of control method for vehicle, system and equipment | |
CN116513199A (en) | Driver fatigue monitoring method integrating visual recognition and vehicle signals | |
CN108021875A (en) | A kind of vehicle driver's personalization fatigue monitoring and method for early warning | |
CN104691333B (en) | Coach driver fatigue state evaluation method based on the deviation frequency | |
CN108583430A (en) | A kind of control loop and judgment method for detecting and reminding driver distraction | |
CN111645694B (en) | Driver driving state monitoring system and method based on attitude estimation | |
CN110126730A (en) | Vehicle lane change based reminding method and system | |
CN114872713A (en) | Device and method for monitoring abnormal driving state of driver | |
CN106157537A (en) | Anti-fatigue-driving system | |
CN105599607A (en) | Device and method for starting vehicle based on driver state | |
KR20180059224A (en) | Apparatus for controlling stop of vehicle and method thereof |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination |