CN112417921A - Driver condition detection system - Google Patents

Driver condition detection system Download PDF

Info

Publication number
CN112417921A
CN112417921A CN201910767813.6A CN201910767813A CN112417921A CN 112417921 A CN112417921 A CN 112417921A CN 201910767813 A CN201910767813 A CN 201910767813A CN 112417921 A CN112417921 A CN 112417921A
Authority
CN
China
Prior art keywords
driver
face
condition detection
hidden
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910767813.6A
Other languages
Chinese (zh)
Inventor
杨海琴
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to CN201910767813.6A priority Critical patent/CN112417921A/en
Publication of CN112417921A publication Critical patent/CN112417921A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/59Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
    • G06V20/597Recognising the driver's state or behaviour, e.g. attention or drowsiness

Abstract

The driver condition detection system includes a driver monitoring camera that captures a face of a driver of a vehicle and generates a face image of the driver, and a driver condition detection section configured to detect a condition of the driver based on the face image. The driver condition detection section is configured to detect the condition of the driver based on the face portion of the driver, instead of being hidden in the face image, if the face portion of the driver is hidden in the face image. The face of the driver is the driver's mouth, nose, right and left eyes.

Description

Driver condition detection system
Technical Field
The present invention relates to a driver condition detection system.
Background
Known in the past is an apparatus that detects the condition of the vehicle driver using an image capturing apparatus provided in the vehicle. For example, japanese patent publication No. 2016-057839A describes that the angular velocities of the face direction and the face direction of the driver are calculated from the image of the driver captured by the image capturing device, and the face direction is estimated from the angular velocity of the face direction if the face direction of the driver is not within the image capturing range of the image capturing device.
Disclosure of Invention
Technical problem
However, even if the direction of the driver's face is within the image capturing range of the image capturing means, the face portion of the driver may be hidden in the image of the driver's face, and the accuracy of detecting the driver's situation may be degraded. For example, if the steering wheel is located between the image capturing device and the driver, the driver's face will be hidden behind the spokes or the like of the steering wheel as the amount of rotation of the steering wheel becomes greater. Further, the face of the driver may be hidden by the hand, arm, or the like of the driver.
It is therefore an object of the present invention to provide a driver condition detection system capable of suppressing a decrease in the accuracy of driver condition detection due to the face portion of the driver being hidden in the face image of the driver.
Solution to the problem
The present invention is summarized as follows.
(1) A driver condition detection system comprising: a driver monitoring camera capturing a face of a driver of the vehicle and generating a face image of the driver; a driver condition detection section configured to detect a condition of a driver based on the face image; wherein, if the face portion of the driver is hidden in the face image, the driver condition detection section is configured to detect the condition of the driver based on the face portion of the driver being not hidden in the face image, the face portion of the driver being the mouth, the nose, the right eye, and the left eye.
(2) The driver condition detection system described in (1) above, wherein the condition of the driver is a face direction of the driver.
(3) The driver situation detection system described in the above (2), wherein if a part of the face of the driver is hidden in the face image, the driver condition detection section is configured to detect the direction of the face of the driver based on the part of the face of the driver which is not hidden in the face image, and to hide the face image of the driver before a certain part of the face.
(4) The driver situation detection system described in the above (3), wherein the driver situation detection section is configured to determine that the direction of the face of the driver has not changed before the face portion of the driver is hidden. If the amount of change in the face condition of the driver not hidden in the face image is less than or equal to the threshold value, and if the amount of change is greater than the threshold value, the direction of the face of the driver is not detected.
(5) The driver situation detection system described in the above (3), wherein the driver situation detection section is configured to determine that the direction of the face of the driver has not changed before the face portion of the driver is hidden. If the amount of change in the position and size of a face part that the driver does not hide in the face image is a threshold value or less, and if the amount of change in at least one of the position and size is greater than the threshold value, the direction of the face of the driver is not detected.
(6) The driver state detection system described in any one of (2) to (5) above, wherein the driver state detection section is configured to match the face image and the face shape data to detect a direction of the face of the driver.
(7) The driving condition detection system of the above (6), wherein the driving condition detection section is configured to match a face part in the face image with a face part in the face shape data to detect the face direction of the driver.
(8) The driver situation detection system described in the above (7), wherein, if the face portion of the driver is hidden in the face image, the driver state detection section is configured not to perform matching of the face portion of the driver hidden in the face image and the face portion in the face shape data.
Advantageous effects
According to the present invention, in the variable compression ratio internal combustion engine having the variable compression ratio mechanism, it is possible to improve the response of the mechanical compression ratio.
Drawings
Fig. 1 is a block diagram showing the configuration of a driving condition detection system according to a first embodiment of the present invention.
Fig. 2 is a view schematically showing the interior of a vehicle in which a driver condition detection system is installed.
Fig. 3 is a flowchart showing a control routine of a process for detecting the condition of the driver in the first embodiment.
Fig. 4 is a view schematically showing face shape data when the direction of the driver's face is 0 °.
Fig. 5 is a view schematically showing face shape data when the direction of the face of the driver is 10 °.
Fig. 6 shows the face image of the driver, and the matching result of the face image and the face shape data when the face direction is 10 °.
Fig. 7 is a flowchart showing a control routine of a process for detecting a driver condition in the second embodiment.
Description of the embodiments
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings. Note that in the following explanation, similar component elements are given the same reference numerals.
First embodiment
Next, with reference to fig. 1 to 6, a first embodiment of the present invention will be explained. Fig. 1 is a block diagram showing the configuration of a driving condition detection system according to a first embodiment of the present invention. The driver condition detection system 1 is mounted on a vehicle and detects the condition of the driver of the vehicle. The driver state detection system 1 is provided with a driver monitor camera 10 and an Electronic Control Unit (ECU) 20.
Fig. 2 is a view schematically showing the interior of a vehicle in which a driver condition detection system is installed. The driver monitor camera 10 captures an image of the face of the driver of the vehicle 80 to generate an image of the face of the driver. The driver monitoring camera 10 is installed in the vehicle. Specifically, as shown in fig. 2, the driver monitoring camera 10 is mounted on the top of a steering column 81 of a vehicle 80. Fig. 2 shows a projection range of the driver monitoring camera 10 constituted by a broken line. Note that the driver monitor camera 10 may be mounted at an instrument panel, an instrument panel cover, or the like of the vehicle 80.
The driver monitoring camera 10 is composed of a camera and a projector. For example, the camera is a CMOS (complementary metal oxide semiconductor) camera or a CCD (charge coupled device) camera, and the projector is an LED (light emitting diode). Further, in order to be able to capture the face of the driver without making the driver feel uncomfortable, even at other low-luminance times (such as at night, etc.), the projector may be a near-infrared LED. For example, the projectors are two near infrared LEDs, arranged on both sides of the camera. In addition, the camera may be equipped with a filter such as a visible light cut filter. The driver face image generated by the driver monitor camera 10 is transmitted from the driver monitor camera 10 to the ECU 20.
The electronic control unit 20 is a microcomputer, and is interconnected by bidirectional buses such as a Central Processing Unit (CPU), a Read Only Memory (ROM), a Random Access Memory (RAM), an input port, and an output port. In the present embodiment, one ECU 20 is provided, but a plurality of ECUs may be provided for different functions. The ECU 20 includes a driver condition detection portion 21 that detects a condition of the driver based on the face image of the driver generated by the driver monitor camera 10.
As shown in fig. 2, the steering wheel 82 is located between the driver monitor camera 10 and the driver. In this case, the driver monitor camera 10 captures the driver through the steering wheel 82. Therefore, when the amount of rotation of the steering wheel 82 (i.e., the steering angle of the steering wheel 82) becomes large, the spoke portions and the like of the steering wheel 82 will obstruct the driver's capture.
Specifically, if the steering wheel 82 is turned rightward (clockwise) as viewed from the driver's side, the driver's face will be hidden by the steering wheel 82 in the order of mouth, nose, left eye, and right eye as the amount of turning increases. On the other hand, if the steering wheel 82 is turned leftward (counterclockwise) as viewed from the driver's side, the driver's face will be hidden by the steering wheel 82 in the order of mouth, nose, right eye, and left eye as the amount of turning increases.
Further, the driver's hand, arm, or the like may hide the face portion of the driver. Therefore, even if the direction of the face of the driver is within the image capturing range of the driver monitor camera 10, sometimes the face portion of the driver is hidden in the image of the face of the driver, thereby reducing the accuracy of the driver situation detection.
Therefore, in the present embodiment, when the face portion (the mouth, the nose, the right eye, and the left eye) of the driver is hidden in the face image of the driver, the driver condition detection section 21 detects the condition of the driver from the driver face part that is not hidden in the face image of the driver. Therefore, even if the face portion of the driver is hidden in the face image of the driver, the condition of the driver can be detected. Therefore, since the face portion of the driver is hidden in the face image of the driver, it is possible to suppress a decrease in the driver situation detection accuracy.
Next, referring to the flowchart of fig. 3, the control performed by the driver condition detection system 1 will be described in detail. Fig. 3 is a flowchart showing a process control routine for detecting the state of the driver in the first embodiment. The present control routine is repeatedly executed by the ECU 20. In the present control routine, the condition of the driver of the vehicle 80 is detected.
First, in step s101, the driving condition detecting section 21 acquires a face image of the driver from the driving monitor camera 10. The driver face image is generated by the driver monitor camera 10.
Next, in step s102, the driver situation detection section 21 detects the face portion (mouth, nose, right eye, and left eye) of the driver from the face image of the driver acquired in step s 101. For example, the driver situation detection section 21 recognizes a face region from the face image of the driver, and detects the mouth, nose, right eye, and left eye by extracting feature points of the mouth, nose, right eye, and left eye. If the feature points of the surface part are not extracted, the surface part is not detected.
Next, in step s103, the driver situation detection section 21 determines whether or not the face portion of the driver is hidden in the face image of the driver. For example, if the face portion of the driver cannot be detected, the driver situation detection portion 21 determines that the face portion of the driver is hidden.
If at step s103 it is determined that a portion of the surface feature is not concealed, the control routine continues to step s 104. In step s104, the driving condition detecting section 21 determines whether or not all face portions of the driver are hidden in the face image of the driver. For example, if the mouth, nose, right eye, and left eye of the driver cannot be detected, the driver condition detection section 21 determines that all the facial parts of the driver are hidden.
If it is determined in step s104 that all the face parts are not concealed, the control routine proceeds to step s 105. In step s105, the driving condition detecting section 21 detects the driving condition. Specifically, the driver condition detection section 21 detects the eye opening and the face direction of the driver. For example, the driver condition detection 21 st section extracts feature points of the upper and lower eyelids of the right and left eyes to detect the opening degrees of both eyes.
Further, the driver situation detection portion 21 detects the direction of the face of the driver by the following method, for example. The driver situation detection section 21 detects the direction of the face of the driver by matching the face image of the driver and the face shape data. For example, the face shape data is 3D face shape data. The driver state detection section 21 stores face shape data in advance when the driver faces forward, that is, the driver face direction is 0'. The face shape data may be face shape data of a standard person or may be acquired for each driver. For example, the face shape data is stored in the ROM or RAM of the ECU 20. Further, when the driver is facing forward, the driver state detection section 21 generates face shape data according to the face shape data when the direction of the driver's face is changed.
Fig. 4 is a view schematically showing face shape data when the direction of the driver's face is 0 °. Fig. 5 is a view schematically showing face shape data when the direction of the face of the driver is 10 °. The driver state detection section 21 matches a face portion in the driver face image with a face portion in the face shape data, and detects the face direction of the face shape data when the matching ratio of the two becomes larger with the driver face direction. Note that the driver condition detection section 21 may detect the direction of the face of the driver from the positional relationship of the face region and the face portion (mouth, nose, right eye, and left eye).
After step s105, the current control routine ends. On the other hand, if it is determined in step s104 that all the face components are hidden, the current control routine is ended. In this case, the driver condition detection section 21 does not detect the driver condition.
Further, if it is determined in step s103 that a part of the face component is hidden, the control routine proceeds to step s 106. In step s106, the driving condition detecting section 21 detects the driver condition based on the driver face portion which is not hidden in the driver face image. For example, if the mouth, nose, and right eye of the driver are hidden, the driver condition detection section 21 detects the eye opening and face direction of the driver from the left eye of the driver. Specifically, the driving condition detecting section 21 extracts the feature points of the upper eyelid and the lower eyelid of the left eye, and detects the opening degree of the left eye. Further, the driver state detection section 21 matches the left eye in the driver face image and the left eye in the face shape data, and detects the face direction of the face shape data as the face direction of D when the two matching rates are maximum. A river. Note that, if both eyes of the driver are hidden, the driver condition detection section 21 does not detect the opening degree of the eyes of the driver.
Fig. 6 shows the driver's face image and the matching result of the face image and the face data when the face direction is 10 °. In the face image shown in fig. 6, the mouth, nose, and right eye of the driver are hidden by the steering wheel 82. In this example, when the face direction of the face shape data is 10 °, the matching rate of the left eye is the maximum. Therefore, the face direction of the driver is calculated to be 10 °. On the other hand, the matching rate of the oral cavity, noise and right eye is less than 50%. However, since the face direction of the driver is not detected with the mouth, nose, and right eye, the face direction of the driver can be detected. Note that, in this example, although the matching rates of the mouth, nose, and right eye hidden by the steering wheel 82 are calculated, these matching rates may not be calculated. That is, if the face portion of the driver is hidden in the face image of the driver, the driver situation detection section 21 does not need to match the face portion of the driver hidden in the face image of the driver and the face portion in the face shape data. Therefore, the processing load of the driver condition detection part 21 can be reduced.
After step s106, the current control routine ends. Note that, in step s103, the driving condition detecting section 21 may perform matching of the face image of the driver with the face shape data for each face direction angle, and if the matching rate of the face portion is smaller than a predetermined value for all angles of the face direction, above, determine that this face portion is hidden. For example, the predetermined value is 50%.
Further, the driver condition detection portion 21 may determine that the driver is dozing off and issue a warning to the driver if it is detected that the degree of opening of the eyes of the driver is less than or equal to a predetermined value. Also, if the detected direction of the face of the driver is out of the predetermined range, the driver situation detection portion 21 may determine that the driver is looking aside, and issue a warning to the driver. For example, the driver condition detection part 21 may issue a visual or audible warning to the driver through a Human Machine Interface (HMI) provided on the vehicle 80. The human-machine interface is an interface for inputting and outputting information between the driver and the vehicle 80. For example, the human-machine interface includes a display that displays text or image information, a speaker that generates sound, operation buttons on which the driver performs an entry operation, a touch panel, a microphone, and the like.
Second embodiment
The configuration and control of the driver condition detection system according to the second embodiment are substantially similar to those of the driver condition detection system according to the first embodiment except for the points explained below. For this reason, the following will focus on the portions of the second embodiment of the present invention different from the first embodiment.
For example, if many face components are hidden in the face image of the driver, it is difficult to detect a slight change in the direction of the face of the driver from the non-hidden face components. On the other hand, the exact facial direction of the driver can be detected before hiding the facial part of the driver. Therefore, in the second embodiment, if the face portion of the driver is hidden in the face image of the driver, the driver situation detection section 21 is based on the face portion of the driver (not hidden in the face image of the driver) and the face image of the driver before the face portion of the driver is hidden.
If the direction of the face of the driver is not changed after a certain part of the face of the driver is hidden, the position and size of the part of the face which is not hidden in the image of the face of the driver do not change much. Therefore, if the amount of change in the condition of the face portion of the driver that is not hidden in the image of the face of the driver is the threshold value or less, the driver condition detection portion 21 determines that the direction of the face of the driver has not changed from before the face portion of the driver is hidden. On the other hand, if the amount of change is larger than the threshold, the driver situation detection portion 21 does not detect the direction of the face of the driver. The state of the face portion of the driver is, for example, the position or size of the face portion. Due to the above control, since the face portion of the driver is hidden in the face image of the driver, it is possible to further suppress a decrease in the detection accuracy of the direction of the face of the driver.
Note that the condition of the surface portion of the driver may be the position and size of the surface portion. In this case, if the amount of change in the position and size of the face part of the driver that is not hidden in the image of the face of the driver is a threshold value or less, the driver situation detection section 21 determines that the direction of the face of the driver has not changed from before the face part of the driver is hidden. On the other hand, if at least one of the amount of change in the position and the size of the driver's face part that is not hidden in the driver's face image is larger than the threshold value, the driver situation detection section 21 does not detect the direction of the driver's face.
Fig. 7 is a flowchart showing a control routine of processing for detecting a driving condition in the second embodiment. The present control routine is repeatedly executed by the ECU 20. In the present control routine, the condition of the driver of the vehicle 80 is detected. Steps s201 to s205 are similar to steps s101 to s105 in fig. 3, and therefore, the description is omitted.
If it is determined in step s203 that a portion of the face component is hidden, the control routine proceeds to step s 206. In step s206, the driver situation detection section 21 determines whether the amount of change pv in the position of the driver's face portion that is not hidden in the driver's face image is the threshold pth or lower. The threshold pth of each face piece is set in advance through experiments or calculations.
If it is determined in step s206 that the amount of change pv is equal to or smaller than the threshold pth, the control routine continues to step s 207. In step s207, the driver situation detection section 21 determines whether the amount of change sv in the size of the driver's face that is not hidden in the image of the driver's face is the threshold sth or less. The threshold value of each face portion is set in advance by experiment or calculation.
If it is determined in step s207 that the amount of change sv is equal to or smaller than the threshold value sth, the control routine proceeds to step s 208. In step s208, the driver condition detection portion 21 detects the driver condition. Specifically, the driver situation detection section 21 detects the face direction of the driver as the face direction of the current driver before the facial part of the driver is hidden. That is, the driver situation detection portion 21 detects the direction of the driver 'S face detected last in step S205 as the driver' S current face direction. In other words, the driver condition detection 21 st section determines that the direction of the face of the driver has not changed before hiding the face portion of the driver. Further, if at least one eye of the driver is not hidden, the driver situation detection section 21 detects the degree of opening of the eyes of the driver based on any of the non-hidden eyes. After step s208, the current control routine ends.
On the other hand, if it is judged in step s206 that the variation pv is larger than the threshold pth, or if it is judged in step s207 that the variation sv is larger than the threshold sth, the current control routine is ended. Note that, if at least one eye of the driver is not hidden, the driver condition detection section 21 may detect the opening degree of the eyes of the driver based on any of the unhidden eyes after step s206 or after step s 207. Further, in the present control routine, step s206 or step s207 may be omitted.

Claims (10)

1. A driver condition detection system comprising: the image processing apparatus includes a driver monitoring camera that captures a face of a driver of a vehicle and generates a face image of the driver, and a driver condition detection section configured to detect a condition of the driver based on the face image, wherein if the face portion of the driver is hidden in the face image, the driver condition detection section is configured to detect the condition of the driver based on the face portion of the driver not being hidden in the face image, the face portion of the driver being a mouth, a nose, a right eye, and a left eye of the driver.
2. The driver condition detection system according to claim 1, wherein the condition of the driver is a face direction of the driver.
3. The driver condition detection system according to claim 2, wherein if the face portion of the driver is hidden in the face image, the driver state detection section is configured to detect the direction of the face of the driver from the face portion that is not hidden in the driver face image and a face image before the portion hidden in the driver face image.
4. The driver condition detection system according to claim 3, wherein the driver condition detection section is configured to determine that the direction of the face of the driver has not changed before the face portion of the driver is hidden; if the amount of change in the face condition of the driver not hidden in the face image is less than or equal to the threshold, the direction of the face of the driver is not detected if the amount of change is greater than the threshold.
5. The driver situation detection system according to claim 3, wherein the driving condition detection section is configured not to detect the direction of the face of the driver if a variation amount of a position and a size of a face portion of the driving factor which is not hidden in the face image is a threshold value or less, and if a variation amount of at least one of the position and the size is larger than the threshold value.
6. The driving condition detection system according to claim 2, wherein the driving condition detection section is configured to match the face image and the face shape data to detect a face direction of the driver.
7. The driving condition detection system according to claim 3, wherein the driving condition detection section is configured to match the face image and the face shape data to detect a face direction of the driver.
8. The driving condition detection system according to claim 4, wherein the driving condition detection section is configured to match the face image and the face shape data to detect a face direction of the driver.
9. The driving condition detection system according to claim 5, wherein the driving condition detection section is configured to match the face image and the face shape data to detect a face direction of the driver.
10. The driving condition detection system according to claim 6, wherein the driving condition detection section is configured to match a face component in the face image and a face component in the face shape data to detect the direction of the face of the driver.
CN201910767813.6A 2019-08-20 2019-08-20 Driver condition detection system Pending CN112417921A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910767813.6A CN112417921A (en) 2019-08-20 2019-08-20 Driver condition detection system

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910767813.6A CN112417921A (en) 2019-08-20 2019-08-20 Driver condition detection system

Publications (1)

Publication Number Publication Date
CN112417921A true CN112417921A (en) 2021-02-26

Family

ID=74779991

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910767813.6A Pending CN112417921A (en) 2019-08-20 2019-08-20 Driver condition detection system

Country Status (1)

Country Link
CN (1) CN112417921A (en)

Similar Documents

Publication Publication Date Title
US10970572B2 (en) Driver condition detection system
US10800424B2 (en) Driver condition detection system
US10604160B2 (en) Driver condition detection system
JP6497158B2 (en) Display device, moving body
US20180232588A1 (en) Driver state monitoring device
US20180204079A1 (en) Control system of vehicle
JP7047821B2 (en) Driving support device
JP6870660B2 (en) Driver monitoring device
WO2014034065A1 (en) Moving body warning device and moving body warning method
JP2020052827A (en) Occupant modeling device, occupant modeling method, and occupant modeling program
JP2016115117A (en) Determination device and determination method
US20220309808A1 (en) Driver monitoring device, driver monitoring method, and driver monitoring-use computer program
KR20210120398A (en) Electronic device displaying image by using camera monitoring system and the method for operation the same
US10525981B2 (en) Driver condition detection system
JP2016115120A (en) Opened/closed eye determination device and opened/closed eye determination method
KR20110062651A (en) Lane departure warning system
JP6572538B2 (en) Downward view determination device and downward view determination method
CN112417921A (en) Driver condition detection system
JP6962141B2 (en) Driver status detector
JP7046748B2 (en) Driver status determination device and driver status determination method
JP2008162550A (en) External environment display device
JP6852407B2 (en) Driver status detector
JP2016115119A (en) Eye opening/closing determination device and eye opening/closing determination method
WO2013114871A1 (en) Driving assistance device and driving assistance method
JP2021152729A (en) Yawn detection device

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20210226

WD01 Invention patent application deemed withdrawn after publication