JP2012018622A - Driving-support device and method - Google Patents

Driving-support device and method Download PDF

Info

Publication number
JP2012018622A
JP2012018622A JP2010156952A JP2010156952A JP2012018622A JP 2012018622 A JP2012018622 A JP 2012018622A JP 2010156952 A JP2010156952 A JP 2010156952A JP 2010156952 A JP2010156952 A JP 2010156952A JP 2012018622 A JP2012018622 A JP 2012018622A
Authority
JP
Japan
Prior art keywords
area
movement
information
detection
motion
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2010156952A
Other languages
Japanese (ja)
Other versions
JP5587068B2 (en
Inventor
Khiat Abdelaziz
Takahiro Watanabe
Kazuma Yamamoto
アブデラジズ キアツト
一真 山本
孝弘 渡辺
Original Assignee
Nissan Motor Co Ltd
Oki Electric Ind Co Ltd
日産自動車株式会社
沖電気工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nissan Motor Co Ltd, Oki Electric Ind Co Ltd, 日産自動車株式会社, 沖電気工業株式会社 filed Critical Nissan Motor Co Ltd
Priority to JP2010156952A priority Critical patent/JP5587068B2/en
Publication of JP2012018622A publication Critical patent/JP2012018622A/en
Application granted granted Critical
Publication of JP5587068B2 publication Critical patent/JP5587068B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

PROBLEM TO BE SOLVED: To provide a driving-support device and the like capable of suppressing false detection of a hand movement of a driver.SOLUTION: A face detection/face feature tracking unit 11 detects a face area and a light environment determination unit 12 determines a lighting environment in the face area. A motion detection/contrast countermeasure unit 13 detects motion information from a motion detection area around the face of a driver in a video taken by a camera 1 to detect a hand area. If brightness changes rapidly, a light change detection unit 14 suspends determination of a hand movement event. A motion direction determination/event detection unit 15 classifies the motion information into first motion direction information from outside to inside with respect to surroundings of the face of the driver and second motion direction information from inside to outside. If the second motion direction information is detected after the first motion direction information is detected, it is determined that a hand movement event of the driver has occurred.

Description

  The present invention relates to a driving support apparatus and method for presenting information for supporting driving of a driver.

  2. Description of the Related Art Conventionally, a technique for detecting the movement of a driver's hand as in Patent Document 1 below is known. In Patent Document 1, the movement of the driver's hand is detected for menu selection and the way the head-up display is seen.

JP 2009-104297 A

  However, the technique of Patent Document 1 may erroneously detect the movement of the driver's hand. For example, when the driver's face moves or when another occupant moves, the movement of the driver's hand cannot be distinguished from the movement of the driver's face or the movement of another occupant as a hand movement. There was a possibility that.

  Therefore, the present invention has been proposed in view of the above-described circumstances, and an object thereof is to provide a driving support apparatus and method that can suppress erroneous detection of a driver's hand movement.

  The present invention detects motion information from a motion detection region around the driver's face, and uses the motion information as the first motion direction information from the outside to the inside and the first from the inside to the outside with respect to the periphery of the driver's face. When the second movement direction information is detected after the first movement direction information is detected, it is determined that a movement event of the driver's hand has occurred.

  According to the present invention, when the first movement direction information from the outside to the inside is detected for the face area and then the second movement direction information from the inside to the outside is detected for the face area, Since it is determined that a movement event has occurred, it is possible to suppress erroneous detection as a hand movement event when the driver's face moves or when another occupant moves.

It is a block diagram which shows the outline of the driving assistance apparatus which shows this invention as embodiment. It is a figure explaining the example of installation of a camera system in the driving assistance device shown as an embodiment of the present invention. It is a block diagram which shows the functional structure of the hand movement detection apparatus in the driving assistance apparatus shown as embodiment of this invention. In the driving assistance device shown as an embodiment of the present invention, it is a figure showing a face proximity field and a face remote field set as a camera image. In the driving assistance device shown as an embodiment of the present invention, it is a time chart which shows a detection rate when a hand with respect to a face proximity area and a face remote area enters and leaves. In the driving assistance apparatus shown as embodiment of this invention, it is a figure which shows the relationship between the brightness of a camera image, and a binarization threshold value, (a) is a night environment, (b) is a tunnel environment, (c) is a daytime environment. It is. In the driving assistance apparatus shown as embodiment of this invention, it is a figure which shows changing a light environment determination area | region according to a face area | region. In the driving assistance device shown as an embodiment of the present invention, it is a figure showing setting up the 1st face circumference area and the 2nd face circumference area to a camera picture. In the driving assistance apparatus shown as embodiment of this invention, it is another figure which shows setting a 1st face peripheral area and a 2nd face peripheral area to a camera image. It is a figure which shows the process which produces | generates a motion mask image by the motion detection and contrast countermeasure part in the driving assistance device shown as embodiment of this invention. In the driving assistance apparatus shown as an embodiment of the present invention, (a) to (f) show camera images when the brightness changes temporarily, and (B) (a) to (f). The motion mask image at the time of this camera image is shown. In the driving support apparatus shown as the embodiment of the present invention, (A) (a) to (f) show camera images when the hand region appears, and (B) (a) to (f) The motion mask image at the time of a camera image is shown. It is a figure which shows a mode that the brightness changes temporarily steeply in the driving assistance apparatus shown as embodiment of this invention.

  Hereinafter, embodiments of the present invention will be described with reference to the drawings.

  A driving support apparatus shown as an embodiment of the present invention is configured as shown in FIG. 1, for example.

  The driving support device includes a camera 1 as an imaging unit, an infrared light source 2, and a hand movement detection device 3.

  The camera 1 captures an area including the periphery of the driver's face. The camera 1 supplies, for example, a camera image including a face area irradiated by the infrared light source 2 to the hand motion detection device 3 every predetermined time. The infrared light source 2 irradiates an area including the driver's face with infrared rays. The infrared light source 2 is used to supplement the brightness of the face area when the brightness of the driver's face area is insufficient. Needless to say, the camera 1 is not limited to an infrared camera, and may be a camera of another type.

  This driving assistance device outputs the hand movement detection signal when the hand movement detecting device 3 detects the movement state of the driver's hand based on the camera image. This hand movement detection signal is supplied, for example, to a driver state monitoring device (not shown). Thereby, the hand movement detection signal is used, for example, for recognizing the driver's arousal state, and can be used for outputting a warning sound from a speaker or the like so as to take a rest according to the arousal state. The hand movement detection signal can also be used to control the operation of the vehicle or to transmit it to an information center outside the vehicle.

  For example, as shown in FIG. 2, the driving support device is provided in a driver seat operated by a driver. The camera system 302 including the camera 1 and the infrared light source 2 is installed in the vicinity of the steering 300 and near the instrument 301 that can be visually recognized by the driver. Thereby, the camera system 302 can photograph at least a driver's face image and a surrounding image of the face. A camera image obtained by imaging with the camera 1 is supplied to the hand motion detection device 3.

  The hand movement detection device 3 includes functional units as shown in FIG. 3, for example. Note that the hand movement detection device 3 is actually composed of a ROM, a RAM, a CPU, and the like, but has functions that can be realized when the CPU performs processing according to a program for hand movement detection stored in the ROM. It will be described as a block.

  For example, as shown in FIG. 4, the hand movement detection device 3 includes a face proximity area 201 around a driver's face area in a camera image 100 captured by the camera 1, and is located outside the face proximity area. The face remote area 202 is provided. The face proximity area 201 includes upper, lower, left, and right areas 201U, 201D, 201R, and 201L. The face remote area 202 includes upper, lower, left and right areas 202U, 202D, 202R, 202L.

  Here, consider the occurrence of a hand movement event in which the driver moves his hand closer to his face and then releases his hand from his face.

  The hand movement detection device 3 detects a movement in the first direction from the outside to the inside of the driver's hand and a movement in the second direction from the inside to the outside of the driver's hand. When movement in the second direction from the inside of the driver's hand to the outside occurs after movement in the first direction from the outside to the inside of the driver's hand occurs, the hand movement detection device 3 A signal as shown in FIG.

  According to FIG. 5, first, a signal S1 indicating that a hand has entered the face remote area 202 is detected at time t1, and then a signal S2 indicating that a hand has entered the face proximity area 201 is detected at time t2. Is done. Further, a signal S3 indicating that a hand has entered the face proximity area 201 is detected at a subsequent time t3, and a signal S4 indicating that a hand has entered the face remote area 202 is detected at a subsequent time t4.

  As described above, the hand movement detecting device 3 detects the hand movement when the event of the hand movement occurs, the first movement direction information from the outside to the inside with respect to the periphery of the driver's face, and the second from the inside to the outside. It can be classified into motion direction information. Then, the hand movement detection device 3 determines that the movement event of the driver's hand has occurred when the second movement direction information is detected after the first movement direction information is detected (event determination means). Hereinafter, a specific configuration for detecting such a hand movement event will be described.

  The hand motion detection device 3 includes a face detection / face feature tracking unit 11, a light environment determination unit 12, a motion detection / contrast countermeasure unit 13, a light change detection unit 14, and a motion direction determination / event detection unit 15.

  The face detection / face feature tracking unit 11 is supplied with a camera image 100 from the camera 1. The face detection / face feature tracking unit 11 uses the camera image 100 to detect a face area included in the camera image 100. This face area information is composed of a plurality of coordinate data, and is specified as, for example, a rectangular area. Further, the face detection / face feature tracking unit 11 extracts a feature of the face area and tracks the feature of the face area. Thereby, the face detection / face feature tracking unit 11 detects the movement of the face area by detecting the movement of the face. The face detection / face feature tracking unit 11 supplies face area information, which is coordinate data representing a face area, to the light environment determination unit 12 and the motion detection / contrast countermeasure unit 13.

  The light environment determination unit 12 is supplied with the camera image 100 from the camera 1 and the face area information from the face detection / face feature tracking unit 11. The light environment determination unit 12 determines the light environment around the driver based on the camera image 100. The light environment determination unit 12 determines whether the light environment is in the daytime, at night, or in the tunnel based on the brightness in the ceiling area in the vehicle in the camera image 100 captured by the camera 1. . Based on the determined light environment, the light environment determination unit 12 sets a binarization threshold value for the subsequent motion detection / contrast countermeasure unit 13 to identify a hand region from a motion detection region other than the face region in the camera image 100. Set (threshold setting means).

  The light environment determination unit 12 specifies a region outside the face region obtained from the face region information as a motion detection region in which motion information is detected by the motion detection / contrast countermeasure unit 13. Then, the light environment determination unit 12 sets a binarization threshold value for detecting motion information based on the lightness in the motion detection area (threshold setting means).

  The light environment discriminating unit 12 recognizes a time zone (nighttime, daytime, tunnel) based on the brightness in the ceiling area in the vehicle in the video imaged by the camera 1 (time zone recognition means). Based on the recognized time zone, the light environment determination unit 12 sets a binarization threshold value by which the motion detection / contrast countermeasure unit 13 detects motion information based on the brightness in the motion detection area (threshold setting unit). .

  Specifically, as shown in FIG. 6, the light environment determination unit 12 provides light environment determination regions 101R and 101L in a region corresponding to the ceiling in the camera image 100. In the light environment determination regions 101R and 101L, regions where the lightness hardly changes according to the traveling of the vehicle and the lightness changes according to the light environment are set. In the light environment determination regions 101R and 101L, the vertical length CH and the horizontal length CW are preset to predetermined lengths. The light environment determination areas 101R and 101L are set based on the mounting position and angle of view of the camera 1, the face position of the camera 1 and a normal driver, and the like.

  In the night environment, as shown in FIG. 6A, the average brightness of the light environment determination areas 101R and 101L is a low value. Therefore, the light environment determination unit 12 sets the binarization threshold for determining the hand area to a low value. In the tunnel environment, as shown in FIG. 6B, the average brightness of the light environment determination areas 101R and 101L is higher than the night environment and lower than the daytime environment. Therefore, the light environment determination unit 12 sets the binarization threshold for determining the hand area to a value higher than that of the night environment. In the daytime environment, as shown in FIG. 6C, the average brightness of the light environment determination areas 101R and 101L is the highest. Therefore, the light environment determination unit 12 sets the binarization threshold for determining the hand area to a value higher than that of the tunnel environment.

  Also, the light environment determination areas 101R and 101L may vary depending on the face area. For example, as shown in FIG. 7, when the driver tilts his / her face, an overlapping area 103 between the face area 102 and the light environment determination areas 101R and 101L occurs. At this time, the light environment determination unit 12 sets the areas other than the face area 102 in the light environment determination areas 101R and 101L as the light environment determination areas 101R and 101L. Thereby, the light environment discrimination | determination part 12 can suppress misdetecting a light environment with the brightness of the face area | region 102, can discriminate | determine a light environment more correctly, and can set a binarization threshold value.

  Furthermore, the light environment determination unit 12 may set a plurality of binarization thresholds instead of a single binarization threshold in order to detect a hand region outside the face region 102.

  For example, as shown in FIG. 8, in the camera image 100, a first face peripheral area 104 adjacent to the outside of the face area 102 and a second face peripheral area 105 adjacent to the outside of the first face peripheral area 104 are displayed. Set. The light environment determination unit 12 sets a first binarization threshold for the first face peripheral area 104 and sets a second binarization threshold for the second face peripheral area 105. . In a situation where the infrared light source 2 irradiates the driver's face with infrared light in the nighttime environment, the infrared light is also irradiated near the driver's face. Therefore, the light environment determination unit 12 sets the first binarization threshold higher than the second binarization threshold. Thereby, the hand area that appears in the first face peripheral area 104 and the second face peripheral area 105 can be detected with high accuracy.

  As another example, as shown in FIG. 9, in the camera image 100, the first face peripheral area 106 adjacent to the window side of the face area 102 and the second face peripheral area 107 adjacent to the vehicle interior side of the face area 102. Set. The light environment determination unit 12 sets a first binarization threshold for the first face peripheral area 106 and sets a second binarization threshold for the second face peripheral area 107. . In a daytime environment, in a situation where natural light is irradiated on the driver's face from the window side, the average brightness on the window side is high and the average brightness on the inside of the vehicle is low. Therefore, the light environment determination unit 12 sets the first binarization threshold higher than the second binarization threshold. As a result, the hand areas appearing in the first face peripheral area 106 and the second face peripheral area 107 can be detected with high accuracy.

  The motion detection / contrast countermeasure unit 13 is supplied with a camera image 100 from the camera 1, face area information from the face detection / face feature tracking unit 11, and a binarization threshold from the light environment determination unit 12. The motion detection / contrast countermeasure unit 13 recognizes an area outside the face area detected by the face detection / face feature tracking unit 11 in the camera image 100 as a motion detection area. In addition, the motion detection / contrast countermeasure unit 13 detects motion outside the face area that has been moved in response to the movement of the face area detected by the face detection / face feature tracking unit 11 in the camera image 100. Move the area.

  The motion detection / contrast countermeasure unit 13 detects a hand region from a motion detection region that is a region excluding the face region 102 in the camera image 100. At this time, the motion detection / contrast countermeasure unit 13 sets a region having a brightness higher than the binarization threshold among the motion detection regions excluding the face region 102 of the camera image 100 as a hand region, and a brightness lower than the binarization threshold. Is determined not to be a hand region. The motion detection / contrast countermeasure unit 13 supplies to the light change detection unit 14 motion mask information in which the value of the hand area is “1” and the area other than the hand area is masked (value = 0).

  As shown in FIG. 10, when the camera image 100 is newly input, the motion detection / contrast countermeasure unit 13 performs a filtering process on the camera image 100 (step ST1). This filtering process is performed into a camera image 100 from which a region with high brightness is locally removed.

  Next, the motion detection / contrast countermeasure unit 13 performs a foreground extraction process on the camera image 100a subjected to the filtering process (step ST3). This foreground extraction process is performed by a difference process between the camera image 100a filtered as shown in FIG. 10 and the background image created in advance in the background extraction process (step ST2) of the previous frame. A motion image 100c obtained by extracting the detected area is acquired. Then, the motion detection / contrast countermeasure unit 13 compares the binarization threshold set by the light environment discrimination unit 12 with the pixel value of the motion image, and determines a pixel having a pixel value exceeding the binarization threshold as a hand region. (Step ST4). Accordingly, the motion detection / contrast countermeasure unit 13 generates a motion mask image 100d including the hand region 100d (1) having a value “1” and the mask region 100d (0) having a value “0”. Further, in the background extraction process (step ST2) shown in FIG. 10, a weighted average process is performed on the current background image with the camera image 100a obtained by filtering the camera image input this time, and used in the next frame process. Create a background image.

  In the nighttime environment, the entire camera image 100 has a low contrast, and the vicinity of the face tends to be brightened by the infrared light applied to the face area 102 from the infrared light source 2. Therefore, it is desirable that the motion detection / contrast countermeasure unit 13 uses different binarization threshold values for the first face peripheral area 104 and the second face peripheral area 105 around the face area 102 as shown in FIG. . This is because of the countermeasure that the difference value tends to be different depending on the position in the camera image 100 even in the same hand region or background (motion detection region) depending on the light environment or the characteristics of the camera 1. Therefore, it is desirable that the motion detection / contrast countermeasure unit 13 uses a different binarization threshold depending on the position of the camera image 100 depending on the light environment.

  In the night environment, since the infrared light from the infrared light source 2 is irradiated to the face area 102 around the face area 102, the area other than the face area 102 (motion detection area) becomes brighter. It becomes darker in the area farther away from. Therefore, even if a hand region appears in the same motion detection region, the difference value at the end of the camera image 100 is smaller than the difference value around the face region 102. Furthermore, as a general characteristic of the camera 1, the contrast is lower at the end of the camera image 100 than at the center of the camera image 100. For this reason, even if a hand region appears in the same motion detection region, the difference value at the end of the camera image 100 is smaller than the difference value around the face region 102.

In consideration of such a night environment and the characteristics of the camera 1, the motion detection / contrast countermeasure unit 13 determines a binarization threshold other than the face region 102 when the light environment determination unit 12 determines that the night environment is present. The magnitude relationship between nighttimeNoFaceTH and the binarization threshold nighttimeFaceTH of the face area 102 is
nighttimeNoFaceTH ≤ nighttimeFaceTH
And Thereby, regardless of the illumination of the infrared light source 2 and the characteristics of the camera 1, a hand region other than the face region 102 can be detected with high accuracy.

Similarly, in the tunnel environment and the daytime environment, infrared light is irradiated from the infrared light source 2 to the face area 102, but the influence of the infrared light is less than that in the night environment. However, due to the characteristics of the camera 1, the contrast is lower at the end of the camera image 100 than at the face area 102. In consideration of this, the motion detection / contrast countermeasure unit 13 performs binarization threshold tunnelNoFaceTH for an area other than the face area 102 (motion detection area) in the tunnel environment, binarization threshold tunnelFaceTH for the face area 102 in the tunnel environment, The magnitude relationship between the binarization threshold daytimeNoFaceTH of the area (motion detection area) other than the face area 102 in the daytime environment, and the binarization threshold daytimeFaceTH of the face area 102 in the daytime environment,
tunnelNoFaceTH ≤ tunnelFaceTH
daytimeNoFaceTH ≦ daytimeFaceTH
And Thereby, even if the contrast of the motion detection area located on the end side of the face area 102 is lower than that of the face area 102, the hand area can be detected with high accuracy by the low binarization threshold.

Furthermore, considering the lighting conditions in each light environment,
nighttimeNoFaceTH ≤ tunnelNoFaceTH ≤ daytimeNoFaceTH
By setting the binarization threshold, the hand region can be detected effectively regardless of the light environment. However, it is not always necessary to set the binarization threshold in this way.

  Further, the motion detection / contrast countermeasure unit 13 may set different binarization threshold values for each region of the camera image 100 as shown in FIGS.

As shown in FIG. 8, when the first face peripheral area 104 and the second face peripheral area 105 are set around the face area 102, the binarization threshold value FaceTH in the face area 102 and the first face peripheral area 104 The binarization threshold NoFaceTH1 and the binarization threshold NoFaceTH2 in the second face peripheral area 105 are
NoFaceTH2 ≤ NoFaceTH1 ≤ FaceTH
And The motion detection / contrast countermeasure unit 13 is higher than the face region 102 even if the contrast decreases as it reaches the end of the camera image 100 such as the face region 102, the first face peripheral region 104, and the second face peripheral region 105. The hand region in the one-face peripheral region 104 can be detected with high accuracy, and the hand region in the second face-peripheral region 105 can be detected with higher accuracy than the first face-peripheral region 104.

In daytime environments, the contrast on the window side may be high and the contrast on the inside of the vehicle may be low. In such a daytime environment, as shown in FIG. 9, the motion detection / contrast countermeasure unit 13 performs binarization threshold NoFaceTH3 from the face area 102 to the first face peripheral area 106 on the window side, and from the face area 102 to the vehicle interior side. If the binarization threshold NoFaceTH4 in the second face peripheral region 107 is
NoFaceTH4 ≤ NoFaceTH3 ≤ FaceTH
And Thereby, the motion detection / contrast countermeasure unit 13 detects the hand area in the first face peripheral area 106 and the second face peripheral area 107 more accurately than the face area 102, and the second face peripheral area 107 in which the contrast decreases. The hand region at can be detected with high accuracy.

  The light change detection unit 14 is supplied with motion mask information from the motion detection / contrast countermeasure unit 13. The light change detection unit 14 detects a change in the light environment in which the brightness is rapidly increased. The light change detection unit 14 determines the possibility that a hand region will appear based on the motion mask information due to the sharp increase in brightness. The light change detection unit 14 cancels the motion mask information and does not supply the motion mask information to the motion direction determination / event detection unit 15 when steep brightness has occurred. Thereby, the light change detection part 14 stops the process which determines that the driver | operator's hand movement event generate | occur | produced, when it is detected that the brightness becomes sharply high.

  Specifically, as shown in FIG. 11A, when the camera images 100 are supplied in time series as shown in (a) to (f), the motion detection / contrast countermeasure unit 13 performs the processing shown in FIG. ) Consider a scene in which the motion mask image 100d is supplied to the light change detection unit 14 in time series as in (a) to (f). When a change in the light environment that sharply increases the brightness occurs, the hand region 100d (1) recognized as the hand region of the motion mask image 100d occurs only over several frames. Further, the area of the hand region 100d (1) is very small compared to the area of the mask region 100d (0).

  On the other hand, as shown in FIG. 12A, when the camera images 100 of (a) to (f) are acquired in a situation where the driver is moving his hand, (a) to (a) of FIG. As shown in f), the area of the hand region 100d (1) is large, and the hand region 100d (1) appears over many frames (long time).

  As described above, when the area of the hand region 100d (1) is small and the number of frames in which the hand region 100d (1) is generated is small (short time), the light change detection unit 14 increases the brightness sharply. Therefore, the motion mask image 100d is not supplied to the motion direction determination / event detection unit 15.

  Further, as shown in FIG. 13, the light change detection unit 14 may obtain the average brightness of the light environment determination regions 101R and 101L in the camera image 100 to detect the change in brightness steeply. For example, in a scene where sunlight suddenly enters from the window side, the average brightness of the light environment determination area 101L is close to the upper limit value. Even in such a scene, the light change detection unit 14 uses the motion mask image 100d acquired from the motion detection / contrast countermeasure unit 13 when the lightness sharply changes in the light environment determination regions 101R and 101L. It is not supplied to the determination / event detection unit 15.

  As a result, the light change detection unit 14 can suppress erroneous detection of the hand region caused by supplying the motion mask image 100d generated when the brightness change suddenly occurs to the motion direction determination / event detection unit 15.

  The motion direction determination / event detection unit 15 is supplied with the motion mask image 100 d via the light change detection unit 14. With reference to the temporally continuous motion mask image 100d, the motion direction determination / event detection unit 15 moves the hand region 100d (1) from the face remote region 202 to the face proximity region 201 as shown in FIG. Into the first movement direction information and the second movement direction information in which the hand area 100d (1) moves from the face proximity area 201 to the face remote area 202 described above.

  As illustrated in FIG. 5, the movement direction determination / event detection unit 15 performs the series of operations when the second movement direction information (S3 → S4) is detected after the first movement direction information (S1 → S2). The movement of the hand region 100d (1) is recognized as an event of hand movement (event determination means).

  The movement direction determination / event detection unit 15 outputs a hand movement detection signal to the outside when recognizing the hand movement event. The face proximity area 201 and the face remote area 202 may be represented by predetermined coordinate data, and may be moved dynamically according to the movement of the face area 102.

  Here, the movement direction determination / event detection unit 15 detects the hand when the occupation ratio (%) including the hand region 100d (1) in the face proximity region 201 and the face remote region 202 is equal to or greater than a predetermined value. It is determined that the area 100d (1) has entered the face proximity area 201 and the face remote area 202. When the hand area 100d (1) enters the face proximity area 201 or the face remote area 202, the occupation ratio of the hand area 100d (1) with respect to the entire face proximity area 201 or the face remote area 202 is low, and the hand area 100d gradually increases. The occupation ratio of (1) increases, and then the hand passes through the face proximity area 201 or the face remote area 202, and the occupation ratio of the hand area 100d (1) decreases. The movement direction determination / event detection unit 15 detects the coordinate value at which the occupation ratio of the hand area 100d (1) is maximized, so that the hand approaches the face proximity area 201 or the face remote area 202 first. It is possible to determine in which direction the hand has moved by entering. Thereby, the movement direction determination / event detection unit 15 can detect the movement direction in which the hand moves from the outside of the camera image 100 to the inside and the hand moves from the inside of the camera image 100 to the outside.

  When the second movement direction information (S3 → S4) is detected after the first movement direction information (S1 → S2), it indicates that the driver behaves to touch his / her face or head. Here, the face proximity area 201 and the face remote area 202 are provided above, below, on the right side, and on the left side around the face area 102. Therefore, the approach direction of the hand region 100d (1) may not coincide with the exit direction of the hand region 100d (1). In this case, the detection accuracy of the hand movement event becomes ambiguous.

  On the other hand, the movement direction determination / event detection unit 15 may set weights for each face proximity area 201 and face remote area 202. The weighting of the face proximity area 201 or the face remote area 202 is set so as to match the behavior of the driver touching his / her face or head with his / her hand. For example, the right face proximity area 201R and face remote area 202R on the face area 102 and the left face proximity area 201L and face remote area 202L on the face area 102 are weighted by the upper face proximity area 201U and face remoteness on the face area 102, respectively. The weighting of the lower face proximity area 201D and the face remote area 202D with respect to the area 202U and the face area 102 is set higher. This is because the occurrence probability of a hand movement event in which a hand enters and exits from the right side or the left side is higher than the occurrence probability of a hand movement event in which a hand enters from the upper side or the lower side.

  As a result, the movement direction determination / event detection unit 15 causes the hand area 100d (1) to appear when the hand area 100d (1) appears in the face proximity area 201R, the face remote area 202R, the face proximity area 201L, and the face remote area 202L with high weight. Even if the occupation ratio of the area 100d (1) is low, it is easy to determine that a hand has entered the area. That is, the movement direction determination / event detection unit 15 can easily detect an event of a hand movement in which a hand enters and exits from the right side or the left side.

  As described above in detail, according to the driving support apparatus shown as the embodiment of the present invention, as shown in FIG. 5, the motion information of the hand region 100d (1) is externally applied to the face region 102 from the outside. When the second movement direction information is detected after the first movement direction information is detected, the driver's hand is classified into the first movement direction information and the second movement direction information from the inside to the outside. It is determined that a motion event has occurred. Thereby, according to this driving assistance device, it is possible to suppress erroneous detection as an event of hand movement that the driver's face has moved or that another occupant has moved. In addition, according to this driving support device, the background of the face area 102 is set as a motion detection area and the direction of the hand area 100d (1) is detected. Therefore, an event of hand movement can be performed with a simple processing and a small processing load. It can be detected.

  Further, according to this driving support device, the area outside the face area detected by the face detection / face feature tracking unit 11 is recognized as the motion detection area, so that the face area 102 can be excluded from the motion detection area. Further, according to this driving support apparatus, the motion detection area outside the face area 102 is moved in response to the movement of the face area detected by the face detection / face feature tracking unit 11. Thereby, according to this driving assistance device, the movement of the head that is not the movement of the hand can be excluded from the hand area 100d (1), the detection rate of the hand area 100d (1) can be improved, and the movement of the hand movement can be improved. It is possible to suppress false detection of events.

  Furthermore, according to this driving support device, an area outside the face area 102 detected by the face detection / face feature tracking unit 11 is specified as a motion detection area, and motion information is obtained based on the brightness in the motion detection area. A binarization threshold value to be detected is set. Thereby, according to this driving assistance apparatus, even if the light environment in the vehicle fluctuates, such as a night environment, a daytime environment, and a tunnel environment, an appropriate binarization threshold value is set, and the hand region 100d ( 1) can be detected, and erroneous detection of hand movement events can be suppressed.

  Furthermore, according to this driving support device, the time zone is recognized based on the lightness in the light environment determination regions 101R and 101L corresponding to the ceiling region in the vehicle in the camera image 100, and based on the lightness in the motion detection region. To set a binarization threshold. As a result, according to this driving support device, it is possible to accurately determine the light environment in the passenger compartment and set an appropriate binarization threshold, accurately detect the hand region 100d (1), and erroneously detect a hand movement event. Can be suppressed.

  Furthermore, according to this driving support device, when a change in the light environment in which the brightness is sharply increased is detected, the process for determining that the movement event of the driver's hand has occurred is stopped. It is possible to suppress erroneous detection of a hand movement event when the brightness changes.

  The above-described embodiment is an example of the present invention. For this reason, the present invention is not limited to the above-described embodiment, and various modifications can be made depending on the design and the like as long as the technical idea according to the present invention is not deviated from this embodiment. Of course, it is possible to change.

DESCRIPTION OF SYMBOLS 1 Camera 2 Infrared light source 3 Detection apparatus 11 Face detection / face characteristic tracking part 12 Light environment discrimination | determination part 13 Detection / contrast countermeasure part 14 Light change detection part 15 Direction determination / event detection part

Claims (7)

  1. Imaging means for imaging an area including the periphery of the driver's face;
    A motion detection means for detecting motion information of a hand region from a motion detection region around the driver's face out of the video imaged by the imaging means;
    Classifying the movement information of the hand area detected by the movement detection means into first movement direction information from the outside to the inside and second movement direction information from the inside to the outside with respect to the periphery of the driver's face; Driving support comprising: event determination means for determining that a movement event of a driver's hand has occurred when the second movement direction information is detected after the first movement direction information is detected. apparatus.
  2. A face area detecting means for detecting the driver's face area in the video imaged by the imaging means;
    The driving support device according to claim 1, wherein the motion detection unit recognizes a region outside the face region detected by the face region detection unit as the motion detection region.
  3. A face area detecting means for detecting the driver's face area in the video imaged by the imaging means;
    The motion detection means moves a motion detection area outside the face area detected by the face area detection means in response to movement of the face area detected by the face area detection means. The driving support device according to claim 1.
  4. A face area detecting means for detecting the driver's face area in the video imaged by the imaging means;
    A binary value that identifies an area outside the face area detected by the face area detection means as a motion detection area for detecting motion information by the motion detection means, and detects motion information based on brightness in the motion detection area The driving support apparatus according to claim 1, further comprising: a threshold setting unit that sets a threshold value.
  5. A time zone recognizing means for recognizing a time zone based on brightness in a ceiling area in a vehicle among images captured by the imaging means;
    Threshold setting means for setting a binarization threshold value for detecting motion information based on lightness in a motion detection area based on a time zone recognized by the time zone recognition means, The driving support device according to claim 1.
  6. Provided with light environment detection means for detecting changes in the light environment where the brightness increases sharply,
    The event determination unit, when the light environment detection unit detects that the brightness is sharply increased, stops the process of determining that a movement event of the driver's hand has occurred. Item 2. The driving support device according to Item 1.
  7. From the video imaged by the imaging means, detecting motion information from the motion detection area around the driver's face,
    Classifying the detected movement information into first movement direction information from outside to inside and second movement direction information from inside to outside with respect to the periphery of the driver's face;
    A driving support method, wherein when the second movement direction information is detected after the first movement direction information is detected, it is determined that a movement event of a driver's hand has occurred.
JP2010156952A 2010-07-09 2010-07-09 Driving support apparatus and method Active JP5587068B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2010156952A JP5587068B2 (en) 2010-07-09 2010-07-09 Driving support apparatus and method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2010156952A JP5587068B2 (en) 2010-07-09 2010-07-09 Driving support apparatus and method

Publications (2)

Publication Number Publication Date
JP2012018622A true JP2012018622A (en) 2012-01-26
JP5587068B2 JP5587068B2 (en) 2014-09-10

Family

ID=45603813

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2010156952A Active JP5587068B2 (en) 2010-07-09 2010-07-09 Driving support apparatus and method

Country Status (1)

Country Link
JP (1) JP5587068B2 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9104243B2 (en) 2013-09-25 2015-08-11 Hyundai Motor Company Vehicle operation device
WO2019159364A1 (en) * 2018-02-19 2019-08-22 三菱電機株式会社 Passenger state detection device, passenger state detection system, and passenger state detection method
US10719697B2 (en) 2016-09-01 2020-07-21 Mitsubishi Electric Corporation Gesture judgment device, gesture operation device, and gesture judgment method

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005228163A (en) * 2004-02-13 2005-08-25 Omron Corp Control system, and control operation determining method

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005228163A (en) * 2004-02-13 2005-08-25 Omron Corp Control system, and control operation determining method

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
CSNG200202299003; 今川和幸 外3名: '顔によるオクルージョンを考慮した手話動画像からの実時間掌追跡' 電子情報通信学会技術研究報告 PRMU97-104〜110 パターン認識・メディア理解 第97巻 第251号, 19970912, 第15〜22頁, 社団法人電子情報通信学会 *
JPN6014005611; 今川和幸 外3名: '顔によるオクルージョンを考慮した手話動画像からの実時間掌追跡' 電子情報通信学会技術研究報告 PRMU97-104〜110 パターン認識・メディア理解 第97巻 第251号, 19970912, 第15〜22頁, 社団法人電子情報通信学会 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US9104243B2 (en) 2013-09-25 2015-08-11 Hyundai Motor Company Vehicle operation device
KR101550604B1 (en) * 2013-09-25 2015-09-07 현대자동차 주식회사 Vehicle operation device
US10719697B2 (en) 2016-09-01 2020-07-21 Mitsubishi Electric Corporation Gesture judgment device, gesture operation device, and gesture judgment method
WO2019159364A1 (en) * 2018-02-19 2019-08-22 三菱電機株式会社 Passenger state detection device, passenger state detection system, and passenger state detection method

Also Published As

Publication number Publication date
JP5587068B2 (en) 2014-09-10

Similar Documents

Publication Publication Date Title
KR101808587B1 (en) Intelligent integration visual surveillance control system by object detection and tracking and detecting abnormal behaviors
CN104519318B (en) Frequency image monitoring system and surveillance camera
US9047518B2 (en) Method for the detection and tracking of lane markings
EP2801956B1 (en) Passenger counter
EP2544162B1 (en) Surrounding area monitoring device for vehicle
EP2544161B1 (en) Surrounding area monitoring device for vehicle
JP3880759B2 (en) Moving object detection method
EP2523174B1 (en) Vision based night-time rear collision warning system, controller, and method of operating the same
US9235990B2 (en) Vehicle periphery monitoring device
JP4173901B2 (en) Vehicle periphery monitoring device
JP4633043B2 (en) Image processing device
US8890674B2 (en) Driver assistance detection system
EP1933256B1 (en) Eye closure recognition system and method
JP5297078B2 (en) Method for detecting moving object in blind spot of vehicle, and blind spot detection device
WO2014192137A1 (en) Movement trajectory prediction device, and movement trajectory prediction method
JP3764086B2 (en) Vehicle information providing device
US20120281093A1 (en) Detection system for assisting a driver when driving a vehicle using a plurality of image capturing devices
EP2759999B1 (en) Apparatus for monitoring surroundings of vehicle
CN101633356B (en) System and method for detecting pedestrians
US8085140B2 (en) Travel information providing device
KR101543105B1 (en) Method And Device for Recognizing a Pedestrian and Vehicle supporting the same
JP5270794B2 (en) Vehicle periphery monitoring device
US20030137593A1 (en) Infrared image-processing apparatus
US9418303B2 (en) Method for traffic sign recognition
JP2007334751A (en) Vehicle periphery monitoring device

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20130530

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20140130

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20140212

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20140401

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20140708

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20140723

R150 Certificate of patent or registration of utility model

Ref document number: 5587068

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150