US20190163997A1 - Control system - Google Patents
Control system Download PDFInfo
- Publication number
- US20190163997A1 US20190163997A1 US16/174,776 US201816174776A US2019163997A1 US 20190163997 A1 US20190163997 A1 US 20190163997A1 US 201816174776 A US201816174776 A US 201816174776A US 2019163997 A1 US2019163997 A1 US 2019163997A1
- Authority
- US
- United States
- Prior art keywords
- driver
- face
- changed
- information
- reference image
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
- 238000012544 monitoring process Methods 0.000 claims abstract description 27
- 238000012545 processing Methods 0.000 description 23
- 230000001133 acceleration Effects 0.000 description 10
- 239000011521 glass Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000000034 method Methods 0.000 description 2
- 230000002457 bidirectional effect Effects 0.000 description 1
- 238000004891 communication Methods 0.000 description 1
- 238000001514 detection method Methods 0.000 description 1
- 230000007613 environmental effect Effects 0.000 description 1
- 238000003384 imaging method Methods 0.000 description 1
- 230000007935 neutral effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/59—Context or environment of the image inside of a vehicle, e.g. relating to seat occupancy, driver state or inner lighting conditions
- G06V20/597—Recognising the driver's state or behaviour, e.g. attention or drowsiness
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/10—Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
- G06V40/16—Human faces, e.g. facial parts, sketches or expressions
- G06V40/172—Classification, e.g. identification
-
- G06K9/00845—
-
- G06K9/00288—
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V40/00—Recognition of biometric, human-related or animal-related patterns in image or video data
- G06V40/50—Maintenance of biometric data or enrolment thereof
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
- B60W2040/0818—Inactivity or incapacity of driver
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W2420/00—Indexing codes relating to the type of sensors based on the principle of their operation
- B60W2420/40—Photo, light or radio wave sensitive means, e.g. infrared sensors
- B60W2420/403—Image sensing, e.g. optical camera
-
- B—PERFORMING OPERATIONS; TRANSPORTING
- B60—VEHICLES IN GENERAL
- B60W—CONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
- B60W40/00—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models
- B60W40/08—Estimation or calculation of non-directly measurable driving parameters for road vehicle drive control systems not related to the control of a particular sub unit, e.g. by using mathematical models related to drivers or passengers
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0231—Control of position or course in two dimensions specially adapted to land vehicles using optical position detecting means
-
- G—PHYSICS
- G05—CONTROLLING; REGULATING
- G05D—SYSTEMS FOR CONTROLLING OR REGULATING NON-ELECTRIC VARIABLES
- G05D1/00—Control of position, course, altitude or attitude of land, water, air or space vehicles, e.g. using automatic pilots
- G05D1/02—Control of position or course in two dimensions
- G05D1/021—Control of position or course in two dimensions specially adapted to land vehicles
- G05D1/0257—Control of position or course in two dimensions specially adapted to land vehicles using a radar
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06V—IMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
- G06V20/00—Scenes; Scene-specific elements
- G06V20/50—Context or environment of the image
- G06V20/56—Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
Definitions
- the present disclosure relates to a control system for a vehicle.
- U.S. Pat. No. 8,587,440 discloses a conventional control system of a vehicle configured to use a driver monitor camera to capture an image of a driver and confirm the face of the driver based on the same.
- the driver monitor camera If monitoring the condition of a driver based on the image of the driver captured by a driver monitor camera, it is necessary to first detect (learn) the positions of the two eyes or the position of the nose of the driver, the distance between the two eyes, etc. based on the image of the driver captured by the driver monitor camera, set the image of the face of the driver as a reference (reference image), then compare that reference image and the image of the driver captured at any time by the driver monitor camera. Due to this, for example, the driver condition such as in which direction the face of the driver is oriented is monitored.
- This reference image of the face of the driver need only be set one time when the driver changes, but while the driver condition is being monitored, sometimes the driver will change his or her posture resulting in the face of the driver ending up temporarily departing from the inside of the field of view of the driver monitor camera or resulting in the face of the driver ending up temporarily concealed by an object or the hand of the driver.
- the face of the driver will temporarily no longer be able to be recognized.
- the driver may conceivably be changed during the automated driving. For this reason, in the above-mentioned conventional control system of a vehicle, it is not possible to judge whether the temporary failure to recognize the face of the driver was caused by the driver having been changed or was caused by, though the driver has not been changed, a change of his or her posture etc.
- the present disclosure was made focusing on such a problem and has as its object to keep the reference image from being reset and the driver condition from no longer being able to be monitored during the reset period despite the driver not having been changed.
- a control system for controlling a vehicle provided with a driver monitor camera capturing an image of a face of a driver, comprising a reference image setting part configured so as to set a reference image of the face of the driver based on the image captured by the driver monitor camera, a monitoring part configured so as to monitor a condition of the driver based on the image captured by the driver monitor camera and the reference image, a first judging part configured so as to judge if the face of the driver can no longer be recognized when monitoring the condition of the driver, a second judging part configured so as to judge if the driver has been changed when the face of the driver can no longer be recognized, and a reference image resetting part configured to reset the reference image of the face of the driver based on the image captured by the driver monitor camera when it is judged that the driver has changed.
- FIG. 1 is a schematic view of the configuration of an automated driving system for a vehicle according to one embodiment of the present disclosure.
- FIG. 2 is a schematic external view of a host vehicle mounting an automated driving system according to one embodiment of the present disclosure.
- FIG. 3 is a schematic internal view of a host vehicle mounting an automated driving system according to one embodiment of the present disclosure.
- FIG. 4 is a flow chart for explaining driver condition monitoring control according to one embodiment of the present disclosure.
- FIG. 1 is a schematic view of the configuration of an automated driving system 100 of a vehicle according to one embodiment of the present invention.
- FIG. 2 is a schematic view of the appearance of a host vehicle 1 mounting the automated driving system 100 according to the present embodiment.
- FIG. 3 is a schematic view of the inside of the host vehicle 1 mounting the automated driving system 100 according to the present embodiment.
- the automated driving system 100 is provided with a surrounding environment information acquiring device 10 , a host vehicle information acquiring device 20 , a driver information acquiring device 30 , a map database 40 , a storage device 50 , a human-machine interface (below, referred to as an “HMI”) 60 , a navigation system 70 , and an electronic control unit 80 .
- a surrounding environment information acquiring device 10 a host vehicle information acquiring device 20
- a driver information acquiring device 30 a map database 40
- a storage device 50 a human-machine interface (below, referred to as an “HMI”) 60
- HMI human-machine interface
- the surrounding environment information acquiring device 10 is a device for acquiring information relating to obstacles in the surroundings of the host vehicle (for example, buildings, moving vehicles such as vehicles in front of it and in back of it on the road and oncoming vehicles, stopped vehicles, the curb, fallen objects, pedestrians, etc.) and the weather and other such surrounding environmental conditions of the host vehicle 1 (below, referred to as the “surrounding environment information”).
- the surrounding environment information acquiring device 10 is provided with a LIDAR (laser imaging detection and ranging) device 11 , milliwave radar sensors 12 , an external camera 13 , illuminance sensor 14 , rain sensor 15 , and outside information receiving device 16 .
- LIDAR laser imaging detection and ranging
- the LIDAR device 11 uses laser beams to detect the road and obstacles in the host vehicle surroundings. As shown in FIG. 2 , in the present embodiment, the LIDAR device 11 is, for example, attached to the roof of the host vehicle 1 . The LIDAR device 11 successively fires laser beams toward the overall surroundings of the host vehicle 1 and measures the distances to the road and host vehicle surroundings from the reflected light. Further, the LIDAR device 11 uses the results of measurement as the basis to generate 3D images of the road and obstacles in the overall surroundings of the host vehicle 1 and sends information of the generated 3D images to the electronic control unit 80 .
- the locations of attachment and number of units of the LIDAR device 11 are not particularly limited so long as the information necessary for generating a three-dimensional image can be acquired.
- they may also be divided and attached to the grilles or to the insides of the headlights or brake lights and other such lights of the host vehicle 1 or may be divided and attached to parts of the body (frame) of the host vehicle 1 .
- the milliwave radar sensors 12 utilize electromagnetic waves to detect obstacles in the host vehicle surroundings at a farther distance than the LIDAR device 11 .
- the milliwave radar sensors 12 are attached to the front bumper and rear bumper of the host vehicle 1 .
- the milliwave radar sensors 12 emit electromagnetic waves to the surroundings of the host vehicle 1 (in the present embodiment, the front, rear, and sides of the host vehicle 1 ) and use the reflected waves to measure the distances to obstacles in the host vehicle surroundings and the relative speed with the obstacles. Further, the milliwave radar sensors 12 send the results of measurement as host vehicle surrounding information to the electronic control unit 80 .
- the locations of attachment and number of the milliwave radar sensors 12 are not particularly limited so long as the necessary host vehicle surrounding information can be acquired.
- they may also be attached to the grilles or to the insides of the headlights or brake lights and other such lights of the host vehicle 1 or may be attached to parts of the body (frame) of the host vehicle 1 .
- the external camera 13 captures an image of the area in front of the host vehicle 1 .
- the external camera 13 is, for example, attached to the center part of the front of the roof of the host vehicle 1 .
- the external camera 13 processes the captured image of the area in front of the host vehicle to detect information on obstacles in front of the host vehicle, the width of the lane of the road driven on and the road shape, road signs, white lines, the state of traffic lights, and other road information in the area in front of the host vehicle, the yaw angle (relative direction of vehicle with respect to lane driven on), the offset position of the vehicle from the center of the lane driven on, and other such driving information of the host vehicle 1 , rain or snow or fog and other such weather information of the host vehicle surroundings, etc. Further, the external camera 13 sends the detected image information to the electronic control unit 80 .
- the locations of attachment and number of the external cameras 13 are not particularly limited so long as the area in front of the host vehicle 1 can be captured.
- they may also be attached to the upper center part of the inner surface of the windshield inside the host vehicle.
- the illuminance sensor 14 detects the illuminance in the host vehicle surroundings. As shown in FIG. 2 , in the present embodiment, the illuminance sensor 14 is, for example, attached to the top surface of the instrument panel of the host vehicle. The illuminance sensor 14 sends the detected illuminance information of the host vehicle surroundings to the electronic control unit 80 .
- the rain sensor 15 detects the presence of rainfall and the amount of rainfall. As shown in FIG. 2 , in the present embodiment, the rain sensor 15 is, for example, attached to the top of the center of the front surface of the front glass of the host vehicle 1 .
- the rain sensor 15 fires light generated by a built-in light emitting diode toward the front surface of the front glass and measures the change in the reflected light at that time so as to detect the presence of rainfall, the amount of rainfall, and other rainfall information. Further, the rain sensor 15 sends the detected rainfall information to the electronic control unit 80 .
- the outside information receiving device 16 receives congestion information, weather information (rain, snow, fog, wind speed, and other information), and other outside information road sent from a traffic information communication system center or other outside communication center.
- the outside information receiving device 16 sends the received outside information to the electronic control unit 80 .
- the host vehicle information acquiring device 20 is a device for acquiring information relating to the state of the host vehicle 1 such as the speed or acceleration, posture, and current position of the host vehicle 1 (below, referred to as the “host vehicle information”). As shown in FIG. 1 , the host vehicle information acquiring device 20 according to the present embodiment is provided with a vehicle speed sensor 21 , acceleration sensor 22 , yaw rate sensor 23 , GPS receiver 24 , and door operation sensor 25 .
- the vehicle speed sensor 21 is a sensor for detecting the speed of the host vehicle 1 .
- the vehicle speed sensor 21 sends the detected vehicle speed information of the host vehicle 1 to the electronic control unit 80 .
- the acceleration sensor 22 is a sensor for detecting the acceleration of the host vehicle 1 at the time of accelerating or the time of braking.
- the acceleration sensor 22 sends the detected acceleration information of the host vehicle 1 to the electronic control unit 80 .
- the yaw rate sensor 23 is a sensor for detecting the posture of the host vehicle 1 , more specifically detects the speed of change of the yaw angle at the time the host vehicle 1 turns, that is, the rotational angular speed (yaw rate) about the vertical axis of the host vehicle 1 .
- the yaw rate sensor 23 sends the detected posture information of the host vehicle 1 to the electronic control unit 80 .
- the GPS receiver 24 receives signals from three or more GPS satellites to identify the longitude and latitude of the host vehicle 1 and detect the current position of the host vehicle 1 .
- the GPS receiver 24 sends the detected current position information of the host vehicle 1 to the electronic control unit 80 .
- the door operation sensor 25 is a sensor for detecting operation of a door of the host vehicle 1 .
- the door operation sensor 25 sends the detected door operation information to the electronic control unit 80 .
- the driver information acquiring device 30 is a device for acquiring information relating to the condition of the driver of the host vehicle 1 (below, referred to as the “driver information”). As shown in FIG. 1 and FIG. 3 , the driver information acquiring device 30 according to the present embodiment comprises a driver monitor camera 31 , steering wheel touch sensor 32 , sitting sensor 33 , seatbelt buckle sensor 34 , and seat position sensor 35 .
- the driver monitor camera 31 is attached to the top surface of the steering column and captures the external appearance of the driver.
- the driver monitor camera 31 processes the video of the driver captured by image processing to thereby detect the external appearance information of the driver such as the expression or posture of the driver. Further, the driver monitor camera 31 sends the detected external appearance information of the driver to the electronic control unit 80 .
- the steering wheel touch sensor 32 is attached to the steering wheel.
- the steering wheel touch sensor 32 detects whether the driver is gripping the steering wheel and sends the detected information on the gripping of the steering wheel to the electronic control unit 80 .
- the sitting sensor 33 is provided at the bottom of the surface of the seat.
- the sitting sensor 33 detects the load applied to the surface of the seat and sends sitting information indicating if the driver is sitting on the seat to the electronic control unit 80 .
- the seatbelt buckle sensor 34 is housed inside the seatbelt buckle and sends seatbelt fastening information indicating if the seatbelt is connected to the seatbelt buckle, that is, if the driver has fastened his or her seatbelt, to the electronic control unit 80 .
- the seat position sensor 35 is a sensor detecting a seat position (front-back position of the seat) and sends seat position information indicating if the driver has moved the seat to the electronic control unit 80 .
- the map database 40 is a database relating to map information. This map database 40 is for example stored in a hard disk drive (HDD) mounted in the vehicle.
- the map information includes positional information on the roads, information on the road shapes (for example, curves or straight stretches, curvature of curves, etc.), positional information on the intersections and turn-off points, information on the road types, etc.
- the storage device 50 stores a road map designed for automated driving.
- the automated driving use road map is prepared by the electronic control unit 80 based on the 3D image generated by the LIDAR device 11 and constantly or periodically updated by the electronic control unit 80 .
- the HMI 60 is an interface for input and output of information between the driver or a vehicle passenger and the automated driving system 100 .
- the HMI 60 according to the present embodiment is provided with an information device 61 for providing the driver with various types of information, a microphone 62 for recognizing the voice of the driver, and a touch panel or operating buttons or other input operating device 63 for the driver to perform an input operation.
- the information device 61 is provided with a display 611 for displaying text information or image information and a speaker 612 for generating sound.
- the navigation system 70 is a system for guiding the host vehicle 1 to a destination set by the driver through the HMI 60 .
- the navigation system 70 sets the driving route until the destination based on the current position information of the host vehicle 1 detected by the GPS receiver 24 and map information of the map database 40 and sends information relating to the set driving route as navigation information to the electronic control unit 80 .
- the electronic control unit 80 is a microcomputer comprised of components connected with each other by a bidirectional bus such as a central processing unit (CPU), read only memory (ROM), random access memory (RAM), input port, and output port.
- CPU central processing unit
- ROM read only memory
- RAM random access memory
- the electronic control unit 80 is provided with an automated driving control part 90 and is configured to be able to automatically perform driving operations relating to acceleration, steering, and braking to make the vehicle run as automated driving when the driver switches from the manual driving mode (mode where driving operations relating to acceleration, steering, and braking are performed by the driver) to the automated driving mode.
- the automated driving control part 90 is configured provided with a target driving route setting part 91 , a target driving line setting part 92 , and a driving operation part 93 .
- the target driving route setting part 91 sets the target driving route of the vehicle in the automated driving mode. Specifically, the target driving route setting part 91 sets the driving route up to the destination included in the navigation information as the target driving route.
- the target driving line setting part 92 sets the target driving line when driving along a driving lane on the target driving route. Specifically, the target driving line setting part 92 sets as the target driving line the driving line enabling the road in front of the host vehicle to be passed by a suitable speed corresponding to the road situation (degree of congestion, road shape, road surface conditions, etc.) based on obstacle information of the surroundings of the host vehicle (information on vehicles in front, fallen objects, etc.), road information on the road in front of the host vehicle such as the width of the driving lane or road shape, and speed information of the host vehicle.
- the road situation degree of congestion, road shape, road surface conditions, etc.
- the driving operation part 93 automatically performs driving operations relating to acceleration, steering, and braking so that the vehicle runs along the target driving line. Specifically, the driving operation part 93 controls the various control parts required for driving operations relating to acceleration, steering, and braking based on the surrounding environment information, host vehicle information, and, as necessary, driver information and various other types of information and automatically performs driving operations of the vehicle.
- the electronic control unit 80 is provided with a driving assistance part 94 in addition to the automated driving control part 90 and is configured to be able to automatically perform various types of driving assistance aimed at securing the safety of the driver during the manual driving mode and automated driving mode.
- the driving assistance part 94 for providing such driving assistance, is provided with a driver condition monitoring part 95 for monitoring the condition of the driver based on the image of the driver captured by the driver monitor camera 31 , that is, the above-mentioned external appearance information of the driver and, for example, is configured so as to be able to prompt the driver to pay attention when driving distracted etc. and being lax in monitoring the surroundings and otherwise perform driving assistance suitable for a condition of the driver.
- the driver condition monitoring part 95 is configured to first detect (learn) the positions of the two eyes or position of the nose of the driver, the distance between the two eyes, etc. based on the image captured by the driver monitor camera 31 (that is, the external appearance information of the driver), set an image of the face of the driver serving as a reference (reference image), and compare the reference image and the image of the face of the driver captured at any moment by the driver monitor camera 31 (that is, the external appearance information of the driver sent at any time) to thereby monitor in which direction the face of the driver or line of sight is facing or other driver conditions.
- the driver condition monitoring part 95 is configured provided with a reference image setting part 951 for setting a reference image of the face of the driver based on the image captured by the driver monitor camera 31 and a monitoring part 952 for monitoring the condition of the driver based on the image captured by the driver monitor camera 31 and the reference image.
- the reference image may be set just once when the driver changes, but while monitoring the driver condition, the driver may change his or her posture resulting in the face of the driver ending up temporarily departing from the field of view of the driver monitor camera 31 or an object or hand of the driver may end up causing the face of the driver to be temporarily concealed.
- the face of the driver will temporarily no longer be able to be recognized, but, for example, in a vehicle in which automated driving is performed, the driver may conceivably be changed during the automated driving. In this case as well, the face of the driver will temporarily become unable to be recognized.
- the driver condition is liable to be monitored based on the reference image of the other driver of before the change. As a result, the driver condition is liable to be unable to be monitored normally.
- the driver condition monitoring part 95 was configured so as to judge if the driver has been changed when the face of the driver can no longer be recognized while monitoring the driver condition and so as to reset the reference image of the face of the driver only when judging that the driver has been changed.
- the driver condition monitoring part 95 was configured to be further provided with a first judging part 953 for judging if the face of the driver can no longer be recognized based on the image captured by the driver monitor camera, a second judging part 954 for judging if the driver has been changed when the face of the driver can no longer be recognized, and a reference image resetting part 955 for resetting the reference image of the face of the driver based on the image captured by the driver monitor camera when it is judged that the driver has changed.
- a first judging part 953 for judging if the face of the driver can no longer be recognized based on the image captured by the driver monitor camera
- a second judging part 954 for judging if the driver has been changed when the face of the driver can no longer be recognized
- a reference image resetting part 955 for resetting the reference image of the face of the driver based on the image captured by the driver monitor camera when it is judged that the driver has changed.
- FIG. 4 is a flow chart for explaining the driver monitoring control according to the present embodiment.
- the electronic control unit 80 repeatedly performs the present routine by a predetermined processing period during operation of the vehicle.
- step S 1 the electronic control unit 80 acquires external appearance information of the driver.
- the electronic control unit 80 judges if a reference image setting flag F 1 has been set to “1”.
- the reference image setting flag F 1 is a flag with an initial value set to “0” and is set to “1” when first setting or when resetting the reference image.
- the electronic control unit 80 proceeds to the processing of step S 3 if the reference image setting flag F 1 is “0”.
- the electronic control unit 80 proceeds to the processing of step S 7 if the reference image setting flag F 1 is “1”.
- the electronic control unit 80 judges if the reference image can be set. Specifically, the electronic control unit 80 judges if the position of the two eyes or the position of the nose of the driver, the distance between the two eyes, etc. can be detected from the external appearance information of the driver, that is, if it can recognize the face of the driver based on the external appearance information of the driver. If the electronic control unit 80 can recognize the face of the driver based on the external appearance information of the driver, it judges that the reference image can be set and proceeds to the processing of step S 4 .
- the electronic control unit 80 proceeds to the processing of step S 6 if, for example, the face of the driver does not fit inside the field of view of the driver monitor camera 31 , part of the face of the driver ends up being concealed behind the hand or an object, or otherwise when the face of the driver cannot be recognized based on the external appearance information of the driver.
- the electronic control unit 80 sets the image of the face of the driver recognized based on the external appearance information of the driver as the reference image of the driver.
- step S 5 the electronic control unit 80 sets the reference image setting flag F 1 to “1”.
- the electronic control unit 80 provides the driver with text information or voice information such as “Please turn your face to the camera” through, for example, the information device 61 and prompts the driver to make the face of the driver be captured by the driver monitor camera 31 .
- the electronic control unit 80 recognizes the face of the driver based on the external appearance information of the driver.
- step S 8 the electronic control unit 80 judges if the face of the driver could be recognized.
- the electronic control unit 80 proceeds to the processing of step S 9 if it could recognize the face of the driver based on the external appearance information of the driver.
- the electronic control unit 80 proceeds to the processing of step S 10 if it could not recognize the face of the driver.
- the electronic control unit 80 compares the reference image and the image of the face of the driver recognized based on the external appearance information of the driver so as to monitor the driver condition.
- the electronic control unit 80 judges if the vehicle speed is a predetermined vehicle speed Vp (for example 10 km/h) or more.
- the electronic control unit 80 can judge that the possibility of the driver being changed is low if the vehicle speed is the predetermined vehicle speed Vp or more when driving by a certain extent of vehicle speed during manual driving of course and also during automated driving, so ends the current processing so as not to reset the reference image.
- the electronic control unit 80 judges that there is undeniably a possibility of the driver having been changed and proceeds to the processing of step S 11 if the vehicle speed is less than the predetermined vehicle speed Vp.
- the electronic control unit 80 judges if the shift position is a running range (D range or R range). This is because a driver generally will basically shift to the position of the parking range (P range) or neutral range (N range) when changing over and the possibility of taking over driving while in a running range is considered to be low. Therefore, the electronic control unit 80 judges that the possibility of the driver having been changed is low and ends the current processing so as not to reset the reference image if the shift position is a running range. On the other hand, the electronic control unit 80 judges that there is undeniably the possibility of the driver having been changed and proceeds to the processing of step S 12 if the shift position is not a running range.
- D range or R range running range
- the electronic control unit 80 judges if the driver is wearing his or her seatbelt based on the seatbelt fastening information.
- the electronic control unit 80 judges that the possibility of the driver having been changed is low and ends the current processing so that the reference image is not reset if the driver is wearing his or her seatbelt.
- the electronic control unit 80 judges that there is undeniably the possibility of the driver having been changed and proceeds to the processing of step S 13 if the driver is not wearing his or her seatbelt.
- the electronic control unit 80 judges if the door has been operated based on the door operating information.
- the electronic control unit 80 judges that the possibility of the driver having been changed is low and ends the current processing so that the reference image is not reset if the door has not been operated.
- the electronic control unit 80 judges that there is undeniably the possibility of the driver having been changed and proceeds to the processing of step S 14 if the door has been operated.
- step S 14 the electronic control unit 80 judges if the driver is sitting in the seat based on the sitting information.
- the electronic control unit 80 judges that the possibility of the driver having been changed is low if the driver is sitting in the seat and ends the current processing so that the reference image is not reset.
- the electronic control unit 80 judges that there is undeniably the possibility of the driver having been changed and proceeds to the processing of step S 15 if the driver is not sitting in the seat.
- step S 15 the electronic control unit 80 judges if the seat position has been changed based on the seat position information. This is because when the driver has been changed, the possibility of the seat position being changed is high. The electronic control unit 80 judges that the possibility of the driver having been changed is low and ends the current processing so that the reference image is not reset if the seat position has not been changed. On the other hand, the electronic control unit 80 judges that there is undeniably the possibility of the driver having been changed and proceeds to the processing of step S 16 if the seat position has been changed.
- step S 16 the electronic control unit 80 returns the reference image setting flag F 1 to “0” to reset the reference image since there is undeniably the possibility of the driver having been changed.
- an electronic control unit 80 for controlling a vehicle provided with a driver monitor camera 31 capturing an image of a face of the driver, configured comprising a reference image setting part 951 setting a reference image of the face of the driver based on the image captured by the driver monitor camera 31 , a monitoring part 952 monitoring a condition of the driver based on the image captured by the driver monitor camera 31 and the reference image, a first judging part 953 judging if the face of the driver can no longer be recognized when monitoring the condition of the driver, a second judging part 954 judging if the driver has been changed when the face of the driver can no longer be recognized, and a reference image resetting part 955 resetting the reference image of the face of the driver based on the image captured by the driver monitor camera 31 when it is judged that the driver has changed.
- a reference image setting part 951 setting a reference image of the face of the driver based on the image captured by the driver monitor camera 31
- a monitoring part 952 monitoring a condition of the driver based on the image
- the present embodiment it is judged if the driver has been changed and the reference image of the face of the driver is reset only when judging that the driver has been changed, so it is possible to prevent the reference image from being reset and the driver condition from becoming unable to be monitored during the reset period despite the driver not having been changed.
- the second judging part 954 can be configured so as to judge that the driver has not been changed when the face of the driver can no longer be recognized if for example the vehicle speed is a predetermined vehicle speed Vp or more. Further, in addition to this, for example, it may be configured so as to judge that the driver has not been changed when the driver is wearing his or her seatbelt, when the shift position is in the running range, when the door has not been operated, when the driver is sitting in the seat, or when the seat position has not been changed.
- step S 10 whether or not the driver has been changed was judged from step S 10 to step S 15 of the flow chart of FIG. 4 , but the order of judgment may be suitably changed. Further, it is sufficient to perform at least one of the processings from step S 10 to step S 15 . Furthermore, it is possible to add processing other than this for judging if the driver has been changed.
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Multimedia (AREA)
- Theoretical Computer Science (AREA)
- Human Computer Interaction (AREA)
- Health & Medical Sciences (AREA)
- General Health & Medical Sciences (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Automation & Control Theory (AREA)
- Mathematical Physics (AREA)
- Transportation (AREA)
- Mechanical Engineering (AREA)
- Traffic Control Systems (AREA)
- Control Of Driving Devices And Active Controlling Of Vehicle (AREA)
Abstract
Description
- This application claims priority based on Japanese Patent Application No. 2017-225751 filed with the Japan Patent Office on Nov. 24, 2017, the entire contents of which are incorporated into the present specification by reference.
- The present disclosure relates to a control system for a vehicle.
- The specification of U.S. Pat. No. 8,587,440 discloses a conventional control system of a vehicle configured to use a driver monitor camera to capture an image of a driver and confirm the face of the driver based on the same.
- If monitoring the condition of a driver based on the image of the driver captured by a driver monitor camera, it is necessary to first detect (learn) the positions of the two eyes or the position of the nose of the driver, the distance between the two eyes, etc. based on the image of the driver captured by the driver monitor camera, set the image of the face of the driver as a reference (reference image), then compare that reference image and the image of the driver captured at any time by the driver monitor camera. Due to this, for example, the driver condition such as in which direction the face of the driver is oriented is monitored.
- This reference image of the face of the driver need only be set one time when the driver changes, but while the driver condition is being monitored, sometimes the driver will change his or her posture resulting in the face of the driver ending up temporarily departing from the inside of the field of view of the driver monitor camera or resulting in the face of the driver ending up temporarily concealed by an object or the hand of the driver.
- In such a case, the face of the driver will temporarily no longer be able to be recognized. In the case for example of a vehicle being driven by automated driving, the driver may conceivably be changed during the automated driving. For this reason, in the above-mentioned conventional control system of a vehicle, it is not possible to judge whether the temporary failure to recognize the face of the driver was caused by the driver having been changed or was caused by, though the driver has not been changed, a change of his or her posture etc.
- To set the reference image, as explained above, it is necessary to detect the positions of the two eyes or the position of the nose of the driver, the distance between the two eyes, etc., so a certain extent of time becomes required. For this reason, despite the driver not having been changed, the change in the posture etc. causes the face of the driver to temporarily no longer be able to be recognized. If, because of this, configuring the system so that when the driver monitor camera next captures the face of the driver, the reference image of that face is set, there is the problem that the driver condition can no longer be monitored during the period when setting the reference image.
- The present disclosure was made focusing on such a problem and has as its object to keep the reference image from being reset and the driver condition from no longer being able to be monitored during the reset period despite the driver not having been changed.
- To solve the above technical problem, according to one aspect of the present disclosure, there is provided a control system for controlling a vehicle provided with a driver monitor camera capturing an image of a face of a driver, comprising a reference image setting part configured so as to set a reference image of the face of the driver based on the image captured by the driver monitor camera, a monitoring part configured so as to monitor a condition of the driver based on the image captured by the driver monitor camera and the reference image, a first judging part configured so as to judge if the face of the driver can no longer be recognized when monitoring the condition of the driver, a second judging part configured so as to judge if the driver has been changed when the face of the driver can no longer be recognized, and a reference image resetting part configured to reset the reference image of the face of the driver based on the image captured by the driver monitor camera when it is judged that the driver has changed.
- According to this aspect of the present disclosure, it is possible to keep a reference image from being reset and the driver condition from no longer being able to be monitored during the reset period despite the driver not having been changed.
-
FIG. 1 is a schematic view of the configuration of an automated driving system for a vehicle according to one embodiment of the present disclosure. -
FIG. 2 is a schematic external view of a host vehicle mounting an automated driving system according to one embodiment of the present disclosure. -
FIG. 3 is a schematic internal view of a host vehicle mounting an automated driving system according to one embodiment of the present disclosure. -
FIG. 4 is a flow chart for explaining driver condition monitoring control according to one embodiment of the present disclosure. - Below, referring to the drawings, embodiments of the present invention will be explained in detail. Note that, in the following explanation, similar component elements will be assigned the same reference notations.
-
FIG. 1 is a schematic view of the configuration of anautomated driving system 100 of a vehicle according to one embodiment of the present invention.FIG. 2 is a schematic view of the appearance of ahost vehicle 1 mounting theautomated driving system 100 according to the present embodiment.FIG. 3 is a schematic view of the inside of thehost vehicle 1 mounting theautomated driving system 100 according to the present embodiment. - As shown in
FIG. 1 , theautomated driving system 100 according to the present embodiment is provided with a surrounding environmentinformation acquiring device 10, a host vehicleinformation acquiring device 20, a driverinformation acquiring device 30, amap database 40, astorage device 50, a human-machine interface (below, referred to as an “HMI”) 60, anavigation system 70, and anelectronic control unit 80. - The surrounding environment
information acquiring device 10 is a device for acquiring information relating to obstacles in the surroundings of the host vehicle (for example, buildings, moving vehicles such as vehicles in front of it and in back of it on the road and oncoming vehicles, stopped vehicles, the curb, fallen objects, pedestrians, etc.) and the weather and other such surrounding environmental conditions of the host vehicle 1 (below, referred to as the “surrounding environment information”). As shown inFIG. 1 toFIG. 3 , the surrounding environmentinformation acquiring device 10 according to the present embodiment is provided with a LIDAR (laser imaging detection and ranging)device 11,milliwave radar sensors 12, anexternal camera 13,illuminance sensor 14,rain sensor 15, and outsideinformation receiving device 16. - The LIDAR
device 11 uses laser beams to detect the road and obstacles in the host vehicle surroundings. As shown inFIG. 2 , in the present embodiment, the LIDARdevice 11 is, for example, attached to the roof of thehost vehicle 1. The LIDARdevice 11 successively fires laser beams toward the overall surroundings of thehost vehicle 1 and measures the distances to the road and host vehicle surroundings from the reflected light. Further, the LIDARdevice 11 uses the results of measurement as the basis to generate 3D images of the road and obstacles in the overall surroundings of thehost vehicle 1 and sends information of the generated 3D images to theelectronic control unit 80. - Note that, the locations of attachment and number of units of the LIDAR
device 11 are not particularly limited so long as the information necessary for generating a three-dimensional image can be acquired. For example, they may also be divided and attached to the grilles or to the insides of the headlights or brake lights and other such lights of thehost vehicle 1 or may be divided and attached to parts of the body (frame) of thehost vehicle 1. - The
milliwave radar sensors 12 utilize electromagnetic waves to detect obstacles in the host vehicle surroundings at a farther distance than the LIDARdevice 11. As shown inFIG. 2 , in the present embodiment, themilliwave radar sensors 12, for example, are attached to the front bumper and rear bumper of thehost vehicle 1. Themilliwave radar sensors 12 emit electromagnetic waves to the surroundings of the host vehicle 1 (in the present embodiment, the front, rear, and sides of the host vehicle 1) and use the reflected waves to measure the distances to obstacles in the host vehicle surroundings and the relative speed with the obstacles. Further, themilliwave radar sensors 12 send the results of measurement as host vehicle surrounding information to theelectronic control unit 80. - Note that, the locations of attachment and number of the
milliwave radar sensors 12 are not particularly limited so long as the necessary host vehicle surrounding information can be acquired. For example, they may also be attached to the grilles or to the insides of the headlights or brake lights and other such lights of thehost vehicle 1 or may be attached to parts of the body (frame) of thehost vehicle 1. - The
external camera 13 captures an image of the area in front of thehost vehicle 1. As shown inFIG. 2 , in the present embodiment, theexternal camera 13 is, for example, attached to the center part of the front of the roof of thehost vehicle 1. Theexternal camera 13 processes the captured image of the area in front of the host vehicle to detect information on obstacles in front of the host vehicle, the width of the lane of the road driven on and the road shape, road signs, white lines, the state of traffic lights, and other road information in the area in front of the host vehicle, the yaw angle (relative direction of vehicle with respect to lane driven on), the offset position of the vehicle from the center of the lane driven on, and other such driving information of thehost vehicle 1, rain or snow or fog and other such weather information of the host vehicle surroundings, etc. Further, theexternal camera 13 sends the detected image information to theelectronic control unit 80. - Note that, the locations of attachment and number of the
external cameras 13 are not particularly limited so long as the area in front of thehost vehicle 1 can be captured. For example, they may also be attached to the upper center part of the inner surface of the windshield inside the host vehicle. - The
illuminance sensor 14 detects the illuminance in the host vehicle surroundings. As shown inFIG. 2 , in the present embodiment, theilluminance sensor 14 is, for example, attached to the top surface of the instrument panel of the host vehicle. Theilluminance sensor 14 sends the detected illuminance information of the host vehicle surroundings to theelectronic control unit 80. - The
rain sensor 15 detects the presence of rainfall and the amount of rainfall. As shown inFIG. 2 , in the present embodiment, therain sensor 15 is, for example, attached to the top of the center of the front surface of the front glass of thehost vehicle 1. Therain sensor 15 fires light generated by a built-in light emitting diode toward the front surface of the front glass and measures the change in the reflected light at that time so as to detect the presence of rainfall, the amount of rainfall, and other rainfall information. Further, therain sensor 15 sends the detected rainfall information to theelectronic control unit 80. - The outside
information receiving device 16, for example, receives congestion information, weather information (rain, snow, fog, wind speed, and other information), and other outside information road sent from a traffic information communication system center or other outside communication center. The outsideinformation receiving device 16 sends the received outside information to theelectronic control unit 80. - The host vehicle
information acquiring device 20 is a device for acquiring information relating to the state of thehost vehicle 1 such as the speed or acceleration, posture, and current position of the host vehicle 1 (below, referred to as the “host vehicle information”). As shown inFIG. 1 , the host vehicleinformation acquiring device 20 according to the present embodiment is provided with avehicle speed sensor 21,acceleration sensor 22,yaw rate sensor 23,GPS receiver 24, anddoor operation sensor 25. - The
vehicle speed sensor 21 is a sensor for detecting the speed of thehost vehicle 1. Thevehicle speed sensor 21 sends the detected vehicle speed information of thehost vehicle 1 to theelectronic control unit 80. - The
acceleration sensor 22 is a sensor for detecting the acceleration of thehost vehicle 1 at the time of accelerating or the time of braking. Theacceleration sensor 22 sends the detected acceleration information of thehost vehicle 1 to theelectronic control unit 80. - The
yaw rate sensor 23 is a sensor for detecting the posture of thehost vehicle 1, more specifically detects the speed of change of the yaw angle at the time thehost vehicle 1 turns, that is, the rotational angular speed (yaw rate) about the vertical axis of thehost vehicle 1. Theyaw rate sensor 23 sends the detected posture information of thehost vehicle 1 to theelectronic control unit 80. - The
GPS receiver 24 receives signals from three or more GPS satellites to identify the longitude and latitude of thehost vehicle 1 and detect the current position of thehost vehicle 1. TheGPS receiver 24 sends the detected current position information of thehost vehicle 1 to theelectronic control unit 80. - The
door operation sensor 25 is a sensor for detecting operation of a door of thehost vehicle 1. Thedoor operation sensor 25 sends the detected door operation information to theelectronic control unit 80. - The driver
information acquiring device 30 is a device for acquiring information relating to the condition of the driver of the host vehicle 1 (below, referred to as the “driver information”). As shown inFIG. 1 andFIG. 3 , the driverinformation acquiring device 30 according to the present embodiment comprises adriver monitor camera 31, steeringwheel touch sensor 32, sittingsensor 33,seatbelt buckle sensor 34, andseat position sensor 35. - The
driver monitor camera 31 is attached to the top surface of the steering column and captures the external appearance of the driver. Thedriver monitor camera 31 processes the video of the driver captured by image processing to thereby detect the external appearance information of the driver such as the expression or posture of the driver. Further, thedriver monitor camera 31 sends the detected external appearance information of the driver to theelectronic control unit 80. - The steering
wheel touch sensor 32 is attached to the steering wheel. The steeringwheel touch sensor 32 detects whether the driver is gripping the steering wheel and sends the detected information on the gripping of the steering wheel to theelectronic control unit 80. - The sitting
sensor 33 is provided at the bottom of the surface of the seat. The sittingsensor 33 detects the load applied to the surface of the seat and sends sitting information indicating if the driver is sitting on the seat to theelectronic control unit 80. - The
seatbelt buckle sensor 34 is housed inside the seatbelt buckle and sends seatbelt fastening information indicating if the seatbelt is connected to the seatbelt buckle, that is, if the driver has fastened his or her seatbelt, to theelectronic control unit 80. - The
seat position sensor 35 is a sensor detecting a seat position (front-back position of the seat) and sends seat position information indicating if the driver has moved the seat to theelectronic control unit 80. - The
map database 40 is a database relating to map information. Thismap database 40 is for example stored in a hard disk drive (HDD) mounted in the vehicle. The map information includes positional information on the roads, information on the road shapes (for example, curves or straight stretches, curvature of curves, etc.), positional information on the intersections and turn-off points, information on the road types, etc. - The
storage device 50 stores a road map designed for automated driving. The automated driving use road map is prepared by theelectronic control unit 80 based on the 3D image generated by theLIDAR device 11 and constantly or periodically updated by theelectronic control unit 80. - The
HMI 60 is an interface for input and output of information between the driver or a vehicle passenger and theautomated driving system 100. TheHMI 60 according to the present embodiment is provided with aninformation device 61 for providing the driver with various types of information, amicrophone 62 for recognizing the voice of the driver, and a touch panel or operating buttons or otherinput operating device 63 for the driver to perform an input operation. - The
information device 61 is provided with adisplay 611 for displaying text information or image information and aspeaker 612 for generating sound. - The
navigation system 70 is a system for guiding thehost vehicle 1 to a destination set by the driver through theHMI 60. Thenavigation system 70 sets the driving route until the destination based on the current position information of thehost vehicle 1 detected by theGPS receiver 24 and map information of themap database 40 and sends information relating to the set driving route as navigation information to theelectronic control unit 80. - The
electronic control unit 80 is a microcomputer comprised of components connected with each other by a bidirectional bus such as a central processing unit (CPU), read only memory (ROM), random access memory (RAM), input port, and output port. - The
electronic control unit 80 is provided with an automateddriving control part 90 and is configured to be able to automatically perform driving operations relating to acceleration, steering, and braking to make the vehicle run as automated driving when the driver switches from the manual driving mode (mode where driving operations relating to acceleration, steering, and braking are performed by the driver) to the automated driving mode. Specifically, the automateddriving control part 90 is configured provided with a target drivingroute setting part 91, a target drivingline setting part 92, and a drivingoperation part 93. - The target driving
route setting part 91 sets the target driving route of the vehicle in the automated driving mode. Specifically, the target drivingroute setting part 91 sets the driving route up to the destination included in the navigation information as the target driving route. - The target driving
line setting part 92 sets the target driving line when driving along a driving lane on the target driving route. Specifically, the target drivingline setting part 92 sets as the target driving line the driving line enabling the road in front of the host vehicle to be passed by a suitable speed corresponding to the road situation (degree of congestion, road shape, road surface conditions, etc.) based on obstacle information of the surroundings of the host vehicle (information on vehicles in front, fallen objects, etc.), road information on the road in front of the host vehicle such as the width of the driving lane or road shape, and speed information of the host vehicle. - The driving
operation part 93 automatically performs driving operations relating to acceleration, steering, and braking so that the vehicle runs along the target driving line. Specifically, the drivingoperation part 93 controls the various control parts required for driving operations relating to acceleration, steering, and braking based on the surrounding environment information, host vehicle information, and, as necessary, driver information and various other types of information and automatically performs driving operations of the vehicle. - Further, the
electronic control unit 80 is provided with a drivingassistance part 94 in addition to the automateddriving control part 90 and is configured to be able to automatically perform various types of driving assistance aimed at securing the safety of the driver during the manual driving mode and automated driving mode. - The driving
assistance part 94 according to the present embodiment, for providing such driving assistance, is provided with a drivercondition monitoring part 95 for monitoring the condition of the driver based on the image of the driver captured by thedriver monitor camera 31, that is, the above-mentioned external appearance information of the driver and, for example, is configured so as to be able to prompt the driver to pay attention when driving distracted etc. and being lax in monitoring the surroundings and otherwise perform driving assistance suitable for a condition of the driver. - Here, the driver
condition monitoring part 95 according to the present embodiment is configured to first detect (learn) the positions of the two eyes or position of the nose of the driver, the distance between the two eyes, etc. based on the image captured by the driver monitor camera 31 (that is, the external appearance information of the driver), set an image of the face of the driver serving as a reference (reference image), and compare the reference image and the image of the face of the driver captured at any moment by the driver monitor camera 31 (that is, the external appearance information of the driver sent at any time) to thereby monitor in which direction the face of the driver or line of sight is facing or other driver conditions. - That is, the driver
condition monitoring part 95 is configured provided with a referenceimage setting part 951 for setting a reference image of the face of the driver based on the image captured by thedriver monitor camera 31 and amonitoring part 952 for monitoring the condition of the driver based on the image captured by thedriver monitor camera 31 and the reference image. - Here, the reference image may be set just once when the driver changes, but while monitoring the driver condition, the driver may change his or her posture resulting in the face of the driver ending up temporarily departing from the field of view of the
driver monitor camera 31 or an object or hand of the driver may end up causing the face of the driver to be temporarily concealed. - In such a case, the face of the driver will temporarily no longer be able to be recognized, but, for example, in a vehicle in which automated driving is performed, the driver may conceivably be changed during the automated driving. In this case as well, the face of the driver will temporarily become unable to be recognized.
- Therefore, if ending up configuring the driver
condition monitoring part 95 in the above way, that is, configuring it provided with only the referenceimage setting part 951 and monitoringpart 952, it would not be possible to judge if the reason why the face of the driver temporarily could no longer be recognized was due to a change in the posture of the driver etc. or was due to the driver having been changed. - If the reason why the face of the driver temporarily could no longer be recognized was due to the driver having been changed, unless resetting the reference image, the driver condition is liable to be monitored based on the reference image of the other driver of before the change. As a result, the driver condition is liable to be unable to be monitored normally.
- On the other hand, to set the reference image of the face of the driver, as explained above, it is necessary to detect the positions of the two eyes or position of the nose of the driver, the distance between the two eyes, etc., so a certain extent of time becomes necessary. For this reason, if configuring the system so as to set the reference image of the face of the driver when next captured by the
driver monitor camera 31 after a change in posture etc. causes the face of the driver to temporarily become unable to be recognized despite the driver not having been changed, it will no longer be possible to monitor the driver condition during the period for setting the reference image. - Therefore, in the present embodiment, the driver
condition monitoring part 95 was configured so as to judge if the driver has been changed when the face of the driver can no longer be recognized while monitoring the driver condition and so as to reset the reference image of the face of the driver only when judging that the driver has been changed. - That is, the driver
condition monitoring part 95 was configured to be further provided with a first judgingpart 953 for judging if the face of the driver can no longer be recognized based on the image captured by the driver monitor camera, a second judgingpart 954 for judging if the driver has been changed when the face of the driver can no longer be recognized, and a referenceimage resetting part 955 for resetting the reference image of the face of the driver based on the image captured by the driver monitor camera when it is judged that the driver has changed. Below, the driver condition monitoring control according to the present embodiment will be explained. -
FIG. 4 is a flow chart for explaining the driver monitoring control according to the present embodiment. Theelectronic control unit 80 repeatedly performs the present routine by a predetermined processing period during operation of the vehicle. - At step S1, the
electronic control unit 80 acquires external appearance information of the driver. - At step S2, the
electronic control unit 80 judges if a reference image setting flag F1 has been set to “1”. The reference image setting flag F1 is a flag with an initial value set to “0” and is set to “1” when first setting or when resetting the reference image. Theelectronic control unit 80 proceeds to the processing of step S3 if the reference image setting flag F1 is “0”. On the other hand, theelectronic control unit 80 proceeds to the processing of step S7 if the reference image setting flag F1 is “1”. - At step S3, the
electronic control unit 80 judges if the reference image can be set. Specifically, theelectronic control unit 80 judges if the position of the two eyes or the position of the nose of the driver, the distance between the two eyes, etc. can be detected from the external appearance information of the driver, that is, if it can recognize the face of the driver based on the external appearance information of the driver. If theelectronic control unit 80 can recognize the face of the driver based on the external appearance information of the driver, it judges that the reference image can be set and proceeds to the processing of step S4. On the other hand, theelectronic control unit 80 proceeds to the processing of step S6 if, for example, the face of the driver does not fit inside the field of view of thedriver monitor camera 31, part of the face of the driver ends up being concealed behind the hand or an object, or otherwise when the face of the driver cannot be recognized based on the external appearance information of the driver. - At step S4, the
electronic control unit 80 sets the image of the face of the driver recognized based on the external appearance information of the driver as the reference image of the driver. - At step S5, the
electronic control unit 80 sets the reference image setting flag F1 to “1”. - At step S6, the
electronic control unit 80 provides the driver with text information or voice information such as “Please turn your face to the camera” through, for example, theinformation device 61 and prompts the driver to make the face of the driver be captured by thedriver monitor camera 31. - At step S7, the
electronic control unit 80 recognizes the face of the driver based on the external appearance information of the driver. - At step S8, the
electronic control unit 80 judges if the face of the driver could be recognized. Theelectronic control unit 80 proceeds to the processing of step S9 if it could recognize the face of the driver based on the external appearance information of the driver. On the other hand, theelectronic control unit 80 proceeds to the processing of step S10 if it could not recognize the face of the driver. - At step S9, the
electronic control unit 80 compares the reference image and the image of the face of the driver recognized based on the external appearance information of the driver so as to monitor the driver condition. - At step S10, the
electronic control unit 80 judges if the vehicle speed is a predetermined vehicle speed Vp (for example 10 km/h) or more. Theelectronic control unit 80 can judge that the possibility of the driver being changed is low if the vehicle speed is the predetermined vehicle speed Vp or more when driving by a certain extent of vehicle speed during manual driving of course and also during automated driving, so ends the current processing so as not to reset the reference image. On the other hand, theelectronic control unit 80 judges that there is undeniably a possibility of the driver having been changed and proceeds to the processing of step S11 if the vehicle speed is less than the predetermined vehicle speed Vp. - At step S11, the
electronic control unit 80 judges if the shift position is a running range (D range or R range). This is because a driver generally will basically shift to the position of the parking range (P range) or neutral range (N range) when changing over and the possibility of taking over driving while in a running range is considered to be low. Therefore, theelectronic control unit 80 judges that the possibility of the driver having been changed is low and ends the current processing so as not to reset the reference image if the shift position is a running range. On the other hand, theelectronic control unit 80 judges that there is undeniably the possibility of the driver having been changed and proceeds to the processing of step S12 if the shift position is not a running range. - At step S12, the
electronic control unit 80 judges if the driver is wearing his or her seatbelt based on the seatbelt fastening information. Theelectronic control unit 80 judges that the possibility of the driver having been changed is low and ends the current processing so that the reference image is not reset if the driver is wearing his or her seatbelt. On the other hand, theelectronic control unit 80 judges that there is undeniably the possibility of the driver having been changed and proceeds to the processing of step S13 if the driver is not wearing his or her seatbelt. - At step S13, the
electronic control unit 80 judges if the door has been operated based on the door operating information. Theelectronic control unit 80 judges that the possibility of the driver having been changed is low and ends the current processing so that the reference image is not reset if the door has not been operated. On the other hand, theelectronic control unit 80 judges that there is undeniably the possibility of the driver having been changed and proceeds to the processing of step S14 if the door has been operated. - At step S14, the
electronic control unit 80 judges if the driver is sitting in the seat based on the sitting information. Theelectronic control unit 80 judges that the possibility of the driver having been changed is low if the driver is sitting in the seat and ends the current processing so that the reference image is not reset. On the other hand, theelectronic control unit 80 judges that there is undeniably the possibility of the driver having been changed and proceeds to the processing of step S15 if the driver is not sitting in the seat. - At step S15, the
electronic control unit 80 judges if the seat position has been changed based on the seat position information. This is because when the driver has been changed, the possibility of the seat position being changed is high. Theelectronic control unit 80 judges that the possibility of the driver having been changed is low and ends the current processing so that the reference image is not reset if the seat position has not been changed. On the other hand, theelectronic control unit 80 judges that there is undeniably the possibility of the driver having been changed and proceeds to the processing of step S16 if the seat position has been changed. - At step S16, the
electronic control unit 80 returns the reference image setting flag F1 to “0” to reset the reference image since there is undeniably the possibility of the driver having been changed. - According to the present embodiment explained above, there is provided an electronic control unit 80 (control system) for controlling a vehicle provided with a
driver monitor camera 31 capturing an image of a face of the driver, configured comprising a referenceimage setting part 951 setting a reference image of the face of the driver based on the image captured by thedriver monitor camera 31, amonitoring part 952 monitoring a condition of the driver based on the image captured by thedriver monitor camera 31 and the reference image, a first judgingpart 953 judging if the face of the driver can no longer be recognized when monitoring the condition of the driver, a second judgingpart 954 judging if the driver has been changed when the face of the driver can no longer be recognized, and a referenceimage resetting part 955 resetting the reference image of the face of the driver based on the image captured by thedriver monitor camera 31 when it is judged that the driver has changed. - In this way, according to the present embodiment, it is judged if the driver has been changed and the reference image of the face of the driver is reset only when judging that the driver has been changed, so it is possible to prevent the reference image from being reset and the driver condition from becoming unable to be monitored during the reset period despite the driver not having been changed.
- The second judging
part 954 can be configured so as to judge that the driver has not been changed when the face of the driver can no longer be recognized if for example the vehicle speed is a predetermined vehicle speed Vp or more. Further, in addition to this, for example, it may be configured so as to judge that the driver has not been changed when the driver is wearing his or her seatbelt, when the shift position is in the running range, when the door has not been operated, when the driver is sitting in the seat, or when the seat position has not been changed. - By configuring the second judging
part 954 in this way, it is possible to accurately judge the driver being changed. - Above, embodiments of the present disclosure were explained, but the above embodiments only show part of the examples of application of the present disclosure. They are not intended to limit the technical scope of the present disclosure to the specific constitutions of the embodiments.
- For example, in the above embodiments, whether or not the driver has been changed was judged from step S10 to step S15 of the flow chart of
FIG. 4 , but the order of judgment may be suitably changed. Further, it is sufficient to perform at least one of the processings from step S10 to step S15. Furthermore, it is possible to add processing other than this for judging if the driver has been changed.
Claims (7)
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2017225751A JP6996253B2 (en) | 2017-11-24 | 2017-11-24 | Vehicle control device |
JP2017-225751 | 2017-11-24 |
Publications (2)
Publication Number | Publication Date |
---|---|
US20190163997A1 true US20190163997A1 (en) | 2019-05-30 |
US10896338B2 US10896338B2 (en) | 2021-01-19 |
Family
ID=66634507
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US16/174,776 Active 2039-04-05 US10896338B2 (en) | 2017-11-24 | 2018-10-30 | Control system |
Country Status (3)
Country | Link |
---|---|
US (1) | US10896338B2 (en) |
JP (1) | JP6996253B2 (en) |
CN (1) | CN109987101A (en) |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210109540A1 (en) * | 2017-03-14 | 2021-04-15 | Aptiv Technologies Limited | Angle finding for a detector having a paired staggered array |
US11325609B2 (en) * | 2019-10-08 | 2022-05-10 | Subaru Corporation | Vehicle driving assist system |
US11373283B2 (en) * | 2018-05-31 | 2022-06-28 | Toyota Jidosha Kabushiki Kaisha | Object monitoring device |
US11443532B2 (en) * | 2018-12-28 | 2022-09-13 | Honda Motor Co., Ltd. | Capturing neutral face expression apparatus and method |
Families Citing this family (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP7363758B2 (en) * | 2020-12-23 | 2023-10-18 | 株式会社デンソー | Condition monitoring device and condition monitoring program |
CN113392718A (en) * | 2021-05-21 | 2021-09-14 | 海南师范大学 | Shared automobile travel management system and method based on block chain |
Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110068934A1 (en) * | 2009-09-22 | 2011-03-24 | Automotive Research & Test Center | Method and system for monitoring driver |
US20190122060A1 (en) * | 2016-05-30 | 2019-04-25 | Aisin Seiki Kabushiki Kaisha | Alarm device |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP3769442B2 (en) * | 2000-02-15 | 2006-04-26 | ナイルス株式会社 | Eye state detection device |
JP2002254955A (en) * | 2001-02-28 | 2002-09-11 | Matsushita Electric Ind Co Ltd | Travel warning guide device |
JP2002303526A (en) * | 2001-04-04 | 2002-10-18 | Matsushita Electric Ind Co Ltd | Travel guiding device |
US6496117B2 (en) * | 2001-03-30 | 2002-12-17 | Koninklijke Philips Electronics N.V. | System for monitoring a driver's attention to driving |
JP2008213585A (en) * | 2007-03-01 | 2008-09-18 | Toyota Motor Corp | Drunk-driving prevention device |
JP2009085728A (en) * | 2007-09-28 | 2009-04-23 | Hitachi Ltd | Rest period reporting device |
JP2009180705A (en) * | 2008-02-01 | 2009-08-13 | Aisin Aw Co Ltd | Navigation device, navigation method, and program |
JP2010204304A (en) * | 2009-03-02 | 2010-09-16 | Panasonic Corp | Image capturing device, operator monitoring device, method for measuring distance to face |
JP2011113503A (en) * | 2009-11-30 | 2011-06-09 | Aisin Aw Co Ltd | Driving support device |
JP6107595B2 (en) * | 2013-10-23 | 2017-04-05 | 株式会社デンソー | Condition monitoring device |
-
2017
- 2017-11-24 JP JP2017225751A patent/JP6996253B2/en active Active
-
2018
- 2018-10-30 US US16/174,776 patent/US10896338B2/en active Active
- 2018-11-23 CN CN201811405376.5A patent/CN109987101A/en active Pending
Patent Citations (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20110068934A1 (en) * | 2009-09-22 | 2011-03-24 | Automotive Research & Test Center | Method and system for monitoring driver |
US20190122060A1 (en) * | 2016-05-30 | 2019-04-25 | Aisin Seiki Kabushiki Kaisha | Alarm device |
Cited By (4)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210109540A1 (en) * | 2017-03-14 | 2021-04-15 | Aptiv Technologies Limited | Angle finding for a detector having a paired staggered array |
US11373283B2 (en) * | 2018-05-31 | 2022-06-28 | Toyota Jidosha Kabushiki Kaisha | Object monitoring device |
US11443532B2 (en) * | 2018-12-28 | 2022-09-13 | Honda Motor Co., Ltd. | Capturing neutral face expression apparatus and method |
US11325609B2 (en) * | 2019-10-08 | 2022-05-10 | Subaru Corporation | Vehicle driving assist system |
Also Published As
Publication number | Publication date |
---|---|
JP2019096117A (en) | 2019-06-20 |
JP6996253B2 (en) | 2022-01-17 |
US10896338B2 (en) | 2021-01-19 |
CN109987101A (en) | 2019-07-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US10915100B2 (en) | Control system for vehicle | |
US10579056B2 (en) | Control system for vehicle | |
US11327485B2 (en) | Control device for vehicle, and vehicle | |
US11529964B2 (en) | Vehicle automated driving system | |
US10896338B2 (en) | Control system | |
US10427690B2 (en) | Control system | |
US9747800B2 (en) | Vehicle recognition notification apparatus and vehicle recognition notification system | |
US10067506B2 (en) | Control device of vehicle | |
US10120378B2 (en) | Vehicle automated driving system | |
US9507345B2 (en) | Vehicle control system and method | |
US10099692B2 (en) | Control system for vehicle | |
US20180237008A1 (en) | Control device for vehicle | |
US20160137127A1 (en) | Road sign information display system and method in vehicle | |
US10232772B2 (en) | Driver assistance system | |
US10906542B2 (en) | Vehicle detection system which classifies valid or invalid vehicles | |
US11897482B2 (en) | Autonomous vehicle control for changing grip completion status in real-time to prevent unnecessary warning issuance | |
JP2018185555A (en) | Vehicle controller | |
US20230141584A1 (en) | Apparatus for displaying at least one virtual lane line based on environmental condition and method of controlling same | |
TWM482526U (en) | Navigation monitoring device |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
AS | Assignment |
Owner name: TOYOTA JIDOSHA KABUSHIKI KAISHA, JAPAN Free format text: ASSIGNMENT OF ASSIGNORS INTEREST;ASSIGNORS:MATSUMURA, TAKESHI;SATO, JUN;HARA, KENICHIROH;AND OTHERS;REEL/FRAME:047364/0109 Effective date: 20181010 |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: LARGE ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: PUBLICATIONS -- ISSUE FEE PAYMENT VERIFIED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |