WO2022030191A1 - In-vehicle device and driving assistance method - Google Patents

In-vehicle device and driving assistance method Download PDF

Info

Publication number
WO2022030191A1
WO2022030191A1 PCT/JP2021/026182 JP2021026182W WO2022030191A1 WO 2022030191 A1 WO2022030191 A1 WO 2022030191A1 JP 2021026182 W JP2021026182 W JP 2021026182W WO 2022030191 A1 WO2022030191 A1 WO 2022030191A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
state
unit
driver
state information
Prior art date
Application number
PCT/JP2021/026182
Other languages
French (fr)
Japanese (ja)
Inventor
克志 古澤
Original Assignee
フォルシアクラリオン・エレクトロニクス株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by フォルシアクラリオン・エレクトロニクス株式会社 filed Critical フォルシアクラリオン・エレクトロニクス株式会社
Publication of WO2022030191A1 publication Critical patent/WO2022030191A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to an in-vehicle device and a driving support method.
  • Patent Document 1 determines the presence / absence and amount of acceleration / deceleration of another vehicle existing around the own vehicle, the steering state, the lighting state, the presence / absence of an obstacle, and the like, and based on the determined state of the other vehicle.
  • a driving support device that determines the possibility of a fanning action on the own vehicle, and based on the determination result, performs notification control and / or behavior control of the own vehicle to the driver of the own vehicle.
  • the present invention has been made in view of the above circumstances, and provides an in-vehicle device and a driving support method capable of notifying the driver of a vehicle that there is a possibility of being fanned before the fanning occurs.
  • the purpose is.
  • the in-vehicle device of the present invention is an in-vehicle device mounted on a vehicle, and includes a first state information acquisition unit that acquires driving state information indicating the running state of the vehicle, and driving of the vehicle. Based on the information acquired by the second state information acquisition unit that acquires the driver state information indicating the state of the person, the first state information acquisition unit, and the second state information acquisition unit, the vehicle travels.
  • a state determination unit that determines whether or not the vehicle is in a specific state that may pose a danger to other vehicles around the vehicle, and a state determination unit that determines that the vehicle is in the specific state.
  • the distance indicated by the distance information is set, the image acquisition unit that acquires the image taken at the time of occurrence of the surroundings, the distance acquisition unit that acquires the distance information indicating the distance between the vehicle and another vehicle around the vehicle, and the distance information.
  • the detection unit that detects the other vehicle from the image taken at the time of occurrence by comparing the photographed image of the other vehicle with the image taken at the time of occurrence, and the said
  • the notification unit mounted on the vehicle is provided with a notification control unit for executing a first notification operation for alerting the driver. ..
  • the present invention it is possible to notify the driver of the vehicle that there is a possibility of being fanned before the fanning occurs.
  • FIG. 1 is a block diagram showing a configuration of a warning system.
  • FIG. 2 is a flowchart showing a procedure for detecting a specific state.
  • FIG. 3 is a flowchart showing the operation at the time of warning.
  • FIG. 1 is a block diagram showing a configuration of the driving support system 1.
  • the vehicle equipped with the driving support system 1 is referred to as the own vehicle 5, and the vehicle other than the own vehicle 5 is referred to as another vehicle.
  • the driving support system 1 includes an external photography unit 10, an interior photography unit 20, a sonar unit 30, a biological sensor 40, a vehicle speed sensor 50, an acceleration sensor 60, an operation unit 70, a display unit 80, an audio output unit 90, and an in-vehicle device 100. Be prepared.
  • the vehicle outside shooting unit 10 includes four cameras, a front camera 11, a rear camera 13, a left side camera 15, and a right side camera 17.
  • the front camera 11 photographs the front of the own vehicle 5.
  • the rear camera 13 photographs the rear of the own vehicle 5.
  • the left side camera 15 photographs the left side of the own vehicle 5.
  • the right side camera 17 photographs the right side of the own vehicle 5.
  • the present embodiment describes a case where the vehicle exterior photographing unit 10 includes four cameras, but the number of cameras for photographing the outside of the vehicle is arbitrary.
  • the front camera 11, the rear camera 13, the left side camera 15, and the right side camera 17 are images from image sensors such as CCD (Charge-Coupled Device) and CMOS (Complementary Metal-Oxide-Semiconductor) and the light receiving state of the image sensors, respectively. It is equipped with a data processing circuit that generates a data processing circuit.
  • image sensors such as CCD (Charge-Coupled Device) and CMOS (Complementary Metal-Oxide-Semiconductor) and the light receiving state of the image sensors, respectively. It is equipped with a data processing circuit that generates a data processing circuit.
  • the angles of view of the front camera 11, the rear camera 13, the left side camera 15, and the right side camera 17 are adjusted so that they can shoot a range of 360 ° around the own vehicle 5.
  • the front camera 11, the rear camera 13, the left side camera 15, and the right side camera 17 shoot each shooting range at a predetermined frame rate to generate a shot image.
  • the photographed image of the vehicle outside photographing unit 10 is hereinafter referred to as an outside vehicle photographed image.
  • the images taken outside the vehicle generated by each of the front camera 11, the rear camera 13, the left side camera 15, and the right side camera 17 are output to the in-vehicle device 100.
  • the in-vehicle device 100 temporarily stores the out-of-vehicle image taken from the in-vehicle photography unit 10 in the storage unit 110 included in the in-vehicle device 100.
  • the vehicle interior photographing unit 20 includes a camera 25 that photographs the vehicle interior of the own vehicle 5.
  • the camera 25 also includes an image sensor such as a CCD or CMOS, and a data processing circuit that generates an image from the light receiving state of the image sensor.
  • the camera 25 is installed on, for example, a dashboard, a rear-view mirror, a ceiling, or the like, and is oriented so that a driver seated in the driver's seat can be photographed, and the angle of view, the installation position, and the like are set.
  • the camera 25 photographs the interior of the vehicle at a predetermined frame rate to generate a captured image.
  • the photographed image taken by the vehicle interior photographing unit 20 is hereinafter referred to as an indoor photographed image.
  • the vehicle interior shooting unit 20 outputs the generated indoor shooting image to the vehicle-mounted device 100.
  • the in-vehicle device 100 temporarily stores the indoor photographed image input from the vehicle interior photographing unit 20 in the storage unit 110.
  • the sonar unit 30 is mounted at a plurality of locations such as the front, rear, left side, and right side of the own vehicle 5, and uses ultrasonic waves to obstruct other vehicles, pedestrians, structures, etc. existing around the own vehicle 5. Detects the orientation and distance of objects.
  • the sonar unit 30 outputs sonar information indicating the direction and distance of an obstacle to the in-vehicle device 100.
  • the in-vehicle device 100 temporarily stores the sonar information input from the sonar unit 30 in the storage unit 110.
  • the biosensor 40 is a sensor that detects the biometric information of the driver seated in the driver's seat.
  • the biological information is information representing the state of biological phenomena such as heartbeat, respiration, brain wave, pulse, blood pressure, sweating, and body temperature.
  • the biosensor 40 is a heart rate sensor provided on the seat belt of the seat and the biometric information is information representing the driver's heart rate will be described.
  • the biosensor 40 is a breathing and brain wave. , Pulse, blood pressure, sweating, body temperature and other biometric information may be detected.
  • the biosensor 40 outputs the detected biometric information of each occupant to the in-vehicle device 100.
  • the in-vehicle device 100 temporarily stores the biological information input from the biological sensor 40 in the storage unit 110.
  • the vehicle speed sensor 50 is a sensor that detects the vehicle speed of the own vehicle 5.
  • the vehicle speed sensor 50 outputs vehicle speed information indicating the detected vehicle speed of the own vehicle 5 to the in-vehicle device 100.
  • the acceleration sensor 60 is a sensor that detects the acceleration of the own vehicle 5.
  • the acceleration sensor 60 outputs acceleration information indicating the detected acceleration of the own vehicle 5 to the in-vehicle device 100.
  • the in-vehicle device 100 temporarily stores the input vehicle speed information and acceleration information in the storage unit 110.
  • the operation unit 70 functions as a reception unit that accepts the operations of the occupants.
  • the operation unit 70 includes hardware such as switches and buttons, and outputs an operation signal corresponding to the hardware that has received the operation to the in-vehicle device 100.
  • the display unit 80 includes a touch panel 85.
  • the touch panel 85 includes a display panel such as a liquid crystal panel or an organic EL panel, and a touch sensor for detecting a touch operation on the display panel.
  • a display image is displayed on the display panel under the control of the vehicle-mounted device 100.
  • the touch sensor detects the position of the display panel touched by the occupant of the own vehicle 5, and outputs coordinate information indicating the detected position to the in-vehicle device 100.
  • the audio output unit 90 includes an audio processing unit 91 and a speaker 93.
  • the audio processing unit 91 includes a D / A converter, a volume circuit, an amplifier circuit, etc., digitally / analog-converts audio data input from the in-vehicle device 100 by the D / A converter, and adjusts the volume level by the volume circuit. , It is amplified by the amplifier circuit and output as audio from the speaker 93.
  • the in-vehicle device 100 is a computer device including a storage unit 110 and a processor 130.
  • a "specific state” which is a state in which the own vehicle 5 may pose a danger to other vehicles around the own vehicle 5
  • the in-vehicle device 100 detects the specific state of the own vehicle.
  • the other vehicle detects a "specific state”.
  • the in-vehicle device 100 executes a notification operation to alert the driver when the other vehicle is traveling around the own vehicle 5 when the "specific state" is detected. Assist the driver in driving operations.
  • the storage unit 110 includes, for example, a flash memory and a non-volatile memory such as an EEPROM (Electrically Erasable Programmable Read-Only Memory). Further, the storage unit 110 may be configured to include a volatile memory such as a RAM (Random Access Memory) in addition to the non-volatile memory.
  • the storage unit 110 stores a control program executed by the processor 130. Further, the storage unit 110 temporarily stores an image taken outside the vehicle, an image taken indoors, biological information, vehicle speed information, and acceleration information.
  • the processor 130 is composed of a CPU (Central Processing Unit) and a microcomputer such as an MCU (MicroControllerUnit) and an MPU (MicroProcessorUnit) equipped with a CPU. Further, the in-vehicle device 100 may be realized by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
  • a CPU Central Processing Unit
  • MCU MicroControllerUnit
  • MPU MicroProcessorUnit
  • the in-vehicle device 100 may be realized by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
  • ASIC Application Specific Integrated Circuit
  • FPGA Field Programmable Gate Array
  • the in-vehicle device 100 includes a first state information acquisition unit 131, a second state information acquisition unit 132, a state detection unit 133, a driving tendency determination unit 134, an image acquisition unit 135, a distance acquisition unit 136, a target vehicle detection unit 137, and notification control.
  • a unit 138 is provided. These functional blocks indicate the functions of the vehicle-mounted device 100 realized by the processor 130 executing the control program stored in the storage unit 110.
  • the first state information acquisition unit 131 acquires the running state information indicating the running state of the own vehicle 5.
  • the first state information acquisition unit 131 acquires an image taken outside the vehicle, vehicle speed information, and acceleration information from the storage unit 110 as traveling state information.
  • the first state information acquisition unit 131 outputs the acquired out-of-vehicle image, vehicle speed information, and acceleration information to the state detection unit 133.
  • the second state information acquisition unit 132 acquires driver state information indicating the state of the driver of the own vehicle 5.
  • the second state information acquisition unit 132 acquires indoor photographed images and biological information from the storage unit 110 as driver state information.
  • the second state information acquisition unit 132 outputs the acquired indoor photographed image and biological information to the state detection unit 133.
  • the state detection unit 133 is input with an image taken outside the vehicle, vehicle speed information, and acceleration information from the first state information acquisition unit 131, and an image taken indoors and biometric information are input from the second state information acquisition unit 132. Based on these input information, the state detection unit 133 detects a specific state in which the own vehicle 5 can adversely affect the running of another vehicle.
  • the state detection unit 133 corresponds to the state determination unit of the present invention, and determines a specific state based on the running state of the own vehicle 5, a specific emotion of the driver, and a specific operation. That is, when the state detection unit 133 detects the running state of the own vehicle 5, a specific emotion of the driver, or a specific motion, the running of the own vehicle 5 poses a danger to other vehicles around the own vehicle 5. Judge that you are in a specific condition that may affect you.
  • the traveling state of the own vehicle 5 includes interrupted driving, sudden acceleration and deceleration, and low-speed traveling.
  • the driver's specific movement includes a forward leaning posture, a gesture of hitting a steering wheel, an instrument panel, or the like.
  • the driver's specific emotions include emotions that adversely affect safe driving, such as impatience, irritation, and tension.
  • the state detection unit 133 detects the interrupted driving of the own vehicle 5 based on the image taken outside the vehicle.
  • the interrupted operation of the own vehicle 5 means that the own vehicle 5 changes its course to the front on the course of another running vehicle.
  • the state detection unit 133 detects a lane change based on an image taken outside the vehicle of the road surface, and when the lane change is detected, interrupts driving based on the distance to another vehicle traveling around the own vehicle 5. To detect.
  • the state detection unit 133 detects the sudden acceleration or deceleration of the own vehicle 5 by comparing the acceleration indicated by the acceleration information with the preset threshold value, and presets the vehicle speed indicated by the vehicle speed information.
  • the low-speed running of the own vehicle 5 is detected by comparing with the threshold value.
  • the state detection unit 133 detects the driver's posture and type from the captured image of the vehicle interior photographing unit 20 to detect a specific state.
  • the posture detected by the state detection unit 133 includes a forward leaning posture, and the type includes an act of hitting a steering wheel, an instrument panel, or the like.
  • the state detection unit 133 detects the driver's face image from the captured image of the vehicle interior photographing unit 20, and detects the driver's specific emotion based on the detected face image.
  • Specific emotions are emotions that adversely affect driving, such as impatience, irritation, and tension.
  • the state detection unit 133 detects a specific emotion of the driver based on the master image stored in the storage unit 110.
  • This master image is a face image of the driver taken in advance, and is an image taken for each specific emotion such as impatience, irritation, and tension.
  • the state detection unit 133 compares the face image captured in the captured image of the vehicle interior photographing unit 20 with the master image, and detects a specific emotion of the driver.
  • the state detection unit 133 may detect a specific emotion of the driver by using the biological information of the biological sensor 40 in addition to the captured image of the vehicle interior photographing unit 20. For example, when the state detection unit 133 detects an angry emotion as a specific emotion from the image taken by the vehicle interior photographing unit 20, and the heart rate indicated by the biological information of the biological sensor 40 is equal to or higher than a preset value. , It may be determined that the driver is feeling angry.
  • the driving tendency determination unit 134 counts the number of detections for each detection item detected by the state detection unit 133 as a specific state such as interrupted operation, sudden acceleration, sudden deceleration, low-speed running, specific emotion, and specific operation.
  • the storage unit 110 stores the count value of the counter for each detection item.
  • the driving tendency determination unit 134 adds 1 to the count value of the counter of the corresponding detection item each time the state detection unit 133 detects a specific state.
  • the driving tendency determination unit 134 determines the driving tendency of the driver based on the count value of each detection item stored in the storage unit 110. For example, the driving tendency determination unit 134 compares the count value of each detection item with the threshold value set for each detection item, and detects the detection item whose count value is equal to or greater than the threshold value. The driving tendency determination unit 134 determines the driving tendency of the driver based on the detected detection item.
  • the image acquisition unit 135 causes the outside vehicle photographing unit 10 to perform photography when the state detection unit 133 detects a specific state.
  • the vehicle outside shooting unit 10 shoots the outside of the vehicle for a preset time to generate an externally shot image.
  • the image acquisition unit 135 stores the image taken outside the vehicle input from the image acquisition unit 10 outside the vehicle in the storage unit 110.
  • An externally captured image taken by the outside vehicle photographing unit 10 when the state detecting unit 133 detects a specific state is referred to as an image taken at the time of occurrence.
  • the distance acquisition unit 136 acquires distance information indicating the distance between the own vehicle 5 and another vehicle existing around the own vehicle 5.
  • the distance acquisition unit 136 acquires sonar information of the sonar unit 30 from the storage unit 110 as distance information.
  • the distance acquisition unit 136 outputs the acquired sonar information to the target vehicle detection unit 137.
  • the distance acquisition unit 136 may detect the distance to another vehicle based on the captured image acquired by the image acquisition unit 135 and generate the distance information.
  • the camera included in the outside vehicle photographing unit 10 may be configured by a stereo camera, and the distance from the own vehicle 5 to another vehicle may be detected based on the image taken by the stereo camera.
  • the target vehicle detection unit 137 corresponds to the detection unit of the present invention.
  • the distance information acquired by the distance acquisition unit 136 is input to the target vehicle detection unit 137.
  • the target vehicle detection unit 137 detects another vehicle whose distance from the own vehicle 5 is equal to or less than the set distance based on the distance indicated by the input distance information.
  • the detected other vehicle is referred to as a target vehicle.
  • the target vehicle detection unit 137 outputs information such as the direction of the detected target vehicle to the image acquisition unit 135 and instructs the image acquisition unit 135 to take a picture of the target vehicle.
  • the image acquisition unit 135 causes the vehicle outside shooting unit 10 to perform shooting according to the instruction of the target vehicle detection unit 137, and outputs the shot outside vehicle shooting image to the target vehicle detection unit 137.
  • the target vehicle detection unit 137 compares the input external image of the target vehicle with the image taken at the time of occurrence, and detects an image. It is determined whether or not the target vehicle is photographed. That is, when a specific state is detected, it is determined whether or not the target vehicle is traveling around the own vehicle 5.
  • the target vehicle detection unit 137 outputs the first notification to the notification control unit 138 when the target vehicle is photographed in the detection image.
  • the first notification is a notification indicating that a vehicle whose distance to the own vehicle 5 is less than or equal to the set distance and which was traveling around the own vehicle 5 was detected when a specific state was detected. be.
  • the target vehicle detection unit 137 outputs a second notification to the notification control unit 138 when the target vehicle is not photographed in the image captured at the time of occurrence.
  • the second notification is a notification indicating that a vehicle whose distance from the own vehicle 5 is equal to or less than the set distance has been detected.
  • the notification control unit 138 controls the operation of the display unit 80 and the voice output unit 90, which are notification units.
  • the notification control unit 138 corresponds to the notification control unit and the display control unit of the present invention.
  • the notification control unit 138 controls the display unit 80 and the voice output unit 90 to execute the first notification operation.
  • the notification control unit 138 displays a warning image on the display unit 80 and outputs a predetermined warning sound from the voice output unit 90.
  • the warning image for example, another vehicle whose inter-vehicle distance is less than or equal to the set distance is detected, and the detected other vehicle is a vehicle that was traveling around the own vehicle 5 when a specific state was detected. That there is, etc. are displayed.
  • the notification control unit 138 controls the display unit 80 to execute the second notification operation.
  • the level of alerting is set lower than that in the first notification operation. Therefore, when the second notification is input from the target vehicle detection unit 137, the notification control unit 138 causes the display unit 80 to display the warning image without outputting the warning sound from the voice output unit 90. In the warning image, for example, it is displayed that another vehicle whose inter-vehicle distance is less than or equal to the set distance has been detected.
  • the notification control unit 138 causes the display unit 80 to display the driving tendency determined by the driving tendency determination unit 134. For example, when the driving tendency determination unit 134 determines that the driver's driving tendency tends to have a high frequency of sudden acceleration, the notification control unit 138 provides guidance to notify that the frequency of sudden acceleration tends to be high. It is displayed on the display unit 80. Further, when the driving tendency determination unit 134 determines that the notification control unit 138 tends to take an action expressing an angry feeling while driving, the frequency of taking an action expressing an angry feeling increases. The display unit 80 displays a guide notifying that the price is high.
  • the vehicle-mounted device 100 determines whether or not the own vehicle 5 is traveling (step S1).
  • the in-vehicle device 100 determines whether or not the own vehicle 5 is traveling based on the vehicle speed information.
  • the in-vehicle device 100 waits until the own vehicle 5 starts running.
  • the in-vehicle device 100 acquires an image taken outside the vehicle (step S2).
  • the in-vehicle device 100 detects interrupted driving based on the acquired image taken outside the vehicle (step S3). Further, the in-vehicle device 100 may detect the interrupted driving by referring to the acceleration information of the own vehicle 5.
  • the in-vehicle device 100 determines that a specific state has been detected, causes the vehicle-mounted photographing unit 10 to perform shooting, and acquires a captured image at the time of occurrence (step S4). ..
  • the in-vehicle device 100 stores the detection time when a specific state is detected and the captured image at the time of occurrence in the storage unit 110 in association with each other (step S5). Further, the vehicle-mounted device 100 increments the count value of the counter that counts the number of detections of the interrupt operation by 1 (step S6).
  • the vehicle-mounted device 100 acquires the vehicle speed information detected by the vehicle speed sensor 50 when the determination in step S3 is a negative determination (step S3 / NO) or after the processing in step S6 (step S7).
  • the in-vehicle device 100 compares the acquired vehicle speed information with a preset threshold value, and determines whether or not the own vehicle 5 is in a low-speed running state and this low-speed running state continues for a set time or longer. Is determined (step S8).
  • the in-vehicle device 100 determines that the low-speed running state has continued for a set time or longer (step S8 / YES), the in-vehicle device 100 determines that a specific state has been detected, and causes the outside-vehicle photographing unit 10 to perform imaging when the occurrence occurs. Acquire the captured image (step S9).
  • the in-vehicle device 100 stores the detection time when a specific state is detected and the captured image at the time of occurrence in the storage unit 110 in association with each other (step S10). Further, the vehicle-mounted device 100 increments the count value of the counter that counts the number of detections of low-speed operation by 1 (step S11).
  • the in-vehicle device 100 acquires the acceleration information detected by the acceleration sensor 60 when the determination in step S8 is a negative determination (step S8 / NO) or after the processing in step S11 (step S12).
  • the in-vehicle device 100 compares the acquired acceleration information with a preset threshold value to detect sudden acceleration or deceleration of the own vehicle 5 (step S13).
  • the in-vehicle device 100 When the in-vehicle device 100 detects a sudden acceleration or a sudden deceleration of the own vehicle 5 (step S13 / YES), the in-vehicle device 100 determines that a specific state has been detected, causes the outside shooting unit 10 to take a picture, and captures an image taken at the time of occurrence. Acquire (step S14).
  • the in-vehicle device 100 stores the detection time when a specific state is detected and the captured image at the time of occurrence in the storage unit 110 in association with each other (step S15). Further, the in-vehicle device 100 increments the count value of the counter that counts the number of detections of sudden acceleration or deceleration by 1 (step S16).
  • the in-vehicle device 100 acquires an indoor photographed image and biological information of the vehicle interior photographing unit 20 when the determination in step S13 is a negative determination (step S13 / NO) or after the processing of step S16 (step S17). ..
  • the in-vehicle device 100 detects a specific action or a specific emotion of the driver based on the acquired indoor photographed image and biological information (step S18).
  • the in-vehicle device 100 detects a leaning posture in front of the driver or a gesture of hitting the steering wheel, instrument panel, or the like, it is determined that a specific motion is detected.
  • the in-vehicle device 100 determines that a specific emotion has been detected when the estimated emotion is an emotion such as impatience, irritation, or tension.
  • the in-vehicle device 100 When a specific motion or a specific emotion is detected (step S18 / YES), the in-vehicle device 100 causes the outside vehicle photographing unit 10 to perform shooting and acquires a captured image at the time of occurrence (step S19). The in-vehicle device 100 stores the detection time at which a specific motion or a specific emotion is detected in the storage unit 110 in association with the captured image at the time of occurrence (step S20). Further, the vehicle-mounted device 100 increments the count value of the corresponding counter in the detected specific state by 1 (step S21).
  • the in-vehicle device 100 determines whether or not there is a counter whose count value is equal to or greater than the threshold value when the determination in step S18 is a negative determination (step S18 / NO) or after executing the process in step S21 (step S18 / NO). Step S22).
  • the in-vehicle device 100 returns to the determination in step S1 when there is no counter whose count value is equal to or greater than the threshold value (step S22 / NO). Further, when there is a counter whose count value is equal to or higher than the threshold value (step S22 / YES), the in-vehicle device 100 determines and determines the driving tendency of the driver based on the specific state associated with this counter. The driving tendency is displayed on the display unit 80 (step S23).
  • the vehicle-mounted device 100 determines whether or not a specific state has been detected (step S31).
  • the in-vehicle device 100 acquires sonar information of the sonar unit 30 (step S32).
  • the in-vehicle device 100 detects another vehicle whose distance from the own vehicle 5 is less than or equal to the set distance based on the acquired sonar information (step S33). When the vehicle-mounted device 100 does not detect another vehicle within the set distance (step S33 / NO), the vehicle-mounted device 100 returns to the determination in step S31.
  • the in-vehicle device 100 determines whether the own vehicle 5 is traveling in order to exclude the case where another vehicle within the set distance is detected (step S33 / YES) and the inter-vehicle distance is shortened due to waiting for a signal or the like. It is determined whether or not (step S34). When the own vehicle 5 is not traveling (step S34 / NO), the in-vehicle device 100 returns to the determination in step S31.
  • the in-vehicle device 100 instructs the outside shooting unit 10 to take a picture of the target vehicle with the other vehicle detected in step S33 as the target vehicle (step). S35), the externally photographed image of the target vehicle is acquired.
  • the in-vehicle device 100 reads out the captured image at the time of occurrence from the storage unit 110.
  • the in-vehicle device 100 refers to the detection time stored in association with the image taken at the time of occurrence, and reads out the image taken at the time of occurrence within a predetermined time from the detection time to the current time.
  • the in-vehicle device 100 compares the read image taken at the time of occurrence with the image taken externally taken by the target vehicle, and determines whether or not the target vehicle is photographed in the image taken at the time of occurrence (step S36).
  • the vehicle-mounted device 100 executes the first notification operation (step S37).
  • the notification control unit 138 displays a warning image on the display unit 80 and outputs a predetermined warning sound from the voice output unit 90. Further, when the target vehicle is not captured in the captured image at the time of occurrence (step S36 / NO), the vehicle-mounted device 100 executes the second notification operation (step S38). The notification control unit 138 causes the display unit 80 to display a warning image as the second notification operation.
  • the in-vehicle device 100 of the present embodiment has a first state information acquisition unit 131, a second state information acquisition unit 132, a state detection unit 133, an image acquisition unit 135, a distance acquisition unit 136, and a target vehicle detection unit 137. And a notification control unit 138.
  • the first state information acquisition unit 131 acquires the running state information indicating the running state of the own vehicle 5.
  • the second state information acquisition unit 132 acquires driver state information indicating the state of the driver of the own vehicle 5. Based on the information acquired by the first state information acquisition unit 131 and the second state information acquisition unit 132, the state detection unit 133 may cause the traveling of the own vehicle 5 to pose a danger to other vehicles around the own vehicle 5. Determine if you are in a particular state.
  • the image acquisition unit 135 acquires an image taken at the time of occurrence in which the surroundings of the own vehicle 5 are photographed.
  • the distance acquisition unit 136 acquires distance information indicating the distance between the own vehicle 5 and other vehicles around the own vehicle 5.
  • the target vehicle detection unit 137 compares the photographed image of the other vehicle with the photographed image at the time of occurrence, and when the other vehicle is generated. Detect from captured images.
  • the notification control unit 138 performs a first notification operation for calling the driver's attention to the display unit 80 and the voice output unit 90 mounted on the own vehicle 5. Let it run. Therefore, when the own vehicle 5 detects a specific state that may pose a danger to other vehicles around the own vehicle 5, the in-vehicle device 100 is present around the own vehicle 5 at the time of detecting the specific state.
  • the distance between the other vehicle and the own vehicle 5 is detected, and when the detected distance is equal to or less than the set value, the first notification operation for alerting the driver is executed. Therefore, it is possible to notify the driver of the vehicle that there is a possibility of being fanned before the fanning occurs.
  • the second state information acquisition unit 132 acquires the driver's biological information, and estimates the driver's emotion as the driver's state information based on the acquired biological information.
  • the state detection unit 133 determines a specific state based on the driver's emotions estimated by the second state information acquisition unit 132. Therefore, it is possible to improve the estimation accuracy of the driver's emotion and improve the detection accuracy of a specific state.
  • the second state information acquisition unit 132 acquires a photographed image of the interior of the own vehicle 5, and acquires information indicating the driver's operation based on the acquired photographed image as driver state information.
  • the state detection unit 133 determines a specific state based on the driver's operation estimated by the second state information acquisition unit 132. For example, as a driver's motion, the driver's emotion is estimated by detecting the forward leaning posture and the action of hitting the steering wheel, instrument panel, etc., and a specific state is detected based on the estimated driver's emotion. It is possible to improve the detection accuracy of the state of.
  • the notification control unit 138 executes a second notification operation having a lower level of alerting than the first notification operation. Therefore, even if the other vehicle does not exist around the own vehicle 5 when the specific state is detected, the driver can be alerted when the distance to the own vehicle 5 is equal to or less than the set value.
  • the in-vehicle device 100 counts and counts the number of times the state detection unit 133 detects a specific state based on the traveling state information and the number of times the specific state is detected based on the driver state information.
  • the driving tendency determination unit 134 for determining the driving tendency of the driver based on the number of times is provided.
  • the notification control unit 138 causes the display unit 80 to display the driving tendency determined by the driving tendency determination unit 134. Therefore, it is possible to determine the driving tendency of the driver based on the number of detections of a specific state and notify the driver of the determined driving tendency.
  • the configuration of the vehicle-mounted device 100 shown in FIG. 1 is a schematic diagram showing the functions of the vehicle-mounted device 100 classified according to the main processing content, and the configuration of the vehicle-mounted device 100 is based on the processing content. It can be divided into more blocks. Further, this functional block may be configured to execute more processing by one block shown in FIG. 1. Further, the processing of each block may be executed by one hardware or may be executed by a plurality of hardware. Further, the processing of each block may be realized by one program or may be realized by a plurality of programs.
  • processing units of the flowcharts shown in FIGS. 2 and 3 are divided according to the main processing contents in order to make the processing of the in-vehicle device 100 easy to understand, and depending on the method and name of division of the processing units.
  • the present invention is not limited. Further, the processing of the in-vehicle device 100 can be divided into more processing units depending on the processing content, or one processing unit can be divided so as to include more processing. Further, the processing order of the above flowchart is not limited to the illustrated example.
  • the operation support method of the present invention is realized by a computer, it is also possible to configure the program to be executed by the computer in the form of a recording medium or a transmission medium for transmitting the program.
  • a magnetic or optical recording medium or a semiconductor memory device can be used.
  • Specific examples of the recording medium include a flexible disk, an HDD (Hard Disk Drive), a CD-ROM (Compact Disk Read Only Memory), a DVD, a Blu-ray (registered trademark) Disc, and a magneto-optical disk.
  • a portable recording medium such as a flash memory or a card-type recording medium, or a fixed recording medium can be mentioned.
  • the recording medium may be a non-volatile storage device such as RAM, ROM, or HDD, which is an internal storage device included in the display device.
  • Driving support system 5 Vehicle 10 External photography unit 11 Front camera 13 Rear camera 15 Left side camera 17 Right side camera 20 Vehicle interior photography unit 25 Camera 30 Sonar unit 40 Biosensor 50 Vehicle speed sensor 60 Acceleration sensor 70 Operation unit 85 Touch panel 90 Voice Output unit 91 Voice processing unit 93 Speaker 100 In-vehicle device 110 Storage unit 130 Processor 131 First state information acquisition unit 132 Second state information acquisition unit 133 State detection unit 134 Driving tendency determination unit 135 Image acquisition unit 136 Distance acquisition unit 137 Target vehicle Detection unit 138 Notification control unit

Landscapes

  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Traffic Control Systems (AREA)

Abstract

Provided is an in-vehicle device that is capable of notifying a driver of the likelihood of suffering road rage prior to the occurrence of road rage behavior. This in-vehicle device 100 comprises: a state detection unit 133 which determines whether or not a host vehicle 5 is in a specific state that could pose a risk to other vehicles around the host vehicle 5; an image acquisition unit 135 which, in the case when occurrence of the specific state is determined, acquires an image captured of the surroundings of the host vehicle 5 at the time of the occurrence; a distance acquisition unit 136 which acquires distance information indicative of the distances to the other vehicles around the host vehicle 5; a target vehicle detection unit 137 which, in the case when the distance to any of the other vehicles indicated by the distance information is detected to be equal to or less than a set value, detects said vehicle from the image captured at the time of the occurrence; and a notification control unit 138 which, in the case when said vehicle has been detected by the target vehicle detection unit 137, causes a display unit 80 or an audio output unit 90 mounted in the host vehicle 5 to execute a first notification action.

Description

車載装置及び運転支援方法In-vehicle device and driving support method
 本発明は、車載装置及び運転支援方法に関する。 The present invention relates to an in-vehicle device and a driving support method.
 従来、煽り運転を検出して後続車両との事故やトラブルを防止する技術が知られている。例えば、特許文献1は、自車両の周囲に存在する他車両の加減速の有無や量、操舵状態、灯火状態、障害の有無等の状態を判定し、判定した他車両の状態に基づいて、自車両に対する煽り行為の発生可能性を判定する煽り行為を判定し、判定結果に基づいて、自車両の運転者に対する通知制御及び/又は自車両の挙動制御を行う運転支援装置を開示する。 Conventionally, there is known a technology for detecting a driving motion and preventing an accident or trouble with a following vehicle. For example, Patent Document 1 determines the presence / absence and amount of acceleration / deceleration of another vehicle existing around the own vehicle, the steering state, the lighting state, the presence / absence of an obstacle, and the like, and based on the determined state of the other vehicle. Disclose a driving support device that determines the possibility of a fanning action on the own vehicle, and based on the determination result, performs notification control and / or behavior control of the own vehicle to the driver of the own vehicle.
特開2006-205773号公報Japanese Unexamined Patent Publication No. 2006-205773
 しかしながら他車両の状態に基づいて、自車両への煽り行為の発生可能性を判定する場合、判定に遅れが生じ、煽り行為の発生後に運転者に通知される懸念がある。 However, when determining the possibility of a fanning action on the own vehicle based on the condition of another vehicle, there is a concern that the judgment will be delayed and the driver will be notified after the fanning action occurs.
 本発明は上記事情に鑑みてなされたものであり、煽り行為の発生前に、煽り行為を受ける可能性があることを車両の運転者に通知することができる車載装置及び運転支援方法を提供することを目的とする。 The present invention has been made in view of the above circumstances, and provides an in-vehicle device and a driving support method capable of notifying the driver of a vehicle that there is a possibility of being fanned before the fanning occurs. The purpose is.
 この明細書には、2020年8月6日に出願された日本国特許出願・特願2020-133511号のすべての内容が含まれる。 This specification includes all the contents of the Japanese patent application / Japanese Patent Application No. 2020-133511 filed on August 6, 2020.
 上記課題を解決するため、本発明の車載装置は、車両に搭載された車載装置であって、前記車両の走行状態を示す走行状態情報を取得する第1状態情報取得部と、前記車両の運転者の状態を示す運転者状態情報を取得する第2状態情報取得部と、前記第1状態情報取得部及び前記第2状態情報取得部が取得した情報に基づいて、前記車両の走行が、前記車両の周囲の他車両に危険を及ぼすおそれのある特定の状態にあるか否かを判定する状態判定部と、前記状態判定部により前記特定の状態にあると判定された場合に、前記車両の周囲を撮影した発生時撮影画像を取得する画像取得部と、前記車両と、前記車両の周囲の他車両との距離を示す距離情報を取得する距離取得部と、前記距離情報が示す距離が設定値以下の他車両が検出された場合に、前記他車両を撮影した撮影画像と、前記発生時撮影画像とを比較して、前記他車両を前記発生時撮影画像から検出する検出部と、前記検出部により前記他車両が検出された場合に、前記車両に搭載された報知部に、前記運転者に注意喚起を促す第1報知動作を実行させる報知制御部と、を備えることを特徴とする。 In order to solve the above problems, the in-vehicle device of the present invention is an in-vehicle device mounted on a vehicle, and includes a first state information acquisition unit that acquires driving state information indicating the running state of the vehicle, and driving of the vehicle. Based on the information acquired by the second state information acquisition unit that acquires the driver state information indicating the state of the person, the first state information acquisition unit, and the second state information acquisition unit, the vehicle travels. A state determination unit that determines whether or not the vehicle is in a specific state that may pose a danger to other vehicles around the vehicle, and a state determination unit that determines that the vehicle is in the specific state. The distance indicated by the distance information is set, the image acquisition unit that acquires the image taken at the time of occurrence of the surroundings, the distance acquisition unit that acquires the distance information indicating the distance between the vehicle and another vehicle around the vehicle, and the distance information. When another vehicle below the value is detected, the detection unit that detects the other vehicle from the image taken at the time of occurrence by comparing the photographed image of the other vehicle with the image taken at the time of occurrence, and the said When the other vehicle is detected by the detection unit, the notification unit mounted on the vehicle is provided with a notification control unit for executing a first notification operation for alerting the driver. ..
 本発明によれば、煽り行為の発生前に、煽り行為を受ける可能性があることを車両の運転者に通知することができる。 According to the present invention, it is possible to notify the driver of the vehicle that there is a possibility of being fanned before the fanning occurs.
図1は、警告システムの構成を示すブロック図である。FIG. 1 is a block diagram showing a configuration of a warning system. 図2は、特定の状態を検出する手順を示すフローチャートである。FIG. 2 is a flowchart showing a procedure for detecting a specific state. 図3は、警告時の動作を示すフローチャートである。FIG. 3 is a flowchart showing the operation at the time of warning.
 以下、添付図面を参照しながら本発明の実施形態について説明する。
 図1は、運転支援システム1の構成を示すブロック図である。以下、運転支援システム1を搭載した車両を、自車両5といい、自車両5以外の車両を他車両という。
 運転支援システム1は、車外撮影部10、車室内撮影部20、ソナーユニット30、生体センサ40、車速センサ50、加速度センサ60、操作部70、表示部80、音声出力部90及び車載装置100を備える。
Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings.
FIG. 1 is a block diagram showing a configuration of the driving support system 1. Hereinafter, the vehicle equipped with the driving support system 1 is referred to as the own vehicle 5, and the vehicle other than the own vehicle 5 is referred to as another vehicle.
The driving support system 1 includes an external photography unit 10, an interior photography unit 20, a sonar unit 30, a biological sensor 40, a vehicle speed sensor 50, an acceleration sensor 60, an operation unit 70, a display unit 80, an audio output unit 90, and an in-vehicle device 100. Be prepared.
 車外撮影部10は、フロントカメラ11、リアカメラ13、左サイドカメラ15及び右サイドカメラ17の4台のカメラを備える。フロントカメラ11は、自車両5の前方を撮影する。リアカメラ13は、自車両5の後方を撮影する。左サイドカメラ15は、自車両5の左側方を撮影する。右サイドカメラ17は、自車両5の右側方を撮影する。本実施形態は、車外撮影部10が4台のカメラを備える場合について説明するが、車外を撮影するカメラの台数は任意である。 The vehicle outside shooting unit 10 includes four cameras, a front camera 11, a rear camera 13, a left side camera 15, and a right side camera 17. The front camera 11 photographs the front of the own vehicle 5. The rear camera 13 photographs the rear of the own vehicle 5. The left side camera 15 photographs the left side of the own vehicle 5. The right side camera 17 photographs the right side of the own vehicle 5. The present embodiment describes a case where the vehicle exterior photographing unit 10 includes four cameras, but the number of cameras for photographing the outside of the vehicle is arbitrary.
 フロントカメラ11、リアカメラ13、左サイドカメラ15及び右サイドカメラ17は、それぞれCCD(Charge-Coupled Device)やCMOS(Complementary Metal-Oxide-Semiconductor)等のイメージセンサと、イメージセンサの受光状態から画像を生成するデータ処理回路とを備える。 The front camera 11, the rear camera 13, the left side camera 15, and the right side camera 17 are images from image sensors such as CCD (Charge-Coupled Device) and CMOS (Complementary Metal-Oxide-Semiconductor) and the light receiving state of the image sensors, respectively. It is equipped with a data processing circuit that generates a data processing circuit.
 フロントカメラ11、リアカメラ13、左サイドカメラ15及び右サイドカメラ17は、自車両5を中心に360°の範囲を撮影可能となるように画角が調整されている。フロントカメラ11、リアカメラ13、左サイドカメラ15及び右サイドカメラ17は、各々の撮影範囲を所定のフレームレートで撮影して撮影画像を生成する。車外撮影部10の撮影画像を、以下では、車外撮影画像という。フロントカメラ11、リアカメラ13、左サイドカメラ15及び右サイドカメラ17の各々が生成した車外撮影画像は、車載装置100に出力される。車載装置100は、車外撮影部10から入力された車外撮影画像を、車載装置100が備える記憶部110に一時的に記憶させる。 The angles of view of the front camera 11, the rear camera 13, the left side camera 15, and the right side camera 17 are adjusted so that they can shoot a range of 360 ° around the own vehicle 5. The front camera 11, the rear camera 13, the left side camera 15, and the right side camera 17 shoot each shooting range at a predetermined frame rate to generate a shot image. The photographed image of the vehicle outside photographing unit 10 is hereinafter referred to as an outside vehicle photographed image. The images taken outside the vehicle generated by each of the front camera 11, the rear camera 13, the left side camera 15, and the right side camera 17 are output to the in-vehicle device 100. The in-vehicle device 100 temporarily stores the out-of-vehicle image taken from the in-vehicle photography unit 10 in the storage unit 110 included in the in-vehicle device 100.
 車室内撮影部20は、自車両5の車室内を撮影するカメラ25を備える。このカメラ25もCCDやCMOS等のイメージセンサと、イメージセンサの受光状態から画像を生成するデータ処理回路とを備える。カメラ25は、例えば、ダッシュボード、ルームミラー、天井等に設置され、運転席に着座した運転者を撮影可能なように向き、画角、設置位置等が設定されている。カメラ25は、所定のフレームレートで車室内を撮影して撮影画像を生成する。車室内撮影部20が撮影した撮影画像を、以下では室内撮影画像という。車室内撮影部20は、生成した室内撮影画像を車載装置100に出力する。車載装置100は、車室内撮影部20から入力された室内撮影画像を記憶部110に一時的に記憶させる。 The vehicle interior photographing unit 20 includes a camera 25 that photographs the vehicle interior of the own vehicle 5. The camera 25 also includes an image sensor such as a CCD or CMOS, and a data processing circuit that generates an image from the light receiving state of the image sensor. The camera 25 is installed on, for example, a dashboard, a rear-view mirror, a ceiling, or the like, and is oriented so that a driver seated in the driver's seat can be photographed, and the angle of view, the installation position, and the like are set. The camera 25 photographs the interior of the vehicle at a predetermined frame rate to generate a captured image. The photographed image taken by the vehicle interior photographing unit 20 is hereinafter referred to as an indoor photographed image. The vehicle interior shooting unit 20 outputs the generated indoor shooting image to the vehicle-mounted device 100. The in-vehicle device 100 temporarily stores the indoor photographed image input from the vehicle interior photographing unit 20 in the storage unit 110.
 ソナーユニット30は、自車両5の前方、後方、左側方及び右側方等の複数箇所に搭載され、超音波を用いて自車両5の周囲に存在する他車両、歩行者、構造物等の障害物の方位や距離を検出する。ソナーユニット30は、障害物の方位や距離を示すソナー情報を車載装置100に出力する。車載装置100は、ソナーユニット30から入力されたソナー情報を記憶部110に一時的に記憶させる。 The sonar unit 30 is mounted at a plurality of locations such as the front, rear, left side, and right side of the own vehicle 5, and uses ultrasonic waves to obstruct other vehicles, pedestrians, structures, etc. existing around the own vehicle 5. Detects the orientation and distance of objects. The sonar unit 30 outputs sonar information indicating the direction and distance of an obstacle to the in-vehicle device 100. The in-vehicle device 100 temporarily stores the sonar information input from the sonar unit 30 in the storage unit 110.
 生体センサ40は、運転席に着座した運転者の生体情報を検出するセンサである。生体情報とは、心拍、呼吸、脳波、脈拍、血圧、発汗、体温などの生体現象の状態を表す情報である。本実施形態では、生体センサ40が、座席のシートベルトに設けられた心拍センサであり、生体情報が運転者の心拍数を表す情報である場合について説明するが、生体センサ40は、呼吸、脳波、脈拍、血圧、発汗、体温等の他の生体情報を検出する構成であってもよい。生体センサ40は、検出した各乗員の生体情報を車載装置100に出力する。車載装置100は、生体センサ40から入力された生体情報を記憶部110に一時的に記憶させる。 The biosensor 40 is a sensor that detects the biometric information of the driver seated in the driver's seat. The biological information is information representing the state of biological phenomena such as heartbeat, respiration, brain wave, pulse, blood pressure, sweating, and body temperature. In the present embodiment, the case where the biosensor 40 is a heart rate sensor provided on the seat belt of the seat and the biometric information is information representing the driver's heart rate will be described. However, the biosensor 40 is a breathing and brain wave. , Pulse, blood pressure, sweating, body temperature and other biometric information may be detected. The biosensor 40 outputs the detected biometric information of each occupant to the in-vehicle device 100. The in-vehicle device 100 temporarily stores the biological information input from the biological sensor 40 in the storage unit 110.
 車速センサ50は、自車両5の車速を検出するセンサである。車速センサ50は、検出した自車両5の車速を示す車速情報を車載装置100に出力する。
 加速度センサ60は、自車両5の加速度を検出するセンサである。加速度センサ60は、検出した自車両5の加速度を示す加速度情報を車載装置100に出力する。
 車載装置100は、入力された車速情報や加速度情報を記憶部110に一時的に記憶させる。
The vehicle speed sensor 50 is a sensor that detects the vehicle speed of the own vehicle 5. The vehicle speed sensor 50 outputs vehicle speed information indicating the detected vehicle speed of the own vehicle 5 to the in-vehicle device 100.
The acceleration sensor 60 is a sensor that detects the acceleration of the own vehicle 5. The acceleration sensor 60 outputs acceleration information indicating the detected acceleration of the own vehicle 5 to the in-vehicle device 100.
The in-vehicle device 100 temporarily stores the input vehicle speed information and acceleration information in the storage unit 110.
 操作部70は、乗員の操作を受け付ける受付部として機能する。操作部70は、スイッチやボタン等のハードウェアを備え、操作を受け付けたハードウェアに対応した操作信号を車載装置100に出力する。 The operation unit 70 functions as a reception unit that accepts the operations of the occupants. The operation unit 70 includes hardware such as switches and buttons, and outputs an operation signal corresponding to the hardware that has received the operation to the in-vehicle device 100.
 表示部80は、タッチパネル85を備える。タッチパネル85は、液晶パネルや、有機ELパネル等の表示パネルと、表示パネルに対するタッチ操作を検出するタッチセンサとを備える。表示パネルには、車載装置100の制御により表示画像が表示される。タッチセンサは、自車両5の乗員によりタッチされた表示パネルの位置を検出し、検出した位置を示す座標情報を車載装置100に出力する。 The display unit 80 includes a touch panel 85. The touch panel 85 includes a display panel such as a liquid crystal panel or an organic EL panel, and a touch sensor for detecting a touch operation on the display panel. A display image is displayed on the display panel under the control of the vehicle-mounted device 100. The touch sensor detects the position of the display panel touched by the occupant of the own vehicle 5, and outputs coordinate information indicating the detected position to the in-vehicle device 100.
 音声出力部90は、音声処理部91及びスピーカ93を備える。
 音声処理部91は、D/Aコンバータや、ボリューム回路、アンプ回路等を備え、車載装置100から入力された音声データをD/Aコンバータによりデジタル/アナログ変換し、ボリューム回路により音量レベルを調整し、アンプ回路により増幅して、スピーカ93から音声として出力する。
The audio output unit 90 includes an audio processing unit 91 and a speaker 93.
The audio processing unit 91 includes a D / A converter, a volume circuit, an amplifier circuit, etc., digitally / analog-converts audio data input from the in-vehicle device 100 by the D / A converter, and adjusts the volume level by the volume circuit. , It is amplified by the amplifier circuit and output as audio from the speaker 93.
 車載装置100は、記憶部110及びプロセッサ130を備えるコンピュータ装置である。車載装置100は、自車両5が、自車両5の周囲の他車両に危険を及ぼすおそれのある状態である「特定の状態」を検出した場合に、この特定の状態を検出したときに自車両5の周囲を走行する他車両を撮影し、自車両5との車間距離が予め設定された設定距離以下の他車両が検出された場合に、この他車両が、「特定の状態」を検出したときに、自車両5の周囲を走行していたか否かを判定する。そして、車載装置100は、他車両が、「特定の状態」を検出したときに、自車両5の周囲を走行していた場合に、運転者に注意喚起を促す報知動作を実行することで、運転者の運転操作を支援する。 The in-vehicle device 100 is a computer device including a storage unit 110 and a processor 130. When the own vehicle 5 detects a "specific state" which is a state in which the own vehicle 5 may pose a danger to other vehicles around the own vehicle 5, the in-vehicle device 100 detects the specific state of the own vehicle. When another vehicle traveling around 5 is photographed and another vehicle whose distance from the own vehicle 5 is less than or equal to a preset set distance is detected, the other vehicle detects a "specific state". Occasionally, it is determined whether or not the vehicle has been traveling around the own vehicle 5. Then, the in-vehicle device 100 executes a notification operation to alert the driver when the other vehicle is traveling around the own vehicle 5 when the "specific state" is detected. Assist the driver in driving operations.
 記憶部110は、例えば、フラッシュメモリや、EEPROM(Electrically Erasable Programmable Read-Only Memory)等の不揮発性のメモリを備える。また、記憶部110は、不揮発性のメモリに加え、RAM(Random Access Memory)等の揮発性のメモリを備える構成であってもよい。
 記憶部110は、プロセッサ130が実行する制御プログラムを記憶する。また、記憶部110は、車外撮影画像や、室内撮影画像、生体情報、車速情報、加速度情報を一時的に記憶する。
The storage unit 110 includes, for example, a flash memory and a non-volatile memory such as an EEPROM (Electrically Erasable Programmable Read-Only Memory). Further, the storage unit 110 may be configured to include a volatile memory such as a RAM (Random Access Memory) in addition to the non-volatile memory.
The storage unit 110 stores a control program executed by the processor 130. Further, the storage unit 110 temporarily stores an image taken outside the vehicle, an image taken indoors, biological information, vehicle speed information, and acceleration information.
 プロセッサ130は、CPU(Central Processing Unit)や、CPUを搭載したMCU(Micro Controller Unit)、MPU(Micro Processor Unit)等のマイコンにより構成される。また、車載装置100を、ASIC(Application Specific Integrated Circuit)やFPGA(Field  Programmable Gate Array)等の集積回路により実現してもよい。 The processor 130 is composed of a CPU (Central Processing Unit) and a microcomputer such as an MCU (MicroControllerUnit) and an MPU (MicroProcessorUnit) equipped with a CPU. Further, the in-vehicle device 100 may be realized by an integrated circuit such as an ASIC (Application Specific Integrated Circuit) or an FPGA (Field Programmable Gate Array).
 車載装置100は、第1状態情報取得部131、第2状態情報取得部132、状態検出部133、運転傾向判定部134、画像取得部135、距離取得部136、対象車両検出部137及び報知制御部138を備える。これらの機能ブロックは、記憶部110に記憶された制御プログラムをプロセッサ130が実行することで実現される車載装置100の機能を示すものである。 The in-vehicle device 100 includes a first state information acquisition unit 131, a second state information acquisition unit 132, a state detection unit 133, a driving tendency determination unit 134, an image acquisition unit 135, a distance acquisition unit 136, a target vehicle detection unit 137, and notification control. A unit 138 is provided. These functional blocks indicate the functions of the vehicle-mounted device 100 realized by the processor 130 executing the control program stored in the storage unit 110.
 第1状態情報取得部131は、自車両5の走行状態を示す走行状態情報を取得する。第1状態情報取得部131は、走行状態情報として、車外撮影画像と、車速情報と、加速度情報とを記憶部110から取得する。第1状態情報取得部131は、取得した車外撮影画像、車速情報及び加速度情報を状態検出部133に出力する。 The first state information acquisition unit 131 acquires the running state information indicating the running state of the own vehicle 5. The first state information acquisition unit 131 acquires an image taken outside the vehicle, vehicle speed information, and acceleration information from the storage unit 110 as traveling state information. The first state information acquisition unit 131 outputs the acquired out-of-vehicle image, vehicle speed information, and acceleration information to the state detection unit 133.
 第2状態情報取得部132は、自車両5の運転者の状態を示す運転者状態情報を取得する。第2状態情報取得部132は、運転者状態情報として、室内撮影画像や、生体情報を記憶部110から取得する。第2状態情報取得部132は、取得した室内撮影画像や、生体情報を状態検出部133に出力する。 The second state information acquisition unit 132 acquires driver state information indicating the state of the driver of the own vehicle 5. The second state information acquisition unit 132 acquires indoor photographed images and biological information from the storage unit 110 as driver state information. The second state information acquisition unit 132 outputs the acquired indoor photographed image and biological information to the state detection unit 133.
 状態検出部133には、第1状態情報取得部131から車外撮影画像や、車速情報、加速度情報が入力され、第2状態情報取得部132から室内撮影画像や、生体情報が入力される。状態検出部133は、入力されたこれらの情報に基づいて、自車両5が他車両の走行に悪影響を及ぼし得る状態である特定の状態を検出する。 The state detection unit 133 is input with an image taken outside the vehicle, vehicle speed information, and acceleration information from the first state information acquisition unit 131, and an image taken indoors and biometric information are input from the second state information acquisition unit 132. Based on these input information, the state detection unit 133 detects a specific state in which the own vehicle 5 can adversely affect the running of another vehicle.
 状態検出部133は、本発明の状態判定部に相当し、自車両5の走行状態や、運転者の特定の感情、特定の動作に基づいて特定の状態を判定する。すなわち、状態検出部133は、自車両5の走行状態や、運転者の特定の感情、特定の動作を検出した場合に、自車両5の走行が、自車両5の周囲の他車両に危険を及ぼすおそれのある特定の状態にあると判定する。自車両5の走行状態には、割り込み運転や、急加速や急減速、低速走行が含まれる。また、運転者の特定の動作には、前のめり姿勢や、ステアリングやインパネ等を叩く仕草等が含まれる。また、運転者の特定の感情には、焦り、苛立ち、緊張等の安全運転に悪影響を及ぼす感情が含まれる。 The state detection unit 133 corresponds to the state determination unit of the present invention, and determines a specific state based on the running state of the own vehicle 5, a specific emotion of the driver, and a specific operation. That is, when the state detection unit 133 detects the running state of the own vehicle 5, a specific emotion of the driver, or a specific motion, the running of the own vehicle 5 poses a danger to other vehicles around the own vehicle 5. Judge that you are in a specific condition that may affect you. The traveling state of the own vehicle 5 includes interrupted driving, sudden acceleration and deceleration, and low-speed traveling. In addition, the driver's specific movement includes a forward leaning posture, a gesture of hitting a steering wheel, an instrument panel, or the like. In addition, the driver's specific emotions include emotions that adversely affect safe driving, such as impatience, irritation, and tension.
 状態検出部133は、車外撮影画像に基づき、自車両5の割り込み運転を検出する。自車両5の割り込み運転とは、自車両5が、走行中の他車両の、進路上の前方に進路変更することをいう。例えば、状態検出部133は、路面を撮影した車外撮影画像に基づいて車線変更を検出し、車線変更を検出したときに自車両5の周囲を走行する他車両との距離に基づいて割り込み運転を検出する。 The state detection unit 133 detects the interrupted driving of the own vehicle 5 based on the image taken outside the vehicle. The interrupted operation of the own vehicle 5 means that the own vehicle 5 changes its course to the front on the course of another running vehicle. For example, the state detection unit 133 detects a lane change based on an image taken outside the vehicle of the road surface, and when the lane change is detected, interrupts driving based on the distance to another vehicle traveling around the own vehicle 5. To detect.
 また、状態検出部133は、加速度情報が示す加速度と、予め設定されたしきい値と比較して、自車両5の急加速又は急減速を検出し、車速情報が示す車速と、予め設定されたしきい値とを比較して、自車両5の低速走行を検出する。 Further, the state detection unit 133 detects the sudden acceleration or deceleration of the own vehicle 5 by comparing the acceleration indicated by the acceleration information with the preset threshold value, and presets the vehicle speed indicated by the vehicle speed information. The low-speed running of the own vehicle 5 is detected by comparing with the threshold value.
 さらに、状態検出部133は、車室内撮影部20の撮影画像から運転者の姿勢や、仕種を検出して特定の状態を検出する。状態検出部133が検出する姿勢には、前のめり姿勢が含まれ、仕種にはステアリングや、インパネ等を叩く行為が含まれる。 Further, the state detection unit 133 detects the driver's posture and type from the captured image of the vehicle interior photographing unit 20 to detect a specific state. The posture detected by the state detection unit 133 includes a forward leaning posture, and the type includes an act of hitting a steering wheel, an instrument panel, or the like.
 また、状態検出部133は、車室内撮影部20の撮影画像から運転者の顔画像を検出し、検出した顔画像に基づいて運転者の特定の感情を検出する。特定の感情とは、焦り、苛立ち、緊張等の運転に悪影響を及ぼす感情である。
 例えば、状態検出部133は、記憶部110に記憶させたマスタ画像に基づいて運転者の特定の感情を検出する。このマスタ画像は、事前に撮影された運転者の顔画像であって、焦りや、苛立ち、緊張等の特定の感情ごとに撮影された画像である。状態検出部133は、車室内撮影部20の撮影画像に撮影された顔画像と、マスタ画像とを比較して、運転者の特定の感情を検出する。
Further, the state detection unit 133 detects the driver's face image from the captured image of the vehicle interior photographing unit 20, and detects the driver's specific emotion based on the detected face image. Specific emotions are emotions that adversely affect driving, such as impatience, irritation, and tension.
For example, the state detection unit 133 detects a specific emotion of the driver based on the master image stored in the storage unit 110. This master image is a face image of the driver taken in advance, and is an image taken for each specific emotion such as impatience, irritation, and tension. The state detection unit 133 compares the face image captured in the captured image of the vehicle interior photographing unit 20 with the master image, and detects a specific emotion of the driver.
 また、状態検出部133は、車室内撮影部20の撮影画像に加えて、生体センサ40の生体情報を用いて運転者の特定の感情を検出してもよい。例えば、状態検出部133は、車室内撮影部20の撮影画像により特定の感情として怒りの感情が検出され、さらに生体センサ40の生体情報が示す心拍数が予め設定された値以上である場合に、運転者に怒りの感情が生じていると判定してもよい。 Further, the state detection unit 133 may detect a specific emotion of the driver by using the biological information of the biological sensor 40 in addition to the captured image of the vehicle interior photographing unit 20. For example, when the state detection unit 133 detects an angry emotion as a specific emotion from the image taken by the vehicle interior photographing unit 20, and the heart rate indicated by the biological information of the biological sensor 40 is equal to or higher than a preset value. , It may be determined that the driver is feeling angry.
 運転傾向判定部134は、割り込み運転、急加速、急減速、低速走行、特定の感情、特定の動作等の状態検出部133が特定の状態として検出する検出項目ごとに、検出回数をカウントする。記憶部110は、検出項目ごと、カウンタのカウント値を記憶する。運転傾向判定部134は、状態検出部133が特定の状態を検出するごとに、対応する検出項目のカウンタのカウント値を1加算する。 The driving tendency determination unit 134 counts the number of detections for each detection item detected by the state detection unit 133 as a specific state such as interrupted operation, sudden acceleration, sudden deceleration, low-speed running, specific emotion, and specific operation. The storage unit 110 stores the count value of the counter for each detection item. The driving tendency determination unit 134 adds 1 to the count value of the counter of the corresponding detection item each time the state detection unit 133 detects a specific state.
 運転傾向判定部134は、記憶部110に記憶させた各検出項目のカウント値に基づいて、運転者の運転傾向を判定する。例えば、運転傾向判定部134は、各検出項目のカウント値と、検出項目ごとに設定されたしきい値と、を比較して、カウント値がしきい値以上となった検出項目を検出する。運転傾向判定部134は、検出した検出項目に基づいて、運転者の運転傾向を判定する。 The driving tendency determination unit 134 determines the driving tendency of the driver based on the count value of each detection item stored in the storage unit 110. For example, the driving tendency determination unit 134 compares the count value of each detection item with the threshold value set for each detection item, and detects the detection item whose count value is equal to or greater than the threshold value. The driving tendency determination unit 134 determines the driving tendency of the driver based on the detected detection item.
 画像取得部135は、状態検出部133が特定の状態を検出した場合に、車外撮影部10に撮影を実行させる。車外撮影部10は、予め設定された設定時間の間、車外を撮影して外部撮影画像を生成する。車外撮影部10が撮影する外部撮影画像には、特定の状態が生じた際に、自車両5の周囲に存在した他車両が撮影される。画像取得部135は、車外撮影部10から入力された車外撮影画像を、記憶部110に記憶させる。状態検出部133が特定の状態を検出したときに車外撮影部10により撮影された外部撮影画像を、発生時撮影画像という。 The image acquisition unit 135 causes the outside vehicle photographing unit 10 to perform photography when the state detection unit 133 detects a specific state. The vehicle outside shooting unit 10 shoots the outside of the vehicle for a preset time to generate an externally shot image. In the externally captured image captured by the vehicle exterior photographing unit 10, another vehicle existing around the own vehicle 5 is photographed when a specific state occurs. The image acquisition unit 135 stores the image taken outside the vehicle input from the image acquisition unit 10 outside the vehicle in the storage unit 110. An externally captured image taken by the outside vehicle photographing unit 10 when the state detecting unit 133 detects a specific state is referred to as an image taken at the time of occurrence.
 距離取得部136は、自車両5と、自車両5の周囲に存在する他車両との距離を示す距離情報を取得する。距離取得部136は、ソナーユニット30のソナー情報を距離情報として記憶部110から取得する。距離取得部136は、取得したソナー情報を対象車両検出部137に出力する。
 また、距離取得部136は、画像取得部135が取得した撮影画像に基づいて他車両との距離を検出し、距離情報を生成してもよい。例えば、車外撮影部10が備えるカメラをステレオカメラにより構成し、このステレオカメラの撮影画像に基づいて自車両5から他車両までの距離を検出してもよい。
The distance acquisition unit 136 acquires distance information indicating the distance between the own vehicle 5 and another vehicle existing around the own vehicle 5. The distance acquisition unit 136 acquires sonar information of the sonar unit 30 from the storage unit 110 as distance information. The distance acquisition unit 136 outputs the acquired sonar information to the target vehicle detection unit 137.
Further, the distance acquisition unit 136 may detect the distance to another vehicle based on the captured image acquired by the image acquisition unit 135 and generate the distance information. For example, the camera included in the outside vehicle photographing unit 10 may be configured by a stereo camera, and the distance from the own vehicle 5 to another vehicle may be detected based on the image taken by the stereo camera.
 対象車両検出部137は、本発明の検出部に相当する。対象車両検出部137には、距離取得部136が取得した距離情報が入力される。対象車両検出部137は、入力された距離情報が示す距離に基づいて、自車両5との距離が設定距離以下の他車両を検出する。以下、検出された他車両を対象車両という。対象車両検出部137は、対象車両が検出されると、検出された対象車両の方位等の情報を画像取得部135に出力し、対象車両の撮影を指示する。画像取得部135は、対象車両検出部137の指示に従って車外撮影部10に撮影を実行させ、撮影された車外撮影画像を対象車両検出部137に出力する。 The target vehicle detection unit 137 corresponds to the detection unit of the present invention. The distance information acquired by the distance acquisition unit 136 is input to the target vehicle detection unit 137. The target vehicle detection unit 137 detects another vehicle whose distance from the own vehicle 5 is equal to or less than the set distance based on the distance indicated by the input distance information. Hereinafter, the detected other vehicle is referred to as a target vehicle. When the target vehicle is detected, the target vehicle detection unit 137 outputs information such as the direction of the detected target vehicle to the image acquisition unit 135 and instructs the image acquisition unit 135 to take a picture of the target vehicle. The image acquisition unit 135 causes the vehicle outside shooting unit 10 to perform shooting according to the instruction of the target vehicle detection unit 137, and outputs the shot outside vehicle shooting image to the target vehicle detection unit 137.
 対象車両検出部137は、画像取得部135から対象車両を撮影した車外撮影画像が入力されると、入力された対象車両の外部撮影画像と、発生時撮影画像とを比較して、検出用画像に対象車両が撮影されているか否かを判定する。すなわち、特定の状態を検出したときに、対象車両が自車両5の周囲を走行していたか否かを判定する。
 対象車両検出部137は、検出用画像に対象車両が撮影されていた場合、第1通知を報知制御部138に出力する。第1通知は、自車両5との距離が設定距離以下の車両であって、特定の状態を検出したときに、自車両5の周囲を走行していた車両が検出されたことを示す通知である。
 また、対象車両検出部137は、発生時撮影画像に対象車両が撮影されていなかった場合、第2通知を報知制御部138に出力する。第2通知は、自車両5との距離が設定距離以下の車両が検出されたことを示す通知である。
When the image acquisition unit 135 inputs an image taken outside the vehicle, the target vehicle detection unit 137 compares the input external image of the target vehicle with the image taken at the time of occurrence, and detects an image. It is determined whether or not the target vehicle is photographed. That is, when a specific state is detected, it is determined whether or not the target vehicle is traveling around the own vehicle 5.
The target vehicle detection unit 137 outputs the first notification to the notification control unit 138 when the target vehicle is photographed in the detection image. The first notification is a notification indicating that a vehicle whose distance to the own vehicle 5 is less than or equal to the set distance and which was traveling around the own vehicle 5 was detected when a specific state was detected. be.
Further, the target vehicle detection unit 137 outputs a second notification to the notification control unit 138 when the target vehicle is not photographed in the image captured at the time of occurrence. The second notification is a notification indicating that a vehicle whose distance from the own vehicle 5 is equal to or less than the set distance has been detected.
 報知制御部138は、報知部である表示部80や音声出力部90の動作を制御する。報知制御部138は、本発明の報知制御部及び表示制御部に相当する。報知制御部138は、対象車両検出部137から第1通知が入力されると、表示部80や音声出力部90を制御して第1報知動作を実行させる。報知制御部138は、第1報知動作として、表示部80に警告画像を表示させ、音声出力部90から所定の警告音を出力させる。警告画像には、例えば、車間距離が設定距離以下の他車両が検出されたこと、検出された他車両は、特定の状態が検出されたときに自車両5の周囲を走行していた車両であること、等が表示される。 The notification control unit 138 controls the operation of the display unit 80 and the voice output unit 90, which are notification units. The notification control unit 138 corresponds to the notification control unit and the display control unit of the present invention. When the first notification is input from the target vehicle detection unit 137, the notification control unit 138 controls the display unit 80 and the voice output unit 90 to execute the first notification operation. As the first notification operation, the notification control unit 138 displays a warning image on the display unit 80 and outputs a predetermined warning sound from the voice output unit 90. In the warning image, for example, another vehicle whose inter-vehicle distance is less than or equal to the set distance is detected, and the detected other vehicle is a vehicle that was traveling around the own vehicle 5 when a specific state was detected. That there is, etc. are displayed.
 また、報知制御部138は、対象車両検出部137から第2通知が入力されると、表示部80を制御して第2報知動作を実行させる。第2報知動作は、第1報知動作よりも注意喚起のレベルが低く設定されている。このため、報知制御部138は、対象車両検出部137から第2通知が入力された場合、音声出力部90から警告音を出力させることなく、表示部80に警告画像を表示させる。警告画像には、例えば、車間距離が設定距離以下の他車両が検出されたことが表示される。 Further, when the second notification is input from the target vehicle detection unit 137, the notification control unit 138 controls the display unit 80 to execute the second notification operation. In the second notification operation, the level of alerting is set lower than that in the first notification operation. Therefore, when the second notification is input from the target vehicle detection unit 137, the notification control unit 138 causes the display unit 80 to display the warning image without outputting the warning sound from the voice output unit 90. In the warning image, for example, it is displayed that another vehicle whose inter-vehicle distance is less than or equal to the set distance has been detected.
 さらに、報知制御部138は、運転傾向判定部134が判定した運転傾向を表示部80に表示させる。例えば、報知制御部138は、運転者の運転傾向が急加速の頻度が高い傾向にあると運転傾向判定部134により判定されると、急加速の頻度が高い傾向にあることを通知する案内を表示部80に表示させる。また、報知制御部138は、運転者が運転中に怒りの感情を表す行動を起こす頻度が高い傾向にあると運転傾向判定部134により判定されると、怒りの感情を表す行動を起こす頻度が高いことを通知する案内を表示部80に表示させる。 Further, the notification control unit 138 causes the display unit 80 to display the driving tendency determined by the driving tendency determination unit 134. For example, when the driving tendency determination unit 134 determines that the driver's driving tendency tends to have a high frequency of sudden acceleration, the notification control unit 138 provides guidance to notify that the frequency of sudden acceleration tends to be high. It is displayed on the display unit 80. Further, when the driving tendency determination unit 134 determines that the notification control unit 138 tends to take an action expressing an angry feeling while driving, the frequency of taking an action expressing an angry feeling increases. The display unit 80 displays a guide notifying that the price is high.
 次に、図2及び図3に示すフローチャートを参照しながら車載装置100の動作について説明する。
 まず、車載装置100は、自車両5が走行中であるか否かを判定する(ステップS1)。車載装置100は、車速情報に基づいて自車両5が走行中であるか否かを判定する。車載装置100は、自車両5が走行中ではない場合(ステップS1/NO)、自車両5が走行を開始するまで待機する。
Next, the operation of the in-vehicle device 100 will be described with reference to the flowcharts shown in FIGS. 2 and 3.
First, the vehicle-mounted device 100 determines whether or not the own vehicle 5 is traveling (step S1). The in-vehicle device 100 determines whether or not the own vehicle 5 is traveling based on the vehicle speed information. When the own vehicle 5 is not running (step S1 / NO), the in-vehicle device 100 waits until the own vehicle 5 starts running.
 車載装置100は、自車両5が走行中である場合(ステップS1/YES)、車外撮影画像を取得する(ステップS2)。車載装置100は、取得した車外撮影画像に基づいて割り込み運転を検出する(ステップS3)。また、車載装置100は、自車両5の加速度情報も参照して割り込み運転を検出してもよい。車載装置100は、割り込み運転が検出された場合(ステップS3/YES)、特定の状態が検出されたと判定し、車外撮影部10に撮影を実行させて発生時撮影画像を取得する(ステップS4)。 When the own vehicle 5 is traveling (step S1 / YES), the in-vehicle device 100 acquires an image taken outside the vehicle (step S2). The in-vehicle device 100 detects interrupted driving based on the acquired image taken outside the vehicle (step S3). Further, the in-vehicle device 100 may detect the interrupted driving by referring to the acceleration information of the own vehicle 5. When the in-vehicle device 100 detects an interrupted operation (step S3 / YES), the in-vehicle device 100 determines that a specific state has been detected, causes the vehicle-mounted photographing unit 10 to perform shooting, and acquires a captured image at the time of occurrence (step S4). ..
 次に、車載装置100は、特定の状態を検出した検出時刻と、発生時撮影画像とを対応づけて記憶部110に記憶させる(ステップS5)。また、車載装置100は、割り込み運転の検出回数をカウントするカウンタのカウント値を1インクリメントする(ステップS6)。 Next, the in-vehicle device 100 stores the detection time when a specific state is detected and the captured image at the time of occurrence in the storage unit 110 in association with each other (step S5). Further, the vehicle-mounted device 100 increments the count value of the counter that counts the number of detections of the interrupt operation by 1 (step S6).
 また、車載装置100は、ステップS3の判定が否定判定の場合(ステップS3/NO)、又はステップS6の処理後、車速センサ50により検出された車速情報を取得する(ステップS7)。車載装置100は、取得した車速情報と、予め設定されたしきい値とを比較して、自車両5が低速走行の状態にあり、この低速走行の状態が設定時間以上継続しているか否かを判定する(ステップS8)。車載装置100は、低速走行の状態が設定時間以上継続していると判定した場合(ステップS8/YES)、特定の状態が検出されたと判定し、車外撮影部10に撮影を実行させて発生時撮影画像を取得する(ステップS9)。 Further, the vehicle-mounted device 100 acquires the vehicle speed information detected by the vehicle speed sensor 50 when the determination in step S3 is a negative determination (step S3 / NO) or after the processing in step S6 (step S7). The in-vehicle device 100 compares the acquired vehicle speed information with a preset threshold value, and determines whether or not the own vehicle 5 is in a low-speed running state and this low-speed running state continues for a set time or longer. Is determined (step S8). When the in-vehicle device 100 determines that the low-speed running state has continued for a set time or longer (step S8 / YES), the in-vehicle device 100 determines that a specific state has been detected, and causes the outside-vehicle photographing unit 10 to perform imaging when the occurrence occurs. Acquire the captured image (step S9).
 次に、車載装置100は、特定の状態を検出した検出時刻と、発生時撮影画像とを対応づけて記憶部110に記憶させる(ステップS10)。また、車載装置100は、低速運転の検出回数をカウントするカウンタのカウント値を1インクリメントする(ステップS11)。 Next, the in-vehicle device 100 stores the detection time when a specific state is detected and the captured image at the time of occurrence in the storage unit 110 in association with each other (step S10). Further, the vehicle-mounted device 100 increments the count value of the counter that counts the number of detections of low-speed operation by 1 (step S11).
 また、車載装置100は、ステップS8の判定が否定判定の場合(ステップS8/NO)、又はステップS11の処理後、加速度センサ60により検出された加速度情報を取得する(ステップS12)。車載装置100は、取得した加速度情報と、予め設定されたしきい値とを比較して、自車両5の急加速又は急減速を検出する(ステップS13)。 Further, the in-vehicle device 100 acquires the acceleration information detected by the acceleration sensor 60 when the determination in step S8 is a negative determination (step S8 / NO) or after the processing in step S11 (step S12). The in-vehicle device 100 compares the acquired acceleration information with a preset threshold value to detect sudden acceleration or deceleration of the own vehicle 5 (step S13).
 車載装置100は、自車両5の急加速又は急減速を検出した場合(ステップS13/YES)、特定の状態が検出されたと判定し、車外撮影部10に撮影を実行させて発生時撮影画像を取得する(ステップS14)。 When the in-vehicle device 100 detects a sudden acceleration or a sudden deceleration of the own vehicle 5 (step S13 / YES), the in-vehicle device 100 determines that a specific state has been detected, causes the outside shooting unit 10 to take a picture, and captures an image taken at the time of occurrence. Acquire (step S14).
 次に、車載装置100は、特定の状態を検出した検出時刻と、発生時撮影画像とを対応づけて記憶部110に記憶させる(ステップS15)。また、車載装置100は、急加速又は急減速の検出回数をカウントするカウンタのカウント値を1インクリメントする(ステップS16)。 Next, the in-vehicle device 100 stores the detection time when a specific state is detected and the captured image at the time of occurrence in the storage unit 110 in association with each other (step S15). Further, the in-vehicle device 100 increments the count value of the counter that counts the number of detections of sudden acceleration or deceleration by 1 (step S16).
 また、車載装置100は、ステップS13の判定が否定判定の場合(ステップS13/NO)、又はステップS16の処理後、車室内撮影部20の室内撮影画像、及び生体情報を取得する(ステップS17)。車載装置100は、取得した室内撮影画像や、生体情報に基づいて、運転者の特定の動作や、特定の感情を検出する(ステップS18)。車載装置100は、運転者の前のめり姿勢や、ステアリングやインパネ等を叩く仕草を検出した場合、特定の動作が検出されたと判定する。また、車載装置100は、推定した感情が、焦り、苛立ち、緊張等の感情であった場合、特定の感情が検出されたと判定する。 Further, the in-vehicle device 100 acquires an indoor photographed image and biological information of the vehicle interior photographing unit 20 when the determination in step S13 is a negative determination (step S13 / NO) or after the processing of step S16 (step S17). .. The in-vehicle device 100 detects a specific action or a specific emotion of the driver based on the acquired indoor photographed image and biological information (step S18). When the in-vehicle device 100 detects a leaning posture in front of the driver or a gesture of hitting the steering wheel, instrument panel, or the like, it is determined that a specific motion is detected. Further, the in-vehicle device 100 determines that a specific emotion has been detected when the estimated emotion is an emotion such as impatience, irritation, or tension.
 車載装置100は、特定の動作、又は特定の感情が検出された場合(ステップS18/YES)、車外撮影部10に撮影を実行させて発生時撮影画像を取得する(ステップS19)。車載装置100は、特定の動作、又は特定の感情が検出された検出時刻と、発生時撮影画像とを対応づけて記憶部110に記憶させる(ステップS20)。また、車載装置100は、検出した特定の状態の対応したカウンタのカウント値を1インクリメントする(ステップS21)。 When a specific motion or a specific emotion is detected (step S18 / YES), the in-vehicle device 100 causes the outside vehicle photographing unit 10 to perform shooting and acquires a captured image at the time of occurrence (step S19). The in-vehicle device 100 stores the detection time at which a specific motion or a specific emotion is detected in the storage unit 110 in association with the captured image at the time of occurrence (step S20). Further, the vehicle-mounted device 100 increments the count value of the corresponding counter in the detected specific state by 1 (step S21).
 車載装置100は、ステップS18の判定が否定判定であった場合(ステップS18/NO)、又はステップS21の処理を実行後、カウント値がしきい値以上のカウンタがあるか否かを判定する(ステップS22)。車載装置100は、カウント値がしきい値以上のカウンタがない場合(ステップS22/NO)、ステップS1の判定に戻る。また、車載装置100は、カウント値がしきい値以上のカウンタがある場合(ステップS22/YES)、このカウンタに対応づけられた特定の状態に基づいて運転者の運転傾向を判定し、判定した運転傾向を表示部80に表示させる(ステップS23)。 The in-vehicle device 100 determines whether or not there is a counter whose count value is equal to or greater than the threshold value when the determination in step S18 is a negative determination (step S18 / NO) or after executing the process in step S21 (step S18 / NO). Step S22). The in-vehicle device 100 returns to the determination in step S1 when there is no counter whose count value is equal to or greater than the threshold value (step S22 / NO). Further, when there is a counter whose count value is equal to or higher than the threshold value (step S22 / YES), the in-vehicle device 100 determines and determines the driving tendency of the driver based on the specific state associated with this counter. The driving tendency is displayed on the display unit 80 (step S23).
 次に、特定の状態を検出してから報知動作を行うまでの車載装置100の動作について、図3に示すフローチャートを参照しながら説明する。
 まず、車載装置100は、特定の状態が検出されたか否かを判定する(ステップS31)。車載装置100は、特定の状態が検出された場合(ステップS31/YES)、ソナーユニット30のソナー情報を取得する(ステップS32)。
Next, the operation of the in-vehicle device 100 from the detection of the specific state to the notification operation will be described with reference to the flowchart shown in FIG.
First, the vehicle-mounted device 100 determines whether or not a specific state has been detected (step S31). When a specific state is detected (step S31 / YES), the in-vehicle device 100 acquires sonar information of the sonar unit 30 (step S32).
 車載装置100は、取得したソナー情報に基づき、自車両5との距離が設定距離以下の他車両を検出する(ステップS33)。車載装置100は、設定距離以下の他車両が検出されなかった場合(ステップS33/NO)、ステップS31の判定に戻る。 The in-vehicle device 100 detects another vehicle whose distance from the own vehicle 5 is less than or equal to the set distance based on the acquired sonar information (step S33). When the vehicle-mounted device 100 does not detect another vehicle within the set distance (step S33 / NO), the vehicle-mounted device 100 returns to the determination in step S31.
 また、車載装置100は、設定距離以下の他車両が検出された場合(ステップS33/YES)、信号待ち等により車間距離が短くなった場合を排除するため、自車両5が走行中であるか否かを判定する(ステップS34)。車載装置100は、自車両5が走行中ではない場合(ステップS34/NO)、ステップS31の判定に戻る。 Further, the in-vehicle device 100 determines whether the own vehicle 5 is traveling in order to exclude the case where another vehicle within the set distance is detected (step S33 / YES) and the inter-vehicle distance is shortened due to waiting for a signal or the like. It is determined whether or not (step S34). When the own vehicle 5 is not traveling (step S34 / NO), the in-vehicle device 100 returns to the determination in step S31.
 また、車載装置100は、自車両5が走行中である場合(ステップS34/YES)、ステップS33で検出した他車両を対象車両として、この対象車両の撮影を車外撮影部10に指示し(ステップS35)、対象車両が撮影された外部撮影画像を取得する。 Further, when the own vehicle 5 is traveling (step S34 / YES), the in-vehicle device 100 instructs the outside shooting unit 10 to take a picture of the target vehicle with the other vehicle detected in step S33 as the target vehicle (step). S35), the externally photographed image of the target vehicle is acquired.
 次に、車載装置100は、記憶部110から発生時撮影画像を読み出す。車載装置100は、発生時撮影画像に対応づけて記憶させた検出時刻を参照して、検出時刻から現在時刻までの経過時間が、所定時間以内の発生時撮影画像を読み出す。車載装置100は、読み出した発生時撮影画像と、対象車両が撮影された外部撮影画像とを比較して、発生時撮影画像に対象車両が撮影されているか否かを判定する(ステップS36)。車載装置100は、発生時撮影画像に対象車両が撮影されていた場合(ステップS36/YES)、第1報知動作を実行させる(ステップS37)。報知制御部138は、第1報知動作として、表示部80に警告画像を表示させ、音声出力部90から所定の警告音を出力させる。
 また、車載装置100は、発生時撮影画像に対象車両が撮影されていなかった場合(ステップS36/NO)、第2報知動作を実行させる(ステップS38)。報知制御部138は、第2報知動作として、表示部80に警告画像を表示させる。
Next, the in-vehicle device 100 reads out the captured image at the time of occurrence from the storage unit 110. The in-vehicle device 100 refers to the detection time stored in association with the image taken at the time of occurrence, and reads out the image taken at the time of occurrence within a predetermined time from the detection time to the current time. The in-vehicle device 100 compares the read image taken at the time of occurrence with the image taken externally taken by the target vehicle, and determines whether or not the target vehicle is photographed in the image taken at the time of occurrence (step S36). When the target vehicle is photographed in the image captured at the time of occurrence (step S36 / YES), the vehicle-mounted device 100 executes the first notification operation (step S37). As the first notification operation, the notification control unit 138 displays a warning image on the display unit 80 and outputs a predetermined warning sound from the voice output unit 90.
Further, when the target vehicle is not captured in the captured image at the time of occurrence (step S36 / NO), the vehicle-mounted device 100 executes the second notification operation (step S38). The notification control unit 138 causes the display unit 80 to display a warning image as the second notification operation.
 以上説明したように本実施形態の車載装置100は、第1状態情報取得部131、第2状態情報取得部132、状態検出部133、画像取得部135、距離取得部136、対象車両検出部137及び報知制御部138を備える。 As described above, the in-vehicle device 100 of the present embodiment has a first state information acquisition unit 131, a second state information acquisition unit 132, a state detection unit 133, an image acquisition unit 135, a distance acquisition unit 136, and a target vehicle detection unit 137. And a notification control unit 138.
 第1状態情報取得部131は、自車両5の走行状態を示す走行状態情報を取得する。
 第2状態情報取得部132は、自車両5の運転者の状態を示す運転者状態情報を取得する。
 状態検出部133は、第1状態情報取得部131及び第2状態情報取得部132が取得した情報に基づいて、自車両5の走行が、自車両5の周囲の他車両に危険を及ぼすおそれのある特定の状態にあるか否かを判定する。
 画像取得部135は、状態検出部133により特定の状態にあると判定された場合に、自車両5の周囲を撮影した発生時撮影画像を取得する。
 距離取得部136は、自車両5と、自車両5の周囲の他車両との距離を示す距離情報を取得する。
 対象車両検出部137は、距離情報が示す距離が設定値以下の他車両が検出された場合に、他車両を撮影した撮影画像と、発生時撮影画像とを比較して、他車両を発生時撮影画像から検出する。
 報知制御部138は、対象車両検出部137により他車両が検出された場合に、自車両5に搭載された表示部80や音声出力部90に、運転者に注意喚起を促す第1報知動作を実行させる。
 従って、車載装置100は、自車両5が、自車両5の周囲の他車両に危険を及ぼすおそれのある特定の状態を検出した場合に、特定の状態の検出時に自車両5の周囲に存在した他車両と、自車両5との距離を検出し、検出した距離が設定値以下の場合に、運転者に注意喚起を促す第1報知動作が実行される。このため、煽り行為の発生前に、煽り行為を受ける可能性があることを車両の運転者に通知することができる。
The first state information acquisition unit 131 acquires the running state information indicating the running state of the own vehicle 5.
The second state information acquisition unit 132 acquires driver state information indicating the state of the driver of the own vehicle 5.
Based on the information acquired by the first state information acquisition unit 131 and the second state information acquisition unit 132, the state detection unit 133 may cause the traveling of the own vehicle 5 to pose a danger to other vehicles around the own vehicle 5. Determine if you are in a particular state.
When the state detection unit 133 determines that the vehicle is in a specific state, the image acquisition unit 135 acquires an image taken at the time of occurrence in which the surroundings of the own vehicle 5 are photographed.
The distance acquisition unit 136 acquires distance information indicating the distance between the own vehicle 5 and other vehicles around the own vehicle 5.
When another vehicle whose distance indicated by the distance information is equal to or less than the set value is detected, the target vehicle detection unit 137 compares the photographed image of the other vehicle with the photographed image at the time of occurrence, and when the other vehicle is generated. Detect from captured images.
When another vehicle is detected by the target vehicle detection unit 137, the notification control unit 138 performs a first notification operation for calling the driver's attention to the display unit 80 and the voice output unit 90 mounted on the own vehicle 5. Let it run.
Therefore, when the own vehicle 5 detects a specific state that may pose a danger to other vehicles around the own vehicle 5, the in-vehicle device 100 is present around the own vehicle 5 at the time of detecting the specific state. The distance between the other vehicle and the own vehicle 5 is detected, and when the detected distance is equal to or less than the set value, the first notification operation for alerting the driver is executed. Therefore, it is possible to notify the driver of the vehicle that there is a possibility of being fanned before the fanning occurs.
 また、第2状態情報取得部132は、運転者の生体情報を取得し、取得した生体情報に基づき、運転者の感情を運転者状態情報として推定する。
 状態検出部133は、第2状態情報取得部132が推定した運転者の感情に基づき、特定の状態を判定する。
 従って、運転者の感情の推定精度を高め、特定の状態の検出精度を高めることができる。
Further, the second state information acquisition unit 132 acquires the driver's biological information, and estimates the driver's emotion as the driver's state information based on the acquired biological information.
The state detection unit 133 determines a specific state based on the driver's emotions estimated by the second state information acquisition unit 132.
Therefore, it is possible to improve the estimation accuracy of the driver's emotion and improve the detection accuracy of a specific state.
 第2状態情報取得部132は、自車両5の車室内を撮影した撮影画像を取得し、取得した撮影画像に基づいて運転者の動作を示す情報を運転者状態情報として取得する。
 状態検出部133は、第2状態情報取得部132が推定した運転者の動作に基づき、特定の状態を判定する。
 例えば、運転者の動作として、前のめり姿勢や、ステアリングやインパネ等を叩く行為を検出することで運転者の感情を推定し、推定した運転者の感情に基づいて特定の状態を検出するため、特定の状態の検出精度を高めることができる。
The second state information acquisition unit 132 acquires a photographed image of the interior of the own vehicle 5, and acquires information indicating the driver's operation based on the acquired photographed image as driver state information.
The state detection unit 133 determines a specific state based on the driver's operation estimated by the second state information acquisition unit 132.
For example, as a driver's motion, the driver's emotion is estimated by detecting the forward leaning posture and the action of hitting the steering wheel, instrument panel, etc., and a specific state is detected based on the estimated driver's emotion. It is possible to improve the detection accuracy of the state of.
 報知制御部138は、対象車両検出部137が発生時撮影画像から他車両を検出することができなかった場合、第1報知動作よりも注意喚起のレベルが低い第2報知動作を実行させる。
 従って、特定の状態の検出時に自車両5の周囲に存在しなかった他車両であっても、自車両5との距離が設定値以下の場合に、運転者に注意喚起を促すことができる。
When the target vehicle detection unit 137 cannot detect another vehicle from the captured image when the target vehicle is generated, the notification control unit 138 executes a second notification operation having a lower level of alerting than the first notification operation.
Therefore, even if the other vehicle does not exist around the own vehicle 5 when the specific state is detected, the driver can be alerted when the distance to the own vehicle 5 is equal to or less than the set value.
 また、車載装置100は、状態検出部133が、走行状態情報に基づいて特定の状態を検出した回数と、運転者状態情報に基づいて特定の状態を検出した回数とをそれぞれ計数し、計数した回数に基づいて運転者の運転傾向を判定する運転傾向判定部134を備える。
 報知制御部138は、運転傾向判定部134が判定した運転傾向を表示部80に表示させる。
 従って、特定の状態の検出回数に基づいて運転者の運転傾向を判定し、判定した運転傾向を運転者に通知することができる。
Further, the in-vehicle device 100 counts and counts the number of times the state detection unit 133 detects a specific state based on the traveling state information and the number of times the specific state is detected based on the driver state information. The driving tendency determination unit 134 for determining the driving tendency of the driver based on the number of times is provided.
The notification control unit 138 causes the display unit 80 to display the driving tendency determined by the driving tendency determination unit 134.
Therefore, it is possible to determine the driving tendency of the driver based on the number of detections of a specific state and notify the driver of the determined driving tendency.
 上述した実施形態は、あくまでも本発明の一態様を例示するものであって、本発明の要旨を逸脱しない範囲で任意に変形、及び応用が可能である。
 例えば、図1に示す車載装置100の構成は、車載装置100が備える機能を主な処理内容に応じて分類して示した概略図であり、車載装置100の構成は、処理内容に応じて、さらに多くのブロックに分割することもできる。また、この機能ブロックは、図1に示す1つのブロックによりさらに多くの処理を実行するように構成しても良い。また、各ブロックの処理は、1つのハードウェアで実行してもよいし、複数のハードウェアで実行してもよい。また、各ブロックの処理は、1つのプログラムで実現してもよいし、複数のプログラムで実現してもよい。
The above-described embodiment is merely an example of one aspect of the present invention, and can be arbitrarily modified and applied without departing from the gist of the present invention.
For example, the configuration of the vehicle-mounted device 100 shown in FIG. 1 is a schematic diagram showing the functions of the vehicle-mounted device 100 classified according to the main processing content, and the configuration of the vehicle-mounted device 100 is based on the processing content. It can be divided into more blocks. Further, this functional block may be configured to execute more processing by one block shown in FIG. 1. Further, the processing of each block may be executed by one hardware or may be executed by a plurality of hardware. Further, the processing of each block may be realized by one program or may be realized by a plurality of programs.
 また、図2及び図3に示すフローチャートの処理単位は、車載装置100の処理を理解容易にするために、主な処理内容に応じて分割したものであり、処理単位の分割の仕方や名称によって本発明が制限されることはない。また、車載装置100の処理は、処理内容に応じて、さらに多くの処理単位に分割することもできるし、1つの処理単位がさらに多くの処理を含むように分割することもできる。また、上記のフローチャートの処理順序も、図示した例に限られるものではない。 Further, the processing units of the flowcharts shown in FIGS. 2 and 3 are divided according to the main processing contents in order to make the processing of the in-vehicle device 100 easy to understand, and depending on the method and name of division of the processing units. The present invention is not limited. Further, the processing of the in-vehicle device 100 can be divided into more processing units depending on the processing content, or one processing unit can be divided so as to include more processing. Further, the processing order of the above flowchart is not limited to the illustrated example.
 また、本発明の運転支援方法をコンピュータにより実現する場合、このコンピュータに実行させるプログラムを記録媒体、又はプログラムを伝送する伝送媒体の態様で構成することも可能である。記録媒体には、磁気的、光学的記録媒体又は半導体メモリーデバイスを用いることができる。具体的には、記録媒体には、フレキシブルディスク、HDD(Hard Disk Drive)、CD-ROM(Compact Disk Read Only Memory)、DVD、Blu-ray(登録商標) Disc、光磁気ディスクが挙げられる。また、記録媒体として、フラッシュメモリ、カード型記録媒体等の可搬型、或いは固定式の記録媒体を挙げることもできる。また、上記記録媒体は、表示装置が備える内部記憶装置であるRAM、ROM、HDD等の不揮発性記憶装置であってもよい。 Further, when the operation support method of the present invention is realized by a computer, it is also possible to configure the program to be executed by the computer in the form of a recording medium or a transmission medium for transmitting the program. As the recording medium, a magnetic or optical recording medium or a semiconductor memory device can be used. Specific examples of the recording medium include a flexible disk, an HDD (Hard Disk Drive), a CD-ROM (Compact Disk Read Only Memory), a DVD, a Blu-ray (registered trademark) Disc, and a magneto-optical disk. Further, as the recording medium, a portable recording medium such as a flash memory or a card-type recording medium, or a fixed recording medium can be mentioned. Further, the recording medium may be a non-volatile storage device such as RAM, ROM, or HDD, which is an internal storage device included in the display device.
 1 運転支援システム
 5 車両
 10 車外撮影部
 11 フロントカメラ
 13 リアカメラ
 15 左サイドカメラ
 17 右サイドカメラ
 20 車室内撮影部
 25 カメラ
 30 ソナーユニット
 40 生体センサ
 50 車速センサ
 60 加速度センサ
 70 操作部
 85 タッチパネル
 90 音声出力部
 91 音声処理部
 93 スピーカ
 100 車載装置
 110 記憶部
 130 プロセッサ
 131 第1状態情報取得部
 132 第2状態情報取得部
 133 状態検出部
 134 運転傾向判定部
 135 画像取得部
 136 距離取得部
 137 対象車両検出部
 138 報知制御部
1 Driving support system 5 Vehicle 10 External photography unit 11 Front camera 13 Rear camera 15 Left side camera 17 Right side camera 20 Vehicle interior photography unit 25 Camera 30 Sonar unit 40 Biosensor 50 Vehicle speed sensor 60 Acceleration sensor 70 Operation unit 85 Touch panel 90 Voice Output unit 91 Voice processing unit 93 Speaker 100 In-vehicle device 110 Storage unit 130 Processor 131 First state information acquisition unit 132 Second state information acquisition unit 133 State detection unit 134 Driving tendency determination unit 135 Image acquisition unit 136 Distance acquisition unit 137 Target vehicle Detection unit 138 Notification control unit

Claims (6)

  1.  車両に搭載された車載装置であって、
     前記車両の走行状態を示す走行状態情報を取得する第1状態情報取得部と、
     前記車両の運転者の状態を示す運転者状態情報を取得する第2状態情報取得部と、
     前記第1状態情報取得部及び前記第2状態情報取得部が取得した情報に基づいて、前記車両の走行が、前記車両の周囲の他車両に危険を及ぼすおそれのある特定の状態にあるか否かを判定する状態判定部と、
     前記状態判定部により前記特定の状態にあると判定された場合に、前記車両の周囲を撮影した発生時撮影画像を取得する画像取得部と、
     前記車両と、前記車両の周囲の他車両との距離を示す距離情報を取得する距離取得部と、
     前記距離情報が示す距離が設定値以下の他車両が検出された場合に、前記他車両を撮影した撮影画像と、前記発生時撮影画像とを比較して、前記他車両を前記発生時撮影画像から検出する検出部と、
     前記検出部により前記他車両が検出された場合に、前記車両に搭載された報知部に、前記運転者に注意喚起を促す第1報知動作を実行させる報知制御部と、
     を備えることを特徴とする車載装置。
    It is an in-vehicle device mounted on a vehicle.
    The first state information acquisition unit that acquires the running state information indicating the running state of the vehicle, and
    A second state information acquisition unit that acquires driver state information indicating the state of the driver of the vehicle, and
    Whether or not the running of the vehicle is in a specific state that may pose a danger to other vehicles around the vehicle based on the information acquired by the first state information acquisition unit and the second state information acquisition unit. A state judgment unit that determines whether or not
    An image acquisition unit that acquires an image taken at the time of occurrence by photographing the surroundings of the vehicle when the state determination unit determines that the vehicle is in the specific state.
    A distance acquisition unit that acquires distance information indicating the distance between the vehicle and other vehicles around the vehicle, and
    When another vehicle whose distance indicated by the distance information is equal to or less than the set value is detected, the photographed image of the other vehicle is compared with the photographed image at the time of occurrence, and the other vehicle is photographed at the time of occurrence. The detector to detect from
    When the other vehicle is detected by the detection unit, a notification control unit that causes the notification unit mounted on the vehicle to execute a first notification operation for calling attention to the driver.
    An in-vehicle device characterized by being equipped with.
  2.  前記第2状態情報取得部は、前記運転者の生体情報を取得し、取得した前記生体情報に基づき、前記運転者の感情を前記運転者状態情報として推定し、
     前記状態判定部は、前記第2状態情報取得部が推定した前記運転者の感情に基づき、前記特定の状態を判定する、ことを特徴とする請求項1記載の車載装置。
    The second state information acquisition unit acquires the driver's biometric information, estimates the driver's emotions as the driver's state information based on the acquired biometric information, and determines the driver's emotions.
    The vehicle-mounted device according to claim 1, wherein the state determination unit determines the specific state based on the driver's emotions estimated by the second state information acquisition unit.
  3.  前記第2状態情報取得部は、前記車両の車室内を撮影した撮影画像を取得し、取得した前記撮影画像に基づいて前記運転者の動作を示す情報を前記運転者状態情報として取得し、
     前記状態判定部は、前記第2状態情報取得部が推定した前記運転者の動作に基づき、前記特定の状態を判定する、ことを特徴とする請求項1又は2記載の車載装置。
    The second state information acquisition unit acquires a photographed image of the interior of the vehicle, and acquires information indicating the driver's operation based on the acquired photographed image as the driver state information.
    The vehicle-mounted device according to claim 1 or 2, wherein the state determination unit determines the specific state based on the driver's operation estimated by the second state information acquisition unit.
  4.  前記報知制御部は、前記検出部が前記発生時撮影画像から前記他車両を検出することができなかった場合、前記第1報知動作よりも注意喚起のレベルが低い第2報知動作を実行させる、ことを特徴とする請求項1から3のいずれか一項に記載の車載装置。 When the detection unit cannot detect the other vehicle from the captured image at the time of occurrence, the notification control unit executes a second notification operation having a lower level of alerting than the first notification operation. The vehicle-mounted device according to any one of claims 1 to 3, wherein the in-vehicle device is characterized by the above.
  5.  前記状態判定部が、前記走行状態情報に基づいて前記特定の状態を検出した回数と、前記運転者状態情報に基づいて前記特定の状態を検出した回数とをそれぞれ計数し、計数した回数に基づいて前記運転者の運転傾向を判定する傾向判定部と、
     前記傾向判定部が判定した前記運転傾向を表示部に表示させる表示制御部と、を備えることを特徴とする請求項1から4のいずれか一項に記載の車載装置。
    The state determination unit counts the number of times the specific state is detected based on the driving state information and the number of times the specific state is detected based on the driver state information, and is based on the counted number of times. The tendency determination unit that determines the driving tendency of the driver,
    The vehicle-mounted device according to any one of claims 1 to 4, further comprising a display control unit for displaying the driving tendency determined by the tendency determination unit on the display unit.
  6.  車両の走行状態を示す走行状態情報を取得するステップと、
     前記車両の運転者の状態を示す運転者状態情報を取得するステップと、
     取得した前記走行状態情報及び運転者状態情報に基づいて、前記車両の走行が、前記車両の周囲の他車両に危険を及ぼすおそれのある特定の状態にあるか否かを判定するステップと、
     前記特定の状態にあると判定された場合に、前記車両の周囲を撮影した発生時撮影画像を取得するステップと、
     前記車両と、前記車両の周囲の他車両との距離を示す距離情報を取得するステップと、
     前記距離情報が示す距離が設定値以下の他車両が検出された場合に、前記他車両を撮影した撮影画像と、前記発生時撮影画像とを比較して、前記他車両を前記発生時撮影画像から検出するステップと、
     前記発生時撮影画像から前記他車両が検出された場合に、前記車両に搭載された報知部に、前記運転者に注意喚起を促す第1報知動作を実行させるステップと、
     を有することを特徴とする運転支援方法。
    Steps to acquire driving state information indicating the running state of the vehicle,
    A step of acquiring driver state information indicating the state of the driver of the vehicle, and
    Based on the acquired driving state information and driver state information, a step of determining whether or not the running of the vehicle is in a specific state that may pose a danger to other vehicles around the vehicle, and
    When it is determined that the vehicle is in the specific state, a step of acquiring an image taken at the time of occurrence, which is a photograph of the surroundings of the vehicle, and
    A step of acquiring distance information indicating the distance between the vehicle and another vehicle around the vehicle, and
    When another vehicle whose distance indicated by the distance information is equal to or less than the set value is detected, the photographed image of the other vehicle is compared with the photographed image at the time of occurrence, and the other vehicle is photographed at the time of occurrence. And the steps to detect from
    When the other vehicle is detected from the image taken at the time of occurrence, the step of causing the notification unit mounted on the vehicle to execute the first notification operation for calling the driver's attention.
    A driving support method characterized by having.
PCT/JP2021/026182 2020-08-06 2021-07-12 In-vehicle device and driving assistance method WO2022030191A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2020-133511 2020-08-06
JP2020133511A JP2022029911A (en) 2020-08-06 2020-08-06 On-vehicle device and operation support method

Publications (1)

Publication Number Publication Date
WO2022030191A1 true WO2022030191A1 (en) 2022-02-10

Family

ID=80117933

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2021/026182 WO2022030191A1 (en) 2020-08-06 2021-07-12 In-vehicle device and driving assistance method

Country Status (2)

Country Link
JP (1) JP2022029911A (en)
WO (1) WO2022030191A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010015451A (en) * 2008-07-04 2010-01-21 Toyota Motor Corp Driver's state recording apparatus
JP2018045303A (en) * 2016-09-12 2018-03-22 株式会社デンソー Driving assist system
JP2020067765A (en) * 2018-10-23 2020-04-30 三菱電機株式会社 Guide device, guide system, guide method and program

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2010015451A (en) * 2008-07-04 2010-01-21 Toyota Motor Corp Driver's state recording apparatus
JP2018045303A (en) * 2016-09-12 2018-03-22 株式会社デンソー Driving assist system
JP2020067765A (en) * 2018-10-23 2020-04-30 三菱電機株式会社 Guide device, guide system, guide method and program

Also Published As

Publication number Publication date
JP2022029911A (en) 2022-02-18

Similar Documents

Publication Publication Date Title
JP5082834B2 (en) Aside look detection device and method, and program
US11027746B2 (en) Drive mode switching control device, method and program
JP2009040107A (en) Image display control device and image display control system
JP6462629B2 (en) Driving support device and driving support program
JP6103383B2 (en) Vehicle driving evaluation system
JP2003123196A (en) Device and program for monitoring circumference of vehicle
JP2010033106A (en) Driver support device, driver support method, and driver support processing program
KR20150043659A (en) Method for alarming status of vehicle and Apparatus for the same
JP2017182776A (en) Vehicle periphery monitoring apparatus and computer program
JPWO2011114638A1 (en) Vehicle perimeter monitoring apparatus and vehicle perimeter monitoring method
TWI557003B (en) Image based intelligent security system for vehicle combined with sensor
US20180162274A1 (en) Vehicle side-rear warning device and method using the same
CN111645674B (en) Vehicle control device
CN113232659A (en) Blind spot information acquisition device and method, vehicle, and recording medium having program recorded thereon
JP6811743B2 (en) Safe driving support device
KR20120046645A (en) Appratus and method for vehicle drive assistance
WO2022030191A1 (en) In-vehicle device and driving assistance method
KR20160129984A (en) Drowsiness detection apparatus and method
JP2023002810A (en) output device
KR101666343B1 (en) System for processing integrated data of the ultra sonic sensors and camera on vehicle
JP2015191437A (en) Driving support device
KR20200082463A (en) Video recording apparatus and operating method for the same
JP2020013369A (en) Operation support device, operation support system, and operation support program
KR102296213B1 (en) Video recording apparatus and operating method thereof
US20230415574A1 (en) Parking assistance system and parking assistance method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21853217

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21853217

Country of ref document: EP

Kind code of ref document: A1