WO2018230245A1 - Traveling support device, control program, and computer-readable non-transitory tangible recording medium - Google Patents

Traveling support device, control program, and computer-readable non-transitory tangible recording medium Download PDF

Info

Publication number
WO2018230245A1
WO2018230245A1 PCT/JP2018/019062 JP2018019062W WO2018230245A1 WO 2018230245 A1 WO2018230245 A1 WO 2018230245A1 JP 2018019062 W JP2018019062 W JP 2018019062W WO 2018230245 A1 WO2018230245 A1 WO 2018230245A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle
driving
stationary
stationary object
front
Prior art date
Application number
PCT/JP2018/019062
Other languages
French (fr)
Japanese (ja)
Inventor
希 北川
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2017117934A priority Critical patent/JP2019001314A/en
Priority to JP2017-117934 priority
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Publication of WO2018230245A1 publication Critical patent/WO2018230245A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B62LAND VEHICLES FOR TRAVELLING OTHERWISE THAN ON RAILS
    • B62DMOTOR VEHICLES; TRAILERS
    • B62D6/00Arrangements for automatically controlling steering depending on driving conditions sensed and responded to, e.g. control circuits
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01CMEASURING DISTANCES, LEVELS OR BEARINGS; SURVEYING; NAVIGATION; GYROSCOPIC INSTRUMENTS; PHOTOGRAMMETRY OR VIDEOGRAMMETRY
    • G01C21/00Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00
    • G01C21/26Navigation; Navigational instruments not provided for in preceding groups G01C1/00-G01C19/00 specially adapted for navigation in a road network
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Abstract

The traveling support device is for use with a vehicle capable of being switched between automated driving and manual driving, and is provided with: a stationary object detection part for detecting a stationary object ahead of the vehicle; and a request processing part for effecting a takeover request which requests the driver of the vehicle to take over the driving from the automated driving, thus switching to manual driving, on the basis of a stationary object being detected ahead of the vehicle by the stationary object detection part while the vehicle is traveling by automated driving on a road without a passing lane.

Description

Driving support device, control program, computer-readable continuous tangible recording medium Cross-reference of related applications

This application is based on Japanese Patent Application No. 2017-117934 filed on June 15, 2017, the contents of which are incorporated herein by reference.

The present disclosure relates to a driving support device and a control program that support driving of a vehicle.

Conventionally, automatic driving for automatically controlling acceleration / deceleration and steering of a vehicle is known. For example, in Patent Document 1, when the vehicle speed of a preceding vehicle that travels ahead of the vehicle speed setting speed of the host vehicle is low during the inter-vehicle control traveling, it is determined whether or not the preceding vehicle can be overtaken. A technique for overtaking a preceding vehicle by maneuvering is disclosed. In the technique disclosed in Patent Document 1, when there is no overtaking lane on the road on which the vehicle travels, the vehicle follows the preceding vehicle without overtaking.

However, the technique disclosed in Patent Document 1 does not assume a case where there is a stationary object such as a parked vehicle on a road without an overtaking lane. In the technique disclosed in Patent Document 1, even if it is assumed that the preceding vehicle can be replaced with a parked vehicle, the vehicle follows the speed of the preceding vehicle when there is no overtaking lane. Stopping every time, smooth running is hindered.

JP 2003-63273 A

The present disclosure is intended to provide a driving support device and a control program that enable a smoother driving on a road without an overtaking lane more easily in a vehicle that performs automatic driving.

In the first aspect of the present disclosure, a travel support device used in a vehicle capable of switching between automatic driving and manual driving includes a stationary object detection unit that detects a stationary object in front of the vehicle, and a road without an overtaking lane. Based on the fact that the stationary object detection unit detects a stationary object in front of the vehicle while the vehicle is traveling in the automatic driving, the driver of the vehicle is switched from the automatic driving to the manual driving. And a request processing unit that makes a driving change request to be directed.

According to the driving support device described above, the driving shift to manual driving is performed based on the fact that the stationary object detection unit detects a stationary object in front of the vehicle while the vehicle is traveling automatically on a road without an overtaking lane. By making a request, it is possible to switch to manual operation. Therefore, even if a stationary object ahead of a road without an overtaking lane does not stop every time by automatic driving, the driver can pass through this stationary object while confirming safety by manual driving. Therefore, smoother driving on a road without an overtaking lane is possible. In addition, since it is possible to change to manual driving by making a driving change request to manual driving, it is easy to avoid difficulty when passing a stationary object in front of the vehicle while avoiding oncoming vehicles by automatic driving Can be avoided. As a result, smoother driving on a road without an overtaking lane can be made easier.

In the second aspect of the present disclosure, a control program for controlling a driving support device used in a vehicle capable of switching between automatic driving and manual driving includes a computer and a stationary object detection unit that detects a stationary object in front of the vehicle From the automatic operation to the manual operation based on the fact that the stationary object detection unit detects a stationary object ahead of the vehicle while the vehicle is traveling on the road without an overtaking lane. It is made to function as a request processing unit that makes a driving change request for requesting a driving change to the driver of the vehicle.

According to the above control program, when the vehicle is traveling on a road without an overtaking lane in automatic driving, the stationary object detection unit detects a stationary object in front of the vehicle and requests a driving change to manual driving. By performing the operation, it is possible to change to manual operation. Therefore, even if a stationary object ahead of a road without an overtaking lane does not stop every time by automatic driving, the driver can pass through this stationary object while confirming safety by manual driving. Therefore, smoother driving on a road without an overtaking lane is possible. In addition, since it is possible to change to manual driving by making a driving change request to manual driving, it is easy to avoid difficulty when passing a stationary object in front of the vehicle while avoiding oncoming vehicles by automatic driving Can be avoided. As a result, smoother driving on a road without an overtaking lane can be made easier.

In the third aspect of the present disclosure, in a computer-readable continuous tangible recording medium including instructions executed by a computer that controls a driving support device used in a vehicle capable of switching between automatic driving and manual driving, the instructions are Detecting the stationary object in front of the vehicle and detecting the stationary object in front of the vehicle while the vehicle is traveling on the road without an overtaking lane. Performing a driving change request for requesting the driver of the vehicle to change driving to manual driving.

According to the computer-readable continuous tangible recording medium, the stationary object detection unit detects a stationary object in front of the vehicle while the vehicle is driving on a road without an overtaking lane, and is manually operated. By making a driving change request for driving, it is possible to switch to manual driving. Therefore, even if a stationary object ahead of a road without an overtaking lane does not stop every time by automatic driving, the driver can pass through this stationary object while confirming safety by manual driving. Therefore, smoother driving on a road without an overtaking lane is possible. In addition, since it is possible to change to manual driving by making a driving change request to manual driving, it is easy to avoid difficulty when passing a stationary object in front of the vehicle while avoiding oncoming vehicles by automatic driving Can be avoided. As a result, smoother driving on a road without an overtaking lane can be made easier.

The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description with reference to the accompanying drawings. The drawing
It is a diagram showing an example of a schematic configuration of a driving support system, It is a figure which shows an example of schematic structure of automatic operation ECU, It is a flowchart which shows an example of the flow of a related process at the time of front stationary vehicle detection in automatic operation ECU in Embodiment 1, 3 is a flowchart illustrating an example of a flow of TOR presence / absence determination processing in the automatic operation ECU according to the first embodiment; It is a figure for demonstrating the proper use of the presence or absence of TOR in a related process at the time of a front stationary vehicle detection, It is a diagram for explaining the running of each vehicle in the case where the post-stop request function in the related processing at the time of detection of the front stationary vehicle is selected and the case where the pre-stop request function is selected, It is a flowchart which shows an example of the flow of a related process at the time of front stationary object detection in automatic operation ECU in Embodiment 2, It is a flowchart which shows an example of the flow of the TOR presence determination process in the automatic driving ECU in Embodiment 3, 10 is a flowchart illustrating an example of a flow of TOR presence / absence determination processing in an automatic operation ECU according to a fourth embodiment.

A plurality of embodiments for disclosure will be described with reference to the drawings. For convenience of explanation, among the embodiments, parts having the same functions as those shown in the drawings used in the explanation so far may be given the same reference numerals and explanation thereof may be omitted. is there. For the parts denoted by the same reference numerals, the description in other embodiments can be referred to.

(Embodiment 1)
<Schematic configuration of the driving support system 1>
Hereinafter, Embodiment 1 of the present disclosure will be described with reference to the drawings. A travel support system 1 shown in FIG. 1 is used in a vehicle such as an automobile, and includes an automatic driving ECU 10, an ADAS (Advanced Driver Assistance Systems) locator 20, a vehicle control ECU 30, a peripheral monitoring sensor 40, and an HMI (Human Machine Interface). A system 50 is included. The automatic operation ECU 10, the ADAS locator 20, the vehicle control ECU 30, and the HMI system 50 may be configured to be connected to the in-vehicle LAN, for example. Below, the vehicle using the driving assistance system 1 is called an own vehicle.

The ADAS locator 20 includes a GNSS (Global Navigation Satellite System) receiver 21, an inertial sensor 22, and a map database (hereinafter referred to as DB) 23 storing map data. The GNSS receiver 21 receives positioning signals from a plurality of artificial satellites. The inertial sensor 22 includes, for example, a triaxial gyro sensor and a triaxial acceleration sensor. The map DB 23 is a non-volatile memory and stores map data such as link data, node data, road shapes, and structures. The map data may include a three-dimensional map made up of point shapes of road shapes and structure feature points.

The link data is composed of a unique number for identifying the link, a link length indicating the link length, a link direction, link shape information, node coordinates of the start and end of the link, and road attribute data. The road attributes include road name, road type, road width, number of lanes, speed regulation value, and the like. On the other hand, the node data includes a node ID, a node coordinate, a node name, a node type, a link ID of a link connected to the node, an intersection type, etc. Composed of each data. As map data of structures, there are traffic lights, pedestrian crossings, and the like.

The ADAS locator 20 sequentially measures the vehicle position of the vehicle by combining the positioning signal received by the GNSS receiver 21 and the measurement result of the inertial sensor 22. In addition, it is good also as a structure which uses the travel distance calculated | required from the pulse signal sequentially output from the wheel speed sensor of the own vehicle for positioning of a vehicle position. And the measured vehicle position is output to in-vehicle LAN. The ADAS locator 20 also reads map data from the map DB 23 and outputs it to the in-vehicle LAN. The map data may be obtained from the outside of the vehicle using a communication module. Further, the ADAS locator 20 may be configured not to include the GNSS receiver 21 but sequentially specify the vehicle position of the own vehicle with respect to the three-dimensional map.

The vehicle control ECU 30 is an electronic control device that performs acceleration / deceleration control and steering control of the host vehicle. The vehicle control ECU 30 includes a steering ECU that performs steering control, a power unit control ECU that performs acceleration / deceleration control, a brake ECU, and the like. The vehicle control ECU 30 acquires detection signals output from vehicle state sensors such as an accelerator position sensor, a brake pedal force sensor, a rudder angle sensor, and a wheel speed sensor mounted on the host vehicle, and performs electronic control throttle, brake actuator, EPS (Electric Power Steering) Outputs control signals to each travel control device such as a motor. Further, the vehicle control ECU 30 can output detection signals of the above-described vehicle state sensors to the in-vehicle LAN.

The periphery monitoring sensor 40 detects obstacles such as pedestrians, animals other than humans, bicycles, motorcycles, other vehicles, falling objects on the road, guardrails, curbs, trees, and the like. In addition, road markings such as travel lane lines and stop lines are detected. The peripheral monitoring sensor 40 is, for example, a peripheral monitoring camera that captures a predetermined range around the vehicle, a millimeter wave radar that transmits a search wave to the predetermined range around the vehicle, sonar, LIDAR (Light Detection and Ranging / Laser Imaging Detect ion and Ranging). The peripheral monitoring camera sequentially outputs captured images that are sequentially captured to the automatic operation ECU 10 as sensing information. A sensor that transmits an exploration wave such as sonar, millimeter wave radar, or LIDAR sequentially outputs a scanning result based on a received signal obtained when a reflected wave reflected by an obstacle is received as sensing information to the automatic operation ECU 10.

As shown in FIG. 1, the HMI system 50 includes an HCU (Human Machine Interface Control Unit) 51, an operation device 52, a DSM (Driver Status Monitor) 53, a display device 54, and an audio output device 55. The input operation from the driver is received, the driver state of the driver of the own vehicle is monitored, and information is presented to the driver of the own vehicle. The driver state is any state of a physical state and a psychological state of the driver of the own vehicle. The physical state is not limited to the physical condition of the driver, and may be a state of looking aside the driver.

The operation device 52 is a switch group operated by a driver of the vehicle HV. The operation device 52 is used for performing various settings. For example, the operation device 52 includes a steering switch provided in a spoke spoke portion of the own vehicle, a touch switch integrated with the display device 54, and the like.

As an example, the DSM 53 includes a near-infrared light source and a near-infrared camera, and a control unit that controls them. The DSM 53 is arranged, for example, on the steering column cover in a posture in which the near-infrared camera faces the driver's seat side of the own vehicle. The DSM 53 may be arranged at other positions as long as the face of the driver seated in the driver's seat of the own vehicle can be imaged, and is arranged on the top surface of the instrument panel. There may be.

The DSM 53 uses a near-infrared camera to photograph the driver's head irradiated with near-infrared light from a near-infrared light source. The image captured by the near-infrared camera is analyzed by the control unit. The control unit detects the driver state such as the driver's face direction, line-of-sight direction, and sleepiness from the captured image. The detected driver state is output to the HCU 51.

As an example, the DSM 53 detects a part such as a face contour, eyes, nose, mouth, and the like by image recognition processing from a captured image obtained by capturing a driver's face with a near-infrared camera (hereinafter referred to as a face image). Then, the face orientation of the driver is detected from the relative positional relationship of each part. Further, the DSM 53 may detect the driver's pupil and corneal reflection from the face image by image recognition processing, and may detect the line-of-sight direction with respect to the reference position in the vehicle interior based on the positional relationship between the detected pupil and corneal reflection. The reference position may be, for example, the installation position of the near infrared camera. The line-of-sight direction may be detected in consideration of the face direction.

Furthermore, the DSM 53 detects the closed eye by calculating a change in eyelid shape detected from the face image as the degree of eye opening. Then, the degree of sleepiness (hereinafter referred to as sleepiness level) is detected from the temporal change in the degree of eye closure, the shape characteristics of the facial part, the temporal change in the facial part, and the like. The DSM 53 may be configured to detect a driver state other than drowsiness such as concentration, tension, anxiety, and pleasant discomfort from the shape characteristics of the face portion detected from the face image, the temporal change of the face portion, and the like.

The display device 54 displays display information such as various images and text for information notification on the display screen based on the image data acquired from the HCU 51. Examples of the display device 54 include a combination meter display, CID (Center Information Display), and HUD (Head-Up Display). The display of the combination meter is disposed, for example, in front of the driver's seat. The CID is arranged above the center cluster. The HUD projects image light based on the image data acquired from the HCU 51 onto a projection area defined by the front windshield, so that a virtual image of this image is superimposed on a part of the foreground so that the driver can visually recognize the image. The projection member on which the HUD projects light is not limited to the front windshield, and may be a light transmissive combiner.

Examples of the audio output device 55 include an audio speaker. The audio speaker is disposed, for example, in the lining of the door of the own vehicle. The audio speaker presents information directed to the driver by the reproduced sound.

The HCU 51 is configured mainly by a microcomputer including a processor, a memory, an I / O, and a bus for connecting them, and is connected to the operation device 52, the DSM 53, the display device 54, the audio output device 55, and the in-vehicle LAN. The HCU 51 executes various processes related to the functions of the HMI system 50 by executing the control program stored in the memory.

The automatic operation ECU 10 is mainly configured by a microcomputer including a processor, a memory, an I / O, and a bus for connecting them, and is connected to the periphery monitoring sensor 40 and the in-vehicle LAN. The memory mentioned here is a non-transitory physical storage medium that stores computer-readable programs and data in a non-transitory manner. The non-transitional tangible storage medium is realized by a semiconductor memory or a magnetic disk. The automatic operation ECU 10 implements various functions by executing a control program stored in the memory. Note that a plurality of processors may be used. The automatic driving ECU 10 executes functions related to driving support such as an automatic driving function for automatically driving the host vehicle. This automatic driving ECU 10 corresponds to a driving support device.

<Schematic configuration of automatic operation ECU 10>
Here, a schematic configuration of the automatic driving ECU 10 will be described with reference to FIG. As shown in FIG. 2, the automatic driving ECU 10 includes a traveling environment recognition unit 100, a support unit 110, an ECU communication unit 120, an HCU communication unit 130, and a control selection determination unit 140 as functional blocks. Note that some or all of the functions executed by the automatic operation ECU 10 may be configured in hardware by one or a plurality of ICs. Further, part or all of the functional blocks provided in the automatic operation ECU 10 may be realized by a combination of execution of software by a processor and a hardware member.

The traveling environment recognition unit 100 recognizes the traveling environment of the host vehicle from the vehicle position and map data of the host vehicle acquired from the ADAS locator 20, the sensing information acquired from the surrounding monitoring sensor 40, and the like. As an example, the driving environment recognition unit 100 recognizes the shape and movement state of an object around the own vehicle from the sensing information acquired from the surrounding monitoring sensor 40 within the sensing range of the surrounding monitoring sensor 40, and By combining with the vehicle position and map data, a virtual space in which the actual driving environment is reproduced in three dimensions is generated. If the traveling environment recognition unit 100 recognizes the distance from the obstacle including the vehicle around the own vehicle, the relative speed of the obstacle to the own vehicle, and the like as the traveling environment from the sensing information acquired from the surrounding monitoring sensor 40. Good.

Further, the traveling environment recognition unit 100 recognizes a stationary object in front of the host vehicle as the traveling environment. Therefore, the traveling environment recognition unit 100 corresponds to a stationary object detection unit. The traveling environment recognition unit 100 detects a stationary object ahead of the host vehicle up to a distance corresponding to the sensing distance of the periphery monitoring sensor 40. The stationary object in front of the vehicle mentioned here may be a stationary object in front of the vehicle in the traveling lane. The presence / absence of the stationary state may be specified from the temporal change in the position of the detected object. What is necessary is just to recognize the traveling lane of the own vehicle based on the traveling lane detected by the periphery monitoring sensor 40. Furthermore, the traveling environment recognition unit 100 detects whether or not the stationary object in front of the host vehicle is a vehicle. As an example, a known image recognition technique such as pattern matching may be used to distinguish whether a stationary object in front of the host vehicle is a vehicle (hereinafter referred to as a stationary vehicle) or not.

In addition, when the traveling environment recognition unit 100 distinguishes and recognizes a stationary vehicle, it distinguishes and detects whether or not the stationary vehicle is a parked vehicle. As used herein, a parked vehicle refers to a vehicle in which both left and right turn signal lamps (hereinafter referred to as hazard lamps) are lit, and a vehicle that is stationary without any lighting devices including the hazard lamps being lit. Indicates. As an example, the traveling environment recognition unit 100 recognizes a parked vehicle and a parked vehicle when the color of a hazard lamp is recognized in the image of a stationary vehicle in front of the own vehicle or when the color of any lighting device is not recognized. What is necessary is just to detect separately.

The support unit 110 executes functions related to driving support of the own vehicle. As shown in FIG. 2, the support unit 110 includes an automatic driving function unit 111 and an MRM (Minimum Risk 部 Manoeuvres) function unit 112 as sub-function blocks that execute functions related to driving support of the host vehicle.

The automatic driving function unit 111 executes an automatic driving function for performing automatic driving. In other words, the automatic driving function unit 111 performs the driving operation of the host vehicle in place of the driver by automatically performing acceleration / deceleration control and steering control of the host vehicle in cooperation with the vehicle control ECU 30. Based on the driving environment recognized by the driving environment recognition unit 100, the automatic driving function unit 111 generates a driving plan for driving the vehicle by automatic driving.

For example, as a mid- to long-term travel plan, a recommended route for making the vehicle head for the destination set by the driver or the like is generated. A schedule of planned driving change from automatic driving to manual driving by a driver is set mainly based on a long-term driving plan. The automatic driving function unit 111 generates a short-term travel plan for traveling according to the recommended route. As a specific example, execution of steering for lane change, acceleration / deceleration for speed adjustment, steering and braking for obstacle avoidance, and the like is determined. Then, the automatic driving function unit 111 performs automatic driving by performing acceleration / deceleration control and steering control of the own vehicle in cooperation with the vehicle control ECU 30 in accordance with the generated travel plan. The automatic driving function unit 111 performs automatic driving that automatically performs acceleration / deceleration control and steering control of the vehicle as automatic driving. The automatic driving performed by the automatic driving function unit 111 can be changed to manual driving.

The MRM function unit 112, when a request for a driving change to the manual driving is made while the automatic driving is continued, but the driver does not perform a driving operation by the limit of the changing to the manual driving. The MRM function for automatically evacuating is executed. Further, the driving change request for manual driving performed from the driving support system 1 toward the driver is hereinafter referred to as TOR (Take Over Requests). As an example of the MRM function, it is possible to automatically stop the host vehicle by performing at least acceleration / deceleration control of the host vehicle in cooperation with the vehicle control ECU 30 based on the driving environment recognized by the driving environment recognition unit 100. Can be mentioned.

The ECU communication unit 120 performs information output processing for the vehicle control ECU 30 and information acquisition processing from the vehicle control ECU 30. The ECU communication unit 120 generates vehicle control information for instructing acceleration / deceleration and steering in accordance with a function related to traveling support of the host vehicle executed by the support unit 110, and controls the vehicle together with the driving state information indicating the operating state of automatic driving. It outputs sequentially toward ECU30. Further, the ECU communication unit 120 can sequentially acquire state information indicating the control state of each traveling control device from the vehicle control ECU 30 and correct the content of the vehicle control information.

The ECU communication unit 120 has a vehicle state acquisition unit 121 as a sub-function block. The vehicle state acquisition unit 121 sequentially acquires signals output from the respective vehicle state sensors as vehicle state information. Moreover, the vehicle state acquisition part 121 is good also as a structure which acquires sequentially the detection information which shows the holding state of the steering wheel detected, for example by the holding sensor provided in the steering wheel. The vehicle state information and the detection information are provided to an operation mode selection unit 141 described later, and are used when driving is switched from automatic driving to manual driving.

The HCU communication unit 130 performs information output processing for the HCU 51 and information acquisition processing from the HCU 51. As shown in FIG. 2, the HCU communication unit 130 includes a setting acquisition unit 131, a driving change request unit 132, and a driver state acquisition unit 133 as sub function blocks.

The setting acquisition unit 131 acquires setting information about the setting performed by the operation input from the driver via the operation device 52 from the HCU 51. As an example of the setting information, there are a setting of whether or not the automatic driving function is performed, a setting for switching between a pre-stop request function and a post-stop request function, which will be described later. The pre-stop request function is a function that, when TOR is performed, causes TOR to be performed before stopping the vehicle by automatic driving. The post-stop request function is a function for performing TOR after stopping the vehicle by automatic driving when performing TOR.

The operation change request unit 132 generates change request information for requesting an operation change from the automatic operation to the manual operation, and outputs it to the HCU 51. Then, the driving change request unit 132 causes the driver to perform a TOR requesting the driving change to the manual driving under the control of the display device 54 and / or the audio output device 55 in cooperation with the HCU 51. The driving change request unit 132 causes the TOR to be performed according to the determination result of whether or not to perform the TOR in the TOR determination unit 144 described later. The driving change request unit 132 corresponds to a request processing unit.

The driver status acquisition unit 133 acquires the driver status of the driver of the own vehicle. The driver status acquisition unit 133 may be configured to sequentially acquire the driver status detected by the DSM 53 from the HCU 51. The presentation processing unit 134 causes information to be presented to the driver of the host vehicle under the control of the display device 54 and / or the audio output device 55 in cooperation with the HCU 51.

The control selection determination unit 140 performs processing related to selection of control according to conditions. As shown in FIG. 2, the control selection determination unit 140 includes an operation mode selection unit 141, a function selection unit 142, a manual operation determination unit 143, and a TOR determination unit 144 as sub function blocks.

The driving mode selection unit 141 switches the driving mode of the own vehicle among a plurality of preliminarily defined by the control of changing the operation state of the function related to the driving support of the own vehicle. The plurality of operation modes switched by the operation mode selection unit 141 include an automatic evacuation mode and an immediate stop mode in addition to a manual operation mode for performing manual operation and an automatic operation mode for performing automatic operation.

In the manual operation mode, the automatic operation function is stopped and the driver controls the driving of the vehicle. The vehicle control ECU 30 that is acquiring the driving state information indicating that it is in the manual driving mode generates a control signal according to the vehicle state information acquired from each vehicle state sensor, and outputs the control signal to each traveling control device. Become. In the automatic driving mode, the running automatic driving function controls the traveling of the vehicle. The vehicle control ECU 30 that is acquiring the driving state information indicating that the vehicle is in the automatic driving mode generates a control signal having the content according to the vehicle control information acquired from the automatic driving function unit 111 and outputs the control signal to each traveling control device. become. As an example, whether the operation mode selection unit 141 selects the manual operation mode or the automatic operation mode according to the setting of whether or not the automatic operation function is performed in the setting information acquired by the setting acquisition unit 131. Switch.

The automatic evacuation mode is a specific aspect of the automatic operation mode. In the automatic evacuation mode, the driving operation by the driver is not determined by the manual driving determination unit 143 until the switching limit to the manual driving is performed even though the driving switching request to the manual driving is performed while the automatic driving is continued. If this happens, the MRM function is executed to automatically evacuate the vehicle. The immediate stop mode is another aspect of the automatic driving mode. In the immediate stop mode, the vehicle is automatically stopped before the TOR is performed when the driving is switched to the manual driving.

In the automatic operation mode, the operation mode selection unit 141 switches from the automatic operation mode to the manual operation mode when the automatic operation section ends based on the medium to long-term travel plan generated by the automatic operation function unit 111. This is hereinafter referred to as planned driving change. In the planned driving change, the TOR is performed while the automatic driving is continued. For example, when the driving operation by the driver is determined by the manual driving determining unit 143 within a set time such as 10 seconds, the manual driving mode is switched. And it is sufficient. On the other hand, when the driving operation by the driver is not determined by the manual driving determination unit 143 within the set time, the configuration may be such that the automatic retraction mode is switched. The set time referred to here is a time that can be arbitrarily set as long as it does not exceed the automatic operation section without switching to the manual operation mode.

In addition, the operation mode selection unit 141 switches from the automatic operation mode to the manual operation mode even when a situation in which it is desirable to stop the automatic operation suddenly occurs in the automatic operation mode. This is referred to as unplanned driving shift below. The situation that suddenly occurs and it is desirable to stop the automatic driving is that it is necessary to change the course in order to avoid an obstacle ahead, but it is difficult to change the course in the automatic driving. For example, when the sensing accuracy is lowered, or when an abnormality of the periphery monitoring sensor 40 is detected.

In order to avoid obstacles ahead, it is necessary to change the course, but it is difficult to change the course. For example, when the vehicle is driving automatically on a road without an overtaking lane, A case where a stationary vehicle in front of the car is detected is mentioned. Hereinafter, this situation is referred to as a forward stationary vehicle detection on a road without an overtaking lane. This is because in automatic driving, when passing a side of a stationary vehicle ahead of the host vehicle on a road without an overtaking lane, it is difficult to confirm that it does not match with the oncoming vehicle. Details are described below.

For example, a case where a parked vehicle having a length of 12 m and a width of 2.5 m is positioned in front of the own vehicle on a one-lane road with a road width of 3.2 m and a speed limit of 50 km / h will be described as an example. Here, it is assumed that the host vehicle travels at 50 km / h (that is, 13.9 m / s), the oncoming vehicle travels at 60 km / h (that is, 16.9 m / s), and the maximum lateral acceleration of the host vehicle is 0.2G. The sensing distance in front of the host vehicle at the periphery monitoring sensor 40 is assumed to be 100 m longer than the sensing distance in a known automatic driving vehicle. An oncoming vehicle 101 m ahead beyond the sensing distance ahead of the host vehicle cannot be detected by the periphery monitoring sensor 40. Since the relative speed between the host vehicle and the oncoming vehicle is 30.6 m / s, the grace time until the 101 m distance between the host vehicle and the oncoming vehicle becomes 0 m is 3.2 s. Therefore, the own vehicle needs to change from the own lane to the own lane through the side of the parked vehicle within 3.2 s. On the other hand, the time required to change the lane at 0.2G, pass through a side of a parked vehicle having a length of 12 m, and return to the own lane at 0.2G is calculated to be 5.4 s. Here, the calculation is performed assuming that the acceleration is 0.2 G and the movement is 1.6 m, and then the deceleration is 0.2 G and the movement is 1.6 m. Therefore, at a sensing distance in front of the own vehicle of about 100 m, it is difficult to automatically pass the side of a stationary vehicle in front of the own vehicle on a road without an overtaking lane.

Moreover, even if it is a case where it tries to detect a far-coming oncoming vehicle by acquiring position information and speed information from an oncoming vehicle by inter-vehicle communication, the oncoming vehicle has a communication module capable of inter-vehicle communication. There is a difficulty of having to.

The function selection unit 142 selects a function to be executed from the pre-stop request function and the post-stop request function according to the setting for switching between the pre-stop request function and the post-stop request function in the setting information acquired by the setting acquisition unit 131. To do. The function selection unit 142 corresponds to the selection unit. As an example, by default, the pre-stop request function is selected, and when the setting to switch to the post-stop request function is performed by an operation input to the operation device 52, the post-stop request function is selected. do it.

As another example, when the driver state acquired by the driver state acquisition unit 133 is a state in which it is estimated that it is difficult for the driver to quickly respond to the TOR, the post-stop request function is selected, while the TOR If it is not estimated that the driver's quick response is difficult, the pre-stop request function may be selected. The state where it is estimated that it is difficult for the driver to quickly respond to the TOR mentioned here may be a driver state such as a tension state, an anxiety state, a drowsiness state, or a side look. According to this, when the driver state is estimated to be difficult for the driver to quickly respond to the TOR, the function selection unit 142 selects the post-stop request function, so the TOR is performed after the stop. In other words, even a driver for which quick response is difficult can respond to TOR with a margin.

In addition, when the operation mode selection unit 141 performs the operation change to the unplanned manual operation, the function selection unit 142 newly acquires the driver state acquired by the driver state acquisition unit 133 in order to suppress processing delay. Rather than using it, it is preferable to use a configuration of using the latest past driver status acquired sequentially by the driver status acquisition unit 133. If the driver state acquired by the driver state acquisition unit 133 is a state where it is estimated that it is difficult for the driver to quickly respond to the TOR, the driver state is set in advance according to the operation input received from the driver by the operation device 52. Regardless of the set information, the function selection unit 142 may select the post-stop request function.

The manual operation determination unit 143 determines whether or not there is a driving operation of the driver related to the driving change from the automatic operation to the manual operation. As an example, when the detection information indicating the gripping state of the steering wheel acquired by the vehicle state acquisition unit 121 indicates that the steering wheel is gripped, it may be determined that there is a driving operation of the driver. A configuration may be adopted in which it is determined that there is a driving operation of the driver based on a state other than the gripping state of the steering wheel. The operation mode selection unit 141 switches from the automatic operation mode to the manual operation mode when the manual operation determination unit 143 determines that there is a driving operation of the driver when the operation is switched from the automatic operation to the manual operation.

The TOR determination unit 144 determines whether or not to perform the TOR when the operation is switched from the automatic operation to the manual operation. The TOR determination unit 144 may determine whether or not to perform the TOR based on the driving change schedule during the planned driving change. As an example, a configuration may be adopted in which TOR is determined to be performed when a remaining distance and a remaining traveling time to a point that is determined to be a driving change point in a driving change schedule become a predetermined value. The predetermined value here is a value that can be arbitrarily set.

Further, the TOR determination unit 144 determines whether or not to perform TOR at the time of unplanned operation change. In a situation where it is not preferable to continue the automatic operation, such as when the sensing accuracy of the surrounding monitoring sensor 40 is lowered, or when an abnormality of the surrounding monitoring sensor 40 is detected, the configuration may be such that TOR is performed. On the other hand, in a situation where automatic driving can be stopped continuously, such as when a forward stationary vehicle is detected on a road without an overtaking lane, the TOR is performed according to the driving environment recognized by the driving environment recognition unit 100. Determine whether or not Details of the determination as to whether or not to perform TOR when detecting a front stationary vehicle on a road without an overtaking lane will be described later.

<Regarding processing related to detection of a front stationary vehicle in the automatic driving ECU 10>
Next, an example of a flow of processing related to detection of a front stationary vehicle on a road without an overtaking lane in the automatic driving ECU 10 (hereinafter, processing related to detection of a front stationary vehicle) will be described using the flowchart of FIG. . A control program for realizing the function related to the processing related to the detection of the front stationary vehicle corresponds to the control program. The flowchart in FIG. 3 may be configured to start when the host vehicle starts automatic driving, for example.

First, in step S1, when the traveling environment recognition unit 100 detects a stationary vehicle in front of the host vehicle (YES in S1), the process proceeds to step S3. On the other hand, when the traveling environment recognition unit 100 has not detected a stationary vehicle in front of the host vehicle (NO in S1), the process proceeds to step S2. In step S2, when it is the end timing of the related process at the time of detecting the front stationary vehicle (YES at S2), the related process at the time of detecting the front stationary vehicle is ended. On the other hand, if it is not the end timing of the related process when the front stationary vehicle is detected (NO in S2), the process returns to S1 and the process is repeated. The end timing of the related processing when the front stationary vehicle is detected includes when the ignition power of the own vehicle is turned off, when driving is switched from automatic driving to manual driving, and the like.

In step S3, when the traveling environment recognition unit 100 recognizes that the vehicle is traveling on a road without an overtaking lane (YES in S3), the process proceeds to step S5. On the other hand, when the traveling environment recognition unit 100 recognizes that the vehicle is traveling on a road with an overtaking lane (NO in S3), the process proceeds to step S4. In addition, it is good also as a structure using the road of one lane on one side instead of the road without the overtaking lane said here.

In step S4, the automatic driving function unit 111 continues automatic driving, automatically changes the lane to the overtaking lane, and proceeds to S2. When the automatic driving function unit 111 automatically changes the lane to the overtaking lane, it automatically stops the vehicle as necessary, and changes the lane after seeing the vehicle behind the vehicle. To do.

In step S5, the TOR determination unit 144 performs a TOR presence / absence determination process and proceeds to step S6. Here, the TOR presence / absence determination process will be described with reference to the flowchart of FIG.

First, in step S51, when the distance from the detected stationary vehicle to the intersection ahead of the stationary vehicle is within the threshold in the traveling environment recognized by the traveling environment recognition unit 100 (YES in S51), the process proceeds to step S52. Move. The intersection in front of the stationary vehicle may be the nearest intersection in front of the stationary vehicle. The threshold value referred to here is a threshold value for distinguishing a stationary vehicle that is far from the intersection as being estimated to be stopped regardless of the intersection, and can be arbitrarily set. On the other hand, when the distance from the detected stationary vehicle to the intersection ahead of the stationary vehicle is not within the threshold (NO in S51), the process proceeds to step S55.

In step S52, when the traveling environment recognition unit 100 detects a stationary vehicle ahead of the host vehicle as a parked vehicle (YES in S52), the process proceeds to step S53. On the other hand, when the traveling environment recognition unit 100 detects that the stationary vehicle ahead of the host vehicle is not a parked vehicle (NO in S52), the process proceeds to step S54. In step S53, the TOR determination unit 144 determines that TOR is to be performed, and proceeds to step S6. On the other hand, in step S54, the TOR determination unit 144 determines that TOR is not performed, and proceeds to step S6.

In step S55, when the distance from the detected stationary vehicle to the traffic signal ahead of the stationary vehicle is within the threshold in the traveling environment recognized by the traveling environment recognition unit 100 (YES in S55), the process proceeds to S52. The traffic signal in front of the stationary vehicle may be a traffic signal immediately in front of the stationary vehicle provided for traffic on the road where the stationary vehicle is located. Here, the threshold value is a threshold value for distinguishing a stationary vehicle that is far from the traffic light as being estimated to be stopped regardless of the traffic light, and can be arbitrarily set. On the other hand, when the distance from the detected stationary vehicle to the traffic signal ahead of the stationary vehicle is not within the threshold (NO in S55), the process proceeds to step S56.

In step S56, when the distance from the detected stationary vehicle to the pedestrian crossing ahead of the stationary vehicle is within the threshold in the traveling environment recognized by the traveling environment recognition unit 100 (YES in S56), the process proceeds to S52. The pedestrian crossing in front of the stationary vehicle may be the nearest pedestrian crossing in front of the stationary vehicle. The threshold mentioned here is a threshold for distinguishing a stationary vehicle that is far from the pedestrian crossing as being out of the threshold as long as it is estimated that the vehicle has stopped regardless of the pedestrian crossing, and can be set arbitrarily. is there. On the other hand, when the distance from the detected stationary vehicle to the pedestrian crossing ahead of the stationary vehicle is not within the threshold (NO in S56), the process proceeds to step S57.

In step S57, it is determined that the TOR determination unit 144 performs the TOR, and the process proceeds to step S6. Note that the order of the processing of S51, S55, and S56 in the flowchart of FIG. 4 may be changed. Further, only a part of the processes of S51, S55, and S56 in the flowchart of FIG. 4 may be performed.

Returning to FIG. 3, if it is determined in step S6 that TOR is to be performed in S5 (YES in S6), the process proceeds to step S8. On the other hand, when it is determined that the TOR is not performed in S5 (NO in S6), the process proceeds to step S7. In step S7, the automatic driving function unit 111 continues the automatic driving, automatically stops the detected stationary vehicle with an inter-vehicle distance, and proceeds to S2. After automatically stopping the detected stationary vehicle with a distance between the vehicles, when this stationary vehicle starts, the automatic driving function unit 111 follows the started vehicle and automatically runs the vehicle. It will be.

In step S8, when the function selection unit 142 selects the pre-stop request function (YES in S8), the process proceeds to step S9. On the other hand, when the post-stop request function is selected by the function selection unit 142 (NO in S8), the process proceeds to step S14. In step S9, the driving change request unit 132 performs TOR under the control of the display device 54 and / or the audio output device 55 in cooperation with the HCU 51, and proceeds to step S10.

In step S10, the presentation processing unit 134 presents information as necessary. For example, the presentation processing unit 134 may be configured to cause information to be requested to pass through a side of a stationary vehicle ahead of the host vehicle by manual driving. According to this, it is possible to easily recognize that the driver should pass the side of the stationary vehicle ahead of the host vehicle by manual driving, and it is possible to smoothly move to the action.

When the distance from the detected stationary vehicle to the intersection ahead of the stationary vehicle is within the threshold and the stationary vehicle ahead of the host vehicle is detected as a parked vehicle, the presentation processing unit 134 A configuration may be adopted in which information is presented so as to call attention to. As an example, information indicating that the vehicle is approaching the intersection may be displayed.

Further, when the detected distance from the stationary vehicle to the traffic signal ahead of the stationary vehicle is within the threshold and the stationary vehicle ahead of the host vehicle is detected as a parked vehicle, the presentation processing unit 134 A configuration may be adopted in which information is presented so as to call attention to. As an example, it is only necessary to present information indicating that the traffic lights pay attention to the color of the traffic light.

In addition, when the distance from the detected stationary vehicle to the pedestrian crossing ahead of the stationary vehicle is within the threshold and the stationary vehicle in front of the own vehicle is detected as a parked vehicle, the presentation processing unit 134 However, it may be configured such that information indicating that a pedestrian crossing person is careful is presented. According to these configurations, it is possible to easily recognize the object to be watched by the driver, and to pass the side of the stationary vehicle ahead of the host vehicle by manual driving while paying attention to the object to be watched. In addition, although it is good also as a structure which does not perform the information presentation mentioned above in the presentation process part 134, it is preferable to make the information presentation mentioned above perform from a viewpoint of a driver | operator's assistance.

In step S11, when the manual operation determination unit 143 determines that there is a driving operation of the driver (YES in S11), the operation mode selection unit 141 switches from the automatic operation mode to the manual operation mode, and the manual operation by the driver is performed. The process related to the detection of the front stationary vehicle is ended. On the other hand, when it is determined that there is no driving operation of the driver (NO in S11), the process proceeds to step S12.

When the vehicle passes by the side of the stationary vehicle and returns to the travel lane before the stationary vehicle passes by manual driving, the driving mode selection unit 141 switches from the manual driving mode to the automatic driving mode, and the automatic driving function unit 111 May be configured to resume automatic operation. According to this, smoother travel becomes possible more easily as much as it automatically returns to automatic operation. The fact that the vehicle has passed through the side of the stationary vehicle and returned to the traveling lane before passing the stationary vehicle by manual operation is based on the position of the stationary vehicle relative to the own vehicle recognized by the traveling environment recognition unit 100 and the traveling lane of the own vehicle. In addition, the control selection determination unit 140 may determine.

In step S12, if the limit for manual operation is reached (YES in S12), the process proceeds to step S13. The limit for switching to manual operation may be that the elapsed time after the TOR is performed is a specified time, or that the distance between the detected stationary vehicle and the vehicle is a specified distance. May be. On the other hand, when the change limit to manual operation is not reached (NO in S12), the process returns to S11 and the process is repeated.

In step S13, the operation mode selection unit 141 switches from the automatic operation mode to the automatic evacuation mode, and the MRM function unit 112 executes the MRM function, and automatically stops the vehicle at a predetermined distance from the detected stationary vehicle. Then, the related process at the time of detecting the front stationary vehicle is terminated. The predetermined inter-vehicle distance here can be arbitrarily set as long as it is an inter-vehicle distance estimated to be safe.

In step S14, which is performed when the post-stop request function is selected in S8, the operation mode selection unit 141 switches from the automatic operation mode to the immediate stop mode, and the automatic operation function unit 111 detects a predetermined inter-vehicle distance from the detected stationary vehicle. Leave your car to stop automatically.

In step S15, when the detected stationary vehicle moves within the set time after stopping the vehicle in S14 (YES in S15), the process proceeds to step S16. The moved stationary vehicle becomes the preceding vehicle of the own vehicle. On the other hand, if the detected stationary vehicle has not moved within the set time after stopping the vehicle in S14 (NO in S15), the process proceeds to step S17. Whether or not the detected stationary vehicle has moved may be recognized by the traveling environment recognition unit 100. The set time here can be arbitrarily set, and may be set to a time estimated to be required for the temporary stop at the temporary stop position, for example.

In step S16, the automatic driving function unit 111 continues automatic driving without performing TOR, follows the preceding vehicle that has started, and proceeds to S2. Therefore, even when the post-stop request function is selected, if the stationary vehicle ahead of the host vehicle starts within the set time after stopping the host vehicle, automatic driving is performed without performing TOR. Can continue. Therefore, even when the post-stop request function is selected, it is possible to reduce the troublesomeness of performing the TOR more than necessary without causing the TOR to be performed each time the host vehicle is stopped.

If the distance from the detected stationary vehicle to the traffic light in front of the stationary vehicle is within the threshold and information on the lighting cycle of this traffic signal can be acquired by the automatic operation ECU 10 via a communication module or the like, the lighting is performed. Based on the cycle information, the waiting time from the red signal to the blue signal may be set as the set time in S15. According to this, even when the post-stop request function is selected, if the distance from the detected stationary vehicle to the traffic signal ahead of the stationary vehicle is within the threshold, TOR is used for stopping at the signal waiting level. You don't have to. Therefore, when the stationary vehicle is parked waiting for a signal, automatic driving can be continued without performing TOR, and the troublesomeness of performing TOR more than necessary can be reduced.

In step S17, the driving change request unit 132 performs TOR, and proceeds to step S18. In step S18, in the same manner as in S10, the presentation processing unit 134 causes information presentation as necessary.

As shown in the flowchart of FIG. 4, the autonomous driving ECU 10 detects that a stationary object in front of the host vehicle is detected while the host vehicle is driving automatically on a road without an overtaking lane regardless of whether there is an oncoming vehicle. And TOR.

As described above, according to the configuration of the first embodiment, the traveling environment recognition unit 100 detects a stationary object ahead of the host vehicle while the host vehicle is traveling on a road without an overtaking lane by automatic driving. In addition, by performing TOR, it is possible to switch to manual operation. Therefore, even if the stationary vehicle ahead of the road without the overtaking lane does not stop every time by automatic driving, the driver can pass through the stationary vehicle while confirming safety by manual driving. Therefore, smoother driving on a road without an overtaking lane is possible. In addition, since it is possible to change to manual driving by making a driving change request to manual driving, as described above, when passing a stationary vehicle in front of the host vehicle while avoiding oncoming vehicles by automatic driving This difficulty can be easily avoided. As a result, smoother driving on a road without an overtaking lane can be made easier.

Here, the use of the presence or absence of TOR in the related process when detecting a front stationary vehicle will be described with reference to FIG. FIG. 5 shows a case where a stationary object in front of the host vehicle is detected while the host vehicle is driving automatically on one lane on one side, and the stationary vehicle and the intersection, traffic signal, and pedestrian crossing are A description will be given taking an example where the distance is within the threshold. In FIG. 5, the case where all of an intersection, a traffic signal, and a pedestrian crossing exist will be described as an example. However, the same applies to the case where only some of the intersection, the traffic signal, and the pedestrian crossing exist. Moreover, HV of FIG. 5 has shown the own vehicle and OV has shown the stationary vehicle.

As shown in FIG. 5, a stationary vehicle in front of the host vehicle is detected while the host vehicle is driving automatically on one lane on one side, and the stationary vehicle and an intersection, a traffic light, and a crossing are detected. Even when the distance from the sidewalk is within the threshold, if the stationary vehicle is a parked vehicle, TOR is performed. On the other hand, when the stationary vehicle is not a parked vehicle, TOR is not performed. This is because vehicles that are stationary near intersections, traffic lights, and pedestrian crossings but are not parked vehicles are only temporarily stopped and are likely to start in a short time. This is because even if driving is continued, there is a high possibility that the vehicle will not interfere with smooth running.

For example, if a stationary vehicle is temporarily stopped at an intersection where there is no traffic light, it is likely that the stationary vehicle will start before it reaches the stop location. Absent. Also, if a stationary vehicle is waiting for a signal at an intersection with a traffic light, it is necessary to wait for the signal as well, so even if the own vehicle is stopped like a stationary vehicle waiting for a signal It does not interfere with the smooth running of the vehicle. As described above, according to the related processing at the time of detection of a front stationary vehicle, TOR is performed even when a stationary object in front of the own vehicle is detected while the own vehicle is traveling on a road without an overtaking lane. TOR is not performed in a situation where it is difficult to prevent smooth travel of the vehicle even if it is not allowed. Therefore, it is possible to reduce the frequency of driving change to manual driving and reduce the burden on the driver due to manual driving.

Next, with reference to FIG. 6, a description will be given of the traveling of the respective vehicles when the post-stop request function is selected and the pre-stop request function is selected in the related process when the front stationary vehicle is detected. . In FIG. 6, a case where a stationary vehicle in front of the own vehicle is detected and the TOR is performed while the host vehicle is traveling on a one-lane road on one side will be described as an example. . In the example of FIG. 6, it is assumed that the stationary vehicle does not start. Moreover, HV of FIG. 6 has shown the own vehicle and OV has shown the stationary vehicle.

As shown in FIG. 6, when the post-stop request function is selected by the function selection unit 142, the TOR is not performed and the automatic driving is continued when the traveling environment recognition unit 100 detects a stationary vehicle. Subsequently, when the distance between the own vehicle and the stationary vehicle reaches a distance that can be stopped by automatic driving while securing a predetermined distance between the stationary vehicle and the stationary vehicle, deceleration is automatically started. The vehicle is automatically stopped with a predetermined distance between them. And after stopping the own vehicle automatically, TOR is performed. In addition, after performing TOR, when the driving operation by the driver is determined by the manual driving determination unit 143, the mode is switched to the manual driving mode. After switching to the manual operation mode, the driver starts the vehicle by manual operation and passes the side of the stationary vehicle while confirming safety. According to this, even a driver who is impatient by being asked to change driving in a short time can settle down and shift to manual driving after the own vehicle stops.

On the other hand, as shown in FIG. 6, when the pre-stop request function is selected by the function selection unit 142, TOR is immediately performed when the traveling environment recognition unit 100 detects a stationary vehicle, and the driving operation by the driver is performed. The automatic operation is continued until the manual operation determination unit 143 determines. After switching to the manual operation mode, the driver operates the direction indicator by manual operation, and starts steering at a timing at which the course can be changed while confirming safety. And it passes by the side of a stationary vehicle by manual operation, and returns to the travel lane before the stationary vehicle passes. According to this, as long as it becomes possible to shift to manual operation without stopping the host vehicle, smooth running becomes possible.

(Embodiment 2)
In the first embodiment, the configuration when the stationary object is a stationary vehicle has been described, but the configuration is not necessarily limited thereto. For example, a configuration including a case where the stationary object is an object other than a vehicle (hereinafter, a second embodiment) will be described. The driving support system 1 according to the second embodiment is the same as the driving support system 1 according to the first embodiment except that a part of the processing in the automatic driving ECU 10 is different. Specifically, in addition to performing processing related to detection of a front stationary vehicle by the automatic driving ECU 10, the traveling environment recognition unit 100 detects a stationary object ahead of the own vehicle while the vehicle is traveling on a road without an overtaking lane. The difference is that related processing is performed at the time of detecting a stationary stationary object related to the detection.

Here, with reference to FIG. 7, an example of the flow of processing related to the detection of a front stationary object in the automatic operation ECU 10 according to the second embodiment will be described. A control program for realizing a function related to the processing related to the detection of the forward stationary object also corresponds to the control program. The flowchart in FIG. 7 may be configured to start when the vehicle starts automatic driving, for example.

First, in step S101, when the traveling environment recognition unit 100 detects a stationary object in front of the vehicle (YES in S101), the process proceeds to step S103. On the other hand, when the traveling environment recognition unit 100 has not detected a stationary object in front of the host vehicle (NO in S101), the process proceeds to step S102.

Note that the traveling environment recognition unit 100 may be configured to detect a stationary object that needs to run out of the oncoming lane in order for the vehicle to continue traveling on a road without an overtaking lane. This is because the stationary vehicle does not need to run out of the oncoming lane in order to continue driving on a road that does not have an overtaking lane. This is because the vehicle can run more smoothly. As to whether or not it is a stationary object that needs to run out of the oncoming lane, the lateral width of the stationary object in the traveling lane of the vehicle passes through a certain distance such as 1 m from the stationary object. May be determined by whether or not the width needs to run out of the opposite lane. In addition, the fact that the height of the stationary object from the road surface is a height that needs to be avoided may be added to the condition for determining that the stationary object needs to run out of the oncoming lane.

In step S102, when it is the end timing of the related process at the time of detecting the front stationary object (YES in S102), the related process at the time of detecting the front stationary object is ended. On the other hand, when it is not the end timing of the related process at the time of detecting the front stationary object (NO in S102), the process returns to S101 and is repeated. The end timing of the related processing at the time of detecting the stationary stationary object may be when the ignition power of the own vehicle is turned off or when the driving is switched from automatic driving to manual driving.

In step S103, when the traveling environment recognition unit 100 detects a stationary object ahead of the host vehicle as a vehicle (YES in S103), the process proceeds to step S104. On the other hand, when the traveling environment recognition unit 100 detects that a stationary object in front of the host vehicle is not a vehicle (NO in S103), the process proceeds to step S105.

In step S104, the related process at the time of detecting the front stationary vehicle described in the first embodiment is performed, and the related process at the time of detecting the front stationary object is ended. The processing from step S105 to step S113 is the same as the processing from S8 to S14 in the processing related to the detection of the front stationary vehicle. In the process from step S105 to step S113, the stationary vehicle in the process from S8 to S14 is replaced with a stationary object.

Moreover, what is necessary is just to make it the structure which makes the presentation process part 134 perform the information presentation of requesting passing the side of the stationary object ahead of the own vehicle by manual driving in step S109. Even in the related processing at the time of forward stationary object detection, the operation mode selection unit 141 automatically operates from the manual operation mode when the vehicle passes by the side of the stationary object and returns to the travel lane before passing the stationary vehicle by manual operation. It is good also as a structure which switches to mode and the automatic driving | operation function part 111 restarts automatic driving | operation.

In step S114 following S113, the driving change request unit 132 performs TOR, and proceeds to step S115. In step S115, similarly to S109, the presentation processing unit 134 causes information to be presented as necessary. In the related process at the time of detection of the front stationary object, when the detected stationary object is not a vehicle, even if the post-stop request function is selected, the set time in S15 of the related process at the time of detection of the front stationary object is awaited. Without TOR. Further, in the related processing at the time of detecting the front stationary object, TOR is performed when the detected stationary object is not a vehicle. That is, TOR is performed when the traveling environment recognition unit 100 detects a stationary object other than the vehicle ahead of the host vehicle while the host vehicle is traveling on a road without an overtaking lane.

Applicable when it is difficult to change the course in automatic driving even when the driving environment recognition unit 100 detects a stationary object other than the vehicle ahead of the own vehicle while the vehicle is driving on a road without an overtaking lane. Therefore, also by the configuration of the second embodiment, as in the configuration of the first embodiment, smoother traveling on a road without an overtaking lane can be more easily performed.

(Embodiment 3)
Further, in the TOR presence / absence determination process in the related process at the time of detecting the front stationary vehicle, the configuration of the following third embodiment may be employed. Here, the configuration of the third embodiment will be described. The driving support system 1 of the third embodiment is the same as the driving support system 1 of the above-described embodiment except that the TOR presence / absence determination process in the related process at the time of detection of the front stationary vehicle is different.

Here, an example of the flow of TOR presence / absence determination processing in the automatic operation ECU 10 according to the third embodiment will be described with reference to FIG. The flowchart of FIG. 8 starts when the traveling environment recognition unit 100 recognizes that the host vehicle is traveling on a road without an overtaking lane in S3 of the related process at the time of detection of a front stationary vehicle.

First, in step S201, when the traveling environment recognition unit 100 detects a stationary vehicle in front of the host vehicle as a parked vehicle (YES in S201), the process proceeds to step S202. On the other hand, when the traveling environment recognition unit 100 detects that the stationary vehicle in front of the host vehicle is not a parked vehicle (NO in S201), the process proceeds to step S203.

In step S202, it is determined that the TOR determination unit 144 performs the TOR, and the process proceeds to S6 of the related process when the front stationary vehicle is detected. On the other hand, in step S203, the TOR determination unit 144 determines that the TOR is not performed, and the process proceeds to S6 of the related process when the front stationary vehicle is detected. That is, in the third embodiment, TOR is performed when the traveling environment recognition unit 100 detects a parked vehicle ahead of the own vehicle while the own vehicle is traveling on a road without an overtaking lane. In other words, in Embodiment 3, when the traveling environment recognition unit 100 detects a parked vehicle ahead of the host vehicle while the host vehicle is traveling automatically on a road without an overtaking lane, an intersection, a traffic light, and a crossing are detected. TOR is performed regardless of the position of the parked vehicle relative to any of the sidewalks.

Unlike the parked vehicle, there is a possibility that the vehicle can start in a short time when the traveling environment recognition unit 100 detects a parked vehicle ahead of the vehicle while the vehicle is traveling on a road without an overtaking lane. Since it is low, it is difficult to prevent the vehicle from running smoothly if TOR is performed and the side of the parked vehicle is passed by manual driving by the driver. This does not depend on the location of the parked vehicle, such as whether the parked vehicle is close to intersections, traffic lights, crosswalks, and the like. Therefore, even with the configuration of the third embodiment, smoother driving on a road without an overtaking lane can be more easily performed.

(Embodiment 4)
Further, in the TOR presence / absence determination process in the process related to the detection of the front stationary vehicle, the configuration of the following fourth embodiment may be employed. Here, the configuration of the fourth embodiment will be described. The driving support system 1 of the fourth embodiment is the same as the driving support system 1 of the above-described embodiment except that the TOR presence / absence determination process in the related process at the time of detecting the front stationary vehicle is different.

Here, an example of the flow of the TOR presence / absence determination process in the automatic operation ECU 10 according to the fourth embodiment will be described with reference to FIG. The flowchart of FIG. 9 starts when the traveling environment recognition unit 100 recognizes that the host vehicle is traveling on a road without an overtaking lane in S3 of the related process when detecting a front stationary vehicle.

In addition, the traveling environment recognition unit 100 according to the fourth embodiment distinguishes and detects whether or not the stationary vehicle is a stopped vehicle when the stationary vehicle is distinguished and recognized. Here, the stopped vehicle refers to a vehicle in which a brake lamp is lit. As an example, when the light color of the brake lamp is recognized in the image of the stationary vehicle in front of the host vehicle, the traveling environment recognition unit 100 may detect the image separately from the stopped vehicle.

First, in step S301, when the traveling environment recognition unit 100 detects a stationary vehicle ahead of the host vehicle as a stopped vehicle (YES in S301), the process proceeds to step S302. On the other hand, when the traveling environment recognition unit 100 detects that the stationary vehicle in front of the host vehicle is not a stopped vehicle (NO in S301), the process proceeds to step S303.

In step S302, the TOR determination unit 144 determines that the TOR is not performed, and the process proceeds to S6 of the related process when the front stationary vehicle is detected. On the other hand, in step S303, it is determined that the TOR determination unit 144 performs the TOR, and the process proceeds to S6 of the related process when the front stationary vehicle is detected. That is, in the fourth embodiment, TOR is not performed when the traveling environment recognition unit 100 detects a stopped vehicle ahead of the own vehicle while the own vehicle is traveling on a road without an overtaking lane.

Unlike the parked vehicle, there is a possibility that the vehicle can start in a short time when the traveling environment recognition unit 100 detects a stopped vehicle in front of the own vehicle while the vehicle is traveling automatically on a road without an overtaking lane. Since it is high, it is more difficult to keep the vehicle running smoothly if automatic driving is continued without performing TOR. Therefore, even with the configuration of the fourth embodiment, smoother driving on a road without an overtaking lane can be more easily performed.

(Embodiment 5)
In the above-described embodiment, when the function selecting unit 142 selects the post-stop request function, the configuration in which the TOR is performed after at least waiting for the set time to elapse after the host vehicle is stopped is described. Not limited to. For example, when the function selection unit 142 selects the requested function after stopping, the TOR may be performed without waiting for the set time to elapse after the host vehicle stops. In other words, when the function selection unit 142 selects the post-stop request function, the setting is performed regardless of whether the stationary vehicle detected by the traveling environment recognition unit 100 moves within the set time after the own vehicle stops. The TOR may be performed without waiting for the time to elapse.

(Embodiment 6)
In the above-described embodiment, the TOR is performed based on the detection of a stationary object in front of the own vehicle while the own vehicle is traveling on a road without an overtaking lane regardless of the presence of an oncoming vehicle. However, the present invention is not limited to this. For example, when a stationary object in front of the host vehicle is detected while the host vehicle is driving on a road without an overtaking lane, and the oncoming vehicle can be detected by the traveling environment recognition unit 100, The automatic operation ECU 10 determines whether or not the vehicle can pass the side of a stationary object ahead of the host vehicle without preventing the oncoming vehicle from passing. If it is determined that the vehicle can pass, the TOR may not be performed. .

Whether the vehicle can pass the side of a stationary object ahead of the host vehicle without obstructing the traffic of the oncoming vehicle, the speed of the host vehicle, the speed of the oncoming vehicle, the relative position of the host vehicle and the oncoming vehicle, What is necessary is just to set it as the structure which automatic operation ECU10 determines based on a relative position with a stationary object. As an example, it is assumed that the oncoming vehicle travels at a constant speed, while the own vehicle passes the side of the stationary object at the acceleration allowed for automatic driving and returns to the own lane. If it is possible to pass with a distance, the configuration may be such that it can be passed. In addition, what is necessary is just to set it as the structure which passes the side of a stationary object automatically by continuing automatic driving | running | working, when determining that it can pass and not performing TOR.

According to the configuration of the sixth embodiment, even when a stationary object in front of the host vehicle is detected while the host vehicle is driving automatically on a road without an overtaking lane, the autonomous driving ECU 10 detects the stationary object in the autonomous driving. When it is determined that the vehicle can pass through the side, the vehicle automatically passes through the side of the stationary object without performing TOR, so that smoother driving on a road without an overtaking lane is possible.

(Embodiment 7)
In the above-described embodiment, the configuration in which the DSM 53 is used for detecting the driver state has been described, but the configuration is not necessarily limited thereto. The driver state may be detected using a vehicle signal such as a change in steering angle detected by the steering angle sensor of the host vehicle or a biosensor such as a pulse wave sensor, an electrocardiogram sensor, or a respiratory sensor. It is good. Accordingly, a configuration may be adopted in which driver states other than those that can be detected using the DSM 53 are targeted.

The biometric sensor may be provided in the own vehicle such as provided in a steering wheel, a driver's seat, or the like, or may be provided in a wearable device worn by a driver. If the wearable device worn by the driver is provided with a biosensor, the HCU 51 may acquire the detection result of the biosensor, for example, via wireless communication.

(Embodiment 8)
In the above-described embodiment, the configuration in which the function selection unit 142 selects the post-stop request function according to the driver state acquired by the driver state acquisition unit 133 is shown, but the configuration is not necessarily limited thereto. For example, the automatic operation ECU 10 may not include the driver state acquisition unit 133, and the function selection unit 142 may not select the post-stop request function according to the driver state.

(Embodiment 9)
In the above-described embodiment, the function selection unit 142 has been configured to be able to select the pre-stop request function and the post-stop request function. For example, the automatic operation ECU 10 may be configured not to include the function selection unit 142 but to be fixed so that only one of the pre-stop request function and the post-stop request function can be executed.

(Embodiment 10)
In the above-described embodiment, the configuration in which the automatic operation ECU 10 is configured by one ECU is shown, but the configuration is not necessarily limited thereto, and may be configured by a plurality of ECUs. In addition, although the configuration in which the automatic operation ECU 10, the HCU 51, and the vehicle control ECU 30 are different ECUs is shown, it is not necessarily limited thereto. For example, the automatic operation ECU 10 may be configured to bear a part or all of the functions of the HCU 51, or may be configured to perform a part or all of the functions of the vehicle control ECU 30.

Here, the flowchart described in this application or the processing of the flowchart is configured by a plurality of sections (or referred to as steps), and each section is expressed as, for example, S1. Further, each section can be divided into a plurality of subsections, while a plurality of sections can be combined into one section. Further, each section configured in this manner can be referred to as a device, module, or means.

Although the present disclosure has been described based on the embodiments, it is understood that the present disclosure is not limited to the embodiments and structures. The present disclosure includes various modifications and modifications within the equivalent range. In addition, various combinations and forms, as well as other combinations and forms including only one element, more or less, are within the scope and spirit of the present disclosure.

Claims (19)

  1. A driving support device used in a vehicle capable of switching between automatic driving and manual driving,
    A stationary object detector (100) for detecting a stationary object in front of the vehicle;
    The driving change from the automatic driving to the manual driving based on the fact that the stationary object detection unit detects a stationary object ahead of the vehicle while the vehicle is traveling in the automatic driving on a road without an overtaking lane. A driving support device comprising: a request processing unit (132) that makes a driving change request for requesting the vehicle driver to the vehicle driver.
  2. 2. The travel according to claim 1, wherein, when the request processing unit makes the driving change request, the request processing unit makes the driving change request before the stationary object stops the vehicle by the automatic driving. Support device.
  3. 2. The driving support device according to claim 1, wherein the request processing unit causes the driving change request to be made after the vehicle is stopped by the automatic driving with respect to the stationary object when the driving change request is made. .
  4. In the case where the request processing unit makes the driving change request, the driving change request is made to the stationary object before the vehicle is stopped by the automatic driving, or the stationary object is The travel support apparatus according to claim 1, further comprising a selection unit (142) capable of selecting whether to make the driving change request after stopping the vehicle by automatic driving.
  5. The request processing unit makes the driving change request when the stationary object moves within a set time after the vehicle is stopped by the automatic driving with respect to the stationary object,
    The travel support device according to claim 3 or 4, wherein the request processing unit does not make the driving change request when the stationary object does not move within the set time.
  6. A presentation processing unit (134) for presenting information to the driver of the vehicle;
    When the request processing unit causes the request processing unit to perform the driving change request after stopping the vehicle by the automatic driving with respect to the stationary object, the presentation processing unit is configured to move the side of the stationary object by the manual driving. The driving support device according to any one of claims 3 to 5, wherein information is provided to request passage.
  7. An automatic driving function unit (111) for performing the automatic driving;
    The automatic driving function unit performs the automatic driving when the vehicle passes by the side of the stationary object and returns to the travel lane before the stationary object passes by the manual driving after the driving change request is made. The travel support apparatus according to any one of claims 3 to 6, wherein the driving is resumed.
  8. 8. The stationary object detection unit according to claim 1, wherein the stationary object detection unit detects a stationary object that needs to run out of an oncoming lane in order for the vehicle to continue traveling on a road without an overtaking lane. The travel support device according to any one of the above.
  9. The stationary object detection unit detects a stationary vehicle in front of the vehicle as the stationary object,
    The request processing unit issues the driving change request based on detection of a stationary vehicle ahead of the vehicle by the stationary object detection unit while the vehicle is traveling in the automatic driving on a road without an overtaking lane. The driving support device according to any one of claims 1 to 8, wherein the driving support device is performed.
  10. The request processing unit is configured to detect the stationary vehicle and its stationary vehicle even when the stationary object detection unit detects a stationary vehicle ahead of the vehicle while the vehicle is traveling on the road without an overtaking lane. The driving support device according to claim 9, wherein the driving change request is not made based on a distance to a traffic light in front of a stationary vehicle and an intersection within a threshold.
  11. The stationary object detection unit detects, as the stationary object, a stationary vehicle in front of the vehicle by distinguishing whether it is a parked vehicle,
    The request processing unit is a case where a stationary vehicle in front of the vehicle is detected by the stationary object detection unit while the vehicle is traveling on a road without an overtaking lane by the automatic driving, and the stationary vehicle 11. The driving change request is made to be made when the stationary vehicle is the parked vehicle even when the distance to the traffic light in front of the stationary vehicle and the intersection is within the threshold. The driving support apparatus according to the description.
  12. A presentation processing unit (134) for presenting information to the driver of the vehicle;
    The presentation processing unit is a case where the request processing unit detects a stationary vehicle in front of the vehicle by the stationary object detection unit while the vehicle is traveling on the road without an overtaking lane in the automatic driving, When the distance between the stationary vehicle and the target traffic signal or intersection in front of the stationary vehicle is within a threshold and the stationary vehicle is the parked vehicle, the driving change request The driving support device according to claim 11, wherein when performing the step, the driver is caused to present information indicating that the target is to be alerted.
  13. The request processing unit is configured to detect the stationary vehicle and its stationary vehicle even when the stationary object detection unit detects a stationary vehicle ahead of the vehicle while the vehicle is traveling on the road without an overtaking lane. The travel support apparatus according to any one of claims 9 to 12, wherein the driving change request is not made based on a distance to a pedestrian crossing in front of a stationary vehicle being within a threshold.
  14. The stationary object detection unit detects, as the stationary object, a stationary vehicle in front of the vehicle by distinguishing whether it is a parked vehicle,
    The request processing unit is a case where a stationary vehicle in front of the vehicle is detected by the stationary object detection unit while the vehicle is traveling on a road without an overtaking lane by the automatic driving, and the stationary vehicle The travel support according to claim 13, wherein even if the distance to the pedestrian crossing ahead of the stationary vehicle is within a threshold value, the driving change request is made when the stationary vehicle is the parked vehicle. apparatus.
  15. A presentation processing unit (134) for presenting information to the driver of the vehicle;
    The presentation processing unit is a case where the request processing unit detects a stationary vehicle in front of the vehicle by the stationary object detection unit while the vehicle is traveling on the road without an overtaking lane in the automatic driving, When the distance between the stationary vehicle and the pedestrian crossing in front of the stationary vehicle is within a threshold and the stationary vehicle is the parked vehicle, the pedestrian crossing is performed when the driving change request is made. The driving support device according to claim 14, wherein the driver is presented with information indicating that the driver is careful about the passerby.
  16. The stationary object detection unit detects, as the stationary object, a stationary vehicle in front of the vehicle by distinguishing whether it is a parked vehicle,
    Even if the request processing unit detects a stationary vehicle ahead of the vehicle by the stationary object detection unit while the vehicle is traveling on a road without an overtaking lane in the automatic driving, the stationary vehicle is The travel support device according to claim 9, wherein when the vehicle is a parked vehicle, the driving change request is made.
  17. The stationary object detection unit detects, as the stationary object, a stationary vehicle in front of the vehicle by distinguishing whether it is a stopped vehicle,
    The request processing unit makes the driving change request when the stationary object detection unit detects a stopped vehicle ahead of the vehicle while the vehicle is traveling on the road without an overtaking lane in the automatic driving. The driving support device according to claim 9, which is not present.
  18. A control program for controlling a driving support device used in a vehicle capable of switching between automatic driving and manual driving,
    Computer
    A stationary object detector (100) for detecting a stationary object in front of the vehicle;
    The driving change from the automatic driving to the manual driving based on the stationary object detecting unit detecting the stationary object ahead of the vehicle while the vehicle is traveling in the automatic driving on the road without the overtaking lane. The control program for functioning as a request | requirement process part (132) which performs the driving | operation change request | requirement which requests | requires toward the driver | operator of the said vehicle.
  19. In a computer-readable continuous tangible recording medium including instructions executed by a computer that controls a driving support device used in a vehicle capable of switching between automatic driving and manual driving, the instructions include:
    Detecting a stationary object in front of the vehicle (100);
    Based on the detection of a stationary object in front of the vehicle while the vehicle is traveling on the road without an overtaking lane, the driver of the vehicle is switched from the automatic driving to the manual driving. Making a driving change request to request (132);
    A computer readable persistent tangible recording medium comprising:
PCT/JP2018/019062 2017-06-15 2018-05-17 Traveling support device, control program, and computer-readable non-transitory tangible recording medium WO2018230245A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2017117934A JP2019001314A (en) 2017-06-15 2017-06-15 Driving support equipment and control program
JP2017-117934 2017-06-15

Publications (1)

Publication Number Publication Date
WO2018230245A1 true WO2018230245A1 (en) 2018-12-20

Family

ID=64660758

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2018/019062 WO2018230245A1 (en) 2017-06-15 2018-05-17 Traveling support device, control program, and computer-readable non-transitory tangible recording medium

Country Status (2)

Country Link
JP (1) JP2019001314A (en)
WO (1) WO2018230245A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0819A (en) * 1988-10-13 1996-01-09 Kawasaki Heavy Ind Ltd Unmanned traveling device
JP2010033441A (en) * 2008-07-30 2010-02-12 Fuji Heavy Ind Ltd Vehicle driving support device
JP2011514580A (en) * 2008-02-08 2011-05-06 ダイムラー・アクチェンゲゼルシャフトDaimler AG Vehicle driver longitudinal and lateral guide assistance method and driver assistance system for implementing the method
JP2012051441A (en) * 2010-08-31 2012-03-15 Toyota Motor Corp Automatic operation vehicle control device
WO2015087395A1 (en) * 2013-12-10 2015-06-18 三菱電機株式会社 Travel controller
JP2016031660A (en) * 2014-07-29 2016-03-07 クラリオン株式会社 Vehicle control device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0819A (en) * 1988-10-13 1996-01-09 Kawasaki Heavy Ind Ltd Unmanned traveling device
JP2011514580A (en) * 2008-02-08 2011-05-06 ダイムラー・アクチェンゲゼルシャフトDaimler AG Vehicle driver longitudinal and lateral guide assistance method and driver assistance system for implementing the method
JP2010033441A (en) * 2008-07-30 2010-02-12 Fuji Heavy Ind Ltd Vehicle driving support device
JP2012051441A (en) * 2010-08-31 2012-03-15 Toyota Motor Corp Automatic operation vehicle control device
WO2015087395A1 (en) * 2013-12-10 2015-06-18 三菱電機株式会社 Travel controller
JP2016031660A (en) * 2014-07-29 2016-03-07 クラリオン株式会社 Vehicle control device

Also Published As

Publication number Publication date
JP2019001314A (en) 2019-01-10

Similar Documents

Publication Publication Date Title
EP1758755B1 (en) Driver assistance method and device
JP5345350B2 (en) Vehicle driving support device
US9988047B2 (en) Vehicle control system with traffic driving control
DE102011121948A1 (en) Perspective on actions of an autonomous driving system
EP1564703B1 (en) Vehicle driving assist system
EP3118835A1 (en) System for avoiding collision with multiple moving bodies
US9159235B2 (en) Method of operating a driver assistance system of a motor vehicle
JP4807263B2 (en) Vehicle display device
US20160139598A1 (en) Autonomous driving vehicle system
US10436600B2 (en) Vehicle image display system and method
JP4333797B2 (en) Vehicle control device
DE102010049721A1 (en) Method for displaying criticality of current- and/or future driving conditions of motor car, involves determining criticality of current and future driving conditions of vehicle, and displaying visual warning based on criticality
JP5970513B2 (en) Driving support control device
JP4134891B2 (en) Collision possibility judgment device
EP2988098B1 (en) Driver assistance system with non-static symbol of fluctuating shape
US10023230B2 (en) Drive assist device, and drive assist method
US10259457B2 (en) Traffic light anticipation
JP5657809B2 (en) Armpit detector
US9499171B2 (en) Driving support apparatus for vehicle
KR101730321B1 (en) Driver assistance apparatus and control method for the same
WO2014076759A1 (en) Drive assist apparatus, and drive assist method
WO2014148025A1 (en) Travel control device
JP2016196285A (en) Travel controlling device and travel controlling method
CN103874931B (en) For the method and apparatus of the position of the object in the environment for asking for vehicle
JP2006227811A (en) Driving support apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 18817744

Country of ref document: EP

Kind code of ref document: A1