DE102017219813A1 - A method and apparatus for assisting a driver of an own vehicle - Google Patents

A method and apparatus for assisting a driver of an own vehicle

Info

Publication number
DE102017219813A1
DE102017219813A1 DE102017219813.3A DE102017219813A DE102017219813A1 DE 102017219813 A1 DE102017219813 A1 DE 102017219813A1 DE 102017219813 A DE102017219813 A DE 102017219813A DE 102017219813 A1 DE102017219813 A1 DE 102017219813A1
Authority
DE
Germany
Prior art keywords
vehicle
lane change
speed
determined
lane
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
DE102017219813.3A
Other languages
German (de)
Inventor
Toshiharu Sugawara
Heiko Altmannshofer
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Automotive Systems Ltd
Original Assignee
Hitachi Automotive Systems Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Automotive Systems Ltd filed Critical Hitachi Automotive Systems Ltd
Priority to DE102017219813.3A priority Critical patent/DE102017219813A1/en
Publication of DE102017219813A1 publication Critical patent/DE102017219813A1/en
Pending legal-status Critical Current

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units, or advanced driver assistance systems for ensuring comfort, stability and safety or drive control systems for propelling or retarding the vehicle
    • B60W30/18Propelling the vehicle
    • B60W30/18009Propelling the vehicle related to particular drive situations
    • B60W30/18163Lane change; Overtaking manoeuvres
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/08Interaction between the driver and the control system
    • B60W50/14Means for informing the driver, warning the driver or prompting a driver intervention
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems
    • G08G1/167Driving aids for lane monitoring, lane changing, e.g. blind spot detection
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W50/00Details of control systems for road vehicle drive control not related to the control of a particular sub-unit, e.g. process diagnostic or vehicle driver interfaces
    • B60W50/02Ensuring safety in case of control system failures, e.g. by diagnosing, circumventing or fixing failures
    • B60W50/0205Diagnosing or detecting failures; Failure detection models
    • B60W2050/0215Sensor drifts or sensor failures
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2520/00Input parameters relating to overall vehicle dynamics
    • B60W2520/10Longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/801Lateral distance
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2554/00Input parameters relating to objects
    • B60W2554/80Spatial relation or speed relative to objects
    • B60W2554/804Relative longitudinal speed
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W2555/00Input parameters relating to exterior conditions, not covered by groups B60W2552/00, B60W2554/00
    • B60W2555/20Ambient conditions, e.g. wind or rain

Abstract

The present disclosure relates to a method and apparatus for assisting a driver of an own vehicle with one or more sensors configured to detect other vehicles in the vicinity of the own vehicle, the method performed by a driver assistance control unit of the own vehicle when a lane change operation for changing from a current lane on which the own vehicle is traveling to a target lane is to be performed, determining an available detection area detected by the one or more sensors of the own vehicle; determining a required detection area for performing automatic control or semi-automatic control of the lane change operation by the driver assistance control unit of the own vehicle; and controlling the lane change operation of the own vehicle based on a comparison of the determined available detection range and the determined required detection range.

Description

  • The present disclosure relates to a method and apparatus for assisting a driver of an own vehicle with one or more sensors configured to detect other vehicles in the vicinity of the subject vehicle.
  • BACKGROUND
  • Auto-threading systems (automatic threading systems / automatic lane change systems) for automatically controlling a vehicle to make lane changes or driving on threading tracks is one of the key features of autonomous driving systems that aim to reduce the driver's workload and the risk of traffic accidents. However, it is difficult to use automatic threading systems at blind threading portions where obstacles can block a detection range of sensors of the controlled vehicle (own vehicle), since normally in the case where surrounding vehicles can not be detected, potentially due to the blocked detection range, the automatic Threading system is typically configured to assess the situation based on security requirements. When an un-detected following vehicle can come out of a blind spot at a high speed and the own vehicle is still controlled to perform the lane change, the following vehicle may need to brake hard or there may be a high risk of a collision to be reduced.
  • Japanese Patent Laid-Open Publication JP 2014-180986 A relates to a lane change assist system including an obstacle detection means that detects an obstacle present in the periphery of a vehicle, a white line detection means that detects a white line of a lane, a required detection area calculating means, a required detection area which is requested when the lane change control is implemented based on the detected white line detected, a shield decision means that judges whether a detection range of the obstacle detection means is smaller than the required detection range, control amount calculation means which is smaller when the detection range of the obstacle detection means is smaller as the required detection area, calculates a control amount of a vehicle required to cause the detection area of the obstacle detection means to be equal to or larger than the required D and a vehicle control means which controls the vehicle on the basis of the calculated control quantity.
  • In situations around dead threading angles, however, there arises a problem that the system may cause the own vehicle to accelerate and / or change its lateral position to make the blind spot smaller, but sometimes the system needs a sufficient reduction of the blind spot at the end of the threading section can not reach, and the driving then has to be handed over to the driver at the end of the threading section, when the driver does not have enough time to safely take over control of the vehicle. Therefore, it is very difficult for the driver to take over the control correctly and safely in such situations.
  • In order to avoid the above problems, it is an object of the present invention to provide methods and systems for assisting drivers in the situations of automatic lane change control or lane-tracking control, and to improve reliability and safety aspects of an automatic threading system or automatic lane change system.
  • SUMMARY
  • To achieve the above object, according to the present invention, a method according to claim 1, an apparatus according to claim 19 and a computer program product according to claim 20 are proposed.
  • A major advantage of the present invention is that the automatic control of lane change and track following situations can be more safely and reliably controlled, and in the event of a problem due to blind spot situations, control can be more reliably and safely handed to the driver.
  • A major aspect of exemplary embodiments is that an actual detection range (actually available sensor range) is calculated based on detection information, particularly preferably whether no vehicle is detected in the vicinity of the vehicle in the target lane (as this may indicate that an obstacle can limit the detection range of the sensors of the own vehicle). Preferably, the required detection range for the automatic threading can also be calculated, for example on the basis of map data and / or the detection information. Further, in the case that the required detection range is not covered by the actual detection range, the driving control can be efficiently, reliably and safely transmitted to the driver of the own vehicle.
  • According to one aspect of the present invention, a method for assisting a driver of an own vehicle with one or more sensors configured to detect other vehicles in the vicinity of the own vehicle is proposed.
  • In exemplary embodiments, the method, which is preferably performed by a driver assistance control unit of the own vehicle, for. For example, when a lane change operation for changing from a current lane on which the own vehicle is traveling to a target lane is to be performed, determining an available detection area detected by the one or more sensors of the own vehicle; Determining a required detection area for performing automatic control or semi-automatic control of the lane change operation by the driver assistance control unit of the own vehicle; and / or controlling the lane change operation of the own vehicle based on a comparison of the determined available detection range and the determined required detection range.
  • In preferred exemplary embodiments, controlling the lane change operation of the subject vehicle may be performed when the particular available coverage area is equal to or greater than the particular required coverage area and / or when the particular available coverage area covers the particular required coverage area.
  • In preferred exemplary embodiments, the method may include dispensing, e.g. B. by a man-machine interface of the own vehicle, a warning message to the driver, which requests that the driver takes over the control of the own vehicle include.
  • In preferred exemplary embodiments, the required detection range may be determined based on a speed or speed of the subject vehicle.
  • In preferred exemplary embodiments, the required detection range may be determined based on an estimated required length and / or an estimated required time estimated to be required for performing the lane change.
  • In preferred exemplary embodiments, the required coverage area may be determined based on map data indicating a peripheral environment of the host vehicle and / or based on lane marker information detected by one or more sensors of the host vehicle.
  • In preferred exemplary embodiments, the required detection range may be determined based on an estimated speed or speed of a virtual vehicle.
  • In preferred exemplary embodiments, the required detection range may be determined based on an estimated maximum relative velocity of the virtual vehicle relative to the subject vehicle, wherein the maximum relative velocity is preferably based on the estimated speed or velocity of the virtual vehicle and / or the speed or speed of the virtual vehicle Own vehicle is determined.
  • In preferred exemplary embodiments, the virtual vehicle may be estimated to ride in the target lane of the lane change operation outside of the particular available detection range.
  • In preferred example embodiments, the estimated speed or speed of the virtual vehicle may be determined based on a speed limit on the destination lane.
  • In preferred exemplary embodiments, the estimated speed or speed of the virtual vehicle may be determined based on a current weather condition, a time of day, a current season, and / or a current traffic condition.
  • In preferred exemplary embodiments, the estimated speed or speed of the virtual vehicle may be determined based on statistical speed data indicating an average speed of vehicles on the destination lane, or a statistically estimated maximum speed of vehicles traveling on the destination lane.
  • In preferred exemplary embodiments, the method may further include determining a last lane change start point at the current lane ahead of the subject vehicle, wherein the available coverage area may be preferably determined as an estimated available coverage area when the subject vehicle is located at the last lane change start point and / or the required coverage area may preferably be determined as the required detection range, the last lane change start point to perform the lane change operation is required.
  • In preferred exemplary embodiments, the method may further include determining a last lane change start point on the current lane ahead of the subject vehicle; determining an estimated second available detection range when the own vehicle is located at the last lane change start point; determining a second required detection area to perform automatic control or semi-automatic control of the lane change operation by the driver assistance control unit of the own vehicle at the last lane change start point; and / or controlling the lane change operation of the own vehicle at the last lane change start point based on a comparison of the estimated second available detection range and the determined second required detection range.
  • In preferred exemplary embodiments, controlling the lane change operation of the subject vehicle at the last lane change start point may be performed when the determined estimated second available detection range is equal to or greater than the determined second required detection range and / or when the determined estimated second available detection range covers the determined second required detection range ,
  • In preferred exemplary embodiments, the available coverage area may be determined based on previously stored information indicating an available detection area of a location in front of the own vehicle.
  • In preferred exemplary embodiments, the steps of determining the available detection range, determining the required detection range, and / or controlling the lane change operation of the own vehicle may be performed based on a comparison of the determined available detection range and the determined required detection range, if no other vehicle in the vehicle Target lane following the own vehicle is detected by the one or more sensors of the own vehicle when the lane change operation is to be performed.
  • In preferred exemplary embodiments, when another vehicle in the target lane following the own vehicle is detected by the one or more sensors of the own vehicle when the lane change operation is performed, the method may preferably further include: determining a relative distance between the own vehicle and the detected other vehicle; Determining a relative speed between the own vehicle and the detected other vehicle; and / or controlling the lane change operation of the own vehicle when the determined relative distance between the own vehicle and the detected other vehicle and / or the determined relative speed between the own vehicle and the detected other vehicle meet a lane change condition.
  • According to another aspect of the present invention, an apparatus mountable on an own vehicle with one or more sensors configured to detect other vehicles in the vicinity of the own vehicle is proposed; the device preferably comprising a control unit configured to perform the method of any of the aspects or embodiments described above or described below.
  • According to another aspect of the present invention, there is provided a computer program product comprising a computer program having computer program instructions configured to cause a controller or processor to perform the steps of a method of any of the aspects or embodiments described above or described below.
  • Although certain exemplary aspects have been described above, it should be understood that such aspects merely illustrate and do not limit the broad invention, and that the exemplary aspects are not limited to the specific constructions and arrangements shown and described above, as various other changes, Combinations, omissions, modifications and substitutions are possible in addition to those set forth in the paragraphs above.
  • Those skilled in the art will recognize that various adjustments, modifications, and / or a combination of the aspects just described may be configured. Therefore, it should be understood that other aspects than those specifically described herein may be practiced. Those skilled in the art, also in light of this disclosure, recognize that various aspects described herein may be combined to form other aspects of the present disclosure.
  • list of figures
    • 1 exemplifies a vehicle equipped with a driver assistance system according to exemplary embodiments;
    • 2 exemplifies a flowchart of a control process of a driver assistance system according to example embodiments;
    • 3A . 3B and 3C each indicate a different option for indicating a lane change intent to surrounding vehicles;
    • 4 exemplifies an example situation in which the vehicle is traveling on a road surrounded by three other vehicles;
    • 5 exemplifies a correlation map of the time required for lane change / threading as a function of the vehicle speed of the own vehicle;
    • 6 exemplifies a dashed range of allowable / possible lane change for the distance between vehicles (horizontal axis) and the collision prediction time (vertical axis);
    • 7 exemplifies a warning processing of the illumination of a warning lamp on an instrument panel of the own vehicle and / or the operation of a warning tone;
    • 8th exemplifies a flowchart of lane change control processing by the driver assistance control unit;
    • 9 exemplifies a situation in which an own vehicle should make a lane change, and a following vehicle in the destination lane can not be detected due to an obstacle (eg, a wall) that limits the actual detection area;
    • 10 exemplifies a flowchart of an example of a process for determining a maximum relative speed;
    • 11 exemplifies an instrument panel of the own vehicle, which outputs a warning message to the driver;
    • 12 exemplifies a flowchart of another control process of a driver assistance system according to other example embodiments;
    • 13 exemplifies an overview of an exemplary threading section, e.g. B. a highway, is; and
    • 14 exemplifies calculation / determination of the actual detection area on the basis of previously obtained data.
  • DETAILED DESCRIPTION OF THE DRAWINGS AND PREFERRED EXEMPLARY EMBODIMENTS
  • Hereinafter, preferred aspects and embodiments of the present invention will be described in detail with reference to the accompanying drawings. The same or similar features in various drawings and embodiments are designated by like reference numerals. Of course, the following detailed description with respect to various preferred aspects and preferred exemplary embodiments is not intended to limit the scope of the present invention.
  • 1 exemplifies a vehicle 100 represented with a driver assistance system according to exemplary embodiments such. B. a driver assistance system is equipped with an autonomous driving system.
  • In the context of the present disclosure, an autonomous driving system is a system configured to automatically (or at least semi-automatically) control vehicle driving operations, such as driving. As cruise control, overtaking maneuvers, lane change maneuvers, turn maneuvers, parking, etc., z. B. without influence of the driver or at least with very little influence or very little control of the driver.
  • Although the driver assistance system of exemplary embodiments of the present invention is optionally configured to control various different driving operations or maneuvers, and configured to fully and autonomously control the vehicle, e.g. Along a target route, however, exemplary embodiments may relate in particular to the control of lane change driving operations in which the vehicle is operated to continuously shift from one lane of one lane to another lane of the lane, particularly preferably related to lane-threading operations , z. B. when a vehicle on a highway or the like z. B. auffährt on an access lane, which is to unite with a lane of the highway.
  • The vehicle 100 may be a vehicle with two, three, four or more wheels and the vehicle may be powered by an internal combustion engine, an electric motor or a combination thereof. The vehicle can through Front-wheel drive, rear-wheel drive or four-wheel drive are driven. Exemplary is the vehicle 100 from 1 as a four-wheeled car with a left front wheel "FL-Rad", a right front wheel "FR-Rad", a left-rear wheel "RL-Rad" and a right-rear wheel "RR-Rad".
  • Each of the wheels is exemplary with a respective brake of the four brakes 16FL . 16FR, 16RL and 16RR (For example, equipped with brake cylinders, pistons, pads, etc.) and exemplified is a respective wheel speed sensor for each of the four brakes 16FL . 16FR . 16RL and 16RR provided, see the exemplary wheel speed sensors 22FL . 22FR . 22RL and 22RR in 1 ,
  • The driver assistance system of the vehicle 100 from 1 includes, for example, a driver assistance control unit 1 (Driver assistance control unit), which is communicable with a steering control unit 8th (Steering control unit), which is configured to a steering control mechanism 10 to control a brake control unit 15 (Brake control unit) configured to use a brake control mechanism 13 and a throttle control unit 19 (Throttle control unit) configured to control a throttle control mechanism 20 of the vehicle 100 to control.
  • In general, the driver assistance control unit 1 is configured to provide control values and / or control signals to the respective control units 8th . 15 and 19 the respective control mechanisms 10 . 13 and 20 and the respective tax units 8th . 15 and 19 the respective control mechanisms 10 . 13 and 20 are configured to receive respective command values or command signals from the driver assistance control unit 1 through communication and the actuators of the control mechanisms 10 . 13 and 20 to control based on the command values.
  • In some example embodiments, the driver assistance control unit may 1 be implemented as an autonomous driving control unit configured to autonomously and automatically control the respective control mechanisms 10 . 13 and 20 to control driving operations by the vehicle 100 be performed.
  • By way of example, the driver assistance control unit 1 a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), and an input / output unit, though not shown. A processing procedure of the vehicle driving assistance operations, e.g. B. as described below, may be stored in the ROM.
  • Further, map data (eg, navigation map data or other map data indicating a geographic location of a road system) may be stored in the ROM and specific map data surrounding the vehicle 100 can be extracted in accordance with an own vehicle position, which is connected to a position sensor z. B. based on satellite navigation systems such. B. based on data from a GPS sensor (not shown), the speed of all wheels and / or a detected steering angle is calculated.
  • Although described in detail later, the driver assistance control unit may 1 be configured to calculate command values supplied to the respective control units 8, 15 and 19 of the respective control mechanisms 10, 13 and 20 for guiding the vehicle to effect control of desired driving operations, such as driving. B. to make it possible to change a lane, z. On the basis of the relative distances and the relative speeds between the vehicle and the surrounding vehicles, which are detected by external detection sensors when the lane can not be changed.
  • As mentioned above, the driver assistance control unit 1 operate on the basis of sensor information, including sensor information about a surrounding area of the vehicle 100 , In the example of 1 includes the vehicle 100 exemplified several sensors 2 . 3 and 5 ,
  • The sensors 2 . 3 . 4 and 5 For example, with sensor devices for detecting or sensing the outside of a vehicle, for detecting an environmental region of the vehicle 100 and / or for detecting obstacles in the vicinity of the vehicle 100 be equipped. Such sensors may be equipped with one or more cameras (including regular cameras such as CCD cameras, infrared cameras, and / or stereo cameras), one or more radars, and / or one or more lidars or ladars (laser radars), and so on.
  • By way of example, in 1 the sensors for detecting the outside of the vehicle a camera 2 , which is located at the front of the vehicle, laser radar 3 and 4 located on the right and left sides thereof, and a millimeter-wave radar 5 located at the rear thereof, thereby making possible relative distances and relative speeds between the vehicle 100 (Own vehicle) and surrounding vehicles to detect.
  • In exemplary embodiments, a combination of the above sensors is used as an example of the sensor structure, although the present invention is not limited thereto and Ultrasonic sensor, a stereo camera, an infrared camera or combinations thereof and the like can be used together with or instead of the above sensors. Signals from the sensors can be added to the driver assistance control unit 1 to be delivered.
  • Further, as an example, the driver's input is in a lane change assist input device 11 to the driver assistance control unit 1 delivered. By way of example, the lane change assistant input device 11 For example, use turn signals and / or flashing lights and a lane change assistant operation can be decided on the basis of on and off information thereof. The lane change assistant input device 11 however, is not limited to the turn signals or flashing lights and may use a dedicated input device.
  • In general, the lane change assist input device includes 11 an input interface configured to receive an operation of the driver indicating the command of a driver and / or the intention of a driver to perform a lane change, or the like.
  • Summarizing the above, the driver assistance system may include multiple sensors in accordance with exemplary embodiments 2 . 3 . 4 and 5 for recognizing or perceiving the outside of a vehicle, the steering control mechanism 10 , the brake control mechanism 13 and the throttle control mechanism 20 for assisting a lane change on the basis of information detected by the sensors, the driver assistance control unit 1 for calculating command values corresponding to the actuators of the control mechanisms 10 . 13 and 20 are fed, the steering control unit 8th for controlling the steering control mechanism 10 on the basis of the command value from the driver assistance control unit 1 , the brake control unit 15 for controlling the brake control mechanism 13 on the basis of the command value to set the distribution of the braking force for each wheel, and the throttle control unit 19 for controlling the throttle control mechanism 20 based on the command value to adjust an output torque of an engine include. Furthermore, the driver assistance system includes 1 from 1 an example of a warning device 23 ,
  • Further, the driver assistance control unit becomes 1 by way of example with sensor signals from a combined vehicle system sensor 14 for example, which can detect a longitudinal acceleration, a lateral acceleration and yaw rate, sensor signals from the wheel speed sensors 22FL to 22RR installed in the wheels, brake force commands from the driver assistance control unit 1 and / or sensor signals generated by the steering control unit 8th from a steering wheel angle detector 21 be supplied.
  • Moreover, an output is the brake control unit 15 by way of example with the brake control mechanism 13 connected, which may include a pump (not shown) and a control valve, and the brake control unit 15 can generate any braking force to apply it to the wheels, regardless of the driver's brake pedal operation. The brake control unit 15 Can the twist, the drift and the blocking of the vehicle 100 based on the above information and can generate a braking force for relevant wheels to suppress them, so that the handling and stability of the driver's operations or driving operations can be improved.
  • The driver assistance control unit 1 can issue a brake command to the brake control unit 15 transmit, allowing any braking force in the vehicle 100 can be generated. It should be noted that the present invention is not limited to the brake control unit and another actuator such. B. brake-by-wire or the like can use.
  • The brake control system of the driver assistance system of 1 exemplifies the brake control mechanism 13 that is communicative with a brake control unit 15 connected, which is configured to the operation of the brakes 16FL . 16FR . 16RL and 16RR to control, for. B. based on brake actuation control signals from the brake control unit 15 to the brake control mechanism 13 and / or to respective brakes 16FL . 16FR . 16RL and 16RR for brake application 16FL . 16FR . 16RL and 16RR be sent on the basis of the brake application control signals.
  • By way of example, the brake control unit gives 15 electrical control signals, but the brake control mechanism 13 can be implemented as an electronic control system with electronic actuators, but the brake control mechanism 13 may additionally or alternatively also have mechanical, hydraulic and / or pneumatic actuators.
  • By way of example, the vehicle includes 100 also a brake pedal 12 that can be operated by the vehicle driver, for. B. to influence the vehicle control or the brake control of the vehicle 100 to take over. That is, the vehicle 100 is configured to be presented to the driver by dedicated input devices such. B. the brake pedal 12 or the steering wheel 6 or the accelerator pedal 17 of the vehicle 100 to enable the control of the vehicle 100 to influence or even take over.
  • The pedal force of the driver when pedaling on the brake pedal 12 For example, it may be boosted (eg, doubled) by a brake booster (not shown) to produce an oil pressure according to the pedal force through a master cylinder (not shown). The generated fluid pressure can be controlled by the brake control mechanism 13 are supplied to the respective brake cylinder of the brakes 16FL to 16RR of the wheels. The brakes 16FL to 16RR of the wheels may consist of cylinders (not shown), pistons, brake pads and the like. The pistons may be driven by brake fluid supplied from the master cylinder (not shown) and the brake pads connected to the pistons may be pressurized on disc rotors. The disc rotors can be rotated together with the wheels (not shown). Consequently, a braking torque acting on the disc rotors becomes the braking force acting between the wheels and the road. Consequently, a desired braking force can be exerted on the wheels in accordance with the driver's brake pedal operation.
  • By way of example, the brake control unit 15 For example, a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), and / or an input / output unit may be included, although in FIG 1 not shown in detail.
  • The steering control system of the driver assistance system of 1 exemplifies the steering control mechanism 10 that is communicable with the steering control unit 8th configured to configure the operation of the steering control mechanism 10 to control, which is provided by way of example for the front wheels, z. On the basis of steering confirmation control signals supplied by the steering control unit 8th to the steering control mechanism 10 be sent based on the steering control signals.
  • The steering control unit gives an example 8th electrical control signals, but the steering control mechanism 10 can be implemented as an electrical control system with electric actuators, but the steering control mechanism 10 may additionally or alternatively also have mechanical, hydraulic and / or pneumatic actuators.
  • By way of example, the vehicle includes 100 also a steering wheel 6 that can be operated by the vehicle driver, for. B. to influence the vehicle control or the steering control of the vehicle 100 to take over. That is, the vehicle 100 is configured to be presented to the driver by dedicated input devices such. B. the steering wheel 6 or the brake pedal 12 or the accelerator pedal 17 of the vehicle 100 to enable the control of the vehicle 100 to influence or even take over.
  • By way of example, the steering torque and / or a steering angle generated by a steering wheel 6 entered by the driver, respectively by the steering torque detector 7 and / or the steering angle detector 21 be detected, and the steering control unit 8th may control a motor based on the detected information to generate an assist torque.
  • By way of example, the steering control unit 8th For example, a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), and an input / output unit similar to the driver assistance control unit 1 include, though in 1 not shown in detail. The steering control mechanism 10 can be operated by the resultant force of the driver's steering torque and the assist torque by the engine to turn the front wheels. On the other hand, a reaction force from the road surface to the steering control mechanism 10 be transmitted in accordance with a rotation angle of the front wheel and then transmitted to the driver.
  • The steering control unit 8th can be a torque by means of the motor 9 regardless of the steering operation of the driver and can generate the steering control mechanism 10 control. Consequently, the driver assistance control unit 1 a steering force command to the steering control unit 8th to thereby control the rotation of the front wheels in any rotation angle. However, the present invention is not limited to the use of a steering control unit and may be another actuator such. B. Steering-by-wire or the like use.
  • The throttle control system of the driver assistance system of 1 exemplifies the throttle control mechanism 20 that is communicable with the throttle control unit 19 configured to configure the operation of the throttle control mechanism 20 to control, for. On the basis of throttle actuation control signals received from the throttle control unit 19 to the throttle control mechanism 20 be sent on the basis of the throttle control signals.
  • By way of example, the throttle controller gives 19 electrical control signals, but the throttle control mechanism 20 can be implemented as an electric control system with electric actuators, but the throttle control mechanism 20 may additionally or alternatively also have mechanical, hydraulic and / or pneumatic actuators. The throttle control mechanism 20 Further, a drive system of the vehicle 100 include, for. B. with an internal combustion engine and / or a drive electric motor.
  • By way of example, the vehicle includes 100 also an accelerator pedal 17 that can be operated by the vehicle driver, for. B. to influence the vehicle control or the throttle control of the vehicle 100 to take over. That is, the vehicle 100 is configured to be presented to the driver by dedicated input devices such. B. the accelerator pedal 17 or the steering wheel 6 or the brake pedal 12 of the vehicle 100 to enable the control of the vehicle 100 to influence or even take over.
  • By way of example, the Pedalttretausmaß the accelerator pedal 17 by the driver through a stroke sensor 18 can be detected and can in the throttle control unit 19 be entered. The throttle control unit 19 For example, a central processing unit (CPU), a read only memory (ROM), a random access memory (RAM), and an input / output unit may be similar to the driver assistance control unit 1 include, though in 1 not shown in detail. The throttle control unit 19 For example, it may set a throttle opening according to the accelerator pedal travel amount to the drive system of the throttle control mechanism 20 to control. Consequently, the vehicle can 100 accelerated according to the driver's accelerator pedal operation. Further, the throttle control unit may control the throttle opening independently of the accelerator pedal operation of the driver. Consequently, the driver assistance control unit 1 transmit an acceleration command to the Drosseisteuereinheit to any acceleration in the vehicle 100 to create.
  • In summary, according to the above control operations, when the autonomous driving system exemplarily performs the automatic lane-threading control, the brake control mechanism and the throttle control mechanism may be arranged in the vicinity of the vehicle according to the circumstances of the surrounding vehicles 100 (Own vehicle) can be adjusted to the speed of the vehicle 100 correct to control, z. B. so that the vehicle can be guided to a position in which the lane can be changed. The system also controls the steering to change the lane by performing the control of the steering control mechanism.
  • 2 exemplifies a flowchart of a control process of a driver assistance system according to exemplary embodiments. In particular 2 FIG. 12 is a flowchart showing, by way of example, a control process of automatic threading processing according to processing commands of a control program or control program part, such as a control program part. B. in the memory of the driver assistance control unit 1 saved.
  • First, the driver assistance control unit judges 1 whether lane change / automatic threading is required or not (step S201 ). The driver assistance control unit 1 For example, it may judge that a lane change is required based on one or more of the following criteria: a lane change is required to follow a navigation route as determined by a navigation system of the vehicle, a lane change is required as the driver intends to or Request for a lane change indicates, for. On the basis of an input to the lane change assistant input device 11 , a lane change is required because another vehicle in front of the own vehicle (vehicle 100 ) is to be overhauled to meet a target speed condition, a lane change is required for threading into another lane (eg, at a freeway access point), a lane change is required because the current lane ends or due to road conditions, obstacles or the like a lane change is required to retract into a lane set for turning before a predetermined turn.
  • If the assessment of step S201 YES, the processing goes to step S202 If it is NO, it is judged that the lane change / automatic threading is not required, and the processing proceeds to return.
  • When processing step S202 leads the driver assistance control unit 1 the processing of the notification of the intention of changing the lane of the vehicle 100 (Own vehicle) to other vehicles in the environments by, as exemplified in 3A . 3B and 3C shown.
  • 3A . 3B and 3C each indicate a different option for indicating a lane change intent to surrounding vehicles.
  • First, there is a method for turning on turn signals (flashing lights) on the own vehicle 100 as exemplified in 3A shown to intent a lane change to the surrounding vehicles, such. B. another vehicle 101 to indicate.
  • Additionally or alternatively, there is a method for controlling the own vehicle 100 so that the own vehicle 101 is moved along the boundary to the adjacent lane to which the driver wants to change from the current lane, as exemplified in 3B shown.
  • To execute the above control, the driver assistance control unit may 1 first the lane using information from the camera 2 detect, which is arranged at the front of the vehicle. The Driver assistance control unit 1 may then calculate a target steering angle or a target assist torque required for the above movement. A target steering angle or torque may go to the steering control unit 8th be transmitted. Consequently, the vehicle can be controlled to be moved along the boundary to the adjacent lane.
  • Additionally or alternatively, there is a method for reporting the lane change intent of the own vehicle 100 to other vehicles such. B. the vehicle 101 in step S202 Also, a method for transmitting the lane change intent of the vehicle to other vehicles by vehicle-to-vehicle communication, as exemplified in 3C shown.
  • Since, as described above, the lane change intention of the driver of the own vehicle 100 to other vehicles through the processing of step S202 can be clearly transmitted, other vehicles such. B. the vehicle 101 in 3A to 3C recognize the lane change intent of the vehicle, so that the lane change can be performed smoothly.
  • With another return to 2 next judges the driver assistance control unit 1 whether one or more of the following vehicles are in the vicinity of the own vehicle 101 can be detected as moving in the target lane of the intended lane change or not (step S203 ). If so, the processing goes to step S204 continue, and if not, the processing goes to step S210 continue.
  • That is, based on sensor information provided by the sensors 2 . 3 . 4 and or 5 will be obtained in step S203 Checked if one or more vehicles in the area of the own vehicle 100 as within the target lane (ie the target lane to which the vehicle's path 100 should be changed and in the vehicle 100 after the lane change operation) next to or behind (in the back) of the own vehicle 100 can be detected driving.
  • As mentioned, if at least one following vehicle in the vicinity of the own vehicle 101 as in the target lane of the intended lane change driving in step S203 is detected, the process goes to step S204 continued.
  • In step S204 determines or calculates the driver assistance control unit 1 relative distances and relative speeds between the own vehicle 100 and the detected surrounding vehicles on the basis of information from the sensors such. B. the camera 2 for detecting the front part of the vehicle, the laser radar 3 and 4 for detecting the side parts of the vehicle and the millimeter-wave radar 5 for detecting the rear part of the vehicle, as exemplified in FIG 4 shown.
  • 4 exemplifies an example situation in which the own vehicle 100 driving on a street, taking it from three other vehicles 101 . 102 and 103 is surrounded.
  • In 4 drives the own vehicle 100 exemplified in a left lane of a road and is from three other vehicles 101 . 102 and 103 surround. The vehicle 103 drives by way of example in the same lane as the own vehicle 100 in front of the own vehicle. The vehicle 101 exemplary runs in the other lane (ie a potential Zielfahrspur when a lane change is to be controlled) next to the own vehicle 100 and the vehicle 102 also exemplarily travels in the other lane (ie a potential destination lane when a lane change is to be controlled) but behind the own vehicle 100 ,
  • In 4 the reference numbers refer A2 . A3 . A4 and A5 by way of example to the respective detection ranges of the sensors 2 . 3 . 4 or. 5 , the obstacles (eg vehicles) at the front (detection area A2 of the sensor 2 ) and back (detection area A5 of the sensor 5 ) of the vehicle 100 as well as left (detection area A3 of the sensor 3 ) and right (detection area A4 of the sensor 4 ) from the vehicle 100 to capture. Because the vehicle 102 in the sensor area A5 can be arranged under the vehicles in the environment of the own vehicle 100 the vehicle 102 consequently through the sensor 5 be detected as the vehicle 101 in the sensor area A4 is arranged, the vehicle can 101 through the sensor 4 be detected, and there the vehicle 103 in the sensor area A2 is arranged, the vehicle can 103 through the sensor 2 be detected.
  • In the situation of 4 would step S203 lead by way of example to YES and the driver assistance control unit 1 would step in S204 the relative distance and the relative speed between the own vehicle and the vehicle 102 calculate and potentially also the relative distance and the relative velocity between the own vehicle and the vehicle 101 both located in the potential target lane when lane change should be desired or intended.
  • The speed of the own vehicle 100 For example, based on information from wheel speed sensors 22FL to 22RR be estimated. For example, a highest speed of information of the four wheel speed sensors may be selected to be set as the estimated vehicle speed. The estimated vehicle speed method is not limited to this, and another method of using an average value of the wheel speed sensors may be used.
  • The relative positions and relative speeds of the other vehicles may be expressed in co-ordinate (eg, co-moving) coordinate systems, for example where the origin is at the center of gravity of the subject vehicle 100 is fixed and the X-axis by way of example in the direction of the front of the own vehicle 100 is fixed (for example, along an axial direction of the vehicle 100 ) and the Y-axis by way of example in the left side of the own vehicle 100 is fixed.
  • The relative distances Xi (eg X1 for the vehicle 101 . X2 for the vehicle 102 and X3 for the vehicle 103 ) and the relative speeds Vi (eg V1 for the vehicle 101 . V2 for the vehicle 102 and V3 for the vehicle 103 ) between the center of gravity of the own vehicle 100 and the surrounding vehicles (i = 1, 2, 3,... i) in the X-axis direction at a time t can be expressed by, for example, the following expressions: { X i ( t ) V i ( t ) = X ˙ i ( t ) ( i = 1.2 . i )
    Figure DE102017219813A1_0001
  • The suffix i represents by way of example an i-th vehicle, for. B. i = 1 for the vehicle 101 , i = 2 for the vehicle 102 Further, the relative velocity Vi of the ith vehicle is exemplarily defined such that the velocity in the direction in which the ith surrounding vehicle is the own vehicle 100 approaching, is positive.
  • With return to 2 , goes into step after determining the relative positions and the relative speeds of the other vehicles S204 the driver assistance control unit 1 exemplary for calculating a collision risk in step S206 further, preferably for each detected surrounding vehicle or at least for each detected surrounding vehicle in the target lane of the lane change, if the lane is to be changed, on the basis of the respective determined relative positions and the relative speeds.
  • By way of example, the speed of the own vehicle, which is estimated above, becomes the predetermined correlation map of the vehicle speed V0 of the own vehicle 100 over time T1 which is required for lane change / threading, as exemplified in FIG 5 shown at the time T1 to calculate, which is required for the lane change.
  • 5 exemplifies a correlation map of time T1 required for lane change / threading as a function of vehicle speed V0 of the own vehicle 100 represents.
  • In the 5 The correlation map shown by way of example is set so that the higher the vehicle speed V0 of the own vehicle 100 is, the shorter the time T1 is, which is required for the lane change. Hence the time T1 , which is required for lane change (or track threading), shorter at high speeds and the time T1 is longer at low speeds, so that time T1 required for the lane change according to the vehicle speed V0 of the own vehicle 100 can be calculated correctly.
  • Such a correlation map may be predetermined or used in other exemplary embodiments by an estimated time T1 required for lane change as a function of vehicle speed V0 of the own vehicle 100 to determine. On the other hand, the time required for a lane change may be calculated based on the vehicle speed and based on a detected lane width ahead of the own vehicle (eg, based on a detected view of the camera 2 in front of the vehicle).
  • Next, a distance between vehicles Xi ^ gap at a time t + T1 to the i-th vehicle in the vicinity of the own vehicle 100 and an estimated time to collision or a collision prediction time Ti ^ ttc at the same time t + T1 representing the risk of collision (after time T1 required for the lane change) at the time t when the lane is changed on the basis of the calculated time T1 required for the lane change is calculated by the following expressions: X i G a p ( t + T 1 ) = | X i ( t + T 1 ) | - ( L 0 / 2 + L i / 2 )
    Figure DE102017219813A1_0002
    T i t t c ( t + T 1 ) = { X i G a p ( t + T 1 ) V i ( t + T 1 ) ( V i > 0 ) ( V i 0 )
    Figure DE102017219813A1_0003
  • In the above expressions, L0 represents the total length (in the front and rear directions) of the own vehicle 100 and Li represents the length of a vehicle i in the vicinity of the own vehicle 100 represents.
  • Thus, for an i-th vehicle, the distance between vehicles Xi ^ gap at an instant t + T1 provides an estimated gap distance between the vehicle i-th vehicle and the own vehicle 100 after the completion of the lane change (or lane-threading), wherein the start time t of the lane change operation and the estimated duration T1 of the lane change process as well as the expected relative distance Xi from the own vehicle 100 to the ith vehicle at time t + T1 when it is expected that the lane change will be completed. The expected relative distance Xi from the own vehicle 100 to the i-th vehicle at time t + T1 can be determined, for example, based on the relative distance Xi and the relative speed Vi of the ith vehicle relative to the own vehicle 100 be estimated as in step S204 calculated.
  • Further, the estimated collision prediction time Ti ^ ttc at the same time t + T1 exemplifies an estimate of time until a collision between the i-th vehicle and the own vehicle 100 could occur after completing the lane change.
  • Next, the driver assistance control unit judges 1 whether the lane change can be performed or not (step S207 in 2 ) on the basis of the calculated distance between vehicles Xi ^ gap (t + T1) and the collision prediction time Ti ^ ttc (t + T1) represented by the expressions ( 2 ) and (3) are calculated, e.g. B. for each of the surrounding vehicles, or at least for each of the surrounding vehicles that drive in the Zielfahrspur the desired lane change.
  • It can be assumed by way of example that a lane change can be judged as possible when the calculated distance between vehicles Xi ^ gap (t + T1) is greater than (or greater than or equal to) a relative distance distance limit Xi ^ gap_a (hereinafter as the first predetermined value) and when the calculated collision prediction time Ti ^ ttc (t + T1) is greater than (or greater than or equal to) a threshold value Ti ^ ttc_a for the collision prediction time (hereinafter referred to as the second predetermined value).
  • For example 6 exemplarily a dashed range of a permissible / possible lane change for pairs of the distance between vehicles Xi ^ gap (horizontal axis) and the collision prediction time Ti ^ ttc (vertical axis) represents.
  • The assessment standard for step S207 in 2 may preferably be set so that the lane change can be performed when the relative distances and the collision prediction times for all surrounding vehicles i or at least for each surrounding vehicle i in the target lane have sufficient time, that is, when the following expressions are satisfied, on the basis the above-mentioned first and second thresholds Xi ^ gap_a and Ti ^ ttc_a. For example, if at least one of the conditions of the expression expressed below ( 4 ) is not fulfilled or realized, otherwise in step S207 judges that the lane change can not be performed. X i G a p ( t + T ) > X i G a p _ a T i T T C > T i T T c _ a
    Figure DE102017219813A1_0004
  • Here is an example based on the example of 4 with the two vehicles 101 (i = 1) and 102 (i = 2) in the target lane X1 ^ gap_a exemplify a relative distance threshold (hereinafter referred to as first predetermined value) for judging whether the lane change can be performed to the front vehicle 101 in a target space, and X2 ^ gap_a is exemplified a threshold value for the relative distance (hereinafter referred to as third predetermined value) for judging whether the lane change can be performed to a rearward rearward or following vehicle 102 in the finish area.
  • For example, it may be desired that the first and third predetermined values are given at intervals (eg, 7 m for the first predetermined value and 10 m for the third predetermined value), which is considered to not make the lane change regardless of the relative speed, if the driver exists within the above relative distances (the lane changes).
  • It should be noted that these predetermined thresholds may be predetermined, but need not be fixed and, in some example embodiments, according to vehicle speed V0 of the own vehicle 100 or can be changed by the driver. For example, the first and / or third thresholds may be based on vehicle speed V0 are calculated based on a predetermined function or predetermined functions, e.g. B. such that the first and / or the third threshold for higher vehicle speeds V0 can be determined greater and so that the first and / or the third threshold for lower vehicle speeds V0 of the own vehicle 100 can be determined smaller.
  • On the other hand, by way of example, based on the example of 4 with the two vehicles 101 (i = 1) and 102 (i = 2) in the target lane of the time parameters T1 ^ TTC_a exemplify a threshold for the collision prediction time (hereinafter referred to as second predetermined value) for judging whether the lane can be changed to the forward vehicle 101 in the finish area and the Time parameter T2 ^ TTC_a is a threshold for the collision prediction time (hereinafter referred to as fourth predetermined time) for judging whether the lane can be changed to the following vehicle in the target space.
  • Preferably, the second and fourth predetermined thresholds are time parameters (eg, 5 seconds for the second predetermined value and 6 seconds for the fourth predetermined value) for which the driver can sense that the situation is dangerous when calculating (n ) Time (s) may (may) fall into the collision prediction time.
  • It should also be appreciated that these predetermined thresholds may be predetermined, but need not be fixed, and in some example embodiments according to vehicle speed V0 of the own vehicle 100 or can be changed / determined by the driver. The second and / or fourth thresholds may be based on vehicle speed, for example V0 are calculated based on a predetermined function or predetermined functions, e.g. B. such that the second and / or the fourth threshold for higher vehicle speeds V9 can be determined greater and so that the second and / or the fourth threshold for lower vehicle speeds V0 of the own vehicle 100 can be determined smaller.
  • According to the exemplary assessment standard of the above expression ( 4 ), for example, when the lane is changed in the state in which the collision prediction time is briefly determined (ie, the respective relative speed is large) even if the relative distance is large, that is, if the own vehicle 100 can be overhauled by the following vehicle immediately after the lane is changed is preferably in step S207 still judged that the lane can not be changed for safety reasons.
  • Even if the determined relative speed is negative, that is, even if the respective i-th vehicle is moving away from the own vehicle 100 can get farther in step S207 It can be judged that the lane change can not be performed when the relative distance is short, preferably to avoid a short gap between the vehicles after the lane change.
  • When in step S207 it is judged that the lane change can be performed (step S207 returns YES), e.g. For example, according to the above judgment, the processing goes to step S209 in 2 continue and the desired lane change is controlled by the driver assistance control unit 1 carried out.
  • If, on the other hand, the assessment of S207 NO is (ie), the processing goes to step S208 Continue and a warning will be sent to the driver of your own vehicle 100 output.
  • It should be noted that the lane change correctness judgment does not apply to those of 6 (or the above expression 4) is limited and in other exemplary embodiments, in the horizontal axis of 6 fixed definition can be replaced by the relative velocity Vi, for example.
  • In step S208 leads when step S207 Returns NO, the driver assistance control unit 1 a warning processing to output a warning to the driver of the own vehicle, for example as in 7 shown.
  • 7 exemplifies a warning processing for illuminating a warning lamp on an instrument panel of the own vehicle 100 and / or pressing a warning tone. The warning by light and / or sound may vary in brightness or volume depending on a level of calculated risk.
  • As in 7 First, the display size of a warning lamp of a warning device can be shown 8th (see also 1 ) and the volume of a warning sound according to the steps in step S206 calculated collision risks are changed, as in 7 shown, for. B. so that the driver is informed that the lane can not be changed (step S208 ). The display warning light and / or the warning sound volume can be changed, eg. B. according to the collision risks, for example, so that it can be predicted when the driver can change the lane.
  • With another return to 2 if, on the other hand, step S207 YES returns, in step S209 the desired lane change by the control of the driver assistance control unit 1 carried out.
  • The driver assistance control unit 1 For example, perform the processing of the controller for a lane change, for. B. as in 8th shown.
  • 8th exemplifies a flowchart of a lane change control processing by the driver assistance control unit 1 represents.
  • As exemplified in 8th First, a target path for lane change is shown in one step S901 calculated, for example, based on obtained lane marking information such. On the basis of the lane markings, in front of the own vehicle 100 through the camera 2 be detected. Likewise or additionally, the target lane change lane may be determined based on navigation information having map information about lane positions of the road ahead of the detected vehicle position.
  • In step S902 is based on the determined target path of step S901 a steering assist torque is calculated to follow the target path, and the steering assist torque is sent to the steering control unit 8th instructed to the steering control mechanism 10 based on the determined target back-up torque of step S902 to steer, and the vehicle 100 performs the lane change under the control of the driver assistance control unit 1 by.
  • In step S903 judges the driver assistance control unit 1 whether the lane change is completed or not. When step S903 YES returns, the processing for the lane change control ends. When step S903 Returns NO, processing returns to step S901 back.
  • In the above became with return to 2 the processing in the case described that the driver assistance control unit 1 has detected that at least one or more following vehicles in the vicinity of the own vehicle 101 drive in the destination lane of the intended lane change (step S203 has returned YES).
  • Next, the processing for the case where the driver assistance control unit is described will be described 1 no following vehicle in the vicinity of the own vehicle 101 when driving in the target lane of the intended lane change has detected driving (step S203 has returned NO).
  • That is, if in step S203 is judged that the following vehicles are not detected in the Zielfahrspur the lane change, ie no following vehicles are detected in the Zielfahrspur the lane change, the detection area in one step S210 calculated. This is exemplary in 9 shown.
  • 9 shows by way of example a situation in which an own vehicle 100 should perform a lane change and a following vehicle 101 in the target lane due to an obstacle (eg, a wall) that limits the actual detection range can not be detected.
  • Especially in the example of 9 is the actual detection range that is above the sensor (s) of your own vehicle 100 is limited insofar as in this situation the dashed area is not included in the actual detection area since the dashed area can not be detected due to the obstacle (eg a wall). The actual detection range may also be referred to as the actual available detection range and with the maximum available detection range (available if no obstacles are the detection range of the sensor (s) of the own vehicle 100 limit), with the actual available coverage area corresponding to the maximum available coverage area when there are no obstacles in the coverage area, but the actual available coverage area may be less than the maximum available coverage area if the coverage area is limited by obstacles within the coverage area (e.g. B. in the case of 2D sensors, eg radar).
  • For example, the actual detection range may be based on a radial maximum distance to one or more obstacles caused by the sensor (s) of the subject vehicle 100 be detected, calculated by way of example. It should be noted, however, that the calculation method of the actual detection range is not limited to the above method. For example, in the case of 3D sensors (eg, a stereo camera or lidar devices), the detected road surface area may be set as the actual detection area. The actual detection area can also be expressed as an occupancy grid map.
  • With return to 2 Next is a maximum relative speed in step S211 calculated, z. B. based on exemplary processing, as in 10 shown.
  • 10 exemplifies a flow chart of an example of a process for determining a maximum relative speed.
  • Exemplary is the maximum relative speed or speed in 10 calculated, including that the speed (or speed) of the own vehicle 100 in one step S1101 is determined / calculated, for. B. on the basis of the speed of all wheels, as explained above.
  • In step S1102 For example, the maximum speed of another virtual vehicle is determined / calculated based on a speed limit of the current road section, a time of day, a season, a road surface condition and / or accumulated speed data (such as statistical vehicle speed data) in that area and so on , It should be noted that the method of estimating the maximum speed of a virtual vehicle does not affect the above-mentioned aspects is limited. The maximum speed of the vehicle may be determined, for example, based on or according to the speed limit of the current road segment, and may be thresholded (eg, to cover a possibility that the following vehicle may exceed the speed limit by a certain speed or percent speed , depending on traffic conditions, season conditions, road conditions, etc.).
  • In other words, the current speed of the own vehicle 100 will be in step S1101 determined and in step S1102 For example, a maximum speed of an undetected potential following vehicle (virtual vehicle) traveling on the lane change target lane is estimated, for example, based on a speed limit of the current road section and / or based on statistical speed data available for the current road section are.
  • Based on the maximum speed or speed of a virtual vehicle in step S1102 and the own vehicle speed or own vehicle speed determined in step S1101 is determined, the maximum relative speed between the own vehicle 100 and the other virtual vehicle in step S1103 calculated or determined.
  • For example, assuming that the following numbers are assumed to be exemplary values only, the own vehicle is on a driveway at an own vehicle speed of 80 km / h and the driveway threads into a highway with a speed limit of 100 km / h, the maximum relative speed between the own vehicle 100 and the other virtual vehicle than 20 km / h on the basis of a difference of the own vehicle speed and the determined maximum speed of a virtual vehicle, which is assumed to travel at the speed limit on the target lane on the highway. Taking into account z. A safety margin of 10 km / h (assuming that the virtual vehicle can exceed the speed limit of 10 km / h) may also have the maximum relative speed or speed between the own vehicle 100 and the virtual machine other than 30 km / h on the basis of a difference of the own vehicle speed and the determined maximum speed of a virtual vehicle, which is assumed to travel with the speed limit on the destination lane on the highway, while considering a safety margin becomes. Depending on a time of day, an expected traffic condition, season conditions, statistical data, etc., the maximum relative speed or speed may be further adjusted for safety reasons.
  • With return to 2 then becomes the required detection range for the lane change and / or threading in step S212 based on the maximum relative speed or speed in step S211 is determined, calculated or determined.
  • The required length for the lane change along the adjacent lane (destination lane) may be calculated, for example, based on the maximum relative speed or speed determined in step S212 is determined. The required length can be calculated according to the relative speed. For example, the required length is determined larger when the relative speed or speed is higher, and the required length is determined smaller when the relative speed or speed is lower. Further, the required length may be based on the determined / estimated time T1 which is required for lane change / threading as discussed above.
  • By way of example, a required detection range which is estimated to have to be available to safely perform the lane change, ie which is considered to be a lane change / threading request, is calculated or determined on the basis of the determined required length and may further on the basis of map data and / or lane marking information provided by the sensor (s) of the own vehicle 100 be detected.
  • Since the required detection range is exemplified based on a shape of the adjacent traffic lane (target lane), e.g. B. based on the map data and / or the lane marking information can be calculated / determined by (a) sensor (s) of the own vehicle 100 By way of example, the above algorithm may also be applicable to threading or lane change independently of a curve shape of a road. It should be noted, however, that the calculation method of the actual detection range is not limited to the above method. For example, the required detection area may be calculated based on the adjacent traffic lane and the traffic lane adjacent to the adjacent traffic lane.
  • In step S213 judges the driver assistance control unit 1 whether the required Detection area in step S212 is determined by the actual detection range covered in step S210 is determined. In step S213 For example, it is determined whether or not the required coverage area that was determined in step S212 is determined, less than or at least equal to the actual detection range, in step S210 is determined.
  • In other words, the driver assistance control unit 1 judged in step S213 whether or not a detection area requirement (detection area condition) is satisfied in the current lane change situation or threading situation.
  • When step S213 Returns NO, processing moves to step S214 Continue, and driving is to the driver of the own vehicle 100 to hand over. By a visual and / or audible warning or a delivery command issued, for example, by a man-machine interface installed in the vehicle, for example, the driver is instructed to control the own vehicle 100 to take over, or at least warned that the control of the own vehicle 100 must be taken over by manual control.
  • For example 11 exemplifies a dashboard of the own vehicle, which outputs a warning message to the driver.
  • The alert message may include an indication in addition to the issuance of the alert message when the driver must assume the driving control indicating the reason for the need to take control of the vehicle by the driver, so that the driver can understand why manual control is required is, such. B. with information as to why an automation level is reduced and / or when the driver must start with the manual control of the vehicle.
  • When step S213 YES returns, processing moves to step S209 Next, and the driver assistance control unit 1 performs the lane change, as described above by way of example, for. B. in conjunction with 8th ,
  • It should be noted that a major aspect of the present disclosure is in conjunction with the steps S210 to S214 in 2 is described.
  • In other words, and summarizing the above, if no other vehicle (following vehicle) is detected in the destination lane of the desired lane change / threading operation, which is due to a limited actual available detection area z. B. may be due to an obstacle in the detection area, the required detection area, which is required to safely perform the lane change, and which is aimed, for example, at the realization of the safety margin, even if a following vehicle at high speed from outside the actual detection range can approach in a blind threading situation.
  • Further, if the required detection range is not covered by the actual detection range, the driving control is given to the driver instead. This aspect has the advantage of helping to avoid any sudden occurrence of a tax transfer to the driver, e.g. B. at the end of Einfädelbereichs so that the driver can take over the vehicle control in a lighter and safer way, even before initiating the lane change or Einfädelsituation.
  • 12 exemplifies a flowchart of another control process of a driver assistance system according to other exemplary embodiments.
  • 12 exemplifies an automatic threading processing stored in a memory of the driver assistance control unit 1 may be stored, and the driver assistance control unit 1 may be configured to execute the appropriate control process.
  • In 12 are the steps S1301 to S1314 for example, the same as S201 to S214 in the above 2 ,
  • When step S1313 NO returns, then, by way of example, a last threading start point (or last lane change point) in step S315 calculated or determined, e.g. Based on information provided by the camera 2 or another front sensor (s) of the subject vehicle 100 detected, and / or on the basis of map information.
  • The last threading starting point exemplifies a position from which the own vehicle 100 threading safely or can safely change a lane.
  • In step S1316 Then, the actual detection area at the threading start point is estimated based on the current actual detection area and the determined threading start point.
  • In step S1317 becomes the required detection area at the threading start point on the basis of the required length, the map data and / or the lane marking information calculated or determined by the sensor (s) of the own vehicle 100 be detected (eg similar to step S211 but relative to the particular last threading start point).
  • In S1318 judges the driver assistance control unit 1 whether the required detection area at the threading start point is covered by the actual detection area at the threading start point (eg, similar to step S212 but relative to the particular last threading start point).
  • When step S1318 YES returns, processing moves to step S1319 continue to continue the threading maneuver (or lane change maneuver).
  • In step S1319 For example, the vehicle may also be controlled to reduce a blind area in the actual detection area estimated at the last threading start point. This has the advantage that the possibility of performing the automatic threading or the automatic lane change can be increased.
  • When step S1318 Returns NO, processing moves to step S1314 Continue and the driving control will be similar to the above step S214 handed over to the driver.
  • As a result of processing the steps S1315 to S1319 For example, the vehicle may further perform the threading maneuver or the lane change maneuver under the condition that the threading detection request (detection condition) (ie, the judgment of step S1318 ) is satisfied at least at the threading start point or last threading start point.
  • 13 exemplifies an overview of an exemplary threading section, e.g. As a highway, is.
  • Although the actual detection range for the sensor (s) of the own vehicle 100 is available when retracted into the Einfädelabschnitt example is obstructed by a wall or other obstacle, for example, if the own vehicle 100 At the beginning of the threading section, the driver assistance control unit compares here 1 of the own vehicle 100 by way of example the required detection range at the last threading starting point with the actual detection range at the threading starting point in exemplary embodiments according to FIG 12 , This has the advantage that the automatic threading / lane change operation can be performed depending on the situation.
  • 14 exemplifies a calculation / determination of the actual detection range on the basis of the previously obtained data.
  • The upper section in 14 FIG. 11 exemplifies the process of storing actual detection information with respect to the vehicle position in the database at time A. The lower section in FIG 14 exemplifies the process of extracting the previously stored actual detection information at time B (later than time A). Such a database may be provided in the vehicle, or such database may be located at a remote location, e.g. In the database center of a service provider, and such a center and / or such a database can be used with the driver assistance control unit 1 communicate via wireless communications.
  • The own vehicle 100 For example, the actual detection information and / or visibility information (eg, low / medium / high visibility) at the forward threading section may be extracted before the own vehicle 100 reaches the threading section to be able to more accurately estimate an actual detection area at the last threading start point.
  • The information can also be given to the driver via a man-machine interface. This is advantageous because such information is very useful to the driver since the driver can prepare for it, the control of the own vehicle 100 before the threading section is reached, or at least before the last threading start point is reached.
  • As previously mentioned, the database can be in your own vehicle 100 in exemplary embodiments, but the location and specific configuration of the database are not limited, and the database may be implemented in a data center in other exemplary embodiments, as previously mentioned. If the database is installed in a data center, other vehicles may use the same information through wireless information, such as: With vehicle-to-X communications.
  • As will be appreciated by one skilled in the art, as described above and in the accompanying drawings, the present invention may be used as a method (eg, computer implemented process, or any other process) as a controller (including a device, a machine, a system, a computer program product and / or any other device) or a combination of the foregoing.
  • Thus, the embodiments of the present invention may take the form of a complete hardware embodiment, a complete software embodiment (including firmware, resident software, microcode, etc.) or an embodiment combining software and hardware aspects, generally referred to herein as a "system". can be designated. Further, embodiments of the present invention may take the form of a computer program product or a computer readable medium on which computer executable program code is embodied in the medium.
  • Embodiments of the present invention are described above with reference to flowchart illustrations and / or block diagrams of methods and apparatus. Of course, each block of the flowchart representations and / or block diagrams and / or combinations of blocks in the flowchart representations and / or block diagrams may be implemented by computer executable program code.
  • The computer-executable program code may be coupled to a processor of a general-purpose computer, special purpose computer or other programmable data processing device, such as a computer. A control unit may be provided to generate a particular machine such that the program code executed via the processor of the computer or other programmable data processing device generates a means for implementing the functions / actions / outputs described in the flowchart, block diagram block or in the block diagram blocks, figures and / or the written description. This computer-executable program code may also be stored in computer-readable memory that may instruct a computer or other programmable computing device to function in a specific manner such that the program code stored in the computer-readable memory generates an article of manufacture with command means that performs the function / action / Implement output specified in the flowchart, block diagram block (s), figures, and / or written description. The computer-executable program code may also be loaded onto a computer or other programmable computing device to cause a series of operations to be performed on the computer or other programmable device to generate a computer-implemented process such that the program code residing on the computer or other programmable device, provides steps to implement the functions / actions / outputs specified in the flowchart, block diagram block (s), figures, and / or written description. Alternatively, the steps or actions implemented by the computer program may be combined with operator or man-implemented steps or actions to perform an embodiment of the invention.
  • It should also be noted that logic operations may be described herein to demonstrate various aspects of the invention, rather than should the present invention be construed as limited to any particular logic flow or logic implementation. The described logic may be divided into various logic blocks (eg, programs, modules, functions, or subroutines) without changing the overall results or otherwise departing from the true scope of the invention. Often, logic elements may be added, modified, omitted, performed in a different order, or implemented using other logic structures (eg, logic gates, loop primitives, conditional logic, and other logic designs) without changing the overall results or otherwise from the true scope of the invention departing.
  • Although certain exemplary embodiments have been described and illustrated in the accompanying drawings, it is to be understood that such embodiments are merely illustrative of and not limiting of the broad invention, and that the embodiments of the invention are not limited to the specific constructions and arrangements shown and described since various other changes, combinations, omissions, modifications and substitutions are possible in addition to those set forth in the paragraphs above. Those skilled in the art will recognize that various adaptations, modifications, and / or a combination of the embodiments just described can be configured without departing from the scope and spirit of the invention. It is therefore to be understood that within the scope of the appended claims, the invention may be practiced otherwise than as specifically described herein. For example, unless expressly stated otherwise, the steps described herein may be performed by processes in sequences different from those described herein, and one or more steps may be combined, split, or performed concurrently. Those skilled in the art, in light of this disclosure, also recognize that various embodiments of the invention described herein can be combined to form other embodiments of the invention.
  • QUOTES INCLUDE IN THE DESCRIPTION
  • This list of the documents listed by the applicant has been generated automatically and is included solely for the better information of the reader. The list is not part of the German patent or utility model application. The DPMA assumes no liability for any errors or omissions.
  • Cited patent literature
    • JP 2014180986 A [0003]

Claims (20)

  1. A method for assisting a driver of an own vehicle with one or more sensors configured to detect other vehicles in the vicinity of the own vehicle, wherein the method performed by a driver assistance control unit of the own vehicle when a lane change operation is to depart from a current lane on which the vehicle is driving, to change to a destination lane, is to be performed, comprises: Determining an available detection area, which is detected by one or more sensors of the own vehicle, Determining a required detection range for performing automatic control or semi-automatic control of the lane change operation by the driver assistance control unit of the own vehicle, and Controlling the lane change operation of the own vehicle based on a comparison of the determined available detection range and the determined required detection range.
  2. Method according to Claim 1 characterized in that the controlling of the lane change operation of the own vehicle is performed when the particular available detection range is equal to or greater than the certain required detection range and / or when the particular available detection range covers the determined required detection range.
  3. Method according to Claim 1 or 2 characterized by outputting a warning message to the driver by a man-machine interface of the own vehicle requesting that the driver take over the control of the own vehicle.
  4. Method according to at least one of the preceding claims, characterized in that the required detection range is determined on the basis of a speed or speed of the own vehicle.
  5. Method according to at least one of the preceding claims, characterized in that the required detection range is determined on the basis of an estimated required length and / or an estimated required time, which are estimated to be required for performing the lane change.
  6. Method according to at least one of the preceding claims, characterized in that the required detection range is determined on the basis of map data indicating a peripheral environment of the own vehicle and / or on the basis of lane marking information detected by one or more sensors of the own vehicle ,
  7. Method according to at least one of the preceding claims, characterized in that the required detection range is determined on the basis of an estimated speed or speed of a virtual vehicle.
  8. Method according to the Claims 6 and 7 characterized in that the required detection range is determined based on an estimated maximum relative velocity of the virtual vehicle relative to the subject vehicle, the maximum relative velocity being determined based on the estimated speed or velocity of the virtual vehicle and the speed or speed of the subject vehicle ,
  9. Method according to Claim 7 or 8th characterized in that the virtual vehicle is estimated to be traveling in the target lane of the lane change operation outside the determined available detection area.
  10. Method according to at least one of Claims 7 to 9 characterized in that the estimated speed or speed of the virtual vehicle is determined based on a speed limit on the target lane.
  11. Method according to at least one of Claims 7 to 10 characterized in that the estimated speed or speed of the virtual vehicle is determined based on a current weather condition, a time of day, a current season and / or a current traffic condition.
  12. Method according to at least one of Claims 7 to 11 characterized in that the estimated speed or speed of the virtual vehicle is determined based on statistical speed data indicative of an average speed of vehicles on the destination lane or a statistically estimated maximum speed of vehicles traveling on the destination lane.
  13. Method according to at least one of the preceding claims, characterized in that it further comprises: - determining a last lane change start point on the current lane in front of the own vehicle, wherein the available detection range is determined as the estimated available detection range when the own vehicle is located at the last lane change start point, and the required Detection area is determined as a required detection area, which is required at the last lane change start point for performing the lane change operation.
  14. Method according to at least one of Claims 1 to 13 characterized in that it further comprises: - determining a last lane change start point on the current lane in front of the own vehicle; - determining an estimated second available detection range when the own vehicle is located at the last lane change start point; Determining a second required detection range for performing automatic control or semi-automatic control of the lane change operation by the driver assistance control unit of the own vehicle at the last lane change start point, and controlling the lane change operation of the own vehicle at the last lane change start point on the basis of a comparison of the estimated second available detection range and the determined second required one detection range.
  15. Method according to Claim 14 characterized in that controlling the lane change operation of the own vehicle at the last lane change start point is performed when the determined estimated second available detection range is equal to or greater than the determined second required detection range and / or when the determined estimated second available detection range covers the determined second required detection range ,
  16. Method according to at least one of the preceding claims, characterized in that the available detection area is determined on the basis of previously stored information indicating an available detection area of a location in front of the own vehicle.
  17. Method according to at least one of the preceding claims, characterized in that the steps of determining the available detection area, determining the required detection area and controlling the lane change operation of the own vehicle are performed on the basis of a comparison of the determined available detection area and the determined required detection area no other vehicle in the target lane following the own vehicle is detected by the one or more sensors of the own vehicle when the lane change operation is to be performed.
  18. Method according to Claim 17 characterized in that when another vehicle in the target lane following the own vehicle is detected by the one or more sensors of the own vehicle when the lane change operation is performed, the method further comprises: determining a relative distance between the own vehicle and the detected other vehicle, determining a relative speed or speed between the own vehicle and the detected other vehicle, and controlling the lane change operation of the own vehicle when the determined relative distance between the own vehicle and the detected other vehicle and / or the determined relative speed or Speed between the own vehicle and the detected other vehicle meet a lane change condition.
  19. An apparatus mountable to an own vehicle having one or more sensors configured to detect other vehicles in the vicinity of the subject vehicle, the apparatus comprising a controller configured to perform the method of any one of the preceding claims.
  20. A computer program product comprising a computer program having computer program instructions adapted to cause a control unit or processor to perform the steps of a method according to any one of Claims 1 to 18 performs.
DE102017219813.3A 2017-11-08 2017-11-08 A method and apparatus for assisting a driver of an own vehicle Pending DE102017219813A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
DE102017219813.3A DE102017219813A1 (en) 2017-11-08 2017-11-08 A method and apparatus for assisting a driver of an own vehicle

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
DE102017219813.3A DE102017219813A1 (en) 2017-11-08 2017-11-08 A method and apparatus for assisting a driver of an own vehicle
PCT/JP2018/037910 WO2019093061A1 (en) 2017-11-08 2018-10-11 Method and device for assisting driver of vehicle

Publications (1)

Publication Number Publication Date
DE102017219813A1 true DE102017219813A1 (en) 2019-05-09

Family

ID=66179275

Family Applications (1)

Application Number Title Priority Date Filing Date
DE102017219813.3A Pending DE102017219813A1 (en) 2017-11-08 2017-11-08 A method and apparatus for assisting a driver of an own vehicle

Country Status (2)

Country Link
DE (1) DE102017219813A1 (en)
WO (1) WO2019093061A1 (en)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004018681A1 (en) * 2004-04-17 2005-11-03 Daimlerchrysler Ag Collision avoidance between road vehicles travelling in opposite directions uses detectors to allow planned overtaking
JP2014180986A (en) 2013-03-21 2014-09-29 Toyota Motor Corp Lane change assist system
DE102014226462A1 (en) * 2014-12-18 2016-06-23 Honda Motor Co., Ltd. Adaptive driving control system with exceptional prediction
DE102016205142A1 (en) * 2016-03-29 2017-10-05 Volkswagen Aktiengesellschaft Methods, apparatus and computer program for initiating or performing a cooperative maneuver
EP3239960A1 (en) * 2014-12-26 2017-11-01 Hitachi Automotive Systems, Ltd. Vehicle control system

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5825239B2 (en) * 2012-10-09 2015-12-02 トヨタ自動車株式会社 Vehicle control device
EP3435354A4 (en) * 2016-03-25 2019-11-27 Hitachi Automotive Systems, Ltd. Vehicle control device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE102004018681A1 (en) * 2004-04-17 2005-11-03 Daimlerchrysler Ag Collision avoidance between road vehicles travelling in opposite directions uses detectors to allow planned overtaking
JP2014180986A (en) 2013-03-21 2014-09-29 Toyota Motor Corp Lane change assist system
DE102014226462A1 (en) * 2014-12-18 2016-06-23 Honda Motor Co., Ltd. Adaptive driving control system with exceptional prediction
EP3239960A1 (en) * 2014-12-26 2017-11-01 Hitachi Automotive Systems, Ltd. Vehicle control system
DE102016205142A1 (en) * 2016-03-29 2017-10-05 Volkswagen Aktiengesellschaft Methods, apparatus and computer program for initiating or performing a cooperative maneuver

Also Published As

Publication number Publication date
WO2019093061A1 (en) 2019-05-16

Similar Documents

Publication Publication Date Title
EP3072770B1 (en) Autonomous driving device
EP2902290B1 (en) System for accommodating a pedestrian during autonomous vehicle operation
JP2016203972A (en) Automated hitching assist system
US9656667B2 (en) Method for minimizing automatic braking intrusion based on collision confidence
JP6269546B2 (en) Automatic driving device
JP6035306B2 (en) Vehicle travel control device
EP2814014B1 (en) Method for coordinating the operation of motor vehicles
JP6180968B2 (en) Vehicle control device
JP6684714B2 (en) Method and system for vehicle driver assistance
RU2654839C2 (en) Collision avoidance support device
RU2571843C1 (en) System for post-accident vehicle path determination
US9707973B2 (en) Drive assist device
JP5939226B2 (en) Driving assistance device
KR101552074B1 (en) Vehicle driving support system
JP2017001597A (en) Automatic driving device
US9180911B2 (en) Vehicle operating condition determining system, driving assist system, and operating condition determining method
CN106608264A (en) Methods of improving performance of automotive intersection turn assist features
DE102015118101A1 (en) Drive control device for a vehicle
US7433772B2 (en) Target speed control system for a vehicle
US9429946B2 (en) Driving control system and dynamic decision control method thereof
JP4366419B2 (en) Driving support device
US9205864B2 (en) Driving assistance system for vehicle
EP2643829B1 (en) Method and distance control device for preventing collisions of a motor vehicle in a driving situation with little lateral distance
GB2544162A (en) Park out assist
JP6323385B2 (en) Vehicle travel control device

Legal Events

Date Code Title Description
R012 Request for examination validly filed
R016 Response to examination communication