WO2016204213A1 - Dispositif de commande de véhicule et procédé de commande de véhicule - Google Patents

Dispositif de commande de véhicule et procédé de commande de véhicule Download PDF

Info

Publication number
WO2016204213A1
WO2016204213A1 PCT/JP2016/067896 JP2016067896W WO2016204213A1 WO 2016204213 A1 WO2016204213 A1 WO 2016204213A1 JP 2016067896 W JP2016067896 W JP 2016067896W WO 2016204213 A1 WO2016204213 A1 WO 2016204213A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle control
vehicle
target
distance
detected
Prior art date
Application number
PCT/JP2016/067896
Other languages
English (en)
Japanese (ja)
Inventor
洋介 伊東
明憲 峯村
昇悟 松永
淳 土田
政行 清水
渉 池
Original Assignee
株式会社デンソー
トヨタ自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016112096A external-priority patent/JP6539228B2/ja
Application filed by 株式会社デンソー, トヨタ自動車株式会社 filed Critical 株式会社デンソー
Priority to US15/736,661 priority Critical patent/US10573180B2/en
Priority to DE112016002750.8T priority patent/DE112016002750B4/de
Priority to CN201680035181.9A priority patent/CN107848530B/zh
Publication of WO2016204213A1 publication Critical patent/WO2016204213A1/fr

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60TVEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
    • B60T7/00Brake-action initiating means
    • B60T7/12Brake-action initiating means for automatic initiation; for initiation not subject to will of driver or passenger
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a vehicle control device and a vehicle control method for performing vehicle control on an object ahead of the host vehicle.
  • the radar target acquired by the radar sensor and the image target acquired by the image sensor are collated and it is determined that the radar target and the image target are from the same object, the radar target A technique for generating a new target (fusion target) by fusing the image target with the image target is known.
  • fusion target a new target
  • the recognition accuracy of an object such as a preceding vehicle ahead of the host vehicle can be improved.
  • the vehicle control of the own vehicle with respect to an object can be appropriately performed by using the positional information on the object specified using a fusion target (refer patent document 1).
  • the present invention has been made in view of the above, and has as its main object to provide a vehicle control device and a vehicle control method capable of appropriately performing vehicle control on an object ahead of the host vehicle.
  • the vehicle control apparatus provides a first target information of an object in front of the host vehicle acquired as a reflected wave of a carrier wave, and a second of the object acquired by image processing of a captured image in front of the host vehicle.
  • a vehicle control device that fuses target information to generate a fusion target and performs vehicle control of the host vehicle with respect to the object detected as the fusion target, wherein the object is detected by the fusion target.
  • a state determination unit that determines whether or not the object has transitioned to a state in which the object is detected only by the first target information, and the state detection unit detects the object by only the first target information.
  • a distance determining unit that determines whether or not the distance to the object when it is determined that the state has transitioned to a predetermined state is a predetermined short distance, and the distance determining unit determines that the distance to the object is a predetermined short distance The distance When it is, and a vehicle control unit for implementing a vehicle control for the object.
  • the object when the second target information cannot be acquired from the state in which the object is detected as a fusion target, the object is transitioned to a state in which only the first target information is detected.
  • the vehicle control for the object is performed on condition that the distance to the object is a predetermined short distance. Therefore, even after the object is no longer detected as a fusion target, it is possible to perform vehicle control on the object with high reliability.
  • the schematic block diagram of the vehicle control apparatus in 1st and 2nd embodiment The functional block diagram of ECU in 1st Embodiment. Explanatory drawing of the relationship between a vehicle distance and an image lost. The figure which shows the relationship between relative speed and collision margin time.
  • the flowchart of vehicle control. The block diagram of the logic circuit for determining permission and prohibition of PB.
  • the functional block diagram of ECU in 2nd Embodiment The top view at the time of the own vehicle approaching an object.
  • a vehicle control system 100 is mounted on a vehicle, detects an object existing in front of the vehicle, and performs various controls to avoid or reduce a collision with the object. system).
  • a vehicle equipped with the vehicle control system 100 is referred to as a host vehicle.
  • the vehicle control system 100 includes an ECU 10, various sensors 20, and a controlled object 30.
  • an image sensor 21, a radar sensor 22, a vehicle speed sensor 23, and the like are provided.
  • the image sensor 21 is a CCD camera, a monocular camera, a stereo camera or the like, and is installed near the upper end of the windshield of the host vehicle.
  • the image sensor 21 captures a captured image by capturing an area that extends in a predetermined range toward the front of the host vehicle at predetermined time intervals. Then, by subjecting the captured image to image processing, an object ahead of the host vehicle is acquired as target information (image target GT) and output to the ECU 10.
  • the image target GT includes information such as the lateral width of the object in addition to the distance and relative speed with the object in the traveling direction of the host vehicle, the lateral position indicating the position of the host vehicle in the vehicle width direction. Therefore, the ECU 10 recognizes the image target GT as information having a predetermined width.
  • the radar sensor 22 detects an object ahead of the host vehicle as target information (radar target LT) using a directional electromagnetic wave such as a millimeter wave or a laser, and the light is detected at the front of the host vehicle.
  • the shaft is attached so that it faces the front of the vehicle.
  • the radar sensor 22 scans a region extending in a predetermined range toward the front of the vehicle every predetermined time with a radar signal, and receives an electromagnetic wave reflected from the surface of the object outside the vehicle, thereby detecting the distance from the object and the object. Relative speed and the like are acquired as target information and output to the ECU 10.
  • the radar target LT includes information such as the distance to the object in the traveling direction of the host vehicle, the relative speed, and the lateral position indicating the position of the host vehicle in the vehicle width direction.
  • the radar target LT corresponds to the first target information
  • the image target GT corresponds to the second target information.
  • the vehicle speed sensor 23 is provided on a rotating shaft that transmits power to the wheels of the host vehicle, and obtains the host vehicle speed that is the speed of the host vehicle based on the rotational speed of the rotating shaft.
  • the ECU 10 is an electronic control unit that controls the entire vehicle control system 100.
  • the ECU 10 is mainly composed of a CPU and includes a ROM, a RAM, and the like.
  • the ECU 10 fuses the image target GT and the radar target LT to detect an object (vehicle, road obstacle or other vehicle) ahead of the host vehicle.
  • the position of the fusion target in the traveling direction of the host vehicle is specified based on the distance and relative speed of the radar target LT, and the width of the fusion target in the vehicle width direction of the host vehicle is determined based on the lateral width and lateral position of the image target GT. Identify the location.
  • the fusion target is generated using the radar target LT and the image target GT and the position of the object is specified by the fusion target, among the information acquired by the radar sensor 22 and the image sensor 21, The position of the object is specified using information with higher accuracy, and the recognition accuracy of the position of the object can be improved.
  • the ECU 10 performs well-known image processing such as template matching on the captured image acquired from the image sensor 21, thereby detecting the type of object detected as the image target GT (another vehicle, a pedestrian, a road obstacle). Etc.).
  • a plurality of dictionaries which are image patterns indicating features for each object, are stored in the ROM as templates for specifying the type of each object.
  • the dictionary both a whole body dictionary in which features of the entire object are patterned and a half-body dictionary in which partial features of the object are patterned are stored.
  • Information on the type of object recognized by the image sensor 21 is also input to the ECU 10.
  • the object recognition accuracy is improved by generating the fusion target on the condition that the type of the object detected as the image target GT is specified in the whole body dictionary.
  • the image target GT is not used for the generation of the fusion target. ing.
  • the ECU 10 determines whether or not there is a possibility of collision between the object recognized as the fusion target and the host vehicle. Specifically, the lateral position closest to the host vehicle is selected as the lateral position to be controlled among the lateral position of the fusion target and the lateral position of the image target GT. Then, based on the approaching state between the lateral position of the selected object and the host vehicle, it is determined whether or not there is a possibility of collision between the host vehicle and the object.
  • a collision margin time TTC (Time to the object) is calculated by a method such as dividing the distance in the traveling direction between the object and the host vehicle by the relative speed with the object. to Collision).
  • the relative speed is obtained by subtracting the vehicle speed of the host vehicle from the vehicle speed of the preceding vehicle.
  • the TTC is an evaluation value indicating how many seconds later the vehicle collides with an object when traveling at the vehicle speed as it is. The smaller the TTC, the higher the risk of collision, and the larger the TTC, the higher the risk of collision. The nature becomes low. Note that TTC may be calculated in consideration of relative acceleration.
  • the ECU 10 compares the TTC and the operation timing of each controlled object 30, and if the TTC is equal to or less than the operation timing, the ECU 10 operates the corresponding controlled object 30.
  • the ECU compares the TTC and the operation timing of each controlled object 30, and operates the corresponding controlled object 30 when the TTC is equal to or lower than the operation timing.
  • the TTC falls below the speaker operation timing, an alarm is sent to the driver by the operation of the speaker. If the TTC is equal to or lower than the seat belt operation timing, the seat belt is wound up. If the TTC is equal to or less than the brake operation timing, the automatic brake is operated to reduce the collision speed. As described above, the collision between the host vehicle and the object is avoided or alleviated.
  • the object when the host vehicle approaches the object from a state in which the object is recognized as a fusion target, the object may not be recognized as a fusion target because the lower end of the object moves out of the shooting range of the image sensor 21. Yes (the image target GT is lost).
  • FIG. 2 shows an explanatory diagram of the relationship between the distance between the host vehicle and the object and the lost image target GT.
  • it is shown as a shooting angle of view ⁇ 1 of the image sensor 21.
  • the entire rear end portion of the preceding vehicle M2 is included in the shooting field angle ⁇ 1 of the image sensor 21, so that the image object is displayed in the whole body dictionary.
  • the type of the target GT is specified, and a fusion target can be generated.
  • the TTC for the object recognized only by the radar target LT is the operation timing of the controlled object 30.
  • the controlled object 30 is actuated.
  • the controlled object 30 is operated even if the TTC for the object recognized only by the radar target LT is the operation timing of the controlled object 30. I won't let you.
  • the predetermined short distance is a distance at which the lower end of the object cannot be seen, and may be set for each vehicle type in consideration of the mounting height, the mounting angle, and the like of the image sensor 21. As described above, even when the image target GT is lost, vehicle control can be performed on an object with high reliability that has been detected as a fusion target.
  • the TTC becomes a small value. Therefore, there is a possibility that the operation of the controlled object 30 by the vehicle control has already been started before the image target GT is lost. high. In other words, when the image target GT is lost, the situation where the operation of the controlled object 30 by the vehicle control is not started is limited to the situation where the relative speed between the object and the host vehicle is small and the TTC is a large value. It is done.
  • the vertical axis represents relative speed and the horizontal axis represents TTC.
  • the vehicle is detected with respect to the object detected as the fusion target before the image target GT is lost.
  • the possibility that the controlled object 30 is activated by the control is increased.
  • the relative speed decreases, the object can be detected as a fusion target only up to a larger value of TTC. Therefore, there is a high possibility that the image target GT is lost before the controlled object 30 is activated by vehicle control. Become.
  • the radar target LT is changed from the fusion target on the condition that the relative speed between the object and the host vehicle is smaller than a predetermined distance in addition to the predetermined short distance between the object and the host vehicle.
  • the vehicle control is performed on the object that has been switched to the recognition only by the user.
  • the reliability of the object specified with the radar target only gradually decreases.
  • the vehicle control is performed on the condition that the host vehicle and the object are close to each other and the relative speed is small. There is a high possibility that the vehicle has already stopped before the predetermined time elapses after the loss. Therefore, if the elapsed time after the object is no longer specified by the fusion target is within a predetermined time, the vehicle control is performed, and if the elapsed time exceeds the predetermined time, the vehicle control is not performed.
  • the radar is used instead of the lateral position of the object obtained by the fusion target.
  • the collision determination is performed using the lateral position of the object obtained by the target LT.
  • the object is identified by the radar target LT immediately after the transition after switching to the detection of only the lateral position of the object identified by the fusion target and the radar target LT. It is determined whether or not the difference from the horizontal position is less than a predetermined value. If the difference between the lateral position of the object specified by the radar target LT and the lateral position of the object specified by the fusion target is large, vehicle control is not performed. That is, if the difference between the horizontal position of the object specified by the radar target LT and the horizontal position of the object specified by the fusion target is large, the reliability of the object specified by the fusion target is low. In this case, vehicle control using a radar target is not performed.
  • vehicle control is performed on the condition that the lateral position of the object is in a predetermined approaching state with respect to the host vehicle.
  • the ECU 10 determines whether or not the radar target LT is in a fusion state in step S11. For example, the ECU 10 affirms when the radar target LT and the image target GT are in a fusion state because the image target GT is included in a predetermined range in the coordinate system of the radar target LT.
  • step S12 in the fusion state, the ECU 10 determines whether or not the lateral position of the object specified by the fusion target is equal to or less than a predetermined first threshold value Th1. Specifically, the ECU 10 determines whether or not the distance between the host lane O and the lateral position is equal to or less than a predetermined first threshold Th1 when the center position in the vehicle width direction of the host vehicle M1 is the axis (own lane O). Determine.
  • step S13 determines whether TTC is below the operation timing Th2 of the controlled object 30 in step S13, when step S12 is affirmed.
  • step S13 is affirmed, the ECU 10 operates the controlled object 30 in step S14.
  • step S12 or 13 is denied, ECU10 progresses to step S22 and does not operate the controlled object 30.
  • step S11 determines whether or not the image target GT is lost (image lost FSN) from the fusion state in step S15. If the determination in step S15 is affirmative, the ECU 10 determines whether or not the distance of the object specified by the radar target in step S16 is equal to or less than a third threshold Th3. If the determination in step S16 is affirmative, the ECU 10 determines in step S17 whether or not the relative speed between the host vehicle and the object specified by the radar target is equal to or less than a fourth threshold Th4.
  • step S18 determines in step S18 whether the state in which the image target GT has been lost continues for a predetermined number of times (or cycles). This process is affirmed when the state in which the object is detected only by the radar target LT is repeated a predetermined number of times or less after switching from the fusion target to the detection of the radar target LT.
  • Step S19 the ECU 10 determines whether or not the lateral position of the object specified by the radar target LT is equal to or less than the fifth threshold Th5.
  • the fifth threshold Th5 is set as a determination value for the approaching state between the vehicle and the object. If the determination in step S19 is affirmative, the ECU 10 determines in step S20 whether or not the lateral position of the object specified by the fusion target is equal to or less than the sixth threshold Th6 immediately before the image target GT is lost. To do. Note that the difference between the fifth threshold Th5 and the sixth threshold Th6 is set to be less than a predetermined value.
  • step S19 and S20 determines whether or not TTC is equal to or lower than the operation timing Th2 of the controlled object 30 in step S21.
  • Step S21 the ECU 10 proceeds to Step S14 and operates the controlled object 30. If any of steps S15 to S21 is denied, the ECU 10 proceeds to step S22 and does not actuate the controlled object 30.
  • FIG. 1B shows functional blocks representing the functions of the ECU 10.
  • the ECU 10 includes a state determination unit 101, a distance determination unit 102, a lateral position acquisition unit 103, a lateral position determination unit 104, a relative speed determination unit 105, and a vehicle control unit 106.
  • the state determination unit 101 is a functional block that executes Step S15 of the flowchart of FIG. 4. Whether the object has transitioned from a state in which the object is detected by the fusion target to a state in which the object is detected only by the first target information. Determine whether or not.
  • the distance determination unit 102 is a functional block that executes Step S ⁇ b> 16, and the distance to the object when the state determination unit 101 determines that the state has been changed to a state in which the object is detected using only the first target information. Is determined to be a predetermined short distance.
  • the lateral position acquisition unit 103 is a functional block that acquires the lateral position of the object in the vehicle width direction, and the object is detected only from the first target information from the state in which the object is detected as a fusion target. If it is determined that the state has been changed, the lateral position of the object in the vehicle width direction is acquired using the first target information. In addition, when the horizontal position acquisition unit 103 transitions from a state in which the object is detected as a fusion target to a state in which the object is detected only with the first target information, the lateral position acquisition unit 103 selects the fusion target immediately before the transition. To obtain the first lateral position of the object, and obtain the second lateral position of the object using the first target information immediately after the transition.
  • the lateral position determination unit 104 is a functional block that executes steps S19 and S20.
  • the distance determination unit 102 determines that the distance to the object is a predetermined short distance
  • the lateral position acquisition unit 104 Whether or not the acquired lateral position of the object in the vehicle width direction is in a predetermined approach state with respect to the host vehicle (that is, the lateral position of the object specified by the radar target LT is equal to or less than a fifth threshold Th5). If it is in a predetermined approaching state with respect to the host vehicle, it is determined whether or not the difference between the first lateral position and the second lateral position is greater than or equal to a predetermined value.
  • the relative speed determination unit 105 is a functional block that executes Step S17, and determines whether the relative speed between the object and the host vehicle is smaller than a predetermined value (Th4).
  • the vehicle control unit 106 is a functional block that executes steps S14 and S22, and performs vehicle control on the object according to the flowchart of FIG.
  • the detection accuracy of the object can be improved.
  • a part of the object for example, the lower end
  • the vehicle control for the object using the fusion target may not be performed.
  • the fusion target is not detected as the host vehicle approaches the object.
  • vehicle control is performed on the object.
  • the distance to the object is Vehicle control over an object is performed on condition that the distance is a predetermined short distance. Therefore, even after the object is no longer detected as a fusion target, it is possible to perform vehicle control on the object with high reliability.
  • the lateral position of the object is determined relative to the host vehicle.
  • the vehicle control for the object is executed on condition that the vehicle is in a predetermined approach state. In this case, if the possibility of collision between the object and the host vehicle is low, vehicle control is not performed. Therefore, after the object is no longer detected as a fusion target, it is highly reliable while suppressing unnecessary vehicle control. It becomes possible to perform vehicle control on the object.
  • the lateral position of the object in the vehicle width direction is obtained using the radar target LT. Therefore, the lateral position of the object at that time can be acquired with high accuracy.
  • the second embodiment will be described focusing on the differences from the first embodiment.
  • the speaker, the seat belt, the brake, etc. are operated as vehicle control.
  • the second embodiment In such a case, the brake is operated to avoid collision with an object.
  • the ECU 10a (see also FIG. 1A) of the second embodiment further includes a pedestrian determination unit 107 that determines that the object is a pedestrian, but includes a state determination unit 101 and a distance determination unit 102.
  • the lateral position acquisition unit 103, the lateral position determination unit 104, and the relative speed determination unit 105 are common to the first embodiment.
  • the vehicle control unit 106a of the present embodiment performs primary brake control based on the first margin, which is a margin of collision against an object, in addition to the function of the vehicle control unit 106 of the first embodiment.
  • the secondary brake control is performed based on the second margin having a collision margin smaller than the first margin.
  • a brake control in two stages of a pre-braking (FPB) and an intervention braking (PB) has been proposed.
  • FPB pre-braking
  • PB intervention braking
  • the FPB is activated when the lateral position of the object specified by the fusion target is equal to or less than a predetermined first threshold Th1 and the TTC is equal to or less than the FPB activation timing Th7.
  • the FPB flag is set to “1”.
  • the FPB is set so as not to operate at an operation timing Th8 or less of the PB. In other words, PB is preferentially operated under the circumstances.
  • the PB operation condition is set so that the PB operates when the lateral position of the object is equal to or less than a predetermined fifth threshold Th5 and TTC is equal to or less than the PB operation timing Th8.
  • the PB flag is set to “1”.
  • the threshold of TTC is set such that the FPB operation timing Th7 is larger than the PB operation timing Th8. That is, the TTC threshold is set so that FPB is performed in a situation where the collision margin is high, in other words, in a situation where the possibility of collision is low. Note that after the FPB is activated, the FPB operation is stopped by a driver operation (for example, turning of the steering wheel or a brake operation). Not.
  • FPB corresponds to “primary brake control”
  • PB corresponds to “secondary brake control”
  • operation timing Th7 corresponds to “first margin”
  • operation timing Th8 corresponds to “first brake control”. This corresponds to “2 margin”.
  • the PB is activated on the condition that the FPB is activated when it is determined that the image lost FSN has occurred at a short distance. That is, at the time when the PB is activated, if the FPB has been activated in advance, the PB is activated assuming that there is a possibility of collision. If the FPB has not been activated in advance, the possibility of collision is low and the PB is not activated. That is, when the FPB is operating, the operation of the PB is permitted, and when the FPB is not operating, the operation of the PB is prohibited.
  • the FPB operating condition is not satisfied and the FPB ends due to a change in the temporary behavior of the object (for example, the lateral position of the object temporarily exceeds the threshold). There is. In this case, it is possible that the PB operating condition is satisfied when the lateral position of the object thereafter becomes equal to or less than the threshold value again. In such a case, if the object is within a predetermined time T from the end of FPB, it is considered that the reliability of the detected object is high.
  • a predetermined delay is provided to allow the execution of the PB.
  • PB is set on the condition that there is a history that FPB was completed within the predetermined time T immediately before. I tried to do it.
  • the brake experience flag is used as the history. The brake experience flag is set to “1” when the operation of the FPB is finished, and is reset to “0” after a predetermined time T has elapsed.
  • FIG. 5A shows a logic circuit 40 for determining permission and prohibition of PB operation in the present embodiment.
  • the logic circuit 40 includes an NAND circuit C1 that inputs an FPB flag and a signal of the PB flag, an AND circuit C2 that inputs an output signal of the NAND circuit C1 and an inverted signal of a brake experience flag, an output signal of the AND circuit C2, an image lost And an AND circuit C3 for inputting an FSN control signal and a short distance determination signal. If the signal output from the AND circuit C3 is “1”, the operation of the PB is prohibited, and if the signal is “0”, the operation of the PB is permitted.
  • the NAND circuit C1 outputs “0” when both the FPB flag and the PB flag are “1”, and “1” when at least one of the FPB flag and the PB flag is “0”. Is output.
  • the AND circuit C2 outputs “1” only when the output signal of the NAND circuit C1 is “1” and the brake experience flag is “0”.
  • “1” is input as the control signal for the image lost FSN when the image target GT is lost from the fusion state (image lost FSN).
  • “1” is input.
  • “0” is input.
  • the short distance determination signal is input when “1” is input when the distance of the object specified by the radar target LT is equal to or smaller than the third threshold Th3 and is larger than the third threshold Th3. “0” is input.
  • the AND circuit C3 outputs “1” only when the output signal of the AND circuit C2 is “1”, the image is in the lost FSN state, and the distance from the object is a short distance. That is, when the image lost state occurs at a short distance, if the FPB is not operating and there is no braking experience, the operation of the PB is prohibited.
  • FIGS. 6, 8, and 10 are plan views in which the TTC (collision margin time) is the vertical axis and the horizontal position with respect to the host vehicle in the vehicle width direction is the horizontal axis.
  • TTC collision margin time
  • Th7 and PB operation timing Th8 are provided. In this case, when the TTC for the object falls below the respective threshold values of PB or FPB, it is determined that each operation timing is reached.
  • the horizontal axis is provided with an operating width W which is an operating range in the vehicle width direction of FB and PB.
  • the operating width W is set by adding a predetermined length to the width of the host vehicle. 6, 8, and 10, when a pedestrian as an object is located within the operating width W, the predetermined lateral position (Th1, Th5) is satisfied. 6, 8, and 10 assume a scene in which a pedestrian approaches the vehicle along a broken arrow when the vehicle travels in the traveling direction. 7, 9, and 11 show timing charts corresponding to FIGS. 6, 8, and 10, respectively.
  • FIG. 6 and 7 show a scene in which the PB operates under the FPB operating state after the FPB operates.
  • the FPB flag is set to “1” and the FPB is operated.
  • the distance between the pedestrian and the host vehicle further approaches, and at timing t12, the state changes to the image lost FSN state ("0" ⁇ "1").
  • the short distance determination signal is “1”
  • the output signal of the AND circuit C3 is “1” (the same applies to timing t22 in FIG. 9 and timing t32 in FIG. 11).
  • the PB flag is set to “1”, the FPB is ended, and the PB is operated.
  • the output signal of the NAND circuit C1 becomes “0” by setting the PB flag to “1” while the FPB flag is “1”. Accordingly, the output signals of the AND circuit C2 and the AND circuit C3 become “0”, so that the operation of the PB is permitted. That is, in the image lost FSN state, when it is determined that the distance to the object is a short distance, the PB is executed on the condition that the FPB is executed, that is, the FPB flag is “1”. .
  • FIGS. 8 and 9 show scenes in which the FPB is temporarily ended after the FPB is activated, and then the PB is activated.
  • the FPB is activated at timing t21, and transitions to the image lost FSN state at timing t22 (“0” ⁇ “1”).
  • the FPB flag is reset to “0” and the FPB ends.
  • the brake experience flag is set to “1”.
  • the object is detected again within the operating width W.
  • the brake experience flag is maintained at “1”, whereby the PB is activated.
  • the logic circuit 40 when the PB flag is set to “1”, the brake experience flag is already “1”, and based on this, the output signals of the AND circuit C2 and the AND circuit C3 are “ “0”, the operation of PB is permitted. That is, in the image lost FSN state, when it is determined that the distance to the object is a short distance, there is a history that the FPB has been finished within the predetermined time T immediately before, that is, the brake experience flag is “1”. PB is performed on the condition.
  • the TTC is equal to or less than the operation timing Th7 at the timing t31, but since the pedestrian is not detected within the operation width W, the FPB flag is maintained in the “0” state.
  • the state is changed to the image lost FSN state ("0" ⁇ "1").
  • the TTC becomes equal to or less than the operation timing Th8.
  • the PB flag is maintained at “0”.
  • the PB flag is set to “1”.
  • FIG. 12 shows a logic circuit 50 obtained by further adding a pedestrian determination signal to the logic circuit 40.
  • a pedestrian determination signal is input to the AND circuit C3.
  • the determination signal of the pedestrian “1” is input when it is determined that the object is a pedestrian, and “0” is input when it is determined that the object is not a pedestrian.
  • the unnecessary operation of PB can be suppressed by acting so as to prohibit the operation of PB.
  • the operating width W can be made variable according to the type of object, the lateral speed of the object, and the like.
  • the operating width W and the lateral speed of the object have a relationship as shown in FIG. As shown in FIG. 13, the operating width W increases as the lateral speed increases until the lateral speed of the object is equal to or lower than a predetermined value.
  • the operating width W becomes constant at the upper limit value.
  • the present embodiment can achieve the following effects.
  • the vehicle control of the own vehicle is continued.
  • the vehicle control is suddenly stopped and the occurrence of inconvenience such as the sudden stop of the operation of the controlled object 30 can be suppressed.
  • the ECU 10 may perform vehicle control. For example, when the distance between the own lane O and the lateral position at the time when the image target GT is lost is within a predetermined value, the vehicle control may be performed after the image target GT is lost.
  • the orientation of the imaging center axis of the image sensor 21 can be changed according to the change in the loaded weight in the host vehicle. Therefore, the short distance position where the image target GT is lost changes. Therefore, in the flowchart of FIG. 4, the third threshold Th3 of the distance used in the determination of S16 may be variably set according to a change in the orientation of the imaging center axis of the image sensor 21 in the host vehicle. The change in the imaging center axis may be obtained based on the detection value of a weight sensor provided in the vehicle. In this case, as the vehicle loading weight increases, the rear side of the vehicle sinks with respect to the front side of the vehicle, and the imaging central axis faces upward, so the third threshold Th3 is reduced.
  • the inter-vehicle distance at which the image target GT is lost can be determined more accurately.
  • the inter-vehicle distance at which the image target GT is lost may be obtained in consideration that the mounting height of the image sensor 21 changes in accordance with the change in the loaded weight in the host vehicle.
  • TTC is used as the operation timing of PB and FPB.
  • the TTC is not limited to this as long as it represents a collision margin. For example, a distance from an object based on TTC is set. It is good also as a structure to use.
  • the object to be controlled by the vehicle is a pedestrian.
  • the object is not limited to the pedestrian, but may be another vehicle, an obstacle on the road, or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

L'invention concerne un dispositif de commande de véhicule (10, 10a) qui crée une cible de fusion par la fusion d'une cible radar d'un objet qui est à l'avant d'un véhicule hôte et acquise en tant qu'onde réfléchie d'une onde porteuse et une image cible d'un objet acquise par le traitement d'image d'une image obtenue par imagerie de la zone à l'avant du véhicule hôte, et qui commande le véhicule hôte par rapport à l'objet détecté comme étant la cible de fusion, évalue si l'objet a effectué une transition de la détection au moyen de la cible de fusion à la détection uniquement au moyen de la cible radar, et, lorsqu'il est déterminé que l'objet a effectué une transition pour être détecté uniquement au moyen de la cible radar, évalue si la distance jusqu'à l'objet est une distance proche prescrite. Lorsqu'il est déterminé que la distance jusqu'à l'objet est une distance proche prescrite, la commande du véhicule par rapport à l'objet est mise en œuvre.
PCT/JP2016/067896 2015-06-16 2016-06-16 Dispositif de commande de véhicule et procédé de commande de véhicule WO2016204213A1 (fr)

Priority Applications (3)

Application Number Priority Date Filing Date Title
US15/736,661 US10573180B2 (en) 2015-06-16 2016-06-16 Vehicle control device and vehicle control method
DE112016002750.8T DE112016002750B4 (de) 2015-06-16 2016-06-16 Fahrzeugsteuerungsvorrichtung und fahrzeugsteuerungsverfahren
CN201680035181.9A CN107848530B (zh) 2015-06-16 2016-06-16 车辆控制装置以及车辆控制方法

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2015-121397 2015-06-16
JP2015121397 2015-06-16
JP2016-112096 2016-06-03
JP2016112096A JP6539228B2 (ja) 2015-06-16 2016-06-03 車両制御装置、及び車両制御方法

Publications (1)

Publication Number Publication Date
WO2016204213A1 true WO2016204213A1 (fr) 2016-12-22

Family

ID=57545689

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/067896 WO2016204213A1 (fr) 2015-06-16 2016-06-16 Dispositif de commande de véhicule et procédé de commande de véhicule

Country Status (1)

Country Link
WO (1) WO2016204213A1 (fr)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108154084A (zh) * 2017-03-10 2018-06-12 南京沃杨机械科技有限公司 用于农机无人驾驶的多传感器融合的农田环境感知方法
JP2018106334A (ja) * 2016-12-26 2018-07-05 トヨタ自動車株式会社 車両用注意喚起装置
CN110576853A (zh) * 2018-06-07 2019-12-17 本田技研工业株式会社 车辆控制系统
CN111591287A (zh) * 2019-02-04 2020-08-28 丰田自动车株式会社 碰撞前控制装置
CN111845610A (zh) * 2019-04-26 2020-10-30 丰田自动车株式会社 车辆控制装置
CN113264041A (zh) * 2020-02-17 2021-08-17 丰田自动车株式会社 碰撞回避支援装置
CN114537310A (zh) * 2020-11-19 2022-05-27 丰田自动车株式会社 门控制装置

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007226680A (ja) * 2006-02-24 2007-09-06 Toyota Motor Corp 物体検出装置
JP2008132867A (ja) * 2006-11-28 2008-06-12 Hitachi Ltd 衝突回避支援装置及びそれを搭載した車両
JP2012048643A (ja) * 2010-08-30 2012-03-08 Denso Corp 物体検出装置
JP2014067169A (ja) * 2012-09-25 2014-04-17 Toyota Motor Corp 衝突予測装置
JP2014117995A (ja) * 2012-12-14 2014-06-30 Daihatsu Motor Co Ltd 運転支援装置

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007226680A (ja) * 2006-02-24 2007-09-06 Toyota Motor Corp 物体検出装置
JP2008132867A (ja) * 2006-11-28 2008-06-12 Hitachi Ltd 衝突回避支援装置及びそれを搭載した車両
JP2012048643A (ja) * 2010-08-30 2012-03-08 Denso Corp 物体検出装置
JP2014067169A (ja) * 2012-09-25 2014-04-17 Toyota Motor Corp 衝突予測装置
JP2014117995A (ja) * 2012-12-14 2014-06-30 Daihatsu Motor Co Ltd 運転支援装置

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018106334A (ja) * 2016-12-26 2018-07-05 トヨタ自動車株式会社 車両用注意喚起装置
CN108154084A (zh) * 2017-03-10 2018-06-12 南京沃杨机械科技有限公司 用于农机无人驾驶的多传感器融合的农田环境感知方法
CN110576853A (zh) * 2018-06-07 2019-12-17 本田技研工业株式会社 车辆控制系统
CN111591287A (zh) * 2019-02-04 2020-08-28 丰田自动车株式会社 碰撞前控制装置
CN111591287B (zh) * 2019-02-04 2023-05-09 丰田自动车株式会社 碰撞前控制装置
CN111845610A (zh) * 2019-04-26 2020-10-30 丰田自动车株式会社 车辆控制装置
CN111845610B (zh) * 2019-04-26 2022-07-15 丰田自动车株式会社 车辆控制装置
CN113264041A (zh) * 2020-02-17 2021-08-17 丰田自动车株式会社 碰撞回避支援装置
CN113264041B (zh) * 2020-02-17 2023-10-03 丰田自动车株式会社 碰撞回避支援装置
CN114537310A (zh) * 2020-11-19 2022-05-27 丰田自动车株式会社 门控制装置
CN114537310B (zh) * 2020-11-19 2023-11-21 丰田自动车株式会社 门控制装置

Similar Documents

Publication Publication Date Title
CN107848530B (zh) 车辆控制装置以及车辆控制方法
WO2016204213A1 (fr) Dispositif de commande de véhicule et procédé de commande de véhicule
WO2017104773A1 (fr) Dispositif et procédé de commande de corps mobile
CN108156822B (zh) 车辆控制装置,以及车辆控制方法
WO2017111147A1 (fr) Dispositif d'assistance au déplacement et procédé d'assistance au déplacement
JP6855776B2 (ja) 物体検出装置、及び物体検出方法
JP6361592B2 (ja) 車両制御装置
JP2018097582A (ja) 運転支援装置、及び運転支援方法
US10839232B2 (en) Vehicle control method and apparatus
JP6300181B2 (ja) 車両の制御装置
JP2018097687A (ja) 車両制御装置、車両制御方法
WO2017171082A1 (fr) Dispositif de commande de véhicule et procédé de commande de véhicule
JP6380232B2 (ja) 物体検出装置、及び物体検出方法
WO2017110871A1 (fr) Dispositif de commande de véhicule et procédé de commande de véhicule
CN108137007B (zh) 车辆控制装置以及车辆控制方法
WO2017043358A1 (fr) Dispositif de détection d'objet et procédé de détection d'objet
US20210394754A1 (en) Driving assistance device
WO2017183668A1 (fr) Dispositif de commande de véhicule et procédé de commande de véhicule
US20180372860A1 (en) Object detection device and object detection method
JP2018063605A (ja) 車両制御装置
WO2018110196A1 (fr) Dispositif et procédé de commande de véhicule
WO2018070380A1 (fr) Appareil de commande de véhicule
JP7561098B2 (ja) 車両制御装置及びプログラム
US20220366702A1 (en) Object detection device
JP7413548B2 (ja) 走行支援装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16811687

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15736661

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 112016002750

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16811687

Country of ref document: EP

Kind code of ref document: A1