WO2016204213A1 - Vehicle control device and vehicle control method - Google Patents

Vehicle control device and vehicle control method Download PDF

Info

Publication number
WO2016204213A1
WO2016204213A1 PCT/JP2016/067896 JP2016067896W WO2016204213A1 WO 2016204213 A1 WO2016204213 A1 WO 2016204213A1 JP 2016067896 W JP2016067896 W JP 2016067896W WO 2016204213 A1 WO2016204213 A1 WO 2016204213A1
Authority
WO
WIPO (PCT)
Prior art keywords
vehicle control
vehicle
target
distance
detected
Prior art date
Application number
PCT/JP2016/067896
Other languages
French (fr)
Japanese (ja)
Inventor
洋介 伊東
明憲 峯村
昇悟 松永
淳 土田
政行 清水
渉 池
Original Assignee
株式会社デンソー
トヨタ自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016112096A external-priority patent/JP6539228B2/en
Application filed by 株式会社デンソー, トヨタ自動車株式会社 filed Critical 株式会社デンソー
Priority to DE112016002750.8T priority Critical patent/DE112016002750B4/en
Priority to CN201680035181.9A priority patent/CN107848530B/en
Priority to US15/736,661 priority patent/US10573180B2/en
Publication of WO2016204213A1 publication Critical patent/WO2016204213A1/en

Links

Images

Classifications

    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/09Taking automatic action to avoid collision, e.g. braking and steering
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60WCONJOINT CONTROL OF VEHICLE SUB-UNITS OF DIFFERENT TYPE OR DIFFERENT FUNCTION; CONTROL SYSTEMS SPECIALLY ADAPTED FOR HYBRID VEHICLES; ROAD VEHICLE DRIVE CONTROL SYSTEMS FOR PURPOSES NOT RELATED TO THE CONTROL OF A PARTICULAR SUB-UNIT
    • B60W30/00Purposes of road vehicle drive control systems not related to the control of a particular sub-unit, e.g. of systems using conjoint control of vehicle sub-units
    • B60W30/08Active safety systems predicting or avoiding probable or impending collision or attempting to minimise its consequences
    • B60W30/095Predicting travel path or likelihood of collision
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60RVEHICLES, VEHICLE FITTINGS, OR VEHICLE PARTS, NOT OTHERWISE PROVIDED FOR
    • B60R21/00Arrangements or fittings on vehicles for protecting or preventing injuries to occupants or pedestrians in case of accidents or other traffic risks
    • B60R21/01Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents
    • B60R21/013Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over
    • B60R21/0134Electrical circuits for triggering passive safety arrangements, e.g. airbags, safety belt tighteners, in case of vehicle accidents or impending vehicle accidents including means for detecting collisions, impending collisions or roll-over responsive to imminent contact with an obstacle, e.g. using radar systems
    • BPERFORMING OPERATIONS; TRANSPORTING
    • B60VEHICLES IN GENERAL
    • B60TVEHICLE BRAKE CONTROL SYSTEMS OR PARTS THEREOF; BRAKE CONTROL SYSTEMS OR PARTS THEREOF, IN GENERAL; ARRANGEMENT OF BRAKING ELEMENTS ON VEHICLES IN GENERAL; PORTABLE DEVICES FOR PREVENTING UNWANTED MOVEMENT OF VEHICLES; VEHICLE MODIFICATIONS TO FACILITATE COOLING OF BRAKES
    • B60T7/00Brake-action initiating means
    • B60T7/12Brake-action initiating means for automatic initiation; for initiation not subject to will of driver or passenger
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/86Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01SRADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
    • G01S13/00Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
    • G01S13/88Radar or analogous systems specially adapted for specific applications
    • G01S13/93Radar or analogous systems specially adapted for specific applications for anti-collision purposes
    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present invention relates to a vehicle control device and a vehicle control method for performing vehicle control on an object ahead of the host vehicle.
  • the radar target acquired by the radar sensor and the image target acquired by the image sensor are collated and it is determined that the radar target and the image target are from the same object, the radar target A technique for generating a new target (fusion target) by fusing the image target with the image target is known.
  • fusion target a new target
  • the recognition accuracy of an object such as a preceding vehicle ahead of the host vehicle can be improved.
  • the vehicle control of the own vehicle with respect to an object can be appropriately performed by using the positional information on the object specified using a fusion target (refer patent document 1).
  • the present invention has been made in view of the above, and has as its main object to provide a vehicle control device and a vehicle control method capable of appropriately performing vehicle control on an object ahead of the host vehicle.
  • the vehicle control apparatus provides a first target information of an object in front of the host vehicle acquired as a reflected wave of a carrier wave, and a second of the object acquired by image processing of a captured image in front of the host vehicle.
  • a vehicle control device that fuses target information to generate a fusion target and performs vehicle control of the host vehicle with respect to the object detected as the fusion target, wherein the object is detected by the fusion target.
  • a state determination unit that determines whether or not the object has transitioned to a state in which the object is detected only by the first target information, and the state detection unit detects the object by only the first target information.
  • a distance determining unit that determines whether or not the distance to the object when it is determined that the state has transitioned to a predetermined state is a predetermined short distance, and the distance determining unit determines that the distance to the object is a predetermined short distance The distance When it is, and a vehicle control unit for implementing a vehicle control for the object.
  • the object when the second target information cannot be acquired from the state in which the object is detected as a fusion target, the object is transitioned to a state in which only the first target information is detected.
  • the vehicle control for the object is performed on condition that the distance to the object is a predetermined short distance. Therefore, even after the object is no longer detected as a fusion target, it is possible to perform vehicle control on the object with high reliability.
  • the schematic block diagram of the vehicle control apparatus in 1st and 2nd embodiment The functional block diagram of ECU in 1st Embodiment. Explanatory drawing of the relationship between a vehicle distance and an image lost. The figure which shows the relationship between relative speed and collision margin time.
  • the flowchart of vehicle control. The block diagram of the logic circuit for determining permission and prohibition of PB.
  • the functional block diagram of ECU in 2nd Embodiment The top view at the time of the own vehicle approaching an object.
  • a vehicle control system 100 is mounted on a vehicle, detects an object existing in front of the vehicle, and performs various controls to avoid or reduce a collision with the object. system).
  • a vehicle equipped with the vehicle control system 100 is referred to as a host vehicle.
  • the vehicle control system 100 includes an ECU 10, various sensors 20, and a controlled object 30.
  • an image sensor 21, a radar sensor 22, a vehicle speed sensor 23, and the like are provided.
  • the image sensor 21 is a CCD camera, a monocular camera, a stereo camera or the like, and is installed near the upper end of the windshield of the host vehicle.
  • the image sensor 21 captures a captured image by capturing an area that extends in a predetermined range toward the front of the host vehicle at predetermined time intervals. Then, by subjecting the captured image to image processing, an object ahead of the host vehicle is acquired as target information (image target GT) and output to the ECU 10.
  • the image target GT includes information such as the lateral width of the object in addition to the distance and relative speed with the object in the traveling direction of the host vehicle, the lateral position indicating the position of the host vehicle in the vehicle width direction. Therefore, the ECU 10 recognizes the image target GT as information having a predetermined width.
  • the radar sensor 22 detects an object ahead of the host vehicle as target information (radar target LT) using a directional electromagnetic wave such as a millimeter wave or a laser, and the light is detected at the front of the host vehicle.
  • the shaft is attached so that it faces the front of the vehicle.
  • the radar sensor 22 scans a region extending in a predetermined range toward the front of the vehicle every predetermined time with a radar signal, and receives an electromagnetic wave reflected from the surface of the object outside the vehicle, thereby detecting the distance from the object and the object. Relative speed and the like are acquired as target information and output to the ECU 10.
  • the radar target LT includes information such as the distance to the object in the traveling direction of the host vehicle, the relative speed, and the lateral position indicating the position of the host vehicle in the vehicle width direction.
  • the radar target LT corresponds to the first target information
  • the image target GT corresponds to the second target information.
  • the vehicle speed sensor 23 is provided on a rotating shaft that transmits power to the wheels of the host vehicle, and obtains the host vehicle speed that is the speed of the host vehicle based on the rotational speed of the rotating shaft.
  • the ECU 10 is an electronic control unit that controls the entire vehicle control system 100.
  • the ECU 10 is mainly composed of a CPU and includes a ROM, a RAM, and the like.
  • the ECU 10 fuses the image target GT and the radar target LT to detect an object (vehicle, road obstacle or other vehicle) ahead of the host vehicle.
  • the position of the fusion target in the traveling direction of the host vehicle is specified based on the distance and relative speed of the radar target LT, and the width of the fusion target in the vehicle width direction of the host vehicle is determined based on the lateral width and lateral position of the image target GT. Identify the location.
  • the fusion target is generated using the radar target LT and the image target GT and the position of the object is specified by the fusion target, among the information acquired by the radar sensor 22 and the image sensor 21, The position of the object is specified using information with higher accuracy, and the recognition accuracy of the position of the object can be improved.
  • the ECU 10 performs well-known image processing such as template matching on the captured image acquired from the image sensor 21, thereby detecting the type of object detected as the image target GT (another vehicle, a pedestrian, a road obstacle). Etc.).
  • a plurality of dictionaries which are image patterns indicating features for each object, are stored in the ROM as templates for specifying the type of each object.
  • the dictionary both a whole body dictionary in which features of the entire object are patterned and a half-body dictionary in which partial features of the object are patterned are stored.
  • Information on the type of object recognized by the image sensor 21 is also input to the ECU 10.
  • the object recognition accuracy is improved by generating the fusion target on the condition that the type of the object detected as the image target GT is specified in the whole body dictionary.
  • the image target GT is not used for the generation of the fusion target. ing.
  • the ECU 10 determines whether or not there is a possibility of collision between the object recognized as the fusion target and the host vehicle. Specifically, the lateral position closest to the host vehicle is selected as the lateral position to be controlled among the lateral position of the fusion target and the lateral position of the image target GT. Then, based on the approaching state between the lateral position of the selected object and the host vehicle, it is determined whether or not there is a possibility of collision between the host vehicle and the object.
  • a collision margin time TTC (Time to the object) is calculated by a method such as dividing the distance in the traveling direction between the object and the host vehicle by the relative speed with the object. to Collision).
  • the relative speed is obtained by subtracting the vehicle speed of the host vehicle from the vehicle speed of the preceding vehicle.
  • the TTC is an evaluation value indicating how many seconds later the vehicle collides with an object when traveling at the vehicle speed as it is. The smaller the TTC, the higher the risk of collision, and the larger the TTC, the higher the risk of collision. The nature becomes low. Note that TTC may be calculated in consideration of relative acceleration.
  • the ECU 10 compares the TTC and the operation timing of each controlled object 30, and if the TTC is equal to or less than the operation timing, the ECU 10 operates the corresponding controlled object 30.
  • the ECU compares the TTC and the operation timing of each controlled object 30, and operates the corresponding controlled object 30 when the TTC is equal to or lower than the operation timing.
  • the TTC falls below the speaker operation timing, an alarm is sent to the driver by the operation of the speaker. If the TTC is equal to or lower than the seat belt operation timing, the seat belt is wound up. If the TTC is equal to or less than the brake operation timing, the automatic brake is operated to reduce the collision speed. As described above, the collision between the host vehicle and the object is avoided or alleviated.
  • the object when the host vehicle approaches the object from a state in which the object is recognized as a fusion target, the object may not be recognized as a fusion target because the lower end of the object moves out of the shooting range of the image sensor 21. Yes (the image target GT is lost).
  • FIG. 2 shows an explanatory diagram of the relationship between the distance between the host vehicle and the object and the lost image target GT.
  • it is shown as a shooting angle of view ⁇ 1 of the image sensor 21.
  • the entire rear end portion of the preceding vehicle M2 is included in the shooting field angle ⁇ 1 of the image sensor 21, so that the image object is displayed in the whole body dictionary.
  • the type of the target GT is specified, and a fusion target can be generated.
  • the TTC for the object recognized only by the radar target LT is the operation timing of the controlled object 30.
  • the controlled object 30 is actuated.
  • the controlled object 30 is operated even if the TTC for the object recognized only by the radar target LT is the operation timing of the controlled object 30. I won't let you.
  • the predetermined short distance is a distance at which the lower end of the object cannot be seen, and may be set for each vehicle type in consideration of the mounting height, the mounting angle, and the like of the image sensor 21. As described above, even when the image target GT is lost, vehicle control can be performed on an object with high reliability that has been detected as a fusion target.
  • the TTC becomes a small value. Therefore, there is a possibility that the operation of the controlled object 30 by the vehicle control has already been started before the image target GT is lost. high. In other words, when the image target GT is lost, the situation where the operation of the controlled object 30 by the vehicle control is not started is limited to the situation where the relative speed between the object and the host vehicle is small and the TTC is a large value. It is done.
  • the vertical axis represents relative speed and the horizontal axis represents TTC.
  • the vehicle is detected with respect to the object detected as the fusion target before the image target GT is lost.
  • the possibility that the controlled object 30 is activated by the control is increased.
  • the relative speed decreases, the object can be detected as a fusion target only up to a larger value of TTC. Therefore, there is a high possibility that the image target GT is lost before the controlled object 30 is activated by vehicle control. Become.
  • the radar target LT is changed from the fusion target on the condition that the relative speed between the object and the host vehicle is smaller than a predetermined distance in addition to the predetermined short distance between the object and the host vehicle.
  • the vehicle control is performed on the object that has been switched to the recognition only by the user.
  • the reliability of the object specified with the radar target only gradually decreases.
  • the vehicle control is performed on the condition that the host vehicle and the object are close to each other and the relative speed is small. There is a high possibility that the vehicle has already stopped before the predetermined time elapses after the loss. Therefore, if the elapsed time after the object is no longer specified by the fusion target is within a predetermined time, the vehicle control is performed, and if the elapsed time exceeds the predetermined time, the vehicle control is not performed.
  • the radar is used instead of the lateral position of the object obtained by the fusion target.
  • the collision determination is performed using the lateral position of the object obtained by the target LT.
  • the object is identified by the radar target LT immediately after the transition after switching to the detection of only the lateral position of the object identified by the fusion target and the radar target LT. It is determined whether or not the difference from the horizontal position is less than a predetermined value. If the difference between the lateral position of the object specified by the radar target LT and the lateral position of the object specified by the fusion target is large, vehicle control is not performed. That is, if the difference between the horizontal position of the object specified by the radar target LT and the horizontal position of the object specified by the fusion target is large, the reliability of the object specified by the fusion target is low. In this case, vehicle control using a radar target is not performed.
  • vehicle control is performed on the condition that the lateral position of the object is in a predetermined approaching state with respect to the host vehicle.
  • the ECU 10 determines whether or not the radar target LT is in a fusion state in step S11. For example, the ECU 10 affirms when the radar target LT and the image target GT are in a fusion state because the image target GT is included in a predetermined range in the coordinate system of the radar target LT.
  • step S12 in the fusion state, the ECU 10 determines whether or not the lateral position of the object specified by the fusion target is equal to or less than a predetermined first threshold value Th1. Specifically, the ECU 10 determines whether or not the distance between the host lane O and the lateral position is equal to or less than a predetermined first threshold Th1 when the center position in the vehicle width direction of the host vehicle M1 is the axis (own lane O). Determine.
  • step S13 determines whether TTC is below the operation timing Th2 of the controlled object 30 in step S13, when step S12 is affirmed.
  • step S13 is affirmed, the ECU 10 operates the controlled object 30 in step S14.
  • step S12 or 13 is denied, ECU10 progresses to step S22 and does not operate the controlled object 30.
  • step S11 determines whether or not the image target GT is lost (image lost FSN) from the fusion state in step S15. If the determination in step S15 is affirmative, the ECU 10 determines whether or not the distance of the object specified by the radar target in step S16 is equal to or less than a third threshold Th3. If the determination in step S16 is affirmative, the ECU 10 determines in step S17 whether or not the relative speed between the host vehicle and the object specified by the radar target is equal to or less than a fourth threshold Th4.
  • step S18 determines in step S18 whether the state in which the image target GT has been lost continues for a predetermined number of times (or cycles). This process is affirmed when the state in which the object is detected only by the radar target LT is repeated a predetermined number of times or less after switching from the fusion target to the detection of the radar target LT.
  • Step S19 the ECU 10 determines whether or not the lateral position of the object specified by the radar target LT is equal to or less than the fifth threshold Th5.
  • the fifth threshold Th5 is set as a determination value for the approaching state between the vehicle and the object. If the determination in step S19 is affirmative, the ECU 10 determines in step S20 whether or not the lateral position of the object specified by the fusion target is equal to or less than the sixth threshold Th6 immediately before the image target GT is lost. To do. Note that the difference between the fifth threshold Th5 and the sixth threshold Th6 is set to be less than a predetermined value.
  • step S19 and S20 determines whether or not TTC is equal to or lower than the operation timing Th2 of the controlled object 30 in step S21.
  • Step S21 the ECU 10 proceeds to Step S14 and operates the controlled object 30. If any of steps S15 to S21 is denied, the ECU 10 proceeds to step S22 and does not actuate the controlled object 30.
  • FIG. 1B shows functional blocks representing the functions of the ECU 10.
  • the ECU 10 includes a state determination unit 101, a distance determination unit 102, a lateral position acquisition unit 103, a lateral position determination unit 104, a relative speed determination unit 105, and a vehicle control unit 106.
  • the state determination unit 101 is a functional block that executes Step S15 of the flowchart of FIG. 4. Whether the object has transitioned from a state in which the object is detected by the fusion target to a state in which the object is detected only by the first target information. Determine whether or not.
  • the distance determination unit 102 is a functional block that executes Step S ⁇ b> 16, and the distance to the object when the state determination unit 101 determines that the state has been changed to a state in which the object is detected using only the first target information. Is determined to be a predetermined short distance.
  • the lateral position acquisition unit 103 is a functional block that acquires the lateral position of the object in the vehicle width direction, and the object is detected only from the first target information from the state in which the object is detected as a fusion target. If it is determined that the state has been changed, the lateral position of the object in the vehicle width direction is acquired using the first target information. In addition, when the horizontal position acquisition unit 103 transitions from a state in which the object is detected as a fusion target to a state in which the object is detected only with the first target information, the lateral position acquisition unit 103 selects the fusion target immediately before the transition. To obtain the first lateral position of the object, and obtain the second lateral position of the object using the first target information immediately after the transition.
  • the lateral position determination unit 104 is a functional block that executes steps S19 and S20.
  • the distance determination unit 102 determines that the distance to the object is a predetermined short distance
  • the lateral position acquisition unit 104 Whether or not the acquired lateral position of the object in the vehicle width direction is in a predetermined approach state with respect to the host vehicle (that is, the lateral position of the object specified by the radar target LT is equal to or less than a fifth threshold Th5). If it is in a predetermined approaching state with respect to the host vehicle, it is determined whether or not the difference between the first lateral position and the second lateral position is greater than or equal to a predetermined value.
  • the relative speed determination unit 105 is a functional block that executes Step S17, and determines whether the relative speed between the object and the host vehicle is smaller than a predetermined value (Th4).
  • the vehicle control unit 106 is a functional block that executes steps S14 and S22, and performs vehicle control on the object according to the flowchart of FIG.
  • the detection accuracy of the object can be improved.
  • a part of the object for example, the lower end
  • the vehicle control for the object using the fusion target may not be performed.
  • the fusion target is not detected as the host vehicle approaches the object.
  • vehicle control is performed on the object.
  • the distance to the object is Vehicle control over an object is performed on condition that the distance is a predetermined short distance. Therefore, even after the object is no longer detected as a fusion target, it is possible to perform vehicle control on the object with high reliability.
  • the lateral position of the object is determined relative to the host vehicle.
  • the vehicle control for the object is executed on condition that the vehicle is in a predetermined approach state. In this case, if the possibility of collision between the object and the host vehicle is low, vehicle control is not performed. Therefore, after the object is no longer detected as a fusion target, it is highly reliable while suppressing unnecessary vehicle control. It becomes possible to perform vehicle control on the object.
  • the lateral position of the object in the vehicle width direction is obtained using the radar target LT. Therefore, the lateral position of the object at that time can be acquired with high accuracy.
  • the second embodiment will be described focusing on the differences from the first embodiment.
  • the speaker, the seat belt, the brake, etc. are operated as vehicle control.
  • the second embodiment In such a case, the brake is operated to avoid collision with an object.
  • the ECU 10a (see also FIG. 1A) of the second embodiment further includes a pedestrian determination unit 107 that determines that the object is a pedestrian, but includes a state determination unit 101 and a distance determination unit 102.
  • the lateral position acquisition unit 103, the lateral position determination unit 104, and the relative speed determination unit 105 are common to the first embodiment.
  • the vehicle control unit 106a of the present embodiment performs primary brake control based on the first margin, which is a margin of collision against an object, in addition to the function of the vehicle control unit 106 of the first embodiment.
  • the secondary brake control is performed based on the second margin having a collision margin smaller than the first margin.
  • a brake control in two stages of a pre-braking (FPB) and an intervention braking (PB) has been proposed.
  • FPB pre-braking
  • PB intervention braking
  • the FPB is activated when the lateral position of the object specified by the fusion target is equal to or less than a predetermined first threshold Th1 and the TTC is equal to or less than the FPB activation timing Th7.
  • the FPB flag is set to “1”.
  • the FPB is set so as not to operate at an operation timing Th8 or less of the PB. In other words, PB is preferentially operated under the circumstances.
  • the PB operation condition is set so that the PB operates when the lateral position of the object is equal to or less than a predetermined fifth threshold Th5 and TTC is equal to or less than the PB operation timing Th8.
  • the PB flag is set to “1”.
  • the threshold of TTC is set such that the FPB operation timing Th7 is larger than the PB operation timing Th8. That is, the TTC threshold is set so that FPB is performed in a situation where the collision margin is high, in other words, in a situation where the possibility of collision is low. Note that after the FPB is activated, the FPB operation is stopped by a driver operation (for example, turning of the steering wheel or a brake operation). Not.
  • FPB corresponds to “primary brake control”
  • PB corresponds to “secondary brake control”
  • operation timing Th7 corresponds to “first margin”
  • operation timing Th8 corresponds to “first brake control”. This corresponds to “2 margin”.
  • the PB is activated on the condition that the FPB is activated when it is determined that the image lost FSN has occurred at a short distance. That is, at the time when the PB is activated, if the FPB has been activated in advance, the PB is activated assuming that there is a possibility of collision. If the FPB has not been activated in advance, the possibility of collision is low and the PB is not activated. That is, when the FPB is operating, the operation of the PB is permitted, and when the FPB is not operating, the operation of the PB is prohibited.
  • the FPB operating condition is not satisfied and the FPB ends due to a change in the temporary behavior of the object (for example, the lateral position of the object temporarily exceeds the threshold). There is. In this case, it is possible that the PB operating condition is satisfied when the lateral position of the object thereafter becomes equal to or less than the threshold value again. In such a case, if the object is within a predetermined time T from the end of FPB, it is considered that the reliability of the detected object is high.
  • a predetermined delay is provided to allow the execution of the PB.
  • PB is set on the condition that there is a history that FPB was completed within the predetermined time T immediately before. I tried to do it.
  • the brake experience flag is used as the history. The brake experience flag is set to “1” when the operation of the FPB is finished, and is reset to “0” after a predetermined time T has elapsed.
  • FIG. 5A shows a logic circuit 40 for determining permission and prohibition of PB operation in the present embodiment.
  • the logic circuit 40 includes an NAND circuit C1 that inputs an FPB flag and a signal of the PB flag, an AND circuit C2 that inputs an output signal of the NAND circuit C1 and an inverted signal of a brake experience flag, an output signal of the AND circuit C2, an image lost And an AND circuit C3 for inputting an FSN control signal and a short distance determination signal. If the signal output from the AND circuit C3 is “1”, the operation of the PB is prohibited, and if the signal is “0”, the operation of the PB is permitted.
  • the NAND circuit C1 outputs “0” when both the FPB flag and the PB flag are “1”, and “1” when at least one of the FPB flag and the PB flag is “0”. Is output.
  • the AND circuit C2 outputs “1” only when the output signal of the NAND circuit C1 is “1” and the brake experience flag is “0”.
  • “1” is input as the control signal for the image lost FSN when the image target GT is lost from the fusion state (image lost FSN).
  • “1” is input.
  • “0” is input.
  • the short distance determination signal is input when “1” is input when the distance of the object specified by the radar target LT is equal to or smaller than the third threshold Th3 and is larger than the third threshold Th3. “0” is input.
  • the AND circuit C3 outputs “1” only when the output signal of the AND circuit C2 is “1”, the image is in the lost FSN state, and the distance from the object is a short distance. That is, when the image lost state occurs at a short distance, if the FPB is not operating and there is no braking experience, the operation of the PB is prohibited.
  • FIGS. 6, 8, and 10 are plan views in which the TTC (collision margin time) is the vertical axis and the horizontal position with respect to the host vehicle in the vehicle width direction is the horizontal axis.
  • TTC collision margin time
  • Th7 and PB operation timing Th8 are provided. In this case, when the TTC for the object falls below the respective threshold values of PB or FPB, it is determined that each operation timing is reached.
  • the horizontal axis is provided with an operating width W which is an operating range in the vehicle width direction of FB and PB.
  • the operating width W is set by adding a predetermined length to the width of the host vehicle. 6, 8, and 10, when a pedestrian as an object is located within the operating width W, the predetermined lateral position (Th1, Th5) is satisfied. 6, 8, and 10 assume a scene in which a pedestrian approaches the vehicle along a broken arrow when the vehicle travels in the traveling direction. 7, 9, and 11 show timing charts corresponding to FIGS. 6, 8, and 10, respectively.
  • FIG. 6 and 7 show a scene in which the PB operates under the FPB operating state after the FPB operates.
  • the FPB flag is set to “1” and the FPB is operated.
  • the distance between the pedestrian and the host vehicle further approaches, and at timing t12, the state changes to the image lost FSN state ("0" ⁇ "1").
  • the short distance determination signal is “1”
  • the output signal of the AND circuit C3 is “1” (the same applies to timing t22 in FIG. 9 and timing t32 in FIG. 11).
  • the PB flag is set to “1”, the FPB is ended, and the PB is operated.
  • the output signal of the NAND circuit C1 becomes “0” by setting the PB flag to “1” while the FPB flag is “1”. Accordingly, the output signals of the AND circuit C2 and the AND circuit C3 become “0”, so that the operation of the PB is permitted. That is, in the image lost FSN state, when it is determined that the distance to the object is a short distance, the PB is executed on the condition that the FPB is executed, that is, the FPB flag is “1”. .
  • FIGS. 8 and 9 show scenes in which the FPB is temporarily ended after the FPB is activated, and then the PB is activated.
  • the FPB is activated at timing t21, and transitions to the image lost FSN state at timing t22 (“0” ⁇ “1”).
  • the FPB flag is reset to “0” and the FPB ends.
  • the brake experience flag is set to “1”.
  • the object is detected again within the operating width W.
  • the brake experience flag is maintained at “1”, whereby the PB is activated.
  • the logic circuit 40 when the PB flag is set to “1”, the brake experience flag is already “1”, and based on this, the output signals of the AND circuit C2 and the AND circuit C3 are “ “0”, the operation of PB is permitted. That is, in the image lost FSN state, when it is determined that the distance to the object is a short distance, there is a history that the FPB has been finished within the predetermined time T immediately before, that is, the brake experience flag is “1”. PB is performed on the condition.
  • the TTC is equal to or less than the operation timing Th7 at the timing t31, but since the pedestrian is not detected within the operation width W, the FPB flag is maintained in the “0” state.
  • the state is changed to the image lost FSN state ("0" ⁇ "1").
  • the TTC becomes equal to or less than the operation timing Th8.
  • the PB flag is maintained at “0”.
  • the PB flag is set to “1”.
  • FIG. 12 shows a logic circuit 50 obtained by further adding a pedestrian determination signal to the logic circuit 40.
  • a pedestrian determination signal is input to the AND circuit C3.
  • the determination signal of the pedestrian “1” is input when it is determined that the object is a pedestrian, and “0” is input when it is determined that the object is not a pedestrian.
  • the unnecessary operation of PB can be suppressed by acting so as to prohibit the operation of PB.
  • the operating width W can be made variable according to the type of object, the lateral speed of the object, and the like.
  • the operating width W and the lateral speed of the object have a relationship as shown in FIG. As shown in FIG. 13, the operating width W increases as the lateral speed increases until the lateral speed of the object is equal to or lower than a predetermined value.
  • the operating width W becomes constant at the upper limit value.
  • the present embodiment can achieve the following effects.
  • the vehicle control of the own vehicle is continued.
  • the vehicle control is suddenly stopped and the occurrence of inconvenience such as the sudden stop of the operation of the controlled object 30 can be suppressed.
  • the ECU 10 may perform vehicle control. For example, when the distance between the own lane O and the lateral position at the time when the image target GT is lost is within a predetermined value, the vehicle control may be performed after the image target GT is lost.
  • the orientation of the imaging center axis of the image sensor 21 can be changed according to the change in the loaded weight in the host vehicle. Therefore, the short distance position where the image target GT is lost changes. Therefore, in the flowchart of FIG. 4, the third threshold Th3 of the distance used in the determination of S16 may be variably set according to a change in the orientation of the imaging center axis of the image sensor 21 in the host vehicle. The change in the imaging center axis may be obtained based on the detection value of a weight sensor provided in the vehicle. In this case, as the vehicle loading weight increases, the rear side of the vehicle sinks with respect to the front side of the vehicle, and the imaging central axis faces upward, so the third threshold Th3 is reduced.
  • the inter-vehicle distance at which the image target GT is lost can be determined more accurately.
  • the inter-vehicle distance at which the image target GT is lost may be obtained in consideration that the mounting height of the image sensor 21 changes in accordance with the change in the loaded weight in the host vehicle.
  • TTC is used as the operation timing of PB and FPB.
  • the TTC is not limited to this as long as it represents a collision margin. For example, a distance from an object based on TTC is set. It is good also as a structure to use.
  • the object to be controlled by the vehicle is a pedestrian.
  • the object is not limited to the pedestrian, but may be another vehicle, an obstacle on the road, or the like.

Landscapes

  • Engineering & Computer Science (AREA)
  • Automation & Control Theory (AREA)
  • Transportation (AREA)
  • Mechanical Engineering (AREA)
  • Traffic Control Systems (AREA)

Abstract

A vehicle control device (10, 10a), which creates a fusion target by fusing a radar target of an object that is in front of a host vehicle and acquired as a reflected wave of a carrier wave, and an image target of an object acquired by image processing of an image obtained by imaging the area in front of the host vehicle, and which controls the host vehicle in regard to the object detected as the fusion target, assesses whether or not the object has transitioned from being detected using the fusion target to being detected using only the radar target, and, when the object is determined to have transitioned to being detected using only the radar target, assesses whether or not the distance to the object is a prescribed near distance. When the distance to the object is determined to be a prescribed near distance, vehicle control in relation to the object is carried out.

Description

車両制御装置、及び車両制御方法Vehicle control apparatus and vehicle control method 関連出願の相互参照Cross-reference of related applications
 本出願は、2015年6月16日に出願された日本出願番号2015-121397号及び2016年6月3日に出願された日本出願番号2016-112096号に基づくもので、ここにその記載内容を援用する。 This application is based on Japanese Application No. 2015-121397 filed on June 16, 2015 and Japanese Application No. 2016-1112096 filed on June 3, 2016. Incorporate.
 本発明は、自車両前方の物体に対する車両制御を行う車両制御装置、及び車両制御方法に関する。 The present invention relates to a vehicle control device and a vehicle control method for performing vehicle control on an object ahead of the host vehicle.
 レーダセンサで取得されたレーダ物標と、画像センサで取得された画像物標とを照合し、レーダ物標と画像物標とが同一の物体によるものであると判定した場合に、レーダ物標と画像物標とをフュージョン(融合)して新たな物標(フュージョン物標)を生成する技術が知られている。このフュージョン物標の生成により、自車両前方の先行車両等の物体の認識精度を向上することができる。そして、フュージョン物標を用いて特定される物体の位置情報を用いることで、物体に対する自車両の車両制御を適切に行うことができる(特許文献1参照)。 When the radar target acquired by the radar sensor and the image target acquired by the image sensor are collated and it is determined that the radar target and the image target are from the same object, the radar target A technique for generating a new target (fusion target) by fusing the image target with the image target is known. By generating this fusion target, the recognition accuracy of an object such as a preceding vehicle ahead of the host vehicle can be improved. And the vehicle control of the own vehicle with respect to an object can be appropriately performed by using the positional information on the object specified using a fusion target (refer patent document 1).
特開2005-145396号公報JP 2005-145396 A
 しかし、物体に対して自車両が接近すると、画像センサの撮影画角から物体の一部(下端)が外れるため、フュージョン物標を生成できなくなることが生じうる。この場合、フュージョン物標で特定されていた物体に対する車両制御が行えなくなる不都合が生じてしまう。 However, when the host vehicle approaches the object, a part (lower end) of the object is removed from the shooting angle of view of the image sensor, so that a fusion target may not be generated. In this case, there arises a problem that vehicle control cannot be performed on the object specified by the fusion target.
 本発明は上記に鑑みてなされたものであり、自車両前方の物体に対する車両制御を適切に行うことができる車両制御装置、及び車両制御方法を提供することを主たる目的とする。 The present invention has been made in view of the above, and has as its main object to provide a vehicle control device and a vehicle control method capable of appropriately performing vehicle control on an object ahead of the host vehicle.
 本発明の一態様による車両制御装置は、搬送波の反射波として取得される自車両前方の物体の第1物標情報と、自車両前方の撮影画像の画像処理で取得される前記物体の第2物標情報とをフュージョンしてフュージョン物標を生成し、前記フュージョン物標として検出された前記物体に対する自車両の車両制御を行う車両制御装置であって、前記物体が前記フュージョン物標で検出される状態から前記物体が前記第1物標情報のみで検出される状態に遷移したか否かを判定する状態判定部と、前記状態判定部により、前記第1物標情報のみで前記物体が検出される状態に遷移したと判定された際の前記物体との距離が所定の近距離であるか否かを判定する距離判定部と、前記距離判定部により、前記物体との距離が所定の近距離であると判定された際に、前記物体に対する車両制御を実施する車両制御部と、を備える。 The vehicle control apparatus according to an aspect of the present invention provides a first target information of an object in front of the host vehicle acquired as a reflected wave of a carrier wave, and a second of the object acquired by image processing of a captured image in front of the host vehicle. A vehicle control device that fuses target information to generate a fusion target and performs vehicle control of the host vehicle with respect to the object detected as the fusion target, wherein the object is detected by the fusion target. A state determination unit that determines whether or not the object has transitioned to a state in which the object is detected only by the first target information, and the state detection unit detects the object by only the first target information. A distance determining unit that determines whether or not the distance to the object when it is determined that the state has transitioned to a predetermined state is a predetermined short distance, and the distance determining unit determines that the distance to the object is a predetermined short distance The distance When it is, and a vehicle control unit for implementing a vehicle control for the object.
 本発明によれば、物体がフュージョン物標として検出されている状態から、第2物標情報が取得できなくなることで、物体が第1物標情報のみで検出される状態に遷移した場合には、物体との距離が所定の近距離であることを条件として、物体に対する車両制御を実施するようにした。そのため、物体がフュージョン物標として検出されなくなった後も、信頼度の高い物体に対する車両制御を行うことが可能となる。 According to the present invention, when the second target information cannot be acquired from the state in which the object is detected as a fusion target, the object is transitioned to a state in which only the first target information is detected. The vehicle control for the object is performed on condition that the distance to the object is a predetermined short distance. Therefore, even after the object is no longer detected as a fusion target, it is possible to perform vehicle control on the object with high reliability.
第1及び第2の実施形態における車両制御装置の概略構成図。The schematic block diagram of the vehicle control apparatus in 1st and 2nd embodiment. 第1実施形態におけるECUの機能ブロック図。The functional block diagram of ECU in 1st Embodiment. 車間距離と画像ロストとの関係の説明図。Explanatory drawing of the relationship between a vehicle distance and an image lost. 相対速度と衝突余裕時間との関係を示す図。The figure which shows the relationship between relative speed and collision margin time. 車両制御のフローチャート。The flowchart of vehicle control. PBの許可及び禁止を決定するための論理回路の構成図。The block diagram of the logic circuit for determining permission and prohibition of PB. 第2実施形態におけるECUの機能ブロック図。The functional block diagram of ECU in 2nd Embodiment. 物体に対して自車両が近づく際の平面図。The top view at the time of the own vehicle approaching an object. PBの作動を許可する態様を示すタイミングチャート。The timing chart which shows the aspect which permits the action | operation of PB. 物体に対して自車両が近づく際の平面図。The top view at the time of the own vehicle approaching an object. PBの作動を許可する態様を示すタイミングチャート。The timing chart which shows the aspect which permits the action | operation of PB. 物体に対して自車両が近づく際の平面図。The top view at the time of the own vehicle approaching an object. PBの作動を禁止する態様を示すタイミングチャート。The timing chart which shows the aspect which prohibits the action | operation of PB. 別例におけるPBの許可及び禁止を決定するための論理回路の構成図。The block diagram of the logic circuit for determining permission and prohibition of PB in another example. 作動幅Wと物体の横速度との関係を示す図。The figure which shows the relationship between the operating width W and the lateral speed of an object.
 以下、本発明の実施形態を、添付図面を参照しながら、より詳細に説明する。しかし、本発明は、多くの異なる形態で実施されてもよく、本明細書で説明される実施形態に限定されると解釈されるべきではない。むしろ、これらの実施形態は、この発明の開示を徹底的でかつ完全にし、本発明の範囲を当業者に完全に伝えるために、提供される。尚、類似の符号は、図面全体にわたって類似の構成要素を示す。 Hereinafter, embodiments of the present invention will be described in more detail with reference to the accompanying drawings. However, the present invention may be implemented in many different forms and should not be construed as limited to the embodiments set forth herein. Rather, these embodiments are provided so that this disclosure will be thorough and complete, and will fully convey the scope of the invention to those skilled in the art. Note that similar reference numbers indicate similar components throughout the drawings.
 (第1実施形態)
 本実施形態に係る車両制御システム100は、車両に搭載され、自該車両の前方に存在する物体を検出し、その物体との衝突を回避又は軽減すべく各種制御を行うPCS(Pre-crash safety system)として機能する。以下の記述において、車両制御システム100を搭載した車両を自車両と言う。
(First embodiment)
A vehicle control system 100 according to the present embodiment is mounted on a vehicle, detects an object existing in front of the vehicle, and performs various controls to avoid or reduce a collision with the object. system). In the following description, a vehicle equipped with the vehicle control system 100 is referred to as a host vehicle.
 図1Aにおいて、車両制御システム100は、ECU10、各種センサ20、被制御対象30を備えて構成されている。 1A, the vehicle control system 100 includes an ECU 10, various sensors 20, and a controlled object 30.
 各種センサ20としては、例えば、画像センサ21、レーダセンサ22、車速センサ23等を備えている。 As the various sensors 20, for example, an image sensor 21, a radar sensor 22, a vehicle speed sensor 23, and the like are provided.
 画像センサ21は、CCDカメラ、単眼カメラ、ステレオカメラ等であり、自車両のフロントガラスの上端付近等に設置される。画像センサ21は、所定時間毎に自車両の前方に向かって所定範囲で広がる領域を撮像して撮影画像を取得する。そして、撮影画像を画像処理することで、自車両前方の物体を物標情報(画像物標GT)として取得し、ECU10に出力する。 The image sensor 21 is a CCD camera, a monocular camera, a stereo camera or the like, and is installed near the upper end of the windshield of the host vehicle. The image sensor 21 captures a captured image by capturing an area that extends in a predetermined range toward the front of the host vehicle at predetermined time intervals. Then, by subjecting the captured image to image processing, an object ahead of the host vehicle is acquired as target information (image target GT) and output to the ECU 10.
 画像物標GTには、自車両の進行方向における物体との距離や相対速度、自車両の車幅方向の位置を表す横位置に加えて、物体の横幅等の情報が含まれている。そのため、ECU10は、画像物標GTを所定の幅を有する情報として認識することとなる。 The image target GT includes information such as the lateral width of the object in addition to the distance and relative speed with the object in the traveling direction of the host vehicle, the lateral position indicating the position of the host vehicle in the vehicle width direction. Therefore, the ECU 10 recognizes the image target GT as information having a predetermined width.
 レーダセンサ22は、ミリ波やレーザ等の指向性のある電磁波を利用して自車両前方の物体を物標情報(レーダ物標LT)として検出するものであり、自車両の前部においてその光軸が車両前方を向くように取り付けられている。レーダセンサ22は、所定時間ごとに車両前方に向かって所定範囲で広がる領域をレーダ信号で走査するとともに、車外の物体の表面で反射された電磁波を受信することで物体との距離、物体との相対速度等を物標情報として取得し、ECU10に出力する。 The radar sensor 22 detects an object ahead of the host vehicle as target information (radar target LT) using a directional electromagnetic wave such as a millimeter wave or a laser, and the light is detected at the front of the host vehicle. The shaft is attached so that it faces the front of the vehicle. The radar sensor 22 scans a region extending in a predetermined range toward the front of the vehicle every predetermined time with a radar signal, and receives an electromagnetic wave reflected from the surface of the object outside the vehicle, thereby detecting the distance from the object and the object. Relative speed and the like are acquired as target information and output to the ECU 10.
 レーダ物標LTには、自車両の進行方向の物体との距離や相対速度、自車両の車幅方向の位置を表す横位置等の情報が含まれている。なお、レーダ物標LTが第1物標情報に相当し、画像物標GTが第2物標情報に相当する。 The radar target LT includes information such as the distance to the object in the traveling direction of the host vehicle, the relative speed, and the lateral position indicating the position of the host vehicle in the vehicle width direction. The radar target LT corresponds to the first target information, and the image target GT corresponds to the second target information.
 車速センサ23は、自車両の車輪に動力を伝達する回転軸に設けられており、その回転軸の回転速度に基づいて、自車両の速度である自車速を求める。 The vehicle speed sensor 23 is provided on a rotating shaft that transmits power to the wheels of the host vehicle, and obtains the host vehicle speed that is the speed of the host vehicle based on the rotational speed of the rotating shaft.
 ECU10は、車両制御システム100全体の制御を行う電子制御ユニットであり、CPUを主体として構成され、ROM,RAM等を備えて構成されている。ECU10は、画像物標GT及びレーダ物標LTを融合(フュージョン)することで、自車両前方の物体(車両、路上障害物や他車両)を検出する。 The ECU 10 is an electronic control unit that controls the entire vehicle control system 100. The ECU 10 is mainly composed of a CPU and includes a ROM, a RAM, and the like. The ECU 10 fuses the image target GT and the radar target LT to detect an object (vehicle, road obstacle or other vehicle) ahead of the host vehicle.
 詳しくは、レーダ物標LTの距離や相対速度により、フュージョン物標の自車両の進行方向の位置を特定し、画像物標GTの横幅や横位置によりフュージョン物標の自車両の車幅方向の位置を特定する。このように、レーダ物標LTと画像物標GTを用いてフュージョン物標を生成し、フュージョン物標により物体の位置を特定する場合、レーダセンサ22と画像センサ21とが取得した情報のうち、精度が高い方の情報を用いて物体の位置が特定されることとなり、物体の位置の認識精度を向上できる。 Specifically, the position of the fusion target in the traveling direction of the host vehicle is specified based on the distance and relative speed of the radar target LT, and the width of the fusion target in the vehicle width direction of the host vehicle is determined based on the lateral width and lateral position of the image target GT. Identify the location. As described above, when the fusion target is generated using the radar target LT and the image target GT and the position of the object is specified by the fusion target, among the information acquired by the radar sensor 22 and the image sensor 21, The position of the object is specified using information with higher accuracy, and the recognition accuracy of the position of the object can be improved.
 またECU10は、画像センサ21から取得した撮影画像に対して、テンプレートマッチング等の周知の画像処理を行うことで、画像物標GTとして検出された物体の種類(他車両、歩行者、路上障害物等)を特定する。本実施形態では、各物体の種類を特定するためのテンプレートとして、物体ごとの特徴を示す画像パターンである複数の辞書がROMに記憶されている。辞書としては、物体全体の特徴をパターン化した全身辞書と、物体の部分的な特徴をパターン化した半身辞書との両方を記憶している。画像センサ21が認識した物体の種類の情報も、ECU10に入力される。 The ECU 10 performs well-known image processing such as template matching on the captured image acquired from the image sensor 21, thereby detecting the type of object detected as the image target GT (another vehicle, a pedestrian, a road obstacle). Etc.). In the present embodiment, a plurality of dictionaries, which are image patterns indicating features for each object, are stored in the ROM as templates for specifying the type of each object. As the dictionary, both a whole body dictionary in which features of the entire object are patterned and a half-body dictionary in which partial features of the object are patterned are stored. Information on the type of object recognized by the image sensor 21 is also input to the ECU 10.
 なお本実施形態では、画像物標GTとして検出された物体の種類が全身辞書で特定されていることを条件としてフュージョン物標を生成することにより、物体の認識精度が高められるようにしている。すなわち、画像物標GTとして検出された物体の種類が半身辞書のみで特定されているなど、物体の信頼度が低い場合には、その画像物標GTをフュージョン物標の生成に使用しないようにしている。 In this embodiment, the object recognition accuracy is improved by generating the fusion target on the condition that the type of the object detected as the image target GT is specified in the whole body dictionary. In other words, when the reliability of an object is low, such as when the type of the object detected as the image target GT is specified only by the half-body dictionary, the image target GT is not used for the generation of the fusion target. ing.
 そして、ECU10は、フュージョン物標として認識された物体と自車両とが衝突する可能性があるか否かを判定する。詳しくは、フュージョン物標の横位置と、画像物標GTの横位置とのうち、自車両に最も近い横位置を制御対象の横位置に選択する。そして、選択された物体の横位置と自車両との接近状態に基づき、自車両と物体とが衝突の可能性があるか否かを判定する。 Then, the ECU 10 determines whether or not there is a possibility of collision between the object recognized as the fusion target and the host vehicle. Specifically, the lateral position closest to the host vehicle is selected as the lateral position to be controlled among the lateral position of the fusion target and the lateral position of the image target GT. Then, based on the approaching state between the lateral position of the selected object and the host vehicle, it is determined whether or not there is a possibility of collision between the host vehicle and the object.
 そして、衝突の可能性があると判定した場合には、当該物体と自車両との進行方向の距離を、物体との相対速度で除算する等の方法で、当該物体に対する衝突余裕時間TTC(Time to Collision)を算出する。なお相対速度は先行車両の車速から自車両の車速を減算することにより求められる。TTCとは、このままの自車速度で走行した場合に、何秒後に物体に衝突するかを示す評価値であり、TTCが小さいほど、衝突の危険性は高くなり、TTCが大きいほど衝突の危険性は低くなる。なお、相対加速度を加味してTTCを算出してもよい。 If it is determined that there is a possibility of a collision, a collision margin time TTC (Time to the object) is calculated by a method such as dividing the distance in the traveling direction between the object and the host vehicle by the relative speed with the object. to Collision). The relative speed is obtained by subtracting the vehicle speed of the host vehicle from the vehicle speed of the preceding vehicle. The TTC is an evaluation value indicating how many seconds later the vehicle collides with an object when traveling at the vehicle speed as it is. The smaller the TTC, the higher the risk of collision, and the larger the TTC, the higher the risk of collision. The nature becomes low. Note that TTC may be calculated in consideration of relative acceleration.
 そしてECU10は、TTCと各被制御対象30の作動タイミングとを比較し、TTCが作動タイミング以下であれば、該当する被制御対象30を作動させる。 Then, the ECU 10 compares the TTC and the operation timing of each controlled object 30, and if the TTC is equal to or less than the operation timing, the ECU 10 operates the corresponding controlled object 30.
 被制御対象30としては、スピーカ、シートベルト、ブレーキ等が設けられており、被制御対象30ごとに所定の作動タイミングが設定されている。そのため、ECUは、TTCと各被制御対象30の作動タイミングとを比較して、TTCが作動タイミング以下となる際に、該当する被制御対象30を作動させる。 As the controlled object 30, a speaker, a seat belt, a brake, and the like are provided, and a predetermined operation timing is set for each controlled object 30. Therefore, the ECU compares the TTC and the operation timing of each controlled object 30, and operates the corresponding controlled object 30 when the TTC is equal to or lower than the operation timing.
 詳しくは、TTCがスピーカの作動タイミング以下となれば、スピーカの作動で運転者に警報を発信する。TTCがシートベルトの作動タイミング以下となれば、シートベルトを巻き上げる制御を行う。TTCがブレーキの作動タイミング以下となれば、自動ブレーキを作動させて衝突速度を低減する制御を行う。以上により、自車両と物体との衝突を回避又は緩和する。 Specifically, if the TTC falls below the speaker operation timing, an alarm is sent to the driver by the operation of the speaker. If the TTC is equal to or lower than the seat belt operation timing, the seat belt is wound up. If the TTC is equal to or less than the brake operation timing, the automatic brake is operated to reduce the collision speed. As described above, the collision between the host vehicle and the object is avoided or alleviated.
 ところで、物体がフュージョン物標として認識されている状態から、自車両が物体に接近すると、画像センサ21の撮影範囲から物体の下端が外れることにより、物体がフュージョン物標として認識されなくなることが生じうる(画像物標GTがロストする)。 By the way, when the host vehicle approaches the object from a state in which the object is recognized as a fusion target, the object may not be recognized as a fusion target because the lower end of the object moves out of the shooting range of the image sensor 21. Yes (the image target GT is lost).
 図2に自車両と物体との距離と画像物標GTのロストとの関係の説明図を示す。なお図中で、画像センサ21の撮影画角θ1として示している。まず、自車両M1と物体(先行車両M2)との距離がd1である場合には、先行車両M2の後端部全体が画像センサ21の撮影画角θ1に含まれるため、全身辞書で画像物標GTの種類が特定され、フュージョン物標を生成できる。しかし、自車両と先行車両M2との距離がd2(<d1)に接近すると、先行車両M2の後端部の下端側が画像センサ21の撮影画角θ1から外れるため、全身辞書で画像物標GTの種類を特定できなくなり、フュージョン物標を生成できなくなる。 FIG. 2 shows an explanatory diagram of the relationship between the distance between the host vehicle and the object and the lost image target GT. In the drawing, it is shown as a shooting angle of view θ1 of the image sensor 21. First, when the distance between the host vehicle M1 and the object (preceding vehicle M2) is d1, the entire rear end portion of the preceding vehicle M2 is included in the shooting field angle θ1 of the image sensor 21, so that the image object is displayed in the whole body dictionary. The type of the target GT is specified, and a fusion target can be generated. However, when the distance between the host vehicle M2 and the preceding vehicle M2 approaches d2 (<d1), the lower end side of the rear end of the preceding vehicle M2 deviates from the shooting angle of view θ1 of the image sensor 21, and thus the image target GT in the whole body dictionary. It becomes impossible to specify the type of the target, and the fusion target cannot be generated.
 しかし、一旦フュージョン物標として認識された物体は、物体が存在する信頼度が高いため、フュージョン物標として認識されなくなった以降も、当該物体に対する車両制御が実施可能であることが好ましい。 However, since an object once recognized as a fusion target has a high degree of reliability that the object exists, it is preferable that vehicle control can be performed on the object even after the object is no longer recognized as a fusion target.
 そこで、フュージョン物標で物体が認識されている状態から、画像物標GTがロストした場合、すなわちレーダ物標LTのみで物体が認識される状況に遷移した場合には、自車両と物体との距離が所定の近距離であることを条件として、当該レーダ物標LTのみで認識される物体に対して車両制御を実施する。 Therefore, when the image target GT is lost from the state where the object is recognized by the fusion target, that is, when the object is recognized only by the radar target LT, the vehicle and the object On the condition that the distance is a predetermined short distance, vehicle control is performed on an object recognized only by the radar target LT.
 すなわち、画像物標GTがロストした場合に、自車両と物体との距離が所定の近距離であれば、レーダ物標LTのみで認識される物体に対するTTCが被制御対象30の作動タイミングとなった際に、被制御対象30を作動させる。一方、自車両と物体との距離が所定の近距離でなければ、レーダ物標LTのみで認識される物体に対するTTCが被制御対象30の作動タイミングとなったとしても、被制御対象30を作動させない。 That is, when the image target GT is lost, if the distance between the host vehicle and the object is a predetermined short distance, the TTC for the object recognized only by the radar target LT is the operation timing of the controlled object 30. The controlled object 30 is actuated. On the other hand, if the distance between the host vehicle and the object is not a predetermined short distance, the controlled object 30 is operated even if the TTC for the object recognized only by the radar target LT is the operation timing of the controlled object 30. I won't let you.
 なお所定の近距離とは、物体の下端が見えなくなる距離であり、画像センサ21の取り付け高さや、取り付け角度等を加味して、車種ごと等に設定されればよい。以上により、画像物標GTがロストしたとしても、フュージョン物標として検出されていた信頼度の高い物体に対する車両制御を実施できるようになる。 The predetermined short distance is a distance at which the lower end of the object cannot be seen, and may be set for each vehicle type in consideration of the mounting height, the mounting angle, and the like of the image sensor 21. As described above, even when the image target GT is lost, vehicle control can be performed on an object with high reliability that has been detected as a fusion target.
 なお、物体と自車両との相対速度が大きければ、TTCが小さい値となるため、画像物標GTがロストする前に、車両制御による被制御対象30の作動が既に開始されている可能性が高い。言い換えると、画像物標GTがロストした時点で、車両制御による被制御対象30の作動が開始されていない状況は、物体と自車両との相対速度が小さく、TTCが大きい値となる状況に限られる。 Note that if the relative speed between the object and the host vehicle is large, the TTC becomes a small value. Therefore, there is a possibility that the operation of the controlled object 30 by the vehicle control has already been started before the image target GT is lost. high. In other words, when the image target GT is lost, the situation where the operation of the controlled object 30 by the vehicle control is not started is limited to the situation where the relative speed between the object and the host vehicle is small and the TTC is a large value. It is done.
 ここで、図3を用いて相対速度とTTCとの関係について詳しく説明する。なお図中において、縦軸は相対速度であり、横軸はTTCである。図示されるように、相対速度が大きいほど、より小さい値のTTCまで物体をフュージョン物標として検出できるため、画像物標GTがロストする前に、フュージョン物標として検出された物体に対して車両制御により被制御対象30が作動される可能性は高くなる。一方、相対速度が小さくなると、より大きい値のTTCまでしか物体をフュージョン物標として検出できないため、車両制御により被制御対象30が作動される前に、画像物標GTがロストする可能性が高くなる。 Here, the relationship between relative speed and TTC will be described in detail with reference to FIG. In the figure, the vertical axis represents relative speed and the horizontal axis represents TTC. As shown in the figure, since the object can be detected as a fusion target up to a smaller value of TTC as the relative speed increases, the vehicle is detected with respect to the object detected as the fusion target before the image target GT is lost. The possibility that the controlled object 30 is activated by the control is increased. On the other hand, when the relative speed decreases, the object can be detected as a fusion target only up to a larger value of TTC. Therefore, there is a high possibility that the image target GT is lost before the controlled object 30 is activated by vehicle control. Become.
 そこで、本実施形態では、物体と自車両とが所定の近距離であることに加えて、物体と自車両との相対速度が所定未満に小さいことを条件として、フュージョン物標からレーダ物標LTのみによる認識に切り替わった物体に対する車両制御を実施する。 Therefore, in the present embodiment, the radar target LT is changed from the fusion target on the condition that the relative speed between the object and the host vehicle is smaller than a predetermined distance in addition to the predetermined short distance between the object and the host vehicle. The vehicle control is performed on the object that has been switched to the recognition only by the user.
 また、フュージョン物標で物体を特定できなくなってから(画像物標GTがロストしてから)所定時間が経過した後は、レーダ物標のみで特定される物体の信頼度が次第に低下する。また上述したように、画像物標GTがロストした場合には、自車両と物体とが近距離であり、且つ相対速度が小さいことを条件として車両制御を実施しているため、フュージョン物標がロストしてから所定時間が経過するまでに、既に車両が停車状態となっている可能性が高い。そこで、フュージョン物標で物体が特定されなくなってからの経過時間が所定時間以内であれば車両制御を実施し、経過時間が所定時間を超える場合には当該車両制御を実施しない。 Also, after a predetermined time has elapsed since the object could not be specified with the fusion target (after the image target GT was lost), the reliability of the object specified with the radar target only gradually decreases. As described above, when the image target GT is lost, the vehicle control is performed on the condition that the host vehicle and the object are close to each other and the relative speed is small. There is a high possibility that the vehicle has already stopped before the predetermined time elapses after the loss. Therefore, if the elapsed time after the object is no longer specified by the fusion target is within a predetermined time, the vehicle control is performed, and if the elapsed time exceeds the predetermined time, the vehicle control is not performed.
 更には、本実施形態では、画像物標GTがロストしてレーダ物標LTのみによって物体が認識される状態に遷移した場合には、フュージョン物標で求められる物体の横位置に代えて、レーダ物標LTで求められる物体の横位置を用いて、衝突判定を実施する。 Furthermore, in this embodiment, when the image target GT is lost and the state is changed to a state where the object is recognized only by the radar target LT, the radar is used instead of the lateral position of the object obtained by the fusion target. The collision determination is performed using the lateral position of the object obtained by the target LT.
 この際、画像物標GTがロストする直前に、フュージョン物標で特定される物体の横位置と、レーダ物標LTのみの検出に切り替わった後、遷移直後のレーダ物標LTで特定される物体の横位置との差が所定未満であるか否かを判定する。そして、レーダ物標LTで特定される物体の横位置とフュージョン物標で特定される物体の横位置との差が大きければ、車両制御を実施しない。すなわち、レーダ物標LTで特定される物体の横位置とフュージョン物標で特定される物体の横位置との差が大きければ、フュージョン物標で特定されていた物体の信頼度が低くなるため、この場合にはレーダ物標を用いた車両制御を実施しないようにする。 At this time, immediately after the image target GT is lost, the object is identified by the radar target LT immediately after the transition after switching to the detection of only the lateral position of the object identified by the fusion target and the radar target LT. It is determined whether or not the difference from the horizontal position is less than a predetermined value. If the difference between the lateral position of the object specified by the radar target LT and the lateral position of the object specified by the fusion target is large, vehicle control is not performed. That is, if the difference between the horizontal position of the object specified by the radar target LT and the horizontal position of the object specified by the fusion target is large, the reliability of the object specified by the fusion target is low. In this case, vehicle control using a radar target is not performed.
 なお、物体の横位置が自車両に対して所定の接近状態であれば、物体と自車両との衝突可能性が高くなる。一方、物体の横位置が自車両に対して所定の接近状態でなければ、物体と自車両との衝突の可能性が低くなる。そこで、本実施形態では、物体の横位置が自車両に対して所定の接近状態であることを条件として車両制御を実施する。 Note that if the lateral position of the object is in a predetermined approaching state with respect to the host vehicle, the possibility of collision between the object and the host vehicle increases. On the other hand, if the lateral position of the object is not in a predetermined approaching state with respect to the own vehicle, the possibility of a collision between the object and the own vehicle is reduced. Therefore, in the present embodiment, vehicle control is performed on the condition that the lateral position of the object is in a predetermined approaching state with respect to the host vehicle.
 以上により、画像物標GTがロストしたとしても、フュージョン物標として検出されていた信頼度の高い物体に対する車両制御を適切に実施できる。 As described above, even if the image target GT is lost, it is possible to appropriately perform vehicle control on a highly reliable object detected as a fusion target.
 次に、ECU10による上記処理の実行例を図4のフローチャートを用いて説明する。なお以下の処理は、レーダ物標LTが取得されている場合に、ECU10がレーダ物標LTごとに所定周期で繰り返し実施する。 Next, an execution example of the above process by the ECU 10 will be described with reference to the flowchart of FIG. The following processing is repeatedly performed by the ECU 10 at a predetermined cycle for each radar target LT when the radar target LT is acquired.
 まず、ECU10は、ステップS11においてレーダ物標LTがフュージョン状態であるか否かを判定する。例えば、ECU10は、レーダ物標LTの座標系において、所定範囲内に画像物標GTが含まれていることによりレーダ物標LTと画像物標GTとがフュージョン状態である場合に肯定する。 First, the ECU 10 determines whether or not the radar target LT is in a fusion state in step S11. For example, the ECU 10 affirms when the radar target LT and the image target GT are in a fusion state because the image target GT is included in a predetermined range in the coordinate system of the radar target LT.
 ECU10は、ステップS12において、フュージョン状態の場合には、フュージョン物標で特定される物体の横位置が所定の第1閾値Th1以下であるか否かを判定する。詳しくは、ECU10は、自車両M1の車幅方向の中心位置を軸(自車線O)とした場合に、自車線Oと横位置との距離が所定の第1閾値Th1以下であるか否かを判定する。 In step S12, in the fusion state, the ECU 10 determines whether or not the lateral position of the object specified by the fusion target is equal to or less than a predetermined first threshold value Th1. Specifically, the ECU 10 determines whether or not the distance between the host lane O and the lateral position is equal to or less than a predetermined first threshold Th1 when the center position in the vehicle width direction of the host vehicle M1 is the axis (own lane O). Determine.
 ECU10は、ステップS12を肯定した場合には、ステップS13においてTTCが被制御対象30の作動タイミングTh2以下であるか否かを判定する。ECU10は、ステップS13を肯定した場合には、ステップS14において被制御対象30を作動する。ECU10は、ステップS12または13を否定した場合には、ステップS22に進み、被制御対象30を作動しない。 ECU10 determines whether TTC is below the operation timing Th2 of the controlled object 30 in step S13, when step S12 is affirmed. When step S13 is affirmed, the ECU 10 operates the controlled object 30 in step S14. When step S12 or 13 is denied, ECU10 progresses to step S22 and does not operate the controlled object 30.
 一方、ECU10は、ステップS11を否定した場合には、ステップS15においてフュージョン状態から画像物標GTがロストした状態(画像ロストFSN)であるか否かを判定する。ECU10は、ステップS15を肯定した場合には、ステップS16においてレーダ物標で特定される物体の距離が第3閾値Th3以下であるか否かを判定する。ECU10は、ステップS16を肯定した場合には、ステップS17において自車両とレーダ物標で特定される物体との相対速度が第4閾値Th4以下であるか否かを判定する。 On the other hand, if step S11 is negative, the ECU 10 determines whether or not the image target GT is lost (image lost FSN) from the fusion state in step S15. If the determination in step S15 is affirmative, the ECU 10 determines whether or not the distance of the object specified by the radar target in step S16 is equal to or less than a third threshold Th3. If the determination in step S16 is affirmative, the ECU 10 determines in step S17 whether or not the relative speed between the host vehicle and the object specified by the radar target is equal to or less than a fourth threshold Th4.
 ECU10は、ステップS17を肯定した場合には、ステップS18において、画像物標GTがロストした状態が所定回数(あるいはサイクル)以下継続されているかを判定する。本処理は、フュージョン物標からレーダ物標LTの検出に切り替わった後、当該物体がレーダ物標LTのみで検出される状態が所定回数以下繰り返される際に肯定する。画像物標GTがロストした状態が継続しているか否かを判定条件に用いることで、外乱などの影響で画像物標GTがロストした場合と区別することができる。 If the determination in step S17 is affirmative, the ECU 10 determines in step S18 whether the state in which the image target GT has been lost continues for a predetermined number of times (or cycles). This process is affirmed when the state in which the object is detected only by the radar target LT is repeated a predetermined number of times or less after switching from the fusion target to the detection of the radar target LT. By using as a determination condition whether or not the state in which the image target GT is lost continues, it can be distinguished from the case in which the image target GT is lost due to the influence of disturbance or the like.
 ECU10は、ステップS18を肯定した場合には、ステップS19において、レーダ物標LTで特定される物体の横位置が第5閾値Th5以下であるか否かを判定する。第5閾値Th5は、車両と物体との接近状態の判定値として設定されている。ECU10は、ステップS19で肯定した場合には、ステップS20において、画像物標GTがロストする直前に、フュージョン物標で特定される物体の横位置が第6閾値Th6以下であるか否かを判定する。なお、第5閾値Th5と第6閾値Th6との差は所定未満に設定されている。そのため、ステップS19,S20の両方が肯定された場合には、レーダ物標LTで特定される物体の横位置と、フュージョン物標で特定される物体の横位置との差が所定未満であることになる。ECU10は、ステップS20を肯定した場合、ステップS21において、TTCが被制御対象30の作動タイミングTh2以下であるか否かを判定する。そしてECU10は、ステップS21を肯定した場合には、ステップS14に進み、被制御対象30を作動する。なおECU10は、ステップS15~S21のいずれかを否定した場合には、ステップS22に進み、被制御対象30を作動しない。 If the ECU 10 affirms Step S18, in Step S19, the ECU 10 determines whether or not the lateral position of the object specified by the radar target LT is equal to or less than the fifth threshold Th5. The fifth threshold Th5 is set as a determination value for the approaching state between the vehicle and the object. If the determination in step S19 is affirmative, the ECU 10 determines in step S20 whether or not the lateral position of the object specified by the fusion target is equal to or less than the sixth threshold Th6 immediately before the image target GT is lost. To do. Note that the difference between the fifth threshold Th5 and the sixth threshold Th6 is set to be less than a predetermined value. Therefore, when both steps S19 and S20 are positive, the difference between the lateral position of the object specified by the radar target LT and the lateral position of the object specified by the fusion target is less than a predetermined value. become. When step S20 is affirmed, the ECU 10 determines whether or not TTC is equal to or lower than the operation timing Th2 of the controlled object 30 in step S21. When the ECU 10 affirms Step S21, the ECU 10 proceeds to Step S14 and operates the controlled object 30. If any of steps S15 to S21 is denied, the ECU 10 proceeds to step S22 and does not actuate the controlled object 30.
 図1Bは、ECU10の機能を表す機能ブロックを示している。ECU10は、状態判定部101、距離判定部102、横位置取得部103、横位置判定部104、相対速度判定部105、及び車両制御部106とを有する。 FIG. 1B shows functional blocks representing the functions of the ECU 10. The ECU 10 includes a state determination unit 101, a distance determination unit 102, a lateral position acquisition unit 103, a lateral position determination unit 104, a relative speed determination unit 105, and a vehicle control unit 106.
 状態判定部101は、図4のフローチャートのステップS15を実行する機能ブロックであり、物体がフュージョン物標で検出される状態から該物体が第1物標情報のみで検出される状態に遷移したか否かを判定する。 The state determination unit 101 is a functional block that executes Step S15 of the flowchart of FIG. 4. Whether the object has transitioned from a state in which the object is detected by the fusion target to a state in which the object is detected only by the first target information. Determine whether or not.
 距離判定部102は、ステップS16を実行する機能ブロックであり、状態判定部101により、第1物標情報のみで前記物体が検出される状態に遷移したと判定された際の前記物体との距離が所定の近距離であるか否かを判定する。 The distance determination unit 102 is a functional block that executes Step S <b> 16, and the distance to the object when the state determination unit 101 determines that the state has been changed to a state in which the object is detected using only the first target information. Is determined to be a predetermined short distance.
 横位置取得部103は、前記物体の車幅方向の横位置を取得する機能ブロックであり、フュージョン物標として前記物体が検出されている状態から前記第1物標情報のみで前記物体が検出される状態に遷移したと判定された後は、第1物標情報を用いて前記物体の車幅方向の横位置を取得する。また、横位置取得部103は、フュージョン物標として前記物体が検出されている状態から第1物標情報のみで前記物体が検出される状態に遷移する場合に、その遷移直前のフュージョン物標を用いて前記物体の第1横位置を取得するとともに、遷移直後の第1物標情報を用いて前記物体の第2横位置を取得する。 The lateral position acquisition unit 103 is a functional block that acquires the lateral position of the object in the vehicle width direction, and the object is detected only from the first target information from the state in which the object is detected as a fusion target. If it is determined that the state has been changed, the lateral position of the object in the vehicle width direction is acquired using the first target information. In addition, when the horizontal position acquisition unit 103 transitions from a state in which the object is detected as a fusion target to a state in which the object is detected only with the first target information, the lateral position acquisition unit 103 selects the fusion target immediately before the transition. To obtain the first lateral position of the object, and obtain the second lateral position of the object using the first target information immediately after the transition.
 横位置判定部104は、ステップS19、S20を実行する機能ブロックであり、距離判定部102により、前記物体との距離が所定の近距離であると判定された時点で、前記横位置取得部により取得された前記物体の車幅方向の横位置が、自車両に対して所定の接近状態であるか否か(つまり、レーダ物標LTで特定される物体の横位置が第5閾値Th5以下であるか否か)を判定し、自車両に対して所定の接近状態である場合には、第1横位置と第2横位置との差が所定値以上であるか否かを判定する。 The lateral position determination unit 104 is a functional block that executes steps S19 and S20. When the distance determination unit 102 determines that the distance to the object is a predetermined short distance, the lateral position acquisition unit 104 Whether or not the acquired lateral position of the object in the vehicle width direction is in a predetermined approach state with respect to the host vehicle (that is, the lateral position of the object specified by the radar target LT is equal to or less than a fifth threshold Th5). If it is in a predetermined approaching state with respect to the host vehicle, it is determined whether or not the difference between the first lateral position and the second lateral position is greater than or equal to a predetermined value.
 相対速度判定部105は、ステップS17を実行する機能ブロックであり、前記物体と自車両との相対速度を判定するが所定値(Th4)よりも小さいか否かを判定する。 The relative speed determination unit 105 is a functional block that executes Step S17, and determines whether the relative speed between the object and the host vehicle is smaller than a predetermined value (Th4).
 車両制御部106は、ステップS14及びS22を実行する機能ブロックであり、図4のフローチャートに従って前記物体に対する車両制御を実施する。 The vehicle control unit 106 is a functional block that executes steps S14 and S22, and performs vehicle control on the object according to the flowchart of FIG.
 上記によれば以下の効果を奏することができる。 According to the above, the following effects can be achieved.
 (1)レーダ物標LTと画像物標GTとのフュージョンにより生成されたフュージョン物標を用いて物体を検出することで、物体の検出精度が高められる。しかし自車両と物体とが接近すると、撮影画像から物体の一部(例えば下端)がはずれることが生じ、画像物標GTが取得されなくなる。この場合、フュージョン物標が生成されなくなるため、フュージョン物標を用いた物体に対する車両制御が実施できなくなることが生じうる。 (1) By detecting an object using a fusion target generated by the fusion of the radar target LT and the image target GT, the detection accuracy of the object can be improved. However, when the host vehicle and the object approach, a part of the object (for example, the lower end) may be removed from the captured image, and the image target GT is not acquired. In this case, since the fusion target is not generated, the vehicle control for the object using the fusion target may not be performed.
 一方、自車両と物体とが接近する前にフュージョン物標として検出された物体(物体)の信頼度は高いため、自車両と物体との接近に伴ってフュージョン物標が検出されない状態になったとしても、当該物体に対して車両制御が実施されることが好ましい。 On the other hand, since the reliability of an object (object) detected as a fusion target before the host vehicle and the object approach each other is high, the fusion target is not detected as the host vehicle approaches the object. However, it is preferable that vehicle control is performed on the object.
 そこで、フュージョン物標で物体が検出されている状態から、画像物標GTが取得できなくなることで、レーダ物標LTのみで物体が検出される状態に遷移した場合には、物体との距離が所定の近距離であることを条件として、物体に対する車両制御を実施するようにした。そのため、物体がフュージョン物標として検出されなくなった後も、信頼度の高い物体に対する車両制御を行うことが可能となる。 Therefore, when the image target GT cannot be acquired from the state in which the object is detected with the fusion target, and the transition is made to the state in which the object is detected only with the radar target LT, the distance to the object is Vehicle control over an object is performed on condition that the distance is a predetermined short distance. Therefore, even after the object is no longer detected as a fusion target, it is possible to perform vehicle control on the object with high reliability.
 (2)フュージョン物標として物体が検出されている状態から、レーダ物標LTのみで物体が検出される状態に遷移し、且つ自車両と物体との距離が所定の近距離であると判定された際、その時点での物体の横位置が自車両に対して所定の接近状態であれば、物体と自車両との衝突可能性が高くなり、物体の横位置が自車両に対して所定の接近状態でなければ、物体と自車両との衝突の可能性が低くなる。 (2) Transition from a state in which an object is detected as a fusion target to a state in which the object is detected only by the radar target LT, and it is determined that the distance between the host vehicle and the object is a predetermined short distance. If the lateral position of the object at that time is in a predetermined approaching state with respect to the own vehicle, the possibility of collision between the object and the own vehicle is increased, and the lateral position of the object is If it is not an approaching state, the possibility of a collision between the object and the host vehicle is reduced.
 そこで、レーダ物標LTのみで物体が検出される状態に遷移したと判定され、自車両と物体との距離が所定の近距離であると判定された際、物体の横位置が自車両に対して所定の接近状態であることを条件として、物体に対する車両制御を実施するようにした。この場合、物体と自車両との衝突可能性が低ければ車両制御が実施されないこととなるため、フュージョン物標として物体が検出されなくなった後、不要な車両制御を抑制しつつ、信頼度の高い物体に対する車両制御を行うことが可能となる。 Therefore, when it is determined that the object is detected only by the radar target LT and the distance between the host vehicle and the object is determined to be a predetermined short distance, the lateral position of the object is determined relative to the host vehicle. The vehicle control for the object is executed on condition that the vehicle is in a predetermined approach state. In this case, if the possibility of collision between the object and the host vehicle is low, vehicle control is not performed. Therefore, after the object is no longer detected as a fusion target, it is highly reliable while suppressing unnecessary vehicle control. It becomes possible to perform vehicle control on the object.
 (3)フュージョン物標として物体が検出される状態から、レーダ物標LTのみで物体が検出される状態に遷移した後は、レーダ物標LTを用いて物体の車幅方向の横位置を求めることとしたため、その時点での物体の横位置を精度よく取得できる。 (3) After transitioning from a state in which an object is detected as a fusion target to a state in which an object is detected only with a radar target LT, the lateral position of the object in the vehicle width direction is obtained using the radar target LT. Therefore, the lateral position of the object at that time can be acquired with high accuracy.
 (4)フュージョン物標として物体が検出されている状態からレーダ物標LTのみで物体が検出される状態に遷移した際に、フュージョン物標として取得された物体の横位置と、レーダ物標LTとして取得された物体のとの横位置との差が大きい場合には、物体が移動した可能性が高い。そこでこの場合には、自車両の車両制御を実施しないようにすることで、不要な車両制御が行われることが抑えられる。 (4) When a transition is made from a state in which an object is detected as a fusion target to a state in which the object is detected only by a radar target LT, the lateral position of the object acquired as a fusion target and the radar target LT If there is a large difference from the lateral position of the acquired object, the possibility that the object has moved is high. Therefore, in this case, unnecessary vehicle control is suppressed by not performing vehicle control of the host vehicle.
 (5)物体と自車両との相対速度が大きくなると、距離を相対速度で除算することにより算出される衝突余裕時間TTCが小さくなるため、フュージョン物標が検出されなくなる前に、車両制御が開始される可能性が高くなる。言い換えると、車両制御が開始される前に画像物標GTが検出できなくなる状況は、物体と自車両との相対速度が小さく、衝突余裕時間TTCが大きくなる状況に限られることとなる。そこで、前記物体が所定の近距離の位置にあると判定された際に、相対速度が小さいことを条件として、自車両の車両制御を実施するようにしたため、フュージョン物標で物体が検出されなくなった後、不要な車両制御を抑制し、信頼度の高い物体に対する車両制御を実施することができる。 (5) When the relative speed between the object and the host vehicle increases, the collision margin time TTC calculated by dividing the distance by the relative speed decreases, so vehicle control starts before the fusion target is not detected. Is likely to be. In other words, the situation in which the image target GT cannot be detected before the vehicle control is started is limited to the situation where the relative speed between the object and the host vehicle is small and the collision margin time TTC is large. Therefore, when it is determined that the object is located at a predetermined short distance, the vehicle control of the host vehicle is performed on the condition that the relative speed is low, so that the object is not detected by the fusion target. After that, unnecessary vehicle control can be suppressed, and vehicle control for an object with high reliability can be performed.
 (6)レーダ物標LTのみで物体が検出される状態に遷移してから所定時間が経過した後は、物体の信頼度が低下する。そこでこの場合には、物体との距離が所定の近距離であることに関わらず、物体に対する車両制御を実施しないようにしたため、信頼度が低下した物体に対して不要な車両制御が行われることを回避できる。 (6) The reliability of the object decreases after a predetermined time elapses after the transition to the state in which the object is detected only by the radar target LT. Therefore, in this case, the vehicle control is not performed on the object regardless of whether the distance to the object is a predetermined short distance, so that unnecessary vehicle control is performed on the object with reduced reliability. Can be avoided.
 (7)物体との距離が近距離であること以外に起因して、物体がフュージョン物標として検出されなくなった場合には、車両制御を実施しないこととしたため、物体検出の信頼度が低下している可能性のある状況下で、不要な車両制御が実施されることを回避できる。 (7) When the object is no longer detected as a fusion target due to the fact that the distance to the object is a short distance, vehicle control is not performed, so the reliability of object detection decreases. It is possible to avoid unnecessary vehicle control from being performed in a situation where there is a possibility of being.
 (第2実施形態)
 次に、第2実施形態について、上記第1実施形態との相違点を中心に説明する。上記第1実施形態では、画像物標GTがロストした時点で物体との距離が近距離である場合に、車両制御としてスピーカ、シートベルト、ブレーキ等を作動させる形態としたが、第2実施形態では、かかる場合に、ブレーキを作動させて物体との衝突回避を図る形態とする。
(Second Embodiment)
Next, the second embodiment will be described focusing on the differences from the first embodiment. In the first embodiment, when the image target GT is lost and the distance from the object is a short distance, the speaker, the seat belt, the brake, etc. are operated as vehicle control. However, the second embodiment In such a case, the brake is operated to avoid collision with an object.
 図5Bに示すように、第2実施形態のECU10a(図1Aも参照)は、物体が歩行者であることを判定する歩行者判定部107を更に有するが、状態判定部101、距離判定部102、横位置取得部103、横位置判定部104、及び相対速度判定部105は、第1実施形態と共通している。また詳しくは後述するが、本実施形態の車両制御部106aは、第1実施形態の車両制御部106の機能に加え、物体に対する衝突余裕度である第1余裕度に基づいて一次ブレーキ制御を実施するとともに、第1余裕度よりも衝突余裕度の小さい第2余裕度に基づいて二次ブレーキ制御を実施する。 As shown in FIG. 5B, the ECU 10a (see also FIG. 1A) of the second embodiment further includes a pedestrian determination unit 107 that determines that the object is a pedestrian, but includes a state determination unit 101 and a distance determination unit 102. The lateral position acquisition unit 103, the lateral position determination unit 104, and the relative speed determination unit 105 are common to the first embodiment. As will be described in detail later, the vehicle control unit 106a of the present embodiment performs primary brake control based on the first margin, which is a margin of collision against an object, in addition to the function of the vehicle control unit 106 of the first embodiment. At the same time, the secondary brake control is performed based on the second margin having a collision margin smaller than the first margin.
 ここで、車両制御部106aが実施する衝突回避のためのブレーキ制御として、事前制動(FPB)と介入制動(PB)の2段階によるブレーキ制御が提案されている。具体的には、物体に対して衝突の可能性が生じた場合に、まずFPBとして弱めのブレーキを実施する。このFPBの作動によって、PBの制動力の立ち上げを早くするとともにドライバへ注意を促している。そして、FPBの作動によってもドライバによる回避操作が行われない場合には、PBとして強めのブレーキを実施する。すなわち、PBは、FPBに比べて衝突可能性が高い状況下で実施され、PBによる制動力は、一般にFPBに比べて大きく設定される。このように、PBはより限られた状況で作動するものであるため、PBの作動条件は、FPBの作動条件に比べてより厳しく、つまりは成立しにくくなるように設定される。 Here, as a brake control for avoiding a collision performed by the vehicle control unit 106a, a brake control in two stages of a pre-braking (FPB) and an intervention braking (PB) has been proposed. Specifically, when there is a possibility of collision with an object, first, weak braking is performed as FPB. This FPB operation speeds up the PB braking force and alerts the driver. Then, if the avoidance operation by the driver is not performed even by the operation of FPB, a stronger brake is implemented as PB. In other words, the PB is performed under a situation where the possibility of collision is higher than that of the FPB, and the braking force by the PB is generally set larger than that of the FPB. As described above, since the PB operates in a more limited situation, the operating condition of the PB is set to be stricter than that of the FPB.
 本実施形態では、FPBの作動条件は、フュージョン物標で特定される物体の横位置が所定の第1閾値Th1以下であって、かつTTCがFPBの作動タイミングTh7以下となる場合にFPBが作動するように設定される。条件の成立に伴って、FPBフラグが「1」にセットされる。ただし、PBの作動タイミングTh8以下では、FPBは作動しないように設定される。つまり、その状況下ではPBを優先して作動させる。 In the present embodiment, the FPB is activated when the lateral position of the object specified by the fusion target is equal to or less than a predetermined first threshold Th1 and the TTC is equal to or less than the FPB activation timing Th7. Set to do. As the condition is satisfied, the FPB flag is set to “1”. However, the FPB is set so as not to operate at an operation timing Th8 or less of the PB. In other words, PB is preferentially operated under the circumstances.
 一方、PBの作動条件は、物体の横位置が所定の第5閾値Th5以下であって、かつTTCがPBの作動タイミングTh8以下となる場合にPBが作動するように設定される。条件の成立に伴って、PBフラグが「1」にセットされる。 On the other hand, the PB operation condition is set so that the PB operates when the lateral position of the object is equal to or less than a predetermined fifth threshold Th5 and TTC is equal to or less than the PB operation timing Th8. As the condition is satisfied, the PB flag is set to “1”.
 ここで、TTCの閾値は、FPBの作動タイミングTh7が、PBの作動タイミングTh8よりも大きい値に設定される。すなわち、衝突余裕度が高い状況下、言い換えると衝突可能性が低い状況下ではFPBが実施されるように、TTCの閾値が設定される。なお、FPBの作動後は、ドライバ操作(例えば、ハンドルの旋回やブレーキ操作)によってFPBの作動が停止する、つまりFPBのキャンセルが可能であるのに対し、PBの作動後はドライバ操作によってもキャンセルされない。 Here, the threshold of TTC is set such that the FPB operation timing Th7 is larger than the PB operation timing Th8. That is, the TTC threshold is set so that FPB is performed in a situation where the collision margin is high, in other words, in a situation where the possibility of collision is low. Note that after the FPB is activated, the FPB operation is stopped by a driver operation (for example, turning of the steering wheel or a brake operation). Not.
 なお、本実施形態では、FPBが「一次ブレーキ制御」に相当し、PBが「二次ブレーキ制御」に相当し、作動タイミングTh7が「第1余裕度」に相当し、作動タイミングTh8が「第2余裕度」に相当する。 In this embodiment, FPB corresponds to “primary brake control”, PB corresponds to “secondary brake control”, operation timing Th7 corresponds to “first margin”, and operation timing Th8 corresponds to “first brake control”. This corresponds to “2 margin”.
 ところで、画像ロストFSNが生じ、かつPBの作動条件が成立した時点において、FPBがすでに作動している状態である場合には、PBの作動前から物体が接触回避の対象として検知されていたことになる。すなわち、物体が継続的に検知されている状態であるといえ、検知されたその物体の信頼度は高いと考えられる。一方で、画像ロストFSNが生じ、かつPBの作動条件が成立した時点において、FPBが作動していない状態である場合には、PBの作動前に物体が接触回避の対象として検知されていないことになる。すなわち、物体が事前に検知されておらず、検知されたその物体の信頼度は低いと考えられる。このような状況が生じるケースとしては、例えばレーダセンサ22による誤検知や、側方からの物体の急な進入による検知等が考えられ、これらに基づくPBの作動は、不要な作動となるおそれがある。 By the way, when the image lost FSN occurs and the PB operating condition is satisfied, if the FPB is already operating, the object has been detected as a contact avoidance target before the PB operating. become. That is, it can be said that the object is continuously detected, but the reliability of the detected object is considered high. On the other hand, when the image lost FSN occurs and the PB operating condition is satisfied, if the FPB is not operating, the object is not detected as a contact avoidance target before the PB operating. become. That is, it is considered that the object is not detected in advance and the reliability of the detected object is low. As a case where such a situation occurs, for example, erroneous detection by the radar sensor 22 or detection due to an abrupt approach of an object from the side can be considered, and the operation of the PB based on these may become an unnecessary operation. is there.
 そこで、本実施形態では、近距離で画像ロストFSNが生じたと判定された時点において、FPBが作動していることを条件に、PBを作動するようにした。すなわち、PBを作動する時点において、FPBが事前に作動していれば衝突可能性があるとしてPBを作動し、FPBが事前に作動していなければ衝突可能性は乏しいとしてPBを作動しない。すなわち、FPBが作動している場合は、PBの作動を許可し、FPBが作動していない場合は、PBの作動を禁止するようにした。 Therefore, in this embodiment, the PB is activated on the condition that the FPB is activated when it is determined that the image lost FSN has occurred at a short distance. That is, at the time when the PB is activated, if the FPB has been activated in advance, the PB is activated assuming that there is a possibility of collision. If the FPB has not been activated in advance, the possibility of collision is low and the PB is not activated. That is, when the FPB is operating, the operation of the PB is permitted, and when the FPB is not operating, the operation of the PB is prohibited.
 一方、FPBが作動している場合に、物体の一時的な挙動の変化(例えば、物体の横位置が一時的に閾値を上回ること)によって、FPBの作動条件を満たさなくなり、FPBが終了することがある。この場合、物体の横位置がその後再び閾値以下となることで、PBの作動条件が成立することが生じ得る。かかる場合、FPBの終了から所定時間T以内であれば、検知されたその物体の信頼度は高いと考えられる。 On the other hand, when the FPB is operating, the FPB operating condition is not satisfied and the FPB ends due to a change in the temporary behavior of the object (for example, the lateral position of the object temporarily exceeds the threshold). There is. In this case, it is possible that the PB operating condition is satisfied when the lateral position of the object thereafter becomes equal to or less than the threshold value again. In such a case, if the object is within a predetermined time T from the end of FPB, it is considered that the reliability of the detected object is high.
 そこで、本実施形態では、FPBの作動が停止した後でも所定の猶予を設け、PBの実施を許可するようにした。具体的には、近距離で画像ロストFSNが生じたと判定された時点において、FPBが実施されていなくても、FPBが直前所定時間T内に終了された履歴があることを条件に、PBを実施するようにした。なお、本実施形態では、履歴としてブレーキ経験フラグを用いた。このブレーキ経験フラグは、FPBの作動が終了した時点で「1」にセットされ、所定時間T経過後に「0」にリセットされる。 Therefore, in this embodiment, even after the operation of the FPB is stopped, a predetermined delay is provided to allow the execution of the PB. Specifically, when it is determined that the image lost FSN has occurred at a short distance, even if FPB is not performed, PB is set on the condition that there is a history that FPB was completed within the predetermined time T immediately before. I tried to do it. In the present embodiment, the brake experience flag is used as the history. The brake experience flag is set to “1” when the operation of the FPB is finished, and is reset to “0” after a predetermined time T has elapsed.
 図5Aには、本実施形態におけるPBの作動の許可及び禁止を決定するための論理回路40を示す。論理回路40は、FPBフラグ及びPBフラグの信号を入力するNAND回路C1と、NAND回路C1の出力信号及びブレーキ経験フラグの反転信号を入力するAND回路C2と、AND回路C2の出力信号、画像ロストFSNの制御信号及び近距離判定信号を入力するAND回路C3とを有している。そして、AND回路C3から出力された信号が「1」であればPBの作動が禁止され、「0」であればPBの作動が許可される。 FIG. 5A shows a logic circuit 40 for determining permission and prohibition of PB operation in the present embodiment. The logic circuit 40 includes an NAND circuit C1 that inputs an FPB flag and a signal of the PB flag, an AND circuit C2 that inputs an output signal of the NAND circuit C1 and an inverted signal of a brake experience flag, an output signal of the AND circuit C2, an image lost And an AND circuit C3 for inputting an FSN control signal and a short distance determination signal. If the signal output from the AND circuit C3 is “1”, the operation of the PB is prohibited, and if the signal is “0”, the operation of the PB is permitted.
 論理回路40において、NAND回路C1は、FPBフラグ及びPBフラグがいずれも「1」の場合に「0」を出力し、FPBフラグ及びPBフラグの少なくともいずれかが「0」の場合に「1」を出力する。AND回路C2は、NAND回路C1の出力信号が「1」であって、かつブレーキ経験フラグが「0」の場合のみ、「1」を出力する。 In the logic circuit 40, the NAND circuit C1 outputs “0” when both the FPB flag and the PB flag are “1”, and “1” when at least one of the FPB flag and the PB flag is “0”. Is output. The AND circuit C2 outputs “1” only when the output signal of the NAND circuit C1 is “1” and the brake experience flag is “0”.
 ここで、画像ロストFSNの制御信号は、上述のとおり、フュージョン状態から画像物標GTがロストした状態(画像ロストFSN)である場合に「1」が入力され、画像ロストFSN状態でない場合に「0」が入力される。また、近距離判定信号は、やはり上述したとおり、レーダ物標LTで特定される物体の距離が第3閾値Th3以下である場合に「1」が入力され、第3閾値Th3よりも大きい場合に「0」が入力される。 Here, as described above, “1” is input as the control signal for the image lost FSN when the image target GT is lost from the fusion state (image lost FSN). When the image lost FSN is not in the image lost FSN state, “1” is input. "0" is input. In addition, as described above, the short distance determination signal is input when “1” is input when the distance of the object specified by the radar target LT is equal to or smaller than the third threshold Th3 and is larger than the third threshold Th3. “0” is input.
 そして、AND回路C3は、AND回路C2の出力信号が「1」であり、かつ画像ロストFSN状態であって、さらに物体との距離が近距離である場合のみ、「1」を出力する。すなわち、近距離で画像ロスト状態が生じた時点において、FPBが作動しておらず、かつブレーキ経験がない場合は、PBの作動を禁止するようにしている。 The AND circuit C3 outputs “1” only when the output signal of the AND circuit C2 is “1”, the image is in the lost FSN state, and the distance from the object is a short distance. That is, when the image lost state occurs at a short distance, if the FPB is not operating and there is no braking experience, the operation of the PB is prohibited.
 続いて、PBの作動が許可される場合と禁止される場合について、図6乃至図11を用いて説明する。ここでは、
(1)FPBが作動した後、FPBの作動状態下でPBが作動するシーン、
(2)FPBが作動した後FPBが一旦終了し、続いてPBが作動するシーン、
(3)側方からの歩行者等の急な進入によって、PBの作動が禁止されるシーン、の各シーンについてPB作動の状態を説明する。このうち、図6,8,10は、TTC(衝突余裕時間)を縦軸とし、自車両の車幅方向における自車両に対する横位置を横軸とした平面図を示している。縦軸には、FPBの作動タイミングTh7と、PBの作動タイミングTh8を設けている。この場合、物体に対するTTCが、PB又はFPBのそれぞれの閾値を下回ることで、それぞれの作動タイミングであると判定される。
Next, the case where the operation of PB is permitted and the case where it is prohibited will be described with reference to FIGS. here,
(1) After the FPB is activated, the scene where the PB is activated under the FPB operating state;
(2) A scene in which FPB is once terminated after FPB is activated, and then PB is activated,
(3) The state of the PB operation will be described for each of the scenes where the PB operation is prohibited due to a sudden approach of a pedestrian or the like from the side. Among these, FIGS. 6, 8, and 10 are plan views in which the TTC (collision margin time) is the vertical axis and the horizontal position with respect to the host vehicle in the vehicle width direction is the horizontal axis. On the vertical axis, FPB operation timing Th7 and PB operation timing Th8 are provided. In this case, when the TTC for the object falls below the respective threshold values of PB or FPB, it is determined that each operation timing is reached.
 横軸には、FB及びPBの車幅方向における作動域である作動幅Wを設けている。作動幅Wは、自車両の幅に左右それぞれ所定長さを加えて設定されている。図6,8,10では、物体としての歩行者が作動幅W内に位置する際に、所定の横位置(Th1,Th5)を満たすものとしている。また、図6,8,10は、自車両が進行方向に向かって進む際に、歩行者が破線矢印に沿って自車両に近づくシーンを想定している。なお、図7,9,11には、図6,8,10にそれぞれ対応するタイミングチャートを示す。 The horizontal axis is provided with an operating width W which is an operating range in the vehicle width direction of FB and PB. The operating width W is set by adding a predetermined length to the width of the host vehicle. 6, 8, and 10, when a pedestrian as an object is located within the operating width W, the predetermined lateral position (Th1, Th5) is satisfied. 6, 8, and 10 assume a scene in which a pedestrian approaches the vehicle along a broken arrow when the vehicle travels in the traveling direction. 7, 9, and 11 show timing charts corresponding to FIGS. 6, 8, and 10, respectively.
 図6及び図7には、FPBが作動した後、FPBの作動状態下でPBが作動するシーンを示す。歩行者が作動幅W内で検知された状態において、タイミングt11でTTCが作動タイミングTh7以下となると、FPBフラグが「1」にセットされるとともに、FPBが作動する。その後歩行者と自車両との距離がさらに近づき、タイミングt12において、画像ロストFSNの状態に遷移する(「0」→「1」)。なおこの時、近距離の判定信号が「1」になっており、AND回路C3の出力信号が「1」になる(以下、図9のタイミングt22、図11のタイミングt32も同様)。そして、タイミングt13で、TTCが作動タイミングTh8以下となると、PBフラグが「1」にセットされるとともに、FPBが終了しPBが作動する。 6 and 7 show a scene in which the PB operates under the FPB operating state after the FPB operates. In a state where a pedestrian is detected within the operating width W, when the TTC becomes the operating timing Th7 or less at the timing t11, the FPB flag is set to “1” and the FPB is operated. Thereafter, the distance between the pedestrian and the host vehicle further approaches, and at timing t12, the state changes to the image lost FSN state ("0" → "1"). At this time, the short distance determination signal is “1”, and the output signal of the AND circuit C3 is “1” (the same applies to timing t22 in FIG. 9 and timing t32 in FIG. 11). At time t13, when the TTC becomes equal to or less than the operation timing Th8, the PB flag is set to “1”, the FPB is ended, and the PB is operated.
 タイミングt13では、FPBフラグが「1」の状態でPBフラグが「1」にセットされることで、NAND回路C1の出力信号が「0」となる。これに伴ってAND回路C2、AND回路C3の出力信号が「0」となるため、PBの作動が許可される。すなわち、画像ロストFSN状態で、物体との距離が近距離であると判定された時点において、FPBが実施されていること、つまりFPBフラグが「1」であることを条件に、PBを実施する。 At timing t13, the output signal of the NAND circuit C1 becomes “0” by setting the PB flag to “1” while the FPB flag is “1”. Accordingly, the output signals of the AND circuit C2 and the AND circuit C3 become “0”, so that the operation of the PB is permitted. That is, in the image lost FSN state, when it is determined that the distance to the object is a short distance, the PB is executed on the condition that the FPB is executed, that is, the FPB flag is “1”. .
 図8及び図9には、FPBが作動した後FPBが一旦終了し、続いてPBが作動するシーンを示す。図6及び図7と同様に、タイミングt21でFPBが作動し、タイミングt22で画像ロストFSNの状態に遷移する(「0」→「1」)。その後、タイミングt23で歩行者が作動幅Wから外れることで、FPBフラグが「0」にリセットされるとともに、FPBが終了する。この時、ブレーキ経験フラグが「1」にセットされる。そして、タイミングt24で物体が再び作動幅W内で検知される。この時、タイミングt23とタイミングt24の間隔が所定時間T(例えば、0.7msec)以内のためブレーキ経験フラグは「1」で維持され、これによりPBが作動する。 8 and 9 show scenes in which the FPB is temporarily ended after the FPB is activated, and then the PB is activated. As in FIGS. 6 and 7, the FPB is activated at timing t21, and transitions to the image lost FSN state at timing t22 (“0” → “1”). Thereafter, when the pedestrian deviates from the operating width W at timing t23, the FPB flag is reset to “0” and the FPB ends. At this time, the brake experience flag is set to “1”. At time t24, the object is detected again within the operating width W. At this time, since the interval between the timing t23 and the timing t24 is within a predetermined time T (for example, 0.7 msec), the brake experience flag is maintained at “1”, whereby the PB is activated.
 かかる場合、論理回路40では、PBフラグが「1」にセットされる時点において、すでにブレーキ経験フラグが「1」となっており、これに基づいてAND回路C2、AND回路C3の出力信号が「0」となるため、PBの作動が許可される。すなわち、画像ロストFSN状態で、物体との距離が近距離であると判定された時点において、FPBが直前所定時間T内に終了された履歴があること、つまりブレーキ経験フラグが「1」であることを条件に、PBを実施する。 In such a case, in the logic circuit 40, when the PB flag is set to “1”, the brake experience flag is already “1”, and based on this, the output signals of the AND circuit C2 and the AND circuit C3 are “ “0”, the operation of PB is permitted. That is, in the image lost FSN state, when it is determined that the distance to the object is a short distance, there is a history that the FPB has been finished within the predetermined time T immediately before, that is, the brake experience flag is “1”. PB is performed on the condition.
 図10及び図11では、側方からの歩行者の急な進入によって、PBの作動が禁止されるシーンを示す。この場合、タイミングt31でTTCが作動タイミングTh7以下となるが、歩行者は作動幅W内に検知されていないため、FPBフラグは「0」の状態で維持される。続くタイミングt32で、画像ロストFSNの状態に遷移する(「0」→「1」)。そして、タイミングt33でTTCが作動タイミングTh8以下となるが、ここでも歩行者は作動幅W内に検知されていないため、PBフラグは「0」の状態で維持される。そして、タイミングt34で歩行者が作動幅W内で検知されると、PBフラグが「1」にセットされる。しかしながら、この時点において、FPBは作動しておらず(FPBフラグが「1」でない)、かつブレーキ経験フラグが「1」にセットされていないため、PBの作動は禁止される。すなわち、画像ロストFSN状態で、物体との距離が近距離であると判定された時点において、FPBが実施されておらず、かつ、FPBが直前所定時間T内に終了された履歴がない場合は、FPBを実施しない。 10 and 11 show a scene in which the operation of the PB is prohibited due to a sudden approach of a pedestrian from the side. In this case, the TTC is equal to or less than the operation timing Th7 at the timing t31, but since the pedestrian is not detected within the operation width W, the FPB flag is maintained in the “0” state. At the subsequent timing t32, the state is changed to the image lost FSN state ("0" → "1"). At time t33, the TTC becomes equal to or less than the operation timing Th8. However, since the pedestrian is not detected within the operation width W, the PB flag is maintained at “0”. When a pedestrian is detected within the operating width W at timing t34, the PB flag is set to “1”. However, at this time, since the FPB is not operating (FPB flag is not “1”) and the brake experience flag is not set to “1”, the operation of the PB is prohibited. That is, in the image lost FSN state, when it is determined that the distance to the object is a short distance, FPB has not been performed and there is no history that FPB has been completed within the predetermined time T immediately before , FPB is not implemented.
 なお、図10及び図11に示すような側方から物体が急に進入してくるケースにおいて、その物体としては例えば歩行者が考えられる。歩行者は、仮に自車両近くの前方位置で自車両に向けて移動してきたとしても、容易に立ち止まったり、方向転換できると考えられる。そのため、かかる場合にPBを実施すると不要作動となる可能性が大きい。例えば、図12には、論理回路40に歩行者の判定信号の入力をさらに加えた論理回路50を示す。ここでは、歩行者の判定信号がAND回路C3に入力される。歩行者の判定信号は、物体が歩行者であると判定された場合に「1」が入力され、物体が歩行者でないと判定された場合に「0」が入力される。つまり、物体が歩行者である場合にPBの作動を禁止するように作用することで、PBの不要作動を抑制することができる。 In the case where an object suddenly enters from the side as shown in FIGS. 10 and 11, for example, a pedestrian can be considered as the object. Even if the pedestrian moves toward the host vehicle at a position near the host vehicle, the pedestrian can easily stop or change direction. For this reason, if PB is performed in such a case, there is a high possibility of unnecessary operation. For example, FIG. 12 shows a logic circuit 50 obtained by further adding a pedestrian determination signal to the logic circuit 40. Here, a pedestrian determination signal is input to the AND circuit C3. As the determination signal of the pedestrian, “1” is input when it is determined that the object is a pedestrian, and “0” is input when it is determined that the object is not a pedestrian. In other words, when the object is a pedestrian, the unnecessary operation of PB can be suppressed by acting so as to prohibit the operation of PB.
 また、作動幅Wは、物体の種別や物体の横速度等に応じて可変とすることができる。例えば、作動幅Wと物体の横速度とは図13に示すような関係となる。図13に示すように、物体の横速度が所定値以下までは、横速度が大きくなるほど作動幅Wも大きくなる。一方、物体の横速度が所定値よりも大きくなると、作動幅Wは上限値で一定となる。 Also, the operating width W can be made variable according to the type of object, the lateral speed of the object, and the like. For example, the operating width W and the lateral speed of the object have a relationship as shown in FIG. As shown in FIG. 13, the operating width W increases as the lateral speed increases until the lateral speed of the object is equal to or lower than a predetermined value. On the other hand, when the lateral velocity of the object becomes larger than a predetermined value, the operating width W becomes constant at the upper limit value.
 上記によれば、第1実施形態の効果に加え、本実施形態は以下の効果を奏することができる。 According to the above, in addition to the effects of the first embodiment, the present embodiment can achieve the following effects.
 画像物標GTがロストした時点で物体との距離が近距離である場合に、FPBが事前に実施されていればPBを実施し、FPBが事前に実施されていなければPBを実施しないようにした。この場合、FPBが事前に実施されている状態では、検知された物体の信頼度が高いと考えられるのに対して、FPBが事前に実施されていない状態では、信頼度が低いと考えられる。そのためこの構成によれば、フュージョン物標で物体が検出されなくなった際に、不要なPBを抑制でき、PBによるブレーキ制御を適正に実施することができる。 When the image target GT is lost and the distance to the object is a short distance, PB is performed if FPB is performed in advance, and PB is not performed unless FPB is performed in advance. did. In this case, it is considered that the reliability of the detected object is high when the FPB is performed in advance, whereas the reliability is low when the FPB is not performed in advance. Therefore, according to this structure, when an object is no longer detected by a fusion target, unnecessary PB can be suppressed and brake control by PB can be appropriately performed.
 (他の実施形態)
 本発明は上記に限定されず次のように実施してもよい。なお以下の説明において上記と同様の構成については同じ図番号を付し詳述を省略する。
(Other embodiments)
The present invention is not limited to the above, and may be implemented as follows. In the following description, the same components as those described above are denoted by the same reference numerals, and detailed description thereof is omitted.
 (A1)物体と自車両との相対速度が大きい場合には、車両制御による被制御対象30の作動が開始された後に、画像物標GTがロストすることも想定される。この場合には、ECU10は、画像物標GTのロスト後も車両制御を継続することにより、被制御対象30の作動を継続させるとよい。 (A1) When the relative speed between the object and the host vehicle is large, it is assumed that the image target GT is lost after the operation of the controlled object 30 by the vehicle control is started. In this case, the ECU 10 may continue the operation of the controlled object 30 by continuing the vehicle control even after the image target GT is lost.
 フュージョン物標として物体が検出されており車両制御が実施されている状況下で、レーダ物標LTのみで物体が検出される状態に遷移したと判定された場合に、自車両の車両制御を継続することで、フュージョン物標が検出できなくなった際に急に車両制御が実施されなくなり、被制御対象30の作動が突然停止する等の不都合が発生することを抑えられる。 When it is determined that the object has been detected only by the radar target LT under the situation where the object is detected as the fusion target and the vehicle control is being performed, the vehicle control of the own vehicle is continued. Thus, when the fusion target cannot be detected, the vehicle control is suddenly stopped and the occurrence of inconvenience such as the sudden stop of the operation of the controlled object 30 can be suppressed.
 (A2)画像物標GTがロストした時点で、フュージョン物標で特定される物体が自車両に対して所定の接近状態にある場合に、ECU10が車両制御を実施するようにしてもよい。例えば、画像物標GTがロストした時点での自車線Oと横位置との距離が所定値以内である場合に、画像物標GTのロスト後に、車両制御を実施してもよい。 (A2) When the image target GT is lost and the object specified by the fusion target is in a predetermined approaching state with respect to the host vehicle, the ECU 10 may perform vehicle control. For example, when the distance between the own lane O and the lateral position at the time when the image target GT is lost is within a predetermined value, the vehicle control may be performed after the image target GT is lost.
 (A3)上記の図4のフローチャートにおいて、画像物標GTがロストしていると判定された場合には、少なくとも物体との距離が閾値未満であるか否かに基づいて、車両制御を実施するか否かが判定されればよく、S17~S20の各判定条件は省略されてもよい。 (A3) In the flowchart of FIG. 4 described above, when it is determined that the image target GT is lost, vehicle control is performed based on at least whether the distance to the object is less than the threshold value. Whether or not each determination condition of S17 to S20 may be omitted.
 (A4)画像センサ21の撮像中心軸の向きは、自車両における積載重量の変化に応じて変化しうる。そのため、画像物標GTがロストする近距離位置が変化する。そこで図4のフローチャートにおいて、S16の判定で使用される距離の第3閾値Th3を、自車両における画像センサ21の撮像中心軸の向きの変化に応じて可変設定してもよい。なお撮像中心軸の変化は車両に設けられた重量センサの検出値に基づいて求められるとよい。この場合、車両の積載重量が大きいほど車両前側に対して車両後側が沈み込み、撮像中心軸が上を向くため、第3閾値Th3を小さくする。このように画像センサ21の撮像中心軸の向きの変化を加味して第3閾値Th3を設定する場合、画像物標GTがロストする車間距離をより精度よく判定することができる。これ以外にも画像センサ21の取り付け高さが、自車両における積載重量の変化に応じて変化することを考慮して、画像物標GTがロストする車間距離が求められてもよい。 (A4) The orientation of the imaging center axis of the image sensor 21 can be changed according to the change in the loaded weight in the host vehicle. Therefore, the short distance position where the image target GT is lost changes. Therefore, in the flowchart of FIG. 4, the third threshold Th3 of the distance used in the determination of S16 may be variably set according to a change in the orientation of the imaging center axis of the image sensor 21 in the host vehicle. The change in the imaging center axis may be obtained based on the detection value of a weight sensor provided in the vehicle. In this case, as the vehicle loading weight increases, the rear side of the vehicle sinks with respect to the front side of the vehicle, and the imaging central axis faces upward, so the third threshold Th3 is reduced. In this way, when the third threshold Th3 is set in consideration of the change in the orientation of the imaging center axis of the image sensor 21, the inter-vehicle distance at which the image target GT is lost can be determined more accurately. In addition to this, the inter-vehicle distance at which the image target GT is lost may be obtained in consideration that the mounting height of the image sensor 21 changes in accordance with the change in the loaded weight in the host vehicle.
 (A5)上記第2実施形態では、PB及びFPBの作動タイミングとしてTTCを用いたが、衝突余裕度が表されるパラメータであればこれに限定されず、例えばTTCに基づいた物体との距離を用いる構成としてもよい。 (A5) In the second embodiment, TTC is used as the operation timing of PB and FPB. However, the TTC is not limited to this as long as it represents a collision margin. For example, a distance from an object based on TTC is set. It is good also as a structure to use.
 (A6)上記第2実施形態では、車両制御の対象とする物体を歩行者としたが、歩行者に限らず、他車両や路上障害物等であってもよい。 (A6) In the second embodiment, the object to be controlled by the vehicle is a pedestrian. However, the object is not limited to the pedestrian, but may be another vehicle, an obstacle on the road, or the like.

Claims (12)

  1.  搬送波の反射波として取得される自車両前方の物体の第1物標情報と、自車両前方の撮影画像の画像処理で取得される前記物体の第2物標情報とをフュージョンしてフュージョン物標を生成し、前記フュージョン物標として検出された前記物体に対する自車両の車両制御を行う車両制御装置(10、10a)であって、
     前記物体が前記フュージョン物標で検出される状態から前記物体が前記第1物標情報のみで検出される状態に遷移したか否かを判定する状態判定部(101)と、
     前記状態判定部により、前記第1物標情報のみで前記物体が検出される状態に遷移したと判定された際の前記物体との距離が所定の近距離であるか否かを判定する距離判定部(102)と、
     前記距離判定部により、前記物体との距離が所定の近距離であると判定された際に、前記物体に対する車両制御を実施する車両制御部(106、106a)と、
     を備えることを特徴とする車両制御装置(10、10a)。
    The fusion target is obtained by fusing the first target information of the object ahead of the host vehicle acquired as a reflected wave of the carrier wave and the second target information of the object acquired by image processing of the captured image ahead of the host vehicle. A vehicle control device (10, 10a) for performing vehicle control of the host vehicle with respect to the object detected as the fusion target,
    A state determination unit (101) for determining whether or not the object has been changed from a state detected by the fusion target to a state detected only by the first target information;
    Distance determination for determining whether or not the distance from the object is a predetermined short distance when it is determined by the state determination unit that the object has been detected only by the first target information. Part (102),
    A vehicle control unit (106, 106a) for performing vehicle control on the object when the distance determination unit determines that the distance to the object is a predetermined short distance;
    A vehicle control device (10, 10a) comprising:
  2.  前記物体の車幅方向の横位置を取得する横位置取得部(103)と、
     前記距離判定部(102)により、前記物体との距離が所定の近距離であると判定された時点で、前記横位置取得部により取得された前記物体の車幅方向の横位置が、自車両に対して所定の接近状態であるか否かを判定する横位置判定部(104)とを備え、
     前記車両制御部(106、106a)は、横位置判定部(104)により、前記物体の車幅方向の横位置が自車両に対して所定の接近状態であると判定された際に、前記物体に対する車両制御を実施する請求項1に記載の車両制御装置(10、10a)。
    A lateral position acquisition unit (103) for acquiring a lateral position of the object in the vehicle width direction;
    When the distance determination unit (102) determines that the distance to the object is a predetermined short distance, the lateral position of the object acquired by the lateral position acquisition unit in the vehicle width direction is A lateral position determination unit (104) for determining whether or not the vehicle is in a predetermined approach state,
    When the lateral position determining unit (104) determines that the lateral position of the object in the vehicle width direction is in a predetermined approach state with respect to the host vehicle, the vehicle control unit (106, 106a) The vehicle control device (10, 10a) according to claim 1, which performs vehicle control on the vehicle.
  3.  前記横位置取得部(103)は、前記フュージョン物標として前記物体が検出されている状態から前記第1物標情報のみで前記物体が検出される状態に遷移したと判定された後は、前記第1物標情報を用いて前記物体の車幅方向の横位置を取得する請求項2に記載の車両制御装置(10、10a)。 The horizontal position acquisition unit (103) determines that the state has been detected from the state in which the object is detected as the fusion target to the state in which the object is detected only by the first target information. The vehicle control device (10, 10a) according to claim 2, wherein a lateral position of the object in the vehicle width direction is acquired using first target information.
  4.  前記横位置取得部(103)は、前記フュージョン物標として前記物体が検出されている状態から前記第1物標情報のみで前記物体が検出される状態に遷移する場合に、その遷移直前の前記フュージョン物標を用いて前記物体の第1横位置を取得するとともに、遷移直後の前記第1物標情報を用いて前記物体の第2横位置を取得するものであって、
     前記横位置判定部(104)は、前記第1横位置と前記第2横位置との差が所定値以上であるか否かを判定し、
     前記車両制御部(106、106a)は、前記横位置判定部(104)により、前記第1横位置と前記第2横位置との差が所定値以上であると判定されたに、前記物体に対する車両制御を実施しない請求項3に記載の車両制御装置(10、10a)。
    When the lateral position acquisition unit (103) transitions from a state in which the object is detected as the fusion target to a state in which the object is detected only by the first target information, the lateral position acquisition unit (103) immediately before the transition The first lateral position of the object is acquired using a fusion target, and the second lateral position of the object is acquired using the first target information immediately after the transition,
    The lateral position determination unit (104) determines whether or not a difference between the first lateral position and the second lateral position is a predetermined value or more,
    The vehicle control unit (106, 106a) determines that the difference between the first lateral position and the second lateral position is greater than or equal to a predetermined value by the lateral position determination unit (104). The vehicle control device (10, 10a) according to claim 3, wherein vehicle control is not performed.
  5.  前記物体と自車両との相対速度が所定値よりも小さいか否かを判定する相対速度判定部(105)を備え、
     前記車両制御部(106、106a)は、前記物体と自車両との相対速度が前記所定値よりも小さいことを条件として、前記物体が所定の近距離の位置にあると判定された際に、前記物体に対する車両制御を実施する請求項1乃至4のいずれか1項に記載の車両制御装置(10、10a)。
    A relative speed determination unit (105) for determining whether a relative speed between the object and the host vehicle is smaller than a predetermined value;
    When the vehicle control unit (106, 106a) determines that the object is at a predetermined short-distance position on the condition that the relative speed between the object and the host vehicle is smaller than the predetermined value, The vehicle control device (10, 10a) according to any one of claims 1 to 4, wherein vehicle control is performed on the object.
  6.  前記車両制御部(106、106a)は、前記フュージョン物標として前記物体が検出されており、前記物体に対する車両制御が実施されている状況下で、前記物体が前記第1物標情報のみで検出される状態に遷移したと判定された場合には、前記物体に対する車両制御を継続する請求項1乃至5のいずれか1項に記載の車両制御装置(10、10a)。 The vehicle control unit (106, 106a) detects the object only by the first target information in a situation where the object is detected as the fusion target and vehicle control is performed on the object. The vehicle control device (10, 10a) according to any one of claims 1 to 5, wherein the vehicle control for the object is continued when it is determined that the state has been changed.
  7.  前記車両制御部(106、106a)は、前記フュージョン物標として前記物体が検出されている状態から前記第1物標情報のみで前記物体が検出される状態に遷移してから所定時間の経過後は、前記物体との距離が所定の近距離であることに関わらず、前記物体に対する車両制御を実施しない請求項1乃至6のいずれか1項に記載の車両制御装置(10、10a)。 The vehicle control unit (106, 106a), after a lapse of a predetermined time from the transition from the state in which the object is detected as the fusion target to the state in which the object is detected only by the first target information The vehicle control device (10, 10a) according to any one of claims 1 to 6, wherein vehicle control is not performed on the object regardless of whether the distance to the object is a predetermined short distance.
  8.  前記車両制御部(106、106a)は、前記距離判定部(102)により、前記物体との距離が所定の近距離ではないと判定された際には、前記物体に対する車両制御を実施しない請求項1乃至7のいずれか1項に記載の車両制御装置(10、10a)。 The vehicle control unit (106, 106a) does not perform vehicle control on the object when the distance determination unit (102) determines that the distance to the object is not a predetermined short distance. The vehicle control device (10, 10a) according to any one of 1 to 7.
  9.  前記車両制御として、前記物体に対する衝突余裕度である第1余裕度に基づいて一次ブレーキ制御を実施するとともに、前記第1余裕度よりも衝突余裕度の小さい第2余裕度に基づいて二次ブレーキ制御を実施する車両制御装置(10a)であって、
     前記車両制御部(106a)は、前記距離判定部(102)により前記物体との距離が所定の近距離であると判定された時点において、前記一次ブレーキ制御が実施されていること、又は前記一次ブレーキ制御が直前所定時間内に終了された履歴があることを条件に、前記二次ブレーキ制御を実施する請求項1乃至8のいずれか1項に記載の車両制御装置(10a)。
    As the vehicle control, primary brake control is performed based on a first margin that is a collision margin with respect to the object, and a secondary brake is performed based on a second margin that is smaller than the first margin. A vehicle control device (10a) that performs control,
    The vehicle control unit (106a) performs the primary brake control when the distance determination unit (102) determines that the distance to the object is a predetermined short distance, or the primary control The vehicle control device (10a) according to any one of claims 1 to 8, wherein the secondary brake control is performed on the condition that there is a history that the brake control has been finished within a predetermined time immediately before.
  10.  前記車両制御部(106a)は、前記距離判定部(102)により前記物体との距離が所定の近距離であると判定された時点において、前記一次ブレーキ制御が実施されておらず、かつ、前記履歴がない場合は、前記二次ブレーキ制御を実施しない請求項9に記載の車両制御装置(10a)。 The vehicle control unit (106a) does not perform the primary brake control when the distance determination unit (102) determines that the distance to the object is a predetermined short distance, and The vehicle control device (10a) according to claim 9, wherein the secondary brake control is not performed when there is no history.
  11.  前記物体が歩行者であることを判定する歩行者判定部(107)を備え、
     前記車両制御部(106a)は、前記物体が歩行者であると判定された場合に、前記二次ブレーキ制御の実施を許可する請求項9又は10に記載の車両制御装置(10a)。
    A pedestrian determination unit (107) for determining that the object is a pedestrian;
    The vehicle control device (10a) according to claim 9 or 10, wherein the vehicle control unit (106a) permits the execution of the secondary brake control when it is determined that the object is a pedestrian.
  12.  搬送波の反射波として取得される自車両前方の物体の第1物標情報と、自車両前方の撮影画像の画像処理で取得される前記物体の第2物標情報とをフュージョンしてフュージョン物標を生成するステップと、前記フュージョン物標として検出された前記物体に対する自車両の車両制御を行うステップとを備え、
     前記物体が前記フュージョン物標で検出される状態から前記物体が第1物標情報のみで検出される状態に遷移したか否かを判定するステップと、
     前記第1物標情報のみで前記物体が検出される状態に遷移したと判定された際の前記物体との距離が所定の近距離であるか否かを判定するステップと、
     前記物体との距離が所定の近距離であると判定された際に、前記物体に対する車両制御を実施するステップと、
     を備えることを特徴とする車両制御方法。
    The fusion target is obtained by fusing the first target information of the object ahead of the host vehicle acquired as a reflected wave of the carrier wave and the second target information of the object acquired by image processing of the captured image ahead of the host vehicle. And a step of performing vehicle control of the host vehicle with respect to the object detected as the fusion target,
    Determining whether or not the object has transitioned from a state in which the object is detected by the fusion target to a state in which the object is detected only by the first target information;
    Determining whether or not the distance to the object when it is determined that the object has been detected using only the first target information is a predetermined short distance;
    Performing vehicle control on the object when it is determined that the distance to the object is a predetermined short distance; and
    A vehicle control method comprising:
PCT/JP2016/067896 2015-06-16 2016-06-16 Vehicle control device and vehicle control method WO2016204213A1 (en)

Priority Applications (3)

Application Number Priority Date Filing Date Title
DE112016002750.8T DE112016002750B4 (en) 2015-06-16 2016-06-16 VEHICLE CONTROL DEVICE AND VEHICLE CONTROL METHOD
CN201680035181.9A CN107848530B (en) 2015-06-16 2016-06-16 Vehicle control device and vehicle control method
US15/736,661 US10573180B2 (en) 2015-06-16 2016-06-16 Vehicle control device and vehicle control method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2015121397 2015-06-16
JP2015-121397 2015-06-16
JP2016-112096 2016-06-03
JP2016112096A JP6539228B2 (en) 2015-06-16 2016-06-03 Vehicle control device and vehicle control method

Publications (1)

Publication Number Publication Date
WO2016204213A1 true WO2016204213A1 (en) 2016-12-22

Family

ID=57545689

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2016/067896 WO2016204213A1 (en) 2015-06-16 2016-06-16 Vehicle control device and vehicle control method

Country Status (1)

Country Link
WO (1) WO2016204213A1 (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN108154084A (en) * 2017-03-10 2018-06-12 南京沃杨机械科技有限公司 For the farm environment cognitive method of the unpiloted Multi-sensor Fusion of agricultural machinery
JP2018106334A (en) * 2016-12-26 2018-07-05 トヨタ自動車株式会社 Warning device for vehicle
CN110576853A (en) * 2018-06-07 2019-12-17 本田技研工业株式会社 Vehicle control system
CN111591287A (en) * 2019-02-04 2020-08-28 丰田自动车株式会社 Pre-collision control device
CN111845610A (en) * 2019-04-26 2020-10-30 丰田自动车株式会社 Vehicle control device
CN113264041A (en) * 2020-02-17 2021-08-17 丰田自动车株式会社 Collision avoidance support device
CN114537310A (en) * 2020-11-19 2022-05-27 丰田自动车株式会社 Door control device

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007226680A (en) * 2006-02-24 2007-09-06 Toyota Motor Corp Object detection system
JP2008132867A (en) * 2006-11-28 2008-06-12 Hitachi Ltd Collision-avoidance support device and vehicle equipped therewith
JP2012048643A (en) * 2010-08-30 2012-03-08 Denso Corp Object detector
JP2014067169A (en) * 2012-09-25 2014-04-17 Toyota Motor Corp Collision prediction device
JP2014117995A (en) * 2012-12-14 2014-06-30 Daihatsu Motor Co Ltd Drive support device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007226680A (en) * 2006-02-24 2007-09-06 Toyota Motor Corp Object detection system
JP2008132867A (en) * 2006-11-28 2008-06-12 Hitachi Ltd Collision-avoidance support device and vehicle equipped therewith
JP2012048643A (en) * 2010-08-30 2012-03-08 Denso Corp Object detector
JP2014067169A (en) * 2012-09-25 2014-04-17 Toyota Motor Corp Collision prediction device
JP2014117995A (en) * 2012-12-14 2014-06-30 Daihatsu Motor Co Ltd Drive support device

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2018106334A (en) * 2016-12-26 2018-07-05 トヨタ自動車株式会社 Warning device for vehicle
CN108154084A (en) * 2017-03-10 2018-06-12 南京沃杨机械科技有限公司 For the farm environment cognitive method of the unpiloted Multi-sensor Fusion of agricultural machinery
CN110576853A (en) * 2018-06-07 2019-12-17 本田技研工业株式会社 Vehicle control system
CN111591287A (en) * 2019-02-04 2020-08-28 丰田自动车株式会社 Pre-collision control device
CN111591287B (en) * 2019-02-04 2023-05-09 丰田自动车株式会社 Control device before collision
CN111845610A (en) * 2019-04-26 2020-10-30 丰田自动车株式会社 Vehicle control device
CN111845610B (en) * 2019-04-26 2022-07-15 丰田自动车株式会社 Vehicle control device
CN113264041A (en) * 2020-02-17 2021-08-17 丰田自动车株式会社 Collision avoidance support device
CN113264041B (en) * 2020-02-17 2023-10-03 丰田自动车株式会社 Collision avoidance assistance device
CN114537310A (en) * 2020-11-19 2022-05-27 丰田自动车株式会社 Door control device
CN114537310B (en) * 2020-11-19 2023-11-21 丰田自动车株式会社 Door control device

Similar Documents

Publication Publication Date Title
CN107848530B (en) Vehicle control device and vehicle control method
WO2016204213A1 (en) Vehicle control device and vehicle control method
WO2017104773A1 (en) Moving body control device and moving body control method
CN108156822B (en) Vehicle control device and vehicle control method
WO2017111147A1 (en) Travel assistance device and travel assistance method
JP6855776B2 (en) Object detection device and object detection method
JP6361592B2 (en) Vehicle control device
JP2018097582A (en) Driving support device and driving support method
US10839232B2 (en) Vehicle control method and apparatus
JP6300181B2 (en) Vehicle control device
JP2018097687A (en) Vehicle control device, vehicle control method
WO2017171082A1 (en) Vehicle control device and vehicle control method
JP6380232B2 (en) Object detection apparatus and object detection method
WO2017110871A1 (en) Vehicle control device and vehicle control method
CN108137007B (en) Vehicle control device and vehicle control method
WO2017043358A1 (en) Object detecting device and object detecting method
US20210394754A1 (en) Driving assistance device
WO2017183668A1 (en) Vehicle control device and vehicle control method
JP2015210191A (en) Object detection device
US20180372860A1 (en) Object detection device and object detection method
WO2018110196A1 (en) Vehicle control device, and vehicle control method
JP2018063605A (en) Vehicle control device
WO2018070380A1 (en) Vehicle control apparatus
US20220366702A1 (en) Object detection device
JP7413548B2 (en) Driving support device

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 16811687

Country of ref document: EP

Kind code of ref document: A1

WWE Wipo information: entry into national phase

Ref document number: 15736661

Country of ref document: US

WWE Wipo information: entry into national phase

Ref document number: 112016002750

Country of ref document: DE

122 Ep: pct application non-entry in european phase

Ref document number: 16811687

Country of ref document: EP

Kind code of ref document: A1