WO2017199971A1 - Vehicle control device and vehicle control method - Google Patents

Vehicle control device and vehicle control method Download PDF

Info

Publication number
WO2017199971A1
WO2017199971A1 PCT/JP2017/018409 JP2017018409W WO2017199971A1 WO 2017199971 A1 WO2017199971 A1 WO 2017199971A1 JP 2017018409 W JP2017018409 W JP 2017018409W WO 2017199971 A1 WO2017199971 A1 WO 2017199971A1
Authority
WO
WIPO (PCT)
Prior art keywords
information
vehicle
detected
region
state
Prior art date
Application number
PCT/JP2017/018409
Other languages
French (fr)
Japanese (ja)
Inventor
高木 亮
Original Assignee
株式会社デンソー
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2016225193A external-priority patent/JP6493365B2/en
Application filed by 株式会社デンソー filed Critical 株式会社デンソー
Priority to US16/302,496 priority Critical patent/US11091153B2/en
Publication of WO2017199971A1 publication Critical patent/WO2017199971A1/en

Links

Images

Classifications

    • GPHYSICS
    • G08SIGNALLING
    • G08GTRAFFIC CONTROL SYSTEMS
    • G08G1/00Traffic control systems for road vehicles
    • G08G1/16Anti-collision systems

Definitions

  • the present disclosure relates to a vehicle control device and a vehicle control method for detecting an object positioned in front of the vehicle.
  • a technique for synthesizing a detection result of an object based on a reflected wave corresponding to a transmission wave and a detection result of an object acquired by an image sensor and generating new information (fusion target) on the object is known.
  • the recognition accuracy of the object ahead of the vehicle can be improved by the generated fusion target.
  • by using the position information and the object width of the object specified using this information it is possible to appropriately perform the collision avoidance control of the vehicle when avoiding the collision with the object.
  • Patent Document 1 discloses a vehicle control device that continues collision avoidance control based on the detection result of an object by a radar sensor when an image lost occurs after a fusion target is generated. In this vehicle control device, after the image is lost, the detection accuracy of the object is lowered, so that the collision avoidance control is difficult to operate.
  • the image lost may be caused by the proximity of the object and the vehicle, in addition to the image lost due to the brightness around the vehicle. Specifically, when the object and the vehicle are close to each other, the object deviates from the angle of view of the image sensor, and the image sensor cannot detect the object properly, and image loss occurs. If it is difficult to operate the collision avoidance control when image loss occurs due to the proximity of the object and the vehicle, there is a high possibility that the collision avoidance control may be delayed or deactivated.
  • the present disclosure has been made in view of the above problems, and an object of the present disclosure is to provide a vehicle control device and a vehicle control method capable of suppressing the operation delay and inactivity of the collision avoidance control.
  • the first information that is the detection result of the object based on the reflected wave corresponding to the transmission wave and the second information that is the detection result of the object based on the captured image obtained by capturing the front of the vehicle with the imaging unit are used.
  • the object transitions from the state detected by the first information and the second information to the state detected only by the first information the object acquires the second information in front of the vehicle.
  • a position determination unit that determines whether or not the object is located in a nearby region that is determined in advance as a region that cannot be determined; and the collision determination unit determines that the object is located in the vicinity region by the position determination unit.
  • a vehicle control device comprising a maintaining unit for maintaining the operating conditions of the avoidance control from a state where the object is detected by the first information and the second information.
  • the position of the object is in the vicinity in front of the vehicle. If it is located in the region, the operating condition of the collision avoidance control is maintained.
  • FIG. 1 is a block diagram of PCSS.
  • FIG. 2A is a diagram illustrating a position of an object detected by an image sensor and a radar sensor
  • FIG. 2B is a diagram illustrating a position of an object detected by the image sensor and the radar sensor.
  • FIG. 3 is a diagram for explaining the PCS.
  • FIG. 4A is a diagram for explaining factors that cause image loss.
  • FIG. 4B is a diagram for explaining factors that cause image loss.
  • FIG. 5 is a flowchart for explaining the PCS.
  • FIG. 6 is a diagram illustrating the neighborhood area NA.
  • FIG. 7 is a diagram for explaining detailed processing performed in step S19 of FIG. FIG.
  • FIG. 8A is a diagram for explaining changes in the operability of the PCS.
  • FIG. 8B is a diagram for explaining changes in the operability of the PCS;
  • FIG. 8C is a diagram for explaining changes in the operability of the PCS.
  • FIG. 9 is a flowchart showing the process performed in step S19 in the second embodiment.
  • FIG. 10A is a diagram for explaining the movement of an object in the vicinity area NA.
  • FIG. 10B is a diagram for explaining the movement of the object in the vicinity area NA.
  • FIG. 11 is a diagram for explaining the position of the neighborhood region in the third embodiment.
  • FIG. 12 is a flowchart showing the process performed in step S14 of FIG. 5 in the third embodiment.
  • FIG. 13 is a diagram for explaining the center position and the object width.
  • FIG. 10A is a diagram for explaining the movement of an object in the vicinity area NA.
  • FIG. 10B is a diagram for explaining the movement of the object in the vicinity area NA.
  • FIG. 11
  • FIG. 14 is a flowchart showing the process performed in step S18 of FIG. 5 in the third embodiment.
  • FIG. 15 is a diagram for explaining the predicted value of the center position.
  • FIG. 16 is a diagram for explaining the left and right end angles.
  • FIG. 17 is a diagram for explaining the discontinuity reliability.
  • the vehicle control device and the vehicle control method according to the present embodiment are mounted on a vehicle (host vehicle CS), detect an object existing in front of the host vehicle CS, and avoid or reduce a collision with the object. It is realized by PCSS (Pre-crash safety system) that performs various controls.
  • the PCSS 100 includes a driving assistance ECU 20 (hereinafter referred to as ECU 20), various sensors 30, and a controlled object 40.
  • the ECU 20 functions as a vehicle control device.
  • the various sensors 30 are connected to the ECU 20 and output detection results for the object and the host vehicle CS to the ECU 20.
  • the PCSS 10 includes an image sensor 31, a radar sensor 32, a vehicle speed sensor 33, and a turning motion detection sensor 34 as various sensors 30.
  • the image sensor 31 is a CCD camera, a monocular camera, a stereo camera or the like, and is installed near the upper end of the windshield of the host vehicle CS.
  • the image sensor 31 captures a captured image by capturing an area that extends in a predetermined range toward the front of the host vehicle CS at predetermined time intervals. Then, by processing the captured image, the position and orientation of the object ahead of the host vehicle CS are acquired as image information and output to the ECU 20.
  • an object whose image information is detected by the image sensor 31 is also referred to as an image target IT.
  • the image sensor 31 functions as an imaging unit.
  • the image information includes the position of the image target IT on the coordinates specified by the vehicle traveling direction (Y axis) and the lateral direction (X axis) with the host vehicle CS as a reference position.
  • the image information includes left and right lateral positions Xr, Xl in the lateral direction (X axis) of the image target IT, and an azimuth angle ⁇ c indicating the azimuth from the host vehicle CS to the object Ob.
  • the ECU 20 can calculate the object width WO from the lateral positions Xr and Xl of the image target IT.
  • the radar sensor 32 acquires radar information that is a detection result of an object based on a reflected wave corresponding to the transmission wave.
  • the radar sensor 32 is attached at the front of the host vehicle CS so that its optical axis faces the front of the vehicle (Y-axis direction), and scans the front of the vehicle by transmitting a transmission wave toward the front of the vehicle.
  • the reflected wave reflected from the surface of the object with respect to the transmitted wave is received.
  • radar information indicating the distance to the object, the relative speed with the object, and the like is generated according to the reflected wave.
  • As the transmission wave a directional electromagnetic wave such as a millimeter wave can be used.
  • the radar information includes the position of the radar target RT in the vehicle traveling direction (Y axis) relative to the host vehicle CS and the azimuth angle ⁇ r from the host vehicle CS to the radar target RT. included.
  • the ECU 20 Based on the position of the radar target RT in the vehicle traveling direction (Y-axis), the ECU 20 sets the relative distance Dr, which is the distance on the Y-axis from the host vehicle CS to the radar target RT, and the relative distance Dr. Based on this, the relative speed Vr of the radar target RT with reference to the host vehicle CS can be acquired.
  • radar information functions as first information
  • image information functions as second information.
  • the vehicle speed sensor 33 is provided on a rotating shaft that transmits power to the wheels of the host vehicle CS, and calculates the vehicle speed that is the speed of the host vehicle CS based on the rotation speed of the rotating shaft.
  • the turning motion detection sensor 34 detects the turning angular velocity of the host vehicle CS that changes from the vehicle traveling direction.
  • the turning motion detection sensor 34 includes a yaw rate sensor that detects a turning angular velocity of the host vehicle CS and a steering angle sensor that detects a steering angle by a steering device (not shown). Based on the output from the turning motion detection sensor 34, the ECU 20 can determine whether or not the host vehicle CS is making a turning motion.
  • the ECU 20 is configured by a known microcomputer and includes a CPU, a ROM, a RAM, and the like. ECU20 functions as the position acquisition part 21, the control part 22, the position determination part 23, and the maintenance part 24 by running the program memorize
  • the position acquisition unit 21 acquires position information of an object ahead of the host vehicle from image information that is an object detection result by the image sensor 31 or radar information that is an object detection result by the radar sensor 32.
  • the control unit 22 When the control unit 22 acquires image information and radar information for the same object, the control unit 22 fuses the image information and the radar information, so that fusion information that is new position information for the object is obtained. Is generated.
  • the relative distance Dr based on the radar information is set as the position of the object in the traveling direction (on the Y axis) of the host vehicle CS, and the horizontal position based on the image information is set as the position of the object in the horizontal direction (on the X axis)
  • Generate fusion information As described above, when generating fusion information for an object, information on the object is generated using information with higher accuracy among the information acquired by the radar sensor 32 and the image sensor 31, and recognition of the object is performed. Accuracy can be improved.
  • an object for which fusion information is generated is referred to as a fusion target FT.
  • control part 22 determines the collision possibility of the object which detected the positional information, and the own vehicle CS, and controls the action
  • the determination area Wcd is an area virtually set in front of the host vehicle CS.
  • the collision margin time TTC Time to Collision
  • the TTC is an evaluation value indicating how many seconds later the vehicle collides with an object when traveling at the vehicle speed as it is. The smaller the TTC, the higher the risk of collision, and the larger the TTC, the higher the risk of collision. The nature becomes low.
  • the values decrease in the order of TTC1, TTC2, and TTC3.
  • the control unit 22 compares the calculated current TTC with the TTC set for each controlled object 40, and activates the corresponding controlled object 40 when the corresponding controlled object 40 exists.
  • the ECU 20 compares the TTC and the operation timing of each controlled object 40, and activates the controlled object 40 when the TTC corresponds to the operation timing of each controlled object 40.
  • TTC is the operation timing of the alarm device 41
  • an alarm is transmitted to the driver by the operation of the alarm device 41.
  • TTC is the operation timing of the seat belt device 42
  • the seat belt device 42 is controlled to be wound up.
  • TTC is the operation timing of the brake device 43
  • the automatic brake is operated to control to reduce the collision speed.
  • the control unit 22 changes the operating conditions of the PCS under certain conditions when image information is not detected and image loss occurs.
  • the reliability of existing in front is high. Therefore, even after the object is no longer recognized as the fusion target FT, it is preferable that the PCS for the object can be continued instead of being excluded from the PCS target. Therefore, when image loss occurs after the fusion information is generated, the PCSS 100 continues the PCS based on the radar information and the past object width WO. On the other hand, when the image lost occurs, the image target IT cannot be acquired from the image sensor 31, and the object width WO cannot be newly acquired.
  • the past object width WO in which the fusion target FT has been detected is used, and the operating condition is changed so that the PCS becomes difficult to operate under a certain condition by reducing the object width WO. And it corresponds to the decrease in detection accuracy.
  • the object detection area (denoted as imaging area CA) by the image sensor 31 is compared with the object detection area (denoted as radar area RA) by the radar sensor 32 in the vehicle traveling direction (Y-axis). Direction). Therefore, when the object exists in a position overlapping the imaging area CA and the radar area RA in front of the host vehicle CS, it can be detected as the fusion target FT. On the other hand, when the object is located far or near in the Y-axis direction from the imaging area CA, image loss occurs.
  • an area in front of the imaging area CA in the vehicle traveling direction is also referred to as a neighborhood area NA.
  • the image sensor 31 cannot specify the type of the image target IT because the lower end side of the rear end portion of the object deviates from the angle of view ⁇ 1 of the image sensor 31.
  • Image lost occurs.
  • the vicinity area NA is a position close to the front of the host vehicle CS, and if the operating conditions are changed so that the PCS becomes difficult to operate when image information is lost in the vicinity area NA, the PCS It becomes a factor of operation delay and malfunction. Therefore, the control unit 22 maintains the operating conditions of the PCS for the object when the image is lost due to the object entering the vicinity area NA.
  • the position determination unit 23 determines whether or not the object is located in the vicinity area NA when the state is changed from the state in which the fusion information is generated for the object to the state in which the object is detected only with the radar information. judge.
  • the neighborhood area NA is preset with areas in the Y-axis direction and the X-axis direction based on the angle of view ⁇ ⁇ b> 1 spreading in the vertical direction of the image sensor 31.
  • the range of the vicinity area NA is set based on the relationship between the angle of view ⁇ 1 in the vertical direction of the image sensor 31 and the position in the height direction of the host vehicle CS to which the image sensor 31 is attached.
  • the maintenance unit 24 maintains the PCS operating condition from the state in which the fusion target FT is detected. In this embodiment, the maintenance unit 24 maintains the PCS operating condition by not causing the control unit 22 to reduce the object width WO in the horizontal direction (X-axis direction).
  • the process shown in FIG. 5 is a process performed by the ECU 20 at a predetermined cycle.
  • step S11 image information is acquired based on the output from the image sensor 31.
  • step S12 radar information is acquired based on the output from the radar sensor 32.
  • step S13 it is determined whether or not the fusion target FT has been detected. If the object is detected based on the image information and the radar information and it is determined that the image target IT and the radar target RT are the same object, the process proceeds to step S14. For example, if the difference between the position of the image target IT based on the image information acquired in step S11 and the position of the radar target RT based on the radar information acquired in step S12 is equal to or less than a predetermined distance, the image It is determined that the target IT and the radar target RT are the same object (fusion target FT).
  • step S14 the image information acquired in step S11 and the radar information acquired in step S12 are combined to generate fusion information that is position information for the fusion target FT.
  • the fusion information includes the object width WO in addition to the position of the object.
  • step S15 the number of detections DN, which is the number of times the fusion target FT has been detected, is recorded.
  • the number of detections DN is information indicating the number of times that the same type of fusion target FT is continuously detected. In this embodiment, every time the same type of fusion target FT is detected in step S13, the detection count DN is increased.
  • step S21 a collision with an object is determined.
  • the fusion target FT is detected in step S13.
  • the collision determination between the object and the host vehicle CS is performed using the lap ratio RR between the object width WO and the determination area Wcd included in the fusion information calculated in step S14.
  • step S22 it is determined whether or not to perform PCS. If it is determined in step S21 that there is a possibility of collision with the object, the TTC is calculated by dividing the relative distance Dr in the object by the relative speed Vr, and the calculated TTC is set to the TTC set for each controlled object 40. By comparing, it is determined whether or not each operation is performed.
  • step S23 the corresponding operation of PCS is performed.
  • step S22: NO the process shown in FIG. Steps S21 to S23 function as a control process.
  • step S16 it is determined in step S16 whether or not fusion target detection has been established for the same object in the past. For example, by referring to the detection number DN, it is determined whether or not the fusion target FT has been detected in the past processing. If the fusion target FT has not been detected in the past process (step S16: NO), the process shown in FIG. 5 is temporarily terminated.
  • step S17 it is determined whether or not the radar target RT is continuously detected. This is because the object is present in the radar area RA if the position of the object can be detected by the radar sensor 32 even when image loss occurs. If the radar target RT has not been detected (step S17: NO), the processing of FIG. 5 is once terminated, assuming that no object exists in front of the host vehicle. Therefore, the object is excluded from the PCS target. On the other hand, when the radar target RT is detected (step S17: YES), it is determined that image loss has occurred, and the process proceeds to step S18.
  • step S18 it is determined whether or not the radar target RT is located in the vicinity area NA.
  • the vicinity area NA is set as an area partitioned by the vehicle traveling direction (Y-axis direction) and the lateral direction (X-axis direction). Based on the radar information acquired in step S12, it is determined whether or not the position of the radar target RT is located in an area determined as the neighborhood area NA.
  • Step S18 functions as a position determination step.
  • a predetermined distance from the center of the host vehicle CS in the lateral direction is defined as a boundary line BD in the lateral direction of the neighborhood area NA.
  • the range of the vicinity area NA in the vehicle traveling direction is determined based on the angle of view of the image sensor 31. If the position of the radar target RT in the lateral direction of the position Pr is on the inner side of the lateral area of the neighboring area NA defined by the boundary line BD, the object is located in the neighboring area NA. judge. On the other hand, when the position Pr is outside the determined neighboring area NA in the horizontal direction, it is determined that the object is not located in the neighboring area NA.
  • the boundary line BD is used as a fixed value determined in advance based on the imaging area CA of the image sensor 31. In addition to this, the boundary line BD may be changed in the horizontal direction according to the type of the object.
  • step S20 the operating condition of the PCS is changed by reducing the object width WO.
  • the object width WO since there is a high possibility that the object Ob is located far from the imaging area CA in the vehicle traveling direction, the possibility that the object Ob and the host vehicle CS collide with each other is low. Therefore, priority is given to the low reliability of the object width WO acquired in the past, and the object width WO is reduced in the horizontal direction. That is, in this embodiment, the wrap rate RR associated with the object width WO is set as the PCS operating condition.
  • step S21 collision determination is performed using the reduced object width WO. As a result, the lap rate RR decreases and the PCS becomes difficult to operate.
  • step S18 when the radar target RT is located in the vicinity area NA (step S18: YES), the PCS operating condition is changed in step S19.
  • step S19 it is switched whether to change or maintain the PCS operating condition by determining the possibility of collision between the object Ob and the host vehicle CS according to various conditions.
  • step S19 functions as a maintenance process.
  • step S31 the relative speed Vr of the radar target RT with reference to the host vehicle CS is determined. Since the TTC is calculated by dividing the relative distance Dr by the relative speed Vr, the TTC until the radar target RT and the host vehicle CS collide increases as the relative speed Vr decreases for the same relative distance Dr. It becomes. Therefore, when the relative speed Vr is small, it is more likely that each operation of the PCS is not performed even after the radar target RT has entered the vicinity area NA, as compared with a case where the relative speed Vr is large.
  • step S31 when the relative speed Vr is larger than the threshold value Th1 (step S31: NO), the process proceeds to step S33, and the object width WO is reduced.
  • the reduction of the object width WO performed in step S33 can use the same technique as the reduction of the object width WO performed in step S20.
  • step S31 YES
  • Step S31 functions as a relative speed acquisition unit.
  • step S32 the number DN of detections of the fusion target FT before the image lost occurs is determined.
  • the number of detections DN indicates the number of times the radar target RT has been detected in the past as the fusion target FT. Therefore, if the number of detections DN is small, the reliability of the fusion target FT decreases. For example, when the detection of the fusion target FT is accidental due to noise or the like, the detection count DN is a low value. Therefore, if the number of detections DN is smaller than the threshold value Th2, the process proceeds to step S33, and the object width WO is reduced.
  • step S32 YES
  • FIGS. 8A and 8B show changes in the object width WO when the ECU 20 performs the process of FIG. 5
  • FIG. 8C shows changes in the object width WO when the ECU 20 does not execute the process of FIG. Is shown.
  • the position of the object is detected only with the radar information, and the object width WO cannot be acquired.
  • the object width WO (t12) at time t12 is maintained at the same size as the object width WO at time t11.
  • FIG. 8C shown as a comparison when the object is located in the vicinity area NA, the object width WO (t12) at time t12 is reduced more than the object width WO (t11) at time t11 shown in FIG. 8A. Yes.
  • the wrap rate RR indicating the ratio to the determination region Wcd is larger than the wrap rate RR (c) in the case of FIG. .
  • the PCS becomes easy to operate, and the PCS operation delay or inactivity with respect to the radar target RT is suppressed.
  • the ECU 20 detects the radar target RT when the image lost occurs from the state in which the fusion target FT is detected and the object is detected only with the radar information. If the position is located in the vicinity area NA in front of the vehicle, the PCS operating condition is maintained. With the above configuration, even when an image is lost due to the object entering the vicinity area NA, it is possible to suppress the delay or inactivation of the PCS with respect to the object.
  • the ECU 20 acquires the object width WO indicating the size of the object in the horizontal direction, and based on the amount of overlap (RR) in the horizontal direction between the acquired object width WO and the determination region Wcd set in front of the vehicle. Change the operating conditions of the PCS. Then, when the object is located in the vicinity area NA, the ECU 20 maintains the object width in a state where the object is detected by the image information and the radar information. With the above configuration, it is possible to change the operating conditions of the PCS by a simpler method.
  • the ECU 20 determines that the object is not located in the vicinity area NA if the position Pr of the object acquired from the radar information is outside the preset vicinity area NA.
  • TTC which is a margin time until the object collides with the host vehicle CS
  • the ECU 20 maintains the PCS operating condition when the object is located in the vicinity area NA on condition that the relative speed Vr of the object is equal to or less than a predetermined value.
  • FIG. 9 is a flowchart showing the process performed in step S19 of FIG. 5 in the second embodiment.
  • step S41 it is determined whether or not the host vehicle CS is traveling straight ahead. For example, based on the output from the turning motion detection sensor 34, it is determined whether the host vehicle CS is traveling straight or turning right or left. If the host vehicle CS is not traveling straight ahead (step S41: NO), the object width WO is reduced in step S43.
  • step S42 it is determined whether or not the radar target RT located in the vicinity area NA is traveling straight.
  • FIG. 10 is a diagram for explaining the movement of an object in the vicinity area NA. As illustrated in FIG. 10A, if an object detected in front of the host vehicle CS is traveling straight in the vehicle traveling direction (Y-axis direction), the possibility that the object and the host vehicle CS collide with each other increases. In such a case, it is preferable to maintain the object width WO.
  • step S42 the radar information is used to detect a change in the position of the radar target RT, and based on this change in position, it is determined whether the radar target RT is going straight or turning right or left. In addition to this, when it is detected using the radar information that the lateral position has changed after the vehicle speed of the radar target RT has decreased, it may be determined that the radar target RT has turned right or left. Therefore, step S42 functions as a movement determination unit.
  • step S42 If the radar target RT is traveling straight (step S42: YES), the processing of FIG. 9 is temporarily terminated without reducing the object width WO. Therefore, the operating conditions of the PCS are maintained. On the other hand, if the radar target RT is not traveling straight and the object is turning left or right (step S42: NO), in step S43, the operating condition is changed by reducing the object width WO. Therefore, in step S20 of FIG. 5, the collision determination with the object is performed using the maintained or reduced object width WO.
  • the area set as the neighborhood area NA is different from that in the first and second embodiments.
  • FIG. 11 is a diagram illustrating the position of the neighborhood area NA in the third embodiment.
  • the radar area RA of the radar sensor 32 is set to be wider than the imaging area CA of the image sensor 31.
  • the neighboring area NA is an area (VA to BA) that extends from the angle of view VA of the image sensor 31 by the boundary angle BA in the horizontal direction.
  • the neighborhood area NA is an area where the first position can be detected from the object but the second position cannot be detected. Therefore, in the present embodiment, the neighborhood area NA is outside the imaging area CA of the image sensor 31. In addition, it is set as an area inside the radar area RA of the radar sensor 32.
  • FIG. 12 is a flowchart showing the process performed in step S14 of FIG. 5 in the third embodiment.
  • the process shown in FIG. 12 is a process for recording the center position in the lateral direction of the object, the object width WO, and the relative velocity in the lateral direction when detecting radar information and image information from the object.
  • the process of FIG. 12 performed last time by the ECU 20 is described as the previous process, and the process of FIG. 12 performed this time is described as the current process.
  • step S51 the distance of the object detected as radar information and the orientation of the object detected as image information are fused.
  • step S52 the center position of the object in the lateral direction is calculated based on the image information acquired in step S11 of FIG.
  • the center positions of the left and right lateral positions Xr and Xl included in the image information are calculated as the center position in the lateral direction of the object.
  • Step S52 functions as a center position calculation unit.
  • the object width WO is calculated.
  • the object width WO is calculated using the left and right lateral positions Xr, Xl of the object included in the image information.
  • the object width WO may be calculated as follows.
  • the object width may be calculated using the image width angle and the distance from the host vehicle to the object.
  • step S54 the maximum value of the object width WO is updated.
  • the object width WO held in the previous process is compared with the object width WO recorded in the previous process, and the larger object width WO is updated as the object width WO. Therefore, steps S53 and S54 function as an object width calculation unit.
  • step S55 a relative speed based on the own vehicle in the lateral direction of the object is calculated.
  • the relative speed in the lateral direction is calculated from the difference in the lateral position between the position of the fusion information generated in step S51 in the previous process and the position of the fusion information generated in step S51 in the current process. . Therefore, step S54 functions as a lateral speed calculation unit.
  • step S61 the predicted center position in the lateral direction of the object at the current time is calculated based on the center position calculated in the past from the current time and the lateral speed of the object.
  • the predicted center position of the object width of the object at the current time is calculated based on the image information calculated in step S52 and the lateral velocity of the object calculated in step S55.
  • a distance corresponding to the relative velocity in the lateral direction of the object recorded in step S54 is added to the center position M of the object width calculated from the image information held in step S52.
  • the predicted center position Mp of the current center position of the object is calculated.
  • Step S61 functions as a position prediction unit.
  • step S61 in addition to calculating the predicted center position Mp based on the center position recorded in the past and the lateral speed of the object according to the image information, the center position recorded in the past and the radar information
  • the predicted center position Mp may be calculated based on the lateral speed of the object according to the above.
  • step S62 the left and right end angles of the left and right lateral positions of the object at the current time are calculated based on the predicted center position calculated in step S61 and the object width WO held in step S53.
  • the left and right end angles indicate the azimuth angles of the left and right lateral positions of the current object with reference to the host vehicle.
  • a position extending in the horizontal direction by the object width WO updated in step S54 on the basis of the predicted center position Mp calculated in step S61 is set to the left and right sides of the object.
  • the predicted horizontal positions Xpr and Xpl of the horizontal position are calculated.
  • left and right end angles indicating the azimuth angles of the left and right lateral positions of the current object are calculated.
  • the left and right end angles are positive when the angle increases to the right with reference to the imaging axis, and negative when the angle increases to the left with respect to the imaging axis.
  • Xpl is the predicted lateral position X1 on the side closer to the host vehicle
  • Xpr is the predicted lateral position X2 on the side farther from the host vehicle
  • Yd represents the distance from the host vehicle to the object. In this embodiment, the distance from the host vehicle to the object included in the fusion information is used.
  • step S62 functions as an azimuth angle calculation unit.
  • steps S63 to S66 it is determined based on the left and right end angles calculated in step S62 that the object is located in the vicinity region.
  • the parting reliability indicating the certainty that a part of the object is located in the vicinity region. Calculate the degree. Therefore, in the present embodiment, steps S63 to S66 function as a position determination unit.
  • FIG. 17 is a diagram for explaining the part-off reliability, and is a graph in which the horizontal axis is the absolute value of the left and right end angle ⁇ f and the vertical axis is the part-off reliability RV.
  • the part-off reliability RV is an evaluation value for determining whether or not the vehicle is located in the vicinity region based on the left and right end angles on the side far from the host vehicle in the horizontal direction.
  • the parting reliability is defined by a value from 0 to 100, and the probability that a part of the object is located in the vicinity region increases as the parting reliability increases.
  • the value of the part-off reliability is defined such that the value increases nonlinearly as the value of the left and right end angle ⁇ f on the side far from the host vehicle increases.
  • the reference angle B indicates the absolute value of the angle from the imaging axis to the angle of view.
  • the left and right end angle ⁇ f is the reference angle B, the end on the side farther from the host vehicle is located on the angle of view of the image sensor 51 among the left and right lateral positions of the object.
  • the part-off reliability RV has a value such that the increase rate in the center range MR, which is a range of the predetermined angles R1 to R2, with respect to the reference angle B, is larger than the increase rates at the lower limit angle R1 or lower and the upper limit angle R2 or higher. Is set.
  • step S63 the part-off reliability is calculated based on the left and right end angles.
  • the part-off reliability is calculated for the left and right end angles far from the host vehicle. For example, a map defining the relationship between the left and right end angles and the parting reliability shown in FIG. 17 is recorded, and the parting reliability corresponding to the left and right end angles calculated in step S62 is calculated by referring to this map. To do.
  • step S64 the part-off reliability calculated in step S63 is determined. In the present embodiment, it is determined whether or not a part of the object is located in the vicinity region by comparing the overrun reliability with the threshold Th11.
  • the threshold value Th11 is determined based on a reference angle B indicating the absolute value of the angle from the imaging axis to the angle of view.
  • step S65 the establishment flag indicating that a part of the object is located in the vicinity region is set to true.
  • step S66 the establishment flag indicating that the object is located in the vicinity region is set to false. When the establishment flag is false, it is determined that the object is not located in the vicinity region.
  • step S18 YES
  • the operating conditions of the PCS are maintained in step S19.
  • the object width WO is maintained at the object width WO when the fusion is established.
  • the establishment flag is false, it is determined that the object is not located in the vicinity region (step S18: NO), and the PCS operating condition is changed in step S20.
  • the object width WO is reduced more than the object width WO when the fusion is established.
  • the ECU 20 is detected in the past when the object changes from the state detected by the radar information and the image information to the state detected only by the radar information.
  • the predicted center position of the object at the current time is calculated based on the center position in the lateral direction of the object and the lateral speed of the object. Further, based on the calculated predicted center position and the object width, the left and right end angles indicating the azimuth angles of the left and right lateral positions of the current object with respect to the host vehicle are calculated. Then, the ECU 20 determines whether or not the object is located in the vicinity region based on the calculated left and right end angles. In this case, even if the neighboring region is a region that extends outward by a predetermined angle in the horizontal direction from the angle of view of the image sensor 31, it can be properly determined whether or not the object is located in this neighboring region.
  • the ECU 20 calculates a parting reliability indicating the certainty that the object is located in the vicinity region, and when the calculated parting reliability is equal to or greater than a threshold, It is determined that it is located in the vicinity region.
  • the missing reliability is set to increase nonlinearly as the left and right end angles of the object on the far side from the host vehicle increase, and the value of the center range in the range within the predetermined angle with respect to the angle of view is set.
  • the rate of increase in the part-off reliability is higher than in other areas. In this case, by setting the increase rate in the center range to be larger than that in the other regions, when the left and right end angles are higher than the lower limit value of the center range, the parting reliability is likely to increase. As a result, in determining whether or not the vehicle is located in the vicinity region, it is possible to reduce the influence of the horizontal mounting error of the image sensor 31 with respect to the host vehicle and the calculation error of the left and right end angles.
  • the subsequent object motion is predicted using the object characteristics when the fusion target FT is established. It may be a thing.
  • the ECU 20 detects blinking of a direction indicator lamp such as a blinker from the image information as the fusion information as a feature indicating a right or left turn, and the fusion target FT is not detected in the next processing.
  • the past object width WO may be reduced.
  • step S14 of FIG. 5 when the fusion target is detected based on the captured image that the object straddles the intersection white line at the intersection, and the fusion target FT is not detected in the next processing (step S13: NO), The past object width WO may be reduced.
  • step S14 of FIG. 5 the ECU 20 acquires the lateral position of the object from the image information as the fusion information, and does not detect the fusion target FT in the next process (step S13: NO), step Based on the lateral position acquired in S14, it is determined whether or not the object is located in the neighborhood area NA. Specifically, if the recorded lateral position of the object is inside the lateral range of the neighboring area NA, it is determined that the current object is located in the neighboring area NA.
  • the PCS operation condition may be changed based on the position of the object in the vicinity area NA.
  • the ECU 20 maintains the PCS operating condition if the relative distance Dr between the host vehicle CS and the object is equal to or greater than a threshold value. If the relative speed Vr is the same, the TTC increases as the relative distance Dr increases. Therefore, when the relative distance Dr after the radar target RT has entered the vicinity area NA is large, there is a high possibility that the PCS is not performed compared to the case where the relative distance Dr is small. Therefore, when the relative distance Dr is greater than or equal to a predetermined value in step S19, the ECU 20 makes the PCS easier to operate by maintaining the object width WO.
  • the PCSS 100 may include the ECU 20 and the image sensor 31 as an integrated device instead of the configuration including the ECU 20 and the image sensor 31 individually. In this case, the ECU 20 described above is provided inside the image sensor 31.
  • the PCSS 100 may include a laser sensor that uses laser light as a transmission wave instead of the radar sensor 32.
  • the PCS operating condition may be maintained.

Abstract

An ECU (20) carries out a collision avoidance control to avoid a collision with an object on the basis of first information which is a result of a detection of the object based on reflected waves which correspond to transmitted waves, and/or of second information which is a result of a detection of the object based on a captured image of the view ahead of a vehicle which is captured with an image capture means. If a state in which the object is detected by the first information and the second information has transitioned to a state in which the object is detected by only the first information, the ECU (20) assesses whether the object is located in a proximity region which is predetermined as a region ahead of the vehicle in which the second information cannot be acquired. If it is assessed that the object is located in the proximity region, the ECU (20) maintains an operation condition of the collision avoidance control from the state in which the object is detected by the first information and the second information.

Description

車両制御装置、車両制御方法Vehicle control apparatus and vehicle control method 関連出願の相互参照Cross-reference of related applications
 本出願は、2016年5月19日に出願された日本出願番号2016-100809号と、2016年11月18日に出願された日本出願番号2016-225193号に基づくもので、ここにその記載内容を援用する。 This application is based on Japanese Application No. 2016-10000809 filed on May 19, 2016 and Japanese Application No. 2016-225193 filed on November 18, 2016. Is used.
 本開示は、車両前方に位置する物体を検出する車両制御装置、及び車両制御方法に関する。 The present disclosure relates to a vehicle control device and a vehicle control method for detecting an object positioned in front of the vehicle.
 送信波に対応する反射波に基づく物体の検出結果と画像センサで取得された物体の検出結果とを合成し、この物体に対する新たな情報(フュージョン物標)を生成する技術が知られている。生成されたフュージョン物標により、車両前方の物体の認識精度を向上することができる。また、この情報を用いて特定される物体の位置情報や物体幅を用いることで、物体との衝突を回避する際の車両の衝突回避制御を適切に実施することができる。 A technique for synthesizing a detection result of an object based on a reflected wave corresponding to a transmission wave and a detection result of an object acquired by an image sensor and generating new information (fusion target) on the object is known. The recognition accuracy of the object ahead of the vehicle can be improved by the generated fusion target. Further, by using the position information and the object width of the object specified using this information, it is possible to appropriately perform the collision avoidance control of the vehicle when avoiding the collision with the object.
 画像センサで取得された物体の検出結果は、反射波に基づく物体の検出結果と比べて不安定であることが知られている。例えば、車両周囲が暗いことで、車両前方に存在する物体を画像センサが検出できない場合がある。以下、画像センサにより物体が検出できない状態を画像ロストと記載する。そのため、特許文献1には、フュージョン物標が生成された後に画像ロストが生じた場合、レーダセンサによる物体の検出結果に基づいて衝突回避制御を継続する車両制御装置が開示されている。この車両制御装置では、画像ロスト後は物体の検出精度が低下するため、衝突回避制御を作動し難くする。 It is known that the detection result of the object acquired by the image sensor is unstable compared to the detection result of the object based on the reflected wave. For example, the image sensor may not be able to detect an object existing in front of the vehicle due to the dark surroundings of the vehicle. Hereinafter, a state where an object cannot be detected by the image sensor is referred to as an image lost. Therefore, Patent Document 1 discloses a vehicle control device that continues collision avoidance control based on the detection result of an object by a radar sensor when an image lost occurs after a fusion target is generated. In this vehicle control device, after the image is lost, the detection accuracy of the object is lowered, so that the collision avoidance control is difficult to operate.
特開2007-226680号公報JP 2007-226680 A
 ところで、画像ロストには車両周囲の明るさに起因するもの以外にも物体と車両とが近接することで生じるものがある。具体的には、物体と車両とが近接することで物体が画像センサの画角から外れ、画像センサが物体を適正に検出することが不可能となり画像ロストが生じる。物体と車両とが近接することで画像ロストが生じた場合に衝突回避制御を作動し難くすると、衝突回避制御の作動遅れや不作動を生じさせる可能性が高くなる。 By the way, the image lost may be caused by the proximity of the object and the vehicle, in addition to the image lost due to the brightness around the vehicle. Specifically, when the object and the vehicle are close to each other, the object deviates from the angle of view of the image sensor, and the image sensor cannot detect the object properly, and image loss occurs. If it is difficult to operate the collision avoidance control when image loss occurs due to the proximity of the object and the vehicle, there is a high possibility that the collision avoidance control may be delayed or deactivated.
 本開示は、上記課題に鑑みたものであり、衝突回避制御の作動遅れや不作動を抑制することができる車両制御装置、及び車両制御方法を提供することを目的とする。 The present disclosure has been made in view of the above problems, and an object of the present disclosure is to provide a vehicle control device and a vehicle control method capable of suppressing the operation delay and inactivity of the collision avoidance control.
 本開示では、送信波に対応する反射波に基づく物体の検出結果である第1情報と、車両前方を撮像手段で撮像した撮像画像に基づく前記物体の検出結果である第2情報と、を用いて前記物体を検出する車両制御装置であって、前記第1情報と前記第2情報との少なくともいずれかに基づいて、前記物体との衝突を回避するための衝突回避制御を実施する制御部と、前記物体が前記第1情報及び前記第2情報により検出されている状態から前記第1情報のみで検出されている状態に推移した場合、前記物体が、前記車両前方において前記第2情報を取得できない領域として予め定められている近傍領域に位置しているか否かを判定する位置判定部と、前記位置判定部により前記物体が前記近傍領域に位置していると判定された場合に、前記衝突回避制御の作動条件を前記物体が前記第1情報及び前記第2情報により検出されている状態から維持する維持部と、を有する車両制御装置。 In the present disclosure, the first information that is the detection result of the object based on the reflected wave corresponding to the transmission wave and the second information that is the detection result of the object based on the captured image obtained by capturing the front of the vehicle with the imaging unit are used. A vehicle control device that detects the object, and a controller that performs collision avoidance control for avoiding a collision with the object based on at least one of the first information and the second information; When the object transitions from the state detected by the first information and the second information to the state detected only by the first information, the object acquires the second information in front of the vehicle. A position determination unit that determines whether or not the object is located in a nearby region that is determined in advance as a region that cannot be determined; and the collision determination unit determines that the object is located in the vicinity region by the position determination unit. A vehicle control device comprising a maintaining unit for maintaining the operating conditions of the avoidance control from a state where the object is detected by the first information and the second information.
 上記のように構成された開示では、物体が第1情報及び第2情報により検出されている状態から第1情報のみで検出されている状態に推移した場合、この物体の位置が車両前方における近傍領域に位置していれば、衝突回避制御の作動条件を維持することとした。上記構成により、物体が近傍領域に進入することで画像ロストが生じた場合でも、物体に対する衝突回避制御の作動遅れや不作動を抑制することができる。 In the disclosure configured as described above, when the object transitions from the state detected by the first information and the second information to the state detected only by the first information, the position of the object is in the vicinity in front of the vehicle. If it is located in the region, the operating condition of the collision avoidance control is maintained. With the above-described configuration, even when an image is lost due to the object entering the vicinity region, it is possible to suppress the operation delay or inactivation of the collision avoidance control for the object.
 本開示についての上記目的およびその他の目的、特徴や利点は、添付の図面を参照しながら下記の詳細な記述により、より明確になる。その図面は、
図1は、PCSSの構成図であり、 図2Aは、画像センサ及びレーダセンサにより検出される物体の位置を示す図であり、 図2Bは、画像センサ及びレーダセンサにより検出される物体の位置を示す図であり、 図3は、PCSを説明する図であり、 図4Aは、画像ロストが生じる要因を説明する図であり、 図4Bは、画像ロストが生じる要因を説明する図であり、 図5は、PCSを説明するフローチャートであり、 図6は、近傍領域NAを説明する図であり、 図7は、図5のステップS19で実施される詳細な処理を説明する図であり、 図8Aは、PCSの作動性の変化を説明する図であり、 図8Bは、PCSの作動性の変化を説明する図であり、 図8Cは、PCSの作動性の変化を説明する図であり、 図9は、第2実施形態においてステップS19で実施される処理を示すフローチャートであり、 図10Aは、近傍領域NAでの物体の移動を説明する図であり、 図10Bは、近傍領域NAでの物体の移動を説明する図であり、 図11は、第3実施形態での近傍領域の位置を説明する図であり、 図12は、第3実施形態において、図5のステップS14で実施される処理を示すフローチャートであり、 図13は、中心位置と物体幅とを説明する図であり、 図14は、第3実施形態において、図5のステップS18で実施される処理を示すフローチャートであり、 図15は、中心位置の予測値を説明する図であり、 図16は、左右端角度を説明する図であり、 図17は、見切れ信頼度を説明する図である。
The above and other objects, features and advantages of the present disclosure will become more apparent from the following detailed description with reference to the accompanying drawings. The drawing
FIG. 1 is a block diagram of PCSS. FIG. 2A is a diagram illustrating a position of an object detected by an image sensor and a radar sensor, FIG. 2B is a diagram illustrating a position of an object detected by the image sensor and the radar sensor. FIG. 3 is a diagram for explaining the PCS. FIG. 4A is a diagram for explaining factors that cause image loss. FIG. 4B is a diagram for explaining factors that cause image loss. FIG. 5 is a flowchart for explaining the PCS. FIG. 6 is a diagram illustrating the neighborhood area NA. FIG. 7 is a diagram for explaining detailed processing performed in step S19 of FIG. FIG. 8A is a diagram for explaining changes in the operability of the PCS. FIG. 8B is a diagram for explaining changes in the operability of the PCS; FIG. 8C is a diagram for explaining changes in the operability of the PCS. FIG. 9 is a flowchart showing the process performed in step S19 in the second embodiment. FIG. 10A is a diagram for explaining the movement of an object in the vicinity area NA. FIG. 10B is a diagram for explaining the movement of the object in the vicinity area NA. FIG. 11 is a diagram for explaining the position of the neighborhood region in the third embodiment. FIG. 12 is a flowchart showing the process performed in step S14 of FIG. 5 in the third embodiment. FIG. 13 is a diagram for explaining the center position and the object width. FIG. 14 is a flowchart showing the process performed in step S18 of FIG. 5 in the third embodiment. FIG. 15 is a diagram for explaining the predicted value of the center position. FIG. 16 is a diagram for explaining the left and right end angles. FIG. 17 is a diagram for explaining the discontinuity reliability.
 以下、各実施形態を図面に基づいて説明する。なお、以下の各実施形態相互において、互いに同一もしくは均等である部分には、図中、同一符号を付しており、同一符号の部分についてはその説明を援用する。 Hereinafter, each embodiment will be described with reference to the drawings. In the following embodiments, parts that are the same or equivalent to each other are denoted by the same reference numerals in the drawings, and the description of the same reference numerals is used.
 (第1実施形態)
 本実施形態に係る車両制御装置及び車両制御方法は、車両(自車両CS)に搭載されており、自車両CSの前方に存在する物体を検出し、その物体との衝突を回避又は軽減すべく各種制御を行うPCSS(Pre-crash safety system)により実現される。また、図1において、PCSS100は、運転支援ECU20(以下、ECU20と記載する)、各種センサ30、被制御対象40を備えて構成されている。図1では、ECU20が車両制御装置として機能する。
(First embodiment)
The vehicle control device and the vehicle control method according to the present embodiment are mounted on a vehicle (host vehicle CS), detect an object existing in front of the host vehicle CS, and avoid or reduce a collision with the object. It is realized by PCSS (Pre-crash safety system) that performs various controls. In FIG. 1, the PCSS 100 includes a driving assistance ECU 20 (hereinafter referred to as ECU 20), various sensors 30, and a controlled object 40. In FIG. 1, the ECU 20 functions as a vehicle control device.
 各種センサ30は、ECU20に接続されており、物体や自車両CSに対する検出結果をECU20に出力する。図1では、PCSS10は、各種センサ30として、画像センサ31、レーダセンサ32、車速センサ33、旋回運動検出センサ34を備えている。 The various sensors 30 are connected to the ECU 20 and output detection results for the object and the host vehicle CS to the ECU 20. In FIG. 1, the PCSS 10 includes an image sensor 31, a radar sensor 32, a vehicle speed sensor 33, and a turning motion detection sensor 34 as various sensors 30.
 画像センサ31は、CCDカメラ、単眼カメラ、ステレオカメラ等であり、自車両CSのフロントガラスの上端付近等に設置される。画像センサ31は、所定時間毎に自車両CSの前方に向かって所定範囲で広がる領域を撮像して撮像画像を取得する。そして、撮像画像を画像処理することで、自車両CS前方の物体の位置や方位を画像情報として取得し、ECU20に出力する。以下、画像センサ31により画像情報が検出される物体を画像物標ITとも記載する。この実施形態では画像センサ31が撮像手段として機能する。 The image sensor 31 is a CCD camera, a monocular camera, a stereo camera or the like, and is installed near the upper end of the windshield of the host vehicle CS. The image sensor 31 captures a captured image by capturing an area that extends in a predetermined range toward the front of the host vehicle CS at predetermined time intervals. Then, by processing the captured image, the position and orientation of the object ahead of the host vehicle CS are acquired as image information and output to the ECU 20. Hereinafter, an object whose image information is detected by the image sensor 31 is also referred to as an image target IT. In this embodiment, the image sensor 31 functions as an imaging unit.
 図2Aに示すように、画像情報には、自車両CSを基準位置とする車両進行方向(Y軸)と横方向(X軸)とで特定される座標上の画像物標ITの位置とが含まれている。図2Aでは、画像情報として、画像物標ITの横方向(X軸)での左右の横位置Xr,Xlと、自車両CSから物体Obまでの方位を示す方位角θcとが含まれる。ECU20は、画像物標ITの横位置Xr,Xlにより物体幅WOを算出することができる。 As shown in FIG. 2A, the image information includes the position of the image target IT on the coordinates specified by the vehicle traveling direction (Y axis) and the lateral direction (X axis) with the host vehicle CS as a reference position. include. In FIG. 2A, the image information includes left and right lateral positions Xr, Xl in the lateral direction (X axis) of the image target IT, and an azimuth angle θc indicating the azimuth from the host vehicle CS to the object Ob. The ECU 20 can calculate the object width WO from the lateral positions Xr and Xl of the image target IT.
 レーダセンサ32は、送信波に対応する反射波に基づく物体の検出結果であるレーダ情報を取得する。レーダセンサ32は、自車両CSの前部においてその光軸が車両前方(Y軸方向)を向くように取り付けられており、車両前方に向かって送信波を送信することで、車両前方を走査するとともに、この送信波に対する物体の表面で反射された反射波を受信する。そして、この反射波に応じて物体との距離及び物体との相対速度等を示すレーダ情報を生成する。送信波は、ミリ波等の指向性のある電磁波を用いることができる。 The radar sensor 32 acquires radar information that is a detection result of an object based on a reflected wave corresponding to the transmission wave. The radar sensor 32 is attached at the front of the host vehicle CS so that its optical axis faces the front of the vehicle (Y-axis direction), and scans the front of the vehicle by transmitting a transmission wave toward the front of the vehicle. At the same time, the reflected wave reflected from the surface of the object with respect to the transmitted wave is received. Then, radar information indicating the distance to the object, the relative speed with the object, and the like is generated according to the reflected wave. As the transmission wave, a directional electromagnetic wave such as a millimeter wave can be used.
 図2Bに示すように、レーダ情報には、自車両CSを基準とするレーダ物標RTの車両進行方向(Y軸)での位置や、自車両CSからレーダ物標RTまでの方位角θrが含まれる。ECU20は、レーダ物標RTの車両進行方向(Y軸)での位置に基づいて、自車両CSからレーダ物標RTまでのY軸上での距離である相対距離Drと、この相対距離Drに基づいて自車両CSを基準とするレーダ物標RTの相対速度Vrを取得することができる。この実施形態では、レーダ情報が第1情報として機能し、画像情報が第2情報として機能する。 As shown in FIG. 2B, the radar information includes the position of the radar target RT in the vehicle traveling direction (Y axis) relative to the host vehicle CS and the azimuth angle θr from the host vehicle CS to the radar target RT. included. Based on the position of the radar target RT in the vehicle traveling direction (Y-axis), the ECU 20 sets the relative distance Dr, which is the distance on the Y-axis from the host vehicle CS to the radar target RT, and the relative distance Dr. Based on this, the relative speed Vr of the radar target RT with reference to the host vehicle CS can be acquired. In this embodiment, radar information functions as first information, and image information functions as second information.
 車速センサ33は、自車両CSの車輪に動力を伝達する回転軸に設けられており、その回転軸の回転速度に基づいて、自車両CSの速度である車速を算出する。 The vehicle speed sensor 33 is provided on a rotating shaft that transmits power to the wheels of the host vehicle CS, and calculates the vehicle speed that is the speed of the host vehicle CS based on the rotation speed of the rotating shaft.
 旋回運動検出センサ34は、車両進行方向から変化する自車両CSの旋回角速度を検出する。例えば、旋回運動検出センサ34は、自車両CSの旋回角速度を検出するヨーレートセンサや、不図示の操舵装置による操舵角を検出する操舵角センサにより構成される。ECU20は旋回運動検出センサ34からの出力に基づいて、自車両CSが旋回運動をしているか否かを判定することができる。 The turning motion detection sensor 34 detects the turning angular velocity of the host vehicle CS that changes from the vehicle traveling direction. For example, the turning motion detection sensor 34 includes a yaw rate sensor that detects a turning angular velocity of the host vehicle CS and a steering angle sensor that detects a steering angle by a steering device (not shown). Based on the output from the turning motion detection sensor 34, the ECU 20 can determine whether or not the host vehicle CS is making a turning motion.
 ECU20は、周知のマイクロコンピュータにより構成され、CPU、ROM、RAM等を備えている。ECU20は、ROMに記憶されたプログラムを実行することで、位置取得部21、制御部22、位置判定部23、維持部24として機能する。まずは、ECU20により実施されるPCS(衝突回避制御)について説明する。 The ECU 20 is configured by a known microcomputer and includes a CPU, a ROM, a RAM, and the like. ECU20 functions as the position acquisition part 21, the control part 22, the position determination part 23, and the maintenance part 24 by running the program memorize | stored in ROM. First, PCS (collision avoidance control) performed by the ECU 20 will be described.
 位置取得部21は、画像センサ31による物体の検出結果である画像情報又はレーダセンサ32による物体の検出結果であるレーダ情報により自車前方の物体の位置情報を取得する。 The position acquisition unit 21 acquires position information of an object ahead of the host vehicle from image information that is an object detection result by the image sensor 31 or radar information that is an object detection result by the radar sensor 32.
 制御部22は、同一の物体に対して画像情報とレーダ情報とを取得している場合、この画像情報及びレーダ情報を融合(フュージョン)することで、この物体に対する新たな位置情報であるフュージョン情報を生成する。例えば、レーダ情報による相対距離Drを、物体の自車両CSの進行方向(Y軸上)の位置とし、画像情報による横位置を物体の横方向(X軸上)の位置や物体幅WOとするフュージョン情報を生成する。このように、物体に対するフュージョン情報を生成する場合、レーダセンサ32と画像センサ31とが取得した情報のうち、精度が高い方の情報を用いて物体に対する情報が生成されることとなり、物体の認識精度を向上できる。以下、フュージョン情報が生成されている物体をフュージョン物標FTと記載する。 When the control unit 22 acquires image information and radar information for the same object, the control unit 22 fuses the image information and the radar information, so that fusion information that is new position information for the object is obtained. Is generated. For example, the relative distance Dr based on the radar information is set as the position of the object in the traveling direction (on the Y axis) of the host vehicle CS, and the horizontal position based on the image information is set as the position of the object in the horizontal direction (on the X axis) Generate fusion information. As described above, when generating fusion information for an object, information on the object is generated using information with higher accuracy among the information acquired by the radar sensor 32 and the image sensor 31, and recognition of the object is performed. Accuracy can be improved. Hereinafter, an object for which fusion information is generated is referred to as a fusion target FT.
 そして、制御部22は、位置情報を検出した物体と自車両CSとの衝突可能性を判定し、判定した衝突可能性に基づいてPCSの作動を制御する。例えば、制御部22は、物体幅WOと判定領域WcdとがX軸方向において重なる比率であるラップ率RRに基づいて、自車両CSと物体とが衝突する可能性を判定し、この判定結果に基づいてPCSの作動を制御する。ここで、判定領域Wcdは、自車両CSの前方に仮想的に設定される領域である。 And the control part 22 determines the collision possibility of the object which detected the positional information, and the own vehicle CS, and controls the action | operation of PCS based on the determined collision possibility. For example, the control unit 22 determines the possibility that the host vehicle CS and the object collide based on a lap rate RR that is a ratio in which the object width WO and the determination region Wcd overlap in the X-axis direction. Based on this, the operation of the PCS is controlled. Here, the determination area Wcd is an area virtually set in front of the host vehicle CS.
 ラップ率RRが所定値以上の場合、図3に示すように、相対距離Drを物体との相対速度Vrで除算する等の方法で、当該物体に対する衝突余裕時間TTC(Time to Collision)を算出する。TTCとは、このままの自車速度で走行した場合に、何秒後に物体に衝突するかを示す評価値であり、TTCが小さいほど、衝突の危険性は高くなり、TTCが大きいほど衝突の危険性は低くなる。図3では、TTC1、TTC2、TTC3の順でその値が小さくなる。制御部22は、算出した現時点でのTTCと各被制御対象40に設定されたTTCとを比較し、該当する被制御対象40が存在する場合、該当する被制御対象40を作動させる。 When the lap rate RR is equal to or greater than a predetermined value, as shown in FIG. 3, the collision margin time TTC (Time to Collision) for the object is calculated by a method such as dividing the relative distance Dr by the relative speed Vr with the object. . The TTC is an evaluation value indicating how many seconds later the vehicle collides with an object when traveling at the vehicle speed as it is. The smaller the TTC, the higher the risk of collision, and the larger the TTC, the higher the risk of collision. The nature becomes low. In FIG. 3, the values decrease in the order of TTC1, TTC2, and TTC3. The control unit 22 compares the calculated current TTC with the TTC set for each controlled object 40, and activates the corresponding controlled object 40 when the corresponding controlled object 40 exists.
 図1に示すPCSS100では、被制御対象40として、警報装置41、シートベルト装置42、ブレーキ装置43を備えており、被制御対象40ごとに所定の作動タイミング(TTC)が設定されている。そのため、ECU20は、TTCと各被制御対象40の作動タイミングとを比較して、TTCが各被制御対象40の作動タイミングに該当する場合、当該被制御対象40を作動させる。 1 includes an alarm device 41, a seat belt device 42, and a brake device 43 as controlled objects 40, and a predetermined operation timing (TTC) is set for each controlled object 40. Therefore, the ECU 20 compares the TTC and the operation timing of each controlled object 40, and activates the controlled object 40 when the TTC corresponds to the operation timing of each controlled object 40.
 例えば、TTCが警報装置41の作動タイミングとなれば、警報装置41の作動で運転者に警報を発信する。TTCがシートベルト装置42の作動タイミングとなれば、シートベルト装置42を巻き上げる制御を行う。TTCがブレーキ装置43の作動タイミングとなれば、自動ブレーキを作動させて衝突速度を低減する制御を行う。以上により、自車両CSと物体との衝突を回避又は緩和する。 For example, if TTC is the operation timing of the alarm device 41, an alarm is transmitted to the driver by the operation of the alarm device 41. When TTC is the operation timing of the seat belt device 42, the seat belt device 42 is controlled to be wound up. If TTC is the operation timing of the brake device 43, the automatic brake is operated to control to reduce the collision speed. Thus, the collision between the host vehicle CS and the object is avoided or alleviated.
 また、制御部22は、フュージョン情報を生成した後、画像情報が検出されず画像ロストが生じた場合、一定の条件下でPCSの作動条件を変更する。一旦、フュージョン物標FTとして認識された物体は、前方に存在する信頼度が高くなる。そのため、物体がフュージョン物標FTとして認識されなくなった後も、PCSの対象から除外するのではく、当該物体に対するPCSが継続できることが好ましい。そこで、PCSS100は、フュージョン情報が生成された後に画像ロストが生じた場合、レーダ情報や過去の物体幅WOに基づいてPCSを継続する。一方で、画像ロストが生じると、画像センサ31から画像物標ITを取得できなくなり、新たに物体幅WOを取得できなくなる。そこで、画像ロスト後は、フュージョン物標FTが検出されていた過去の物体幅WOを用いるとともに、この物体幅WOを縮小することで、一定条件下でPCSが作動し難くなるよう作動条件を変更し、検出精度の低下に対応している。 Also, after generating the fusion information, the control unit 22 changes the operating conditions of the PCS under certain conditions when image information is not detected and image loss occurs. Once the object is recognized as the fusion target FT, the reliability of existing in front is high. Therefore, even after the object is no longer recognized as the fusion target FT, it is preferable that the PCS for the object can be continued instead of being excluded from the PCS target. Therefore, when image loss occurs after the fusion information is generated, the PCSS 100 continues the PCS based on the radar information and the past object width WO. On the other hand, when the image lost occurs, the image target IT cannot be acquired from the image sensor 31, and the object width WO cannot be newly acquired. Therefore, after the image has been lost, the past object width WO in which the fusion target FT has been detected is used, and the operating condition is changed so that the PCS becomes difficult to operate under a certain condition by reducing the object width WO. And it corresponds to the decrease in detection accuracy.
 図4Aに示すように、画像センサ31による物体の検出領域(撮像領域CAと記載する)は、レーダセンサ32による物体の検出領域(レーダ領域RAと記載する)と比べて車両進行方向(Y軸方向)で狭くなる。そのため、物体が自車両CS前方において撮像領域CAとレーダ領域RAとに重なる位置に存在する場合、フュージョン物標FTとして検出することができる。一方で、物体が撮像領域CAからY軸方向において遠方又は近方に位置する場合、画像ロストが生じる。以下では、車両進行方向において撮像領域CAよりも手前の領域を近傍領域NAとも記載する。 As shown in FIG. 4A, the object detection area (denoted as imaging area CA) by the image sensor 31 is compared with the object detection area (denoted as radar area RA) by the radar sensor 32 in the vehicle traveling direction (Y-axis). Direction). Therefore, when the object exists in a position overlapping the imaging area CA and the radar area RA in front of the host vehicle CS, it can be detected as the fusion target FT. On the other hand, when the object is located far or near in the Y-axis direction from the imaging area CA, image loss occurs. Hereinafter, an area in front of the imaging area CA in the vehicle traveling direction is also referred to as a neighborhood area NA.
 図4Bに示すように、物体が近傍領域NAに位置する場合、物体の後端部の下端側が画像センサ31の画角θ1から外れることで画像センサ31が画像物標ITの種類を特定できなくなり画像ロストが生じる。ここで、近傍領域NAは、自車両CSの前方において近い位置であり、この近傍領域NAで画像情報がロストした場合に、PCSが作動し難くなるよう作動条件を変更してしまうと、PCSの動作の遅れや不作動の要因となる。そこで、制御部22は、物体がこの近傍領域NAに進入することで、画像ロストが生じた場合、この物体に対するPCSの作動条件を維持する。 As shown in FIG. 4B, when the object is located in the neighborhood area NA, the image sensor 31 cannot specify the type of the image target IT because the lower end side of the rear end portion of the object deviates from the angle of view θ1 of the image sensor 31. Image lost occurs. Here, the vicinity area NA is a position close to the front of the host vehicle CS, and if the operating conditions are changed so that the PCS becomes difficult to operate when image information is lost in the vicinity area NA, the PCS It becomes a factor of operation delay and malfunction. Therefore, the control unit 22 maintains the operating conditions of the PCS for the object when the image is lost due to the object entering the vicinity area NA.
 位置判定部23は、物体に対してフュージョン情報が生成されている状態から、レーダ情報のみで物体が検出されている状態に推移した場合、この物体が近傍領域NAに位置しているか否かを判定する。近傍領域NAは、図4Bに示すように、画像センサ31の上下方向に広がる画角θ1に基づいてY軸方向及びX軸方向での領域が予め設定されている。例えば、近傍領域NAは、画像センサ31の上下方向での画角θ1と、画像センサ31が取り付けられる自車両CSの高さ方向での位置との関係に基づいてその範囲が設定される。 The position determination unit 23 determines whether or not the object is located in the vicinity area NA when the state is changed from the state in which the fusion information is generated for the object to the state in which the object is detected only with the radar information. judge. As shown in FIG. 4B, the neighborhood area NA is preset with areas in the Y-axis direction and the X-axis direction based on the angle of view θ <b> 1 spreading in the vertical direction of the image sensor 31. For example, the range of the vicinity area NA is set based on the relationship between the angle of view θ1 in the vertical direction of the image sensor 31 and the position in the height direction of the host vehicle CS to which the image sensor 31 is attached.
 維持部24は、位置判定部23により物体が近傍領域NAに位置していると判定された場合に、PCSの作動条件をフュージョン物標FTが検出されている状態から維持する。この実施形態では、維持部24は、制御部22に物体幅WOを横方向(X軸方向)に縮小させないことでPCSの作動条件を維持する。 When the position determination unit 23 determines that the object is located in the vicinity area NA, the maintenance unit 24 maintains the PCS operating condition from the state in which the fusion target FT is detected. In this embodiment, the maintenance unit 24 maintains the PCS operating condition by not causing the control unit 22 to reduce the object width WO in the horizontal direction (X-axis direction).
 次に、ECU20が実施するPCSを図5のフローチャートを用いて説明する。なお図5に示す処理は、ECU20により所定周期で実施される処理である。 Next, the PCS executed by the ECU 20 will be described with reference to the flowchart of FIG. The process shown in FIG. 5 is a process performed by the ECU 20 at a predetermined cycle.
 ステップS11では、画像センサ31からの出力に基づいて画像情報を取得する。ステップS12では、レーダセンサ32からの出力に基づいて、レーダ情報を取得する。 In step S11, image information is acquired based on the output from the image sensor 31. In step S12, radar information is acquired based on the output from the radar sensor 32.
 ステップS13では、フュージョン物標FTの検出の有無を判定する。画像情報及びレーダ情報によりそれぞれ物体を検出し、かつ、画像物標ITとレーダ物標RTとが同じ物体であると判定した場合、ステップS14に進む。例えば、ステップS11で取得した画像情報に基づく画像物標ITの位置と、ステップS12で取得したレーダ情報に基づくレーダ物標RTの位置との差が、予め定められた距離以下であれば、画像物標ITとレーダ物標RTとが同一の物体(フュージョン物標FT)であると判定する。一方、画像情報又はレーダ情報を取得していない場合、又は、画像物標ITの位置とレーダ物標RTの位置との差が予め定められた距離を超える場合、画像物標ITとレーダ物標RTとが異なる物体であると判定する。 In step S13, it is determined whether or not the fusion target FT has been detected. If the object is detected based on the image information and the radar information and it is determined that the image target IT and the radar target RT are the same object, the process proceeds to step S14. For example, if the difference between the position of the image target IT based on the image information acquired in step S11 and the position of the radar target RT based on the radar information acquired in step S12 is equal to or less than a predetermined distance, the image It is determined that the target IT and the radar target RT are the same object (fusion target FT). On the other hand, when image information or radar information is not acquired, or when the difference between the position of the image target IT and the position of the radar target RT exceeds a predetermined distance, the image target IT and the radar target It is determined that the object is different from RT.
 ステップS14では、ステップS11で取得した画像情報とステップS12で取得したレーダ情報とを合成してフュージョン物標FTに対する位置情報であるフュージョン情報を生成する。フュージョン情報には、物体の位置に加えて物体幅WOが含まれる。 In step S14, the image information acquired in step S11 and the radar information acquired in step S12 are combined to generate fusion information that is position information for the fusion target FT. The fusion information includes the object width WO in addition to the position of the object.
 ステップS15では、フュージョン物標FTが検出された回数である検出回数DNを記録する。検出回数DNは、同一種別のフュージョン物標FTが継続して検出される回数を示す情報である。この実施形態では、ステップS13において、同一種別のフュージョン物標FTを検出する毎に、検出回数DNを増加させる。 In step S15, the number of detections DN, which is the number of times the fusion target FT has been detected, is recorded. The number of detections DN is information indicating the number of times that the same type of fusion target FT is continuously detected. In this embodiment, every time the same type of fusion target FT is detected in step S13, the detection count DN is increased.
 ステップS21では、物体との衝突判定を行う。まずは、ステップS13においてフュージョン物標FTを検出しているものとして説明を行う。ステップS14で算出したフュージョン情報に含まれる物体幅WOと判定領域Wcdとのラップ率RRを用いて物体と自車両CSとの衝突判定を行う。 In step S21, a collision with an object is determined. First, description will be made assuming that the fusion target FT is detected in step S13. The collision determination between the object and the host vehicle CS is performed using the lap ratio RR between the object width WO and the determination area Wcd included in the fusion information calculated in step S14.
 ステップS22では、PCSを実施するか否かを判定する。ステップS21で物体と衝突する可能性があると判定した場合、物体における相対距離Drを相対速度Vrで割ることでTTCを算出し、算出したTTCを各被制御対象40に設定されているTTCと比較することで各動作を実施するか否かを判定する。PCSを実施する場合(ステップS22:YES)、ステップS23では、PCSの該当動作を実施する。一方、PCSの該当動作を実施しない場合(ステップS22:NO)、図5に示す処理を、一旦、終了する。ステップS21~S23が制御工程として機能する。 In step S22, it is determined whether or not to perform PCS. If it is determined in step S21 that there is a possibility of collision with the object, the TTC is calculated by dividing the relative distance Dr in the object by the relative speed Vr, and the calculated TTC is set to the TTC set for each controlled object 40. By comparing, it is determined whether or not each operation is performed. When performing PCS (step S22: YES), in step S23, the corresponding operation of PCS is performed. On the other hand, when the corresponding operation of PCS is not performed (step S22: NO), the process shown in FIG. Steps S21 to S23 function as a control process.
 一方、ステップS13においてフュージョン物標FTを検出していない場合、ステップS16では、同一物体に対して過去にフュージョン物標の検出が成立していたか否かを判定する。例えば、検出回数DNを参照することで、過去の処理においてフュージョン物標FTを検出しているか否かを判定する。過去の処理においてフュージョン物標FTを検出していない場合(ステップS16:NO)、図5に示す処理を一旦終了する。 On the other hand, if the fusion target FT has not been detected in step S13, it is determined in step S16 whether or not fusion target detection has been established for the same object in the past. For example, by referring to the detection number DN, it is determined whether or not the fusion target FT has been detected in the past processing. If the fusion target FT has not been detected in the past process (step S16: NO), the process shown in FIG. 5 is temporarily terminated.
 同一物体に対してフュージョン物標FTを検出している場合(ステップS16:YES)、ステップS17では、レーダ物標RTを継続して検出しているか否か判定する。画像ロストが生じた場合でも、レーダセンサ32により物体の位置を検出することができれば、物体がレーダ領域RAに存在しているためである。レーダ物標RTを検出していなければ(ステップS17:NO)、自車前方に物体が存在しないとして、図5の処理を、一旦、終了する。そのため、物体がPCSの対象から除外されたこととなる。一方、レーダ物標RTを検出している場合(ステップS17:YES)、画像ロストが生じていると判定し、ステップS18に進む。 When the fusion target FT is detected for the same object (step S16: YES), in step S17, it is determined whether or not the radar target RT is continuously detected. This is because the object is present in the radar area RA if the position of the object can be detected by the radar sensor 32 even when image loss occurs. If the radar target RT has not been detected (step S17: NO), the processing of FIG. 5 is once terminated, assuming that no object exists in front of the host vehicle. Therefore, the object is excluded from the PCS target. On the other hand, when the radar target RT is detected (step S17: YES), it is determined that image loss has occurred, and the process proceeds to step S18.
 ステップS18では、レーダ物標RTが近傍領域NAに位置しているか否かを判定する。この実施形態では、近傍領域NAを車両進行方向(Y軸方向)と横方向(X軸方向)とで区画される領域として設定している。ステップS12で取得したレーダ情報により、レーダ物標RTの位置が近傍領域NAとして定められた領域に位置しているか否かを判定する。ステップS18が位置判定工程として機能する。 In step S18, it is determined whether or not the radar target RT is located in the vicinity area NA. In this embodiment, the vicinity area NA is set as an area partitioned by the vehicle traveling direction (Y-axis direction) and the lateral direction (X-axis direction). Based on the radar information acquired in step S12, it is determined whether or not the position of the radar target RT is located in an area determined as the neighborhood area NA. Step S18 functions as a position determination step.
 図6に示すように、横方向での自車両CSの中心から所定距離が近傍領域NAの横方向での境界線BDとして定められている。なお、近傍領域NAの車両進行方向での範囲は、画像センサ31の画角に基づいて定められている。そして、レーダ物標RTの位置Prの横方向での位置が境界線BDにより区画される近傍領域NAの横方向の範囲よりも内側である場合に、物体が近傍領域NAに位置していると判定する。一方、位置Prが横方向において、定められた近傍領域NAの外側である場合、物体が近傍領域NAに位置していないと判定する。この実施形態では、境界線BDは画像センサ31の撮像領域CAに基づいて予め定められた固定値として用いている。これ以外にも、物体の種別に応じてこの境界線BDを横方向に変更するものであってもよい。 As shown in FIG. 6, a predetermined distance from the center of the host vehicle CS in the lateral direction is defined as a boundary line BD in the lateral direction of the neighborhood area NA. Note that the range of the vicinity area NA in the vehicle traveling direction is determined based on the angle of view of the image sensor 31. If the position of the radar target RT in the lateral direction of the position Pr is on the inner side of the lateral area of the neighboring area NA defined by the boundary line BD, the object is located in the neighboring area NA. judge. On the other hand, when the position Pr is outside the determined neighboring area NA in the horizontal direction, it is determined that the object is not located in the neighboring area NA. In this embodiment, the boundary line BD is used as a fixed value determined in advance based on the imaging area CA of the image sensor 31. In addition to this, the boundary line BD may be changed in the horizontal direction according to the type of the object.
 レーダ物標RTが近傍領域NAに位置していない場合(ステップS18:NO)、ステップS20では、物体幅WOを縮小することでPCSの作動条件を変更する。この場合、物体Obは車両進行方向において撮像領域CAの遠方に位置している可能性が高いため、物体Obと自車両CSとが衝突する可能性が低くなる。そのため、過去に取得した物体幅WOの信頼度が低いことを優先し、この物体幅WOを横方向に縮小する。すなわち、この実施形態では、物体幅WOに伴うラップ率RRをPCSの作動条件としている。そして、ステップS21において、縮小後の物体幅WOを用いて衝突判定を行う。その結果、ラップ率RRが低下しPCSが作動し難くなる。 When the radar target RT is not located in the vicinity area NA (step S18: NO), in step S20, the operating condition of the PCS is changed by reducing the object width WO. In this case, since there is a high possibility that the object Ob is located far from the imaging area CA in the vehicle traveling direction, the possibility that the object Ob and the host vehicle CS collide with each other is low. Therefore, priority is given to the low reliability of the object width WO acquired in the past, and the object width WO is reduced in the horizontal direction. That is, in this embodiment, the wrap rate RR associated with the object width WO is set as the PCS operating condition. In step S21, collision determination is performed using the reduced object width WO. As a result, the lap rate RR decreases and the PCS becomes difficult to operate.
 一方、レーダ物標RTが近傍領域NAに位置している場合(ステップS18:YES)、ステップS19では、PCSの作動条件を変更する。ステップS19では、物体Obと自車両CSとが衝突する可能性を種々の条件に応じて判定することでPCSの作動条件を変更するか維持するかを切り替える。 On the other hand, when the radar target RT is located in the vicinity area NA (step S18: YES), the PCS operating condition is changed in step S19. In step S19, it is switched whether to change or maintain the PCS operating condition by determining the possibility of collision between the object Ob and the host vehicle CS according to various conditions.
 次に、図5のステップS19で実施される詳細な処理を、図7を用いて説明する。図7で示す処理では、ステップS31,S32までの各条件を全て満たした場合に、物体Obと自車両CSとが衝突する可能性が高いと判定し、物体幅WOを維持することとしている。そのため、ステップS19が維持工程として機能する。 Next, detailed processing performed in step S19 in FIG. 5 will be described with reference to FIG. In the process shown in FIG. 7, when all the conditions up to Steps S31 and S32 are satisfied, it is determined that there is a high possibility that the object Ob and the host vehicle CS will collide, and the object width WO is maintained. Therefore, step S19 functions as a maintenance process.
 まず、ステップS31では、自車両CSを基準とするレーダ物標RTの相対速度Vrを判定する。TTCは相対距離Drを相対速度Vrで割ることで算出されるため、同じ相対距離Drであれば相対速度Vrが小さい程、レーダ物標RTと自車両CSとが衝突するまでのTTCが大きな値となる。そのため、相対速度Vrが小さい場合は、大きい場合と比べてレーダ物標RTが近傍領域NAに進入した後もPCSの各動作が実施されていない可能性が高くなる。 First, in step S31, the relative speed Vr of the radar target RT with reference to the host vehicle CS is determined. Since the TTC is calculated by dividing the relative distance Dr by the relative speed Vr, the TTC until the radar target RT and the host vehicle CS collide increases as the relative speed Vr decreases for the same relative distance Dr. It becomes. Therefore, when the relative speed Vr is small, it is more likely that each operation of the PCS is not performed even after the radar target RT has entered the vicinity area NA, as compared with a case where the relative speed Vr is large.
 そのため、相対速度Vrが閾値Th1より大きい場合(ステップS31:NO)、ステップS33に進み、物体幅WOを縮小する。ステップS33で実施する物体幅WOの縮小は、ステップS20で実施する物体幅WOの縮小と同じ手法を用いることができる。一方、相対速度Vrが閾値Th1以下であれば(ステップS31:YES)、ステップS32に進む。ステップS31が相対速度取得部として機能する。 Therefore, when the relative speed Vr is larger than the threshold value Th1 (step S31: NO), the process proceeds to step S33, and the object width WO is reduced. The reduction of the object width WO performed in step S33 can use the same technique as the reduction of the object width WO performed in step S20. On the other hand, if the relative speed Vr is equal to or less than the threshold value Th1 (step S31: YES), the process proceeds to step S32. Step S31 functions as a relative speed acquisition unit.
 ステップS32では、画像ロストが生じる以前のフュージョン物標FTの検出回数DNを判定する。検出回数DNは過去にレーダ物標RTをフュージョン物標FTとして検出していた回数を示すため、検出回数DNが少ないと、フュージョン物標FTの信頼性が低くなる。例えば、ノイズ等によりフュージョン物標FTの検出が偶発的である場合、検出回数DNが低い値となる。そのため、検出回数DNが閾値Th2より小さければ、ステップS33に進み、物体幅WOを縮小する。 In step S32, the number DN of detections of the fusion target FT before the image lost occurs is determined. The number of detections DN indicates the number of times the radar target RT has been detected in the past as the fusion target FT. Therefore, if the number of detections DN is small, the reliability of the fusion target FT decreases. For example, when the detection of the fusion target FT is accidental due to noise or the like, the detection count DN is a low value. Therefore, if the number of detections DN is smaller than the threshold value Th2, the process proceeds to step S33, and the object width WO is reduced.
 一方、検出回数DNが閾値Th2以上であれば(ステップS32:YES)、図6に示す処理を、一旦、終了する。即ち、物体幅WOが維持されたこととなる。そのため、図5のステップS21において、維持された物体幅WOにより衝突判定が行われるため、PCSが作動し易くなる。 On the other hand, if the number of times of detection DN is greater than or equal to the threshold Th2 (step S32: YES), the processing shown in FIG. That is, the object width WO is maintained. Therefore, in step S21 in FIG. 5, the collision determination is performed based on the maintained object width WO, so that the PCS is easily activated.
 次に、図8A,Bを用いて、ECU20が図5の処理を実施する場合のPCSの作動性の変化を説明する。図8A,Bは、ECU20が図5の処理を実施する場合の物体幅WOの変化を示しており、図8Cは、比較として、ECU20が図5の処理を実施しない場合の物体幅WOの変化を示している。 Next, changes in the operability of the PCS when the ECU 20 performs the process of FIG. 5 will be described with reference to FIGS. 8A and 8B. 8A and 8B show changes in the object width WO when the ECU 20 performs the process of FIG. 5, and FIG. 8C shows changes in the object width WO when the ECU 20 does not execute the process of FIG. Is shown.
 図8Aに示すように、時刻t11において、ECU20により自車両CSの前方にフュージョン物標FTが検出されている状態で、自車両CSを基準とするフュージョン物標FTの相対距離Drが小さくなったとする。そして、図8Bに示すように、時刻t12において、物体が近傍領域NAに進入することで、画像ロストが生じたとする。 As shown in FIG. 8A, at time t11, when the fusion target FT is detected in front of the host vehicle CS by the ECU 20, the relative distance Dr of the fusion target FT with respect to the host vehicle CS becomes smaller. To do. Then, as shown in FIG. 8B, it is assumed that an image has been lost due to the object entering the vicinity area NA at time t12.
 画像ロストが生じることで、物体(レーダ物標RT)の位置がレーダ情報のみで検出され、物体幅WOが取得できなくなる。ここで、図8Bでは、時刻t12においてレーダ物標RTが近傍領域NAに位置しているため時刻t12での物体幅WO(t12)は、時刻t11での物体幅WOと同じ大きさに維持される。一方、比較として示す図8Cでは、物体が近傍領域NAに位置する場合、時刻t12での物体幅WO(t12)は、図8Aで示した時刻t11の物体幅WO(t11)よりも縮小されている。 When the image lost occurs, the position of the object (radar target RT) is detected only with the radar information, and the object width WO cannot be acquired. Here, in FIG. 8B, since the radar target RT is located in the vicinity area NA at time t12, the object width WO (t12) at time t12 is maintained at the same size as the object width WO at time t11. The On the other hand, in FIG. 8C shown as a comparison, when the object is located in the vicinity area NA, the object width WO (t12) at time t12 is reduced more than the object width WO (t11) at time t11 shown in FIG. 8A. Yes.
 そのため、図8Bでは、物体幅WO(t12)が維持されることで、判定領域Wcdに対する比率を示すラップ率RRが、図8(c)の場合のラップ率RR(c)と比べて大きくなる。その結果、PCSが作動し易くなり、レーダ物標RTに対するPCSの作動遅れや不作動を抑制する。 Therefore, in FIG. 8B, by maintaining the object width WO (t12), the wrap rate RR indicating the ratio to the determination region Wcd is larger than the wrap rate RR (c) in the case of FIG. . As a result, the PCS becomes easy to operate, and the PCS operation delay or inactivity with respect to the radar target RT is suppressed.
 以上説明したようにこの第1実施形態では、ECU20は、フュージョン物標FTを検出している状態から画像ロストが生じ、レーダ情報のみで物体を検出する状態に推移した場合に、レーダ物標RTの位置が車両前方における近傍領域NAに位置していれば、PCSの作動条件を維持する。上記構成により、物体が近傍領域NAに進入することで画像ロストが生じた場合でも、この物体に対するPCSの作動の遅れや不作動を抑制することができる。 As described above, in the first embodiment, the ECU 20 detects the radar target RT when the image lost occurs from the state in which the fusion target FT is detected and the object is detected only with the radar information. If the position is located in the vicinity area NA in front of the vehicle, the PCS operating condition is maintained. With the above configuration, even when an image is lost due to the object entering the vicinity area NA, it is possible to suppress the delay or inactivation of the PCS with respect to the object.
 ECU20は、物体の横方向での大きさを示す物体幅WOを取得し、取得された物体幅WOと車両前方に設定された判定領域Wcdとの横方向での重なり量(RR)に基づいてPCSの作動条件を変更する。そして、ECU20は、物体が近傍領域NAに位置する場合、物体が画像情報及びレーダ情報により検出されている状態での物体幅に維持する。上記構成により、PCSの作動条件をより簡易な手法により変更することが可能となる。 The ECU 20 acquires the object width WO indicating the size of the object in the horizontal direction, and based on the amount of overlap (RR) in the horizontal direction between the acquired object width WO and the determination region Wcd set in front of the vehicle. Change the operating conditions of the PCS. Then, when the object is located in the vicinity area NA, the ECU 20 maintains the object width in a state where the object is detected by the image information and the radar information. With the above configuration, it is possible to change the operating conditions of the PCS by a simpler method.
 物体が車両進行方向において自車両CSの近くに位置していても物体が横方向において自車両CSの遠方に位置している場合、物体と自車両CSとの衝突の可能性は低くなる。そこで、ECU20は、レーダ情報により取得される物体の位置Prが予め設定された近傍領域NAの外側であれば、物体が近傍領域NAに位置していないと判定する。上記構成により、物体と自車両CSとの衝突可能性が低い場合は検出精度が低下していることを優先するため、適正なPCSを実施することができる。 Even if the object is positioned near the host vehicle CS in the vehicle traveling direction, if the object is positioned far from the host vehicle CS in the lateral direction, the possibility of a collision between the object and the host vehicle CS is reduced. Therefore, the ECU 20 determines that the object is not located in the vicinity area NA if the position Pr of the object acquired from the radar information is outside the preset vicinity area NA. With the above configuration, when the possibility of a collision between the object and the host vehicle CS is low, priority is given to the fact that the detection accuracy is lowered, so that appropriate PCS can be performed.
 物体の相対速度Vrが小さいと物体と自車両CSとが衝突するまでの余裕時間であるTTCが多くなり、相対速度Vrが大きい物体と比べて、この物体が近傍領域NAに位置していてもPCSが実施されていない可能性が高くなる。そこで、ECU20は物体の相対速度Vrが所定値以下であることを条件として当該物体が近傍領域NAに位置する場合に、PCSの作動条件を維持することした。上記構成により、物体Obが近傍領域NAに位置する場合に、PCSを積極的に作動させて、不作動を抑制することができる。 If the relative speed Vr of the object is small, TTC, which is a margin time until the object collides with the host vehicle CS, increases, and even if the object is located in the vicinity area NA, compared to an object having a large relative speed Vr. There is a high possibility that PCS is not implemented. Therefore, the ECU 20 maintains the PCS operating condition when the object is located in the vicinity area NA on condition that the relative speed Vr of the object is equal to or less than a predetermined value. With the above configuration, when the object Ob is located in the vicinity area NA, the PCS can be actively activated to suppress the inactivation.
 (第2実施形態)
 この第2実施形態では、物体が近傍領域NAに位置する場合に、この物体が自車両CSから遠ざかる方向に移動すれば、ECU20は、PCSを作動し難くする。
(Second Embodiment)
In the second embodiment, when the object is located in the vicinity area NA, if the object moves in a direction away from the host vehicle CS, the ECU 20 makes it difficult to operate the PCS.
 図9は、第2実施形態において、図5のステップS19で実施される処理を示すフローチャートである。 FIG. 9 is a flowchart showing the process performed in step S19 of FIG. 5 in the second embodiment.
 ステップS41では、自車両CSが直進走行しているか否かを判定する。例えば、旋回運動検出センサ34からの出力に基づいて、自車両CSが直進走行しているか右左折しているかを判定する。自車両CSが直進走行していなければ(ステップS41:NO)、ステップS43では、物体幅WOを縮小する。 In step S41, it is determined whether or not the host vehicle CS is traveling straight ahead. For example, based on the output from the turning motion detection sensor 34, it is determined whether the host vehicle CS is traveling straight or turning right or left. If the host vehicle CS is not traveling straight ahead (step S41: NO), the object width WO is reduced in step S43.
 自車両CSが直進走行している場合(ステップS41:YES)、ステップS42では、近傍領域NAに位置するレーダ物標RTが直進しているか否かを判定する。図10は、近傍領域NAでの物体の移動を説明する図である。図10Aに示すように、自車両CSの前方で検出された物体が車両進行方向(Y軸方向)に直進していれば、この物体と自車両CSとが衝突する可能性が高くなる。このような場合、物体幅WOを維持しておくことが好ましい。 If the host vehicle CS is traveling straight (step S41: YES), in step S42, it is determined whether or not the radar target RT located in the vicinity area NA is traveling straight. FIG. 10 is a diagram for explaining the movement of an object in the vicinity area NA. As illustrated in FIG. 10A, if an object detected in front of the host vehicle CS is traveling straight in the vehicle traveling direction (Y-axis direction), the possibility that the object and the host vehicle CS collide with each other increases. In such a case, it is preferable to maintain the object width WO.
 一方、図10Bに示すように、物体が近傍領域NAで右左折することで、車両進行方向(Y軸方向)に対して横方向(X軸方向)に移動する場合、物体が自車両CSの進路から遠ざかるため、物体と自車両CSとが衝突する可能性は低くなる。このような場合、物体幅WOを維持すると、実際には衝突する可能性の低い物体に対しても衝突可能性があると誤判定するおそれがあり、PCSの不要作動の要因となる。 On the other hand, as shown in FIG. 10B, when the object turns left and right in the vicinity area NA and moves in the lateral direction (X-axis direction) with respect to the vehicle traveling direction (Y-axis direction), the object Since moving away from the course, the possibility that the object and the host vehicle CS collide with each other is reduced. In such a case, if the object width WO is maintained, there is a risk of erroneously determining that there is a possibility of collision even with an object that is actually unlikely to collide, which causes unnecessary operation of the PCS.
 そのため、ステップS42では、レーダ情報を用いてレーダ物標RTの位置の変化を検出し、この位置の変化に基づいてレーダ物標RTが直進しているか右左折しているかを判定する。これ以外にも、レーダ物標RTの車速が減速した後に横位置が変動したことを、レーダ情報を用いて検出した場合に、レーダ物標RTが右左折したと判定してもよい。そのため、ステップS42が移動判定部として機能する。 Therefore, in step S42, the radar information is used to detect a change in the position of the radar target RT, and based on this change in position, it is determined whether the radar target RT is going straight or turning right or left. In addition to this, when it is detected using the radar information that the lateral position has changed after the vehicle speed of the radar target RT has decreased, it may be determined that the radar target RT has turned right or left. Therefore, step S42 functions as a movement determination unit.
 レーダ物標RTが直進している場合(ステップS42:YES)、物体幅WOを縮小することなく図9の処理を一旦終了する。そのため、PCSの作動条件が維持されることとなる。一方、レーダ物標RTが直進しておらず、物体が右左折していれば(ステップS42:NO)、ステップS43では、物体幅WOを縮小することで作動条件を変更する。そのため、図5のステップS20において、維持又は縮小された物体幅WOを用いて物体との衝突判定を実施する。 If the radar target RT is traveling straight (step S42: YES), the processing of FIG. 9 is temporarily terminated without reducing the object width WO. Therefore, the operating conditions of the PCS are maintained. On the other hand, if the radar target RT is not traveling straight and the object is turning left or right (step S42: NO), in step S43, the operating condition is changed by reducing the object width WO. Therefore, in step S20 of FIG. 5, the collision determination with the object is performed using the maintained or reduced object width WO.
 以上説明したように、近傍領域NAに位置する物体が右左折することで、当該物体が自車両CSから遠ざかる方向に移動している場合、物体と自車両CSとが共に直進している場合と比べて、衝突の可能性が低くなる。このような場合、ECU20は、PCSを作動し難くすることとした。上記構成により、物体と自車両CSとが衝突する可能性が低い場合は検出精度が低下していることを優先するため、適正なPCSを実施することができる。 As described above, when an object located in the vicinity area NA turns right and left so that the object moves in a direction away from the host vehicle CS, the object and the host vehicle CS are both traveling straight ahead. In comparison, the possibility of a collision is reduced. In such a case, the ECU 20 makes it difficult to operate the PCS. With the above configuration, when the possibility of collision between the object and the host vehicle CS is low, priority is given to the fact that the detection accuracy is lowered, and therefore appropriate PCS can be performed.
 (第3実施形態)
 この第3実施形態では、近傍領域NAとして設定される領域が、第1及び第2の実施形態と比べて異なる。
(Third embodiment)
In the third embodiment, the area set as the neighborhood area NA is different from that in the first and second embodiments.
 図11は、第3実施形態での近傍領域NAの位置を説明する図である。この第3実施形態では、レーダセンサ32のレーダ領域RAは、画像センサ31の撮像領域CAよりも広い領域に設定されている。そして、近傍領域NAは、画像センサ31の画角VAから水平方向に境界角度BAだけ広がる領域(VA~BA)とされている。また、近傍領域NAは、物体から第1位置を検出できるが、第2位置を検出できない領域であるため、本実施形態では、近傍領域NAは、画像センサ31の撮像領域CAよりも外側であって、かつレーダセンサ32のレーダ領域RAよりも内側の領域として設定されている。 FIG. 11 is a diagram illustrating the position of the neighborhood area NA in the third embodiment. In the third embodiment, the radar area RA of the radar sensor 32 is set to be wider than the imaging area CA of the image sensor 31. The neighboring area NA is an area (VA to BA) that extends from the angle of view VA of the image sensor 31 by the boundary angle BA in the horizontal direction. In addition, the neighborhood area NA is an area where the first position can be detected from the object but the second position cannot be detected. Therefore, in the present embodiment, the neighborhood area NA is outside the imaging area CA of the image sensor 31. In addition, it is set as an area inside the radar area RA of the radar sensor 32.
 図12は、第3実施形態において、図5のステップS14で実施される処理を示すフローチャートである。図12に示す処理では、物体からレーダ情報と画像情報とを検出する場合に、物体の横方向での中心位置と、物体幅WOと、横方向での相対速度とを記録する処理である。以下では、ECU20により前回実施された図12の処理を前回の処理と記載し、今回実施される図12の処理を今回の処理と記載する。 FIG. 12 is a flowchart showing the process performed in step S14 of FIG. 5 in the third embodiment. The process shown in FIG. 12 is a process for recording the center position in the lateral direction of the object, the object width WO, and the relative velocity in the lateral direction when detecting radar information and image information from the object. In the following, the process of FIG. 12 performed last time by the ECU 20 is described as the previous process, and the process of FIG. 12 performed this time is described as the current process.
 ステップS51では、レーダ情報として検出した物体の距離と、画像情報として検出した物体の方位とを融合する。 In step S51, the distance of the object detected as radar information and the orientation of the object detected as image information are fused.
 ステップS52では、図5のステップS11で取得した画像情報に基づいて、物体の横方向での中心位置を算出する。本実施形態では、図13に示すように、画像情報に含まれる左右の横位置Xr,Xlの中心位置を物体の横方向での中心位置として算出する。ステップS52が中心位置算出部として機能する。 In step S52, the center position of the object in the lateral direction is calculated based on the image information acquired in step S11 of FIG. In the present embodiment, as shown in FIG. 13, the center positions of the left and right lateral positions Xr and Xl included in the image information are calculated as the center position in the lateral direction of the object. Step S52 functions as a center position calculation unit.
 ステップS53では、物体幅WOを算出する。本実施形態では、画像情報に含まれる物体の左右の横位置Xr,Xlを使用して物体幅WOを算出する。本実施形態では、物体幅WOは、下記式(1)を用いて算出される。
WO=|Xr-Xl| … (1)。
In step S53, the object width WO is calculated. In the present embodiment, the object width WO is calculated using the left and right lateral positions Xr, Xl of the object included in the image information. In the present embodiment, the object width WO is calculated using the following formula (1).
WO = | Xr−Xl | (1).
 なお、画像センサ31が物体の左右の横位置の方位角度の差を示す画像幅角度を出力する場合、物体幅WOを次のように算出するものであってもよい。この場合、ステップS53では、画像幅角度と自車両から物体までの距離とを用いて物体幅を算出すればよい。 When the image sensor 31 outputs an image width angle indicating a difference in azimuth angle between the left and right lateral positions of the object, the object width WO may be calculated as follows. In this case, in step S53, the object width may be calculated using the image width angle and the distance from the host vehicle to the object.
 ステップS54では、物体幅WOの最大値を更新する。例えば,前回処理において保持している物体幅WOと、前回の処理において記録している物体幅WOとを比較し、大きい方の物体幅WOを物体幅WOとして更新する。そのため、ステップS53,S54が物体幅算出部として機能する。 In step S54, the maximum value of the object width WO is updated. For example, the object width WO held in the previous process is compared with the object width WO recorded in the previous process, and the larger object width WO is updated as the object width WO. Therefore, steps S53 and S54 function as an object width calculation unit.
 ステップS55では、物体の横方向での自車両を基準とする相対速度を算出する。例えば、前回処理において、ステップS51で生成したフュージョン情報の位置と、今回処理において、ステップS51で生成したフュージョン情報の位置との横方向での位置の差により、横方向での相対速度を算出する。そのため、ステップS54が横方向速度算出部として機能する。ステップS55の処理が終了すると、図5に示すフローチャートに戻る。 In step S55, a relative speed based on the own vehicle in the lateral direction of the object is calculated. For example, the relative speed in the lateral direction is calculated from the difference in the lateral position between the position of the fusion information generated in step S51 in the previous process and the position of the fusion information generated in step S51 in the current process. . Therefore, step S54 functions as a lateral speed calculation unit. When the process of step S55 ends, the process returns to the flowchart shown in FIG.
 次に、物体の一部が近傍領域に位置しているか否かを判定する処理を、図14、15を用いて説明する。図14に示す処理は、図5のステップS18で実施される処理である。 Next, processing for determining whether or not a part of an object is located in the vicinity region will be described with reference to FIGS. The process shown in FIG. 14 is a process performed in step S18 of FIG.
 ステップS61では、現時点から過去において算出されている中心位置と、物体の横方向速度とに基づいて現時点での物体の横方向での予測中心位置を算出する。本実施形態では、ステップS52で算出した画像情報と、ステップS55で算出した物体の横方向速度とに基づいて現時点での物体における物体幅の予測中心位置を算出する。図15に示すように、ステップS52で保持している画像情報により算出される物体幅の中心位置Mに、ステップS54で記録している物体の横方向での相対速度に応じた距離を加えることで、物体の現在の中心位置の予測中心位置Mpを算出する。ステップS61が位置予測部として機能する。 In step S61, the predicted center position in the lateral direction of the object at the current time is calculated based on the center position calculated in the past from the current time and the lateral speed of the object. In the present embodiment, the predicted center position of the object width of the object at the current time is calculated based on the image information calculated in step S52 and the lateral velocity of the object calculated in step S55. As shown in FIG. 15, a distance corresponding to the relative velocity in the lateral direction of the object recorded in step S54 is added to the center position M of the object width calculated from the image information held in step S52. Thus, the predicted center position Mp of the current center position of the object is calculated. Step S61 functions as a position prediction unit.
 ステップS61において、過去に記録された中心位置と、この画像情報に応じた物体の横方向速度とに基づいて予測中心位置Mpを算出する以外にも、過去に記録された中心位置と、レーダ情報に応じた物体の横方向速度とに基づいて予測中心位置Mpを算出するものであってもよい。 In step S61, in addition to calculating the predicted center position Mp based on the center position recorded in the past and the lateral speed of the object according to the image information, the center position recorded in the past and the radar information The predicted center position Mp may be calculated based on the lateral speed of the object according to the above.
 ステップS62では、ステップS61で算出した予測中心位置と、ステップS53で保持した物体幅WOとに基づいて、現時点での物体の左右の横位置の左右端角度を算出する。ここで、左右端角度は、自車両を基準とする、現在の物体の左右の横位置の方位角度を示している。 In step S62, the left and right end angles of the left and right lateral positions of the object at the current time are calculated based on the predicted center position calculated in step S61 and the object width WO held in step S53. Here, the left and right end angles indicate the azimuth angles of the left and right lateral positions of the current object with reference to the host vehicle.
 本実施形態では、図16に示すように、まず、ステップS61で算出した予測中心位置Mpを基準としてステップS54で更新されている物体幅WO分だけ横方向に延びた位置を、物体の左右の横位置の予測横位置Xpr,Xplとして算出する。次に、算出した予測横位置Xpr,Xplを用いて、現在の物体の左右の横位置の方位角を示す左右端角度を算出する。本実施形態では、左右端角度は、撮像軸を基準として右側に角度が増加する場合をプラス側とし、撮像軸を基準として左側に角度が増加する場合をマイナス側としている。 In the present embodiment, as shown in FIG. 16, first, a position extending in the horizontal direction by the object width WO updated in step S54 on the basis of the predicted center position Mp calculated in step S61 is set to the left and right sides of the object. The predicted horizontal positions Xpr and Xpl of the horizontal position are calculated. Next, using the calculated predicted lateral positions Xpr and Xpl, left and right end angles indicating the azimuth angles of the left and right lateral positions of the current object are calculated. In the present embodiment, the left and right end angles are positive when the angle increases to the right with reference to the imaging axis, and negative when the angle increases to the left with respect to the imaging axis.
 横方向において自車両に近い側の左右端角度をθn、及び自車両に遠い側の左右端角度をθfとした場合、左右端角度と予測横位置との関係は、下記式(2),(3)を用いて算出することができる。
tanθn=X1/Yd … (2)
tanθf=X2/Yd … (3)
 ここで、X1は、予測横位置Xpr及びXplの内、自車両から近い側の予測横位置を示し、X2は、予測横位置Xpr及びXplの内、自車両から遠い側の予測横位置を示す。図16では、Xplが自車両から近い側の予測横位置X1であり、Xprが自車両から遠い側の予測横位置X2である。また、Ydは、自車両から物体までの距離を示し、本実施形態ではフュージョン情報に含まれる自車両から物体までの距離が使用される。
When the left and right end angles on the side close to the host vehicle in the lateral direction are θn and the left and right end angles on the side far from the host vehicle are θf, the relationship between the left and right end angles and the predicted lateral position 3).
tan θn = X1 / Yd (2)
tan θf = X2 / Yd (3)
Here, X1 indicates the predicted lateral position on the side closer to the own vehicle among the predicted lateral positions Xpr and Xpl, and X2 indicates the predicted lateral position on the side farther from the own vehicle among the predicted lateral positions Xpr and Xpl. . In FIG. 16, Xpl is the predicted lateral position X1 on the side closer to the host vehicle, and Xpr is the predicted lateral position X2 on the side farther from the host vehicle. Yd represents the distance from the host vehicle to the object. In this embodiment, the distance from the host vehicle to the object included in the fusion information is used.
 上記した式(2),(3)により、ECU20は、左右端角度を、それぞれ下記式(4),(5)を用いて算出することができる。
θn=arctan(X1/Yd) … (4)
θf=arctan(X2/Yd) … (5)
そのため、ステップS62が方位角度算出部として機能する。
From the above equations (2) and (3), the ECU 20 can calculate the left and right end angles using the following equations (4) and (5), respectively.
θn = arctan (X1 / Yd) (4)
θf = arctan (X2 / Yd) (5)
Therefore, step S62 functions as an azimuth angle calculation unit.
 ステップS63~S66では、ステップS62で算出した左右端角度に基づいて、物体が近傍領域に位置していることを判定する。本実施形態では、ステップS62で算出した左右端角度の内、自車両から遠い方の左右端角度θfに基づいて、物体の一部が近傍領域に位置していることの確からしさを示す見切れ信頼度を算出する。そのため、本実施形態では、ステップS63~S66が位置判定部として機能する。 In steps S63 to S66, it is determined based on the left and right end angles calculated in step S62 that the object is located in the vicinity region. In the present embodiment, based on the left and right end angle θf far from the host vehicle among the left and right end angles calculated in step S62, the parting reliability indicating the certainty that a part of the object is located in the vicinity region. Calculate the degree. Therefore, in the present embodiment, steps S63 to S66 function as a position determination unit.
 図17は、見切れ信頼度を説明する図であり、横軸を左右端角度θfの絶対値とし、縦軸を見切れ信頼度RVとしたグラフである。見切れ信頼度RVは、横方向において、自車両から遠い側の左右端角度に基づいて、近傍領域に位置しているか否かを判定するための評価値である。本実施形態では、見切れ信頼度は、0~100までの値により規定されており、見切れ信頼度が増加するに従い、物体の一部が近傍領域に位置していることの確からしさが増加する。また、見切れ信頼度は、自車両から遠い側の左右端角度θfの値が増加するに従い、その値が非線形に増加するよう値が規定されている。 FIG. 17 is a diagram for explaining the part-off reliability, and is a graph in which the horizontal axis is the absolute value of the left and right end angle θf and the vertical axis is the part-off reliability RV. The part-off reliability RV is an evaluation value for determining whether or not the vehicle is located in the vicinity region based on the left and right end angles on the side far from the host vehicle in the horizontal direction. In this embodiment, the parting reliability is defined by a value from 0 to 100, and the probability that a part of the object is located in the vicinity region increases as the parting reliability increases. Further, the value of the part-off reliability is defined such that the value increases nonlinearly as the value of the left and right end angle θf on the side far from the host vehicle increases.
 横軸の左右端角度θfの内、基準角度Bは、撮像軸から画角までの角度の絶対値を示している。言い換えると、左右端角度θfが基準角度Bとなる場合、物体の左右の横位置の内、自車両から遠い側の端部が画像センサ51の画角上に位置していることになる。 Of the left and right end angles θf on the horizontal axis, the reference angle B indicates the absolute value of the angle from the imaging axis to the angle of view. In other words, when the left and right end angle θf is the reference angle B, the end on the side farther from the host vehicle is located on the angle of view of the image sensor 51 among the left and right lateral positions of the object.
 見切れ信頼度RVは、基準角度Bを基準として、所定角度R1~R2の範囲である中心範囲MRにおける増加率が、下限角度R1以下及び上限角度R2以上での増加率よりも大きくなるよう値が設定されている。中心範囲MRにおける増加率を他の領域よりも大きくすることで、左右端角度が下限角度R1より小さい場合、又は上限角度R2よりも大きい場合において、見切れ信頼度RVが大きく変化することが抑制される。その結果、左右端角度θfが画角に近い程、見切れ信頼度の変化が大きくなり、画像センサ31の自車両に対する横方向での取付け誤差や、ステップS62で算出した左右端角度の誤差の影響を低減することができる。 The part-off reliability RV has a value such that the increase rate in the center range MR, which is a range of the predetermined angles R1 to R2, with respect to the reference angle B, is larger than the increase rates at the lower limit angle R1 or lower and the upper limit angle R2 or higher. Is set. By increasing the increase rate in the center range MR more than other regions, when the left and right end angles are smaller than the lower limit angle R1 or larger than the upper limit angle R2, it is possible to suppress a large change in the part-off reliability RV. The As a result, as the left / right end angle θf is closer to the angle of view, the change in the part-off reliability increases, and the influence of the error in mounting the image sensor 31 in the lateral direction with respect to the host vehicle or the error in the left / right end angle calculated in step S62. Can be reduced.
 ステップS63では、左右端角度に基づいて見切れ信頼度を算出する。左右端角度の内、自車両から遠い側の左右端角度に対して見切れ信頼度を算出する。例えば、図17に示す左右端角度と見切れ信頼度との関係を規定するマップを記録しており、このマップを参照することで、ステップS62で算出した左右端角度に対応する見切れ信頼度を算出する。 In step S63, the part-off reliability is calculated based on the left and right end angles. Of the left and right end angles, the part-off reliability is calculated for the left and right end angles far from the host vehicle. For example, a map defining the relationship between the left and right end angles and the parting reliability shown in FIG. 17 is recorded, and the parting reliability corresponding to the left and right end angles calculated in step S62 is calculated by referring to this map. To do.
 ステップS64では、ステップS63で算出した見切れ信頼度を判定する。本実施形態では、見切れ信頼度を閾値Th11と比較することで、物体の一部が近傍領域に位置しているか否かを判定する。閾値Th11は、撮像軸から画角までの角度の絶対値を示す基準角度Bに基づいて定められている。 In step S64, the part-off reliability calculated in step S63 is determined. In the present embodiment, it is determined whether or not a part of the object is located in the vicinity region by comparing the overrun reliability with the threshold Th11. The threshold value Th11 is determined based on a reference angle B indicating the absolute value of the angle from the imaging axis to the angle of view.
 見切れ信頼度が閾値Th11以上の場合(ステップS64:YES)、ステップS65では、物体の一部が近傍領域に位置していることを示す成立フラグを真にする。一方、見切れ信頼度が閾値Th11未満の場合(ステップS64:NO)、ステップS66では、物体が近傍領域に位置していることを示す成立フラグを偽にする。成立フラグが偽となることで、物体が近傍領域に位置していないと判定したこととなる。 If the overrun reliability is equal to or greater than the threshold Th11 (step S64: YES), in step S65, the establishment flag indicating that a part of the object is located in the vicinity region is set to true. On the other hand, when the part-off reliability is less than the threshold value Th11 (step S64: NO), in step S66, the establishment flag indicating that the object is located in the vicinity region is set to false. When the establishment flag is false, it is determined that the object is not located in the vicinity region.
 図5に戻り、成立フラグが真であれば、物体が近傍領域に位置していると判定し(ステップS18:YES)、ステップS19において、PCSの作動条件を維持する。例えば,物体幅WOをフュージョン成立時での物体幅WOに維持する。一方、成立フラグが偽であれば、物体が近傍領域に位置していないと判定し(ステップS18:NO)、ステップS20において、PCSの作動条件を変更する。例えば、物体幅WOをフュージョン成立時での物体幅WOよりも縮小する。 Referring back to FIG. 5, if the establishment flag is true, it is determined that the object is located in the vicinity region (step S18: YES), and the operating conditions of the PCS are maintained in step S19. For example, the object width WO is maintained at the object width WO when the fusion is established. On the other hand, if the establishment flag is false, it is determined that the object is not located in the vicinity region (step S18: NO), and the PCS operating condition is changed in step S20. For example, the object width WO is reduced more than the object width WO when the fusion is established.
 以上説明したように、この第3実施形態では、ECU20は、物体がレーダ情報及び画像情報により検出されている状態からレーダ情報のみで検出されている状態に推移した場合に、過去において検出されている物体の横方向での中心位置と、物体の横方向速度とに基づいて現時点での物体における予測中心位置を算出する。また、算出した予測中心位置と、物体幅とに基づいて、自車両を基準とする現在の物体の左右の横位置の方位角度を示す左右端角度を算出する。そして、ECU20は、算出した左右端角度に基づいて、物体が近傍領域に位置しているか否かを判定する、こととした。この場合、近傍領域が画像センサ31の画角から水平方向に所定角度だけ外側に広がる領域であっても、物体がこの近傍領域に位置しているか否かを適正に判定することができる。 As described above, in the third embodiment, the ECU 20 is detected in the past when the object changes from the state detected by the radar information and the image information to the state detected only by the radar information. The predicted center position of the object at the current time is calculated based on the center position in the lateral direction of the object and the lateral speed of the object. Further, based on the calculated predicted center position and the object width, the left and right end angles indicating the azimuth angles of the left and right lateral positions of the current object with respect to the host vehicle are calculated. Then, the ECU 20 determines whether or not the object is located in the vicinity region based on the calculated left and right end angles. In this case, even if the neighboring region is a region that extends outward by a predetermined angle in the horizontal direction from the angle of view of the image sensor 31, it can be properly determined whether or not the object is located in this neighboring region.
 ECU20は、算出された左右端角度に基づいて、物体が近傍領域に位置していることの確からしさを示す見切れ信頼度を算出し、算出した見切れ信頼度が閾値以上である場合に、物体が近傍領域に位置していることを判定する。そして、見切れ信頼度は、自車両から遠い側での物体の左右端角度が増加する程、非線形に増加するよう値が設定されており、画角を基準として所定角度内の範囲における中心範囲の見切れ信頼度の増加率が、他の領域と比べて高くなっている。この場合、中心範囲における増加率を、他の領域よりも大きくすることで、左右端角度が中心範囲の下限値よりも高い場合に、見切れ信頼度が大きくなりやすくなる。その結果、近傍領域に位置しているか否かの判定において、画像センサ31の自車両に対する横方向での取付け誤差や、左右端角度の算出誤差の影響を低減することができる。 Based on the calculated left and right end angles, the ECU 20 calculates a parting reliability indicating the certainty that the object is located in the vicinity region, and when the calculated parting reliability is equal to or greater than a threshold, It is determined that it is located in the vicinity region. The missing reliability is set to increase nonlinearly as the left and right end angles of the object on the far side from the host vehicle increase, and the value of the center range in the range within the predetermined angle with respect to the angle of view is set. The rate of increase in the part-off reliability is higher than in other areas. In this case, by setting the increase rate in the center range to be larger than that in the other regions, when the left and right end angles are higher than the lower limit value of the center range, the parting reliability is likely to increase. As a result, in determining whether or not the vehicle is located in the vicinity region, it is possible to reduce the influence of the horizontal mounting error of the image sensor 31 with respect to the host vehicle and the calculation error of the left and right end angles.
 (その他の実施形態)
 上述した第3実施形態において、物体が右左折しているか否かを判定する手法として、フュージョン物標FTが成立している時点での、物体の特徴を用いてその後の物体の動作を予測するものであってもよい。例えば、図5のステップS14において、ECU20は、フュージョン情報として、画像情報からウィンカー等の方向指示灯の点滅を、右左折を示す特徴として検出し、次回の処理でフュージョン物標FTが検出されない場合(ステップS13:NO)、過去の物体幅WOを縮小させてもよい。また、図5のステップS14において、フュージョン情報として、物体が交差点の区画白線を跨いだこと撮像画像に基づいて検出し、次回の処理でフュージョン物標FTが検出されない場合(ステップS13:NO)、過去の物体幅WOを縮小させてもよい。
(Other embodiments)
In the third embodiment described above, as a method of determining whether or not the object is turning right or left, the subsequent object motion is predicted using the object characteristics when the fusion target FT is established. It may be a thing. For example, in step S14 of FIG. 5, the ECU 20 detects blinking of a direction indicator lamp such as a blinker from the image information as the fusion information as a feature indicating a right or left turn, and the fusion target FT is not detected in the next processing. (Step S13: NO), the past object width WO may be reduced. Further, in step S14 of FIG. 5, when the fusion target is detected based on the captured image that the object straddles the intersection white line at the intersection, and the fusion target FT is not detected in the next processing (step S13: NO), The past object width WO may be reduced.
 フュージョン物標FTが成立している時点での、物体の横方向での位置を記録しておき、画像ロストが生じた場合に、記録した過去の物体の横位置に基づいてPCSの作動条件を変更するものであってもよい。この場合、例えば、図5のステップS14において、ECU20は、フュージョン情報として、画像情報から物体の横位置を取得し、次回の処理でフュージョン物標FTを検出しない場合(ステップS13:NO)、ステップS14で取得した横位置に基づいて、物体が近傍領域NAに位置しているか否かを判定する。具体的には、記録された物体の横位置が近傍領域NAの横方向での範囲よりも内側であれば、現在の物体が近傍領域NAに位置していると判定する。 The position of the object in the horizontal direction at the time when the fusion target FT is established is recorded, and when image loss occurs, the PCS operating condition is determined based on the recorded past position of the object. It may be changed. In this case, for example, in step S14 of FIG. 5, the ECU 20 acquires the lateral position of the object from the image information as the fusion information, and does not detect the fusion target FT in the next process (step S13: NO), step Based on the lateral position acquired in S14, it is determined whether or not the object is located in the neighborhood area NA. Specifically, if the recorded lateral position of the object is inside the lateral range of the neighboring area NA, it is determined that the current object is located in the neighboring area NA.
 物体が近傍領域NAに位置する場合に一律にPCSの作動条件を維持することに代えて、物体の近傍領域NA内での位置に基づいてPCSの作動条件を変更するものであってもよい。この場合、図5のステップS19において、ECU20は、物体が近傍領域NAに位置している場合に、自車両CSと物体との相対距離Drが閾値以上であればPCSの作動条件を維持する。同じ相対速度Vrであれば、相対距離Drが大きい程、TTCが大きくなる。そのため、レーダ物標RTが近傍領域NAに進入した後の相対距離Drが大きい場合、小さい場合と比べてPCSが実施されていない可能性が高くなる。そのため、ECU20は、ステップS19において相対距離Drが所定値以上である場合、物体幅WOを維持することで、PCSを作動し易くする。 Instead of maintaining the PCS operation condition uniformly when the object is located in the vicinity area NA, the PCS operation condition may be changed based on the position of the object in the vicinity area NA. In this case, in step S19 of FIG. 5, when the object is located in the vicinity area NA, the ECU 20 maintains the PCS operating condition if the relative distance Dr between the host vehicle CS and the object is equal to or greater than a threshold value. If the relative speed Vr is the same, the TTC increases as the relative distance Dr increases. Therefore, when the relative distance Dr after the radar target RT has entered the vicinity area NA is large, there is a high possibility that the PCS is not performed compared to the case where the relative distance Dr is small. Therefore, when the relative distance Dr is greater than or equal to a predetermined value in step S19, the ECU 20 makes the PCS easier to operate by maintaining the object width WO.
 PCSS100は、ECU20と画像センサ31とを個別に備える構成に代えて、ECU20と画像センサ31とを一体の装置として備えるものであってもよい。この場合、画像センサ31の内部に上述したECU20を備えることとなる。また、PCSS100はレーダセンサ32に代えて、レーザー光を送信波として用いるレーザーセンサを備えるものであってもよい。 The PCSS 100 may include the ECU 20 and the image sensor 31 as an integrated device instead of the configuration including the ECU 20 and the image sensor 31 individually. In this case, the ECU 20 described above is provided inside the image sensor 31. The PCSS 100 may include a laser sensor that uses laser light as a transmission wave instead of the radar sensor 32.
 物体が近傍領域NAに位置している場合に、自車両CSの車速が所定値以上である場合に、PCSの作動条件を維持するものであってもよい。 When the vehicle speed of the host vehicle CS is equal to or higher than a predetermined value when the object is located in the vicinity area NA, the PCS operating condition may be maintained.
 本開示は、実施例に準拠して記述されたが、本開示は当該実施例や構造に限定されるものではないと理解される。本開示は、様々な変形例や均等範囲内の変形をも包含する。加えて、様々な組み合わせや形態、さらには、それらに一要素のみ、それ以上、あるいはそれ以下、を含む他の組み合わせや形態をも、本開示の範疇や思想範囲に入るものである。 Although the present disclosure has been described based on the embodiments, it is understood that the present disclosure is not limited to the embodiments and structures. The present disclosure includes various modifications and modifications within the equivalent range. In addition, various combinations and forms, as well as other combinations and forms including only one element, more or less, are within the scope and spirit of the present disclosure.

Claims (9)

  1.  送信波に対応する反射波に基づく物体の検出結果である第1情報と、車両前方を撮像手段で撮像した撮像画像に基づく前記物体の検出結果である第2情報と、を用いて前記物体を検出する車両制御装置であって、
     前記第1情報と前記第2情報との少なくともいずれかに基づいて、前記物体との衝突を回避するための衝突回避制御を実施する制御部(22)と、
     前記物体が前記第1情報及び前記第2情報により検出されている状態から前記第1情報のみで検出されている状態に推移した場合、前記物体が、前記車両前方において前記第2情報を取得できない領域として予め定められている近傍領域に位置しているか否かを判定する位置判定部(23)と、
     前記物体が前記近傍領域に位置していると判定された場合に、前記衝突回避制御の作動条件を前記物体が前記第1情報及び前記第2情報により検出されている状態から維持する維持部(24)と、を有する車両制御装置。
    The first information that is the detection result of the object based on the reflected wave corresponding to the transmission wave, and the second information that is the detection result of the object based on the captured image obtained by imaging the front of the vehicle with the imaging unit. A vehicle control device for detecting,
    A control unit (22) for performing collision avoidance control for avoiding a collision with the object based on at least one of the first information and the second information;
    When the object transitions from the state detected by the first information and the second information to the state detected only by the first information, the object cannot acquire the second information in front of the vehicle A position determination unit (23) for determining whether or not it is located in a nearby region that is predetermined as a region;
    A maintenance unit that maintains an operation condition of the collision avoidance control from a state in which the object is detected by the first information and the second information when it is determined that the object is located in the vicinity region ( 24).
  2.  前記制御部は、前記物体の横方向での大きさを示す物体幅を取得し、取得された前記物体幅と車両前方に設定された判定領域との前記横方向での重なり量に基づいて前記衝突回避制御の作動条件を変更し、
     前記維持部は、前記物体が前記近傍領域に位置する場合、前記物体幅を前記物体が前記第1情報及び前記第2情報により検出されていた状態での大きさに維持する、請求項1に記載の車両制御装置。
    The control unit acquires an object width indicating a size of the object in the lateral direction, and based on the amount of overlap in the lateral direction between the acquired object width and a determination region set in front of the vehicle Change the operation condition of collision avoidance control,
    2. The maintenance unit according to claim 1, wherein, when the object is located in the vicinity region, the object width is maintained to a size in a state in which the object is detected by the first information and the second information. The vehicle control device described.
  3.  前記位置判定部は、前記車両の横方向において前記車両の中心から所定の距離を前記近傍領域の前記横方向での範囲として定めており、前記第1情報に基づいて取得される前記物体の前記横方向での位置が定められた前記近傍領域の範囲よりも内側である場合に、前記物体が前記近傍領域に位置していると判定する、請求項1又は請求項2に記載の車両制御装置。 The position determination unit determines a predetermined distance from the center of the vehicle in the lateral direction of the vehicle as a range in the lateral direction of the neighboring region, and the position of the object acquired based on the first information 3. The vehicle control device according to claim 1, wherein the vehicle control device determines that the object is located in the vicinity area when a position in a lateral direction is inside a range of the vicinity area. .
  4.  前記物体が前記車両の横方向において前記車両から遠ざかる方向に移動しているか否かを判定する移動判定部を有し、
     前記維持部は、前記位置判定部により前記物体が前記近傍領域に位置していると判定された場合であって、かつ、前記物体が前記横方向において前記車両から遠ざかる方向に移動している場合に、前記衝突回避制御を作動し難くするよう前記作動条件を変更する、請求項1から請求項3のいずれか一項に記載の車両制御装置。
    A movement determination unit that determines whether the object is moving in a direction away from the vehicle in a lateral direction of the vehicle;
    The maintenance unit is a case where the position determination unit determines that the object is located in the vicinity region, and the object is moving in a direction away from the vehicle in the lateral direction. The vehicle control device according to any one of claims 1 to 3, wherein the operation condition is changed so that the collision avoidance control is difficult to operate.
  5.  前記車両を基準とする前記物体の相対速度を取得する相対速度取得部を有し、
     前記維持部は、前記相対速度が所定値以下であることを条件として、前記物体が前記近傍領域に位置していると判定された場合に、前記作動条件を維持する、請求項1から請求項4のいずれか一項に記載の車両制御装置。
    A relative speed acquisition unit that acquires a relative speed of the object with respect to the vehicle;
    The said maintenance part maintains the said operation condition, when it determines with the said object being located in the said vicinity area on the condition that the said relative speed is below a predetermined value. The vehicle control device according to claim 4.
  6.  前記維持部は、前記物体が前記第1情報及び前記第2情報により検出されている状態から前記第1情報のみで検出されている状態に推移し、且つ前記位置判定部により前記物体が前記近傍領域に位置してないと判定された場合に、前記衝突回避制御を作動し難くするよう前記作動条件を変更する、請求項1から請求項5のいずれか一項に記載の車両制御装置。 The maintenance unit transitions from a state in which the object is detected by the first information and the second information to a state in which the object is detected only by the first information, and the position is determined by the position determination unit. The vehicle control device according to any one of claims 1 to 5, wherein the operation condition is changed to make it difficult to operate the collision avoidance control when it is determined that the vehicle is not located in an area.
  7.  前記近傍領域は、前記撮像手段の画角から水平方向に所定角度だけ広がる領域であって、
     前記物体が前記第1情報及び前記第2情報により検出されている状態において、前記第2情報に基づいて、前記物体の横方向での中心位置を算出する中心位置算出部と、
     前記物体が前記第1情報及び前記第2情報により検出されている状態において、前記第2情報に基づいて、前記物体の横方向での大きさを示す物体幅を算出する物体幅算出部と、
     前記第1情報及び前記第2情報の少なくともいずれかに基づいて、前記物体の横方向速度を算出する横方向速度算出部と、
     前記物体が前記第1情報及び前記第2情報により検出されている状態から前記第1情報のみで検出されている状態に推移した場合に、現時点から過去において算出されている前記中心位置と、前記物体の横方向速度とに基づいて現時点での前記物体の横方向での予測中心位置を算出する位置予測部と、
     算出された前記予測中心位置と、前記物体幅とに基づいて、自車両を基準とする現在の前記物体の左右の横位置の方位角度を算出する方位角度算出部と、を備え、
     前記判定部は、算出された前記左右の横位置の方位角度に基づいて、前記物体が前記近傍領域に位置しているか否かを判定する、請求項1又は請求項2に記載の車両制御装置。
    The vicinity region is a region that extends a predetermined angle in the horizontal direction from the angle of view of the imaging means,
    A center position calculation unit that calculates a center position of the object in the lateral direction based on the second information in a state where the object is detected by the first information and the second information;
    An object width calculating unit that calculates an object width indicating a size of the object in a lateral direction based on the second information in a state where the object is detected by the first information and the second information;
    A lateral speed calculator that calculates a lateral speed of the object based on at least one of the first information and the second information;
    The center position calculated in the past from the present time when the object is changed from the state detected by the first information and the second information to the state detected only by the first information; and A position prediction unit that calculates a predicted center position in the lateral direction of the object at a current time based on a lateral speed of the object;
    An azimuth angle calculation unit that calculates azimuth angles of the lateral positions of the current object on the basis of the host vehicle based on the calculated predicted center position and the object width;
    The vehicle control device according to claim 1, wherein the determination unit determines whether or not the object is located in the vicinity region based on the calculated azimuth angle of the left and right lateral positions. .
  8.  前記判定部は、前記左右の横位置の方位角度に基づいて、前記物体が前記近傍領域に位置していることの確からしさを示す見切れ信頼度を算出し、算出した前記見切れ信頼度が閾値以上となる場合に、前記物体が前記近傍領域に位置していることを判定するものであり、
     前記見切れ信頼度は、
     自車両から遠い側での物体の前記方位角度が増加する程、値が非線形に増加するよう設定されており、
     前記画角を基準として所定角度の範囲での増加率が、他の範囲での増加率と比べて高くなるよう値が設定されている、請求項7に記載の車両制御装置。
    The determination unit calculates a part-off reliability indicating the certainty that the object is located in the vicinity region based on the azimuth angles of the left and right lateral positions, and the calculated part-off reliability is equal to or greater than a threshold value. Is determined to determine that the object is located in the vicinity region,
    The part-off reliability is
    The value is set to increase nonlinearly as the azimuth angle of the object on the side far from the vehicle increases.
    The vehicle control device according to claim 7, wherein a value is set such that an increase rate in a range of a predetermined angle with respect to the angle of view is higher than an increase rate in another range.
  9.  送信波に対応する反射波に基づく物体の検出結果である第1情報と、車両前方を撮像手段で撮像した撮像画像に基づく前記物体の検出結果である第2情報と、を用いて前記物体を検出する車両制御方法であって、
     前記第1情報と前記第2情報との少なくともいずれかに基づいて、前記物体との衝突を回避するための衝突回避制御を実施する制御工程と、
     前記物体が前記第1情報及び前記第2情報により検出されている状態から前記第1情報のみで検出されている状態に推移した場合、前記物体が、前記車両前方において前記第2情報を取得できない領域として予め定められている近傍領域に位置しているか否かを判定する位置判定工程と、
     前記物体が前記近傍領域に位置していると判定された場合に、前記衝突回避制御の作動条件を前記物体が前記第1情報及び前記第2情報により検出されている状態から維持する維持工程と、を有する車両制御方法。
    The first information that is the detection result of the object based on the reflected wave corresponding to the transmission wave, and the second information that is the detection result of the object based on the captured image obtained by imaging the front of the vehicle with the imaging unit. A vehicle control method for detecting,
    A control step of performing collision avoidance control for avoiding a collision with the object based on at least one of the first information and the second information;
    When the object transitions from the state detected by the first information and the second information to the state detected only by the first information, the object cannot acquire the second information in front of the vehicle A position determination step for determining whether or not it is located in a nearby region that is predetermined as a region;
    Maintaining the operation condition of the collision avoidance control from a state in which the object is detected by the first information and the second information when it is determined that the object is located in the vicinity region; A vehicle control method.
PCT/JP2017/018409 2016-05-19 2017-05-16 Vehicle control device and vehicle control method WO2017199971A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
US16/302,496 US11091153B2 (en) 2016-05-19 2017-05-16 Vehicle control apparatus and vehicle control method

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2016-100809 2016-05-19
JP2016100809 2016-05-19
JP2016225193A JP6493365B2 (en) 2016-05-19 2016-11-18 Vehicle control apparatus and vehicle control method
JP2016-225193 2016-11-18

Publications (1)

Publication Number Publication Date
WO2017199971A1 true WO2017199971A1 (en) 2017-11-23

Family

ID=60325278

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2017/018409 WO2017199971A1 (en) 2016-05-19 2017-05-16 Vehicle control device and vehicle control method

Country Status (1)

Country Link
WO (1) WO2017199971A1 (en)

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007226680A (en) * 2006-02-24 2007-09-06 Toyota Motor Corp Object detection system
JP2014222462A (en) * 2013-05-14 2014-11-27 株式会社デンソー Crash mitigation device
JP2015148899A (en) * 2014-02-05 2015-08-20 トヨタ自動車株式会社 collision avoidance control device

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2007226680A (en) * 2006-02-24 2007-09-06 Toyota Motor Corp Object detection system
JP2014222462A (en) * 2013-05-14 2014-11-27 株式会社デンソー Crash mitigation device
JP2015148899A (en) * 2014-02-05 2015-08-20 トヨタ自動車株式会社 collision avoidance control device

Similar Documents

Publication Publication Date Title
JP6493365B2 (en) Vehicle control apparatus and vehicle control method
US10922561B2 (en) Object recognition device and vehicle travel control system
JP6539228B2 (en) Vehicle control device and vehicle control method
US10471961B2 (en) Cruise control device and cruise control method for vehicles
JP5862785B2 (en) Collision determination device and collision determination method
US10366603B2 (en) Recognition support device for vehicle
JP6988200B2 (en) Vehicle control device
JP6855776B2 (en) Object detection device and object detection method
JP7018277B2 (en) Object detection device, object detection method and vehicle control system
WO2017104773A1 (en) Moving body control device and moving body control method
JP6477453B2 (en) Object detection device and object detection method
WO2018092590A1 (en) Vehicle control device and vehicle control method
JP2015078926A (en) Target detection device
JP6380232B2 (en) Object detection apparatus and object detection method
US11603096B2 (en) Traveling control apparatus
JP6561704B2 (en) Driving support device and driving support method
US10578714B2 (en) Vehicle control apparatus and vehicle control method
JP6669090B2 (en) Vehicle control device
WO2016204213A1 (en) Vehicle control device and vehicle control method
WO2017138329A1 (en) Collision prediction device
WO2017170799A1 (en) Object recognition device and object recognition method
JP2017151726A (en) Collision predicting device
JP2017194926A (en) Vehicle control apparatus and vehicle control method
WO2017199971A1 (en) Vehicle control device and vehicle control method
US11407390B2 (en) Vehicle control apparatus and vehicle control method

Legal Events

Date Code Title Description
NENP Non-entry into the national phase

Ref country code: DE

121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 17799395

Country of ref document: EP

Kind code of ref document: A1

122 Ep: pct application non-entry in european phase

Ref document number: 17799395

Country of ref document: EP

Kind code of ref document: A1