WO2014033957A1 - 衝突判定装置及び衝突判定方法 - Google Patents
衝突判定装置及び衝突判定方法 Download PDFInfo
- Publication number
- WO2014033957A1 WO2014033957A1 PCT/JP2012/072365 JP2012072365W WO2014033957A1 WO 2014033957 A1 WO2014033957 A1 WO 2014033957A1 JP 2012072365 W JP2012072365 W JP 2012072365W WO 2014033957 A1 WO2014033957 A1 WO 2014033957A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- collision determination
- vehicle
- image
- radar
- detection
- Prior art date
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/165—Anti-collision systems for passive traffic, e.g. including static obstacles, trees
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/02—Systems using reflection of radio waves, e.g. primary radar systems; Analogous systems
- G01S13/06—Systems determining position data of a target
- G01S13/08—Systems for measuring distance only
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
- G01S13/867—Combination of radar systems with cameras
-
- G—PHYSICS
- G08—SIGNALLING
- G08G—TRAFFIC CONTROL SYSTEMS
- G08G1/00—Traffic control systems for road vehicles
- G08G1/16—Anti-collision systems
- G08G1/166—Anti-collision systems for active traffic, e.g. moving vehicles, pedestrians, bikes
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/86—Combinations of radar systems with non-radar systems, e.g. sonar, direction finder
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01S—RADIO DIRECTION-FINDING; RADIO NAVIGATION; DETERMINING DISTANCE OR VELOCITY BY USE OF RADIO WAVES; LOCATING OR PRESENCE-DETECTING BY USE OF THE REFLECTION OR RERADIATION OF RADIO WAVES; ANALOGOUS ARRANGEMENTS USING OTHER WAVES
- G01S13/00—Systems using the reflection or reradiation of radio waves, e.g. radar systems; Analogous systems using reflection or reradiation of waves whose nature or wavelength is irrelevant or unspecified
- G01S13/88—Radar or analogous systems specially adapted for specific applications
- G01S13/93—Radar or analogous systems specially adapted for specific applications for anti-collision purposes
- G01S13/931—Radar or analogous systems specially adapted for specific applications for anti-collision purposes of land vehicles
Definitions
- the present invention relates to a collision determination device and a collision determination method for determining a collision between a vehicle and an object.
- a collision determination device and a collision determination method for example, as described in JP-A-2005-84034, a composite target of an object is generated using a detection result by a radar sensor and a detection result by an image sensor, An apparatus that determines a collision between a vehicle and an object based on a generated synthetic target is known.
- collision determination is performed until a composite target is generated. Was not done. And since a comparatively long processing time is required until a synthetic
- the present invention is intended to provide a collision determination device and a collision determination method that can perform collision determination at an early timing even when a synthetic target is not generated.
- a collision determination device includes a radar detection unit that detects an object in front of a vehicle using a radar wave, an image detection unit that images the front of the vehicle and detects an object, and a detection result and an image from the radar detection unit.
- a collision determination unit that determines a collision between a vehicle and an object based on a composite target generated using a detection result obtained by the detection unit.
- the collision determination unit is an image detection unit in which the object cannot be detected by the radar detection unit. When it is determined that the object is stationary in the traveling direction of the vehicle, the collision determination is performed based on the detection result by the image detection unit instead of the collision determination based on the composite target.
- the collision determination unit determines that the object is stationary in the traveling direction of the vehicle. Instead, the collision determination is performed based on the detection result by the image detection unit. Therefore, the processing time is shortened by the stillness determination by the collision determination unit, and the collision determination can be performed at an early timing based on the detection result by the image detection unit.
- the collision determination unit also includes a first distance change amount calculated from the vehicle speed as a change in the distance between the vehicle and the object per unit time in the traveling direction of the vehicle, and a vehicle per unit time in the traveling direction of the vehicle. Whether or not the object is stationary in the traveling direction of the vehicle may be determined based on a ratio of the second distance change amount calculated from the image detection result as a change in the distance between the object and the object. Accordingly, the stationary state of the object can be determined at high speed based on the ratio between the first distance change amount and the second distance change amount.
- the collision determination unit may determine that the object is stationary in the traveling direction of the vehicle when the ratio between the first distance change amount and the second distance change amount is less than the threshold value.
- the collision detection unit A collision determination may be performed based on the detection result.
- the collision determination unit determines that the object is not detected by the radar detection unit but is detected by the image detection unit, and the object is stationary in the traveling direction of the vehicle, the collision determination unit performs a collision based on the detection result by the image detection unit. A determination may be made.
- the collision determination unit may perform the collision determination based on the detection result by the image detection unit when the synthetic target is generated and the generated synthetic target is released.
- the detection range of the radar detection unit and the detection range of the image detection unit may partially overlap, and there may be a region detected by the image detection unit without being detected by the radar detection unit.
- the radar detection unit may detect an object in front of the vehicle using millimeter waves.
- the collision determination method detects an object in front of a vehicle using a radar wave, detects an object in front of the vehicle, detects an object based on the captured image, and generates a detection result using radar detection and a detection result based on image detection.
- the collision determination method for determining the collision between the vehicle and the object based on the synthesized target the object cannot be detected by radar detection, can be detected by image detection, and the object is stationary in the traveling direction of the vehicle.
- the collision determination is performed, instead of the collision determination based on the composite target, the collision determination is performed based on the detection result by the image detection.
- the present invention it is possible to provide a collision determination device and a collision determination method capable of performing a collision determination at an early timing even when a synthetic target is not generated.
- the collision determination device is a device that is mounted on a vehicle and determines a collision between the vehicle and an object using a radar sensor and an image sensor.
- FIG. 1 is a block diagram showing a configuration of a collision determination device according to an embodiment of the present invention.
- the collision determination apparatus includes a speed sensor 11, a radar 12, a stereo camera 13, and an ECU 20 (Electronic Control Unit).
- Speed sensor 11 detects the speed of the vehicle.
- a wheel speed sensor is used as the speed sensor 11.
- the speed sensor 11 supplies the detected vehicle speed to the ECU 20.
- the radar 12 functions as a radar detection unit (radar sensor) that detects an object ahead of the vehicle using a radar wave, transmits a radar wave (electromagnetic wave) in front of the vehicle, and receives a radar wave reflected from the object.
- a radar detection unit radar sensor
- the radar 12 supplies radar detection information indicating the detection result of the object to the ECU 20.
- the stereo camera 13 functions as an image detection unit (image sensor) that images the front of the vehicle and detects an object from the captured image.
- image sensor image sensor
- a CCD Charge-Coupled Device
- CMOS Complementary Metal-Oxide Semiconductor
- the stereo camera 13 is installed as a plurality of cameras on the front surface or cabin of the vehicle.
- the stereo camera 13 supplies image detection information indicating the detection result of the object to the ECU 20.
- a single camera may be used instead of the stereo camera 13.
- the ECU 20 includes a radar target generation unit 21, an image target generation unit 22, a composite target generation unit 23, and a collision determination unit 24.
- the ECU 20 implements the functions of a radar target generation unit 21, an image target generation unit 22, a synthetic target generation unit 23, and a collision determination unit 24 through execution of a program by the CPU, mainly using a CPU, ROM, RAM, and the like.
- the ECU 20 may be configured as a single unit or a plurality of units.
- the radar target generator 21 generates a radar target based on radar detection information from the radar 12.
- the radar target has target information related to the distance to the object and the lateral position of the object, which is obtained from coordinates based on the vehicle.
- the target information of the radar target is calculated based on the radar detection information from the radar 12.
- the distance to the object represents the distance from the vehicle (radar 12) to the object in the traveling direction of the vehicle, and is calculated based on the time until the radar wave is transmitted from the radar 12, reflected from the object and received.
- the The lateral position of the object represents a distance from the vehicle (radar 12) to the object in a direction orthogonal to the traveling direction of the vehicle, and is calculated based on the direction (angle) of the radar wave reflected and received from the object. .
- the lateral position in the radar target is information on the position of the object detected by the radar 12, and does not include information on the lateral width of the object.
- the image target generator 22 generates an image target based on the image detection information from the stereo camera 13.
- the image target has target information related to the distance to the object and the lateral position of the object, which is obtained from the coordinates based on the vehicle. Further, the image target generation unit 22 determines whether or not the object is in a stationary state by tracking the object based on the image detection information, and supplies the tracking result and the determination result of the stationary state to the collision determination unit 24. To do.
- the target information of the image target is calculated based on the principle of triangulation based on the deviation of the image detection information of the left and right cameras constituting the stereo camera 13, or based on the detected size and position of the license plate etc. of the vehicle ahead. Calculated.
- the distance to the object represents the distance from the vehicle (stereo camera 13) to the object in the traveling direction of the vehicle.
- the lateral position of the object represents the distance from the vehicle (stereo camera 13) to the object in a direction orthogonal to the traveling direction of the vehicle.
- the lateral position in the image target also includes information on the lateral range of the object detected from the image, that is, the lateral width of the object.
- the composite target generation unit 23 generates a composite target of the object using the target information of the radar target and the image target, that is, the detection result by the radar 12 and the stereo camera 13.
- the composite target is generated by collating both targets based on the target information of the radar target and the image target. Both targets are collated based on the similarity of the target information in both targets, that is, the similarity of the distance to the object and the lateral position of the object.
- the composite target has target information regarding the distance to the object and the horizontal position (including the horizontal width) of the object.
- the target information of the composite target is based on the target information of the radar target and the image target, and has higher accuracy than the target information of the radar target or the image target alone.
- FIG. 2 is a diagram showing detection ranges A1 and A2 of the radar 12 and the stereo camera 13.
- the detection range A1 of the radar 12 is narrower than the detection range A2 of the stereo camera 13. Therefore, an area that can be detected only by the stereo camera 13 outside the detection range A1 of the radar 12 exists diagonally forward of the vehicle C.
- a composite target is generated while the object exists in the detection ranges A1 and A2 of both sensors 12 and 13, but when the object deviates from the detection range A1 of the radar 12, the synthesis is performed. No target is generated.
- the collision determination unit 24 calculates a collision determination parameter for each of the radar target, the image target, and the composite target. As parameters, for example, target distance, collision probability, existence probability, and collision lateral position are calculated.
- the target distance means the distance to the target in the direction of travel of the vehicle
- the collision probability means the probability that the vehicle will collide with the object corresponding to the target
- the existence probability corresponds to the target.
- the collision lateral position means a lateral position (position in the width direction of the vehicle) where a collision with an object corresponding to the target is expected.
- the target distance, the collision probability, the existence probability, and the collision lateral position are obtained based on the movement status of each target.
- the parameters of each target are stored in a memory such as a RAM for a predetermined period together with the target information of each target, and are read out as necessary.
- the collision determination unit 24 performs a collision determination based on the composite target.
- the collision determination unit 24 determines the possibility of collision with an object based on whether or not the collision time is less than the threshold when the parameters of the composite target satisfy a predetermined threshold.
- the collision time is calculated by dividing the distance to the object by the relative speed of the object (the amount of change per unit time of the distance to the object) using the target information of the composite target.
- the determination result of the possibility of collision is used for collision avoidance support by, for example, notification to the driver, control intervention for braking or steering of the vehicle, and the like.
- the collision determination unit 24 performs the collision determination based on the image target in a situation where only the image target is generated without generating the radar target.
- the collision determination unit 24 determines the possibility of collision with an object based on whether or not the parameter of the image target satisfies a predetermined threshold and the collision time is less than the threshold.
- the collision time is calculated by dividing the distance to the object by the relative speed of the object using the target information of the image target.
- the collision determination unit 24 determines that the object cannot be detected by the radar 12 and can be detected by the stereo camera 13 and the object is stationary in the traveling direction of the vehicle. Instead of the determination, the collision determination is performed based on the detection result by the stereo camera 13.
- the stationary state of the object is a change in the distance between the vehicle and the object per unit time in the traveling direction of the vehicle, and the first distance change amount calculated from the speed of the vehicle, The determination is based on the ratio to the second distance change amount calculated from the detection result.
- FIG. 3 is a diagram illustrating a situation of a conventional collision determination process.
- FIG. 3 shows time-series changes in the positions of the targets generated by the sensors 12 and 13 together with the detection ranges A1 and A2 of the radar 12 and the stereo camera 13.
- the target for example, a pedestrian P that crosses the front of the traveling vehicle C is assumed.
- the image target generation unit 22 recognizes the pedestrian P as an object, determines the stationary state, and when the image target generation unit 22 confirms the stationary state, the collision determination unit 24 performs a collision determination.
- the stillness determination by the image target generation unit 22 requires a relatively long processing time in relation to the image update cycle. Therefore, for example, as shown in FIG. 3A, the collision avoidance support operation may be delayed or not performed appropriately.
- the image target generation unit 22 recognizes the pedestrian P as an object and determines the stationary state.
- the image target generation unit 22 determines the stationary state in consideration of the speed of the object in the direction that intersects the traveling direction of the vehicle C, particularly in the direction orthogonal thereto. Therefore, unless the image target generation unit 22 confirms the stationary state, for example, the collision determination is not performed as illustrated in FIG.
- FIG. 4 is a flowchart showing the operation of the collision determination device.
- FIG. 5 is a diagram showing the situation of the collision determination process shown in FIG.
- the collision determination device repeatedly executes the process shown in FIG. 4 for each processing cycle.
- the radar target generator 21 generates a radar target when an object is present within the detection range of the radar 12 (step S11).
- the image target generation unit 22 generates an image target when an object exists within the detection range of the stereo camera 13 (S12).
- the synthetic target generation unit 23 generates a synthetic target when collation between the radar target and the image target is obtained (S13).
- the collision determination unit 24 calculates a collision determination parameter for each of the radar target, the image target, and the composite target (S14).
- the collision determination unit 24 determines whether or not the object is undetectable by the radar 12 and can be detected by the stereo camera 13, that is, whether or not a composite target is not generated (S15). Such a situation occurs when the object is outside the detection range of the radar 12 and is within the detection range of the stereo camera 13 as shown in FIG. Such a situation also occurs when radar detection fails and image detection succeeds, that is, when an object is detected by the stereo camera 13 without being detected by the radar 12.
- the collision determination unit 24 determines whether or not the object is stationary in the traveling direction of the vehicle (S16).
- that the object is stationary in the traveling direction of the vehicle means that the speed of the object in the direction is 0 or substantially 0.
- the ratio (distance ratio) between the first distance difference (distance change amount) calculated from the vehicle speed and the second distance difference (distance change amount) calculated from the image detection result. ) Is less than the threshold value. Both the first and second distance differences represent changes in the distance between the vehicle and the object per unit time in the traveling direction of the vehicle.
- the collision determination unit 24 performs a collision determination based on the image target, that is, based on the detection result by the stereo camera 13 (S17). That is, even when the composite target is not generated, the collision determination is performed based on the image target, for example, for a pedestrian crossing the front of the traveling vehicle.
- the collision determination unit 24 is based on the composite target. Collision determination is performed (S18).
- FIG. 5 shows time-series changes in the position of the target in the collision determination process shown in FIG. 4 in comparison with FIG.
- the collision determination unit 24 determines the stationary state of the object instead of the image target generation unit 22 after the image target generation unit 22 recognizes the pedestrian as an object. And the collision determination part 24 will perform a collision determination, if the stationary state of an object is confirmed.
- FIGS. 6 and 7 are flowcharts specifically showing the operation of FIG. Although not shown in FIG. 6, as described in S11, S12, S13, and S14 in FIG. 4, radar target generation, image target generation, composite target generation, and collision determination parameter calculation are performed. Is done.
- the collision determination unit 24 determines whether the image single determination flag is on and the new flag is on (S21).
- the image single determination flag is a flag for permitting the collision determination based on the image target instead of the composite target. This flag is turned on when the movement vector of the object in the traveling direction of the vehicle is less than the threshold value.
- the new flag is a flag indicating that a target has been newly generated. Here, when it is not determined that the conditions of both flags are satisfied, the collision determination based on the image target is not performed, and thus the process ends.
- the collision determination unit 24 determines whether the buffer flag is on and the brake is not in the pre-braking state or the intervention braking state (S22).
- the buffer flag is a flag for permitting buffering of the target information of the image target.
- the pre-braking state or the intervention braking state means braking intervention performed as part of the collision avoidance support.
- the collision determination unit 24 buffers the distance ratio in the immediately preceding processing cycle and sets the dead zone speed used for calculating the distance ratio (S23). The distance ratio and dead zone speed will be described later.
- the collision determination unit 24 determines whether the vehicle speed exceeds the speed threshold and the distance to the object is shorter than the immediately preceding processing cycle (S24).
- the speed threshold is set to 0 or substantially 0 to determine the stationary state of the vehicle.
- the detection result of the speed sensor 11 is used as the vehicle speed, and the target information of the image target is used as the distance to the object.
- the collision determination unit 24 When it is determined that the vehicle speed and the distance to the object are satisfied, the collision determination unit 24 first and second distances indicating a change in the distance between the vehicle and the object per unit time in the traveling direction of the vehicle.
- the difference (distance change amount) is calculated (S25).
- the distance difference is a value indicating how much the distance to the object has changed between the immediately preceding processing cycle and the current processing cycle, and becomes a larger value as the distance to the object is shortened.
- the first distance difference is a value obtained by multiplying the vehicle speed by the image update period.
- the detection result of the speed sensor 11 is used as the vehicle speed, and the characteristic value of the stereo camera 13 is used as the image update period. That is, the first distance difference represents the distance (absolute distance) that the vehicle has moved during the processing cycle.
- the second distance difference is a value obtained by subtracting the distance to the object in the current processing cycle from the distance to the object in the immediately preceding processing cycle.
- the target information of the image target is used.
- the second distance difference represents the distance (relative distance) that the vehicle has moved relative to the object during the processing cycle. Therefore, when the speed of the object in the traveling direction of the vehicle is 0, assuming that the detection error of the stereo camera 13 is ideally 0, the second distance difference becomes equal to the first distance difference.
- the collision determination unit 24 determines whether or not the second distance difference exceeds the first distance difference (S26). And when it determines with the 2nd distance difference exceeding the 1st distance difference, the collision determination part 24 calculates a 1st distance ratio (S27). In this case, there is a high possibility that the object is moving so as to approach the vehicle in the traveling direction of the vehicle.
- the collision determination unit 24 calculates the second distance ratio (S28). In this case, there is a high possibility that the object is moving away from the vehicle in the traveling direction of the vehicle.
- the first and second distance ratios (R1, R2) are expressed as follows using the first and second distance differences (D1, D2), the image update period (Cu), and the dead zone velocity (Vd).
- the dead zone speed is a speed that is set in order to determine an object that is slightly moving in the traveling direction of the vehicle as a stationary object while taking the detection error of the stereo camera 13 into account. In consideration of the assumed speed of the object and the detection error of the stereo camera 13, the value is set to substantially zero instead of zero.
- R1 Max (0, (D2-Vd ⁇ Cu) / D1-1) (1)
- R2 min (0, (D2 + Vd ⁇ Cu) / D1-1) (2)
- the first and second distance ratio is a ratio between the correction value of the second distance difference and the first distance difference, and is calculated as a smaller value as the speed of the object in the traveling direction of the vehicle approaches zero.
- the second distance difference is corrected in consideration of the detection error of the stereo camera 13 and the speed of the object in the traveling direction of the vehicle. That is, the first and second distance ratios are used as an index for determining whether or not the object is stationary in the traveling direction of the vehicle.
- the collision determination unit 24 calculates a third distance ratio (S29).
- the third distance ratio is calculated as a maximum value ( ⁇ ), and as a result, the collision determination with respect to a “stationary object” such as a crossing pedestrian is not performed.
- the collision determination unit 24 determines whether or not the stationary object tracking is being performed (S30).
- the stationary object tracking is performed when the movement vector of the object in the traveling direction of the vehicle is less than a threshold value. Whether or not the stationary object tracking is being performed is determined based on a flag supplied from the synthetic target generating unit 22.
- the collision determination unit 24 determines that the object is a stationary object, and sets the image speed used for tracking as the relative speed of the object used for collision determination (S31).
- the image speed is calculated as the amount of change per unit time of the distance to the object using the target information of the image target. This is because an object that has already been tracked as a stationary object is likely to be a stationary object.
- the collision determination unit 24 determines whether the distance ratio is less than the determination threshold (S32).
- the determination threshold is a threshold for determining whether or not the object is a stationary object that hardly moves in the traveling direction of the vehicle.
- the collision determination unit 24 determines that the object is a stationary object, and sets the vehicle speed detected by the speed sensor 11 as the relative speed of the object used for the collision determination. (S33). This is because an object whose distance ratio is less than the determination threshold is likely to be a stationary object because the speed of the object in the traveling direction of the vehicle is close to zero.
- the collision determination unit 24 determines that the object is a moving object, and sets the maximum value ( ⁇ ) as the relative speed of the object used for the collision determination (S34). ). This is because an object whose distance ratio is equal to or greater than the determination threshold is likely to be a moving object because the speed of the object in the traveling direction of the vehicle is high. In this case, as a result, a collision determination with respect to a “stationary object” such as a crossing pedestrian is not performed.
- the collision determination unit 24 performs a collision determination based on the image target (S35).
- the collision determination unit 24 calculates the collision time by dividing the distance to the object by the set relative speed of the object, and determines whether the collision time is less than the threshold value. Based on the above, the possibility of collision with the object is determined. When it is determined that the collision possibility is high, the collision determination unit 24 determines that the collision avoidance support is activated, and when it is determined that the collision possibility is low, the collision determination unit 24 determines that the collision avoidance support is not activated.
- the object cannot be detected by the radar 12 and can be detected by the stereo camera 13, and the object can be detected by the collision determination unit 24.
- the collision determination is performed based on the detection result of the stereo camera 13 instead of the collision determination based on the composite target. Therefore, the processing time is shortened by the stillness determination by the collision determination unit 24, and the collision determination can be performed at an early timing based on the detection result by the stereo camera 13.
- the first distance change amount calculated from the speed of the vehicle as the change in the distance between the vehicle and the object per unit time in the traveling direction of the vehicle, and the distance between the vehicle and the object per unit time in the traveling direction of the vehicle Whether or not the object is stationary in the traveling direction of the vehicle may be determined based on a ratio between the distance change amount and the second distance change amount calculated from the image detection result. Accordingly, the stationary state of the object can be determined at high speed based on the ratio between the first distance change amount and the second distance change amount.
- the ratio of the first distance change amount and the second distance change amount is less than the threshold value, it may be determined that the object is stationary in the traveling direction of the vehicle.
- the collision is performed based on the detection result by the stereo camera 13. A determination may be made.
- the collision determination may be performed based on the detection result by the stereo camera 13. .
- the collision determination may be performed based on the detection result by the stereo camera 13.
- the detection range of the radar 12 and the detection range of the stereo camera 13 may partially overlap, and there may be a region that is detected by the stereo camera 13 but not detected by the radar 12.
- the radar 12 may detect an object in front of the vehicle using millimeter waves.
- the above-described embodiment describes the best embodiment of the collision determination device and the collision determination method according to the present invention, and the collision determination device and the collision determination method according to the present invention are described in the present embodiment. It is not limited to things.
- the collision determination apparatus and the collision determination method according to the present invention are modified from the collision determination apparatus and the collision determination method according to the present embodiment without departing from the gist of the invention described in each claim, or applied to others. It may be a thing.
- the function of the radar target generation unit 21 may be realized by a single ECU, for example, a radar sensor ECU
- the function of the image target generation unit 22 may be realized by a single ECU, for example, an image sensor ECU.
- the detection ranges A1 and A2 of the radar 12 and the stereo camera 13 are bilaterally symmetric with respect to the traveling direction of the vehicle and overlapped symmetrically.
- the detection ranges A1 and A2 of both the sensors 12 and 13 are partially overlapped and need not be detected by the radar 12 but have an area detected by the stereo camera 13, and are not necessarily in the traveling direction of the vehicle.
- they are symmetrical and do not need to overlap symmetrically.
Abstract
Description
R1=Max(0、(D2-Vd・Cu)/D1-1) …(1)
R2=min(0、(D2+Vd・Cu)/D1-1) …(2)
Claims (9)
- レーダ波により車両前方の物体を検出するレーダ検出部と、
前記車両前方を撮像し撮像した画像により前記物体を検出する画像検出部と、
前記レーダ検出部による検出結果と前記画像検出部による検出結果を用いて生成された合成物標に基づいて前記車両と前記物体の衝突を判定する衝突判定部と、
を備え、
前記衝突判定部は、前記物体が前記レーダ検出部により検出不能であり前記画像検出部により検出可能であり、且つ前記物体が前記車両の進行方向において静止していると判定した場合、前記合成物標に基づく衝突判定に代えて、前記画像検出部による検出結果に基づいて衝突判定を行う衝突判定装置。 - 前記衝突判定部は、前記車両の進行方向における単位時間当りの前記車両と前記物体の間の距離の変化として前記車両の速度から算出した第1の距離変化量と、前記車両の進行方向における単位時間当りの前記車両と前記物体の間の距離の変化として前記画像の検出結果から算出した第2の距離変化量との比率に基づいて、前記物体が前記車両の進行方向において静止しているか否かを判定する、請求項1に記載の衝突判定装置。
- 前記衝突判定部は、前記第1の距離変化量と前記第2の距離変化量との前記比率が閾値未満である場合、前記物体が前記車両の進行方向において静止していると判定する、請求項2に記載の衝突判定装置。
- 前記衝突判定部は、前記物体が前記レーダ検出部の検出範囲外であり前記画像検出部の検出範囲内に存在し、且つ前記物体が前記車両の進行方向において静止していると判定した場合、前記画像検出部による検出結果に基づいて衝突判定を行う、請求項1~3のいずれか一項に記載の衝突判定装置。
- 前記衝突判定部は、前記物体が前記レーダ検出部により検出されず前記画像検出部により検出され、且つ前記物体が前記車両の進行方向において静止していると判定した場合、前記画像検出部による検出結果に基づいて衝突判定を行う、請求項1~3のいずれか一項に記載の衝突判定装置。
- 前記衝突判定部は、前記合成物標が生成され、生成された前記合成物標が解除された場合、前記画像検出部による検出結果に基づいて衝突判定を行う、請求項1~3のいずれか一項に記載の衝突判定装置。
- 前記レーダ検出部の検出範囲と前記画像検出部の検出範囲とが部分的に重なり、前記レーダ検出部により検出されず前記画像検出部により検出される領域が存在する、請求項1~6のいずれか一項に記載の衝突判定装置。
- 前記レーダ検出部は、ミリ波により前記車両前方の前記物体を検出する、請求項1~7のいずれか一項に記載の衝突判定装置。
- レーダ波により車両前方の物体を検出し、
前記車両前方を撮像し撮像した画像により前記物体を検出し、
前記レーダ検出による検出結果と前記画像検出による検出結果を用いて生成された合成物標に基づいて前記車両と前記物体の衝突を判定する衝突判定方法において、
前記物体が前記レーダ検出により検出不能であり前記画像検出により検出可能であり、且つ前記物体が前記車両の進行方向において静止していると判定した場合、前記合成物標に基づく衝突判定に代えて、前記画像検出による検出結果に基づいて衝突判定を行うこと、
を含む衝突判定方法。
Priority Applications (5)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
EP12883649.1A EP2894617B1 (en) | 2012-09-03 | 2012-09-03 | Collision determination device and collision determination method |
PCT/JP2012/072365 WO2014033957A1 (ja) | 2012-09-03 | 2012-09-03 | 衝突判定装置及び衝突判定方法 |
US14/425,240 US9405006B2 (en) | 2012-09-03 | 2012-09-03 | Collision determination device and collision determination method |
JP2014532717A JP5862785B2 (ja) | 2012-09-03 | 2012-09-03 | 衝突判定装置及び衝突判定方法 |
CN201280075571.0A CN104603855B (zh) | 2012-09-03 | 2012-09-03 | 碰撞判定装置和碰撞判定方法 |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2012/072365 WO2014033957A1 (ja) | 2012-09-03 | 2012-09-03 | 衝突判定装置及び衝突判定方法 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014033957A1 true WO2014033957A1 (ja) | 2014-03-06 |
Family
ID=50182801
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2012/072365 WO2014033957A1 (ja) | 2012-09-03 | 2012-09-03 | 衝突判定装置及び衝突判定方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US9405006B2 (ja) |
EP (1) | EP2894617B1 (ja) |
JP (1) | JP5862785B2 (ja) |
CN (1) | CN104603855B (ja) |
WO (1) | WO2014033957A1 (ja) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016072327A1 (ja) * | 2014-11-04 | 2016-05-12 | 株式会社ソニー・インタラクティブエンタテインメント | ヘッドマウントディスプレイおよび情報処理方法 |
JP2019175372A (ja) * | 2018-03-29 | 2019-10-10 | 株式会社パル技研 | 危険予測装置、危険予測方法、及びプログラム |
Families Citing this family (26)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP5991332B2 (ja) * | 2014-02-05 | 2016-09-14 | トヨタ自動車株式会社 | 衝突回避制御装置 |
US10890648B2 (en) * | 2014-10-24 | 2021-01-12 | Texas Instruments Incorporated | Method and apparatus for generating alignment matrix for camera-radar system |
JP6428482B2 (ja) * | 2015-05-18 | 2018-11-28 | トヨタ自動車株式会社 | 車両用制御装置 |
JP2016224615A (ja) * | 2015-05-28 | 2016-12-28 | 株式会社デンソー | 警報制御装置 |
US11543513B2 (en) * | 2015-09-30 | 2023-01-03 | Sony Corporation | Information processing apparatus, information processing method and program |
EP3358368A4 (en) * | 2015-09-30 | 2019-03-13 | Sony Corporation | SIGNAL PROCESSING APPARATUS, SIGNAL PROCESSING METHOD, AND PROGRAM |
US9870708B2 (en) | 2016-03-12 | 2018-01-16 | Wipro Limited | Methods for enabling safe tailgating by a vehicle and devices thereof |
US10145951B2 (en) * | 2016-03-30 | 2018-12-04 | Aptiv Technologies Limited | Object detection using radar and vision defined image detection zone |
CN106379319B (zh) * | 2016-10-13 | 2019-11-19 | 上汽大众汽车有限公司 | 一种汽车辅助驾驶系统及控制方法 |
US10338208B2 (en) * | 2016-10-27 | 2019-07-02 | GM Global Technology Operations LLC | Object detection in multiple radars |
JP6454368B2 (ja) * | 2017-03-15 | 2019-01-16 | 株式会社Subaru | 車両の表示システム及び車両の表示システムの制御方法 |
CN111052201B (zh) * | 2017-09-01 | 2022-02-01 | 株式会社村上开明堂 | 碰撞预测装置、碰撞预测方法以及存储介质 |
JP7195098B2 (ja) * | 2018-09-27 | 2022-12-23 | 株式会社Subaru | 車両用通信装置、並びにこれを用いる車両制御システムおよび交通システム |
JP2020091672A (ja) * | 2018-12-06 | 2020-06-11 | ロベルト・ボッシュ・ゲゼルシャフト・ミト・ベシュレンクテル・ハフツングRobert Bosch Gmbh | 鞍乗型車両のライダー支援システムのための処理装置及び処理方法、鞍乗型車両のライダー支援システム、及び、鞍乗型車両 |
JP2022028989A (ja) * | 2018-12-18 | 2022-02-17 | ソニーセミコンダクタソリューションズ株式会社 | 情報処理装置、および情報処理方法、並びにプログラム |
US10776673B2 (en) * | 2019-01-31 | 2020-09-15 | StradVision, Inc. | Learning method and learning device for sensor fusion to integrate information acquired by radar capable of distance estimation and information acquired by camera to thereby improve neural network for supporting autonomous driving, and testing method and testing device using the same |
JP2020179808A (ja) * | 2019-04-26 | 2020-11-05 | トヨタ自動車株式会社 | 車両制御装置 |
JP7268490B2 (ja) * | 2019-06-13 | 2023-05-08 | 株式会社デンソー | 物体認識装置、運転支援システム |
CN110395111A (zh) * | 2019-08-05 | 2019-11-01 | 深圳市威尔电器有限公司 | 汽车安全驾驶监控方法、监控中心、辅助装置 |
JP7131508B2 (ja) * | 2019-08-21 | 2022-09-06 | トヨタ自動車株式会社 | レーダ装置 |
US11180148B2 (en) * | 2019-09-03 | 2021-11-23 | Ford Global Technologies, Llc | Detection and response to confined trailer in system-assisted hitch operation |
JP7328863B2 (ja) * | 2019-10-11 | 2023-08-17 | 株式会社デンソー | 制御装置 |
CN111289967A (zh) * | 2020-03-31 | 2020-06-16 | 四川长虹电器股份有限公司 | 基于毫米波雷达的人员检测跟踪与计数算法 |
US11328601B1 (en) | 2021-02-22 | 2022-05-10 | Volvo Car Corporation | Prevention of low-speed sideswipe collisions with non-moving objects |
JP7468409B2 (ja) * | 2021-03-01 | 2024-04-16 | トヨタ自動車株式会社 | 車両衝突回避支援装置 |
CN114333318B (zh) * | 2021-12-31 | 2023-04-28 | 成都路行通信息技术有限公司 | 一种基于传感器空间角摩托车碰撞检测方法 |
Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005084034A (ja) | 2003-09-11 | 2005-03-31 | Toyota Motor Corp | 物体検出装置 |
JP2007091102A (ja) * | 2005-09-29 | 2007-04-12 | Toyota Motor Corp | 障害物検出装置 |
JP2008276689A (ja) * | 2007-05-07 | 2008-11-13 | Mitsubishi Electric Corp | 車両用障害物認識装置 |
JP2009186260A (ja) * | 2008-02-05 | 2009-08-20 | Nissan Motor Co Ltd | 物体検出装置及び測距方法 |
JP2011221630A (ja) * | 2010-04-06 | 2011-11-04 | Honda Motor Co Ltd | 車両の周辺監視装置 |
Family Cites Families (24)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0717347A (ja) * | 1993-07-07 | 1995-01-20 | Mazda Motor Corp | 自動車の障害物検知装置 |
JPH09142236A (ja) * | 1995-11-17 | 1997-06-03 | Mitsubishi Electric Corp | 車両の周辺監視方法と周辺監視装置及び周辺監視装置の故障判定方法と周辺監視装置の故障判定装置 |
JP3658519B2 (ja) * | 1999-06-28 | 2005-06-08 | 株式会社日立製作所 | 自動車の制御システムおよび自動車の制御装置 |
JP4308381B2 (ja) * | 1999-09-29 | 2009-08-05 | 富士通テン株式会社 | 周辺監視センサ |
JP2001134769A (ja) * | 1999-11-04 | 2001-05-18 | Honda Motor Co Ltd | 対象物認識装置 |
US20010031068A1 (en) * | 2000-04-14 | 2001-10-18 | Akihiro Ohta | Target detection system using radar and image processing |
JP2002189075A (ja) * | 2000-12-20 | 2002-07-05 | Fujitsu Ten Ltd | 道路上方静止物検知方法 |
DE10133945A1 (de) * | 2001-07-17 | 2003-02-06 | Bosch Gmbh Robert | Verfahren und Vorrichtung zum Austausch und zur Verarbeitung von Daten |
US6771208B2 (en) * | 2002-04-24 | 2004-08-03 | Medius, Inc. | Multi-sensor system |
JP4305191B2 (ja) * | 2004-01-21 | 2009-07-29 | トヨタ自動車株式会社 | 車両用衝突推定装置 |
JP4763250B2 (ja) * | 2004-04-09 | 2011-08-31 | 株式会社デンソー | 物体検出装置 |
JP4684954B2 (ja) * | 2005-08-31 | 2011-05-18 | 本田技研工業株式会社 | 車両の走行安全装置 |
JP2007155469A (ja) * | 2005-12-05 | 2007-06-21 | Alpine Electronics Inc | 車間距離検出装置および車間距離検出方法 |
JP4595833B2 (ja) * | 2006-02-24 | 2010-12-08 | トヨタ自動車株式会社 | 物体検出装置 |
JP4211809B2 (ja) * | 2006-06-30 | 2009-01-21 | トヨタ自動車株式会社 | 物体検出装置 |
EP2122599B1 (en) * | 2007-01-25 | 2019-11-13 | Magna Electronics Inc. | Radar sensing system for vehicle |
JP4453775B2 (ja) * | 2008-06-27 | 2010-04-21 | トヨタ自動車株式会社 | 物体検出装置 |
JP2010018080A (ja) | 2008-07-09 | 2010-01-28 | Mazda Motor Corp | 車両の運転支援装置 |
DE102009021229A1 (de) * | 2009-05-19 | 2010-01-21 | Daimler Ag | Kamerabasierte Objektverarbeitung für die Abstandsregelung eines Fahrzeugs zur kraftstoffsparenden Einreglung des Sollabstandes |
JP2011164989A (ja) * | 2010-02-10 | 2011-08-25 | Toyota Motor Corp | ふらつき判定装置 |
JP5515969B2 (ja) * | 2010-03-30 | 2014-06-11 | ソニー株式会社 | 送信装置及び方法、並びにプログラム |
WO2012033173A1 (ja) * | 2010-09-08 | 2012-03-15 | 株式会社豊田中央研究所 | 移動物予測装置、仮想可動物予測装置、プログラム、移動物予測方法、及び仮想可動物予測方法 |
JP2012103858A (ja) * | 2010-11-09 | 2012-05-31 | Toyota Motor Corp | 障害物認識装置 |
CN102542843A (zh) * | 2010-12-07 | 2012-07-04 | 比亚迪股份有限公司 | 防止车辆碰撞的预警方法及装置 |
-
2012
- 2012-09-03 WO PCT/JP2012/072365 patent/WO2014033957A1/ja active Application Filing
- 2012-09-03 CN CN201280075571.0A patent/CN104603855B/zh active Active
- 2012-09-03 US US14/425,240 patent/US9405006B2/en active Active
- 2012-09-03 JP JP2014532717A patent/JP5862785B2/ja active Active
- 2012-09-03 EP EP12883649.1A patent/EP2894617B1/en active Active
Patent Citations (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2005084034A (ja) | 2003-09-11 | 2005-03-31 | Toyota Motor Corp | 物体検出装置 |
JP2007091102A (ja) * | 2005-09-29 | 2007-04-12 | Toyota Motor Corp | 障害物検出装置 |
JP2008276689A (ja) * | 2007-05-07 | 2008-11-13 | Mitsubishi Electric Corp | 車両用障害物認識装置 |
JP2009186260A (ja) * | 2008-02-05 | 2009-08-20 | Nissan Motor Co Ltd | 物体検出装置及び測距方法 |
JP2011221630A (ja) * | 2010-04-06 | 2011-11-04 | Honda Motor Co Ltd | 車両の周辺監視装置 |
Non-Patent Citations (1)
Title |
---|
See also references of EP2894617A4 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2016072327A1 (ja) * | 2014-11-04 | 2016-05-12 | 株式会社ソニー・インタラクティブエンタテインメント | ヘッドマウントディスプレイおよび情報処理方法 |
JP2016090772A (ja) * | 2014-11-04 | 2016-05-23 | 株式会社ソニー・コンピュータエンタテインメント | ヘッドマウントディスプレイおよび情報処理方法 |
US10185146B2 (en) | 2014-11-04 | 2019-01-22 | Sony Interactive Entertainment Inc. | Head mounted display and information processing method |
US10914947B2 (en) | 2014-11-04 | 2021-02-09 | Sony Interactive Entertainment Inc. | Head mounted display and information processing method |
JP2019175372A (ja) * | 2018-03-29 | 2019-10-10 | 株式会社パル技研 | 危険予測装置、危険予測方法、及びプログラム |
JP7033308B2 (ja) | 2018-03-29 | 2022-03-10 | 株式会社パル技研 | 危険予測装置、危険予測方法、及びプログラム |
Also Published As
Publication number | Publication date |
---|---|
EP2894617B1 (en) | 2016-11-30 |
US9405006B2 (en) | 2016-08-02 |
US20150234044A1 (en) | 2015-08-20 |
EP2894617A1 (en) | 2015-07-15 |
JP5862785B2 (ja) | 2016-02-16 |
CN104603855B (zh) | 2016-06-22 |
CN104603855A (zh) | 2015-05-06 |
JPWO2014033957A1 (ja) | 2016-08-08 |
EP2894617A4 (en) | 2015-11-11 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP5862785B2 (ja) | 衝突判定装置及び衝突判定方法 | |
US9470790B2 (en) | Collision determination device and collision determination method | |
JP6011625B2 (ja) | 速度算出装置及び速度算出方法並びに衝突判定装置 | |
US20140333467A1 (en) | Object detection device | |
JP5979232B2 (ja) | 衝突判定装置及び衝突判定方法 | |
US10471961B2 (en) | Cruise control device and cruise control method for vehicles | |
US20140340518A1 (en) | External sensing device for vehicle, method of correcting axial deviation and recording medium | |
JP6787157B2 (ja) | 車両制御装置 | |
US10527719B2 (en) | Object detection apparatus and object detection method | |
JP5785578B2 (ja) | 車両周辺監視装置 | |
JP6504078B2 (ja) | 衝突予測装置 | |
JP2018097765A (ja) | 物体検出装置、及び物体検出方法 | |
JP2019052920A (ja) | 物体検出装置、物体検出方法及び車両制御システム | |
JP2009019914A (ja) | 物体検出装置 | |
WO2017138329A1 (ja) | 衝突予測装置 | |
WO2017163736A1 (ja) | 移動軌跡検出装置、移動物体検出装置、移動軌跡検出方法 | |
WO2014033958A1 (ja) | 衝突判定装置及び衝突判定方法 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 12883649 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2014532717 Country of ref document: JP Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14425240 Country of ref document: US |
|
REEP | Request for entry into the european phase |
Ref document number: 2012883649 Country of ref document: EP |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2012883649 Country of ref document: EP |