JP4757147B2 - Object detection device - Google Patents

Object detection device Download PDF

Info

Publication number
JP4757147B2
JP4757147B2 JP2006227321A JP2006227321A JP4757147B2 JP 4757147 B2 JP4757147 B2 JP 4757147B2 JP 2006227321 A JP2006227321 A JP 2006227321A JP 2006227321 A JP2006227321 A JP 2006227321A JP 4757147 B2 JP4757147 B2 JP 4757147B2
Authority
JP
Japan
Prior art keywords
object
detection
relative
means
position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
JP2006227321A
Other languages
Japanese (ja)
Other versions
JP2008051615A (en
Inventor
隼人 菊池
Original Assignee
本田技研工業株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 本田技研工業株式会社 filed Critical 本田技研工業株式会社
Priority to JP2006227321A priority Critical patent/JP4757147B2/en
Publication of JP2008051615A publication Critical patent/JP2008051615A/en
Application granted granted Critical
Publication of JP4757147B2 publication Critical patent/JP4757147B2/en
Application status is Active legal-status Critical
Anticipated expiration legal-status Critical

Links

Images

Description

  The present invention relates to an object detection apparatus that identifies an object detected in a predetermined detection area in the traveling direction of a host vehicle from ghost data due to noise or dust and determines earlier whether or not the object is a control target object.

  What determines whether or not an object detected at a certain time interval by a radar device is a control target object for controlling the vehicle according to the number of times the object is detected during a predetermined time is as follows: This is known from US Pat.

In addition, in order to determine the continuity (identity) of an object detected at regular time intervals by a radar device, the next position is predicted from the current detection data of the object, and the current detection position of the object is the predicted position. It is known from Patent Document 2 below that the object detected last time and the object detected this time are determined to be the same object when they are within an allowable range set around.
JP-A-8-216726 JP 2004-132734 A

  By the way, what is described in Patent Document 1 determines whether or not an object is a control target object according to the number of times the object is detected during a predetermined time. Even in the case of an object, the determination result cannot be obtained until the predetermined time elapses, and there is a possibility that a delay occurs in the control using the determination result.

  In addition, in the above-mentioned Patent Document 2, the allowable range around the predicted position that is set to determine whether or not the previously detected object and the currently detected object are the same object is a certain width. Therefore, there is a problem that the reliability of the identity can only be determined to be YES or NO, and detailed and highly accurate determination cannot be performed.

  The present invention has been made in view of the above-described circumstances, and an object thereof is to accurately determine a control target object existing in a predetermined detection region in the traveling direction of the host vehicle with a minimum time delay.

In order to achieve the above object, according to the first aspect of the present invention, there is provided an object detection means for detecting an object existing in a predetermined detection area in the traveling direction of the host vehicle at predetermined time intervals, and the object detection means. A relative relationship calculating means for calculating the relative relationship between the relative position and the relative speed between the vehicle and the object based on the detection result of the vehicle, and the relative position and the relative speed at the previous detection time of the object calculated by the relative relationship calculating means. Based on the current position prediction means for predicting the relative position at the current detection time, range setting means for setting a predetermined range centered on the relative position predicted by the current position prediction means, and the relative relationship calculation means and identity determining means determines that the previous and current object is the same object in the case that the calculated relative position in the present detection time is within the predetermined range, the same In the object detection apparatus and a control object determination means determines that the number of times it is determined that the same object is an object to control the object has reached the number of determinations by the determining means, the range setting means, said prediction A plurality of areas whose outer edges are gradually narrowed within the predetermined range with the relative position as a center, and the control target object determination unit is configured to set the plurality of areas set by the range setting unit. of object detection device outer edge of the area in which the relative position in the present detection time exists and said Rukoto reduce the number of judgments as when small is proposed.

According to the invention described in claim 2, in addition to the configuration of claim 1, wherein the control object determination means, the relative position at the current detection point, the plurality of which are set by the range setting means when the outer edge in the area exists in the narrowest first area, the object detection apparatus is proposed which is characterized in that determines that the current object is a control object.

According to the invention described in claim 3, in addition to the configuration of claim 2, wherein the control object determination means, the relative position at the current detection point, the plurality of which are set by the range setting means when the next edge of the first area in the area exists in the narrow second area, controlling the current object when the relative position in the previous detection point is present in the second area An object detection apparatus characterized by determining that the object is a target object is proposed.

  According to the invention described in claim 4, in addition to the configuration of any one of claims 1 to 3, the previous and current relative positions and relative speeds calculated by the relative relationship calculating means Object position estimating means for estimating the next relative position based on the object position, and the control object determining means determines the object when the next position of the object estimated by the object position estimating means is outside the detection area. An object detection device is proposed that is not determined as a control target.

  According to the invention described in claim 5, in addition to the configuration of any one of claims 1 to 4, the previous and current relative positions and relative speeds calculated by the relative relationship calculating means Relative speed estimation means for estimating the next relative speed based on the control object, the control target object determination means when the next relative speed of the object estimated by the relative speed estimation means is in a direction away from the host vehicle Proposes an object detection device characterized in that the object is not determined as a control target.

According to the configuration of the first aspect, the object detection unit detects an object existing in a predetermined detection area in the traveling direction of the own vehicle at every predetermined time, and the relative relationship calculation unit detects the object based on the detection result of the object detection unit. The relative position consisting of the relative position and relative speed between the vehicle and the object is calculated, and the current position predicting means calculates the relative position at the current detection time based on the relative position and relative speed at the previous detection time of the object calculated by the relative relation calculating means. Predict location. The range setting means sets a predetermined range centered on the relative position predicted by the current position prediction means, and the identity determination means has the relative position at the current detection time calculated by the relative relationship calculation means within the range. In this case, it is determined that the previous and current objects are the same object. Then, the control target object determination unit determines that the object that has been determined to be the same object by the identity determination unit is the control target object. At that time, the range setting means sets a plurality of areas whose outer edges are gradually narrowed within the predetermined range with the predicted relative position as the center , and the control target object determining means is configured to determine the plurality of areas. of Runode reduce the number of judgments as if the outer edge of the area where the relative positions are present is narrow at the present detection time, varying the determination count in accordance with a degree to be accurately detected at a position where the object is predicted Thus, the control target object can be quickly determined with a small number of detections without degrading the determination accuracy of the control target object.

According to the second aspect, when present in the narrowest first area among the plurality of areas the relative position at the current detection point is set in a range setting means, the current object in the control object Since it is determined that the control target object is present, the control target object can be determined more quickly with the minimum number of detections without degrading the determination accuracy of the control target object.

According to the third aspect, if the relative position in the present detection point, following the outer edge of the first area among a plurality of areas set by the range setting means are present in the narrow second area the, since the relative position at the previous detection point determined to be a control object the current object when present in the second area, the minimum without reducing the accuracy of determining the control target object The control target object can be determined more quickly based on the number of detections.

  According to the fourth aspect of the present invention, the object position estimating means estimates the next relative position based on the previous and current relative positions and the relative speed calculated by the relative relationship calculating means, and the control target object determining means When the next position is outside the detection area, the object is not determined to be a control target object, and therefore, it is possible to prevent an object that is out of the detection area and cannot be continuously detected from being determined as a control target object. it can.

  According to the fifth aspect of the present invention, the relative speed estimation means estimates the next relative speed based on the previous and current relative positions and relative speeds calculated by the relative relationship calculation means, and the control object determination means If the next relative speed is in a direction away from the host vehicle, the object is not determined to be a control target object, so an object that is not likely to collide because the object moves away from the host vehicle is the control target object. Can be prevented.

  Hereinafter, embodiments of the present invention will be described with reference to the accompanying drawings.

  1 to 16 show an embodiment of the present invention. FIG. 1 is a block diagram of a control system of an object detection apparatus. FIG. 2 is a current predicted position, a takeover area, a second output candidate determination area, and a third output. FIG. 3 is a main flowchart for explaining the operation of the object detection device, FIG. 4 is a second target output determination sub-flowchart, FIG. 5 is a third target output determination sub-flowchart, and FIG. 6 is a third detection prediction. 7 is a third detection predicted relative speed determination sub-flowchart, FIG. 8 is an explanatory diagram of the operation of the takeover area, FIG. 9 is an explanatory diagram of the second detection operation, and FIGS. 10 to 12 are the third detection operation. FIG. 13 is an explanatory diagram of the fourth action of the detection (the present invention), FIG. 14 is an explanatory diagram of the fourth action of the detection (conventional example), and FIG. 15 is a case where another vehicle interrupts from outside the detection area. Operation explanatory diagram of FIG. 16 is a view illustrating the operation of a case where another vehicle is interrupted from behind the preceding vehicle.

  As shown in FIG. 1, the object detection apparatus according to the present embodiment includes a radar device R (millimeter wave) including a transmission unit that transmits an electromagnetic wave and a reception unit that receives a reflected wave reflected by the object. When a vehicle follows a preceding vehicle detected by a radar device or a laser radar device), automatic braking will occur if the distance between the vehicle and the preceding vehicle becomes less than a predetermined value and the possibility of a rear-end collision increases. In order to activate a rear-end collision prevention system that brakes the host vehicle and an alarm system that prompts the driver to voluntarily brake, the data of the object to be controlled (preceding vehicle) is output to the rear-end collision prevention system and the alarm system.

  The electronic control unit U of the vehicle control device includes a relative relationship calculating means M1, a current position predicting means M2, a range setting means M3, an identity determining means M4, a controlled object determining means M5, and an object position estimating means M6. And relative speed estimation means M7.

Radar device R detects the object at a predetermined time interval (for example, 0.1 second) by transmitting an electromagnetic wave in front of the vehicle and receiving the reflected wave reflected by the object such as the preceding vehicle. To do. The relative relationship calculating means M1 detects the relative relationship between the object and the vehicle, that is, the relative position and the relative speed, at each time interval based on the reflected wave. The current position prediction means M2 predicts the relative position of the object at the current detection time from the relative position and relative speed of the object detected last time. The range setting means M3 sets a predetermined range around the current relative position of the object predicted by the current position prediction means M2. The identity determination unit M4 determines that the object is the same as the previously detected object when the actual relative position that is not predicted by the relative relationship calculation unit M1 is within the predetermined range. The control target object determination unit M5 determines that an object whose number of times determined to be the same object by the identity determination unit M4 has reached a predetermined determination number is a control target for automatic braking or inter-vehicle distance control.

  The object position estimating means M6 estimates the next relative position from the previous and current relative positions and relative speeds calculated by the relative relation calculating means M1, and when the next relative position is out of the range, the object position is continuously detected. Therefore, the object is excluded from the control target. The relative speed estimation means M7 estimates the next relative speed from the previous and current relative positions and relative speeds calculated by the relative relation calculation means M1, and if the next relative speed is in a direction away from the host vehicle, Therefore, the object is excluded from the control target.

  Here, the current predicted position, the takeover area, the second output candidate determination area, and the third output candidate determination area will be described with reference to FIG.

  When the predicted position of the current detection target is obtained based on the previous detection target, the 8 m × 8 m takeover area of 4 m in the front-rear direction and 4 m in the left-right direction based on the current prediction position, and the front-rear direction based on the current prediction position 3m output candidate determination area of 4m x 4m each 2m in the left-right direction, 1m each in the front-rear direction and 1m in the left-right direction 2m x 2m second output candidate determination area based on the current predicted position And are set.

  The reflected wave from the target is detected in step S1 of the flowchart of FIG. 3, the left and right positions and the relative speed of the target are calculated in step S2, the output determination flag is cleared in step S3, and the previously detected target is detected in step S4. Confirm takeover. If the current detection target has succeeded the previous detection target in step S5, and the current detection is the second detection in step S6 (that is, the previous detection is new), the second target output determination subflow is performed in step S7. Execute. If the current detection is not the second detection in step S6 and the current detection is the third detection in step S8, the third target output determination subflow is executed in step S9. If the detection at this time is not the third detection in step S8, the detection is the fourth or more detection, so the output determination flag is set in step S10.

  Steps S2 to S10 are repeated until all the targets are processed in the subsequent step S11. When all the targets are processed, the target data in which the output determination flag is set in step S12 is sent to the control ECU of the automatic braking device or the alarm device. Output.

  Next, the contents of the second target output determination subflow in step S7 will be described with reference to FIG. This subflow determines the possibility that the second target is a real target.

  The second target is the target detected this time as the same target as the previous detection target by the takeover confirmation, and the takeover confirmation between the previous detection target and the current detection target (confirmation of whether they are the same target) and Finds the predicted position of the current detection target based on the previous detection target, selects the current detection target closest to the predicted position within the 8 m × 8 m takeover area set based on the predicted position, and calculates the predicted relative speed. And the current detection target within which the difference between the current detection relative speed and the current detection relative speed is within ± 10 km / h, for example.

  In the example of FIG. 8, of the current detection targets 1 and 2, the current detection target 1 is closer to the current predicted position than the current detection target 2 and the difference in relative speed is smaller. (Same target). In addition, when the target does not exist this time in the handover area of 8 m in the front-rear direction and 8 m in the left-right direction, it is determined that there is no handover target (the same target is not present).

  First, in step S21, it is determined whether or not the reception levels of the reflected waves of the previous detection target and the current detection target are both equal to or greater than the detection threshold value +5 dB. This is because a target with a low reception level may not be stably detected due to the influence of noise or the like, and is excluded from candidates for output.

  If the answer to step S21 is YES, it is determined in step S22 whether or not the difference between the reception levels of the reflected waves of the previous detection target and the current detection target is within 5 dB. This is because, if the same target is used, the difference in reception level should be small, so that a target having a large difference in reception level is excluded from the candidates for output.

  If the answer to step S22 is YES, the actual current detection target position in step S23 is, for example, within 1 m in the front-rear direction and within the left-right direction with respect to the predicted position of the current detection target calculated from the previous detection target position. In step S24, and in step S24, the actual relative speed of the current detection target is set to the relative speed of the previous detection target. It is determined whether the predicted relative speed of the current detection target calculated from the speed is within ± 2 km / h, for example.

  If all the answers in steps S21 to S24 are YES, the current detection target is an output target candidate, and the next predicted position and predicted relative speed are calculated based on the current detection data in step S25. If the predicted position is within the detection area of the radar device R in step S26 and the predicted relative speed is a direction approaching the own vehicle in step S27, the target is determined as an output target in step S28 and output. Set the judgment flag. The reason that the predicted position is within the detection area of the radar device R is that the target cannot be detected continuously if the predicted position is out of the detection area. Further, the reason that the predicted relative speed is in a direction approaching the host vehicle is that a target moving away from the host vehicle cannot be a control target of the automatic braking device or the alarm device.

  The example of FIG. 9A shows a case where the current detection position is in the second output candidate determination area, and in this case, the current detection target is output.

  The example of FIG. 9B shows a case where the current detection position is not within the second output candidate determination area, and in this case, the current detection target is not output.

  Next, the contents of the third target output determination subflow in step S9 will be described with reference to FIG. This subflow determines the possibility that the third target is a true target.

  First, in step S31, it is determined whether or not the reception levels of the reflected waves of the detection target, the previous detection target, and the current detection target are all equal to or greater than the detection threshold value +5 dB. This is because a target with a low reception level may not be stably detected due to the influence of noise or the like, and is excluded from candidates for output.

  If the answer to step S31 is YES, it is determined in step S32 whether or not the differences in the reception levels of the reflected waves of the previous detection target, the previous detection target, and the current detection target are all within 10 dB. This is because, if the same target is used, the difference in reception level should be small, so that a target having a large difference in reception level is excluded from the candidates for output.

  If the answer to step S32 is YES, a third detection predicted position determination subflow is executed in step S33.

  In step S41 of the flowchart of FIG. 6, it is determined whether or not the current detection target position is in the second output candidate determination area of 1 m each in the front-rear direction and 1 m in the left-right direction with respect to the current predicted position. In step S42, an output candidate flag is set. The determination condition in step S41 is the same as the reference for the second target output determination already described. Even in the third target output determination, if the second target output determination criterion is satisfied, the output candidate flag is set only by this determination criterion.

  If the answer to step S41 is no, it is determined in step S43 whether or not the current detection target position is in the third output candidate determination area of 2 m in the front-rear direction and 2 m in the left-right direction with respect to the current predicted position. If so, it is determined in step S44 whether or not the previously detected target position is in the third output candidate determination area with respect to the previous predicted position. If it is included, the output candidate flag is set in step S42. set. The conditions of the steps S43 and S44 are looser than the conditions of the step S41, but even if the loose conditions are satisfied twice, an output candidate flag is set.

  If at least one of the answers in steps S43 and S44 is NO, it is determined in step S45 that the current detection target is not likely to be a real target, and the output candidate flag is cleared.

  Returning to the flowchart of FIG. 5, if the output candidate flag is set in step S34, the third detection predicted relative speed determination flow is executed in step S35.

  In step S51 of the flowchart of FIG. 7, it is determined whether or not the relative speed of the current detection target is within ± 2 km with respect to the current predicted relative speed. If within ± 2 km, an output candidate flag is set in step S52. The determination condition in step S51 is the same as the reference for the second target output determination already described. Even in the third target output determination, if the criterion for the second target output determination is satisfied, the output candidate flag is set only by this determination criterion.

  If the answer to step S51 is NO, it is determined in step S53 whether or not the current detected target relative speed is within ± 5 km with respect to the current predicted relative speed. If within ± 5 km, the previous detected target position is determined in step S54. Is within ± 5 km with respect to the previous predicted relative speed, and if within ± 5 km, the output candidate flag is set in step S52. The conditions of the steps S53 and S54 are looser than the conditions of the step S51. However, even if the loose conditions are satisfied twice, the output candidate flag is set.

  If at least one of the answers in steps S53 and S54 is NO, it is determined in step S55 that the current detection target is not likely to be a real target, and the output candidate flag is cleared.

  Returning to the flowchart of FIG. 5, if the output candidate flag is set in step S36, the next predicted position and predicted relative speed are calculated in step S37 based on the current detection data. If the predicted position is within the detection area of the radar device R in step S38 and the predicted relative speed is a direction approaching the own vehicle in step S39, the target is determined as an output target in step S40 and output. Set the judgment flag. The reason that the predicted position is within the detection area of the radar device R is that the target cannot be detected continuously if the predicted position is out of the detection area. Further, the reason that the predicted relative speed is in a direction approaching the host vehicle is that a target moving away from the host vehicle cannot be a control target of the automatic braking device or the alarm device.

  The example of FIG. 10A shows a case where the previous detection position is in the second output candidate determination area and the current detection position is in the second output candidate determination area. In this case, the previous detection target and This time, both detection targets are output.

  The example of FIG. 10B shows a case where the previous detection position is in the second output candidate determination area and the current detection position is in the third output candidate determination area. In this case, the previous detection target and This time, both detection targets are output.

  The example of FIG. 10C shows a case where the previous detection position is in the second output candidate determination area and the current detection position is outside the third output candidate determination area. In this case, the previous detection target is Outputs but does not output the detection target this time.

  The example of FIG. 11D shows a case where the previous detection position is in the third output candidate determination area and the current detection position is in the second output candidate determination area. In this case, the previous detection target is Does not output, but outputs the detection target this time.

  The example of FIG. 11E shows a case where the previous detection position is in the third output candidate determination area and the current detection position is in the third output candidate determination area. In this case, the previous detection target is Does not output, but outputs the detection target this time.

  The example of FIG. 11F shows a case where the previous detection position is in the third output candidate determination area and the current detection position is outside the third output candidate determination area. In this case, the previous detection target and Do not output both detection targets this time.

  The example of FIG. 12G shows a case where the previous detection position is outside the third output candidate determination area and the current detection position is within the second output candidate determination area. In this case, the previous detection target is Does not output, but outputs the detection target this time.

  The example of FIG. 12H shows a case where the previous detection position is outside the third output candidate determination area and the current detection position is within the third output candidate determination area. In this case, the previous detection target and Do not output both detection targets this time.

  The example of FIG. 12I shows a case where the previous detection position is outside the third output candidate determination area and the current detection position is outside the third output candidate determination area. In this case, the previous detection target and Do not output both detection targets this time.

  If the answer to step S8 in the flowchart of FIG. 3 is NO, the target is a takeover target, and the detection is not the second or third detection but the fourth or more detections. In this case, all output determination flags are set in step S10. Set it for output. That is, all targets that have entered the takeover area three or more times consecutively are output targets.

  In the example of FIG. 13, since the detection target before the second time is outside the second output candidate determination area (in the third output candidate determination area), it is not output, and the previous target is output because it is in the third output candidate determination area. This time, the target is in the second output candidate determination area, but if it is not in the takeover area, it is always output because it has been detected four or more times in succession.

  FIG. 14 shows a conventional example corresponding to the example of FIG. In the conventional example, only the target detected continuously four times or more is output. Therefore, the target is not output regardless of the area of the previous target and the previous target, and the current target detected for the fourth time is output for the first time. . Therefore, according to the present embodiment, an output that is one cycle earlier than the conventional example (that is, an output at the time of the previous detection) is possible.

  FIG. 15 illustrates the operation when another vehicle enters the front from the right front of the host vehicle. FIG. 15A is a conventional example, and since the vehicle is subject to automatic braking and inter-vehicle distance control for the first time after the other vehicle enters the detection area, the deceleration of the own vehicle is delayed by that amount. The inter-vehicle distance may be shortened.

  On the other hand, according to the embodiment shown in FIG. 15B, since the second vehicle can be subject to automatic braking control after the other vehicle enters the detection area, the vehicle starts to decelerate earlier by that amount. Thus, the distance between the vehicles can be prevented from being shortened.

  FIG. 16 is a diagram for explaining the operation when another vehicle enters the front of the host vehicle from behind the preceding vehicle on the right front side of the host vehicle. FIG. 16 (A) is a conventional example, and since the other vehicle is detected from the shadow of the preceding vehicle and is detected for the fourth time, the vehicle is subject to automatic braking control. There is a possibility that the inter-vehicle distance may be shortened with a delay.

  On the other hand, according to the embodiment shown in FIG. 16 (B), the other vehicle can appear in the shadow of the preceding vehicle and be detected, so that the vehicle can be controlled automatically by the second detection. It is possible to prevent the inter-vehicle distance from being shortened by starting the deceleration earlier.

As described above, according to the present embodiment, three areas in which the outer edge is narrowed stepwise with the predicted relative position of the target as the center , that is , a takeover area with a large area and a third output with a medium area. If the candidate determination area and the second output candidate determination area with a small area are set and the target continuously detected in the takeover area is in the second output candidate determination area with the narrowest outer edge in the second detection, The target is output at the time, and if the target detected continuously in the takeover area is in the third output candidate determination area where the outer edge is the next narrowest in both the second and third detections, the target at that time When four targets are detected in succession in the handover area, the target is output at that time.

In other words , a plurality of areas whose outer edges are gradually narrowed around the predicted target relative position are set, and in other words, depending on which area the relative position of the target at the time of detection is present, Since the number of determinations as to whether or not to output the target is changed in a range of 2 to 4 according to the degree to which the predicted position is accurately detected, without degrading the determination accuracy of the control target object, The object to be controlled can be quickly determined with a small number of detections.

  The embodiments of the present invention have been described above, but various design changes can be made without departing from the scope of the present invention.

  For example, the area of the takeover area, the third output candidate determination area and the second output candidate determination area of the embodiment, and the number of determinations as to whether or not to output a target can be changed as appropriate.

Block diagram of the control system of the object detection device Explanatory drawing of this time predicted position, takeover area, second output candidate determination area, and third output candidate determination area Main flowchart explaining the operation of the object detection device Second target output determination sub-flowchart Third target output determination sub-flowchart Third detection predicted position determination sub-flowchart Third detection predicted relative speed determination sub-flowchart Action explanatory diagram of takeover area Explanatory diagram of second detection Explanatory diagram of the third detection Explanatory diagram of the third detection Explanatory diagram of the third detection Explanatory diagram of the fourth detection (present invention) Explanatory diagram of fourth detection (conventional example) Action diagram when another vehicle interrupts from outside the detection area Action diagram when another vehicle interrupts from behind the preceding vehicle

R Radar device (object detection means)
M1 Relative relationship calculation means M2 Current position prediction means M3 Range setting means M4 Identity determination means M5 Control target object determination means M6 Object position estimation means M7 Relative speed estimation means

Claims (5)

  1. An object detection means (R) for detecting an object existing in a predetermined detection area in the traveling direction of the own vehicle every predetermined time;
    A relative relationship calculating means (M1) for calculating a relative relationship consisting of a relative position and a relative speed between the vehicle and the object based on a detection result of the object detecting means (R);
    Current position prediction means (M2) for predicting the relative position at the current detection time based on the relative position and relative speed at the previous detection time of the object calculated by the relative relationship calculation means (M1);
    Range setting means (M3) for setting a predetermined range centered on the relative position predicted by the current position prediction means (M2);
    Identity determining means (M4) for determining that the previous and current objects are the same object when the relative position at the current detection time calculated by the relative relationship calculating means (M1) is within the predetermined range;
    A control target object determination unit (M5) that determines that an object that has been determined to be the same object by the identity determination unit (M4) is a control target object;
    In an object detection device comprising:
    The range setting means (M3) is about the predicted relative position, sets a plurality of areas in which the outer edge within a predetermined range is stepwise narrowing,
    The control object determination means (M5) is object detecting device according to claim Rukoto reduce the number of judgments as when a narrow outer edge of the area relative position exists at the present detection time of the plurality of areas.
  2. The control object determination means (M5) is the relative position at the current detection point, if the outer edge in the plurality of areas are present in the narrowest first area, when the current object is the control target object The object detection device according to claim 1, wherein the determination is performed .
  3. The control object determination means (M5) is the relative position at the current detection point, if the outer edge next to the first area among the plurality of areas are present in the narrow second area, the previous detection relative position and wherein the determining that the control object to the current object when present in the second area at the time, the object detecting apparatus according to claim 2.
  4.   Object position estimation means (M6) for estimating the next relative position based on the previous and current relative positions and relative speeds calculated by the relative relationship calculation means (M1), and the control target object determination means (M5) The method according to claim 1, wherein, when the next position of the object estimated by the object position estimation means (M6) is outside the detection area, the object is not determined as a control target. The object detection apparatus of any one of Claims.
  5.   Relative speed estimation means (M7) for estimating the next relative speed based on the previous and current relative positions and relative speeds calculated by the relative relationship calculation means (M1) is provided, and the control target object determination means (M5) ) Does not determine the object as a control target when the next relative speed of the object estimated by the relative speed estimation means (M7) is in a direction away from the own vehicle. The object detection apparatus of any one of Claim 4.
JP2006227321A 2006-08-24 2006-08-24 Object detection device Active JP4757147B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2006227321A JP4757147B2 (en) 2006-08-24 2006-08-24 Object detection device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2006227321A JP4757147B2 (en) 2006-08-24 2006-08-24 Object detection device

Publications (2)

Publication Number Publication Date
JP2008051615A JP2008051615A (en) 2008-03-06
JP4757147B2 true JP4757147B2 (en) 2011-08-24

Family

ID=39235816

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2006227321A Active JP4757147B2 (en) 2006-08-24 2006-08-24 Object detection device

Country Status (1)

Country Link
JP (1) JP4757147B2 (en)

Families Citing this family (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP5078727B2 (en) * 2008-04-23 2012-11-21 本田技研工業株式会社 Object detection device
JP2010175256A (en) * 2009-01-27 2010-08-12 Honda Motor Co Ltd Object detection device
JP4890577B2 (en) * 2009-03-05 2012-03-07 本田技研工業株式会社 Vehicle object detection device
JP5174880B2 (en) * 2010-10-22 2013-04-03 三菱電機株式会社 Automotive radar equipment
JP6430777B2 (en) * 2014-10-22 2018-11-28 株式会社デンソー Object detection device

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003240843A (en) * 2002-02-19 2003-08-27 Denso Corp Fmcw radar unit and program
JP2004132734A (en) * 2002-10-08 2004-04-30 Fujitsu Ten Ltd Vehicle-mounted radar installation
JP2005173806A (en) * 2003-12-09 2005-06-30 Nissan Motor Co Ltd Preceding vehicle detecting device, local vehicle controller, and preceding vehicle detecting method
JP2006010410A (en) * 2004-06-23 2006-01-12 Toyota Motor Corp Target detection device

Family Cites Families (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09178849A (en) * 1995-12-25 1997-07-11 Toyota Motor Corp On-vehicle radar device
JP3582217B2 (en) * 1996-03-29 2004-10-27 マツダ株式会社 Obstacle detection device
JP3367372B2 (en) * 1997-03-21 2003-01-14 三菱電機株式会社 Radar signal processing equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003240843A (en) * 2002-02-19 2003-08-27 Denso Corp Fmcw radar unit and program
JP2004132734A (en) * 2002-10-08 2004-04-30 Fujitsu Ten Ltd Vehicle-mounted radar installation
JP2005173806A (en) * 2003-12-09 2005-06-30 Nissan Motor Co Ltd Preceding vehicle detecting device, local vehicle controller, and preceding vehicle detecting method
JP2006010410A (en) * 2004-06-23 2006-01-12 Toyota Motor Corp Target detection device

Also Published As

Publication number Publication date
JP2008051615A (en) 2008-03-06

Similar Documents

Publication Publication Date Title
US7825849B2 (en) Object detecting apparatus and method for detecting an object
JP4957747B2 (en) Vehicle environment estimation device
EP2009464B1 (en) Object detection device
WO2010140215A1 (en) Vehicular peripheral surveillance device
EP1065520B1 (en) Vehicle control method and vehicle warning method
US6812882B2 (en) Stationary on-road object detection method for use with radar
US7266453B2 (en) Vehicular object detection system, tracking control system, and vehicle control system
US20100010699A1 (en) Cruise control plan evaluation device and method
JPWO2006051603A1 (en) Axis deviation angle estimation method and apparatus
EP1357394B1 (en) Still object detecting method of scanning radar
JP4045043B2 (en) Radar equipment
JP4905512B2 (en) Target information estimation device
US6300865B1 (en) Process for detecting the road conditions ahead for motor vehicles
US9390624B2 (en) Vehicle-installation intersection judgment apparatus and program
DE112009005424T5 (en) Object detection device and object detection method
JP4453217B2 (en) Inter-vehicle distance control device
DE102007036175B4 (en) Vehicle control system
GB2544674A (en) Intersection collision avoidance with adaptable vehicle dimensions
JPWO2010086895A1 (en) Object recognition apparatus and object recognition method
US5923282A (en) Radar system
EP2599074A1 (en) Vehicle control system
EP1326087B1 (en) Apparatus and method for radar data processing
JP2002096702A (en) Vehicle-to-vehicle distance estimation device
JPH06255391A (en) Traveling controller for vehicle
US5986601A (en) Object detecting system for vehicle

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20081127

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110202

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110223

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110425

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20110525

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20110531

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140610

Year of fee payment: 3