JP2010211504A - Object detection device - Google Patents

Object detection device Download PDF

Info

Publication number
JP2010211504A
JP2010211504A JP2009056734A JP2009056734A JP2010211504A JP 2010211504 A JP2010211504 A JP 2010211504A JP 2009056734 A JP2009056734 A JP 2009056734A JP 2009056734 A JP2009056734 A JP 2009056734A JP 2010211504 A JP2010211504 A JP 2010211504A
Authority
JP
Japan
Prior art keywords
object
vehicle
sensor
roadside
position
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2009056734A
Other languages
Japanese (ja)
Inventor
Koji Takeuchi
宏次 竹内
Original Assignee
Toyota Motor Corp
トヨタ自動車株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Motor Corp, トヨタ自動車株式会社 filed Critical Toyota Motor Corp
Priority to JP2009056734A priority Critical patent/JP2010211504A/en
Publication of JP2010211504A publication Critical patent/JP2010211504A/en
Pending legal-status Critical Current

Links

Images

Abstract

An object of the present invention is to accurately detect the position of a vehicle traveling in the vicinity of a roadside object.
An object detection device detects an object present around a vehicle via second sensors, which are mounted on the vehicle and are different from a first sensor consisting of a radar sensor. The roadside object detection unit 11, the determination unit 12 that determines whether the roadside object is detected by the roadside object detection unit 11, and the determination unit 12 that determines that the roadside object is detected, The position acquisition unit 13 that acquires the position information of the roadside object via the two sensors 21 to 23, and the position information of the roadside object acquired by the position acquisition unit 13, via the first sensor 24, A vehicle detection unit for detecting another vehicle traveling in the vicinity of the roadside object.
[Selection] Figure 2

Description

  The present invention relates to an object detection device that is mounted on a vehicle and detects an object existing around the vehicle via a first sensor that is a radar sensor, for example.

  Conventionally, an object such as a vehicle has been detected using a sensor such as a millimeter wave radar. On the other hand, when an object is detected based on a detection result by a distance measuring sensor such as a millimeter wave radar, it may be difficult to determine whether the detected object is a vehicle. May be detected as a vehicle.

  In order to solve the above problems, various methods, apparatuses, and the like have been disclosed (for example, see Patent Document 1). In the obstacle recognition device described in Patent Document 1, the following processing is performed. First, the travel lane recognition unit recognizes the travel lane in which the host vehicle travels from the measurement result of the navigation system, and determines the possibility that another vehicle exists on either side of the host vehicle. Further, when the traveling lane recognition unit determines that there is no possibility that another vehicle exists on either side of the own vehicle, the radar wave detection unit detects that side (= there is no possibility that another vehicle exists). The object is extracted as an obstacle when the reflectance of the front millimeter wave radar and the short-range millimeter wave radar is equal to or higher than a second threshold value higher than the normal first threshold value. According to the obstacle recognition device, the possibility of detecting roadside objects such as guardrails and side walls as vehicles can be reduced.

JP 2008-40819 A

  However, in the obstacle recognition device described in Patent Document 1, it is determined whether or not the vehicle is a vehicle by paying attention to the fact that the reflectance of the roadside object is lower than the reflectance of the vehicle. For roadside objects, the reflectivity may be equivalent to the reflectivity of the vehicle. In such a case, a roadside object such as a metal guard rail may be detected as a vehicle. In addition, when a roadside object capture point is detected as a vehicle capture point, the roadside object capture point is grouped as a capture point of a vehicle traveling in the vicinity of the roadside object, and the vehicle position is accurately determined. May not be detected. That is, by grouping the capture points of roadside objects present in the vicinity of the vehicle as the capture points of the vehicle, the position of the vehicle may be detected as a position close to the roadside object side. .

  The present invention has been made in view of the above circumstances, and an object thereof is to provide an object detection device capable of accurately detecting the position of a vehicle traveling in the vicinity of a roadside object. .

  In order to achieve the above object, the present invention has the following features. A first invention is an object detection device that detects an object that is mounted on a vehicle and that is present around the vehicle via a first sensor that is a radar sensor. The object detection device is mounted on the vehicle and includes the first sensor Through a second sensor which is a different sensor, a roadside object detection means for detecting an object existing around the vehicle, and a determination means for determining whether or not a roadside object is detected by the roadside object detection means And when the determination means determines that a roadside object is detected, the position acquisition means acquires position information of the roadside object via the second sensor, and the position acquisition means acquires the position information. Vehicle detection means for detecting another vehicle traveling in the vicinity of the roadside object via the first sensor based on the position information of the roadside object.

  In a second aspect based on the first aspect, the vehicle detection means determines whether the capture point acquired via the first sensor in the vicinity of the roadside object is a capture point of one object. Change the condition for determining whether or not.

  A third invention is an area in which, in the second invention, the vehicle detection means determines whether or not a capture point acquired via the first sensor is a capture point of one object. Narrow the grouping area.

  In a fourth aspect based on the third aspect, the vehicle detection means narrows the grouping area to a predetermined ratio set in advance.

  In a fifth aspect based on the first aspect, the vehicle detection means acquires the representative position, which is the position of the other vehicle used in the collision determination with the other vehicle, by the position acquisition means. It is determined based on the position information of the roadside object.

  In a sixth aspect based on the fifth aspect, the vehicle detection means is at least a predetermined distance set in advance from the roadside object among the capture points corresponding to the other vehicle acquired from the first sensor. The position of the separated capture point is determined as the representative position.

  In a seventh aspect based on the fifth aspect, the vehicle detection means is a position of a capture point farthest from the roadside object among the capture points corresponding to the other vehicle acquired from the first sensor. Is determined to be the representative position of the other vehicle.

  An eighth invention is the sensor according to the first invention, wherein the second sensor is capable of detecting an object existing in front of a detection area of the first sensor.

  According to a ninth invention, in the first invention, the second sensor is a radar sensor that detects at least one of a front side and a front side of the vehicle, and the first sensor is a front side and a front side of the vehicle. , An object existing in at least one direction of the rear side and the rear side is detected.

  A tenth invention is the camera according to the first invention, wherein the second sensor is arranged to detect a white line drawn on a road ahead of the vehicle, and the first sensor An object present in at least one direction of the front, front side, rear, and rear side of the vehicle is detected.

  In an eleventh aspect based on the first aspect, the roadside object includes a guardrail.

  In a twelfth aspect based on the first aspect, the determination means determines whether the object detected by the roadside object detection means is a roadside object via at least one of a yaw rate sensor and a steering sensor. To do.

  In a thirteenth aspect based on whether or not the object detected by the roadside object detection means is outside the lane in which the vehicle is traveling, in the first aspect, It is determined whether or not the object detected by the roadside object detection means is a roadside object.

  In a fourteenth aspect based on the first aspect, the object detected by the roadside object detection means is based on whether the object detected by the roadside object detection means is a stationary object. It is determined whether or not is a roadside object.

  According to the first to seventh aspects of the present invention, since the influence on the detection position of the other vehicle by the capture point corresponding to the roadside object can be eliminated, the detection is performed via the second sensor. It is possible to accurately detect the position of the vehicle traveling near the roadside object.

  According to the eighth aspect to the tenth aspect, since the roadside object existing ahead of the other vehicle can be detected via the second sensor, the other vehicle is the roadside. When reaching the vicinity of the position of the object, it is possible to eliminate the influence of the capture point corresponding to the roadside object on the detection position of the other vehicle.

  According to the tenth aspect of the invention, there is no need to newly provide a sensor as the second sensor, so the roadside object can be detected with a simple configuration.

  According to the eleventh aspect, the position of the vehicle traveling in the vicinity of the guardrail detected via the second sensor can be accurately detected.

  According to the twelfth aspect to the fourteenth aspect, it can be more accurately determined whether or not the object is a roadside object.

The block diagram which shows an example of a structure of the object detection apparatus which concerns on this invention Block diagram showing an example of the functional configuration of the object detection ECU A plan view showing an example of a detection result by a conventional object detection device The top view which shows an example of the detection result by a vehicle detection part Flow chart showing an example of the operation of the object detection ECU (first half) Flow chart showing an example of the operation of the object detection ECU (second half) Detailed flowchart showing an example of roadside object determination processing The top view which shows an example of the other detection method by a vehicle detection part The block diagram which shows the modification of the structure of the object detection apparatus which concerns on this invention

  Hereinafter, embodiments of an object detection device according to the present invention will be described with reference to the drawings. The object detection apparatus according to the present invention is an apparatus that is mounted on a vehicle and detects an object existing around the vehicle via a first sensor that is a radar sensor. First, an example of the configuration of an object detection device mounted on a vehicle will be described with reference to FIGS. 1 and 2.

  FIG. 1 is a block diagram showing an example of the configuration of an object detection apparatus according to the present invention. As shown in FIG. 1, an object detection ECU (Electronic Control Unit) 1 (= corresponding to an object detection device) according to the present invention is communicably connected to an input device 2 and a collision determination ECU 3 as peripheral devices. . In the present embodiment, a case where the first sensor detects an object existing behind the host vehicle will be described. That is, the case where the object detection ECU 1 detects another vehicle traveling in the vicinity of the roadside object behind the host vehicle will be described (see FIGS. 3 to 5).

  First, the input device 2 and the collision determination ECU 3 of the object detection ECU 1 will be described with reference to FIG. The input device 2 includes a front radar sensor 21, a left front side radar sensor 22, a right front side radar sensor 23, a rear radar sensor 24, a yaw rate sensor 25, and a steering sensor 26.

  The forward radar sensor 21 (corresponding to a part of the second sensor) is a radar sensor that is mounted on the front surface of the host vehicle and detects an object (here, a roadside object) that exists in front of the host vehicle. The detection result is output to the object detection ECU 1 (here, the roadside object detection unit 11 shown in FIG. 2).

  The left front side radar sensor 22 (corresponding to a part of the second sensor) is a radar sensor that is mounted on the front surface of the host vehicle and detects an object (here, a roadside object) that exists on the left front side of the host vehicle. And a detection result is output to object detection ECU1 (here roadside object detection part 11 shown in FIG. 2).

  The right front side radar sensor 23 (corresponding to a part of the second sensor) is a radar sensor that is mounted on the front surface of the host vehicle and detects an object (here, a roadside object) that exists on the right front side of the host vehicle. And a detection result is output to object detection ECU1 (here roadside object detection part 11 shown in FIG. 2).

  The rear radar sensor 24 (corresponding to the first sensor) is a radar sensor that is mounted on the rear surface of the host vehicle and detects an object (here, another vehicle) existing behind the host vehicle. Is output to the object detection ECU 1 (here, the vehicle detection unit 14 shown in FIG. 2).

  The yaw rate sensor 25 is mounted at an appropriate position of the host vehicle and is a rate gyro or the like, and is a sensor that detects a yaw rate indicating the speed at which the yaw angle changes (= rotational angular speed around the vertical axis passing through the center of gravity of the vehicle). Thus, a signal indicating the yaw rate is output to the object detection ECU 1 (here, the determination unit 12 shown in FIG. 2).

  The steering sensor 26 is a sensor that is mounted at an appropriate position of the host vehicle and detects a steering angle, and outputs a signal indicating the steering angle to the object detection ECU 1 (here, the determination unit 12 shown in FIG. 2). .

  In the present embodiment, a case will be described in which the second sensor includes the front radar sensor 21, the left front side radar sensor 22, and the right front side radar sensor 23. However, the second sensor is the front radar sensor 21, the left front side. Any configuration including at least one of the side radar sensor 22 and the right front side radar sensor 23 may be used. Further, as will be described later with reference to FIG. 9, the second sensor is replaced with (or in addition to) the front radar sensor 21, the left front side radar sensor 22, and the right front side radar sensor 23, and a white line detection camera. 27 may be used.

  Further, in the present embodiment, a case where the first sensor is composed of the rear radar sensor 24 will be described. However, the first sensor is in at least one direction of the front, front side, rear, and rear side of the host vehicle. Any radar sensor that detects an existing object may be used. However, it is preferable that the second sensor is a sensor configured to be able to detect an object existing in front of the detection area of the first sensor.

  That is, in this case, the second sensor (here, the front radar sensor 21, the left front side radar sensor 22, and the right front side radar sensor 23) is replaced with the first sensor (here, the rear radar sensor 24). ) Is a sensor configured to be able to detect an object existing in front of the detection region, so that a roadside object existing in front of the other vehicle can be detected via the second sensor. . Therefore, when the other vehicle reaches the vicinity of the position of the roadside object, it is possible to eliminate the influence of the capture point corresponding to the roadside object on the detection position of the other vehicle. The position of the vehicle traveling near the roadside object detected via the vehicle can be accurately detected.

  The collision determination ECU 3 receives relative position information of other vehicles traveling behind the own vehicle from the object detection ECU 1 (here, the vehicle detection unit 14 shown in FIG. 2), and the other vehicle collides with the own vehicle. It is ECU which determines the presence or absence of possibility to do.

  FIG. 2 is a block diagram illustrating an example of a functional configuration of the object detection ECU 1. As shown in FIG. 2, the object detection ECU 1 functionally includes a roadside object detection unit 11, a determination unit 12, a position acquisition unit 13, and a vehicle detection unit 14.

  Note that the object detection ECU 1 causes a microcomputer disposed at a proper position of the object detection ECU 1 to execute a control program stored in advance in a ROM (Read Only Memory) disposed at a proper position of the object detection ECU 1. The microcomputer is caused to function functionally as functional units such as the roadside object detection unit 11, the determination unit 12, the position acquisition unit 13, and the vehicle detection unit 14.

  The roadside object detection unit 11 (corresponding to a roadside object detection means) detects an object existing around the host vehicle via the front radar sensor 21, the left front side radar sensor 22, and the right front side radar sensor 23. It is a functional part to do.

  The determination unit 12 (corresponding to a determination unit) is a functional unit that determines whether or not a roadside object is detected by the roadside object detection unit 11. Specifically, the determination unit 12 estimates the lane in which the host vehicle is traveling via the yaw rate sensor 25 and the steering sensor 26. And the determination part 12 determines whether it is a roadside object based on whether the object detected by the roadside object detection part 11 exists in the outer side of the estimated lane. The determination unit 12 determines whether or not the object detected by the roadside object detection unit 11 is a roadside object based on whether or not the object is a stationary object. That is, the determination unit 12 determines that the object detected by the roadside object detection unit 11 is a roadside object when the object is a stationary object existing outside the lane in which the host vehicle is traveling.

  In this manner, the position of the lane in which the host vehicle is traveling can be estimated based on the detection results of the yaw rate sensor 25 and the steering sensor 26. Therefore, since it is possible to determine whether or not the vehicle is outside the lane in which the host vehicle is traveling, it is possible to accurately determine whether or not the object is a roadside object. In addition, since it is determined whether or not the detected object is a roadside object based on whether or not the detected object is a stationary object, it is more accurately determined whether or not the object is a roadside object. can do.

  In the present embodiment, a case will be described in which the determination unit 12 estimates the position of the lane in which the host vehicle is traveling based on the detection results of the yaw rate sensor 25 and the steering sensor 26. However, the yaw rate sensor 25 and the steering sensor are described. It is only necessary to determine whether or not the object detected by the roadside object detection unit 11 is a roadside object based on at least one of the detection results of 26.

  Moreover, in this embodiment, although the determination part 12 demonstrates the case where the position of the lane in which the own vehicle is running is estimated based on the detection results of the yaw rate sensor 25 and the steering sensor 26, the determination part 12 The position of the lane in which the host vehicle is traveling may be estimated by other methods. For example, the determination unit 12 may estimate the position of the lane in which the host vehicle is traveling based on the detection result of a white line detection camera that detects a white line drawn in front of the host vehicle. In this case, the position of the lane can be estimated more accurately.

  Furthermore, in the present embodiment, the determination unit 12 is a roadside object when the object detected by the roadside object detection unit 11 is a stationary object that exists outside the lane in which the host vehicle is traveling. Although the case where it determines is demonstrated, the form which the determination part 12 determines whether it is a roadside object by another method may be sufficient. For example, the determination unit 12 may determine that the object detected by the roadside object detection unit 11 is a roadside object when the object is a stationary object. In this case, the process is simplified.

  The position acquisition unit 13 (corresponding to a position acquisition unit), when the determination unit 12 determines that a roadside object is detected, the front radar sensor 21, the left front side radar sensor 22, and the right front side radar. It is a functional unit that acquires position information of the roadside object via the sensor 23. For example, when an object is detected by the roadside object detection unit 11 via the front radar sensor 21 and the determination unit 12 determines that the object is a roadside object, the position acquisition unit 13 The position information of the roadside object is acquired via the radar sensor 21. In other words, when the determination unit 12 determines that the roadside object is a roadside object, the position acquisition unit 13 includes the front radar sensor 21, the left front side radar sensor 22, and the right front side radar sensor 23. The position information of the roadside object is acquired through the detected sensor.

  The vehicle detection unit 14 (corresponding to the vehicle detection means) is configured to receive other roadside objects that travel in the vicinity of the roadside object via the rear radar sensor 24 based on the position information of the roadside object acquired by the position acquisition unit 13. It is a function part which detects a vehicle. In addition, the vehicle detection unit 14 outputs the detected position information of other vehicles to the collision determination ECU 3. Specifically, the vehicle detection unit 14 changes a condition for determining whether or not the capture point acquired via the rear radar sensor 24 is a capture point of one object in the vicinity of the roadside object. . More specifically, the vehicle detection unit 14 is an area for determining whether a capture point acquired via the rear radar sensor 24 is a capture point of one object in the vicinity of the roadside object. A certain grouping area is narrowed to a predetermined ratio set in advance. Here, as the grouping area, a rectangular area having a preset size is set with reference to the acquisition point acquired via the rear radar sensor 24 (see FIGS. 3, 4, and 8).

  Below, with reference to FIG.3 and FIG.4, the effect of the process by the vehicle detection part 14 is demonstrated. FIG. 3 is a plan view showing an example of a detection result obtained by a conventional object detection apparatus. As shown in FIG. 3, the host vehicle VC0 is traveling toward the upper side of the figure, and the guard rail GL is traveling on the side of the host vehicle VC0 (here, the right side), and the host vehicle VC0 is traveling. Arranged along the lane. The rear vehicle VC1 approaches the positions P1, P2, and P3 and the host vehicle VC0 from the rear of the host vehicle VC0. Here, for convenience, the description will be made with the host vehicle VC0 stationary.

  When the rear vehicle VC1 exists at the position P1, a capturing point T1 indicated by a white circle ◯ is detected at the right front end of the rear vehicle VC1 via the rear radar sensor, and a grouping region G1A is set. Since the grouping area G1A includes only the capture point T1, the position P1A indicated by the black circle ● is determined as the representative position of the rear vehicle VC1 and used for collision determination and the like.

  Further, when the rear vehicle VC1 reaches the position P2, via the rear radar sensor, a capture point T2 indicated by a white circle ○ at the front right end of the rear vehicle VC1 and a capture point indicated by a white circle ○ on the guard rail GL. T2G is detected and a grouping area G2A is set. Since the grouping region G2A includes two capture points (= capture point T2 and capture point T2G), a position P2A indicated by a black circle ● is located at the intermediate position between the capture point T2 and the capture point T2G of the rear vehicle VC1. It is determined as a representative position and used for collision determination and the like.

  Further, when the rear vehicle VC1 has reached the position P3, a capture point T3 indicated by a white circle ○ at the front right end of the rear vehicle VC1 and a capture point indicated by a white circle ○ on the guard rail GL via the rear radar sensor. T3G is detected and a grouping area G3A is set. Since the grouping area G3A includes two capture points (= capture point T3 and capture point T3G), a position P3A indicated by a black circle ● is located at the intermediate position between the capture point T3 and the capture point T3G of the rear vehicle VC1. It is determined as a representative position and used for collision determination and the like.

  That is, in the conventional object detection device, the representative position of the rear vehicle VC1 is determined to be close to the position P1A, the position P2A, and the position P3A, and travels along the locus LA indicated by the thick broken line in FIG. It is determined. Therefore, in reality, the rear vehicle VC1 travels on the route passing the left side of the host vehicle VC0 (the route indicated by the thin broken line in the figure), but the capture point T2G corresponding to the guardrail GL, Under the influence of T3G, it is detected that the rear vehicle VC1 is heading toward the host vehicle VC0 along the locus LA.

  FIG. 4 is a plan view showing an example of a detection result by the object detection ECU 1 (vehicle detection unit 14) according to the present invention. As described above with reference to FIG. 2, the vehicle detection unit 14 is configured such that the acquisition point acquired via the rear radar sensor 24 in the vicinity of the roadside object (here, the guard rail GL) is the acquisition point of one object. The grouping regions G1B, G2B, and G3B, which are regions for determining whether or not there are, are compared with normal grouping regions G1A, G2A, and G3A (see FIG. 3), and a predetermined ratio (for example, in the width direction) 2/3 times).

  As a result, the capture point T2G indicated by a white circle on the guard rail GL detected via the rear radar sensor 24 in a state where the rear vehicle VC1 has reached the position P2 is not included in the grouping region G2B. Therefore, a position P2B indicated by a black circle ● at the position of the capture point T2 is determined as a representative position of the rear vehicle VC1 and used for collision determination or the like.

  Similarly, the capturing point T3G indicated by a white circle on the guard rail GL detected via the rear radar sensor 24 in a state where the rear vehicle VC1 has reached the position P3 is not included in the grouping region G3B. Therefore, a position P3B indicated by a black circle ● at the position of the capture point T3 is determined as a representative position of the rear vehicle VC1 and used for collision determination or the like.

  As a result, it is determined that the representative position of the rear vehicle VC1 is approaching the position P1B, the position P2B, and the position P3B, and it is determined that the vehicle is traveling along the locus LB indicated by a thick broken line in FIG. Therefore, similarly to the actual travel route of the rear vehicle VC1 (the route indicated by the thin broken line in the figure), the rear vehicle VC1 is detected as traveling along the locus LB passing through the left side of the host vehicle VC0. The Therefore, since the influence of the capture points T2G and T3G corresponding to the guard rail GL can be eliminated, the position of the rear vehicle VC1 can be accurately detected.

  Thus, in order to distinguish the capture points T2G and T3G corresponding to the roadside object (here, the guard rail GL) from the capture points T2 and T3 of another vehicle (here, the rear vehicle VC1), By changing the condition for determining whether or not it is a capture point, the position of the vehicle traveling near the roadside object can be accurately detected. Further, by narrowing the grouping regions G1B, G2B, and G3B to an appropriate range, the capturing points T2G and T3G corresponding to the roadside object can be distinguished from the capturing points T2 and T3 of other vehicles. The position of the vehicle traveling in the vicinity can be detected more accurately.

  In the present embodiment, the case where the vehicle detection unit 14 narrows the grouping regions G1B, G2B, and G3B near the roadside object will be described. However, the vehicle detection unit 14 detects the rear radar sensor 24 near the roadside object. What is necessary is just a form which changes the conditions which determine whether the capture point acquired via this is the capture point of one object. For example, the form which makes the threshold with respect to the difference of the relative speed between the capture points as conditions for determining whether the vehicle detection part 14 is the capture point of one object may be sufficient. That is, as one of the conditions for determining that a plurality of capture points are capture points of one object, a predetermined threshold (hereinafter referred to as “speed difference threshold”) in which a difference in relative speed between the capture points is set in advance. Assume the following conditions. And the vehicle detection part 14 makes the said speed difference threshold value small in the vicinity of a roadside thing. Also in this case, the capture point of the roadside object that is a stationary object can be determined as the capture point of another vehicle (here, the rear vehicle VC1) that is a moving body.

  Furthermore, since the grouping areas G1B, G2B, and G3B are narrowed to a predetermined ratio (2/3 in this case), the grouping area can be easily changed. In addition, by setting the predetermined ratio to an appropriate value (in this case, 2/3), the capture points T2G and T3G corresponding to the roadside object can be distinguished from the capture points T2 and T3 of other vehicles. The position of the vehicle traveling in the vicinity of the roadside object can be detected more accurately.

  In the present embodiment, a case will be described in which the vehicle detection unit 14 narrows the grouping regions G1B, G2B, and G3B to a predetermined ratio in the vicinity of the roadside object. Any form that narrows G1B, G2B, and G3B may be used. For example, the vehicle detection unit 14 may set the grouping regions G2B and G3B according to the distance from the roadside object. For example, the vehicle detection unit 14 may be configured to set the grouping regions G1B, G2B, and G3B narrower as the position of the capture point is closer to the roadside object. In this case, the position of another vehicle (here, the rear vehicle VC1) can be detected more accurately.

  5 and 6 are flowcharts showing an example of the operation of the object detection ECU 1. First, as shown in FIG. 5, the roadside object detection unit 11 detects an object present around the vehicle via the front radar sensor 21 (S101). Next, an object existing around the vehicle is detected by the roadside object detection unit 11 via the left front side radar sensor 22 (S103). Next, the roadside object detection unit 11 detects an object present around the vehicle via the right front side radar sensor 23 (S105). Then, the determination unit 12 determines whether or not an object has been detected in any of steps S101, S103, and S105 (S107). If it is determined that an object has not been detected (NO in S107), the process returns to step S101, and the processes after step S101 are repeatedly executed.

  If it is determined that an object has been detected (YES in S107), the determination unit 12 determines whether the object detected in any of steps S101, S103, and S105 is a roadside object. An object determination process is performed (S109). Then, the determination unit 12 determines whether or not the roadside object is determined in step S109 (S111). If it is determined that the object is not a roadside object (NO in S111), the process returns to step S101, and the processes after step S101 are repeatedly executed. If it is determined that the vehicle is a roadside object (YES in S111), the position acquisition unit 13 acquires the position information of the roadside object detected in any of steps S101, S103, and S105 (S113).

  And as shown in FIG. 6, the vehicle detection part 14 detects an object via the rear radar sensor 24 (S115). Next, the vehicle detection unit 14 determines whether or not an object is detected in step S115 (S117). If it is determined that no object has been detected (NO in S117), the process returns to step S115, and the processes after step S115 are repeatedly executed. When it is determined that an object is detected (YES in S117), the vehicle detection unit 14 detects the object in step S115 based on the position information of the roadside object acquired in step S113 shown in FIG. Whether or not the object is in the vicinity of the roadside object detected in steps S101, S103, and S105 shown in FIG. 5 (for example, whether the distance from the roadside object is within a predetermined distance (for example, 5 m) set in advance) No) is determined (S119).

  If it is determined that the vehicle is not near the roadside object (NO in S119), the process proceeds to step S123. When it is determined that the vehicle is near the roadside object (YES in S119), the vehicle detection unit 14 narrows the grouping area to a predetermined ratio set in advance (S121). When the process of step S121 is completed or when NO in step S119, the vehicle detection unit 14 determines whether or not two or more capture points are detected in the grouping area (S123). ). If it is determined that only one capture point has been detected (NO in S123), the process proceeds to step S127. When it is determined that two or more capture points are detected (YES in S123), the vehicle detection unit 14 calculates the average position of the plurality of capture point positions as the representative position (S125). In the case of NO in step S123 or when the process of step S125 is completed, the vehicle detection unit 14 outputs the representative position information to the collision determination ECU 3 (S127), and the process is terminated.

  FIG. 7 is a detailed flowchart showing an example of the roadside object determination process executed in step S109 shown in FIG. Note that the following processing is all performed by the determination unit 12. First, it is determined whether or not the object detected in any of steps S101, S103, and S105 in FIG. 5 is a stationary object (S201). If it is determined that the object is not a stationary object (NO in S201), it is determined that the object is not a roadside object (S215), and the process returns to step S111 in FIG. If it is determined that the object is a stationary object (YES in S201), yaw rate information and steering angle information are acquired from the yaw rate sensor 25 and the steering sensor 26, respectively (S203).

  Then, based on the yaw rate information and the steering angle information acquired in step S203, the traveling track of the host vehicle is estimated (S207). Next, the traveling lane of the host vehicle is estimated based on the traveling track of the host vehicle obtained in step S207 (S209). Next, it is determined whether or not the object detected in any of steps S101, S103, and S105 in FIG. 5 is outside the traveling lane of the host vehicle estimated in step S209 (S211). If it is determined that the vehicle is outside the travel lane (YES in S211), it is determined that the vehicle is a roadside object (S213), and the process is returned to step S111 in FIG. If it is determined that the vehicle is inside the traveling lane (NO in S211), it is determined that the vehicle is not a roadside object (S215), and the process is returned to step S111 in FIG.

  Here, the modification of the vehicle detection part 14 shown in FIG. 2 is demonstrated. FIG. 8 is a plan view showing an example of another detection method by the vehicle detection unit 14. Here, the vehicle detection unit 14 uses the roadside object acquired by the position acquisition unit 13 as a representative position that is the position of the other vehicle used in the collision determination with another vehicle (here, the rear vehicle VC1). Here, the functional unit is determined based on the position information of the guardrail GL). Specifically, the vehicle detection unit 14 determines the position of the capture point farthest from the roadside object among the capture points corresponding to the other vehicle acquired from the rear radar sensor 24 as the representative position of the other vehicle. It is determined that

  The process of the vehicle detection part 14 is demonstrated concretely using FIG. When the rear vehicle VC1 arrives at the position P2, a capture point T2 indicated by a white circle ○ at the front right end of the rear vehicle VC1 and a capture point T2G indicated by a white circle ○ on the guard rail GL via the rear radar sensor. Detected and a grouping area G2C is set. Since the grouping region G2C includes two capture points (= capture point T2 and capture point T2G), the position of the capture point T2 that is the most distant from the guardrail GL (= A position P2C) indicated by a black circle ● is determined as a representative position of the rear vehicle VC1.

  Further, when the rear vehicle VC1 has reached the position P3, a capture point T3 indicated by a white circle ○ at the front right end of the rear vehicle VC1 and a capture point indicated by a white circle ○ on the guard rail GL via the rear radar sensor. T3G is detected and a grouping area G3C is set. Since the grouping region G3C includes two capture points (= capture point T3 and capture point T3G), the vehicle detection unit 14 determines the position of the capture point T3 that is the most distant from the guardrail GL (= A position P3C) indicated by a black circle ● is determined as a representative position of the rear vehicle VC1.

  As a result, it is determined that the representative position of the rear vehicle VC1 is approaching the position P1C, the position P2C, and the position P3C, and it is determined that the vehicle is traveling along the locus LC indicated by a thick broken line in FIG. Therefore, similarly to the actual travel route of the rear vehicle VC1 (the route indicated by the thin broken line in the figure), the rear vehicle VC1 is detected as traveling along the locus LC passing through the left side of the host vehicle VC0. The Therefore, since the influence of the capture points T2G and T3G corresponding to the guard rail GL can be eliminated, the position of the rear vehicle VC1 can be accurately detected.

  Thus, since the representative position of the other vehicle (here, the rear vehicle VC1) is determined based on the position information of the roadside object (here, the guard rail GL), the capture point T2G corresponding to the roadside object is determined. Since it is possible to eliminate the influence of T3G on the representative position of the other vehicle, an appropriate representative position can be determined. Of the capture points T2, T2G (or capture points T3, T3G) obtained from the rear radar sensor 24, the capture points T2, T3 that are at least farthest from the roadside object are the roadside It is estimated that it is not a capture point corresponding to an object. Therefore, since the influence of the capture points T2G and T3G corresponding to the roadside object on the representative position of the other vehicle can be eliminated with a simple configuration, an appropriate representative position can be determined.

  In the embodiment shown in FIG. 8, the vehicle detection unit 14 determines that the position of the capture point farthest from the roadside object (here, the guard rail GL) is the representative position of the other vehicle (here, the rear vehicle VC1). Although the case where it determines is demonstrated, the vehicle detection part 14 should just be a form which determines the representative position of the said other vehicle based on the positional information on the said roadside thing.

  For example, the vehicle detection unit 14 may determine that the position of the capture point that is separated from the roadside object by a predetermined distance (for example, 1 m) or more is a representative position. That is, it is estimated that the capture points T2 and T3 separated from the roadside object by a predetermined distance (for example, 1 m) or more are not the capture points T2G and T3G corresponding to the roadside object. Therefore, since the influence of the capture points T2G and T3G corresponding to the roadside object on the representative position of the other vehicle can be eliminated with a simple configuration, an appropriate representative position can be determined.

  Next, a modification of the object detection device shown in FIGS. 1 and 2 will be described. FIG. 9 is a block diagram showing a modification of the configuration of the object detection apparatus according to the present invention. As shown in FIG. 9, an input device 2A is provided instead of the input device 2 shown in FIG. 1, and an object detection ECU 1A is provided instead of the object detection ECU 1 shown in FIGS. Here, differences from the object detection apparatus shown in FIGS. 1 and 2 will be described, and description of components common to the object detection apparatus shown in FIGS. 1 and 2 will be omitted.

  The input device 2 </ b> A includes a white line detection camera 27 instead of the front radar sensor 21, the left front side radar sensor 22, the right front side radar sensor 23, the yaw rate sensor 25, and the steering sensor 26 of the input device 2. The white line detection camera 27 (corresponding to the second sensor) is a camera that includes a CCD (Charge Coupled Device) camera or the like and detects the position of the white line drawn on both sides of the lane in which the host vehicle is traveling. The white line detection camera 27 detects a roadside object such as a guard rail disposed outside the white line drawn on both sides of the lane in which the host vehicle is traveling.

  In the form shown in FIG. 8, the object detection ECU 1A is different from the functional part of the object detection ECU 1 shown in FIG. 2 in the following points. The roadside object detection unit 11 detects an object existing around the host vehicle via a white line detection camera 27 instead of the front radar sensor 21, the left front side radar sensor 22, and the right front side radar sensor 23. Further, the determination unit 12 estimates the lane in which the host vehicle is traveling via the white line detection camera 27 instead of the yaw rate sensor 25 and the steering sensor 26.

  In this way, the lane in which the host vehicle is traveling is estimated via the white line detection camera 27, and an object existing around the host vehicle is detected. The position of the traveling vehicle can be accurately detected.

  In the embodiment shown in FIG. 8, the case where the first sensor is the rear radar sensor 24 has been described. However, the first sensor detects at least one of the front side, the front side side, the rear side, and the rear side of the vehicle. Any radar sensor may be used. For example, the first sensor may be a radar sensor that detects an object present on the front side of the vehicle. However, the second sensor (here, the white line detection camera 27) is a sensor configured to be able to detect an object existing ahead of the detection region of the first sensor (here, the rear radar sensor 24). Preferably there is.

  In other words, in this case, the second sensor (here, the white line detection camera 27) can detect an object existing ahead of the detection area of the first sensor (here, the rear radar sensor 24). Since it is a configured sensor, a roadside object present ahead of the other vehicle can be detected via the second sensor. Therefore, when the other vehicle reaches the vicinity of the position of the roadside object, it is possible to eliminate the influence of the capture point corresponding to the roadside object on the detection position of the other vehicle. The position of the vehicle traveling near the roadside object detected via the vehicle can be accurately detected.

In addition, the object detection apparatus according to the present invention is not limited to the above embodiment, and may be the following form.
(A) In this embodiment, although the object detection ECU1 demonstrated functionally the case where the roadside object detection part 11, the determination part 12, the position acquisition part 13, the vehicle detection part 14, etc. were provided, the roadside object detection part 11, the determination unit 12, the position acquisition unit 13, and the vehicle detection unit 14, at least one functional unit may be realized by hardware such as an electric circuit.

  (B) Although the case where the roadside object is the guard rail GL has been described in the present embodiment, the roadside object may be another type of roadside object (for example, a side wall, a streetlight, a building, or the like).

  The present invention can be applied to, for example, an object detection device that is mounted on a vehicle and detects an object existing around the vehicle via a first sensor including a radar sensor.

1, 1A Object detection ECU (object detection device)
11 Roadside object detection unit (roadside object detection means)
12. Determination unit (determination means)
13 Position acquisition unit (position acquisition means)
14 Vehicle detection unit (vehicle detection means)
2 Input equipment 21 Front radar sensor (part of second sensor)
22 Left front side radar sensor (part of second sensor)
23 Right front side radar sensor (part of second sensor)
24 Rear radar sensor (first sensor)
25 Yaw rate sensor 26 Steering sensor 27 White line detection camera (second sensor)
3 Collision judgment ECU

Claims (14)

  1. An object detection device that is mounted on a vehicle and detects an object existing around the vehicle via a first sensor including a radar sensor,
    Roadside object detection means for detecting an object present around the vehicle via a second sensor which is mounted on the vehicle and is different from the first sensor;
    Determination means for determining whether a roadside object is detected by the roadside object detection means;
    Position determination means for acquiring position information of the roadside object via the second sensor when it is determined that the roadside object is detected by the determination means;
    An object detection device comprising: vehicle detection means for detecting another vehicle traveling in the vicinity of the roadside object via the first sensor based on the position information of the roadside object acquired by the position acquisition means. .
  2.   The vehicle detection means changes a condition for determining whether or not a capture point acquired via the first sensor is a capture point of one object in the vicinity of the roadside object. The object detection apparatus described.
  3.   3. The object according to claim 2, wherein the vehicle detection unit narrows a grouping area that is an area for determining whether or not a capture point acquired via the first sensor is a capture point of one object. Detection device.
  4.   The object detection device according to claim 3, wherein the vehicle detection unit narrows the grouping area to a predetermined ratio set in advance.
  5.   The said vehicle detection means determines the representative position which is a position of the said other vehicle used in the collision determination with the said other vehicle based on the positional information on the said roadside thing acquired by the said position acquisition means. The object detection apparatus according to 1.
  6.   The vehicle detection means includes a position of a capture point that is separated from the roadside object by a predetermined distance or more among the capture points corresponding to the other vehicle acquired from the first sensor as the representative position. The object detection apparatus according to claim 5, wherein the determination is performed.
  7.   The vehicle detection means determines that the position of the capture point farthest from the roadside object among the capture points corresponding to the other vehicle acquired from the first sensor is the representative position of the other vehicle. The object detection apparatus according to claim 5.
  8.   2. The object detection device according to claim 1, wherein the second sensor is a sensor configured to be able to detect an object existing in front of a detection region of the first sensor.
  9. The second sensor is a radar sensor that detects at least one of a front side and a front side of the vehicle,
    2. The object detection device according to claim 1, wherein the first sensor detects an object existing in at least one direction of a front side, a front side side, a rear side, and a rear side of the vehicle.
  10. The second sensor is a camera arranged to detect a white line drawn on a road ahead of the vehicle;
    2. The object detection device according to claim 1, wherein the first sensor detects an object existing in at least one direction of a front side, a front side side, a rear side, and a rear side of the vehicle.
  11.   The object detection device according to claim 1, wherein the roadside object includes a guardrail.
  12.   2. The object detection device according to claim 1, wherein the determination unit determines whether the object detected by the roadside object detection unit is a roadside object via at least one of a yaw rate sensor and a steering sensor.
  13.   The determination unit determines whether or not the object detected by the roadside object detection unit is a roadside object based on whether or not the object exists outside a lane in which the vehicle is traveling. The object detection apparatus according to 1.
  14.   The object detection apparatus according to claim 1, wherein the determination unit determines whether or not the object detected by the roadside object detection unit is a roadside object based on whether or not the object is a stationary object.
JP2009056734A 2009-03-10 2009-03-10 Object detection device Pending JP2010211504A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2009056734A JP2010211504A (en) 2009-03-10 2009-03-10 Object detection device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2009056734A JP2010211504A (en) 2009-03-10 2009-03-10 Object detection device

Publications (1)

Publication Number Publication Date
JP2010211504A true JP2010211504A (en) 2010-09-24

Family

ID=42971592

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2009056734A Pending JP2010211504A (en) 2009-03-10 2009-03-10 Object detection device

Country Status (1)

Country Link
JP (1) JP2010211504A (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015181611A2 (en) 2014-05-29 2015-12-03 Toyota Jidosha Kabushiki Kaisha Driving support apparatus
WO2016063532A1 (en) * 2014-10-22 2016-04-28 株式会社デンソー In-vehicle object determining apparatus
WO2016063533A1 (en) * 2014-10-22 2016-04-28 株式会社デンソー In-vehicle object determining apparatus
JP6289767B1 (en) * 2017-03-07 2018-03-07 三菱電機株式会社 Failure detection apparatus, failure detection method, and failure detection program
US10696297B2 (en) 2014-05-29 2020-06-30 Toyota Jidosha Kabushiki Kaisha Driving support apparatus

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2015181611A2 (en) 2014-05-29 2015-12-03 Toyota Jidosha Kabushiki Kaisha Driving support apparatus
US10106154B2 (en) 2014-05-29 2018-10-23 Toyota Jidosha Kabushiki Kaisha Driving support apparatus
US10696297B2 (en) 2014-05-29 2020-06-30 Toyota Jidosha Kabushiki Kaisha Driving support apparatus
WO2016063532A1 (en) * 2014-10-22 2016-04-28 株式会社デンソー In-vehicle object determining apparatus
WO2016063533A1 (en) * 2014-10-22 2016-04-28 株式会社デンソー In-vehicle object determining apparatus
JP6289767B1 (en) * 2017-03-07 2018-03-07 三菱電機株式会社 Failure detection apparatus, failure detection method, and failure detection program
WO2018163277A1 (en) * 2017-03-07 2018-09-13 三菱電機株式会社 Failure detection device, failure detection method, and failure detection program

Similar Documents

Publication Publication Date Title
CN104044587B (en) For the system and method for the sensor visual for improving the vehicle being under autonomous driving pattern
US10150473B2 (en) Recognition and prediction of lane constraints and construction areas in navigation
JP6304086B2 (en) Automatic driving device
US20160207534A1 (en) Collision avoidance control system and control method
US8433100B2 (en) Lane recognition device
US8615109B2 (en) Moving object trajectory estimating device
EP2793045B1 (en) Method for testing an environment detection system of a vehicle
JP4724043B2 (en) Object recognition device
JP5345350B2 (en) Vehicle driving support device
EP2321666B1 (en) Method for detecting expansive static object
DE102009043458B4 (en) Driving assistance system for vehicles
DE102007002419B4 (en) Vehicle environment monitoring device, method and program
US8085984B2 (en) Image recognizing apparatus and method, and position determining apparatus, vehicle controlling apparatus and navigation apparatus using the image recognizing apparatus or method
EP1052143B1 (en) Driving assistance for vehicle lane changing
US20150298621A1 (en) Object detection apparatus and driving assistance apparatus
CN102460535B (en) Method for judging vehicle traveling position and vehicle traveling position judgment device
US8447484B2 (en) Branch-lane entry judging system
JP4420011B2 (en) Object detection device
EP1680317B1 (en) Driver assist method and device based on lane information
JP4604103B2 (en) Intersection line-of-sight detection device
JP6361567B2 (en) Automated driving vehicle system
CN102044170B (en) Vehicle driving support control apparatus
JP3977802B2 (en) Obstacle detection device, obstacle detection method, and obstacle detection program
US9074906B2 (en) Road shape recognition device
JP5007840B2 (en) Driving assistance device

Legal Events

Date Code Title Description
RD02 Notification of acceptance of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7422

Effective date: 20110901