JP2009157668A - External recognition device - Google Patents

External recognition device Download PDF

Info

Publication number
JP2009157668A
JP2009157668A JP2007335502A JP2007335502A JP2009157668A JP 2009157668 A JP2009157668 A JP 2009157668A JP 2007335502 A JP2007335502 A JP 2007335502A JP 2007335502 A JP2007335502 A JP 2007335502A JP 2009157668 A JP2009157668 A JP 2009157668A
Authority
JP
Japan
Prior art keywords
vehicle
vehicle information
information acquisition
collision
acquisition means
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2007335502A
Other languages
Japanese (ja)
Other versions
JP4982353B2 (en
Inventor
Hiroshi Sakamoto
博史 坂本
Taketo Ogata
健人 緒方
Kazutoshi Tsuchiya
和利 土屋
Masato Imai
正人 今井
Takeshi Inoue
健士 井上
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Priority to JP2007335502A priority Critical patent/JP4982353B2/en
Publication of JP2009157668A publication Critical patent/JP2009157668A/en
Application granted granted Critical
Publication of JP4982353B2 publication Critical patent/JP4982353B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Landscapes

  • Image Processing (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To solve the problem that the function of a system is extremely deteriorated when any abnormality is generated in own vehicle information in determining collision. <P>SOLUTION: An external recognition device includes: an own vehicle information acquisition means for acquiring the information of an own vehicle; an object information acquisition means for acquiring the information of an object detected by an external recognition sensor; a predicted route setting means for setting the predicted route of the own vehicle on the basis of the own vehicle information acquired by the own vehicle information acquisition means; a first collision determination means for determining the possibility of the collision of the object detected by the external recognition sensor with the own vehicle on the basis of the object information acquired by the object information acquisition means and the predicted route set by the predicted route setting means; a second collision determination means for determining the possibility of collision of the object detected by the external recognition sensor with the own vehicle on the basis of the object information acquired by the object information acquisition means; an own vehicle information determination means for determining the abnormality of the own vehicle information acquired by the own vehicle information acquisition means; and a collision determination selection means for selecting the second collision determination means when it is determined that the own vehicle information acquired by the own vehicle information acquisition means is abnormal by the own vehicle information determination means. <P>COPYRIGHT: (C)2009,JPO&INPIT

Description

本発明は、外界認識センサが検知した物体と自車との衝突可能性を判定する外界認識装置に関する。   The present invention relates to an external recognition apparatus that determines the possibility of collision between an object detected by an external recognition sensor and a host vehicle.

交通事故による死亡者数は衝突安全と呼ばれる衝突後の乗員保護(エアバッグ,衝突安全ボディ)の導入により減少傾向にあるが、事故件数は自動車の保有台数とリンクして増加傾向にあり、これらの事故を低減するためには事故を未然に防ぐ予防安全システムの開発が重要である。予防安全システムは、事故の手前で作動するシステムであり、例えば、自車前方の物体と衝突する可能性が生じたときには警報によって運転者に注意を促し、衝突が避けられない状況になったときには自動ブレーキによって乗員の被害を軽減するプリクラッシュ・セーフティ・システム等が実用化されている。   The number of fatalities due to traffic accidents has been declining due to the introduction of passenger protection after collision (airbags, collision safety bodies) called collision safety, but the number of accidents has been increasing linked to the number of cars owned. In order to reduce the number of accidents, it is important to develop a preventive safety system that prevents accidents. A preventive safety system is a system that operates before an accident.For example, when there is a possibility of collision with an object in front of the vehicle, the driver is alerted by an alarm, and when the collision becomes unavoidable Pre-crash safety systems that reduce the damage to passengers by automatic braking have been put into practical use.

ここで、レーダやカメラで検知された自車前方の物体と自車との衝突可能性を判断する外界認識装置がある(特許文献1参照)。特許文献1には、車速と操舵角とヨーレートを用いて自車の進路を旋回半径として予測し、この予測進路と外界認識センサで検知された物体の相対位置に応じて自車との衝突可能性を判定する第1の衝突判定方法と、物体の相対位置の変化量から相対速度ベクトルを算出し、この相対速度ベクトルに応じて自車との衝突可能性を判定する第2の衝突判定方法が記載されている。この方法によれば、縦方向の相対速度が小さい物体に対しては第1の衝突判定結果を重視し、縦方向の相対速度が大きい物体に対しては第2の衝突判定結果を行うことによって、自車前方の停止車両を操舵回避した後に、前方を走行する低速車との接近が起こるようなシーンで、的確に衝突可能性を判定することが可能となる。   Here, there is an external recognition device that determines the possibility of collision between an object ahead of the host vehicle detected by a radar or a camera and the host vehicle (see Patent Document 1). Patent Document 1 predicts the course of the host vehicle as a turning radius using the vehicle speed, the steering angle, and the yaw rate, and can collide with the host vehicle according to the predicted path and the relative position of the object detected by the external recognition sensor. And a second collision determination method for calculating a relative velocity vector from the amount of change in the relative position of the object and determining a collision possibility with the own vehicle according to the relative velocity vector. Is described. According to this method, the first collision determination result is emphasized for an object having a small vertical relative velocity, and the second collision determination result is performed for an object having a large vertical relative velocity. It is possible to accurately determine the possibility of collision in a scene in which approaching with a low-speed vehicle traveling ahead occurs after avoiding steering of a stopped vehicle in front of the host vehicle.

特開2005−25458号公報JP 2005-25458 A

しかしながら、上記第1の衝突判定方法においては、車速,操舵角,ヨーレートといったセンサ情報を使用しているため、これらの情報に異常が発生したとき、例えば、車速センサ,操舵角センサ,ヨーレートセンサの故障時、また、これらの情報を他の制御装置から通信によって受信している場合の通信失陥時等において、第1の衝突判定の精度が著しく低下する可能性があり、結果としてシステム全体を無効化しなければならない。   However, in the first collision determination method, sensor information such as the vehicle speed, the steering angle, and the yaw rate is used. Therefore, when an abnormality occurs in such information, for example, the vehicle speed sensor, the steering angle sensor, and the yaw rate sensor The accuracy of the first collision determination may be significantly reduced at the time of failure or when communication failure occurs when such information is received from another control device. As a result, the entire system may be reduced. Must be invalidated.

そこで、本発明の目的は、自車の予測進路と外界認識センサが検知した物体の相対位置に応じて、自車との衝突可能性を判定する際に使用する自車の情報に異常が発生した場合でも、システム全体の性能低下を最小限に抑える外界認識装置を提供することにある。   Therefore, an object of the present invention is to generate an abnormality in the information of the own vehicle used when determining the possibility of collision with the own vehicle according to the predicted course of the own vehicle and the relative position of the object detected by the external recognition sensor. In such a case, it is an object of the present invention to provide an external recognition apparatus that minimizes the performance degradation of the entire system.

上記課題を解決するため、本発明の望ましい態様の一つは次の通りである。   In order to solve the above problems, one of the desirable embodiments of the present invention is as follows.

本外界認識装置は、自車の情報を取得する自車情報取得手段と、外界認識センサが検知した物体の情報を取得する物体情報取得手段と、自車情報取得手段が取得した自車情報に基づいて、自車の予測進路を設定する予測進路設定手段と、物体情報取得手段が取得した物体情報と予測進路設定手段が設定した予測進路に基づいて、外界認識センサが検知した物体と自車との衝突可能性を判定する第1の衝突判定手段と、物体情報取得手段が取得した物体情報に基づいて、外界認識センサが検知した物体と自車との衝突可能性を判定する第2の衝突判定手段と、自車情報取得手段が取得した自車情報の異常を判定する自車情報判定手段と、自車情報判定手段が、自車情報取得手段が取得した自車情報が異常と判定した場合、第2の衝突判定手段を選択する衝突判定選択手段とを備える。   The external environment recognition device includes an own vehicle information acquisition unit that acquires information on the own vehicle, an object information acquisition unit that acquires information on an object detected by the external environment recognition sensor, and an own vehicle information acquired by the own vehicle information acquisition unit. Based on the predicted course setting means for setting the predicted course of the own vehicle, the object information acquired by the object information acquisition means, and the predicted course set by the predicted course setting means, and the object detected by the external recognition sensor and the own vehicle A first collision determination unit that determines the possibility of collision with the vehicle, and a second collision determination unit that determines a collision possibility between the object detected by the external recognition sensor and the host vehicle based on the object information acquired by the object information acquisition unit. Collision determination means, own vehicle information determination means for determining abnormality of own vehicle information acquired by own vehicle information acquisition means, and own vehicle information determination means determine that own vehicle information acquired by own vehicle information acquisition means is abnormal The second collision determination means And a collision determination selecting means for-option.

本発明によれば、自車の予測進路と外界認識センサが検知した物体の相対位置に応じて自車との衝突可能性を判定する際に使用する自車の情報に異常が発生した場合でも、システム全体の性能低下を最小限に抑える外界認識装置を提供することができる。   According to the present invention, even when an abnormality occurs in the information of the own vehicle used when determining the possibility of collision with the own vehicle according to the predicted course of the own vehicle and the relative position of the object detected by the external recognition sensor. Thus, it is possible to provide an external recognition apparatus that minimizes the performance degradation of the entire system.

以下、実施形態を、図1〜図10を用いて詳細に説明する。   Hereinafter, embodiments will be described in detail with reference to FIGS.

図1は、外界認識装置100のブロック図である。   FIG. 1 is a block diagram of the external environment recognition apparatus 100.

以下に示す制御の内容は、外界認識装置100にプログラミングされ、予め定められた周期で繰り返し実行される。   The contents of the control shown below are programmed in the external recognition apparatus 100 and are repeatedly executed at a predetermined cycle.

自車情報取得手段1は、車速センサ,操舵角センサ,ヨーレートセンサの検出信号に応じて車速Vsp,操舵角α,ヨーレートγ等の自車情報を取得する。尚、これらの自車情報は、各センサの信号を外界認識装置100に直接入力することによって取得しても良いし、センサ信号が他の制御装置に入力されている場合はそれらの制御装置とLAN(Local Area Network)を用いた通信を行うことによって取得しても良い。   The own vehicle information acquisition unit 1 acquires own vehicle information such as the vehicle speed Vsp, the steering angle α, and the yaw rate γ in accordance with detection signals from the vehicle speed sensor, the steering angle sensor, and the yaw rate sensor. These vehicle information may be acquired by directly inputting the signals of the sensors to the external recognition device 100, and when the sensor signals are input to other control devices, You may acquire by performing communication using LAN (Local Area Network).

物体情報取得手段2は、レーダやカメラ等の外界認識センサの検出信号に応じて自車周辺の物体の相対距離PY[i],横位置PX[i],幅WD[i]を取得する。ここで、iは複数の物体を検知している場合の物体ID番号である。尚、これらの物体情報は、外界認識センサの信号を外界認識装置100に直接入力することによって取得しても良いし、外界認識センサとLAN(Local Area Network)を用いた通信を行うことによって取得しても良い。   The object information acquisition unit 2 acquires a relative distance PY [i], a lateral position PX [i], and a width WD [i] of an object around the vehicle according to a detection signal of an external recognition sensor such as a radar or a camera. Here, i is an object ID number when a plurality of objects are detected. The object information may be acquired by directly inputting a signal from the external recognition sensor to the external recognition device 100, or may be acquired by performing communication with the external recognition sensor using a LAN (Local Area Network). You may do it.

予測進路設定手段3は、自車情報取得手段1で取得した自車情報(車速Vsp,操舵角α,ヨーレートγ)に応じて自車の予測進路を演算する。ここでは、自車の予測進路として旋回半径Rを演算する。尚、旋回半径Rの演算方法については後述する。   The predicted course setting means 3 calculates the predicted course of the host vehicle according to the host vehicle information (vehicle speed Vsp, steering angle α, yaw rate γ) acquired by the host vehicle information acquiring unit 1. Here, the turning radius R is calculated as the predicted course of the host vehicle. A method for calculating the turning radius R will be described later.

第1の衝突判定手段4は、予測進路設定手段3で演算した自車の予測進路と、物体情報取得手段2で取得した物体情報(相対距離PY[i],横位置PX[i],幅WD[i])に応じて危険度1DREC1[i]を演算する。ここで、iは複数の物体を検知している場合の物体ID番号である。尚、危険度1DREC1[i]の演算方法については後述する。   The first collision determination means 4 includes the predicted course of the host vehicle calculated by the predicted course setting means 3, the object information acquired by the object information acquisition means 2 (relative distance PY [i], lateral position PX [i], width The risk level 1DREC1 [i] is calculated according to WD [i]). Here, i is an object ID number when a plurality of objects are detected. A method of calculating the risk level 1DREC1 [i] will be described later.

第2の衝突判定手段5は、物体情報取得手段2で取得した物体情報(相対距離PY[i],横位置PX[i],幅WD[i])に応じて危険度2DREC2[i]を演算する。ここで、iは複数の物体を検知している場合の物体ID番号である。尚、危険度2DREC2[i]の演算方法については後述する。   The second collision determination unit 5 sets the risk 2DREC2 [i] according to the object information (relative distance PY [i], lateral position PX [i], width WD [i]) acquired by the object information acquisition unit 2. Calculate. Here, i is an object ID number when a plurality of objects are detected. A method for calculating the risk level 2DREC2 [i] will be described later.

自車情報判定手段6は、自車情報取得手段1で取得した自車情報(車速Vsp,操舵角α,ヨーレートγ)に応じて各情報が正常か否かを判定し、診断NG判定フラグfDGNVCANを演算する。尚、診断NG判定フラグfDGNVCANの演算方法については後述する。   The own vehicle information determination means 6 determines whether each information is normal or not according to the own vehicle information (vehicle speed Vsp, steering angle α, yaw rate γ) acquired by the own vehicle information acquisition means 1, and a diagnosis NG determination flag fDGNVCAN. Is calculated. A method for calculating the diagnostic NG determination flag fDGNVCAN will be described later.

衝突判定選択手段7は、自車情報判定手段6の診断結果に応じて、第1の衝突判定手段4の結果である危険度1DREC1[i]を重視するか、第2の衝突判定手段5の結果である危険度2DREC2[i]を重視するかの判定を行い、統合化危険度DRECI[i]を演算する。尚、これらの判定方法及び統合化危険度DRECI[i]の演算方法については後述する。   The collision determination selection means 7 places importance on the risk 1DREC1 [i], which is the result of the first collision determination means 4, according to the diagnosis result of the vehicle information determination means 6, or the second collision determination means 5 It is determined whether or not the risk level 2DREC2 [i] as a result is important, and the integrated risk level DRECI [i] is calculated. Note that these determination methods and the calculation method of the integrated risk DRECI [i] will be described later.

次に、図2を用いて、前述のプリクラッシュ・セーフティ・システムを例にとり、統合化危険度DRECI[i]に応じて警報を出力する、あるいは自動的にブレーキを制御するといったシステムの動作方法について説明する。図2は、プリクラッシュ・セーフティ・システムの動作方法を示すフローチャートである。   Next, referring to FIG. 2, the above-described pre-crash safety system is taken as an example, and a system operation method of outputting an alarm according to the integrated risk DRECI [i] or automatically controlling the brake Will be described. FIG. 2 is a flowchart showing a method of operating the pre-crash safety system.

最初に、ステップ201において、物体情報を読込む。次に、ステップ201において、外界認識センサが検知した各物体の衝突予測時間TTC[i]を式(1)を用いて演算する。ここで、相対速度VY[i]は、物体の相対距離PY[i]を擬似微分することによって求める。   First, in step 201, object information is read. Next, in step 201, the collision prediction time TTC [i] of each object detected by the external recognition sensor is calculated using equation (1). Here, the relative velocity VY [i] is obtained by pseudo-differentiating the relative distance PY [i] of the object.

TTC[i]=PY[i]÷VY[i] (1)
更に、ステップ203において、外界認識装置100で演算された統合化危険度DRECI[i]を読込む。尚、ステップ201〜203の処理は、検知した物体数に応じてループ処理を行う構成としている。
TTC [i] = PY [i] ÷ VY [i] (1)
Further, in step 203, the integrated risk degree DRECI [i] calculated by the external recognition apparatus 100 is read. In addition, the process of steps 201-203 is set as the structure which performs a loop process according to the detected number of objects.

ステップ204において、ステップ203で読込まれた統合化危険度DRECI[i]に応じて式(2)の条件が成立している物体を選択し、選択された物体の中で衝突予測時間TTC[i]が最小となる物体kを選択する。   In step 204, an object satisfying the condition of the expression (2) is selected according to the integrated risk DRECI [i] read in step 203, and the predicted collision time TTC [i] is selected among the selected objects. ] Is the smallest object k.

DRECI[i]≧cDRECI# (2)
ここで、所定値cDRECI#は、自車に衝突するか否かを判定するための閾値である。次に、ステップ205において、選択された物体kの衝突予測時間TTC[k]に応じて自動的にブレーキを制御する範囲であるか否かの判定を行う。式(3)が成立している場合にはステップ206に進み、ブレーキ制御を実行して処理を終了する。また、式(3)が非成立の場合にはステップ207に進む。
DRECI [i] ≧ cDRECI # (2)
Here, the predetermined value cDRECI # is a threshold value for determining whether or not the vehicle collides. Next, in step 205, it is determined whether or not the brake is automatically controlled in accordance with the predicted collision time TTC [k] of the selected object k. If Expression (3) is established, the process proceeds to step 206, brake control is executed, and the process is terminated. On the other hand, if Expression (3) is not established, the process proceeds to Step 207.

TTC[k]≦cTTCBRK# (3)
ステップ207において、選択された物体kの衝突予測時間TTC[k]に応じて警報を出力する範囲であるか否かの判定を行う。式(4)が成立している場合にはステップ208に進み、警報を出力して処理を終了する。また、式(4)が非成立の場合には、ブレーキ制御,警報ともに実行せずに処理を終了する。
TTC [k] ≦ cTTCBRK # (3)
In step 207, it is determined whether or not the alarm is output in accordance with the predicted collision time TTC [k] of the selected object k. If the expression (4) is established, the process proceeds to step 208, an alarm is output, and the process is terminated. In addition, when Expression (4) is not established, neither the brake control nor the alarm is executed, and the process is terminated.

TTC[k]≦cTTCALM# (4)
以上のように、衝突判定の結果を示す統合化危険度DRECI[i]は、システムの動作(自動ブレーキ,警報)を左右する重要なパラメータである。
TTC [k] ≦ cTTCALM # (4)
As described above, the integrated risk DRECI [i] indicating the result of the collision determination is an important parameter that affects the operation of the system (automatic braking, alarm).

次に、図3を用いて予測進路設定手段3について説明する。図3は、予測進路設定手段3の処理内容を示す模式図である。   Next, the predicted course setting means 3 will be described with reference to FIG. FIG. 3 is a schematic diagram showing the processing contents of the predicted course setting means 3.

自車位置を原点Oとすると、予測進路は原点Oを通る旋回半径Rの円弧で近似できる。ここで、旋回半径Rは、自車の操舵角α,速度Vsp,スタビリティファクタA、ホイールベースL及びステアリングギア比Gsを用いて式(5)で表される。   If the vehicle position is the origin O, the predicted course can be approximated by an arc of a turning radius R passing through the origin O. Here, the turning radius R is expressed by Expression (5) using the steering angle α, the speed Vsp, the stability factor A, the wheel base L, and the steering gear ratio Gs of the host vehicle.

R=(1+AV2)×(L・Gs/α) (5)
スタビリティファクタとは、その正負が、車両のステア特性を支配するものであり、車両の定常円旋回の速度に依存する変化の大きさを示す指数となる重要な値である。式(5)から分かるように、旋回半径Rは、スタビリティファクタAを係数として、自車の速度Vspの2乗に比例して変化する。また、旋回半径Rは車速Vsp及びヨーレートγを用いて式(6)で表すことができる。
R = (1 + AV 2 ) × (L · Gs / α) (5)
The stability factor is an important value that determines the magnitude of the change depending on the speed of the steady circular turning of the vehicle. As can be seen from the equation (5), the turning radius R changes in proportion to the square of the speed Vsp of the host vehicle with the stability factor A as a coefficient. Further, the turning radius R can be expressed by Expression (6) using the vehicle speed Vsp and the yaw rate γ.

R=V/γ (6)
以上のように、車速,操舵角及びヨーレートの自車情報を利用することによって自車の予測進路を旋回半径Rの円弧で近似することが可能となる。
R = V / γ (6)
As described above, the predicted course of the host vehicle can be approximated by an arc of the turning radius R by using the host vehicle information on the vehicle speed, the steering angle, and the yaw rate.

次に、図3を用いて、第1の衝突判定手段4の処理内容について説明する。   Next, the processing content of the first collision determination means 4 will be described with reference to FIG.

レーダ,カメラ等の外界認識センサで検知した物体の相対距離をPY[i]、横位置をPX[i]とすると、旋回半径Rの描く円弧の中心から物体までの距離rは式(7)で表される。   When the relative distance of an object detected by an external recognition sensor such as a radar or a camera is PY [i] and the lateral position is PX [i], the distance r from the center of the arc drawn by the turning radius R to the object is expressed by the following equation (7). It is represented by

(PX[i]−R)2+PY[i]2=r (7)
また、旋回半径Rと距離rの差分dは式(8)で求めることができる。
(PX [i] -R) 2 + PY [i] 2 = r (7)
Further, the difference d between the turning radius R and the distance r can be obtained by Expression (8).

d=|R−r| (8)
図3から、式(8)の半径差分dが小さくなるほど、検知した物体が自車の予測進路に近づくため、危険度が高いと判断できる。例えば、差分dが車幅の約半分以下となる物体は危険度1DREC1[i]が所定値cDRECI#以上となるように演算して、図2で説明したプリクラッシュ・セーフティ・システムが動作するように設定する。
d = | R−r | (8)
From FIG. 3, it can be determined that the degree of risk is high because the detected object approaches the predicted course of the vehicle as the radius difference d in equation (8) decreases. For example, an object whose difference d is about half or less of the vehicle width is calculated so that the risk 1DREC1 [i] is equal to or greater than a predetermined value cDRECI # so that the pre-crash safety system described in FIG. 2 operates. Set to.

次に、図4を用いて、第2の衝突判定手段5の処理内容について説明する。図4は、第2の衝突判定手段5の処理内容を示す模式図である。   Next, the processing content of the second collision determination means 5 will be described with reference to FIG. FIG. 4 is a schematic diagram showing the processing contents of the second collision determination means 5.

レーダ,カメラ等の外界認識センサで検知した物体mの相対距離をPY[m]、横位置をPX[m]とすると、これらの擬似微分により求めた検知物体mの相対速度ベクトル(VX[m],VY[m])は自車方向を向いており、X軸と自車幅内で交差することが予想される。従って、検知物体mは衝突可能性が高いと判定できる。また、検知物体nの相対距離をPY[n]、横位置をPX[n]とすると、これらの擬似微分により求めた検知物体nの相対速度ベクトル(VX[n],VY[n])はX軸と自車幅内で交差せず、かつ、自車の側面とも交差しないことが予想される。従って、検知物体nは衝突可能性が低いと判定できる。   When the relative distance of the object m detected by an external recognition sensor such as a radar or a camera is PY [m] and the lateral position is PX [m], the relative velocity vector (VX [m] ], VY [m]) are directed in the direction of the vehicle and are expected to intersect the X axis within the vehicle width. Therefore, it can be determined that the detected object m has a high possibility of collision. When the relative distance of the sensing object n is PY [n] and the lateral position is PX [n], the relative velocity vectors (VX [n], VY [n]) of the sensing object n obtained by these pseudo-differentiations are obtained. It is expected that the vehicle does not intersect the X axis within the vehicle width and does not intersect the side surface of the vehicle. Therefore, it can be determined that the detected object n has a low possibility of collision.

以上のように、外界認識センサで検知した物体情報から相対速度ベクトルを求め、この相対速度ベクトルと自車との交点を求めることにより衝突可能性を判定できる。例えば、この相対速度ベクトルが図のa点からb点の間に向いている物体は危険度2DREC2[i]が所定値cDRECI#以上となるように演算して、図2で説明したプリクラッシュ・セーフティ・システムが動作するように設定する。   As described above, the possibility of collision can be determined by obtaining the relative velocity vector from the object information detected by the external recognition sensor and obtaining the intersection between the relative velocity vector and the own vehicle. For example, an object whose relative velocity vector is between points a and b in the figure is calculated so that the risk 2DREC2 [i] is equal to or greater than a predetermined value cDRECI #, and the pre-crash Set the safety system to operate.

次に、図5を用いて、自車情報判定手段6の処理内容について説明する。図5は、自車情報判定手段6の処理内容を示すフローチャートである。   Next, the processing content of the own vehicle information determination means 6 is demonstrated using FIG. FIG. 5 is a flowchart showing the processing contents of the vehicle information determination means 6.

ここで、診断NG判定フラグfDGNVCANの各ビット#xDGNVSP,#xDGNSTR,#xDGNYAWの値は、車速,操舵角,ヨーレートに対応する自車情報の正常/異常を表しており、異常と診断された場合には該当するビットがセットされる。   Here, the value of each bit #xDGNVSP, #xDGNSTR, #xDGNYAW of the diagnosis NG determination flag fDGNVCAN represents normality / abnormality of own vehicle information corresponding to the vehicle speed, steering angle, and yaw rate, and is diagnosed as abnormal The corresponding bit is set in.

最初に、ステップ501において、自車情報(車速Vsp,操舵角α,ヨーレートγ)を読込み、ステップ502において、これらの情報の前回値(前回の演算周期での値)を車速前回値Vspz,操舵角前回値αz,ヨーレート前回値γzとして記憶する。次に、ステップ503において、車速前回値Vspzと車速Vspとの差分を演算し、この差分が所定値(異常と判断できる閾値)よりも大きい状態が所定時間経過したか否かの判定を行う。ステップ503において、この条件が非成立であり、車速Vspが異常でないと診断された場合にはステップ505に進む。また、ステップ503において、差分が所定値よりも大きい状態が所定時間経過しており、車速Vspが異常であると診断された場合にはステップ504に進み、診断NG判定フラグfDGNVCANのビット#xDGNVSPをセットしてステップ505に進む。   First, in step 501, the own vehicle information (vehicle speed Vsp, steering angle α, yaw rate γ) is read. In step 502, the previous value (value in the previous calculation cycle) of these pieces of information is used as the vehicle speed previous value Vspz, steering. Stored as the previous angle value αz and the previous yaw rate value γz. Next, in step 503, a difference between the vehicle speed previous value Vspz and the vehicle speed Vsp is calculated, and it is determined whether or not a state in which the difference is larger than a predetermined value (threshold value that can be determined to be abnormal) has elapsed for a predetermined time. If it is determined in step 503 that this condition is not satisfied and the vehicle speed Vsp is not abnormal, the process proceeds to step 505. In step 503, if the difference is greater than the predetermined value and the predetermined time has elapsed and it is diagnosed that the vehicle speed Vsp is abnormal, the process proceeds to step 504, and the bit #xDGNVSP of the diagnosis NG determination flag fDGNVCAN is set. Set and go to step 505.

ステップ505において、操舵角前回値αzと操舵角αとの差分を演算し、この差分が所定値(異常と判断できる閾値)よりも大きい状態が所定時間経過したか否かの判定を行う。ステップ505において、この条件が非成立であり、操舵角αが異常でないと診断された場合にはステップ507に進む。また、ステップ505において、差分が所定値よりも大きい状態が所定時間経過しており、操舵角αが異常であると診断された場合にはステップ506に進み、診断NG判定フラグfDGNVCANのビット#xDGNSTRをセットしてステップ507に進む。   In step 505, a difference between the previous steering angle value αz and the steering angle α is calculated, and it is determined whether or not a state in which the difference is larger than a predetermined value (threshold value that can be determined to be abnormal) has elapsed for a predetermined time. If it is determined in step 505 that this condition is not satisfied and the steering angle α is not abnormal, the process proceeds to step 507. If it is determined in step 505 that the difference is larger than the predetermined value and the predetermined time has elapsed and the steering angle α is diagnosed to be abnormal, the process proceeds to step 506 and bit #xDGNSTR of the diagnostic NG determination flag fDGNVCAN. And proceed to step 507.

ステップ507において、ヨーレート前回値γzとヨーレートγとの差分を演算し、この差分が所定値(異常と判断できる閾値)よりも大きい状態が所定時間経過したか否かの判定を行う。ステップ507において、この条件が非成立であり、ヨーレートγが異常でないと診断された場合には処理を終了する。また、ステップ507において、差分が所定値よりも大きい状態が所定時間経過しており、ヨーレートγが異常であると診断された場合にはステップ508に進み、診断NG判定フラグfDGNVCANのビット#xDGNYAWをセットして処理を終了する。   In step 507, the difference between the previous yaw rate value γz and the yaw rate γ is calculated, and it is determined whether or not a predetermined time has elapsed when the difference is larger than a predetermined value (threshold value that can be determined to be abnormal). If it is determined in step 507 that this condition is not satisfied and the yaw rate γ is not abnormal, the process ends. If it is determined in step 507 that the difference is greater than the predetermined value and the predetermined time has elapsed and the yaw rate γ is diagnosed to be abnormal, the process proceeds to step 508 and the bit #xDGNYAW of the diagnostic NG determination flag fDGNVCAN is set. Set and finish the process.

以上のように、自車情報判定手段6によって自車情報である車速,操舵角,ヨーレートが異常であるか否かを判定可能である。   As described above, it is possible to determine whether or not the vehicle speed, the steering angle, and the yaw rate, which are the vehicle information, are abnormal by the vehicle information determination unit 6.

次に、図6を用いて、衝突判定選択手段7の処理内容について説明する。図6は、衝突判定選択手段7の処理内容を示すフローチャートである。   Next, processing contents of the collision determination selection means 7 will be described with reference to FIG. FIG. 6 is a flowchart showing the processing contents of the collision determination selection means 7.

尚、ステップ602〜607の処理は、検知した物体数に応じてループ処理を行う構成としている。   Note that the processing in steps 602 to 607 is configured to perform loop processing according to the number of detected objects.

最初に、ステップ601において、診断NG判定フラグfDGNVCANを読込み、ステップ602において、物体情報を読込む。次に、ステップ603において、物体の相対距離PY[i]を擬似微分することによって相対速度VY[i]を演算し、ステップ604に進む。ステップ604では、ステップ603で演算された相対速度VY[i]に応じて危険度2重み係数KW2[i]を式(9)に応じて演算する。   First, in step 601, a diagnostic NG determination flag fDGNVCAN is read, and in step 602, object information is read. Next, in step 603, the relative speed VY [i] is calculated by pseudo-differentiating the relative distance PY [i] of the object, and the process proceeds to step 604. In step 604, the risk level 2 weighting coefficient KW2 [i] is calculated according to the equation (9) according to the relative speed VY [i] calculated in step 603.

KW2[i]=関数tKW2(VY[i]) (9)
次に、ステップ605において、車速Vsp,操舵角α,ヨーレートγ等の自車情報に応じて設定された自車の予測進路を使用する第1の衝突判定が適用できるか否かの判定を行う。具体的には、診断NG判定フラグfDGNVCANの各ビットを参照し、全てのビットがクリアされていれば(図のステップ605に記載されているC1)〜C3)条件)、ステップ606に進み、危険度2重み係数KW2[i]を用いて式(10)に応じて統合化危険度DRECI[i]を演算する。
KW2 [i] = function tKW2 (VY [i]) (9)
Next, in Step 605, it is determined whether or not the first collision determination using the predicted course of the host vehicle set according to the host vehicle information such as the vehicle speed Vsp, the steering angle α, and the yaw rate γ can be applied. . Specifically, referring to each bit of the diagnostic NG determination flag fDGNVCAN and if all the bits are cleared (conditions C1 to C3 described in step 605 in the figure), the process proceeds to step 606, which is dangerous. The integrated risk degree DRECI [i] is calculated according to the equation (10) using the degree 2 weight coefficient KW2 [i].

DRECI[i]=(1−KW2[i])×DREC1[i]
+KW2[i]×DREC2[i] (10)
また、ステップ605において、診断NG判定フラグfDGNVCANの少なくとも1つのビットがセットされている場合には、ステップ607に進み、統合化危険度DRECI[i]に危険度2DREC2[i]を代入する。
DRECI [i] = (1-KW2 [i]) × DREC1 [i]
+ KW2 [i] × DREC2 [i] (10)
If at least one bit of the diagnostic NG determination flag fDGNVCAN is set in step 605, the process proceeds to step 607, and the risk level 2DREC2 [i] is assigned to the integrated risk level DRECI [i].

以上のように、衝突判定選択手段7によって車速,操舵角,ヨーレート等の自車情報が正常である場合には、外界認識センサで検知した物体の縦方向の相対速度に応じて危険度1DREC1[i]と危険度2DREC2[i]を上手く重み付けして統合化危険度DRECI[i]を演算できる。また、車速,操舵角,ヨーレート等の自車情報が異常である場合には、これらの情報を使用しない危険度2DREC2[i]を利用することによりシステム全体の性能低下を最小限に抑えることが可能となる。また、診断NG判定フラグfDGNVCANの各ビットを参照して、図3で説明した予測進路の設定方法を変更しても良い。例えば、操舵角のみが異常と判定されている場合は、式(6)を用いて車速とヨーレートに応じて旋回半径Rを演算し、ヨーレートのみが異常と判定されている場合には式(5)を用いて車速と操舵角に応じて旋回半径Rを演算することも可能である。   As described above, when the vehicle information such as the vehicle speed, the steering angle, and the yaw rate is normal by the collision determination selection unit 7, the risk 1DREC1 [ The integrated risk DRECI [i] can be calculated by weighting the i] and the risk 2DREC2 [i] well. In addition, when the vehicle information such as the vehicle speed, the steering angle, and the yaw rate is abnormal, it is possible to minimize the degradation of the performance of the entire system by using the risk 2DREC2 [i] that does not use such information. It becomes possible. In addition, referring to each bit of the diagnostic NG determination flag fDGNVCAN, the method for setting the predicted course described in FIG. 3 may be changed. For example, when it is determined that only the steering angle is abnormal, the turning radius R is calculated according to the vehicle speed and the yaw rate using Expression (6), and when only the yaw rate is determined to be abnormal, Expression (5) ) Can be used to calculate the turning radius R according to the vehicle speed and the steering angle.

次に、図7と図8を用いて、外界認識装置の他の実施形態について説明する。   Next, another embodiment of the external recognition apparatus will be described with reference to FIGS.

図7は、第1の外界認識センサであるレーダと第2の外界認識センサであるカメラの情報を組合せて物体の検知精度・信頼性を高めるセンサフュージョンシステムの処理構成図である。   FIG. 7 is a processing configuration diagram of a sensor fusion system that improves the detection accuracy and reliability of an object by combining information of a radar that is a first external field recognition sensor and a camera that is a second external field recognition sensor.

外界認識装置100aには、車速,操舵角,ヨーレート等の自車情報とレーダで検知した物体の相対距離,横位置,幅等のレーダ情報が入力され、レーダ情報に応じて図1と同様の処理を実行することによりレーダ統合化危険度RDRECI[i]が演算される。また、外界認識装置100aでは、図示しない第1の物体情報取得手段により、レーダ検知物の物体情報である相対距離,横位置,幅をそれぞれPYR[i],PXR[i]、WDR[i]として取得している。   The vehicle recognition device 100a receives the vehicle information such as the vehicle speed, the steering angle, and the yaw rate and the radar information such as the relative distance, lateral position, and width of the object detected by the radar. By executing the processing, the radar integration risk RDRECI [i] is calculated. Further, in the external environment recognition apparatus 100a, the relative distance, the lateral position, and the width, which are object information of the radar detection object, are changed to PYR [i], PXR [i], and WDR [i] by the first object information acquisition unit (not shown). Is getting as.

画像処理手段710では、レーダ物体情報(PYR[i],PXR[i],WDR[i])とレーダ統合化危険度RDRECI[i]に応じて複数のレーダ検知物のうち危険度の高い物体を選択し、カメラで撮像した画像の情報に応じて選択された物体に対して画像処理を行うことにより、物体の相対距離,横位置,幅等のカメラ情報を出力する。   In the image processing means 710, a high-risk object among a plurality of radar detection objects according to the radar object information (PYR [i], PXR [i], WDR [i]) and the radar integrated risk RDRECI [i]. Is selected and image processing is performed on the object selected according to the information of the image captured by the camera, thereby outputting camera information such as the relative distance, lateral position, and width of the object.

ここで、図8を用いて、画像処理手段710の処理内容について説明する。図8は、画像処理手段710の処理内容を示すフローチャートである。   Here, the processing content of the image processing means 710 will be described with reference to FIG. FIG. 8 is a flowchart showing the processing contents of the image processing means 710.

まず、ステップ801において、レーダ統合化危険度RDRECI[i]を読込み、ステップ802において、レーダ統合化危険度RDRECI[i]が所定値cRDRECI#以上であるか否かの判定を行う。ステップ802の条件が成立、即ち、レーダで検知した物体と自車が衝突する可能性が高いと判定された場合には、ステップ803に進む。また、ステップ802の条件が非成立、即ち、レーダで検知した物体と自車が衝突する可能性が低いと判定された場合には処理を終了する。   First, in step 801, the radar integration risk RDRECI [i] is read. In step 802, it is determined whether the radar integration risk RDRECI [i] is equal to or greater than a predetermined value cRDRECI #. If the condition of step 802 is satisfied, that is, if it is determined that the object detected by the radar and the own vehicle are likely to collide, the process proceeds to step 803. If it is determined that the condition in step 802 is not satisfied, that is, it is determined that the object detected by the radar and the own vehicle are unlikely to collide, the process is terminated.

次に、ステップ803において、レーダ物体情報(PYR[i],PXR[i],WDR[i])を読込み、ステップ804において、このレーダ物体情報とカメラ幾何モデル(画像上の位置と実際の位置の関係)に基づいて画像上での処理領域を設定する。画像処理領域を設定した後は、ステップ805に進み、この領域内を走査するパターンマッチング等の画像処理を実行して物体を検知する。その後、ステップ806に進み、物体が検知された場合には相対距離,横位置,幅等の物体情報を出力して処理を終了する。   Next, in step 803, the radar object information (PYR [i], PXR [i], WDR [i]) is read. In step 804, the radar object information and the camera geometric model (the position on the image and the actual position) are read. The processing area on the image is set based on After setting the image processing area, the process proceeds to step 805, and image processing such as pattern matching that scans the area is executed to detect the object. Thereafter, the process proceeds to step 806, and when an object is detected, object information such as a relative distance, a lateral position, and a width is output, and the process ends.

図7に戻り、外界認識装置100bには、車速,操舵角,ヨーレート等の自車情報とカメラで検知した物体の相対距離,横位置,幅等のカメラ情報が入力され、カメラ情報に応じて図1と同様の処理を実行することにより統合化危険度DRECI[i]が演算される。また、外界認識装置100bでは、図示しない第2の物体情報取得手段により、カメラ検知物の物体情報である相対距離,横位置,幅をそれぞれPY[i],PX[i],WD[i]として取得している。   Returning to FIG. 7, the external vehicle recognition device 100 b receives the vehicle information such as the vehicle speed, the steering angle, and the yaw rate and the camera information such as the relative distance, the lateral position, and the width of the object detected by the camera. The integrated risk degree DRECI [i] is calculated by executing the same processing as in FIG. In the external environment recognition apparatus 100b, the relative distance, the lateral position, and the width, which are object information of the camera detection object, are set to PY [i], PX [i], and WD [i] by the second object information acquisition unit (not shown). Is getting as.

制御手段720では、カメラ物体情報(PY[i],PX[i],WD[i])と統合化危険度DRECI[i]に応じて警報を出力する、あるいは自動的にブレーキ制御を実行するための指令値が図2に示したフローチャートに従って演算される。   The control means 720 outputs an alarm according to the camera object information (PY [i], PX [i], WD [i]) and the integrated risk DRECI [i], or automatically executes brake control. The command value is calculated according to the flowchart shown in FIG.

以上のように、車速,操舵角,ヨーレート等の自車情報と第1の外界認識センサであるレーダで検知した物体の情報に応じて衝突判定を行い、自車と衝突する可能性が高い物体を選択して第2の外界認識センサであるカメラを利用して画像処理による物体検知を行うことにより検知精度・信頼性の向上が図れる。このとき、レーダ情報に応じて自車と衝突する可能性が高い物体を選択する処理に外界認識装置100aを適用することによって、第1の衝突判定に使用している自車の情報に異常が発生した場合でも、システム全体の性能低下を最小限に抑えることが可能となる。   As described above, an object that is highly likely to collide with the vehicle by performing collision determination according to the vehicle information such as the vehicle speed, the steering angle, and the yaw rate and the information of the object detected by the radar that is the first external recognition sensor. The detection accuracy / reliability can be improved by selecting and performing object detection by image processing using the camera which is the second external recognition sensor. At this time, by applying the external recognition device 100a to the process of selecting an object that is highly likely to collide with the vehicle according to the radar information, there is an abnormality in the information on the vehicle used for the first collision determination. Even if it occurs, it is possible to minimize the performance degradation of the entire system.

次に、図9を用いて、所定の通信手段によって自車情報を取得している場合の自車情報判定手段の処理内容について説明する。図9は、前記通信手段が失陥したか否かを診断する自車情報判定手段の処理内容を示すフローチャートである。   Next, the processing contents of the vehicle information determination unit when the vehicle information is acquired by a predetermined communication unit will be described with reference to FIG. FIG. 9 is a flowchart showing the processing contents of the vehicle information determination means for diagnosing whether or not the communication means has failed.

一例として、外界認識装置が車速データを受信している場合について説明する。   As an example, a case where the external recognition device receives vehicle speed data will be described.

最初に、ステップ901において、車速データに該当する受信バッファを参照することにより車速データを受信したか否かを判定し、車速データ受信時には処理902に進み、車速データ受信フラグfVCANRCV(#xVSPRCV)をセットしてステップ904に進む。ステップ904では、車速データに該当する受信バッファの値を車速Vspに代入してステップ905に進む。また、ステップ901において、車速データ未受信時には処理903に進み、車速データ受信フラグfVCANRCV(#xVSPRCV)をクリアしてステップ905に進む。   First, in step 901, it is determined whether or not vehicle speed data has been received by referring to a reception buffer corresponding to the vehicle speed data. When vehicle speed data is received, the process proceeds to step 902, where a vehicle speed data reception flag fVCANRCV (#xVSPRCV) is set. Set and go to step 904. In step 904, the value of the reception buffer corresponding to the vehicle speed data is substituted into the vehicle speed Vsp, and the process proceeds to step 905. In step 901, when the vehicle speed data is not received, the process proceeds to step 903, the vehicle speed data reception flag fVCANRCV (#xVSPRCV) is cleared, and the process proceeds to step 905.

次に、ステップ905において、車速データ受信フラグfVCANRCV(#xVSPRCV)がクリアされている状態が所定時間経過したか否かを判定し、所定時間経過している場合には通信手段が失陥したと判定してステップ906に進み、診断NG判定フラグfDGNVCAN(#xDGNVSP)をセットして処理を終了する。また、ステップ905において、車速データ受信フラグfVCANRCV(#xVSPRCV)がクリアされている状態が所定時間経過していない場合には処理を終了する。   Next, in step 905, it is determined whether or not a predetermined time has elapsed in which the vehicle speed data reception flag fVCANRCV (#xVSPRCV) is cleared. If the predetermined time has elapsed, the communication means has failed. The process proceeds to step 906, where the diagnosis NG determination flag fDGNVCAN (#xDGNVSP) is set and the process is terminated. In step 905, if the vehicle speed data reception flag fVCANRCV (#xVSPRCV) is not cleared for a predetermined time, the process is terminated.

以上のように、外界認識装置が車速,操舵角,ヨーレート等の自車情報を受信する間隔を計測し、この受信間隔が所定時間(≒設定した通信周期)経過しているか否かを判定することにより通信手段の異常を診断できる。   As described above, the external recognition device measures the interval at which the vehicle information such as the vehicle speed, the steering angle, and the yaw rate is received, and determines whether or not the reception interval has passed a predetermined time (≈set communication cycle). Thus, it is possible to diagnose an abnormality in the communication means.

次に、図10を用いて、自車情報判定によって自車情報の異常が判定された場合に、画像処理によって車線認識を行い、車線認識の結果に応じて予測進路を設定する方法について説明する。図10は、画像処理による車線認識結果(車線内の自車位置)を示す模式図である。図10は、自車を原点Oとした相対座標系であり、車線認識の結果から求めた道路中心線の座標を(X,Y)と定義する。   Next, a method of performing lane recognition by image processing and setting a predicted course according to the result of lane recognition when the abnormality of the own vehicle information is determined by the own vehicle information determination will be described using FIG. . FIG. 10 is a schematic diagram showing a lane recognition result (vehicle position in a lane) by image processing. FIG. 10 shows a relative coordinate system with the own vehicle as the origin O, and the coordinates of the road center line obtained from the lane recognition result are defined as (X, Y).

原点Oを通る道路中心線が自車の予測進路(旋回半径R)であると仮定すると、この方程式は旋回半径Rを用いて式(11)で表される。   Assuming that the road center line passing through the origin O is the predicted course (turning radius R) of the host vehicle, this equation is expressed by Expression (11) using the turning radius R.

(X−R)2+Y2=R2 (11)
従って、旋回半径Rは道路中心線の座標(X,Y)を用いて式(12)で表される。
(X−R) 2 + Y 2 = R 2 (11)
Therefore, the turning radius R is expressed by Expression (12) using the coordinates (X, Y) of the road center line.

R=(X2+Y2)/(2X) (12)
以上のように、車線認識の結果から求めた道路中心線の座標(X,Y)から自車の予測進路(旋回半径R)を求めることが可能となり、車速,操舵角,ヨーレート等の自車情報に異常が発生した場合には、車線認識の結果から求めた道路中心線を代用することによりシステムの機能が著しく低下することを抑制できる。
R = (X 2 + Y 2 ) / (2X) (12)
As described above, the predicted course (turning radius R) of the own vehicle can be obtained from the coordinates (X, Y) of the road center line obtained from the result of lane recognition, and the vehicle's own vehicle speed, steering angle, yaw rate, etc. When an abnormality occurs in the information, it is possible to suppress a significant decrease in the function of the system by substituting the road center line obtained from the lane recognition result.

図11は、本発明と従来方式を比較する図である。   FIG. 11 is a diagram comparing the present invention with a conventional system.

車速,操舵角,ヨーレート等の自車情報が正常である場合には、特許文献1記載の方式によって、外界認識センサが検知した物体と自車との衝突可能性を的確に判断できるため、全運転域で性能低下無くプリクラッシュ・セーフティ・システム等の安全システムを作動させることができる。そして、本発明によれば、自車情報に異常が発生した場合でも、物体情報のみを使用する第2の衝突判定を選択、あるいはカメラ等の車線認識の結果から求めた道路中心線を予測進路に設定して第1の衝突判定を利用可能とすることにより、若干の性能低下は有るが全域でシステムを作動させることができる。   When the vehicle information such as the vehicle speed, the steering angle, and the yaw rate is normal, the possibility of collision between the object detected by the external recognition sensor and the vehicle can be accurately determined by the method described in Patent Document 1. Safety systems such as a pre-crash safety system can be operated without any performance degradation in the operating range. According to the present invention, even when an abnormality occurs in the own vehicle information, the second collision determination using only the object information is selected, or the road center line obtained from the lane recognition result of the camera or the like is used as the predicted course. By making the first collision determination available by setting to, the system can be operated in the whole area with a slight performance degradation.

外界認識装置のブロック図。The block diagram of an external field recognition apparatus. プリクラッシュ・セーフティ・システムの動作方法を示すフローチャート。The flowchart which shows the operation | movement method of a pre-crash safety system. 予測進路設定手段及び第1の衝突判定手段の処理内容を示す模式図。The schematic diagram which shows the processing content of a prediction course setting means and a 1st collision determination means. 第2の衝突判定手段の処理内容を示す模式図。The schematic diagram which shows the processing content of a 2nd collision determination means. 自車情報判定手段の処理内容を示すフローチャート。The flowchart which shows the processing content of the own vehicle information determination means. 衝突判定選択手段の処理内容を示すフローチャート。The flowchart which shows the processing content of a collision determination selection means. 別の実施形態であるレーダとカメラによるセンサフュージョンシステムの処理構成図。The processing block diagram of the sensor fusion system by the radar which is another embodiment, and a camera. 画像処理手段の処理内容を示すフローチャート。The flowchart which shows the processing content of an image processing means. 通信手段が失陥したか否かを診断する自車情報判定手段の処理内容を示すフローチャート。The flowchart which shows the processing content of the own vehicle information determination means which diagnoses whether the communication means has failed. 画像処理による車線認識結果(車線内の自車位置)を示す模式図。The schematic diagram which shows the lane recognition result (the own vehicle position in a lane) by image processing. 本発明と従来方式を比較する図。The figure which compares this invention and a conventional system.

符号の説明Explanation of symbols

1 自車情報取得手段
2 物体情報取得手段
3 予測進路設定手段
4 第1の衝突判定手段
5 第2の衝突判定手段
6 自車情報判定手段
7 衝突判定選択手段
100 外界認識装置
100a 外界認識装置a
100b 外界認識装置b
710 画像処理手段
720 制御手段
DESCRIPTION OF SYMBOLS 1 Own vehicle information acquisition means 2 Object information acquisition means 3 Predicted course setting means 4 First collision determination means 5 Second collision determination means 6 Own vehicle information determination means 7 Collision determination selection means 100 External recognition apparatus 100a External recognition apparatus a
100b External recognition device b
710 Image processing means 720 Control means

Claims (6)

自車の情報を取得する自車情報取得手段と、
外界認識センサが検知した物体の情報を取得する物体情報取得手段と、
前記自車情報取得手段が取得した自車情報に基づいて、自車の予測進路を設定する予測進路設定手段と、
前記物体情報取得手段が取得した物体情報と前記予測進路設定手段が設定した予測進路に基づいて、前記外界認識センサが検知した物体と自車との衝突可能性を判定する第1の衝突判定手段と、
前記物体情報取得手段が取得した物体情報に基づいて、前記外界認識センサが検知した物体と自車との衝突可能性を判定する第2の衝突判定手段と、
前記自車情報取得手段が取得した自車情報の異常を判定する自車情報判定手段と、
前記自車情報判定手段が、前記自車情報取得手段が取得した自車情報が異常と判定した場合、前記第2の衝突判定手段を選択する衝突判定選択手段とを備える、外界認識装置。
Own vehicle information acquisition means for acquiring own vehicle information;
Object information acquisition means for acquiring information of an object detected by an external recognition sensor;
Predicted course setting means for setting a predicted course of the own vehicle based on the own vehicle information acquired by the own vehicle information acquisition means;
Based on the object information acquired by the object information acquisition means and the predicted course set by the predicted course setting means, first collision determination means for determining the possibility of collision between the object detected by the external field recognition sensor and the own vehicle. When,
Based on the object information acquired by the object information acquisition means, second collision determination means for determining the possibility of collision between the object detected by the external recognition sensor and the host vehicle;
Vehicle information determination means for determining abnormality of the vehicle information acquired by the vehicle information acquisition means;
An external environment recognition apparatus comprising: a host vehicle information determination unit that includes a collision determination selection unit that selects the second collision determination unit when the host vehicle information acquired by the host vehicle information acquisition unit is determined to be abnormal.
自車の情報を取得する自車情報取得手段と、
第1の外界認識センサが検知した物体の情報を取得する第1の物体情報取得手段と、
前記自車情報取得手段が取得した自車情報に基づいて、自車の予測進路を設定する予測進路設定手段と、
前記第1の物体情報取得手段が取得した物体情報と前記予測進路設定手段が設定した予測進路に基づいて、前記第1の外界認識センサが検知した物体と自車との衝突可能性を判定する第1の衝突判定手段と、
前記第1の物体情報取得手段が取得した物体情報に基づいて、前記第1の外界認識センサが検知した物体と自車との衝突可能性を判定する第2の衝突判定手段と、
前記第1の衝突判定手段が衝突の可能性が高いと判定した物体に対して前記第2の外界認識センサを用いて物体の情報を取得する第2の物体情報取得手段と、
前記自車情報取得手段が取得した自車情報の異常を判定する自車情報判定手段と、
前記自車情報判定手段が、前記自車情報取得手段が取得した自車情報が異常と判定し場合、前記第2の衝突判定手段を選択する衝突判定選択手段とを備える外界認識装置。
Own vehicle information acquisition means for acquiring own vehicle information;
First object information acquisition means for acquiring information of an object detected by the first external recognition sensor;
Predicted course setting means for setting a predicted course of the own vehicle based on the own vehicle information acquired by the own vehicle information acquisition means;
Based on the object information acquired by the first object information acquisition means and the predicted course set by the predicted course setting means, the possibility of collision between the object detected by the first external recognition sensor and the host vehicle is determined. First collision determination means;
Based on the object information acquired by the first object information acquisition means, second collision determination means for determining the possibility of collision between the object detected by the first external recognition sensor and the own vehicle;
Second object information acquisition means for acquiring object information using the second external recognition sensor for an object determined by the first collision determination means to have a high possibility of collision;
Vehicle information determination means for determining abnormality of the vehicle information acquired by the vehicle information acquisition means;
An external environment recognition device comprising: a collision determination selection unit that selects the second collision determination unit when the vehicle information determination unit determines that the vehicle information acquired by the vehicle information acquisition unit is abnormal.
前記自車情報は車速,操舵角,ヨーレートの少なくとも1つを含み、前記物体情報は自車に対する相対位置を示す、請求項1又は2記載の外界認識装置。   The external environment recognition device according to claim 1, wherein the own vehicle information includes at least one of a vehicle speed, a steering angle, and a yaw rate, and the object information indicates a relative position with respect to the own vehicle. 前記自車情報取得手段は所定の通信手段によって自車情報を取得し、前記自車情報判定手段は前記通信手段が失陥したか否かを判定する、請求項1又は2記載の外界認識装置。   The external environment recognition device according to claim 1, wherein the own vehicle information acquisition unit acquires own vehicle information by a predetermined communication unit, and the own vehicle information determination unit determines whether or not the communication unit has failed. . 自車の情報を取得する自車情報取得手段と、
車線を認識する外界認識センサと、
前記外界認識センサが検知した物体の情報を取得する物体情報取得手段と、
前記自車情報取得手段が取得した自車情報に応じて自車の予測進路を設定する予測進路設定手段と、
前記物体情報取得手段が取得した物体情報と前記予測進路設定手段が設定した予測進路に基づいて、前記外界認識センサが検知した物体と自車との衝突可能性を判定する第1の衝突判定手段と、
前記自車情報取得手段が取得した自車情報の異常を判定する自車情報判定手段と、
前記自車情報判定手段が、前記自車情報取得手段が取得した自車情報が異常と判定した場合、前記予測進路設定手段は前記外界認識センサが認識した車線から求めた道路中心線に基づいて、自車の予測進路を設定する外界認識装置。
Own vehicle information acquisition means for acquiring own vehicle information;
An external recognition sensor that recognizes the lane,
Object information acquisition means for acquiring information of an object detected by the external world recognition sensor;
Predicted course setting means for setting a predicted course of the host vehicle according to the host vehicle information acquired by the host vehicle information acquiring unit;
Based on the object information acquired by the object information acquisition means and the predicted course set by the predicted course setting means, first collision determination means for determining the possibility of collision between the object detected by the external field recognition sensor and the own vehicle. When,
Vehicle information determination means for determining abnormality of the vehicle information acquired by the vehicle information acquisition means;
When the own vehicle information determining means determines that the own vehicle information acquired by the own vehicle information acquiring means is abnormal, the predicted course setting means is based on the road center line obtained from the lane recognized by the external recognition sensor. An external recognition device that sets the predicted course of the vehicle.
自車の情報を取得する自車情報取得手段と、
車線を認識する第1及び第2の外界認識センサと、
前記第1の外界認識センサが検知した物体の情報を取得する第1の物体情報取得手段と、
前記自車情報取得手段が取得した自車情報に応じて自車の予測進路を設定する予測進路設定手段と、
前記第1の物体情報取得手段が取得した物体情報と前記予測進路設定手段が設定した予測進路に基づいて、前記第1の外界認識センサが検知した物体と自車との衝突可能性を判定する第1の衝突判定手段と、
前記第1の衝突判定手段により衝突の可能性が高いと判定された物体に対して前記第2の外界認識センサを用いて物体の情報を取得する第2の物体情報取得手段と、
前記自車情報取得手段が取得した自車情報の異常を判定する自車情報判定手段と、
前記自車情報判定手段が、前記自車情報取得手段が取得した自車情報が異常と判定した場合、前記予測進路設定手段は、前記第2の外界センサが認識した車線から求めた道路中心線に応じて自車の予測進路を設定する外界認識装置。
Own vehicle information acquisition means for acquiring own vehicle information;
First and second external recognition sensors for recognizing a lane;
First object information acquisition means for acquiring information of an object detected by the first external recognition sensor;
Predicted course setting means for setting a predicted course of the host vehicle according to the host vehicle information acquired by the host vehicle information acquiring unit;
Based on the object information acquired by the first object information acquisition means and the predicted course set by the predicted course setting means, the possibility of collision between the object detected by the first external recognition sensor and the host vehicle is determined. First collision determination means;
Second object information acquisition means for acquiring object information using the second external field recognition sensor for an object determined by the first collision determination means to have a high possibility of collision;
Vehicle information determination means for determining abnormality of the vehicle information acquired by the vehicle information acquisition means;
When the vehicle information determination unit determines that the vehicle information acquired by the vehicle information acquisition unit is abnormal, the predicted course setting unit calculates the road center line obtained from the lane recognized by the second external sensor. An external recognition device that sets the predicted course of the vehicle according to the vehicle.
JP2007335502A 2007-12-27 2007-12-27 External recognition device Active JP4982353B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2007335502A JP4982353B2 (en) 2007-12-27 2007-12-27 External recognition device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2007335502A JP4982353B2 (en) 2007-12-27 2007-12-27 External recognition device

Publications (2)

Publication Number Publication Date
JP2009157668A true JP2009157668A (en) 2009-07-16
JP4982353B2 JP4982353B2 (en) 2012-07-25

Family

ID=40961622

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2007335502A Active JP4982353B2 (en) 2007-12-27 2007-12-27 External recognition device

Country Status (1)

Country Link
JP (1) JP4982353B2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012030673A (en) * 2010-07-30 2012-02-16 Hitachi Automotive Systems Ltd External environment recognition device for vehicle, and light distribution control system using the same
JP2013097391A (en) * 2011-10-27 2013-05-20 Toshiba Alpine Automotive Technology Corp Collision determination method and collision determination program
JP2016080646A (en) * 2014-10-22 2016-05-16 株式会社デンソー Object detector
JP2020042643A (en) * 2018-09-12 2020-03-19 アイシン精機株式会社 Vehicle controller
US20220324463A1 (en) * 2021-04-13 2022-10-13 Toyota Jidosha Kabushiki Kaisha Sensor abnormality estimation device

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6299208B2 (en) * 2013-12-26 2018-03-28 トヨタ自動車株式会社 Vehicle surrounding situation estimation device
JP5898746B1 (en) * 2014-09-29 2016-04-06 富士重工業株式会社 Vehicle travel control device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003294838A (en) * 2002-04-01 2003-10-15 Hitachi Ltd Onboard radar apparatus and onboard processing apparatus
JP2004231010A (en) * 2003-01-29 2004-08-19 Toyota Motor Corp Vehicle control system
JP2005022522A (en) * 2003-07-02 2005-01-27 Toyota Motor Corp Control device for vehicle
JP2005182198A (en) * 2003-12-16 2005-07-07 Fujitsu Ten Ltd Rear-end collision prevention device
JP2006024106A (en) * 2004-07-09 2006-01-26 Honda Motor Co Ltd Contact avoidance controller of vehicle
JP2006123663A (en) * 2004-10-28 2006-05-18 Favess Co Ltd Steering control device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003294838A (en) * 2002-04-01 2003-10-15 Hitachi Ltd Onboard radar apparatus and onboard processing apparatus
JP2004231010A (en) * 2003-01-29 2004-08-19 Toyota Motor Corp Vehicle control system
JP2005022522A (en) * 2003-07-02 2005-01-27 Toyota Motor Corp Control device for vehicle
JP2005182198A (en) * 2003-12-16 2005-07-07 Fujitsu Ten Ltd Rear-end collision prevention device
JP2006024106A (en) * 2004-07-09 2006-01-26 Honda Motor Co Ltd Contact avoidance controller of vehicle
JP2006123663A (en) * 2004-10-28 2006-05-18 Favess Co Ltd Steering control device

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012030673A (en) * 2010-07-30 2012-02-16 Hitachi Automotive Systems Ltd External environment recognition device for vehicle, and light distribution control system using the same
US9268740B2 (en) 2010-07-30 2016-02-23 Hitachi Automotive Systems, Ltd. External environment recognizing device for vehicle and light distribution control system using the same
JP2013097391A (en) * 2011-10-27 2013-05-20 Toshiba Alpine Automotive Technology Corp Collision determination method and collision determination program
JP2016080646A (en) * 2014-10-22 2016-05-16 株式会社デンソー Object detector
JP2020042643A (en) * 2018-09-12 2020-03-19 アイシン精機株式会社 Vehicle controller
JP7243095B2 (en) 2018-09-12 2023-03-22 株式会社アイシン vehicle controller
US20220324463A1 (en) * 2021-04-13 2022-10-13 Toyota Jidosha Kabushiki Kaisha Sensor abnormality estimation device

Also Published As

Publication number Publication date
JP4982353B2 (en) 2012-07-25

Similar Documents

Publication Publication Date Title
US10725474B2 (en) Action planning device having a trajectory generation and determination unit that prevents entry into a failure occurrence range
CN101327796B (en) Method and apparatus for rear cross traffic collision avoidance
CN105799617B (en) Method for the misalignment for determining object sensor
US11673545B2 (en) Method for automated prevention of a collision
JP6803657B2 (en) Vehicle control device and vehicle control system
CN109910879B (en) Vehicle safety anti-collision control method combining safe distance and collision time
JP4982353B2 (en) External recognition device
CN104554258B (en) Using the path planning of the avoidance steering operation of virtual potential field technology
JP5283967B2 (en) In-vehicle object detection device
JP5278776B2 (en) Object detection apparatus and object detection method
US9196163B2 (en) Driving support apparatus and driving support method
EP2302412B1 (en) System and method for evaluation of an automotive vehicle forward collision threat
US9594166B2 (en) Object detecting apparatus
US20150274161A1 (en) Method for operating a driver assistance system of a vehicle
JP4892518B2 (en) Vehicle external recognition device and vehicle system
JP2017521745A (en) In-vehicle device that informs vehicle navigation module of presence of object
CN105702088A (en) warning device
JP2008024108A (en) Collision controller for vehicle
JP2006240453A (en) Sensor failure detector and detection method of sensor failure
KR20190102827A (en) Autonomous emergency braking system and method for vehicle at intersection
JP2014078107A (en) Collision prediction device
CN114132311B (en) Dangerous target screening method and module for automatic emergency braking of vehicle
JP2007038911A (en) Alarm device for vehicle
US20230034560A1 (en) Method for tracking a remote target vehicle in an area surrounding a motor vehicle by means of a collision detection device
JP2014112348A (en) Action analyzing apparatus, action analyzing system, and action analyzing method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20091224

A711 Notification of change in applicant

Free format text: JAPANESE INTERMEDIATE CODE: A712

Effective date: 20091225

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20110712

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20110713

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20110912

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20120327

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20120423

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20150427

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Ref document number: 4982353

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150

Free format text: JAPANESE INTERMEDIATE CODE: R150

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350