JPH07296291A - Traveling lane detector for vehicle - Google Patents

Traveling lane detector for vehicle

Info

Publication number
JPH07296291A
JPH07296291A JP6084805A JP8480594A JPH07296291A JP H07296291 A JPH07296291 A JP H07296291A JP 6084805 A JP6084805 A JP 6084805A JP 8480594 A JP8480594 A JP 8480594A JP H07296291 A JPH07296291 A JP H07296291A
Authority
JP
Japan
Prior art keywords
vehicle
traveling
detected object
locus
detected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP6084805A
Other languages
Japanese (ja)
Other versions
JP3440956B2 (en
Inventor
Katsuyuki Imanishi
勝之 今西
Mare Kitagawa
希 北川
Tetsuo Kikuchi
哲郎 菊地
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Soken Inc
Original Assignee
Nippon Soken Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nippon Soken Inc filed Critical Nippon Soken Inc
Priority to JP08480594A priority Critical patent/JP3440956B2/en
Publication of JPH07296291A publication Critical patent/JPH07296291A/en
Application granted granted Critical
Publication of JP3440956B2 publication Critical patent/JP3440956B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Landscapes

  • Image Processing (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

PURPOSE:To detect the traveling lane one's own vehicle by using a vehicle traveling ahead. CONSTITUTION:An object detecting means 110 detecting the relative distance, relative speed, azimuth between the object in front of ones' own vehicle (A) and the vehicle (A) and a yaw angle detecting sensor 111 detecting the speed signal of the vehicle (A) and the change of the yaw angle are provided on the traveling lane detecting device for vehicle which detects the traveling lane of the vehicle (A). A traveling locus calculation means 112 calculates the traveling locus being the change with time of the position of the vehicle (A). A traveling locus calculation means 113 of an object to be detected adds the relative distance of the object to be detected and the position of the object to be detected calculated from the azimuth angle of the object to be detected to the position of the vehicle (A) and calculates the traveling locus being the change with time of the position of the object to be detected. A traveling lane decision means 114 decides the traveling locus of the object to be detected as the traveling lane of the vehicle (A) when the traveling locus of the vehicle (A) is overlapped with the traveling locus of the object to be detected. Thus, the traveling lane of the vehicle (A) can be decided when it is parallel to the traveling locus of the object to be detected.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【産業上の利用分野】本発明は、車両用自動操舵及び車
両用障害物検出装置に関し、特に本発明は、前方車両に
追従して走行しているときに、この前方車両を利用して
自車両の走行路を検出する走行路検出装置に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a vehicle automatic steering system and a vehicle obstacle detection system. In particular, the present invention utilizes a front vehicle when the vehicle is following the vehicle ahead. The present invention relates to a traveling road detection device that detects a traveling road of a vehicle.

【0002】[0002]

【従来の技術】近年、ドライバーの運転負担軽減、安全
性の向上、道路交通路効率の増大などを目的とした自動
運転分野の研究開発が盛んになってきた。このような背
景のもとで、自動運転としての実用化に最も近いと考え
られる高速道路の専用レーンでの自動走行を想定した自
動運転システムの開発が着手されている。この開発にお
いて、車両の自動操舵を行わせるときに、車両の操舵角
を制御するためには自車両の走行路を検出する必要があ
る。このための従来技術として、画像処理により道路の
走行路端を示す白線を検出して、その白線を追従するよ
うに操舵角を制御する方法が考えられている。
2. Description of the Related Art In recent years, research and development in the field of autonomous driving have been actively conducted for the purpose of reducing the driving burden on drivers, improving safety, and increasing road traffic efficiency. Against this background, development of an automatic driving system has been undertaken, assuming automatic driving on a dedicated lane of a highway, which is considered to be the closest to practical use as automatic driving. In this development, when the vehicle is automatically steered, it is necessary to detect the traveling path of the host vehicle in order to control the steering angle of the vehicle. As a conventional technique for this purpose, a method of detecting a white line indicating the road edge of the road by image processing and controlling the steering angle so as to follow the white line is considered.

【0003】また、車両の進行方向の障害物を検出する
のに、様々な画像処理技術やレーザが考えられている。
このような装置として、画像処理ではステレオ画像を撮
像して、三角測量の原理を利用した立体視によって、前
方の物体の距離分布を測定するものが知られている。こ
の場合、得られた物体の距離分布からこれを障害物と判
定するとき、まず撮像した画像から走行路端を示す2本
の白線を検出してその内側の領域を走行する。そして、
この走行路内にある検出物体を障害物候補として検出し
て、検出物体の一定距離以下になったとき障害物と判定
して警報を出している。
Various image processing techniques and lasers have been considered to detect obstacles in the traveling direction of the vehicle.
As such a device, there is known a device that picks up a stereo image in image processing and measures the distance distribution of an object in front by stereoscopic vision using the principle of triangulation. In this case, when it is determined that the obstacle is an obstacle from the obtained distance distribution of the object, first, two white lines indicating the road edge are detected from the captured image and the inside of the vehicle is traveled. And
A detected object in this traveling path is detected as an obstacle candidate, and when it becomes a certain distance or less of the detected object, it is judged as an obstacle and an alarm is issued.

【0004】[0004]

【発明が解決しようとする課題】しかしながら、このよ
うな方法を用いるときに、撮像した画像に走行路端を示
す白線が撮像されている必要があるが、実環境において
は道路によって白線が引かれていない所があるという問
題がある。また、雨天時は先行車の巻き上げる水しぶき
によって白線が見えなかったり、もしくは見えにくくな
って白線の検出が非常に困難になり、結果として検出が
できない場合があるという問題がある。さらに、遠距離
においては上記白線は画像中で細く接続されるため、道
路がカーブして大きく変化しているときなどは、遠距離
の白線を精度良く検出できない場合があるという問題が
ある。
However, when such a method is used, it is necessary that a white line indicating the road edge is picked up in the picked-up image. In a real environment, a white line is drawn by the road. There is a problem that there are not places. Further, in rainy weather, there is a problem that the white line cannot be seen due to the water splash of the preceding vehicle, or it becomes difficult to detect the white line, which makes it very difficult to detect the white line. Further, since the white line is thinly connected in the image at a long distance, there is a problem that the white line at a long distance may not be accurately detected when the road is curved and greatly changes.

【0005】したがって、本発明は、上記問題点に鑑
み、道路の条件や天候に検出を制限されずに、安定して
走行路及び障害物を検出できる車両用走行路検出装置を
提供することを目的とする。
Therefore, in view of the above problems, the present invention provides a vehicle running road detecting device capable of stably detecting running roads and obstacles without being restricted by road conditions and weather. To aim.

【0006】[0006]

【課題を解決するための手段】本発明は、前記問題点を
解決するために、次の構成を有する車両用走行路検出装
置を提供する。すなわち、自車両の走行路を検出する車
両用走行路検出装置に、前記自車両の前方にある物体と
自車両との相対的距離、相対速度、方位角を検出する物
体検出手段と前記自車両の速度信号及びヨー角の変化分
を検出する自車両速度及びヨー角度検出センサとが設け
られる。前記自車両速度及び前記ヨー角の変化分を入力
する自車両の走行軌跡算出手段は前記自車両の位置の時
間的変化である走行軌跡を算出する。被検出物体の走行
軌跡算出手段は、前記自車両の位置に、被検出物体の相
対的距離、方位角から算出された被検出物体の位置を足
し合わせて被検出物体の位置の時間変化である走行軌跡
を算出する。走行路決定手段は、前記自車両の走行軌跡
が被検出物体の走行軌跡と重なって沿っているときに被
検出物体の走行軌跡を自車両の走行路と決定する。
In order to solve the above-mentioned problems, the present invention provides a vehicle traveling road detecting device having the following configuration. That is, the vehicle running path detection device for detecting the running path of the own vehicle includes an object detecting unit for detecting a relative distance, a relative speed, and an azimuth angle between the object in front of the own vehicle and the own vehicle, and the own vehicle. A vehicle speed and yaw angle detection sensor for detecting a change in the speed signal and the yaw angle of the vehicle. The running locus calculation means of the own vehicle, which inputs the change amount of the own vehicle speed and the yaw angle, calculates the running locus which is a temporal change of the position of the own vehicle. The traveling locus calculation means of the detected object is the time change of the position of the detected object by adding the position of the detected object calculated from the relative distance and the azimuth of the detected object to the position of the host vehicle. Calculate the travel locus. The traveling path determining means determines the traveling path of the detected object as the traveling path of the own vehicle when the traveling path of the host vehicle overlaps the traveling path of the detected object.

【0007】また、前記走行路決定手段は、前記被検出
物体の走行軌跡に前記自車両の走行軌跡が平行して沿う
とき、その被検出物体の前方の走行軌跡から自車両の走
行路を決定するようにしてもよい。前記自車両の走行軌
跡算出手段は、設定条件により走行軌跡の原点を変更す
るようにしてもよい。
Further, the traveling path determining means determines the traveling path of the own vehicle from the traveling locus in front of the detected object when the traveling locus of the own vehicle is parallel to the traveling locus of the detected object. You may do it. The traveling locus calculation means of the own vehicle may change the origin of the traveling locus according to the setting conditions.

【0008】さらに、車両用走行路検出装置に決定され
た前記自車両の走行路の前方に走行上の被検出物体を障
害物候補として判定する障害物候補判定手段と、前記障
害物候補として判定された被検出物体との前記相対距離
が一定距離以下になった時又は一定以上の前記相対速度
で接近してきた時、この被検出物体を障害物として判定
する障害物判定手段とを追加して設けるようにしてもよ
い。
Further, an obstacle candidate judging means for judging an object to be detected on the road ahead of the traveling road of the own vehicle determined by the vehicle traveling road detecting device as an obstacle candidate, and judging as the obstacle candidate. When the relative distance to the detected object becomes a certain distance or less or approaches at a certain relative speed or more, an obstacle determining means for determining the detected object as an obstacle is added. It may be provided.

【0009】[0009]

【作用】本発明の車両用走行路検出装置によれば、前記
自車両の前方にある物体と自車両との相対的距離、相対
速度、方位角が検出され、前記自車両の速度信号及びヨ
ー角の変化分が検出され、前記自車両速度及び前記ヨー
角の変化分により前記自車両の位置の時間的変化である
走行軌跡が算出され、前記自車両の位置に、被検出物体
の相対的距離、方位角から算出された被検出物体の位置
を足し合わせて被検出物体の位置の時間変化である走行
軌跡が算出され、前記自車両の走行軌跡が被検出物体の
走行軌跡と重なって沿っているときに被検出物体の走行
軌跡が自車両の走行路と決定されることにより、走行路
端を示す白線を用いずに、自車両の走行路を求めること
ができる。
According to the vehicle road detection device of the present invention, the relative distance, relative speed, and azimuth between the object in front of the host vehicle and the host vehicle are detected, and the speed signal and yaw of the host vehicle are detected. The amount of change in the angle is detected, and a running locus that is a temporal change in the position of the own vehicle is calculated from the amount of change in the vehicle speed and the yaw angle, and the relative position of the detected object is calculated at the position of the own vehicle. The position of the detected object calculated from the distance and the azimuth is added to calculate a running locus, which is a temporal change in the position of the detected object, and the running locus of the own vehicle overlaps the running locus of the detected object. Since the traveling locus of the detected object is determined to be the traveling path of the own vehicle while the vehicle is in motion, the traveling path of the own vehicle can be obtained without using the white line indicating the end of the traveling road.

【0010】また、前記被検出物体の走行軌跡に前記自
車両の走行軌跡が平行して沿うとき、その被検出物体の
前方の走行軌跡から自車両の走行路を決定することによ
り、走行路決定の多様性が増す。さらに、設定条件によ
り走行軌跡の原点を変更することにより、時間の経過と
ともに走行軌跡の誤差が累積されていき、走行路の検出
精度が悪化するのを防止できる。
Further, when the running locus of the vehicle is parallel to the running locus of the detected object, the running path of the own vehicle is determined from the running locus in front of the detected object to determine the running path. Increase the variety of. Further, by changing the origin of the traveling locus according to the set conditions, it is possible to prevent the errors of the traveling loci from accumulating with the passage of time, and the detection accuracy of the traveling road from being deteriorated.

【0011】さらに、車両用走行路検出装置に決定され
た前記自車両の走行路を、その前方に走行上の被検出物
体を障害物候補として判定するのに、使用でき、さら
に、前記障害物候補として判定された被検出物体との前
記相対距離が一定距離以下になった時又は一定以上の前
記相対速度で接近してきた時、この被検出物体を障害物
として判定できるようになる。
Further, the traveling route of the host vehicle determined by the vehicle traveling route detecting device can be used to determine an object in front of which the traveling object is detected as an obstacle candidate. When the relative distance to the detected object determined as a candidate becomes equal to or less than a certain distance or approaches at the relative speed of a certain amount or more, the detected object can be determined as an obstacle.

【0012】[0012]

【実施例】以下本発明の実施例について図面を参照して
説明する。図1は本発明の実施例に係る車両用走行検出
装置の全体構成を示す図である。本図に示す車両用走行
路検出装置は、自車両の走行路の検出と障害物の検出と
を行うものであり、先ず、自車両前方にある物体の自車
両に対する相対的な位置を検出し検出物体の位置データ
を算出する物体検出手段110を具備する。この物体検
出手段110は、カメラA、Bからなる二つのカメラ2
10、211と、このカメラ210、211で撮像され
た車両前方のステレオ画像をデータとして三角測量の原
理を利用した立体視により自車両に対する相対的な前方
物体の位置を検出する検出部212を具備する。
Embodiments of the present invention will be described below with reference to the drawings. FIG. 1 is a diagram showing the overall configuration of a vehicle travel detection device according to an embodiment of the present invention. The vehicle traveling road detection device shown in the figure is for detecting the traveling road of the own vehicle and for detecting obstacles.First, the relative position of an object in front of the own vehicle to the own vehicle is detected. An object detection unit 110 that calculates position data of a detected object is provided. The object detecting means 110 is composed of two cameras 2 consisting of cameras A and B.
10 and 211, and a detection unit 212 that detects the position of the front object relative to the own vehicle by stereoscopic vision using the principle of triangulation using the stereo images of the front of the vehicle captured by the cameras 210 and 211 as data. To do.

【0013】さらに、前記車両用走行路検出装置には、
自車両のヨー角度の変化分Δθを求めるためにヨー角度
を検出するヨー角検出センサ213と、自車両の速度V
を検出する速度検出センサ214を有する自車両速度及
びヨー角度検出部111とが設けられる。なお、ヨー角
検出センサ213の信号の代わりに、ステアリングの操
舵角検出センサや、角加速度検出センサの信号を利用し
てもよい。
Further, the vehicle traveling road detecting device includes:
A yaw angle detection sensor 213 for detecting the yaw angle in order to obtain a change amount Δθ of the yaw angle of the own vehicle, and a speed V of the own vehicle.
A vehicle speed and yaw angle detection unit 111 having a speed detection sensor 214 for detecting Instead of the signal from the yaw angle detection sensor 213, the signals from the steering angle detection sensor of the steering wheel and the angular acceleration detection sensor may be used.

【0014】前記自車両速度及びヨー角度検出センサ1
11の後段に設けられる自車両の走行軌跡算出手段11
2は、自車両の速度V及びヨー角の変化成分Δθのデー
タを入力してこれらのデータを基に自車両の走行軌跡デ
ータを算出する。図2は図1の自車両の走行軌跡算出手
段112により得られる走行軌跡マップを示す図であ
る。本図に示す走行軌跡は、以下式以下の式(1)、
(2)を用いて得られる。
The vehicle speed and yaw angle detection sensor 1
11, a traveling locus calculation means 11 of the own vehicle, which is provided in the latter stage of 11.
2 inputs the data of the velocity V of the host vehicle and the change component Δθ of the yaw angle, and calculates the traveling locus data of the host vehicle based on these data. FIG. 2 is a diagram showing a traveling locus map obtained by the traveling locus calculation means 112 of the own vehicle in FIG. The travel locus shown in this figure is expressed by the following equation (1),
It is obtained by using (2).

【0015】 ここに、ZXL(t1)は、時刻t1における走行軌跡
マップでの自車両のX方向位置であり、ZYL(t1)
は、時刻t1における走行軌跡マップでの自車両のY方
向位置であり、Δθ(t)は、時刻tの自車両のヨー角
の変化分であり、θs(t)は、時刻tの自車両の方位
で、 であり、V(t)は、時刻tの自車両の速度であり、t
0は、装置作動開始時の時刻である。
[0015] Here, ZXL (t1) is the position of the own vehicle in the X direction on the traveling locus map at time t1, and ZYL (t1).
Is the position of the host vehicle in the Y direction in the travel locus map at time t1, Δθ (t) is the change in the yaw angle of the host vehicle at time t, and θs (t) is the host vehicle at time t. In the direction of Where V (t) is the speed of the host vehicle at time t, and t
0 is the time when the apparatus starts operating.

【0016】次に、前記物体検出手段110及び前記自
車両の走行軌跡算出手段112の後段に設けられる検出
物体の走行軌跡算出手段113は、自車両の走行軌跡デ
ータと検出物体の位置データを入力しこれらのデータを
基に検出物体Aの走行軌跡データを算出する。検出物体
の走行軌跡データは、検出物体Aの走行軌跡算出手段1
13では自車両の走行軌跡に検出物体の各時刻の位置を
足し合わせられ、以下の式(3)、(4)により、算出
される。
Next, the traveling locus calculation means 113 of the detected object, which is provided at the subsequent stage of the object detection means 110 and the traveling locus calculation means 112 of the own vehicle, inputs the traveling locus data of the own vehicle and the position data of the detected object. Then, the traveling locus data of the detection object A is calculated based on these data. The traveling locus data of the detected object is the traveling locus calculation means 1 of the detected object A.
In 13, the position of the detected object at each time is added to the traveling locus of the host vehicle, and is calculated by the following equations (3) and (4).

【0017】 ここに、KXL(t1)は、時刻t1における走行軌跡
マップでの検出物体のX方向位置であり、KYL(t
1)は、時刻t1における走行軌跡マップでの検出物体
のY方向位置であり、d(t1)は、時刻t1における
自車両と検出物体との間の相対距離であり、Δθk(t
1)は、時刻t1の自車両に対する検出物体の方位であ
る。
[0017] Here, KXL (t1) is the position of the detected object in the X direction on the traveling locus map at time t1, and KYL (t
1) is the position of the detected object in the Y-direction in the travel locus map at time t1, d (t1) is the relative distance between the host vehicle and the detected object at time t1, and Δθk (t
1) is the orientation of the detected object with respect to the host vehicle at time t1.

【0018】d(t1)、Δθk(t1)の算出は後述
する。図3は自車両の位置を用いて検出物体Aの移動に
対する各走行軌跡を得る例を示す図である。本図に示す
ように、各時刻の自車両移動量と移動方向が分かるの
で、その移動量を式(1)、(2)によって積分し、自
車両の走行軌跡が算出できる。図中の○符号が算出され
た走行軌跡であり、以下同様である。さらに、物体検出
手段110により自車両と検出物体Aとの相対距離、方
位角が与えられて、各時刻の検出物体Aの自車両に対す
る移動量が分かりまた検出物体の自車両に対する移動方
向が分かるので、その移動量を式(3)、(4)によっ
て自車両のの走行軌跡に足し合わせることにより、検出
物体Aの走行軌跡が算出できる。
The calculation of d (t1) and Δθk (t1) will be described later. FIG. 3 is a diagram showing an example of obtaining each traveling locus with respect to the movement of the detected object A using the position of the own vehicle. As shown in the figure, since the moving amount and the moving direction of the own vehicle at each time are known, the moving amount can be integrated by the equations (1) and (2) to calculate the traveling locus of the own vehicle. The circle marks in the figure are the calculated travel loci, and so on. Further, the object detection means 110 gives the relative distance and the azimuth angle between the own vehicle and the detected object A, and the movement amount of the detected object A with respect to the own vehicle at each time is known, and the moving direction of the detected object with respect to the own vehicle is known. Therefore, the traveling locus of the detected object A can be calculated by adding the movement amount to the traveling locus of the own vehicle by the equations (3) and (4).

【0019】さらに、自車両の走行軌跡算出手段112
及び検出物体の走行軌跡算出手段113の後段に設けら
れる走行路決定手段114は、自車両の走行軌跡データ
と検出物体の走行軌跡データを入力し、これらのデータ
を基に自車両の走行軌跡データが検出物体の走行軌跡デ
ータと重なって沿っているとき、前方の検出物体の走行
軌跡を自車両の走行路と、以下の如く、決定する。
Further, the running locus calculation means 112 of the own vehicle.
Further, the traveling path determination means 114 provided in the latter stage of the traveling locus calculation means 113 of the detected object inputs the traveling locus data of the own vehicle and the traveling locus data of the detected object, and based on these data, the traveling locus data of the own vehicle. Is along the traveling locus data of the detected object, the traveling locus of the detected object in front is determined as the traveling path of the host vehicle as follows.

【0020】図4は図1の走行路決定手段114におい
て、自車両と検出物体Aの走行軌跡が重なる前の例であ
って、走行路の決定がまだ行われない例を示す例図あ
る。本図に示すように、検出物体Aの走行軌跡が算出さ
れているが、ある時刻t1の検出物体AがP1の位置に
あり、そのΔt1秒後にはまだP1の位置には自車両と
検出物体Aの走行軌跡が重なっていない場合がある。走
行路決定手段114は、このように走行軌跡の重なりの
ない状態を自車両と検出物体Aの位置から検出する。こ
の検出の結果として、走行路決定手段114はまだ自車
両の走行路の決定を行わない。
FIG. 4 is an example before the traveling path determination means 114 of FIG. 1 overlaps the traveling loci of the vehicle and the detected object A, and shows an example in which the traveling path is not yet determined. As shown in the figure, the traveling locus of the detected object A is calculated, but the detected object A at a certain time t1 is at the position of P1, and Δt1 seconds after that, the vehicle and the detected object are still at the position of P1. The traveling loci of A may not overlap. The traveling path determination unit 114 detects a state where the traveling loci do not overlap each other from the positions of the own vehicle and the detected object A in this way. As a result of this detection, the travel route determination means 114 has not yet determined the travel route of the host vehicle.

【0021】図5は図1の走行路決定手段114におい
て、自車両と検出物体Aの走行軌跡がつながった場合の
処理を示す例であって走行路の決定がまだ行われない例
を示す図である。本図に示すように、時刻t2で自車両
の走行軌跡が検出物体Aの走行軌跡に達して相互の走行
軌跡がつながる場合がある。走行路決定手段114は、
前述と同様に、この走行軌跡がつながった状態を自車両
と検出物体Aの位置から検出する。走行路決定手段11
4は、この検出の結果として、まだ自車両の走行路の決
定を行わない。
FIG. 5 is a diagram showing an example of a process in the case where the running path determining means 114 of FIG. 1 connects the running loci of the vehicle and the detected object A, and shows an example in which the running path is not yet determined. Is. As shown in the figure, the traveling locus of the host vehicle may reach the traveling locus of the detected object A at time t2, and the mutually traveling loci may be connected. The traveling path determination means 114
Similarly to the above, the state in which the traveling loci are connected is detected from the positions of the vehicle and the detected object A. Driving route determination means 11
As a result of this detection, No. 4 has not yet determined the travel route of the host vehicle.

【0022】図6は図1の走行路決定手段114におい
て、自車両と検出物体Aの走行軌跡が一定時間重なった
例であって、走行路の決定が行われる例を示す図であ
る。本図に示すように、時刻t2後一定時間継続して重
なる場合がある。走行路決定手段114は、前述と同様
に、この走行軌跡が例えばΔt2秒間だけ重なった状態
を自車両と検出物体Aの位置から検出する。走行路決定
手段114は、この検出の結果として、自車両の走行路
が検出物体Aの走行軌跡であるとの決定を行う。決定さ
れた走行路は走行路出力手段115により表示出力され
る。
FIG. 6 is a diagram showing an example in which the traveling paths of the vehicle and the detection object A overlap each other for a certain period of time in the traveling path determining means 114 of FIG. 1, and the traveling path is determined. As shown in the figure, there may be a case where they overlap each other for a certain period of time after the time t2. In the same manner as described above, the traveling path determination unit 114 detects a state in which the traveling loci overlap by Δt2 seconds, for example, from the positions of the own vehicle and the detected object A. As a result of this detection, the travel route determination unit 114 determines that the travel route of the host vehicle is the travel locus of the detected object A. The determined traveling path is displayed and output by the traveling path output means 115.

【0023】図7は図1の走行路決定手段114におい
て、検出物体Bの走行軌跡に自車両が平行して一定時間
走行する例であって走行路の決定が行われる例を示す図
である。本図に示すように、検出物体Bの走行軌跡に自
車両が平行してΔt3秒間走行した時、検出物体Bは隣
接レーンを走行しているものと見なす。検出物体Bと自
車両との走行軌跡が平行であることは、自車両の現在地
と検出物体の走行軌跡との距離を求めその最小値が一定
であることにより確認される。そして、走行路決定手段
114は、自車両の前方の検出物体Bの走行軌跡につい
て、その軌跡に平行で自車両の正面から出発する曲線を
自車両の走行路として決定する。
FIG. 7 is a diagram showing an example in which the own vehicle travels in parallel with the traveling locus of the detected object B for a certain period of time in the traveling path determining means 114 of FIG. 1, and in which the traveling path is determined. . As shown in the figure, when the vehicle travels in parallel with the traveling locus of the detection object B for Δt3 seconds, it is considered that the detection object B is traveling in the adjacent lane. The fact that the traveling loci of the detected object B and the own vehicle are parallel is confirmed by obtaining the distance between the present position of the own vehicle and the traveling locus of the detected object and keeping the minimum value constant. Then, the traveling path determination unit 114 determines, as the traveling path of the host vehicle, a curve that is parallel to the trajectory of the detected object B in front of the host vehicle and starts from the front of the host vehicle.

【0024】したがって、本実施例によれば、走行路端
を示す白線を用いずに、自車両の走行路を求めることが
できる。ところで、自車両の走行軌跡算出手段112に
よる上記走行軌跡を算出する場合、走行軌跡のマップの
座標系の原点をある時刻の自車両の位置に設定して走行
軌跡を求めていくと、時間の経過とともに走行軌跡の誤
差が累積されていき、走行路の検出精度が、以下の如
く、悪化する。
Therefore, according to this embodiment, it is possible to obtain the traveling path of the host vehicle without using the white line indicating the end of the traveling road. By the way, when the traveling locus is calculated by the traveling locus calculation means 112 of the own vehicle, if the origin of the coordinate system of the map of the traveling locus is set to the position of the own vehicle at a certain time and the traveling locus is calculated, The error of the traveling locus accumulates with the passage of time, and the accuracy of detecting the traveling path deteriorates as follows.

【0025】図8は時刻t0の自車両の位置を走行軌跡
のマップの座標原点として自車両及び検出物体Aの走行
軌跡を算出する例を示す図である。本図に示すように、
時間の経過とともに走行軌跡を算出するための測定値、
つまり時車両の速度、ヨー角度の変化分、検出物体の自
車両に対する位置などの誤差が、走行軌跡を算出する過
程で足し合わされていく。そして、正しい自車両の走行
軌跡Ls1と検出物体Aの走行軌跡Ls2に対して、誤
差を累積した自車両の走行軌跡Lg1と検出物体Aの走
行軌跡Lg2が、時刻t0+tkから、大きなズレを持
って算出されてしまうので、走行路検出、後述する障害
物判定において誤検出を引き起こす。
FIG. 8 is a diagram showing an example of calculating the traveling loci of the own vehicle and the detected object A with the position of the own vehicle at time t0 as the coordinate origin of the map of the traveling locus. As shown in this figure,
Measured values for calculating the running trajectory over time,
That is, errors such as the speed of the hour vehicle, the amount of change in the yaw angle, and the position of the detected object with respect to the host vehicle are added together in the process of calculating the travel locus. Then, with respect to the correct traveling locus Ls1 of the own vehicle and the traveling locus Ls2 of the detected object A, the traveling locus Lg1 of the own vehicle and the traveling locus Lg2 of the detected object A, which have accumulated errors, have a large deviation from the time t0 + tk. Since it is calculated, it causes an erroneous detection in the road detection and the obstacle determination described later.

【0026】このため、車両用走行路検出装置に走行軌
跡の算出原点の変更指示手段116を設け、走行軌跡の
算出原点の変更指示手段116では、ある条件が設定さ
れ、この条件がみたされてとき、走行軌跡のマップの座
標系の原点を変更して、走行軌跡の累積誤差を、以下の
如く、打ち消すようにしている。図9は走行軌跡の算出
原点の変更指示手段116により一定時間間隔で走行軌
跡のマップの座標系の原点が現時刻に変更される例を説
明する図である。走行軌跡の算出原点の変更指示手段1
16により一定時間間隔tkで走行軌跡のマップの座標
系の原点を現時刻の自車両の位置に変更すると、図8の
時刻t0+tk以降の軌跡は本図9に示すようになる。
走行軌跡の算出原点の変更指示手段116による座標系
の変換は、式(1)から(4)における積分の範囲の起
点をt0からt0+tkに変更して走行軌跡を算出する
だけである。この結果、図9の座標系での正しい実線の
走行軌跡と算出された走行軌跡の誤差は、図8の座標系
と比較して、軽減される。このようにして図9の座標系
で走行軌跡の一致を決定するとき、図8の検出物体の走
行軌跡Lbの領域に関する決定ができないので、この改
善を以下に説明する。
For this reason, the vehicle traveling path detection device is provided with a change origin calculation instruction means 116 for the travel locus, and the change origin calculation instruction means 116 for the travel trajectory sets a certain condition and meets this condition. At this time, the origin of the coordinate system of the map of the traveling locus is changed to cancel the accumulated error of the traveling locus as follows. FIG. 9 is a diagram for explaining an example in which the origin of the coordinate system of the map of the traveling locus is changed to the current time by the change instruction unit 116 of the calculation origin of the traveling locus at constant time intervals. Calculation of travel locus Calculation origin change instruction means 1
When the origin of the coordinate system of the map of the traveling locus is changed to the position of the own vehicle at the current time by 16 at a constant time interval tk, the locus after time t0 + tk in FIG. 8 becomes as shown in FIG.
The calculation of the traveling locus The transformation of the coordinate system by the origin changing instruction means 116 only calculates the traveling locus by changing the starting point of the integration range in equations (1) to (4) from t0 to t0 + tk. As a result, the error between the correct solid line travel locus in the coordinate system of FIG. 9 and the calculated travel locus is reduced as compared with the coordinate system of FIG. In this way, when the coincidence of the traveling loci in the coordinate system of FIG. 9 is determined, it is not possible to determine the area of the traveling locus Lb of the detected object in FIG. 8, so this improvement will be described below.

【0027】図10は検出物体Aの過去一定時間の走行
軌跡を図9の座標系に変換する例を説明する図である。
本図に示すように、図8の走行軌跡Lbに対応する検出
物体Aの過去一定時間tmの走行軌跡を図9の座標に変
換しておく。この結果、図8の検出物体Aの走行軌跡L
bの領域に関する決定が可能になる。次に、走行路決定
手段114の後段に障害物候補判定部117が設けら
れ、該障害物候補判定部117は走行路決定手段114
による決定走行路を得る場合に自車両の走行軌跡と走行
軌跡が重なる検出物体Aを障害物候補と判定する。な
お、その後、自車両の走行軌跡が検出物体Aの走行軌跡
に沿って重なっている間は、前方の検出物体Aの走行軌
跡は自車両の走行路であり、検出物体Aは障害物候補と
判定される。つまり、自車両の走行路上の検出物体A
は、自車両と衝突する可能性があるためである。
FIG. 10 is a view for explaining an example of converting the traveling locus of the detected object A in the past fixed time into the coordinate system of FIG.
As shown in the figure, the traveling locus of the detected object A corresponding to the traveling locus Lb of FIG. 8 in the past fixed time tm is converted into the coordinates of FIG. As a result, the traveling locus L of the detected object A in FIG.
It enables a decision on the region of b. Next, an obstacle candidate determination unit 117 is provided at a stage subsequent to the travel route determination unit 114, and the obstacle candidate determination unit 117 is included in the travel route determination unit 114.
In the case of obtaining the determined travel route according to, the detected object A whose traveling locus and the traveling locus of the own vehicle overlap is determined as an obstacle candidate. After that, while the traveling locus of the own vehicle overlaps with the traveling locus of the detected object A, the traveling locus of the detected object A in front is the traveling path of the own vehicle, and the detected object A is an obstacle candidate. To be judged. That is, the detected object A on the traveling path of the own vehicle
Is because there is a possibility of collision with the own vehicle.

【0028】図11は図1の障害物候補判定部117に
より、他の検出物体Cが障害物候補と判定される例を示
す図である。本図に示すように、検出物体Cが先に検出
した自車両の走行路に移動して、図6と同様に、Δt2
秒間重なると、障害物候補判定部117により、検出物
体Cは障害物候補と判定される。次に、障害物候補判定
部117の後段に設けられた障害物判定部118は、物
体検出手段110からの相対距離、相対速度のデータを
入力し、自車両とこれらの検出物体A、Bとの距離が一
定以下、又は自車両とこれらの検出物体との相対速度が
ある一定以上になって接近してきて、衝突する可能性が
極めて高くなったとき、これらの検出物体A、Bを障害
物と判定する。
FIG. 11 is a diagram showing an example in which the obstacle candidate judging section 117 in FIG. 1 judges that another detected object C is an obstacle candidate. As shown in this figure, the detected object C moves to the traveling path of the host vehicle previously detected, and as in FIG. 6, Δt2
When overlapping for a second, the obstacle candidate determination unit 117 determines that the detected object C is an obstacle candidate. Next, the obstacle determination unit 118 provided in the subsequent stage of the obstacle candidate determination unit 117 inputs the data of the relative distance and the relative speed from the object detection unit 110, and recognizes the own vehicle and these detected objects A and B. When the distance between the detection objects A and B is equal to or less than a certain value or the relative speed between the own vehicle and these detection objects is equal to or greater than a certain value and the possibility of a collision becomes extremely high, these detection objects A and B are obstacles. To determine.

【0029】障害物判定部118の後段に設けられた警
報出力手段119は、危険であるととの判定により警報
を出して運転者に注意を促す。次に、物体検出手段11
0の詳細を説明する。図12は図1の物体検出手段11
0のカメラ210及び211により立体視法に基づいて
立体物との距離を算出する例を説明する図である。本図
(a)に示すカメラ210、211は、検出されるべき
物体に対向して二つの視点を形成するために二つのレン
ズ12及び13が設けられる。この二つのレンズ12及
び13の後にそれらの光軸16、17がそれぞれ一致す
る撮像素子14及び15が設けられる。撮像素子14及
び15は、例えば、CCD(Charge Coupled Device)で
構成されている。ここに、図中の「d」はレンズ12及
び13と立体物との距離であり、「f」はレンズ12及
び13の焦点距離であり、「a」、「b」はそれぞれ立
体物11の同一点が、撮像素子14、15に投影された
位置の光軸からの距離であり、「m」は光軸間の距離
(基線長)である。本図(b)は本図(a)の右側の撮
像装置を基線長「m」だけ移動させ、左側の撮像装置に
重ねた状態にしたものである。本図(b)に示されるよ
うに、三角形ABCと三角形ADEは互いに相似であ
る。そこで(a+b)とmとの比はfとdとの比と同じ
になる。式にすると、 (a+b):m=f:d となり、これを変形すると下記式のようになる。
The alarm output means 119 provided at the subsequent stage of the obstacle determining section 118 issues an alarm when it is determined that the vehicle is dangerous and alerts the driver. Next, the object detection means 11
Details of 0 will be described. FIG. 12 shows the object detecting means 11 of FIG.
It is a figure explaining the example which calculates the distance to a three-dimensional object based on the stereoscopic vision by the cameras 210 and 211 of 0. The cameras 210 and 211 shown in this figure (a) are provided with two lenses 12 and 13 in order to form two viewpoints facing an object to be detected. After the two lenses 12 and 13, image pickup devices 14 and 15 are provided whose optical axes 16 and 17 coincide with each other. The image pickup devices 14 and 15 are, for example, CCDs (Charge Coupled Devices). Here, “d” in the figure is the distance between the lenses 12 and 13 and the three-dimensional object, “f” is the focal length of the lenses 12 and 13, and “a” and “b” are the three-dimensional object 11 respectively. The same point is the distance from the optical axis of the position projected on the image pickup elements 14 and 15, and “m” is the distance between the optical axes (base line length). In this figure (b), the image pickup device on the right side of this figure (a) is moved by the base line length “m”, and is superimposed on the image pickup device on the left side. As shown in this figure (b), the triangle ABC and the triangle ADE are similar to each other. Therefore, the ratio of (a + b) and m becomes the same as the ratio of f and d. In the expression, (a + b): m = f: d, which is transformed into the following expression.

【0030】 d=f・m/(a+b) …(5) mとfとを予め計測しておき、(a+b)の距離を計測
できれば距離dは式(5)から導ける。これを立体視法
という。また、この距離の時間変化により前記相対距離
d(t)を得ることができる。図12での(a+b)を
求める方法は、左右の画像中の立体物の輝度値を少しず
つずらしながら比較し、最も一致するずらし量を求め
る。このずらし量が(a+b)に相当し、このずらし量
(a+b)を視差pという。カメラ210及び211の
信号を受ける検出部212は、この視差pを以下の如く
算出する。
D = fm · (a + b) (5) If m and f are measured in advance and the distance of (a + b) can be measured, the distance d can be derived from the equation (5). This is called stereoscopy. Further, the relative distance d (t) can be obtained by the time change of this distance. In the method of obtaining (a + b) in FIG. 12, the brightness values of the three-dimensional objects in the left and right images are compared while being shifted little by little, and the most matching shift amount is obtained. This shift amount corresponds to (a + b), and this shift amount (a + b) is called parallax p. The detection unit 212 receiving the signals of the cameras 210 and 211 calculates this parallax p as follows.

【0031】前記の最も一致するずらし量を求める方法
として、一致度を調べる方法に相関計算を使う。以下に
相関計算の方法について、画像認識の書籍などにも紹介
されている標準的な方法を説明する。その他の方法とし
てこの標準的な方法を拡張したものがある。説明を簡素
にするために、まず一次元の輝度値列で説明する。図1
3は図12の撮像素子14及び15に形成される画像の
例を示す図である。本図(a)は撮像素子14による左
画像(「左目画像」という)、本図(b)は撮像素子1
5による右画像(「右目画像」という)を示す。
Correlation calculation is used as a method for checking the degree of coincidence as a method for obtaining the above-mentioned most coincident shift amount. Regarding the correlation calculation method, the standard method introduced in books such as image recognition will be described below. Another method is an extension of this standard method. To simplify the description, a one-dimensional brightness value sequence will be described first. Figure 1
3 is a diagram showing an example of images formed on the image pickup devices 14 and 15 of FIG. This figure (a) is a left image by the image sensor 14 (referred to as "left eye image"), and this figure (b) is the image sensor 1
5 shows a right image (referred to as a “right eye image”) according to No. 5.

【0032】図14は図13のそれぞれ参照符号22及
び23の1行の輝度値を取り出した例を示すグラフであ
る。本図に示す参照符号34及び35はそれぞれ左目画
像、右目画像の輝度値のグラフである。この左目画像の
グラフを右目画像のグラフに重なるまでずらしたずらし
量が視差である。図15は図14から図13の立体物2
1の部分を取り出した例を示すグラフである。本図の参
照符号41の左目画像の輝度値の並びを数列として扱い
bnとし、参照符号42の右目画像の輝度値も同様にan
とし、下記式の相関関数が計算される。
FIG. 14 is a graph showing an example in which the luminance values of one row indicated by reference numerals 22 and 23 in FIG. 13 are taken out. Reference numerals 34 and 35 shown in the figure are graphs of luminance values of the left-eye image and the right-eye image, respectively. The parallax is the shift amount by which the graph of the left-eye image is shifted to the graph of the right-eye image. FIG. 15 is a three-dimensional object 2 of FIGS. 14 to 13.
It is a graph which shows the example which took out the part of 1. The arrangement of the brightness values of the left-eye image of reference numeral 41 in the figure is treated as a sequence and is represented by bn, and the brightness value of the right-eye image of reference numeral 42 is also an.
Then, the correlation function of the following equation is calculated.

【0033】 ここに、nは画像の画素番号、iは画素番号のずらし量
であり、0からSまで整数でずらし量を1ずつ増加させ
て、上記V(i)が計算される。Sは、計測装置に予め
検出範囲として設定される最も近い距離に関する視差に
依存する。wは相関計算を行う幅で、目的によって値が
設定される。例えば、図15に示されるように、参照符
号42において立体物21は画素番号1から12に対応
しているので、wを10とすると最も良い視差が得られ
る。しかし、実際にこの方法を応用する場合には、画像
上の立体物の位置と大きさとが未知であることが多い。
そこで、一般的にはwの値には経験的に最も良いとされ
る値が採用される。
[0033] Here, n is the pixel number of the image, i is the shift amount of the pixel number, and the shift amount is an integer from 0 to S, and the shift amount is increased by 1 to calculate V (i). S depends on the parallax regarding the shortest distance set in advance as a detection range in the measuring device. w is a width for performing the correlation calculation, and a value is set depending on the purpose. For example, as shown in FIG. 15, in the reference numeral 42, the three-dimensional object 21 corresponds to the pixel numbers 1 to 12, so if w is 10, the best parallax can be obtained. However, when this method is actually applied, the position and size of the three-dimensional object on the image are often unknown.
Therefore, in general, the value of w that is empirically considered to be the best is adopted.

【0034】図16はV(i)の結果の例を示す図であ
る。本図に示すように、参照符号51は相関値が最も小
さくなり、この時のiが視差である。また、精度をより
必要とする場合には、参照符号52のように補間計算に
よって算出した点を視差とすることも多い(補間方法は
多々ある)。以上の説明は一次元撮像装置を用いた場合
若しくは1行毎に相関計算を行う場合であったが、二次
元撮像装置を用いた場合には、一般的には1行毎に行う
のではなく、複数行をまとめた領域で相関計算を行う方
法がある。
FIG. 16 is a diagram showing an example of the result of V (i). As shown in the figure, the reference numeral 51 has the smallest correlation value, and i at this time is the parallax. Further, when higher accuracy is required, a point calculated by interpolation calculation like reference numeral 52 is often used as the parallax (there are many interpolation methods). The above description is for the case of using the one-dimensional imaging device or the case of performing the correlation calculation for each row, but in the case of using the two-dimensional imaging device, it is generally not performed for each row. , There is a method of performing the correlation calculation in the area where a plurality of lines are collected.

【0035】図17は二次元撮像装置の相関計算方法を
説明する図である。本図(a)に示すように、その方法
の第1に、縦の1列毎に和を求め(射影)、1行相当の
情報にし、前述の計算を行う方法(射影方式)がある。
本図(b)に示すように、第2に、上記式(2)をその
まま二次元で行う方法(エリア方式)がある。射影方式
については、先に説明した一次元の計算をそのままでよ
いが、エリア方式については、横方向の1行についてそ
れぞれV(i)が計算され、同じiの所の和が求められ
評価される。また、射影方式は横方向の情報を圧縮する
ため、左右画面の縦方向のずれに強く、計算時間も短
い。
FIG. 17 is a diagram for explaining the correlation calculation method of the two-dimensional image pickup device. As shown in this figure (a), the first of the methods is a method (projection method) in which the sum is calculated for each vertical column (projection), and the information corresponding to one row is obtained and the above calculation is performed.
Secondly, as shown in this figure (b), there is a method (area method) in which the above equation (2) is directly performed in two dimensions. For the projection method, the one-dimensional calculation described above may be used as it is, but for the area method, V (i) is calculated for each row in the horizontal direction, and the sum at the same i is obtained and evaluated. It Further, since the projection method compresses information in the horizontal direction, it is resistant to vertical displacement of the left and right screens, and the calculation time is short.

【0036】図18は検出物体の方位角θkの算出を説
明する図である。本図(a)に示すカメラの画角θで、
これによる画像が水平方向にWxドットで、本図(b)
に示すように、表現される。物体検出手段110で物体
検出を行うと、物体の距離と画像巾での水平方向位置P
xが、本図(b)に示すように、得られ、これにより検
出物体の方位角θkがPx×θ/Wxとして得られる。
また、この方位角の時間変化により前記方位角θk
(t)を得ることができる。
FIG. 18 is a diagram for explaining the calculation of the azimuth angle θk of the detected object. At the angle of view θ of the camera shown in FIG.
The resulting image is Wx dots in the horizontal direction.
It is expressed as shown in. When the object detection means 110 detects the object, the horizontal position P in the object distance and the image width.
x is obtained as shown in this figure (b), and thereby the azimuth angle θk of the detected object is obtained as Px × θ / Wx.
In addition, the azimuth angle θk is changed by the time change of the azimuth angle.
(T) can be obtained.

【0037】なお、立体視法は一例であり、物体検出手
段110はレーザレーダで構成されてもよい。
The stereoscopic method is an example, and the object detecting means 110 may be composed of a laser radar.

【0038】[0038]

【発明の効果】以上説明したように本発明によれば、自
車両の前方にある物体と自車両との相対的距離、相対速
度、方位角が検出され、自車両の速度信号及びヨー角の
変化分が検出され、自車両速度及びヨー角の変化分によ
り自車両の位置の時間的変化である走行軌跡が算出さ
れ、自車両の位置に、被検出物体の相対的距離、方位角
から算出された被検出物体の位置を足し合わせて被検出
物体の位置の時間変化である走行軌跡が算出され、自車
両の走行軌跡が被検出物体の走行軌跡と重なって沿って
いるときに被検出物体の走行軌跡が自車両の走行路と決
定されることにより、走行路端を示す白線を用いずに、
自車両の走行路を求めることができる。被検出物体の走
行軌跡と平行の場合にも自車両の走行路と決定でき、設
定条件により走行軌跡の原点を変更することにより、時
間の経過に伴う走行軌跡の累積誤差を抑制でき、決定さ
れた自車両の走行路を、被検出物体の障害物候補として
判定するのに使用でき、障害物候補として判定された被
検出物体との相対距離が一定距離以下になった時又は一
定以上の相対速度で接近してきた時、この被検出物体を
障害物として判定できるようになる。
As described above, according to the present invention, the relative distance, relative speed, and azimuth between an object in front of the host vehicle and the host vehicle are detected, and the speed signal and yaw angle of the host vehicle are detected. The amount of change is detected, and the running locus that is the temporal change in the position of the own vehicle is calculated from the changes in the own vehicle speed and yaw angle, and is calculated from the relative distance and the azimuth angle of the detected object at the position of the own vehicle. The running locus, which is the change over time in the position of the detected object, is calculated by adding up the detected loci of the detected object, and when the running locus of the host vehicle overlaps the running locus of the detected object, the detected object is detected. Since the travel locus of is determined as the travel path of the own vehicle, without using the white line indicating the end of the travel path,
It is possible to obtain the travel route of the host vehicle. Even if it is parallel to the running locus of the detected object, it can be determined as the running path of the own vehicle, and by changing the origin of the running locus according to the setting conditions, accumulated errors of the running locus over time can be suppressed and determined. It can be used to judge the running path of the own vehicle as an obstacle candidate of the detected object, and when the relative distance to the detected object judged as an obstacle candidate is less than a certain distance or a certain distance or more. When approaching at a speed, this detected object can be determined as an obstacle.

【図面の簡単な説明】[Brief description of drawings]

【図1】本発明の実施例に係る車両用走行路検出装置の
全体構成を示す図である。
FIG. 1 is a diagram showing an overall configuration of a vehicle travel path detection device according to an embodiment of the present invention.

【図2】図1の自車両の走行軌跡算出手段112により
得られる自車両の走行軌跡マップを示す図である。
FIG. 2 is a diagram showing a traveling locus map of the own vehicle obtained by a traveling locus calculation means 112 of the own vehicle of FIG.

【図3】自車両の位置を用いて検出物体Aの移動に対す
る各走行軌跡を得る例を示す図である。
FIG. 3 is a diagram showing an example of obtaining each traveling locus with respect to the movement of the detected object A by using the position of the own vehicle.

【図4】図1の走行路決定手段114において、自車両
と検出物体の走行軌跡が重なる前の例であって、まだ走
行路の決定が行われない例を示す図である。
FIG. 4 is a diagram showing an example before the traveling loci of the own vehicle and the detected object overlap each other in the traveling road determination means 114 in FIG. 1, and an example in which the traveling road is not yet determined.

【図5】図1の走行路決定手段114において、自車両
と検出物体の走行軌跡がつながる例であって走行路の決
定がまだ行われない例を素牝図である。
FIG. 5 is a schematic diagram showing an example in which the traveling path determination means 114 in FIG. 1 connects the traveling loci of the vehicle and the detected object, and the traveling path is not yet determined.

【図6】図1の走行路決定手段114において、自車両
と検出物体Aの走行軌跡が一定時間重なった例であっ
て、走行路が決定される例を示す図である。
FIG. 6 is a diagram showing an example in which the travel paths of the host vehicle and the detected object A overlap each other for a certain period of time in the travel path determination means 114 of FIG. 1, and a travel path is determined.

【図7】図1の走行路決定手段114において、検出物
体Bの走行軌跡に自車両が平行して一定時間走行する例
であって、走行路の決定が行われる例を示す図である。
7 is a diagram showing an example in which the own vehicle travels in parallel with the traveling locus of the detected object B for a certain period of time in the traveling path determination means 114 of FIG. 1, and in which the traveling path is determined.

【図8】時刻t0の自車両の位置を走行路軌跡のマップ
の座標原点として自車両及び検出物体Aの走行軌跡を算
出する例を示す図である。
FIG. 8 is a diagram showing an example of calculating the traveling loci of the own vehicle and the detected object A by using the position of the own vehicle at time t0 as the coordinate origin of the map of the traveling road locus.

【図9】走行軌跡の算出原点の変更指示手段116によ
り一定時間間隔で走行軌跡のマップの座標系の原点が現
時刻に変更される例を説明する図である。
FIG. 9 is a diagram illustrating an example in which the origin of the coordinate system of the map of the travel locus is changed to the current time by the travel locus calculation origin change instruction means 116 at constant time intervals.

【図10】検出物体Aの過去一定時間の走行軌跡を図9
の座標系に変換する例を説明する図である。
FIG. 10 shows a traveling locus of the detected object A in the past fixed time period.
It is a figure explaining the example converted into the coordinate system of.

【図11】図1の障害物候補判定部117により、他の
検出物体Cが障害物候補と判定される例を示す図であ
る。
FIG. 11 is a diagram showing an example in which another detected object C is determined as an obstacle candidate by an obstacle candidate determination unit 117 in FIG. 1.

【図12】図1の物体検出手段110のカメラ210及
び211により立体視法に基づいて立体物との距離算出
する例を説明する図である。
FIG. 12 is a diagram illustrating an example of calculating a distance to a three-dimensional object based on a stereoscopic method by cameras 210 and 211 of the object detection unit 110 of FIG. 1.

【図13】図12の雑像素子14及び15に形成れる画
像の例を示す図である。
13 is a diagram showing an example of an image formed on the miscellaneous image elements 14 and 15 of FIG.

【図14】図13のそれぞれ参照符号22及び23の1
行の輝度値を取り出した例を示すグラフである。
FIG. 14 is a reference numeral 22 and 23 of FIG. 13, respectively.
It is a graph which shows the example which extracted the brightness value of a line.

【図15】図14から図13の検出物体21の部分を取
り出した例を示すグラフである。
FIG. 15 is a graph showing an example in which a portion of the detection object 21 of FIGS. 14 to 13 is extracted.

【図16】V(i)の結果の例を示す図である。FIG. 16 is a diagram showing an example of a result of V (i).

【図17】二次元撮像装置の相関計算方法を説明する図
である。
FIG. 17 is a diagram illustrating a correlation calculation method of the two-dimensional imaging device.

【図18】検出物体の方位角θkの算出を説明する図で
ある。
FIG. 18 is a diagram illustrating calculation of an azimuth angle θk of a detected object.

【符号の説明】[Explanation of symbols]

110…物体検出手段 111…自車両速度及びヨー角度検出センサ 112…自車両の走行軌跡算出手段 113…検出物体の走行軌跡算出手段 114…走行路決定手段 116…走行軌跡の算出原点の変更指示手段 117…障害物候補判定部 118…障害物検出判定手段 110 ... Object detection means 111 ... Own vehicle speed and yaw angle detection sensor 112 ... Traveling trajectory calculation means of own vehicle 113 ... Traveling trajectory calculation means of detected object 114 ... 117 ... Obstacle candidate determination unit 118 ... Obstacle detection determination means

Claims (4)

【特許請求の範囲】[Claims] 【請求項1】 自車両の走行路を検出する車両用走行路
検出装置において、 前記自車両の前方にある物体と自車両との相対的距離、
相対速度、方位角を検出する物体検出手段と、 前記自車両の速度信号及びヨー角の変化分を検出する自
車両速度及びヨー角度検出センサと、 前記自車両速度及び前記ヨー角の変化分により前記自車
両の位置の時間的変化である走行軌跡を算出する自車両
の走行軌跡算出手段と、 前記自車両の位置に、被検出物体の相対的距離、方位角
から算出された被検出物体の位置を足し合わせて被検出
物体の位置の時間変化である走行軌跡を算出する被検出
物体の走行軌跡算出手段と、 前記自車両の走行軌跡が被検出物体の走行軌跡と重なっ
て沿っているときに被検出物体の走行軌跡を自車両の走
行路と決定する走行路決定手段とを備えることを特徴と
する車両用走行路検出装置。
1. A vehicle traveling path detection device for detecting a traveling path of an own vehicle, comprising: a relative distance between an object in front of the own vehicle and the own vehicle;
Relative speed, object detection means for detecting the azimuth angle, own vehicle speed and yaw angle detection sensor for detecting a change amount of the speed signal and yaw angle of the own vehicle, by the change amount of the own vehicle speed and the yaw angle A traveling locus calculation means of the own vehicle for calculating a traveling locus which is a temporal change of the position of the own vehicle; and a relative distance of the detected object and a detected object calculated from an azimuth angle at the position of the own vehicle. A traveling locus calculation unit for the detected object that calculates the traveling locus that is the time change of the position of the detected object by adding the positions; and the traveling locus of the host vehicle is along the traveling locus of the detected object. And a travel route determining means for determining a travel trajectory of the detected object as a travel route of the host vehicle.
【請求項2】 前記走行路決定手段は、前記被検出物体
の走行軌跡に前記自車両の走行軌跡が平行して沿うと
き、その被検出物体の前方の走行軌跡から自車両の走行
路を決定することを特徴とする、請求項1に記載の車両
用走行路検出装置。
2. The traveling path determining means determines the traveling path of the host vehicle from the traveling trajectory in front of the detected object when the traveling trajectory of the vehicle is parallel to the traveling trajectory of the detected object. The vehicle travel path detection device according to claim 1, wherein
【請求項3】 前記自車両の走行軌跡算出手段は、設定
条件により走行軌跡の原点を変更することを特徴とす
る、請求項1に記載の車両用走行路検出装置。
3. The vehicle traveling road detecting device according to claim 1, wherein the traveling locus calculation means of the own vehicle changes the origin of the traveling locus according to setting conditions.
【請求項4】 さらに、決定された前記自車両の走行路
の前方に走行上の被検出物体を障害物候補として判定す
る障害物候補判定手段と、前記障害物候補として判定さ
れた被検出物体との前記相対距離が一定距離以下になっ
た時又は一定以上の前記相対速度で接近してきた時、こ
の被検出物体を障害物として判定する障害物判定手段と
を備えることを特徴とする、請求項1に記載の車両用走
行路検出装置。
4. An obstacle candidate determination means for determining an object to be detected running ahead of the determined traveling path of the own vehicle as an obstacle candidate, and an object to be detected determined as the obstacle candidate. When the relative distance between and becomes a certain distance or less, or when approaching at the relative speed of a certain amount or more, the obstacle determining means for determining the detected object as an obstacle is provided. Item 1. The vehicle traveling path detection device according to Item 1.
JP08480594A 1994-04-22 1994-04-22 Roadway detection device for vehicles Expired - Fee Related JP3440956B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP08480594A JP3440956B2 (en) 1994-04-22 1994-04-22 Roadway detection device for vehicles

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP08480594A JP3440956B2 (en) 1994-04-22 1994-04-22 Roadway detection device for vehicles

Publications (2)

Publication Number Publication Date
JPH07296291A true JPH07296291A (en) 1995-11-10
JP3440956B2 JP3440956B2 (en) 2003-08-25

Family

ID=13840935

Family Applications (1)

Application Number Title Priority Date Filing Date
JP08480594A Expired - Fee Related JP3440956B2 (en) 1994-04-22 1994-04-22 Roadway detection device for vehicles

Country Status (1)

Country Link
JP (1) JP3440956B2 (en)

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10307998A (en) * 1997-05-07 1998-11-17 Nippon Soken Inc Automatic steering device for vehicles
JP2000318485A (en) * 1999-05-07 2000-11-21 Honda Motor Co Ltd Steering control device of automatic followup running vehicle
JP2000322697A (en) * 1999-05-10 2000-11-24 Honda Motor Co Ltd Steering controller for vehicle automatically following in traveling
JP2002131432A (en) * 2000-10-24 2002-05-09 Honda Motor Co Ltd Advancing locus estimating device for vehicle
JP2004026143A (en) * 2002-06-03 2004-01-29 Visteon Global Technologies Inc Method and device for identifying target vehicle in automatic speed control collision avoidance system
JP2005310010A (en) * 2004-04-26 2005-11-04 Mitsubishi Electric Corp Circumference monitoring device
JP2005332192A (en) * 2004-05-19 2005-12-02 Toyota Motor Corp Steering support system
JP2006344009A (en) * 2005-06-09 2006-12-21 Xanavi Informatics Corp Method and system for monitoring periphery of vehicle
JP2008210036A (en) * 2007-02-23 2008-09-11 Toyota Motor Corp Obstacle detection apparatus and obstacle detection method
JP2018169950A (en) * 2017-03-30 2018-11-01 株式会社デンソーテン Rear vehicle detection device and rear vehicle detection method
JP2020073383A (en) * 2020-02-03 2020-05-14 日産自動車株式会社 Runway estimation method and runway estimation device
JP2020147088A (en) * 2019-03-12 2020-09-17 トヨタ自動車株式会社 Drive support apparatus
CN113646820A (en) * 2019-03-27 2021-11-12 五十铃自动车株式会社 Detection device and detection method

Cited By (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH10307998A (en) * 1997-05-07 1998-11-17 Nippon Soken Inc Automatic steering device for vehicles
JP2000318485A (en) * 1999-05-07 2000-11-21 Honda Motor Co Ltd Steering control device of automatic followup running vehicle
JP2000322697A (en) * 1999-05-10 2000-11-24 Honda Motor Co Ltd Steering controller for vehicle automatically following in traveling
JP2002131432A (en) * 2000-10-24 2002-05-09 Honda Motor Co Ltd Advancing locus estimating device for vehicle
JP2004026143A (en) * 2002-06-03 2004-01-29 Visteon Global Technologies Inc Method and device for identifying target vehicle in automatic speed control collision avoidance system
JP2005310010A (en) * 2004-04-26 2005-11-04 Mitsubishi Electric Corp Circumference monitoring device
JP2005332192A (en) * 2004-05-19 2005-12-02 Toyota Motor Corp Steering support system
JP2006344009A (en) * 2005-06-09 2006-12-21 Xanavi Informatics Corp Method and system for monitoring periphery of vehicle
JP2008210036A (en) * 2007-02-23 2008-09-11 Toyota Motor Corp Obstacle detection apparatus and obstacle detection method
JP2018169950A (en) * 2017-03-30 2018-11-01 株式会社デンソーテン Rear vehicle detection device and rear vehicle detection method
JP2020147088A (en) * 2019-03-12 2020-09-17 トヨタ自動車株式会社 Drive support apparatus
CN113646820A (en) * 2019-03-27 2021-11-12 五十铃自动车株式会社 Detection device and detection method
JP2020073383A (en) * 2020-02-03 2020-05-14 日産自動車株式会社 Runway estimation method and runway estimation device

Also Published As

Publication number Publication date
JP3440956B2 (en) 2003-08-25

Similar Documents

Publication Publication Date Title
US6744380B2 (en) Apparatus for monitoring area adjacent to vehicle
US7030775B2 (en) Vehicle surroundings monitoring apparatus and traveling control system incorporating the apparatus
JP3671825B2 (en) Inter-vehicle distance estimation device
JP3860061B2 (en) Outside-of-vehicle monitoring device and travel control device equipped with this out-of-vehicle monitoring device
JP3904988B2 (en) Image processing apparatus and method
Kato et al. An obstacle detection method by fusion of radar and motion stereo
US7266454B2 (en) Obstacle detection apparatus and method for automotive vehicle
US11526173B2 (en) Traveling trajectory correction method, traveling control method, and traveling trajectory correction device
JP2017121912A (en) Traveling control system of vehicle
JP2018039284A (en) Travelling control system of vehicle
JPH10341458A (en) Method for correcting on-vehicle stereo camera and onvehicle stereo camera applied with the method
JP2019067345A (en) Vehicle control device, vehicle control method, and program
JP2014086071A (en) Lane recognition method and system
JP3440956B2 (en) Roadway detection device for vehicles
JP6822815B2 (en) Road marking recognition device
JP7255345B2 (en) Driving lane recognition device, driving lane recognition method and program
JP4265931B2 (en) Leading vehicle detection device
JPH08156723A (en) Vehicle obstruction detecting device
JP6699728B2 (en) Inter-vehicle distance estimation method and inter-vehicle distance estimation device
JP2006004188A (en) Obstacle recognition method and obstacle recognition device
JPH06225308A (en) Running course detector
JP7216695B2 (en) Surrounding vehicle monitoring device and surrounding vehicle monitoring method
KR101473426B1 (en) Method for recognizing travelling route of vehicle and apparatus thereof
CN113479204A (en) Vehicle control device, vehicle control method, and storage medium
JP4113628B2 (en) Vehicle display device

Legal Events

Date Code Title Description
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20030506

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20090620

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100620

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20100620

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110620

Year of fee payment: 8

LAPS Cancellation because of no payment of annual fees