JP2002092796A - Lane recognizing device - Google Patents

Lane recognizing device

Info

Publication number
JP2002092796A
JP2002092796A JP2000282094A JP2000282094A JP2002092796A JP 2002092796 A JP2002092796 A JP 2002092796A JP 2000282094 A JP2000282094 A JP 2000282094A JP 2000282094 A JP2000282094 A JP 2000282094A JP 2002092796 A JP2002092796 A JP 2002092796A
Authority
JP
Japan
Prior art keywords
recognition
lane
recognizing
vehicle
white line
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2000282094A
Other languages
Japanese (ja)
Inventor
Makoto Nishida
誠 西田
Toshiaki Kakinami
俊明 柿並
Sachiko Kojima
祥子 小島
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toyota Motor Corp
Toyota Central R&D Labs Inc
Aisin Corp
Original Assignee
Aisin Seiki Co Ltd
Toyota Motor Corp
Toyota Central R&D Labs Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Aisin Seiki Co Ltd, Toyota Motor Corp, Toyota Central R&D Labs Inc filed Critical Aisin Seiki Co Ltd
Priority to JP2000282094A priority Critical patent/JP2002092796A/en
Publication of JP2002092796A publication Critical patent/JP2002092796A/en
Pending legal-status Critical Current

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V20/00Scenes; Scene-specific elements
    • G06V20/50Context or environment of the image
    • G06V20/56Context or environment of the image exterior to a vehicle by using sensors mounted on the vehicle
    • G06V20/588Recognition of the road, e.g. of lane markings; Recognition of the vehicle driving pattern in relation to the road
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/254Fusion techniques of classification results, e.g. of results related to same input data
    • G06F18/256Fusion techniques of classification results, e.g. of results related to same input data of results relating to different input data, e.g. multimodal recognition
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/809Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data
    • G06V10/811Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data the classifiers operating on different input data, e.g. multi-modal recognition

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Multimedia (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Databases & Information Systems (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Data Mining & Analysis (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Image Processing (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

PROBLEM TO BE SOLVED: To provide a lane recognizing device capable of discriminating the occurrence of the misrecognition, and surely recognizing a lane. SOLUTION: This lane recognizing device for recognizing a lane 6 where a self-car 5 drives, on the basis of the image information, comprises plural cameras 11, 12 mounted on a vehicle 5 for photographing different parts (area A, area B) of road compartment lines on 61, 62 at left and right of the lane 6, plural road compartment lines recognizing parts respectively mounted corresponding to these cameras 11, 12 and recognizing the road compartment lines 61A, 61B or 62A and 62B in an image, and a determining part for determining the success or failure of the recognition by collating a result of the recognition of the road compartment line recognizing part (extended position 61A2 and 61B2, 62A2 and 62B2).

Description

【発明の詳細な説明】DETAILED DESCRIPTION OF THE INVENTION

【0001】[0001]

【発明の属する技術分野】本発明は、自動走行システム
や車線逸脱警報システムに用いられる車線認識装置に関
する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a lane recognition device used in an automatic driving system or a lane departure warning system.

【0002】[0002]

【従来の技術】自動走行システムや車線逸脱警報システ
ムにおいては、車両が走行している車線(走行レーン)
を認識する必要がある。このような車線は通常路面に塗
装により描かれている白線や黄色線等の区画線により区
画されている。したがって、こうした区画線を認識する
ことで車線を認識することが可能であり、カメラで取得
した画像情報からこうした区画線を認識する各種の装置
・方法が実用化されている。
2. Description of the Related Art In an automatic driving system or a lane departure warning system, the lane (traveling lane) in which the vehicle is traveling.
Need to recognize. Such a lane is usually demarcated by a demarcation line such as a white line or a yellow line drawn on a road surface by painting. Therefore, it is possible to recognize a lane by recognizing such lane markings, and various apparatuses and methods for recognizing such lane markings from image information acquired by a camera have been put to practical use.

【0003】特開平9−91594号公報に開示されている
技術もこうした技術の一例であって、後方の白線認識用
に専用のカメラを設置することなく前方の白線認識結果
と自車の挙動情報に基づいて後方の白線位置情報を推定
する技術が開示されている。
The technique disclosed in Japanese Patent Application Laid-Open No. 9-91594 is also an example of such a technique, in which a result of recognition of a front white line and information on the behavior of the own vehicle can be obtained without installing a dedicated camera for recognition of a rear white line. There is disclosed a technique for estimating rear white line position information on the basis of the information.

【0004】[0004]

【発明が解決しようとする課題】画像による白線の認識
では誤認識がおこる可能性を完全に除去することはでき
ない。この誤認識の要因としては逆光、路面反射、他車
のライトの映り込み等がある。上記技術においては、こ
うした誤認識が発生した場合、前方の白線認識結果だけ
でなく後方の白線位置情報にもずれが生じ、しかも誤認
識の発生を認知することが困難である。
The recognition of a white line by an image cannot completely eliminate the possibility of erroneous recognition. Factors of this erroneous recognition include backlight, road surface reflection, reflection of lights from other vehicles, and the like. In the above-described technique, when such erroneous recognition occurs, not only the front white line recognition result but also the rear white line position information is shifted, and it is difficult to recognize the occurrence of erroneous recognition.

【0005】そこで本発明は、誤認識発生を判定して確
実な車線認識が可能な車線認識装置を提供することを課
題とする。
SUMMARY OF THE INVENTION It is an object of the present invention to provide a lane recognizing device capable of judging occurrence of erroneous recognition and performing reliable lane recognition.

【0006】[0006]

【課題を解決するための手段】上記課題を解決するた
め、本発明に係る車線認識装置は、画像情報から自車の
走行している車線を認識する車線認識装置において、
(1)車両に設置され、車線左右に設けられている道路区
画線の異なる部分を撮影する複数のカメラと、(2)これ
らのカメラそれぞれに対応して設けられ、画像中の道路
区画線を認識する複数の道路区画線認識部と、(3)道路
区画線認識部の各認識結果を照合することにより認識の
成否を判定する判定部と、を備えることを特徴とする。
In order to solve the above-mentioned problems, a lane recognizing device according to the present invention is a lane recognizing device for recognizing a lane in which a vehicle is traveling from image information.
(1) A plurality of cameras installed on the vehicle and photographing different portions of the road lane markings provided on the left and right of the lane, and (2) a road lane marking in the image provided for each of these cameras. It is characterized by comprising: a plurality of road lane marking recognition units to be recognized; and (3) a judging unit for judging the success or failure of recognition by collating each recognition result of the road lane marking recognition unit.

【0007】本発明によれば、複数のカメラを用いて道
路区画線の異なる部分をそれぞれ撮影し、それぞれの部
分について画像認識によって道路区画線の位置情報を取
得する。それぞれの画像認識が成功していれば、取得し
た位置情報間には齟齬は発生しないはずであるから、齟
齬がある場合には、いずれかの認識結果に誤りがあるも
のと判定できる。例えば、同一の道路区画線に対して3
台以上のカメラが設置されている場合に、過半数のカメ
ラについて認識結果に齟齬がなければ、その結果を正し
い認識結果として使用できる。
According to the present invention, different portions of the road lane marking are photographed using a plurality of cameras, and the position information of the road lane markings is obtained by image recognition for each portion. If the respective image recognitions are successful, no inconsistency should occur between the acquired position information, and if there is an inconsistency, it can be determined that one of the recognition results has an error. For example, 3 for the same road division line
If there are no inconsistencies in the recognition results for the majority of cameras when more than one camera is installed, the results can be used as correct recognition results.

【0008】この判定部は、道路区画線認識部各々が認
識した道路区画線を車両近傍まで延長して延長位置を比
較することで認識の成否を判定してもよい。認識結果に
齟齬がある場合は延長位置自体にも不一致が生ずると考
えられるからである。
[0008] The determination unit may determine the success or failure of recognition by extending the lane markings recognized by each of the lane marking recognition units to the vicinity of the vehicle and comparing the extended positions. This is because if there is a discrepancy in the recognition result, it is considered that a mismatch also occurs in the extension position itself.

【0009】これら複数のカメラのうち一台は駐車アシ
スト用の後方撮像手段であることが好ましい。駐車アシ
スト用の後方撮像手段を通常走行時の車線認識に使用す
ることで駐車アシスト用の後方撮像手段の有効活用が図
れる。
It is preferable that one of the plurality of cameras is a rearward imaging means for parking assistance. By using the rear imaging means for parking assistance for lane recognition during normal traveling, the rear imaging means for parking assistance can be effectively utilized.

【0010】[0010]

【発明の実施の形態】以下、添付図面を参照して本発明
の好適な実施の形態について詳細に説明する。説明の理
解を容易にするため、各図面において同一の構成要素に
対しては可能な限り同一の参照番号を附し、重複する説
明は省略する。
Preferred embodiments of the present invention will be described below in detail with reference to the accompanying drawings. In order to facilitate understanding of the description, the same constituent elements are denoted by the same reference numerals as much as possible in each drawing, and redundant description will be omitted.

【0011】図1は、本発明に係る車線認識装置の主要
部分の構成を示すブロック図である。本装置は、前方と
後方にそれぞれ配置された2台のカメラ11、12と各
カメラ11、12専用の白線認識ECU21、22と、
これらECUに接続されている誤認識判定ECU3から
構成されている。
FIG. 1 is a block diagram showing a configuration of a main part of a lane recognition device according to the present invention. The present apparatus includes two cameras 11 and 12 arranged on the front and rear sides, and white line recognition ECUs 21 and 22 dedicated to the cameras 11 and 12, respectively.
It is composed of an erroneous recognition determination ECU 3 connected to these ECUs.

【0012】図2は、この車線認識装置における誤認識
判定の考え方を説明する図である。図2に示されるよう
に車両5が左右の区画線61、62で区画された車線6
を走行している場合を例に考える。車両の前方、例えば
車室前部中央に設置された前方カメラ11はハッチング
された領域A部分の画像を取得する。この前方カメラ1
1は、前進走行時に車両進行方向の道路状況を認識する
ためのカメラであり、そのため、領域Aは自車から比較
的遠く離れた位置までカバーするよう設定されている。
一方、車両の後方、例えば車室後部中央に設置された後
方カメラ12はハッチングされた領域B部分の画像を取
得する。この後方カメラ12は、後退駐車時等に後方確
認用のカメラとして機能するものであり、そのため、領
域Bは領域Aよりも車両に近い位置を広角でカバーする
よう設定されている。
FIG. 2 is a diagram for explaining the concept of erroneous recognition determination in the lane recognition device. As shown in FIG. 2, the lane 6 is divided into left and right lane markings 61 and 62.
Consider the case where the vehicle is traveling. The front camera 11 installed in front of the vehicle, for example, in the center of the front of the vehicle compartment, acquires an image of the hatched area A. This front camera 1
Reference numeral 1 denotes a camera for recognizing a road condition in the traveling direction of the vehicle when the vehicle travels forward. Therefore, the area A is set so as to cover a position relatively far from the vehicle.
On the other hand, the rear camera 12 installed at the rear of the vehicle, for example, at the center of the rear part of the vehicle compartment, acquires an image of the hatched area B. The rear camera 12 functions as a camera for confirming the rear when the vehicle is parked backward and the like. Therefore, the area B is set to cover a position closer to the vehicle than the area A with a wide angle.

【0013】前方白線認識ECU21が認識し得るのは
領域A内の左右の区画線部分61A、62A部分であ
り、一方、後方白線認識ECU22が認識し得るのは領
域B内の左右の区画線部分61B、62B部分である。
誤認識判定ECU3は、白線認識ECU21、22の認
識した白線を延長してそれぞれの車両直下位置を求める
(図2中の61A2、B2、62A2、B2である)。これ
らのずれから誤認識の有無の判定を行うものである。
The front white line recognition ECU 21 can recognize the left and right division line portions 61A and 62A in the area A, while the rear white line recognition ECU 22 can recognize the left and right division line parts in the area B. 61B and 62B.
The erroneous recognition determination ECU 3 extends the white lines recognized by the white line recognition ECUs 21 and 22 to obtain the positions directly below the respective vehicles (61A 2 , B 2 , 62A 2 and B 2 in FIG. 2 ). From these deviations, the presence or absence of erroneous recognition is determined.

【0014】図3はその処理のフローチャートである。
この処理はイグニッションキーがオンにされてからオフ
にされるまでの間繰り返し行われる。
FIG. 3 is a flowchart of the process.
This process is repeated from the time the ignition key is turned on to the time the ignition key is turned off.

【0015】ステップS1では、前方カメラ11と後方
カメラ12により取得したフレーム画像(図2に示され
る領域A、領域Bの画像に相当)がそれぞれ前方白線認
識ECU21と後方白線認識ECU22に送られる。ス
テップS2においては、各白線認識ECU21、22が
取得したフレーム画像中から特徴量抽出処理などの画像
認識処理によって白線61、62部分の位置情報を取得
する。具体的には、前方白線認識ECU21は領域A中
の白線部分にあたる61Aと62Aを、後方白線認識E
CU22は領域B中の白線部分にあたる61Bと62B
をそれぞれ認識し、その位置情報を誤認識判定ECU3
に通知する。
In step S1, frame images (corresponding to the images of the areas A and B shown in FIG. 2) acquired by the front camera 11 and the rear camera 12 are sent to the front white line recognition ECU 21 and the rear white line recognition ECU 22, respectively. In step S2, the position information of the white lines 61 and 62 is acquired from the frame images acquired by the white line recognition ECUs 21 and 22 by image recognition processing such as feature amount extraction processing. More specifically, the front white line recognition ECU 21 recognizes the white line portions 61A and 62A in the area A as the rear white line recognition E.
CU22 includes 61B and 62B corresponding to white line portions in area B.
And the position information thereof is erroneously recognized and determined by ECU3.
Notify.

【0016】ステップS3では、まずステップS2にお
ける各白線認識ECU21、22における認識処理が成
功していたか否かの判定を行う。白線が薄かったり、途
切れていたなどの理由によりいずれかの白線部分を認識
できず、その位置情報が誤認識判定ECU3に通知され
なかった場合には、いずれかのECU21、22で処理
が成功しなかったものと判定し、ステップS4へ移行し
て認識不成功とし、残りの認識結果も出力せず処理を終
了する。
In step S3, it is first determined whether or not the recognition processing in each of the white line recognition ECUs 21 and 22 in step S2 has been successful. If any of the white lines cannot be recognized because the white line is thin or interrupted, and the position information is not notified to the misrecognition determination ECU 3, the processing is successfully performed by one of the ECUs 21 and 22. It is determined that there is no recognition, the process proceeds to step S4, the recognition is unsuccessful, and the process ends without outputting the remaining recognition result.

【0017】ステップS3において両方の認識処理が成
功していた場合には、ステップS5へと移行して認識し
た白線それぞれを延長してその延長先の車両直下の位置
を求める。例えば、白線61Aの場合を例に採ると、認
識した白線61Aの車両に一番近い位置61A1からこ
の白線61Aを延長して車両の重心位置Wの側方におけ
る延長位置61A2の位置を求める。同様にして白線6
2Aの延長位置62A2、白線61Bの延長位置61
2、白線62Bの延長位置62B2をそれぞれ求める。
If both recognition processes have succeeded in step S3, the process proceeds to step S5, in which each of the recognized white lines is extended to obtain a position immediately below the extension destination vehicle. For example, when taking the case of the white line 61A as an example, determine the position of the extended position 61A 2 in the lateral center of gravity W of the vehicle to extend from the nearest position 61A 1 to the vehicle of the recognized white line 61A the white lines 61A . Similarly, white line 6
Extension position 62A 2 of 2A, extension position 61 of white line 61B
Obtaining B 2, the extended position 62B 2 of the white line 62B, respectively.

【0018】ステップS6では、こうして求めた各白線
の車両直下位置、すなわち、61A 2と61B2、62A
2と62B2がそれぞれ一致しているか否かを判定する。
車線が直線の場合には、それぞれの認識が成功していれ
ばこれらは実際の車両直下における区画線位置61C、
62Cに一致するはずである。しかし、図2に示される
ように車線6自体が曲がっている場合や白線の画像認識
の精度の影響により完全な一致を求めることは無理であ
るから、両者が所定の閾値(例えば20cm)以内にあ
る場合には一致しているものとみなしてステップS7へ
と移行する。ステップS7では、前方カメラ11により
認識した白線61A、62Aと後方カメラ12により認
識した白線61B、62Bを接続することで認識した白
線位置をさらに補正することで、実際の車両直下位置6
1C、62Cに近づけてその白線位置を出力する。
In step S6, each white line thus obtained is determined.
Position immediately below the vehicle, that is, 61A TwoAnd 61BTwo, 62A
TwoAnd 62BTwoIt is determined whether or not each matches.
If the lane is a straight line, it is acceptable if each recognition is successful.
If these are the lane marking position 61C just below the actual vehicle,
Should match 62C. However, as shown in FIG.
Like lane 6 itself is curved or white line image recognition
It is impossible to find a perfect match due to the effect of
Therefore, both are within a predetermined threshold (for example, 20 cm).
If so, it is determined that they match, and the process proceeds to step S7.
And migrate. In step S7, the front camera 11
Recognized by the recognized white lines 61A and 62A and the rear camera 12.
White lines recognized by connecting the recognized white lines 61B and 62B
By further correcting the line position, the actual vehicle position 6
The position of the white line is output closer to 1C and 62C.

【0019】一方、閾値を越えて離れている場合には、
いずれかの白線認識が失敗したものとみなし、ステップ
S4へと移行して認識不成功とし、残りの認識結果も出
力せず処理を終了する。
On the other hand, if the distance is greater than the threshold value,
It is considered that any of the white line recognitions has failed, and the process proceeds to step S4 to make the recognition unsuccessful, and the process ends without outputting the remaining recognition results.

【0020】このように前方カメラ1で取得した画像か
ら前方白線認識ECU21が認識した白線位置情報と後
方カメラ2で取得した画像から後方白線認識ECU22
が認識した白線位置情報とからそれぞれ車両直下位置を
推定し、その結果を照合することによって、認識結果を
検証して誤認識を抑制することができる。
The white line position information recognized by the front white line recognition ECU 21 from the image obtained by the front camera 1 and the rear white line recognition ECU 22 from the image obtained by the rear camera 2
By estimating the position immediately below the vehicle based on the recognized white line position information and comparing the results, it is possible to verify the recognition result and suppress erroneous recognition.

【0021】ここでは、認識成功と判定した場合に、認
識結果の補正を行う例を説明したが、成功と判定した場
合はより広範囲の画像を取得することが可能で精度良い
認識が可能と考えられる前方カメラ11の認識結果のみ
を出力してもよい。また、車線の片方のみが一致してい
る場合には、その一方のみを正常に出力してもよく。誤
認識と判定された場合には、出力を停止するのではな
く、誤認識の可能性が有る旨の情報とともにそれぞれあ
るいは一方の位置情報を出力するようにしてもよい。
Here, an example in which the recognition result is corrected when the recognition is determined to be successful has been described. However, when the recognition is determined to be successful, it is possible to obtain a wider range of images and to realize accurate recognition. Alternatively, only the recognition result of the front camera 11 may be output. If only one of the lanes matches, only one of the lanes may be output normally. When it is determined that the recognition is erroneous, the output may not be stopped, and each or one of the position information may be output together with the information indicating the possibility of the erroneous recognition.

【0022】上述の実施例においては、前方カメラと後
方カメラの各々が認識した道路区画線を車両近傍まで延
長して延長位置を比較する例を説明してきたが、前方カ
メラでの認識結果と自車の挙動(車速や進行方向、ある
いは走行軌跡など)に基づいて前方カメラで認識された
道路区画線が自車の前進走行に伴って後方カメラで認識
されるべき位置を推定し、実際の後方カメラでの認識位
置と比較するようにしてもよい。このようにすると、後
方カメラで認識した道路区画線を延長する必要がないの
で、後方カメラで認識される道路区画線が非常に短い場
合でもそれぞれのカメラによる認識結果を精度良く検証
できる。
In the above-described embodiment, an example has been described in which the road marking line recognized by each of the front camera and the rear camera is extended to the vicinity of the vehicle to compare the extension positions. Based on the behavior of the vehicle (vehicle speed, direction of travel, travel trajectory, etc.), the position of the lane marking recognized by the front camera should be recognized by the rear camera as the host vehicle advances, and the actual rearward position is estimated. You may make it compare with the recognition position in a camera. In this way, since it is not necessary to extend the road lane marking recognized by the rear camera, the recognition result by each camera can be accurately verified even when the road lane marking recognized by the rear camera is very short.

【0023】また、以上の説明では、前方カメラと後方
カメラの2台を用いる場合を例に説明したが、いずれか
に代えて側方カメラを用いてもよい。側方カメラを用い
る場合は、左右いずれかの白線のみを検出することにな
るので左右それぞれに最低1台ずつ設置する必要があ
る。後方カメラとしては後退時に駐車アシストを行うカ
メラを併用することが好ましい。
In the above description, the case where two cameras, a front camera and a rear camera, are used has been described as an example, but a side camera may be used instead of either one. When a side camera is used, only one of the left and right white lines is detected, so that at least one camera must be installed on each of the left and right. As the rear camera, it is preferable to use a camera that performs parking assist at the time of retreat.

【0024】また、左右の各白線をそれぞれ3台以上の
カメラを用いて異なる3ヶ所(一部重複してもよい)以
上の領域で画像を取得することで白線認識を行ってもよ
い。なお、3台以上のカメラで認識を行う場合には、各
カメラの画像から認識された白線の車両直下への延長位
置が全て一致しなくともよく、一部に不一致があっても
過半数が一致する場合には、一致した結果を用いて認識
結果としてもよい。
Alternatively, the white lines may be recognized by acquiring images of the left and right white lines at three or more different regions (which may partially overlap) using three or more cameras. When three or more cameras are used for recognition, the extension positions of the white lines recognized from the images of the cameras to the area directly below the vehicle do not need to match, and even if there is a partial mismatch, the majority will match. In such a case, the recognition result may be used using the matched result.

【0025】こうして認識された白線の位置情報は車線
逸脱警報システムや車両自動走行システムにおいて好適
に使用することができる。
The position information of the white line recognized in this way can be suitably used in a lane departure warning system or an automatic vehicle traveling system.

【0026】[0026]

【発明の効果】以上説明したように本発明によれば、車
両が走行している車線を区画する区画線(白線)の複数
の箇所(一部重複してもよい)をそれぞれ異なるカメラ
で撮像して画像認識によって白線位置情報を検出し、検
出した位置情報を照合してそれぞれの認識結果を検証す
ることで認識の成否を判定して誤認識を抑制することで
確実な車線認識を行うことができる。
As described above, according to the present invention, a plurality of (partially overlapping) portions of a lane marking (white line) that divides the lane in which the vehicle is traveling are captured by different cameras. Detecting white line position information by image recognition, collating the detected position information and verifying each recognition result to determine the success or failure of recognition and suppressing erroneous recognition to perform reliable lane recognition Can be.

【図面の簡単な説明】[Brief description of the drawings]

【図1】本発明に係る車線認識装置の概略を示すブロッ
ク図である。
FIG. 1 is a block diagram schematically showing a lane recognition device according to the present invention.

【図2】本発明の原理を説明する図である。FIG. 2 is a diagram illustrating the principle of the present invention.

【図3】図1の装置の動作を説明するフローチャートで
ある。
FIG. 3 is a flowchart illustrating an operation of the apparatus of FIG. 1;

【符号の説明】[Explanation of symbols]

3…誤認識判定ECU、5…車両、6…車線、11…前
方カメラ(カメラ)、12…後方カメラ(カメラ)、2
1…前方白線認識ECU(白線認識ECU)、22…後
方白線認識ECU(白線認識ECU)、61、62…白
線(区画線)。
3: misrecognition determination ECU, 5: vehicle, 6: lane, 11: front camera (camera), 12: rear camera (camera), 2
1 ... front white line recognition ECU (white line recognition ECU), 22 ... rear white line recognition ECU (white line recognition ECU), 61, 62 ... white line (partition line).

───────────────────────────────────────────────────── フロントページの続き (51)Int.Cl.7 識別記号 FI テーマコート゛(参考) B60R 21/00 624 B60R 21/00 624G 626 626A 628 628D G06T 1/00 330 G06T 1/00 330A 7/60 200 7/60 200J (72)発明者 西田 誠 愛知県豊田市トヨタ町1番地 トヨタ自動 車株式会社内 (72)発明者 柿並 俊明 愛知県刈谷市朝日町2丁目1番地 アイシ ン精機株式会社内 (72)発明者 小島 祥子 愛知県愛知郡長久手町大字長湫字横道41番 地の1 株式会社豊田中央研究所内 Fターム(参考) 5B057 AA16 BA02 BA11 CA12 DA08 DB02 DC01 DC32 5H180 AA01 CC04 LL01 LL02 LL04 5L096 BA04 BA18 CA05 DA03 FA67 FA69 FA76 HA01 JA26 ──────────────────────────────────────────────────続 き Continued on the front page (51) Int.Cl. 7 Identification symbol FI Theme coat ゛ (Reference) B60R 21/00 624 B60R 21/00 624G 626 626A 628 628D G06T 1/00 330 G06T 1/00 330A 7/60 2007/60 200J (72) Inventor Makoto Nishida 1st Toyota Town, Toyota City, Aichi Prefecture Inside Toyota Automobile Co., Ltd. (72) Inventor Toshiaki Kakinami 2-1, Asahimachi, Kariya City, Aichi Prefecture Aisin Seiki Co., Ltd. (72) Inventor Shoko Kojima 1-41, Chukuji-Yokomichi, Nagakute-cho, Aichi-gun, Aichi Prefecture F-term in Toyota Central R & D Laboratories, Inc. 5B057 AA16 BA02 BA11 CA12 DA08 DB02 DC01 DC32 5H180 AA01 CC04 LL01 LL02 LL04 5L096 BA04 BA18 CA05 DA03 FA67 FA69 FA76 HA01 JA26

Claims (3)

【特許請求の範囲】[Claims] 【請求項1】 画像情報から自車の走行している車線を
認識する車線認識装置において、 車両に設置され、車線左右に設けられている道路区画線
の異なる部分を撮影する複数のカメラと、 前記カメラのそれぞれに対応して設けられ、画像中の道
路区画線を認識する複数の道路区画線認識部と、 前記道路区画線認識部の各認識結果を照合することによ
り認識の成否を判定する判定部と、 を備える車線認識装置。
1. A lane recognizing device for recognizing a lane in which a vehicle is traveling from image information, comprising: a plurality of cameras installed on the vehicle for photographing different portions of a lane marking provided on the left and right of the lane; A plurality of road lane marking units provided for each of the cameras for recognizing road lane markings in an image, and a recognition result is determined by comparing each recognition result of the road lane marking recognition unit. A lane recognition device comprising: a determination unit.
【請求項2】 前記判定部は、前記道路区画線認識部各
々が認識した道路区画線を車両近傍まで延長して延長位
置を比較することで認識の成否を判定する請求項1記載
の車線認識装置。
2. The lane recognition according to claim 1, wherein the determination unit determines the success or failure of the recognition by extending the road lane marking recognized by each of the road lane marking recognition units to the vicinity of the vehicle and comparing the extension positions. apparatus.
【請求項3】 前記複数のカメラのうち一台は駐車アシ
スト用の後方撮像手段である請求項1または2に記載の
車線認識装置。
3. The lane recognition device according to claim 1, wherein one of the plurality of cameras is a rearward imaging unit for parking assistance.
JP2000282094A 2000-09-18 2000-09-18 Lane recognizing device Pending JP2002092796A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2000282094A JP2002092796A (en) 2000-09-18 2000-09-18 Lane recognizing device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2000282094A JP2002092796A (en) 2000-09-18 2000-09-18 Lane recognizing device

Publications (1)

Publication Number Publication Date
JP2002092796A true JP2002092796A (en) 2002-03-29

Family

ID=18766658

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2000282094A Pending JP2002092796A (en) 2000-09-18 2000-09-18 Lane recognizing device

Country Status (1)

Country Link
JP (1) JP2002092796A (en)

Cited By (15)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2005111966A1 (en) * 2004-05-19 2005-11-24 Honda Motor Co., Ltd. Lane boundary recognition apparatus for vehicle
WO2005111965A1 (en) * 2004-05-19 2005-11-24 Honda Motor Co., Ltd. Lane boundary recognition apparatus for vehicle
WO2005111964A1 (en) * 2004-05-19 2005-11-24 Honda Motor Co., Ltd. Lane boundary recognition apparatus for vehicle
JP2006349580A (en) * 2005-06-17 2006-12-28 Denso Corp Traveling condition determination device and on-vehicle navigation device
JP2007147564A (en) * 2005-11-30 2007-06-14 Aisin Aw Co Ltd Image recognition device and method thereof, and own vehicle position recognition device and method
US7415134B2 (en) 2005-05-17 2008-08-19 Honda Motor Co., Ltd. Traffic lane marking line recognition system for vehicle
US7421095B2 (en) 2004-05-19 2008-09-02 Honda Motor Co., Ltd. Traffic lane marking line recognition system for vehicle
WO2008107944A1 (en) * 2007-03-01 2008-09-12 Pioneer Corporation Lane deviation prevention device, lane deviation prevention method, lane deviation prevention program, and recording medium
JP2010165142A (en) * 2009-01-15 2010-07-29 Nissan Motor Co Ltd Device and method for recognizing road sign for vehicle
JP2014106738A (en) * 2012-11-27 2014-06-09 Clarion Co Ltd In-vehicle image processing device
US9880554B2 (en) 2015-05-08 2018-01-30 Toyota Jidosha Kabushiki Kaisha Misrecognition determination device
WO2018109999A1 (en) 2016-12-16 2018-06-21 クラリオン株式会社 Lane marking recognition device
US20210323550A1 (en) * 2020-04-21 2021-10-21 Mando Corporation Driver assistance apparatus
US11335085B2 (en) * 2019-07-05 2022-05-17 Hyundai Motor Company Advanced driver assistance system, vehicle having the same and method for controlling the vehicle
EP4148619A1 (en) * 2021-09-10 2023-03-15 Bayerische Motoren Werke Aktiengesellschaft Method and device for lane property detection and plausibilisation using a camera system installed at an automated vehicle and comprising at least three cameras

Cited By (19)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7421095B2 (en) 2004-05-19 2008-09-02 Honda Motor Co., Ltd. Traffic lane marking line recognition system for vehicle
WO2005111965A1 (en) * 2004-05-19 2005-11-24 Honda Motor Co., Ltd. Lane boundary recognition apparatus for vehicle
WO2005111964A1 (en) * 2004-05-19 2005-11-24 Honda Motor Co., Ltd. Lane boundary recognition apparatus for vehicle
WO2005111966A1 (en) * 2004-05-19 2005-11-24 Honda Motor Co., Ltd. Lane boundary recognition apparatus for vehicle
US7415133B2 (en) 2004-05-19 2008-08-19 Honda Motor Co., Ltd. Traffic lane marking line recognition system for vehicle
US7421094B2 (en) 2004-05-19 2008-09-02 Honda Motor Co., Ltd. Traffic lane marking line recognition system for vehicle
US7415134B2 (en) 2005-05-17 2008-08-19 Honda Motor Co., Ltd. Traffic lane marking line recognition system for vehicle
JP2006349580A (en) * 2005-06-17 2006-12-28 Denso Corp Traveling condition determination device and on-vehicle navigation device
JP2007147564A (en) * 2005-11-30 2007-06-14 Aisin Aw Co Ltd Image recognition device and method thereof, and own vehicle position recognition device and method
WO2008107944A1 (en) * 2007-03-01 2008-09-12 Pioneer Corporation Lane deviation prevention device, lane deviation prevention method, lane deviation prevention program, and recording medium
JP4744632B2 (en) * 2007-03-01 2011-08-10 パイオニア株式会社 Lane departure prevention apparatus, lane departure prevention method, lane departure prevention program, and storage medium
JP2010165142A (en) * 2009-01-15 2010-07-29 Nissan Motor Co Ltd Device and method for recognizing road sign for vehicle
JP2014106738A (en) * 2012-11-27 2014-06-09 Clarion Co Ltd In-vehicle image processing device
US9880554B2 (en) 2015-05-08 2018-01-30 Toyota Jidosha Kabushiki Kaisha Misrecognition determination device
WO2018109999A1 (en) 2016-12-16 2018-06-21 クラリオン株式会社 Lane marking recognition device
US11373417B2 (en) 2016-12-16 2022-06-28 Clarion Co., Ltd. Section line recognition device
US11335085B2 (en) * 2019-07-05 2022-05-17 Hyundai Motor Company Advanced driver assistance system, vehicle having the same and method for controlling the vehicle
US20210323550A1 (en) * 2020-04-21 2021-10-21 Mando Corporation Driver assistance apparatus
EP4148619A1 (en) * 2021-09-10 2023-03-15 Bayerische Motoren Werke Aktiengesellschaft Method and device for lane property detection and plausibilisation using a camera system installed at an automated vehicle and comprising at least three cameras

Similar Documents

Publication Publication Date Title
US11461595B2 (en) Image processing apparatus and external environment recognition apparatus
JP2002092796A (en) Lane recognizing device
US9129162B2 (en) Vehicular parking control system and vehicular parking control method using the same
JP6458384B2 (en) Lane detection device and lane detection method
JP2005346395A (en) Lane deviation alarm device
KR101481134B1 (en) System and method for estimating the curvature radius of autonomous vehicles using sensor fusion
JP2010078387A (en) Lane determining apparatus
JP6207952B2 (en) Leading vehicle recognition device
CN111480182B (en) Road map generation system and road map generation method
JP2015064795A (en) Determination device and determination method for parking place
US20200344448A1 (en) Image processing device and image processing method
JP6206036B2 (en) Parking lot judgment device, parking lot judgment method
US9988059B2 (en) Vehicle behavior detection device
JP4066869B2 (en) Image processing apparatus for vehicle
US20190362164A1 (en) Image recognition device, image recognition method, and parking assist system
JP3626733B2 (en) Lane recognition image processing apparatus and program for executing the processing
JP3716905B2 (en) Lane departure warning device
US20220366704A1 (en) Method for detecting an exit lane for a motor vehicle
US20220332328A1 (en) Device for determining a length of a vehicle combination
JP2015064797A (en) Determination device and determination method for parking place
US11816903B2 (en) Method for determining a type of parking space
JP3405819B2 (en) Vehicle running state determination device
JP2005332268A (en) Running road recognition device
JP2009255666A (en) Lane deviation alarm device of vehicle
JP6570302B2 (en) Driving assistance device