JP2005156199A - Vehicle detection method and vehicle detector - Google Patents

Vehicle detection method and vehicle detector Download PDF

Info

Publication number
JP2005156199A
JP2005156199A JP2003391589A JP2003391589A JP2005156199A JP 2005156199 A JP2005156199 A JP 2005156199A JP 2003391589 A JP2003391589 A JP 2003391589A JP 2003391589 A JP2003391589 A JP 2003391589A JP 2005156199 A JP2005156199 A JP 2005156199A
Authority
JP
Japan
Prior art keywords
vehicle
voting
peak
image
detection method
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2003391589A
Other languages
Japanese (ja)
Other versions
JP4123138B2 (en
Inventor
Yasushi Otsuka
裕史 大塚
Hiroshi Takenaga
寛 武長
Tatsuhiko Moji
竜彦 門司
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hitachi Ltd
Original Assignee
Hitachi Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hitachi Ltd filed Critical Hitachi Ltd
Priority to JP2003391589A priority Critical patent/JP4123138B2/en
Publication of JP2005156199A publication Critical patent/JP2005156199A/en
Application granted granted Critical
Publication of JP4123138B2 publication Critical patent/JP4123138B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Landscapes

  • Length Measuring Devices By Optical Means (AREA)
  • Image Processing (AREA)
  • Traffic Control Systems (AREA)
  • Image Analysis (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To precisely specify a position and a direction ranging over from a vehicle in the vicinity thereof to a vehicle in the long distance, by a single eye camera or combination of the single eye camera and a radar. <P>SOLUTION: This method/detector uses a voting system provided with an imaging means for acquiring an image of a front side of own vehicle, an edge extraction means for processing the image acquired by the imaging means to extract a brightness change portion, a voting means for executing voting on a voting space of a two-dimensional coordinate system, based on information of an edge point, and a peak detecting means for detecting a peak of a voting value in the voting space. Inexpensive realization is possible thereby because the vehicle is detected by the single eye camera, and various types of vehicles are detected because the edges of vehicular both ends are used and because not using a pattern of vehicular back face. <P>COPYRIGHT: (C)2005,JPO&NCIPI

Description

本発明は、画像処理を用いた車両検知方法及び車両検知装置に関するものである。   The present invention relates to a vehicle detection method and a vehicle detection device using image processing.

車載カメラから得られた画像に基づいて車両の位置及び方向を検出する方法は数多く存在するが、大きく分けて2つ挙げられる。一つはエッジを検出してそのエッジを解析する方法、もう一つはパターンマッチングを行う方法である。そのうちエッジを用いる方法は車両の大きさの変化に追従できることと、車両の左右端が検出できるメリット等のため、数多く提案されている。特にステレオカメラ方式は、エッジを解析することで視差を検出している。例えば2つのカメラで縦エッジの対を検出して、縦エッジの対の視差がほぼ等しい場合に車両の両端であると推定する車両検知方法は、2つのカメラによって距離が求まるため、背景との分離が比較的容易に行えて信頼性が高い。しかしながら、上記車両検知方法は、カメラが2台必要となるため、重量増,コスト高などにつながる。また2つのカメラのキャリブレーションを行うことも、生産コストの増加につながる。   There are many methods for detecting the position and direction of a vehicle based on an image obtained from an in-vehicle camera. One is a method of detecting an edge and analyzing the edge, and the other is a method of performing pattern matching. Among them, a number of methods using edges have been proposed because they can follow changes in the size of the vehicle and can detect the left and right ends of the vehicle. In particular, the stereo camera system detects parallax by analyzing edges. For example, a vehicle detection method that detects a pair of vertical edges with two cameras and estimates that the parallax of the pair of vertical edges is at both ends of the vehicle when the parallax of the pair of vertical edges is almost equal. Separation is relatively easy and reliability is high. However, the vehicle detection method requires two cameras, which leads to an increase in weight and cost. Also, calibrating the two cameras leads to an increase in production costs.

そこで、単眼カメラにおける車両検知方法も数多く提案されている。たとえば、自車線内では水平エッジの密集領域を車両候補とし、隣接車線では垂直エッジ密集領域の移動を解析して車両候補とするかを判断し、水平エッジを検出した場合に車両候補とすることが提案されている(例えば特許引用文献1)。しかしながら、水平エッジが密集していることが検知できる状況は先行車両が近距離にある場合に限られ、遠方で車両が小さくなると解像度の関係上、必ずしも水平エッジが密集して現れるとは限らない。また、逆光時には車両後部が全体的に影で黒くなるため水平エッジはほとんど観測されなくなる。   Therefore, many vehicle detection methods for monocular cameras have been proposed. For example, in a lane, a horizontal edge dense area is used as a vehicle candidate, and in an adjacent lane, the movement of a vertical edge dense area is analyzed to determine whether it is a vehicle candidate. Has been proposed (for example, Patent Citation 1). However, the situation where it can be detected that the horizontal edges are dense is limited to the case where the preceding vehicle is at a short distance, and if the vehicle becomes small in the distance, the horizontal edges do not always appear dense due to the resolution. . Further, since the rear part of the vehicle becomes black with shadows during backlighting, the horizontal edge is hardly observed.

その他にも、初期の車両エッジ輪郭モデルを設定してから、その後はそのモデルとの照合を行う車両検知方法や、自車の走行車線内にある先行車両の影部分に存在する水平エッジと、先行車の両端を表す縦エッジの対を求めてから、その後は水平エッジの位置と両端の縦エッジの幅によって連続的に計算する車両検知方法が提案されている。いずれも初期の車両検知が正確に行われることで効果を発揮するが、初期の車両検知に失敗した場合にはその後の誤認識に繋がる。エッジの縦軸,横軸に対する投影分布から、車両の上下端,左右端にピークが現れる特徴を利用して車両候補を求める検知方法が提案されている。上記方法は、ウィンドウの設定の仕方によってピークの出方に違いが生じる。車両以外のノイズ領域が含まれるようにすると他の部分にピークが出て解析が煩雑となるため、車両以外の領域を極力含まないようなウィンドウ設定が必要となる。しかしながら、車両位置がわかっていない状態で最適なウィンドウを設定することは、難しい。   In addition, after setting the initial vehicle edge contour model, after that, a vehicle detection method for matching with the model, a horizontal edge present in the shadow part of the preceding vehicle in the traveling lane of the own vehicle, There has been proposed a vehicle detection method in which a pair of vertical edges representing both ends of a preceding vehicle is obtained and then continuously calculated based on the position of the horizontal edge and the width of the vertical edge at both ends. In either case, the initial vehicle detection is performed accurately, and the effect is exhibited. However, if the initial vehicle detection fails, it leads to a subsequent erroneous recognition. A detection method has been proposed in which vehicle candidates are obtained from features in which peaks appear at the upper and lower ends and left and right ends of a vehicle from projection distributions of the vertical and horizontal axes of edges. In the above method, there is a difference in how the peaks appear depending on how the windows are set. If a noise area other than the vehicle is included, a peak appears in the other part and the analysis becomes complicated. Therefore, it is necessary to set a window that does not include an area other than the vehicle as much as possible. However, it is difficult to set an optimal window when the vehicle position is unknown.

特開2002−334330号公報JP 2002-334330 A

単眼カメラあるいは単眼カメラとレーダとの組み合わせによって近傍の車両から遠方の車両まで高精度に特定可能なシステムを提供する。   Provided is a system capable of specifying from a nearby vehicle to a distant vehicle with high accuracy by using a monocular camera or a combination of a monocular camera and a radar.

自車両の前方を撮影した画像を基に、2次元座標系の投票空間上に投票を行う投票方式を用いて、自車両の前方を走行する他の車両の検知を行うことを特徴とする車両検知方法とすることで上記課題を解決する。   A vehicle characterized by detecting another vehicle traveling in front of the own vehicle using a voting method for voting in a voting space of a two-dimensional coordinate system based on an image obtained by photographing the front of the own vehicle. The above problem is solved by using a detection method.

本車両検知方法は、単眼カメラで車両検知を行っているため安価に実現でき、車両両端のエッジを使って車両を検知しているため、様々なタイプの車両を検知することができる。   This vehicle detection method can be realized at low cost because the vehicle detection is performed with a monocular camera, and various types of vehicles can be detected because the vehicle is detected using edges at both ends of the vehicle.

図1は、本発明である車両検知方法の処理フローである。本実施例では、車両に設置されたカメラから画像を入力する車両前方画像取得ステップ(S1)と、車両前方画像取得ステップS1で入力された入力画像の輝度変化部分をエッジとし、前記エッジの位置,強度,方向成分を検出するエッジ検出ステップ(S2)と、エッジ点の方向成分から縦エッジかどうかを判定し、縦エッジがあれば水平方向に別の縦エッジがあるかどうかを探索する縦エッジペア探索ステップ(S3)と、縦エッジペアが見つかったかどうかを判定して処理を変更する縦エッジペア有無判定ステップ(S4)と、縦エッジペアが見つかった場合、縦エッジペアのそれぞれのx座標から、中心位置xと幅wを算出してxw空間に投票するxw空間投票ステップ(S5)と、xw空間のピーク座標を検出するピーク座標検出ステップ(S6)と、ピーク座標のxw情報から車両までの距離方向情報を算出する車両情報算出ステップ(S7)によって車両が検知される。   FIG. 1 is a processing flow of a vehicle detection method according to the present invention. In the present embodiment, the vehicle front image acquisition step (S1) for inputting an image from a camera installed in the vehicle, and the luminance change portion of the input image input in the vehicle front image acquisition step S1 as an edge, the position of the edge , An edge detection step (S2) for detecting the intensity and direction components, and whether or not there is a vertical edge from the direction component of the edge point, and if there is a vertical edge, search for whether there is another vertical edge in the horizontal direction. The edge pair search step (S3), the vertical edge pair presence / absence determination step (S4) for determining whether a vertical edge pair is found and changing the processing, and if the vertical edge pair is found, the center position is determined from the respective x coordinates of the vertical edge pair. xw space voting step (S5) for calculating x and width w and voting to xw space, and peak coordinate detection for detecting peak coordinates in xw space A step (S6), the vehicle is detected by the vehicle information calculating step of calculating the distance direction information from xw information peak coordinates to the vehicle (S7).

本実施例の車間距離制御システム(以下ACCとする)の構成例を図3に示す。車両検知部301において、車両の走行方向前方に向けられたカメラ304から得られた車両前方画像を、画像処理手段305に転送する。画像処理手段305は本発明の車両検知方法によって、画像上の先行車両の位置を認識し、自車両と先行車両の位置関係を算出して
ACC制御部302に転送する。ACC制御部302では先行車両との位置関係から、車間距離を一定に保つように必要に応じて、アクセル制御手段306によるアクセル制御や、ブレーキ制御手段307によるブレーキ制御を行うようACC実行部303に信号を送り、車両の速度を制御する。以下、図1に従って本車両検知方法の詳細な説明を行う。
A configuration example of the inter-vehicle distance control system (hereinafter referred to as ACC) of the present embodiment is shown in FIG. In the vehicle detection unit 301, the vehicle front image obtained from the camera 304 directed forward in the traveling direction of the vehicle is transferred to the image processing unit 305. The image processing means 305 recognizes the position of the preceding vehicle on the image by the vehicle detection method of the present invention, calculates the positional relationship between the host vehicle and the preceding vehicle, and transfers it to the ACC control unit 302. The ACC control unit 302 instructs the ACC execution unit 303 to perform accelerator control by the accelerator control unit 306 and brake control by the brake control unit 307 as necessary so as to keep the inter-vehicle distance constant from the positional relationship with the preceding vehicle. Send a signal to control the speed of the vehicle. Hereinafter, the vehicle detection method will be described in detail according to FIG.

本車両検知方法における各ステップの詳細について図2を用いて説明する。   Details of each step in the vehicle detection method will be described with reference to FIG.

まず、車両前方画像取得ステップS1では車両前方を撮像できるように設置されたカメラ304で撮像された車両前方画像を画像処理手段305に転送する。車両前方画像は図2(a)のような濃淡画像で、図2(a)は先行車両201のテール部分および、背景が写っている例を表した絵である。車両前方画像取得ステップS1以降のエッジ検出ステップS2から車両情報算出ステップS7までの処理は、画像処理手段305において行われる。   First, in the vehicle front image acquisition step S <b> 1, the vehicle front image captured by the camera 304 installed so as to capture the front of the vehicle is transferred to the image processing unit 305. The vehicle front image is a grayscale image as shown in FIG. 2A, and FIG. 2A is a picture showing an example in which the tail portion of the preceding vehicle 201 and the background are shown. The processing from the edge detection step S2 to the vehicle information calculation step S7 after the vehicle front image acquisition step S1 is performed by the image processing means 305.

エッジ検出ステップS2では、画像処理手段305の画像メモリ(図示せず)に格納された入力画像の輝度変化部分をエッジとして抽出し、前記エッジの方向情報を付加したデータとして算出する。前記データは、各エッジ点について位置情報x,yおよび、方向情報θを含んでいる。ここで、位置情報は画面上での値で単位はピクセル、方向情報θは画面上でのエッジの方向を表し、0度から360度の値をとる。本実施例では、エッジの方向は、図2(b)の矢印の向きように輝度の高い方向に対して90度右に傾けた方向になるように定義する。輝度の高い方向に対して90度傾けているのは、エッジの方向を輪郭線の向きにあわせるためであり、傾ける方向に関しては左でも構わないがどちらかに統一しておく。   In the edge detection step S2, the luminance change portion of the input image stored in the image memory (not shown) of the image processing means 305 is extracted as an edge, and is calculated as data with the edge direction information added thereto. The data includes position information x, y and direction information θ for each edge point. Here, the position information is a value on the screen, the unit is a pixel, and the direction information θ represents the edge direction on the screen, and takes a value of 0 to 360 degrees. In this embodiment, the edge direction is defined to be inclined 90 degrees to the right with respect to the high luminance direction as indicated by the arrow in FIG. The reason for tilting 90 degrees with respect to the direction of high brightness is to match the direction of the edge with the direction of the contour line, and the tilting direction may be left, but it is unified to either.

縦エッジペア探索ステップS3からxw空間投票ステップS5では、エッジ検出ステップS2で検出したエッジ情報を利用して車両の幅、中心位置の候補を求めるステップである。先行車両の左右には背景との輝度差、すなわち縦エッジが現れやすいため、縦エッジに注目する。先行車両の左右端の縦エッジであれば左右のペアで現れることが多いため、左右一組の情報として考える。   In the vertical edge pair search step S3 to the xw space voting step S5, the vehicle width and center position candidates are obtained using the edge information detected in the edge detection step S2. Since the brightness difference from the background, that is, the vertical edge tends to appear on the left and right of the preceding vehicle, attention is paid to the vertical edge. Since the vertical edges at the left and right ends of the preceding vehicle often appear in left and right pairs, they are considered as a set of left and right information.

縦エッジペア探索ステップS3では、縦エッジのみに注目し、縦エッジの水平方向に並んだペアを探索する。図2の例では、図2(c)の検出領域202で囲まれた2つの矢印が先行車両両端のエッジである。図2(d)は図2(c)の検出領域202部分を拡大した図である。矢印の向きはエッジの方向を表している。先行車両両端のエッジの場合、エッジの方向は図2(d)のように正反対の方向である可能性が高い。そこで、左右で同一方向の縦エッジペアは無視して、正反対の方向をもつ縦エッジのペアを探索することで、車両の両端エッジペアではない組み合わせをなるべく選ばないようにすることが可能となる。   In the vertical edge pair search step S3, attention is paid only to the vertical edge, and a pair of vertical edges arranged in the horizontal direction is searched. In the example of FIG. 2, the two arrows surrounded by the detection area 202 of FIG. FIG. 2D is an enlarged view of the detection area 202 portion of FIG. The direction of the arrow represents the direction of the edge. In the case of edges at both ends of the preceding vehicle, there is a high possibility that the direction of the edges is the opposite direction as shown in FIG. Therefore, by ignoring vertical edge pairs in the same direction on the left and right sides and searching for pairs of vertical edges having opposite directions, it is possible to select as few combinations as possible that are not edge pairs of both ends of the vehicle.

縦エッジペア有無判定ステップS4では、縦エッジペアがまだ画像中に残っているかどうかを判断して、まだ探索が終了していなければxw空間投票ステップS5へ、探索がすべて終了していればピーク座標検出ステップS6へ処理を切り替える。   In the vertical edge pair presence / absence determination step S4, it is determined whether the vertical edge pair still remains in the image. If the search has not been completed, the process goes to the xw space voting step S5. If all the searches have been completed, the peak coordinate detection is performed. The process is switched to step S6.

xw空間投票ステップS5では、縦エッジペア探索ステップS3で探索した縦エッジペアのそれぞれのx座標から、図2(d)のように中心位置xpと幅wpを算出して、図2(e)に示すxw空上の座標(xp,wp)203に投票する。xw空間上の座標(xp,wp)203に投票するとは、xw空間上の座標(xp,wp)203の値に任意の値(例えば1)を加えることを意味している。xw空間は縦エッジペア探索ステップS3前にあらかじめメモリ上に確保され、すべての座標はゼロクリアされ初期化が行われているものとする。すなわち、縦エッジペアから計算される(x,w)の2次元頻度分布を取っていることになる。図2(d)のように縦エッジの左側の座標が(x1,y1)、右側の座標が(x2,y1)とすると、xp,wpは以下のような式で計算される。   In the xw space voting step S5, the center position xp and the width wp are calculated as shown in FIG. 2D from the respective x coordinates of the vertical edge pair searched in the vertical edge pair searching step S3, and are shown in FIG. Vote on coordinates (xp, wp) 203 on xw sky. Voting on the coordinates (xp, wp) 203 on the xw space means adding an arbitrary value (for example, 1) to the value of the coordinates (xp, wp) 203 on the xw space. It is assumed that the xw space is secured in the memory in advance before the vertical edge pair search step S3, and all coordinates are cleared to zero and initialized. That is, a two-dimensional frequency distribution of (x, w) calculated from the vertical edge pair is taken. As shown in FIG. 2D, when the left coordinate of the vertical edge is (x1, y1) and the right coordinate is (x2, y1), xp and wp are calculated by the following equations.

Figure 2005156199
Figure 2005156199

Figure 2005156199
Figure 2005156199

ここで、x2>x1である。画像上での先行車両は後ろから撮像されていて、おおよそ長方形に近似できるため、先行車両両端の縦エッジペアはy座標が変わってもほぼ(xp,wp)近辺の値に収束する。すなわち、先行車両両端の縦エッジペアの投票を繰り返すことで座標(xp,wp)203あるいはその近傍座標の値が増加することになり、座標(xp,wp)203近辺に値のピークが現れる。   Here, x2> x1. Since the preceding vehicle on the image is captured from behind and can be approximated to a rectangle, the vertical edge pair at both ends of the preceding vehicle converges to a value in the vicinity of (xp, wp) even if the y coordinate changes. That is, by repeating the voting of the vertical edge pairs at both ends of the preceding vehicle, the value of the coordinate (xp, wp) 203 or its neighboring coordinates increases, and a value peak appears near the coordinate (xp, wp) 203.

本実施例では、投票空間の横軸を車両中心位置x、縦軸を車両幅wと定義しているが、最終的に車両中心位置と車両幅が求まればどのように定義してもよい。たとえば横軸を縦エッジの左側のx座標、縦軸を縦エッジの右側のx座標として投票空間を形成して、ピーク位置が求まってから車両中心位置と車両幅を計算してもよい。このように最終的に車両中心位置と車両幅が求まるパラメータであれば投票空間の縦軸横軸はどのように定義しても構わない。   In the present embodiment, the horizontal axis of the voting space is defined as the vehicle center position x, and the vertical axis is defined as the vehicle width w. However, it may be defined as long as the vehicle center position and the vehicle width are finally obtained. . For example, the voting space may be formed with the horizontal axis as the x coordinate on the left side of the vertical edge and the vertical axis as the x coordinate on the right side of the vertical edge, and the vehicle center position and the vehicle width may be calculated after the peak position is obtained. As described above, the vertical axis and the horizontal axis of the voting space may be defined in any way as long as the vehicle center position and the vehicle width are finally obtained.

ピーク座標検出ステップS6では、xw空間のピーク座標を探索して車両中心位置と車両幅の組み合わせ(xp,wp)を求める。ここで、縦エッジのペアは先行車両両端だけではなく、先行車両のナンバープレートやテールランプ,ガードレール,電柱等のノイズである可能性もあり、その分の投票も行われている。また、先行車両が隣接車線等に複数存在する場合があるため、実際にはピークは複数出現することになる。そのため、どのピークが先行車両のものなのかを判断しなければならない。   In the peak coordinate detection step S6, a peak coordinate in the xw space is searched to obtain a combination (xp, wp) of the vehicle center position and the vehicle width. Here, the pair of vertical edges is not limited to both ends of the preceding vehicle, but may be noise from the number plate, tail lamp, guardrail, utility pole, etc. of the preceding vehicle, and voting is performed accordingly. In addition, since there may be a plurality of preceding vehicles in the adjacent lane or the like, a plurality of peaks actually appear. Therefore, it must be determined which peak is that of the preceding vehicle.

判断する方法のひとつに画像処理を用いる方法が挙げられる。検知されたピークから、その領域の左右端が逆算できるため、逆算された領域内で先行車両らしいかどうかを判断する。例えば車両の下端には影ができるためその情報を利用して、領域内に水平エッジが並んでいる横ラインを探索したり、平均濃度よりも十分に低い黒い横ラインを探索することでその領域が車両かどうか判断できる。   One of the determination methods is a method using image processing. Since the left and right ends of the area can be calculated backward from the detected peak, it is determined whether or not the preceding vehicle seems to be within the calculated area. For example, since a shadow is formed at the lower end of the vehicle, the information is used to search for a horizontal line in which horizontal edges are arranged in the area, or search for a black horizontal line that is sufficiently lower than the average density. It can be determined whether the vehicle is a vehicle.

車両かどうか判断するもうひとつの方法として、レーダ情報を用いてセンサフュージョンを行う方法がある。すなわち、構成として図4のように先行車両までの距離,方向等を検知するためのレーダ401を組み込む。レーダ401は距離,方向が検知できればよく、ミリ波レーダ,レーザーレーダ等の、いずれでもかまわない。レーダ情報を使う場合、レーダ401で得られた距離,方向情報からxw空間でピークを探索する範囲を限定することができるため、ノイズのピークを効率良く排除することが可能となる。   Another method for determining whether the vehicle is a vehicle is to perform sensor fusion using radar information. That is, the radar 401 for detecting the distance, direction, etc. to the preceding vehicle is incorporated as shown in FIG. The radar 401 only needs to be able to detect the distance and direction, and may be any of millimeter wave radar, laser radar, and the like. When radar information is used, the range for searching for peaks in the xw space can be limited from the distance and direction information obtained by the radar 401, so that noise peaks can be efficiently eliminated.

以下、レーダ401で得られた距離,方向情報からxw空間でピークを探索する範囲を限定する方法について説明する。車両中心位置情報xと車両幅情報wは画像上の値となるが、実際の車両幅(例えば1.7m と仮定)、カメラパラメータ(CCDサイズ,焦点距離等)が既知である場合、x,wから実際の車両までの距離,方向が一意に求まる。つまりxw情報と実際の距離方向情報は一対一に対応しており、相互変換可能な情報である。すなわち、レーダ情報からxw空間のどのあたりにピークが出現するかは予測ができるため、レーダ情報の距離,方向情報をxwに変換して、その周辺に限定してピークを探索するようにすれば、ノイズのピークは最初から排除することが可能となる。   Hereinafter, a method for limiting the range for searching for a peak in the xw space from the distance and direction information obtained by the radar 401 will be described. The vehicle center position information x and the vehicle width information w are values on the image. If the actual vehicle width (for example, assumed to be 1.7 m) and camera parameters (CCD size, focal length, etc.) are known, x, The distance and direction from w to the actual vehicle can be determined uniquely. That is, the xw information and the actual distance direction information are in one-to-one correspondence and can be converted into each other. That is, since it is possible to predict where the peak appears in the xw space from the radar information, if the distance and direction information of the radar information is converted into xw and the peak is searched only in the vicinity thereof. The noise peak can be eliminated from the beginning.

ここで、レーダ401側から見たセンサフュージョンのメリットを説明する。レーダ
401の課題のひとつに横方向の精度の悪さが挙げられる。これは、先行車両が幅をもっていることから、先行車両のどの部分で反射してきた反射波なのか判断できないことが原因である。しかし本車両検知方法の場合、画像上で車両の両端を検知するため横方向の精度が高く、フュージョンすることでのメリットが大きい。また、レーダ401はガードレール等の反射波を誤検知することがあるという課題もある。これについては、誤検知情報である距離,方向情報をxwに変換して、その周辺に限定してピークを探索した場合、車両では無いためピークが存在する可能性は低い。そのため、車両以外のものを誤検知であることを判定することが可能となる。
Here, the merit of sensor fusion viewed from the radar 401 side will be described. One of the problems of the radar 401 is the poor lateral accuracy. This is because, since the preceding vehicle has a width, it cannot be determined in which part of the preceding vehicle the reflected wave is reflected. However, in the case of this vehicle detection method, since both ends of the vehicle are detected on the image, the lateral accuracy is high, and the merit of fusion is great. In addition, there is a problem that the radar 401 may erroneously detect a reflected wave such as a guardrail. In this regard, when distance and direction information, which is false detection information, is converted to xw and a peak is searched for only in the vicinity thereof, the possibility that a peak exists is low because it is not a vehicle. Therefore, it is possible to determine that something other than the vehicle is erroneous detection.

車両情報算出ステップS7では、ピーク座標検出ステップS6で検出したピークの座標値xwから実際の距離方向情報に変換して終了する。変換については前述したように実際の車両幅(例えば1.7m と仮定)、カメラパラメータ(CCDサイズ,焦点距離等)が既知である場合、x,wから実際の車両までの距離,方向が一意に求まる。   In the vehicle information calculation step S7, the peak coordinate value xw detected in the peak coordinate detection step S6 is converted into actual distance direction information, and the process ends. As described above, when the actual vehicle width (for example, assumed to be 1.7 m) and camera parameters (CCD size, focal length, etc.) are known, the distance and direction from x and w to the actual vehicle are unique. I want to.

次に、エッジの検出方法について説明する。エッジの位置のみの検出方法については様様な方法があるが、エッジの方向成分まで含めて算出する方法としては、ソーベルフィルタによる方法や、2次元ゼロクロス法等が挙げられる。本実施例では、3×3のソーベルフィルタによる方法について説明する。   Next, an edge detection method will be described. There are various methods for detecting only the position of the edge. Examples of the calculation method including the edge direction component include a method using a Sobel filter and a two-dimensional zero-cross method. In this embodiment, a method using a 3 × 3 Sobel filter will be described.

3×3のソーベルフィルタでは、ある注目画素を中心とした上下左右の9つの画素値に対して、図5に示すような係数をそれぞれ乗算し、結果を合計することでエッジの強度を算出する。垂直方向の係数行列402と水平方向の係数行列403を用いて、この処理を行う。   In the 3 × 3 Sobel filter, the edge intensity is calculated by multiplying the nine pixel values of the top, bottom, left, and right centered on a target pixel by the coefficients shown in FIG. 5 and summing the results. To do. This process is performed using a vertical coefficient matrix 402 and a horizontal coefficient matrix 403.

垂直方向の合計値をGx、水平方向の合計値をGyとすると、注目画素の画素値Gは式(3)のようになる。   When the total value in the vertical direction is Gx and the total value in the horizontal direction is Gy, the pixel value G of the target pixel is expressed by Equation (3).

Figure 2005156199
Figure 2005156199

この画素値Gはエッジの強度を表し、大きいほどその地点の近傍で輝度差が大きいことを示す。このGの値があらかじめ設定したエッジ抽出閾値よりも大きければ、その部分の画素はエッジ点として抽出することになる。抽出漏れを防ぐため、エッジ抽出閾値は十分に下げておくことが必要であるが、下げすぎた場合エッジの数が膨大となるため処理時間と誤認識を考慮し適当な値に設定する必要がある。   This pixel value G represents the intensity of the edge, and the larger the value, the greater the difference in brightness near that point. If the value of G is larger than a preset edge extraction threshold, the pixel in that portion is extracted as an edge point. In order to prevent extraction omission, it is necessary to reduce the edge extraction threshold sufficiently, but if it is too low, the number of edges will become enormous, so it is necessary to set it to an appropriate value in consideration of processing time and misrecognition. is there.

エッジの方向θは、エッジ強度の各方向成分GxとGyから、下記の式(4)で求まる。   The edge direction θ is obtained by the following equation (4) from the respective direction components Gx and Gy of the edge strength.

Figure 2005156199
Figure 2005156199

前述したように、方向θは画面上でのエッジの方向を表し、0度以上360度未満の値をとる。エッジの方向は、輝度の高い方向に対して90度右にずらした方向に変換して用いる。   As described above, the direction θ represents the edge direction on the screen, and takes a value of 0 degree or more and less than 360 degree. The edge direction is converted into a direction shifted to the right by 90 degrees with respect to the direction with high luminance.

本発明の一実施例である車両検知方法の処理フローである。It is a processing flow of the vehicle detection method which is one Example of this invention. 本発明の一実施例である車両検知方法の画像処理の流れ図である。It is a flowchart of the image processing of the vehicle detection method which is one Example of this invention. 本発明の一実施例であるACCのシステム構成図である。It is a system block diagram of ACC which is one Example of this invention. 本発明の他の実施例であるACCのシステム構成図である。It is a system block diagram of ACC which is the other Example of this invention. 3×3ソーベルフィルタの係数行列である。3 is a coefficient matrix of a 3 × 3 Sobel filter.

符号の説明Explanation of symbols

201…先行車両、202…検出領域、203…座標(xp,wp)、301…車両検知部、302…ACC制御部、303…ACC実行部、304…カメラ、305…画像処理手段、306…アクセル制御手段、307…ブレーキ制御手段、401…レーダ。

201 ... preceding vehicle, 202 ... detection area, 203 ... coordinate (xp, wp), 301 ... vehicle detection unit, 302 ... ACC control unit, 303 ... ACC execution unit, 304 ... camera, 305 ... image processing means, 306 ... accelerator Control means, 307 ... brake control means, 401 ... radar.

Claims (5)

撮像手段によって自車両の前方の画像を撮影し、
エッジ点抽出手段によって撮影した画像の輝度変化部分をエッジ点として抽出し、
投票手段によって前記エッジ点の情報をもとに2次元座標系の投票空間上に投票を行い、
ピーク検出手段によって前記投票空間における前記投票値のピークを検出し、
前記ピークとなる座標を車両として検知することを特徴とする車両検知方法。
Take an image of the front of the vehicle with the imaging means,
Extract the brightness change part of the image taken by the edge point extraction means as an edge point,
Voting is performed on a voting space in a two-dimensional coordinate system based on the edge point information by voting means,
Detecting a peak of the voting value in the voting space by a peak detecting means;
A vehicle detection method, wherein the coordinates at the peak are detected as a vehicle.
請求項1記載の車両検知方法において、
前記投票空間上に投票する情報は、画像上で水平方向に位置する2つの縦エッジの水平位置から求められる情報であることを特徴とする車両検知方法。
The vehicle detection method according to claim 1,
Information for voting on the voting space is information obtained from horizontal positions of two vertical edges positioned in a horizontal direction on an image.
請求項1または2に記載の車両検知方法において、
先行車両までの距離,方向を検知する測距手段を備え、前記測距手段で測定した距離,方向データをもとに、前記ピーク検出手段は前記投票空間における前記投票値のピークを検出する範囲を特定することを特徴とする車両検知方法。
The vehicle detection method according to claim 1 or 2,
Ranging means for detecting the distance and direction to the preceding vehicle, and based on the distance and direction data measured by the distance measuring means, the peak detecting means detects a peak of the voting value in the voting space. The vehicle detection method characterized by specifying.
自車両の前方を撮影した画像を基に、2次元座標系の投票空間上に投票を行う投票方式によって、自車両の前方を走行する他の車両の検知を行うことを特徴とする車両検知方法。   A vehicle detection method for detecting another vehicle traveling in front of the host vehicle by a voting method for voting in a voting space of a two-dimensional coordinate system based on an image of the front of the host vehicle. . 自車両の前方の画像を取得する撮像手段と、前記撮像手段で取得した画像を処理して輝度変化部分を抽出するエッジ点抽出手段と、
前記エッジ点の情報をもとに、2次元座標系の投票空間上に投票を行う投票手段と、前記投票空間における前記投票値のピークを検出するピーク検出手段を備えたことを特徴とする車両検知装置。

An imaging means for acquiring an image ahead of the host vehicle, an edge point extracting means for processing the image acquired by the imaging means and extracting a luminance change portion;
A vehicle comprising: voting means for voting on a voting space of a two-dimensional coordinate system based on the edge point information; and a peak detecting means for detecting a peak of the voting value in the voting space. Detection device.

JP2003391589A 2003-11-21 2003-11-21 Vehicle detection method and vehicle detection device Expired - Fee Related JP4123138B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2003391589A JP4123138B2 (en) 2003-11-21 2003-11-21 Vehicle detection method and vehicle detection device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2003391589A JP4123138B2 (en) 2003-11-21 2003-11-21 Vehicle detection method and vehicle detection device

Publications (2)

Publication Number Publication Date
JP2005156199A true JP2005156199A (en) 2005-06-16
JP4123138B2 JP4123138B2 (en) 2008-07-23

Family

ID=34718555

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2003391589A Expired - Fee Related JP4123138B2 (en) 2003-11-21 2003-11-21 Vehicle detection method and vehicle detection device

Country Status (1)

Country Link
JP (1) JP4123138B2 (en)

Cited By (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008215938A (en) * 2007-03-01 2008-09-18 Railway Technical Res Inst Apparatus for recognizing ahead-train railway signal/sign
EP1975903A2 (en) 2007-03-26 2008-10-01 Hitachi, Ltd. Vehicle collision avoidance equipment and method
EP2045623A1 (en) 2007-09-28 2009-04-08 Hitachi, Ltd. Vehicle detection apparatus
US7710310B2 (en) * 2003-03-25 2010-05-04 Nederlandse Organisatie Voor Toegepast-Natuurwetenschappelijk Onderzoek Tno Detection system, method for detecting objects and computer program therefor
JP2010151621A (en) * 2008-12-25 2010-07-08 Fujitsu Ten Ltd Signal processor and radar system
EP2570963A2 (en) 2011-09-15 2013-03-20 Clarion Co., Ltd. Systems, devices, and methods for recognizing objects as perceived from a vehicle
US9097801B2 (en) 2011-06-15 2015-08-04 Honda Elesys Co., Ltd. Obstacle detection apparatus and obstacle detection program
JP2015185045A (en) * 2014-03-25 2015-10-22 本田技研工業株式会社 Object detection processing apparatus
CN108922188A (en) * 2018-07-24 2018-11-30 河北德冠隆电子科技有限公司 The four-dimensional outdoor scene traffic of radar tracking positioning perceives early warning monitoring management system
WO2021172535A1 (en) * 2020-02-27 2021-09-02 株式会社デンソー Object detecting device
CN113655486A (en) * 2021-08-16 2021-11-16 安徽江淮汽车集团股份有限公司 Automatic parking method based on single rearview camera and multiple radars

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0641550U (en) * 1992-11-20 1994-06-03 株式会社イナックス Kitchen cabinets
JP2019160251A (en) 2018-03-16 2019-09-19 株式会社リコー Image processing device, object recognition device, instrument control system, moving body, image processing method and program

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09246954A (en) * 1995-06-07 1997-09-19 Internatl Business Mach Corp <Ibm> Programmable gate array for performing improved memory assignment
JP2001134772A (en) * 1999-11-04 2001-05-18 Honda Motor Co Ltd Object recognizing device
JP2003187220A (en) * 2001-12-14 2003-07-04 Toshiba Corp Object detector and its detecting method

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH09246954A (en) * 1995-06-07 1997-09-19 Internatl Business Mach Corp <Ibm> Programmable gate array for performing improved memory assignment
JP2001134772A (en) * 1999-11-04 2001-05-18 Honda Motor Co Ltd Object recognizing device
JP2003187220A (en) * 2001-12-14 2003-07-04 Toshiba Corp Object detector and its detecting method

Cited By (14)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7710310B2 (en) * 2003-03-25 2010-05-04 Nederlandse Organisatie Voor Toegepast-Natuurwetenschappelijk Onderzoek Tno Detection system, method for detecting objects and computer program therefor
JP2008215938A (en) * 2007-03-01 2008-09-18 Railway Technical Res Inst Apparatus for recognizing ahead-train railway signal/sign
EP1975903A2 (en) 2007-03-26 2008-10-01 Hitachi, Ltd. Vehicle collision avoidance equipment and method
EP2045623A1 (en) 2007-09-28 2009-04-08 Hitachi, Ltd. Vehicle detection apparatus
JP2010151621A (en) * 2008-12-25 2010-07-08 Fujitsu Ten Ltd Signal processor and radar system
US9097801B2 (en) 2011-06-15 2015-08-04 Honda Elesys Co., Ltd. Obstacle detection apparatus and obstacle detection program
EP2570963A2 (en) 2011-09-15 2013-03-20 Clarion Co., Ltd. Systems, devices, and methods for recognizing objects as perceived from a vehicle
JP2015185045A (en) * 2014-03-25 2015-10-22 本田技研工業株式会社 Object detection processing apparatus
CN108922188A (en) * 2018-07-24 2018-11-30 河北德冠隆电子科技有限公司 The four-dimensional outdoor scene traffic of radar tracking positioning perceives early warning monitoring management system
CN108922188B (en) * 2018-07-24 2020-12-29 河北德冠隆电子科技有限公司 Radar tracking and positioning four-dimensional live-action traffic road condition perception early warning monitoring management system
WO2021172535A1 (en) * 2020-02-27 2021-09-02 株式会社デンソー Object detecting device
JP7459560B2 (en) 2020-02-27 2024-04-02 株式会社デンソー object detection device
CN113655486A (en) * 2021-08-16 2021-11-16 安徽江淮汽车集团股份有限公司 Automatic parking method based on single rearview camera and multiple radars
CN113655486B (en) * 2021-08-16 2023-08-25 安徽江淮汽车集团股份有限公司 Automatic parking method based on single rearview camera and multiple radars

Also Published As

Publication number Publication date
JP4123138B2 (en) 2008-07-23

Similar Documents

Publication Publication Date Title
CN107272021B (en) Object detection using radar and visually defined image detection areas
US8867790B2 (en) Object detection device, object detection method, and program
US8976999B2 (en) Vehicle detection apparatus
WO2017158958A1 (en) Image processing apparatus, object recognition apparatus, device control system, image processing method, and program
JP7206583B2 (en) Information processing device, imaging device, device control system, moving object, information processing method and program
WO2017138245A1 (en) Image processing device, object recognition device, device control system, and image processing method and program
JP2004112144A (en) Front car tracking system and method for tracking front car
JP2002352225A (en) Obstacle detector and its method
WO2020154990A1 (en) Target object motion state detection method and device, and storage medium
CN108645375B (en) Rapid vehicle distance measurement optimization method for vehicle-mounted binocular system
JP4123138B2 (en) Vehicle detection method and vehicle detection device
CN105825495A (en) Object detection apparatus and object detection method
US10546383B2 (en) Image processing device, object recognizing device, device control system, image processing method, and computer-readable medium
US10984263B2 (en) Detection and validation of objects from sequential images of a camera by using homographies
JP3729025B2 (en) Pedestrian detection device
WO2017094300A1 (en) Image processing device, object recognition device, device conrol system, image processing method, and program
JP4052291B2 (en) Image processing apparatus for vehicle
JP6516012B2 (en) Image processing apparatus, object recognition apparatus, device control system, image processing method and program
CN108629225B (en) Vehicle detection method based on multiple sub-images and image significance analysis
JP2018073275A (en) Image recognition device
JP2010224936A (en) Object detection device
JP2005141517A (en) Vehicle detecting method and device
JP2001082954A (en) Image processing device and image processing distance- measuring method
KR101501531B1 (en) Stereo Vision-based Pedestrian Detection System and the method of
JP4788399B2 (en) Pedestrian detection method, apparatus, and program

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20060314

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20060424

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20071220

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20080108

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20080215

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20080408

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20080421

R151 Written notification of patent or utility model registration

Ref document number: 4123138

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110516

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110516

Year of fee payment: 3

S111 Request for change of ownership or part of ownership

Free format text: JAPANESE INTERMEDIATE CODE: R313111

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110516

Year of fee payment: 3

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20110516

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20120516

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130516

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20130516

Year of fee payment: 5

S533 Written request for registration of change of name

Free format text: JAPANESE INTERMEDIATE CODE: R313533

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

LAPS Cancellation because of no payment of annual fees