JPH09152334A - Object detection apparatus - Google Patents

Object detection apparatus

Info

Publication number
JPH09152334A
JPH09152334A JP31267795A JP31267795A JPH09152334A JP H09152334 A JPH09152334 A JP H09152334A JP 31267795 A JP31267795 A JP 31267795A JP 31267795 A JP31267795 A JP 31267795A JP H09152334 A JPH09152334 A JP H09152334A
Authority
JP
Japan
Prior art keywords
distance
image
rectangular area
plane
measured
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP31267795A
Other languages
Japanese (ja)
Inventor
Yasuyuki Domoto
泰之 道本
Katsumasa Onda
勝政 恩田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Holdings Corp
Original Assignee
Matsushita Electric Industrial Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Industrial Co Ltd filed Critical Matsushita Electric Industrial Co Ltd
Priority to JP31267795A priority Critical patent/JPH09152334A/en
Priority to US08/704,895 priority patent/US5825915A/en
Priority to CA002184561A priority patent/CA2184561C/en
Priority to EP96114009A priority patent/EP0762326B1/en
Priority to DE69625423T priority patent/DE69625423T2/en
Publication of JPH09152334A publication Critical patent/JPH09152334A/en
Pending legal-status Critical Current

Links

Abstract

PROBLEM TO BE SOLVED: To provide an object detection apparatus by which the existence of an object can be detected precisely and stably and by which the computing amount of the mapping processing operation of a stereo image is reduced by a method wherein, in a stereo image processing operation which is used as an input method for three-dimensional information, a change, in terms of time, in a distance up to the object is detected. SOLUTION: In an object detection apparatus, the existence of an object is detected on the basis of a change in a distance in terms of time. In the object detection apparatus, the position of a plane such as a road, a floor, a wall or the like is estimated by a plane estimation part 17, the distance of a rectangular region in which a distance cannot be measured in the rectangular region of the plane is computed so as to be supplemented on the basis of a distance up to the plane in the rectangular region in which the distance can be measured. Thereby, the object can be detected stably and surely, and the computing amount of a mapping processing operation in a mapping processing part 7 can be reduced while a parallax search region is limited to a part between an imaging device and the plane.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【発明の属する技術分野】本発明は、ステレオ画像測距
法によって物体の監視、障害物の検知等を行う物体検出
装置に関するものである。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to an object detecting apparatus for monitoring an object, detecting an obstacle, etc. by a stereo image distance measuring method.

【0002】[0002]

【従来の技術】図3は、実吉敬二他著「3次元画像認識
技術を用いた運転支援システム」(自動車技術学会 学術
講演会前刷集9241992−10)に記載された理論に基づいて
物体を検出する従来の物体検出装置の構成図で、以下、
この物体検出装置の動作について説明する。
2. Description of the Related Art FIG. 3 shows an object based on the theory described in "Driving Support System Using 3D Image Recognition Technology" by Keiji Mitsuyoshi et al. In the configuration diagram of the conventional object detection device for detecting,
The operation of this object detection device will be described.

【0003】左側の撮像装置1によって撮影された画像
は、基準となる左画像2(図4参照)として左画像メモリ
3に記憶される。又、光軸が撮像装置1の光軸と平行に
なるように配置された右側の撮像装置4によって撮影さ
れた画像は、右画像5(図5参照)として右画像メモリ6
に記憶される。
The image taken by the left image pickup device 1 is stored in the left image memory 3 as a reference left image 2 (see FIG. 4). An image captured by the right imaging device 4 arranged so that its optical axis is parallel to the optical axis of the imaging device 1 is a right image memory 6 (see FIG. 5).
Is stored.

【0004】そこで、対応付け処理部7は、先ず、図4
に示すように、左画像2を水平方向(図4においてX軸
方向)にM、垂直方向(図4においてY軸方向)にNの(M
×N)個の矩形領域8に分割すると共に、各矩形領域8
を更に水平方向にm、垂直方向にnの(m×n)個の画素
9に分割した上、(m×n)個の画素9の中でi番目の明
るさLiを有する画素9を矩形領域8毎に検出する。
Therefore, the associating processing unit 7 first performs the processing shown in FIG.
As shown in FIG. 4, the left image 2 has M (M in the horizontal direction (X-axis direction in FIG. 4)) and N (M in the vertical direction (Y-axis direction in FIG. 4)).
× N) rectangular areas 8 and each rectangular area 8
Is further divided into (m × n) pixels 9 of m in the horizontal direction and n in the vertical direction, and a pixel 9 having the i-th brightness L i among the (m × n) pixels 9 is divided. The detection is performed for each rectangular area 8.

【0005】又、右画像5の中で左画像2の矩形領域8
の画像に類似する画像の存在する可能性のある探索範囲
10において、水平方向にm、垂直方向にnの(m×n)個
の画素からなる探索矩形領域を水平方向に1画素分移動
させる毎に、探索矩形領域の画素の中でi番目の明るさ
iを有する画素を検出する。
In the right image 5, a rectangular area 8 of the left image 2 is also included.
Search range in which images similar to other images may exist
In FIG. 10, every time the search rectangular area consisting of (m × n) pixels of m in the horizontal direction and n in the vertical direction is moved by one pixel in the horizontal direction, the i-th brightness in the pixels of the search rectangular area is calculated. Pixels having a magnitude R i are detected.

【0006】そして、左画像2の矩形領域8の各画素9
と右画像5の探索矩形領域において左画像2の矩形領域
8の各画素9の位置に対応する位置の画素との明るさを
画素単位に対照して、左画像2の矩形領域8の画像と右
画像5の探索矩形領域の画像との類似度評価値Cを(数
1)によって求める。
Then, each pixel 9 in the rectangular area 8 of the left image 2
And the brightness of the pixel at the position corresponding to the position of each pixel 9 of the rectangular area 8 of the left image 2 in the search rectangular area of the right image 5, and the image of the rectangular area 8 of the left image 2 is compared. The similarity evaluation value C of the right image 5 and the image in the search rectangular area is obtained by (Equation 1).

【0007】[0007]

【数1】 (Equation 1)

【0008】その結果、両画像の類似度評価値Cが最小
になったときの右画像5における探索矩形領域の位置を
左画像2の矩形領域8に対応する領域(以下「対応領
域」という)と判定した上、左画像2の矩形領域8の座
標Xと右画像5の対応領域の座標XRとのずれから視差
dを(数2)によって求めて、出力する。
As a result, the position of the search rectangular area in the right image 5 when the similarity evaluation value C of both images becomes the minimum (hereinafter referred to as "corresponding area") corresponds to the rectangular area 8 of the left image 2. Then, the parallax d is obtained from (Equation 2) from the deviation between the coordinate X of the rectangular area 8 of the left image 2 and the coordinate X R of the corresponding area of the right image 5, and is output.

【0009】[0009]

【数2】d = X − XR なお、左画像2の矩形領域8の画像と右画像5の探索矩
形領域の画像との類似度評価値Cを求めた上、左画像2
の矩形領域の座標Xと右画像5の対応領域の座標XR
のずれから視差dを求める動作は、左画像2の全ての座
標の矩形領域8に対して順次行う。
D = X−X R The similarity evaluation value C between the image of the rectangular area 8 of the left image 2 and the image of the search rectangular area of the right image 5 is calculated, and then the left image 2
The operation of obtaining the parallax d from the deviation between the coordinate X of the rectangular area and the coordinate X R of the corresponding area of the right image 5 is sequentially performed on the rectangular area 8 of all the coordinates of the left image 2.

【0010】そこで、距離計測部11は、対応付け処理部
7において左画像2の矩形領域8毎に得た視差dに基づ
いて、矩形領域8毎に計測される撮像装置1或いは撮像
装置4から物体までの距離K(X,Y)〔但し、0<X≦
M,0<Y≦N〕を、一般的に知られる(数3)によって
求めた上、矩形領域8毎の距離を示す画像(以下「距離
画像」という)に変換して出力する。
Therefore, the distance measuring unit 11 measures from the image pickup device 1 or the image pickup device 4 measured for each rectangular area 8 based on the parallax d obtained for each rectangular area 8 of the left image 2 in the association processing unit 7. Distance to object K (X, Y) [where 0 <X ≦
M, 0 <Y ≦ N] is obtained by the generally known (Equation 3), and is converted into an image showing the distance for each rectangular area 8 (hereinafter referred to as “distance image”) and output.

【0011】[0011]

【数3】 (Equation 3)

【0012】但し、2a:撮像装置1の光軸と撮像装置
4の光軸との間隔 f:撮像装置1及び撮像装置4のレンズの焦点距離 すると、初期距離画像メモリ12は、距離計測部11から計
測基準時刻t0に出力された距離画像を初期距離画像と
して記憶し、又、距離画像メモリ13は、距離計測部11か
ら計測基準時刻t0以降の計測時刻t(t>t0)に出力さ
れた距離画像を順次記憶する。
However, 2a is the distance between the optical axis of the image pickup apparatus 1 and the optical axis of the image pickup apparatus 4, and f is the focal length of the lenses of the image pickup apparatus 1 and the image pickup apparatus 4. Then, the initial distance image memory 12 has a distance measuring unit 11 The distance image output from the measurement reference time t 0 is stored as an initial distance image, and the distance image memory 13 stores the distance image from the distance measuring unit 11 at the measurement time t (t> t 0 ) after the measurement reference time t 0. The output distance images are sequentially stored.

【0013】距離差検出部14は、初期距離画像メモリ12
に記憶された初期距離画像と距離画像メモリ13に記憶さ
れた距離画像とにおいて同一座標の矩形領域8を座標順
に比較して、初期距離画像と距離画像との距離差が閾値
以上の座標の矩形領域8を識別した上、閾値以上の座標
の矩形領域8に新たな物体が現出したか、存在した物体
が消失したものと判断して、検出した物体の画像15を出
力する。
The distance difference detection unit 14 includes an initial distance image memory 12
The rectangular area 8 having the same coordinates in the initial distance image stored in the distance image memory 13 and the distance image stored in the distance image memory 13 are compared in the order of coordinates, and the rectangle having the coordinates in which the distance difference between the initial distance image and the distance image is equal to or greater than the threshold value. After the area 8 is identified, it is determined that a new object has appeared in the rectangular area 8 whose coordinates are equal to or greater than the threshold value or the existing object has disappeared, and the image 15 of the detected object is output.

【0014】[0014]

【発明が解決しようとする課題】このように、従来の物
体検出装置は、計測基準時刻t0の初期距離画像と時刻
tの距離画像とにおいて距離差の生じた矩形領域8が有
るか、否かを判別することにより、物体を検出している
が、この場合、ほとんどの矩形領域8において初期距離
画像と距離画像とが得られることを前提としている。
As described above, the conventional object detection device determines whether or not there is a rectangular area 8 in which a distance difference occurs between the initial distance image at the measurement reference time t 0 and the distance image at the time t. Although the object is detected by determining whether or not, in this case, it is premised that the initial distance image and the distance image are obtained in most of the rectangular area 8.

【0015】ところが、道路,床,壁等のように形状に
特徴がなく、明度或いは輝度の変化の小さい物体までの
距離を求めるときには、類似度評価値Cの最小値が不明
確になる矩形領域8が増加して、初期距離画像或いは距
離画像が得られない場合がある。この場合、距離の計測
が困難になって、物体の検出ができなくなるが、特に距
離計測の基準となる初期距離画像が得られないと、誤検
出率が増加して、物体の検出が当初からできなくなると
いう重大な問題点があった。
However, when the distance to an object such as a road, a floor, or a wall that does not have a characteristic shape and has a small change in brightness or brightness is obtained, a rectangular area in which the minimum value of the similarity evaluation value C is unclear. In some cases, 8 increases and an initial range image or range image cannot be obtained. In this case, it becomes difficult to measure the distance, and the object cannot be detected. However, if the initial distance image that is the reference for the distance measurement cannot be obtained, the false detection rate increases and the object is detected from the beginning. There was a serious problem that I could not do it.

【0016】又、対応付け処理部7は、距離計測しよう
とする物体が存在している距離に応じた視差dの最大値
maxと最小値dminとの間を視差探索範囲とする(図6
及び図7参照)もので、この視差探索範囲は左画像2と
右画像5との全面に一様に設定されている。
Further, the associating processor 7 sets the parallax search range between the maximum value d max and the minimum value d min of the parallax d according to the distance at which the object whose distance is to be measured exists. 6
And FIG. 7), the parallax search range is set uniformly on the entire surface of the left image 2 and the right image 5.

【0017】ところが、検出すべき物体は、ほとんどの
場合、道路,床,壁等の平面16よりも手前、即ち撮像装
置1及び4の側に存在するので、撮像装置1及び4のア
ングルが図6のように平面16に対して直角のときは、視
差dの最小値dminと平面16の位置の視差dとを等価に
して処理できるが、撮像装置1及び4のアングルが図7
のように平面16に対して角度θだけ傾いていると、物体
が存在しない空間平面16よりも遠い位置についても一様
に視差探索を行わなければならなくなる。このため、ス
テレオ画像測距法による物体検出装置で最も演算量の多
い対応付け処理部7における演算量が更に増加して、物
体の距離計測が非効率的になると共に、誤検出率が増加
させるという問題点があった。
However, in most cases, the object to be detected exists in front of the plane 16 such as the road, floor, or wall, that is, on the side of the image pickup devices 1 and 4, so that the angles of the image pickup devices 1 and 4 are different from each other. 6 is perpendicular to the plane 16, the minimum value d min of the parallax d and the parallax d at the position of the plane 16 can be equivalently processed, but the angles of the image pickup devices 1 and 4 are as shown in FIG.
If the object is inclined by the angle θ with respect to the plane 16, the parallax search has to be performed uniformly even at a position farther than the space plane 16 where no object exists. For this reason, the calculation amount in the association processing unit 7, which has the largest calculation amount in the object detection device based on the stereo image distance measurement method, further increases, the distance measurement of the object becomes inefficient, and the false detection rate increases. There was a problem.

【0018】[0018]

【課題を解決するための手段】本発明は、このような問
題点を解決するためになされたもので、道路,床,壁等
の平面は左画像及び右画像の中の大部分を占める平面と
仮定できることに着目し、平面の距離計測できなかった
座標の矩形領域における距離K(X,Y)を、平面から部
分的に得られた複数の座標の距離K(X,Y)から最小二
乗法等によって推定して補完することにより、平面の位
置が推定できるようになって、物体の検出漏れが少なく
なり、物体の検出が安定して行えるようになる。
The present invention has been made to solve the above problems, and the planes such as roads, floors and walls are planes that occupy most of the left image and the right image. It can be assumed that the distance K (X, Y) in the rectangular area of the coordinates where the distance on the plane cannot be measured is calculated from the distance K (X, Y) of a plurality of coordinates partially obtained from the plane by a minimum of two. By estimating and complementing by a multiplication method or the like, the position of the plane can be estimated, the omission of detection of the object is reduced, and the detection of the object can be performed stably.

【0019】又、推定した平面の位置に応じて視差探索
範囲を設定して、各座標の矩形領域において無駄の少な
い視差探索範囲を設定することにより、ステレオ画像処
理において最も演算量の多い対応付け処理量を大幅に削
減できるようになる。
Further, the parallax search range is set in accordance with the estimated position of the plane, and the parallax search range with less waste is set in the rectangular area of each coordinate, so that the correspondence with the largest amount of calculation in the stereo image processing is performed. The processing amount can be significantly reduced.

【0020】本発明は、第1に、画像を水平方向及び垂
直方向に複数に分割してなる矩形領域毎に計測した物体
までの距離から同一の平面の位置を推定した後、平面の
矩形領域の中で距離が計測できなかった矩形領域の距離
を計測できた矩形領域の平面までの距離に基づいて演算
して補完する、或いは、画像を水平方向及び垂直方向に
複数に分割してなる矩形領域毎に計測した物体までの距
離は、距離を示す距離画像に変換された上、距離画像か
ら同一の平面の位置を推定した後、平面の矩形領域の中
で距離画像が得られなかった矩形領域の距離画像を距離
画像が得られた矩形領域の平面までの距離画像に基づい
て演算して補完することにより、距離が計測できなかっ
た矩形領域が減少して、正確かつ安定した物体検出が行
えるようになると共に、安定した物体検出が行えると同
時に処理量が削減できる。
In the first aspect of the present invention, the position of the same plane is estimated from the distance to the object measured for each rectangular area obtained by dividing the image in the horizontal and vertical directions, and then the rectangular area of the plane is calculated. In the rectangular area where the distance could not be measured, the distance is calculated and complemented based on the distance to the plane of the rectangular area where the distance can be measured, or the image is divided into a plurality of horizontal and vertical rectangles. The distance to the object measured for each area is converted into a distance image showing the distance, and after estimating the position of the same plane from the distance image, the rectangle for which the distance image was not obtained in the rectangular area of the plane By calculating and supplementing the distance image of the area based on the distance image to the plane of the rectangular area where the distance image was obtained, the rectangular area where the distance could not be measured is reduced, and accurate and stable object detection can be performed. When you can do it To be stable at the same time the amount of processing when performed the object detection is to reduce.

【0021】本発明は、第2に、矩形領域毎に当初に計
測した物体までの距離から同一の平面の位置を推定した
後、次以降の物体までの距離の計測は、撮像装置と平面
との間の視差探索範囲に限定する、或いは、矩形領域毎
に当初に計測した物体までの距離から同一の平面の位置
を推定した後、撮像装置と平面との間に限定した視差探
索範囲で物体像のずれを検出して、物体までの距離を計
測することにより、平面の推定が物体検出開始当初のみ
行えばよく、且つ、道路等の平面の裏側等のように物体
が存在し得ない空間について対応付け処理を行わなくて
すむので、対応付け処理部における対応付け処理の演算
量が削減できるようになる。
Second, according to the present invention, after estimating the position of the same plane from the distance to the object initially measured for each rectangular area, the distance to the next and subsequent objects is measured by the imaging device and the plane. Between the image pickup device and the plane after estimating the position of the same plane from the distance to the object initially measured for each rectangular area. By detecting the displacement of the image and measuring the distance to the object, the plane can be estimated only at the beginning of the object detection, and the space where the object cannot exist such as the back side of the plane such as a road. Since it is not necessary to perform the associating process for, the calculation amount of the associating process in the associating processing unit can be reduced.

【0022】[0022]

【発明の実施の形態】以下、本発明の実施の形態につい
て図面を参照しながら説明する。なお、図3乃至図7の
参照符号と同一符号のものは、同一部分を示す。
Embodiments of the present invention will be described below with reference to the drawings. The same reference numerals as those in FIGS. 3 to 7 denote the same parts.

【0023】(実施の形態1)図1は本発明の第1の実施
の形態における物体検出装置の構成図で、以下、この物
体検出装置の動作について説明する。
(Embodiment 1) FIG. 1 is a block diagram of an object detecting device according to a first embodiment of the present invention. The operation of the object detecting device will be described below.

【0024】左側の撮像装置1によって撮影された画像
は、基準となる左画像2(図4参照)として左画像メモリ
3に記憶される。又、光軸が撮像装置1の光軸と平行に
なるように配置された右側の撮像装置4によって撮影さ
れた画像は、右画像5(図5参照)として右画像メモリ6
に記憶される。
The image photographed by the left imaging device 1 is stored in the left image memory 3 as a reference left image 2 (see FIG. 4). An image captured by the right imaging device 4 arranged so that its optical axis is parallel to the optical axis of the imaging device 1 is a right image memory 6 (see FIG. 5).
Is stored.

【0025】そこで、対応付け処理部7は、先ず、図4
に示すように、左画像2を(M×N)個の矩形領域8に分
割して、各矩形領域8を(m×n)個の画素9に分割した
上、(m×n)個の画素9の中でi番目の明るさLiを有
する画素9を矩形領域8毎に検出する。
Therefore, the associating processing unit 7 first performs the processing shown in FIG.
As shown in, the left image 2 is divided into (M × N) rectangular areas 8 and each rectangular area 8 is divided into (m × n) pixels 9, and (m × n) The pixel 9 having the i-th brightness L i among the pixels 9 is detected for each rectangular area 8.

【0026】又、右画像5の中で左画像2の矩形領域8
の画像に類似する画像の存在する可能性のある探索範囲
10において、(m×n)個の画素からなる探索矩形領域を
水平方向に1画素分移動させる毎に、探索矩形領域の画
素の中でi番目の明るさRiを有する画素を検出する。
In the right image 5, the rectangular area 8 of the left image 2 is also included.
Search range in which images similar to other images may exist
In 10, every time the search rectangular area composed of (m × n) pixels is moved by one pixel in the horizontal direction, the pixel having the i-th brightness R i is detected from the pixels in the search rectangular area.

【0027】そして、左画像2の矩形領域8の各画素9
と右画像5の探索矩形領域において左画像2の矩形領域
8の各画素9の位置に対応する位置の画素との明るさを
画素単位に対照して、左画像2の矩形領域8の画像と右
画像5の探索矩形領域の画像との類似度評価値Cを(数
1)の式によって求めた上、両画像の類似度評価値Cが
最小になったときの右画像5における探索矩形領域の位
置を左画像2の矩形領域8に対応する領域(以下「対応
領域」という)と判定し、左画像2の矩形領域8の座標
Xと右画像5の対応領域の座標XRとのずれから視差d
を(数2)の式によって求めて、出力する。
Then, each pixel 9 in the rectangular area 8 of the left image 2
And the brightness of the pixel at the position corresponding to the position of each pixel 9 of the rectangular area 8 of the left image 2 in the search rectangular area of the right image 5, and the image of the rectangular area 8 of the left image 2 is compared. The similarity evaluation value C with the image in the search rectangular area of the right image 5 is obtained by the equation (1), and the search rectangular area in the right image 5 when the similarity evaluation value C of both images becomes the minimum Is determined as an area corresponding to the rectangular area 8 of the left image 2 (hereinafter referred to as "corresponding area"), and the coordinate X of the rectangular area 8 of the left image 2 and the coordinate X R of the corresponding area of the right image 5 are deviated. From parallax d
Is calculated by the equation (2) and is output.

【0028】なお、左画像2の矩形領域8の画像と右画
像5の探索矩形領域の画像との類似度評価値Cを求めた
上、左画像2の矩形領域の座標Xと右画像5の対応領域
の座標XRとのずれから視差dを求める動作は、左画像
2の全ての座標の矩形領域8に対して順次行う。
The similarity evaluation value C between the image of the rectangular area 8 of the left image 2 and the image of the search rectangular area of the right image 5 is obtained, and then the coordinate X of the rectangular area of the left image 2 and the right image 5 are calculated. The operation of obtaining the parallax d from the deviation from the coordinate X R of the corresponding area is sequentially performed on the rectangular area 8 of all the coordinates of the left image 2.

【0029】そこで、距離計測部11は、対応付け処理部
7において左画像2の矩形領域8毎に得た視差dに基づ
いて、矩形領域8毎に計測される撮像装置1或いは撮像
装置4から物体までの距離K(X,Y)〔但し、0<X≦
M,0<Y≦N〕を、一般的に知られる(数3)の式によ
って求めた上、矩形領域8毎の距離を示す画像(以下
「距離画像」という)に変換して出力する。
Therefore, the distance measuring unit 11 measures from the image pickup device 1 or the image pickup device 4 measured for each rectangular area 8 based on the parallax d obtained for each rectangular area 8 of the left image 2 in the association processing unit 7. Distance to object K (X, Y) [where 0 <X ≦
M, 0 <Y ≦ N] is obtained by a generally known equation (3), and converted into an image (hereinafter referred to as “distance image”) indicating the distance for each rectangular area 8 and output.

【0030】平面推定部17は、距離計測部11から計測基
準時刻t0に出力された矩形領域8の距離画像の内の平
面16に対応する距離画像を用いて最小二乗法等の方法で
実空間上の平面16の位置を推定することにより、距離計
測部11から計測基準時刻t0に出力されなかった平面の
矩形領域8の距離画像を補完して、出力する。
The plane estimation unit 17 uses the distance image corresponding to the plane 16 in the distance image of the rectangular area 8 output from the distance measurement unit 11 at the measurement reference time t 0 , and uses the method such as the least squares method. By estimating the position of the plane 16 on the space, the distance image of the rectangular area 8 of the plane which is not output from the distance measuring unit 11 at the measurement reference time t 0 is complemented and output.

【0031】初期距離画像メモリ12は、平面推定部17か
ら出力された距離画像とを初期距離画像として記憶し、
又、距離画像メモリ13は、距離計測部11から計測基準時
刻t0以降の計測時刻t(t>t0)に出力された距離画像
を順次記憶する。
The initial distance image memory 12 stores the distance image output from the plane estimation unit 17 as an initial distance image,
Further, the distance image memory 13 sequentially stores the distance images output from the distance measuring unit 11 at the measurement time t (t> t 0 ) after the measurement reference time t 0 .

【0032】そこで、距離差検出部14は、初期距離画像
メモリ12に記憶された初期距離画像と距離画像メモリ13
に記憶された距離画像とにおいて同一座標の矩形領域8
を座標順に比較して、初期距離画像と距離画像との距離
差が閾値TH以上の座標の矩形領域8を識別した上、閾
値TH以上の座標の矩形領域8に新たな物体が現出した
か、存在した物体が消失したものと判断して、検出した
物体の画像15を出力する。
Therefore, the distance difference detection unit 14 uses the initial distance image and the distance image memory 13 stored in the initial distance image memory 12.
Rectangular area 8 with the same coordinates as the distance image stored in
Are compared in the order of coordinates to identify the rectangular area 8 having coordinates whose distance difference between the initial distance image and the distance image is equal to or greater than the threshold TH, and whether a new object appears in the rectangular area 8 having coordinates equal to or greater than the threshold TH. It is determined that the existing object has disappeared, and the image 15 of the detected object is output.

【0033】このように、ステレオ画像測距方法におい
て矩形領域8の対応付けが困難とされていた道路,床,
壁等の特徴が少ない平面でも、矩形領域8の距離画像の
内の平面に対応する距離画像を用いて最小二乗法等の方
法で実空間上の平面の位置を推定して、距離計測部11か
ら計測基準時刻t0に出力されなかった矩形領域8の距
離画像を補完することにより、時間的な距離変化による
物体の検出が確実にできる。
As described above, in the stereo image distance measuring method, it is difficult to associate the rectangular areas 8 with each other.
Even with a plane such as a wall having few features, the distance measuring unit 11 estimates the position of the plane in the real space by a method such as the least square method using the distance image corresponding to the plane in the distance image of the rectangular area 8. By complementing the distance image of the rectangular area 8 that is not output at the measurement reference time t 0 , it is possible to reliably detect the object due to the temporal distance change.

【0034】(実施の形態2)図2は本発明の第2の実施
の形態における物体検出装置の構成図で、以下、この物
体検出装置の動作について説明する。なお、図1の参照
符号と同一符号のものは同一部分を示す。
(Second Embodiment) FIG. 2 is a block diagram of an object detecting apparatus according to a second embodiment of the present invention. The operation of the object detecting apparatus will be described below. The same reference numerals as those in FIG. 1 denote the same parts.

【0035】対応付け処理部18は、先ず、前述の対応付
け処理部7と同様に、図4に示すように、左画像2を
(M×N)個の矩形領域8に分割して、各矩形領域8を
(m×n)個の画素9に分割した上、(m×n)個の画素
9の中でi番目の明るさLiを有する画素9を矩形領域
8毎に検出する。
As in the above-mentioned association processing unit 7, the association processing unit 18 first processes the left image 2 as shown in FIG.
It is divided into (M × N) rectangular areas 8 and each rectangular area 8 is divided into
After being divided into (m × n) pixels 9, the pixel 9 having the i-th brightness L i among the (m × n) pixels 9 is detected for each rectangular area 8.

【0036】又、右画像5の中で左画像2の矩形領域8
の画像に類似する画像が存在する可能性のある探索範囲
10において、(m×n)個の画素からなる探索矩形領域を
水平方向に1画素分移動させる毎に、探索矩形領域の画
素の中でi番目の明るさRiを有する画素を検出する。
In the right image 5, the rectangular area 8 of the left image 2
Search range in which images similar to other images may exist
In 10, every time the search rectangular area composed of (m × n) pixels is moved by one pixel in the horizontal direction, the pixel having the i-th brightness R i is detected from the pixels in the search rectangular area.

【0037】そして、左画像2の矩形領域8の各画素9
と右画像5の探索矩形領域において左画像2の矩形領域
8の各画素9の位置に対応する位置の画素との明るさを
画素単位に対照して、左画像2の矩形領域8の画像と右
画像5の探索矩形領域の画像との類似度評価値Cを(数
1)の式によって求めた上、両画像の類似度評価値Cが
最小になったときの右画像5における探索矩形領域の位
置を左画像2の矩形領域8に対応する領域(以下「対応
領域」という)と判定し、左画像2の矩形領域8の座標
Xと右画像5の対応領域の座標XRとのずれから計測基
準時刻t0における視差dを(数2)の式によって求め
て、出力する。
Then, each pixel 9 in the rectangular area 8 of the left image 2
And the brightness of the pixel at the position corresponding to the position of each pixel 9 of the rectangular area 8 of the left image 2 in the search rectangular area of the right image 5, and the image of the rectangular area 8 of the left image 2 is compared. The similarity evaluation value C with the image in the search rectangular area of the right image 5 is obtained by the equation (1), and the search rectangular area in the right image 5 when the similarity evaluation value C of both images becomes the minimum Is determined as an area corresponding to the rectangular area 8 of the left image 2 (hereinafter referred to as "corresponding area"), and the coordinate X of the rectangular area 8 of the left image 2 and the coordinate X R of the corresponding area of the right image 5 are deviated. From the above, the parallax d at the measurement reference time t 0 is obtained by the formula (Equation 2) and is output.

【0038】なお、左画像2の矩形領域8の画像と右画
像5の探索矩形領域の画像との類似度評価値Cを求めた
上、左画像2の矩形領域の座標Xと右画像5の対応領域
の座標XRとのずれから視差dを求める動作は、左画像
2の全ての座標の矩形領域8に対して順次行う。
The similarity evaluation value C between the image of the rectangular area 8 of the left image 2 and the image of the search rectangular area of the right image 5 is obtained, and then the coordinates X of the rectangular area of the left image 2 and the right image 5 are calculated. The operation of obtaining the parallax d from the deviation from the coordinate X R of the corresponding area is sequentially performed on the rectangular area 8 of all the coordinates of the left image 2.

【0039】そこで、距離計測部11は、対応付け処理部
18において左画像2の矩形領域8毎に得た視差dに基づ
いて、矩形領域8毎に計測される撮像装置1或いは撮像
装置4から物体までの距離K(X,Y)〔但し、0<X≦
M,0<Y≦N〕を、一般的に知られる(数3)の式によ
って求めた上、矩形領域8毎の距離を示す画像(以下
「距離画像」という)に変換して出力する。
Therefore, the distance measuring unit 11 is a correspondence processing unit.
In FIG. 18, the distance K (X, Y) from the image pickup device 1 or the image pickup device 4 to the object measured for each rectangular region 8 based on the parallax d obtained for each rectangular region 8 of the left image 2 [where 0 < X ≦
M, 0 <Y ≦ N] is obtained by a generally known equation (3), and converted into an image (hereinafter referred to as “distance image”) indicating the distance for each rectangular area 8 and output.

【0040】すると、平面推定部19は、距離計測部11か
ら計測基準時刻t0に出力された矩形領域8の距離画像
の内の平面16に対応する距離画像を用いて最小二乗法等
の方法で実空間上の平面16の位置を推定することによ
り、距離計測部11から計測基準時刻t0に出力されなか
った平面の矩形領域8の距離画像を補完して、出力し、
初期距離画像メモリ20に記憶させる。
Then, the plane estimating unit 19 uses the distance image corresponding to the plane 16 in the distance image of the rectangular area 8 output from the distance measuring unit 11 at the measurement reference time t 0 , by the method such as the least square method. By estimating the position of the plane 16 in the real space with, the distance measuring unit 11 complements and outputs the distance image of the rectangular area 8 of the plane that was not output at the measurement reference time t 0 .
The initial distance image memory 20 is stored.

【0041】そして、対応付け処理部18は、計測基準時
刻t0以降の計測時刻t(t>t0)になると、視差探査範
囲を初期距離画像メモリ20に記憶されている実空間上の
平面16の位置までに限定した状態で、前述の如く、左画
像2の矩形領域8の画像と右画像5の探索矩形領域の画
像との類似度評価値Cを求めた上、左画像2の矩形領域
の座標Xと右画像5の対応領域の座標XRとのずれから
計測時刻tにおける視差dを求めて、出力する。
Then, at the measurement time t (t> t 0 ) after the measurement reference time t 0 , the associating processing unit 18 sets the parallax search range to the plane in the real space stored in the initial distance image memory 20. With the position limited to 16 positions, the similarity evaluation value C between the image of the rectangular area 8 of the left image 2 and the image of the search rectangular area of the right image 5 is obtained as described above, and then the rectangle of the left image 2 is calculated. The parallax d at the measurement time t is obtained from the deviation between the coordinate X of the area and the coordinate X R of the corresponding area of the right image 5, and the parallax d is output.

【0042】距離計測部11は、対応付け処理部18におい
て左画像2の矩形領域8毎に得た視差dに基づいて、矩
形領域8毎に計測される撮像装置1或いは撮像装置4か
ら物体までの距離K(X,Y)〔但し、0<X≦M,0<
Y≦N〕を、一般的に知られる(数3)の式によって求め
た上、矩形領域8毎の距離を示す画像(以下「距離画
像」という)に変換して出力し、距離画像メモリ13に記
憶させる。
Based on the parallax d obtained for each rectangular area 8 of the left image 2 in the association processing section 18, the distance measuring section 11 measures from the imaging device 1 or the imaging device 4 to the object measured for each rectangular region 8. Distance K (X, Y) [where 0 <X ≦ M, 0 <
Y ≦ N] is obtained by a generally known equation (3), converted into an image showing the distance for each rectangular area 8 (hereinafter referred to as “distance image”) and output, and the distance image memory 13 To memorize.

【0043】そこで、距離差検出部14は、初期距離画像
メモリ20に記憶された初期距離画像と距離画像メモリ13
に記憶された距離画像とにおいて同一座標の矩形領域8
を座標順に比較して、初期距離画像と距離画像との距離
差が閾値TH以上の座標の矩形領域8を識別した上、閾
値TH以上の座標の矩形領域8に新たな物体が現出した
か、存在した物体が消失したものと判断して、検出した
物体の画像15を出力する。
Therefore, the distance difference detection unit 14 detects the initial distance image and the distance image memory 13 stored in the initial distance image memory 20.
Rectangular area 8 with the same coordinates as the distance image stored in
Are compared in the order of coordinates to identify the rectangular area 8 having coordinates whose distance difference between the initial distance image and the distance image is equal to or greater than the threshold TH, and whether a new object appears in the rectangular area 8 having coordinates equal to or greater than the threshold TH. It is determined that the existing object has disappeared, and the image 15 of the detected object is output.

【0044】このように、本実施の形態によれば、推定
した平面16よりも遠い位置の物体に対する距離計測を行
わないので、対応付け処理部18における演算処理量が大
幅に削減できる。
As described above, according to the present embodiment, since the distance measurement is not performed on the object located at a position farther than the estimated plane 16, the calculation processing amount in the association processing unit 18 can be significantly reduced.

【0045】なお、対応付け処理部7或いは18から出力
される視差dを距離計測部11において距離画像に変換し
た後、平面16の位置を平面推定部17及び19において推定
する例で説明したが、距離計測部11において距離画像に
変換することなく、対応付け処理部7或いは18から出力
される視差dから平面16の位置を平面推定部17及び19に
おいて推定させてもよい。
An example has been described in which the position of the plane 16 is estimated by the plane estimation units 17 and 19 after the parallax d output from the association processing unit 7 or 18 is converted into a distance image by the distance measurement unit 11. The position of the plane 16 may be estimated by the plane estimation units 17 and 19 from the parallax d output from the association processing unit 7 or 18 without converting the distance measurement unit 11 into a distance image.

【0046】[0046]

【発明の効果】以上説明したように、本発明によれば、
次のような効果を奏するものである。即ち、道路,床,
壁等のように対応付けによる距離計測が難しい平面上の
距離を、平面の位置を推定して、補完することにより、
平面の全ての矩形領域の距離画像が得られるようになっ
て、物体の検出が安定且つ確実にできる。
As described above, according to the present invention,
The following effects are obtained. That is, roads, floors,
By estimating the position of the plane and complementing the distance on the plane where it is difficult to measure the distance by matching such as walls,
Since the range images of all the rectangular areas on the plane can be obtained, the object can be detected stably and reliably.

【0047】又、道路,床,壁等の平面の位置を推定し
て、視差探索範囲をその平面よりも前方に限定すること
により、ステレオ画像処理法による物体検出装置の中で
最も演算量を必要とする対応付け処理部における演算量
を大幅に削減できる。
Further, by estimating the position of a plane such as a road, floor, or wall, and limiting the parallax search range to the front of the plane, the calculation amount is the highest in the object detecting apparatus using the stereo image processing method. The required calculation amount in the association processing unit can be significantly reduced.

【図面の簡単な説明】[Brief description of the drawings]

【図1】本発明の第1の実施の形態における物体検出装
置の構成図である。
FIG. 1 is a configuration diagram of an object detection device according to a first embodiment of the present invention.

【図2】本発明の第2の実施の形態における物体検出装
置の構成図である。
FIG. 2 is a configuration diagram of an object detection device according to a second embodiment of the present invention.

【図3】従来の物体検出装置の構成図である。FIG. 3 is a configuration diagram of a conventional object detection device.

【図4】左画像での矩形領域及び画素の関係を示す図で
ある。
FIG. 4 is a diagram showing a relationship between a rectangular area and pixels in a left image.

【図5】右画像での探索範囲を示す説明図である。FIG. 5 is an explanatory diagram showing a search range in the right image.

【図6】平面が撮像装置の光軸に対して垂直のときの視
差探索範囲をを説明する図である。
FIG. 6 is a diagram illustrating a parallax search range when the plane is perpendicular to the optical axis of the imaging device.

【図7】平面が撮像装置の光軸に対して傾斜していると
きの視差探索範囲をを説明する図である。
FIG. 7 is a diagram illustrating a parallax search range when the plane is inclined with respect to the optical axis of the imaging device.

【符号の説明】[Explanation of symbols]

1,4…撮像装置、 2…左画像、 3…左画像メモ
リ、 5…右画像、 6…右画像メモリ、 7,18…対
応付け処理部、 8…矩形領域、 9…画素、 10…探
索範囲、 11…距離計測部、 12,20…初期距離画像メ
モリ、 13…距離画像メモリ、 14…距離差検出部、
15…物体検出画像、 16…平面、 17,19…平面推定
部。
1, 4 ... Imaging device, 2 ... Left image, 3 ... Left image memory, 5 ... Right image, 6 ... Right image memory, 7, 18 ... Correlation processing unit, 8 ... Rectangular area, 9 ... Pixel, 10 ... Search Range, 11 ... Distance measuring unit, 12,20 ... Initial distance image memory, 13 ... Distance image memory, 14 ... Distance difference detection unit,
15 ... Object detection image, 16 ... Plane, 17, 19 ... Plane estimation unit.

Claims (4)

【特許請求の範囲】[Claims] 【請求項1】 所定の間隔で配置された複数の撮像装置
からそれぞれ出力された複数の画像の中の物体像のずれ
から、三角測量の原理で物体までの距離を計測するステ
レオ画像測距法を用いた物体検出装置において、 前記画像を水平方向及び垂直方向に複数に分割してなる
矩形領域毎に計測した前記物体までの距離から同一の平
面の位置を推定した後、前記平面の矩形領域の中で距離
が計測できなかった前記矩形領域の距離を計測できた前
記矩形領域の平面までの距離に基づいて演算して補完す
ることを特徴とする物体検出装置。
1. A stereo image distance measuring method for measuring a distance to an object on the principle of triangulation from deviations of object images in a plurality of images respectively output from a plurality of image pickup devices arranged at a predetermined interval. In the object detection apparatus using, after estimating the position of the same plane from the distance to the object measured for each rectangular area formed by dividing the image into a plurality of horizontal and vertical directions, the rectangular area of the plane In the object detection device, the distance of the rectangular area in which the distance cannot be measured is calculated and complemented based on the distance to the plane of the rectangular area in which the distance can be measured.
【請求項2】 前記画像を水平方向及び垂直方向に複数
に分割してなる矩形領域毎に計測した前記物体までの距
離は、距離を示す距離画像に変換された上、前記距離画
像から同一の平面の位置を推定した後、前記平面の矩形
領域の中で距離画像が得られなかった前記矩形領域の距
離画像を前記距離画像が得られた前記矩形領域の平面ま
での距離画像に基づいて演算して補完することを特徴と
する請求項1記載の物体検出装置。
2. The distance to the object measured for each rectangular area formed by dividing the image into a plurality of horizontal and vertical directions is converted into a distance image indicating a distance, and the same distance image is obtained from the distance image. After estimating the position of the plane, the distance image of the rectangular area in which the distance image is not obtained in the rectangular area of the plane is calculated based on the distance image to the plane of the rectangular area in which the distance image is obtained. The object detection apparatus according to claim 1, wherein the object detection apparatus is complemented by the above.
【請求項3】 前記矩形領域毎に当初に計測した前記物
体までの距離から同一の平面の位置を推定した後、次以
降の前記物体までの距離の計測は、前記撮像装置と前記
平面との間の視差探索範囲に限定して行うことを特徴と
する請求項1記載の物体検出装置。
3. After estimating the position of the same plane from the distance to the object initially measured for each of the rectangular areas, the distance to the object thereafter is measured between the imaging device and the plane. The object detection apparatus according to claim 1, wherein the object detection apparatus is performed only within a parallax search range between them.
【請求項4】 前記矩形領域毎に当初に計測した前記物
体までの距離から同一の平面の位置を推定した後、前記
撮像装置と前記平面との間に限定した視差探索範囲で前
記物体像のずれを検出して、前記物体までの距離を計測
することを特徴とする請求項1記載の物体検出装置。
4. The position of the same plane is estimated from the distance to the object initially measured for each of the rectangular regions, and then the object image is detected in a parallax search range limited between the image pickup device and the plane. The object detection device according to claim 1, wherein a deviation is detected and a distance to the object is measured.
JP31267795A 1995-09-12 1995-11-30 Object detection apparatus Pending JPH09152334A (en)

Priority Applications (5)

Application Number Priority Date Filing Date Title
JP31267795A JPH09152334A (en) 1995-11-30 1995-11-30 Object detection apparatus
US08/704,895 US5825915A (en) 1995-09-12 1996-08-30 Object detecting apparatus in which the position of a planar object is estimated by using hough transform
CA002184561A CA2184561C (en) 1995-09-12 1996-08-30 Object detecting apparatus in which the position of a planar object is estimated by using hough transform
EP96114009A EP0762326B1 (en) 1995-09-12 1996-09-02 Object detecting apparatus in which the position of a planar object is estimated by using Hough transform
DE69625423T DE69625423T2 (en) 1995-09-12 1996-09-02 Device for recognizing an object in which the position of a flat object is estimated using a Hough transformation

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP31267795A JPH09152334A (en) 1995-11-30 1995-11-30 Object detection apparatus

Publications (1)

Publication Number Publication Date
JPH09152334A true JPH09152334A (en) 1997-06-10

Family

ID=18032103

Family Applications (1)

Application Number Title Priority Date Filing Date
JP31267795A Pending JPH09152334A (en) 1995-09-12 1995-11-30 Object detection apparatus

Country Status (1)

Country Link
JP (1) JPH09152334A (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003132349A (en) * 2001-10-24 2003-05-09 Matsushita Electric Ind Co Ltd Drawing device
KR20110084028A (en) * 2010-01-15 2011-07-21 삼성전자주식회사 Apparatus and method for measuring distance using image data
JP2014142832A (en) * 2013-01-24 2014-08-07 Canon Inc Image processing apparatus, control method of image processing apparatus, and program

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003132349A (en) * 2001-10-24 2003-05-09 Matsushita Electric Ind Co Ltd Drawing device
KR20110084028A (en) * 2010-01-15 2011-07-21 삼성전자주식회사 Apparatus and method for measuring distance using image data
JP2014142832A (en) * 2013-01-24 2014-08-07 Canon Inc Image processing apparatus, control method of image processing apparatus, and program

Similar Documents

Publication Publication Date Title
US20210190497A1 (en) Simultaneous location and mapping (slam) using dual event cameras
US9886649B2 (en) Object detection device and vehicle using same
US7769227B2 (en) Object detector
JP3895238B2 (en) Obstacle detection apparatus and method
US5825915A (en) Object detecting apparatus in which the position of a planar object is estimated by using hough transform
US7471809B2 (en) Method, apparatus, and program for processing stereo image
EP0686942A2 (en) Stereo matching method and disparity measuring method
JPH11252587A (en) Object tracking device
JPH09226490A (en) Detector for crossing object
JP3710548B2 (en) Vehicle detection device
JP2015206798A (en) distance calculation device
KR20140056790A (en) Apparatus for image recognition and method thereof
JP3961584B2 (en) Lane marking detector
JP2000207693A (en) Obstacle detector on vehicle
US20170228602A1 (en) Method for detecting height
JP2001351193A (en) Device for detecting passenger
JP7445501B2 (en) Image processing device and image processing method
JP6188860B1 (en) Object detection device
JPH1144533A (en) Preceding vehicle detector
US20130142388A1 (en) Arrival time estimation device, arrival time estimation method, arrival time estimation program, and information providing apparatus
JPH09152334A (en) Object detection apparatus
JPH1096607A (en) Object detector and plane estimation method
JPH10187974A (en) Physical distribution measuring instrument
JP2000331169A (en) Method and device for measuring motion vector of image
JP2000259997A (en) Height of preceding vehicle and inter-vehicle distance measuring device