JPH02216413A - Range finding method - Google Patents

Range finding method

Info

Publication number
JPH02216413A
JPH02216413A JP3809589A JP3809589A JPH02216413A JP H02216413 A JPH02216413 A JP H02216413A JP 3809589 A JP3809589 A JP 3809589A JP 3809589 A JP3809589 A JP 3809589A JP H02216413 A JPH02216413 A JP H02216413A
Authority
JP
Japan
Prior art keywords
feature point
parallax
image
images
distance
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP3809589A
Other languages
Japanese (ja)
Inventor
Toru Kaneko
透 金子
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Nippon Telegraph and Telephone Corp
Original Assignee
Nippon Telegraph and Telephone Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Nippon Telegraph and Telephone Corp filed Critical Nippon Telegraph and Telephone Corp
Priority to JP3809589A priority Critical patent/JPH02216413A/en
Publication of JPH02216413A publication Critical patent/JPH02216413A/en
Pending legal-status Critical Current

Links

Abstract

PURPOSE:To find a range extending from a photographing position to the surface of an object with high accuracy by determining the parts to be allowed to correspond by detecting a feature point with the position unit of the size of a picture element, and thereafter, obtaining a parallax of this part with finer accuracy than the picture element by calculating a local correlation value. CONSTITUTION:The principle of a stereoscopic vision measurement is utilized, and the left and the right images to become stereo-images are inputted to input parts 1, 1', and stored in store parts 2, 2', respectively. Feature point detecting parts 3, 3' detect a feature point, respectively and obtain the attribute of the feature point and a position coordinate on an image. A local correlation arithmetic part 5 derives the local correlation value of the left and the right images from position coordinates on the left and the right images allowed to correspond by the feature point, and a parallax arithmetic part 6 calculates the parallax of a pair of original images by using an interpolation, based on the obtained correlation. A distance arithmetic part 7 calculates a distance to a photographing object by using this parallax information.

Description

【発明の詳細な説明】 〔産業上の利用分野〕 本発明は、距離計測方法、特にステレオ立体視の原理を
応用して屋内外シーンなどにおける対象物の距離を計測
する距離計測方法に関する。
DETAILED DESCRIPTION OF THE INVENTION [Field of Industrial Application] The present invention relates to a distance measuring method, and particularly to a distance measuring method that applies the principle of stereoscopic viewing to measure the distance of an object in an indoor or outdoor scene.

機械的に対して、物体の3次元形状データを効率よく自
動入力することが望まれている。
For mechanical systems, it is desired to efficiently and automatically input three-dimensional shape data of objects.

〔従来の技術〕[Conventional technology]

左右両眼の視点位置から対象物体を撮影したステレオ画
像対から3次元距離情報を獲得することは、まず左右画
像間で対応する部分を求めてその画像上の位置の差(視
差)を計算し8次にこの視差を三角測量の計算式に当て
はめることによって行われる。
To obtain three-dimensional distance information from a pair of stereo images taken of a target object from the viewpoints of both the left and right eyes, first find the corresponding parts between the left and right images and calculate the difference (parallax) between the positions on the images. 8th This is done by applying this parallax to a triangulation formula.

第1図はステレオ立体視計測の原理を上面図として示し
たものであり、L3.Lgは同一の焦点距離を有しかつ
光軸が互いに平行に配置された撮像カメラのレンズ、E
i 、HzはそれぞれレンズL、、Ltの撮像面、C+
 、C*はそれぞれ撮像面E+ Ezにおいて各レンズ
の光軸が交わる点。
FIG. 1 shows the principle of stereoscopic measurement as a top view, and L3. Lg is a lens of an imaging camera having the same focal length and whose optical axes are arranged parallel to each other; E
i and Hz are the imaging surfaces of lenses L, Lt, and C+, respectively.
, C* are the points where the optical axes of each lens intersect on the imaging plane E+Ez.

Pは測定対象点、Ql 、Qオはそれぞれ撮像面El。P is the measurement target point, and Ql and Qo are the imaging plane El.

E8上に投影された点Pの投影点、2はレンズから見た
点Pの奥行き方間の距離、fはレンズと1層像面との間
の距離、uは点PをレンズL1の中心とレンズL、の中
心との中点Mから直線C,exに沿ってC1からCtの
方向に測った変位、xl。
The projection point of point P projected onto E8, 2 is the distance between the depths of point P as seen from the lens, f is the distance between the lens and the first layer image plane, and u is the point P from the center of lens L1 The displacement xl is measured from the midpoint M between the center of the lens L and the center of the lens L along the straight line C,ex in the direction from C1 to Ct.

X、はそれぞれC1,Ctを原点として直線C5C8に
沿ってC2からCIの方向に設けたX軸に関する点Q、
、 Qtの座標である。ここで図より Xi  /f =  (u+d/2)/z     (
1))Cg  /f=  (u−d/2)  /2  
   (2)であり、これを変形すると Z ”’d−(/ (Xi   Xs )     (
31となる。従って、左右画像から対応する点を表す一
組の座標(x+ r  X、)を検出すれば、その点の
奥行き方向の距離が求められる。ここに、座標X、とx
8の差ΔX=X、−X@を視差とよぶ。
X is a point Q on the X-axis, which is set in the direction from C2 to CI along the straight line C5C8 with C1 and Ct as origins, respectively.
, are the coordinates of Qt. Here, from the figure, Xi /f = (u+d/2)/z (
1)) Cg /f= (u-d/2) /2
(2), and transforming this gives Z ”'d-(/ (Xi Xs ) (
It will be 31. Therefore, by detecting a set of coordinates (x+rX,) representing corresponding points from the left and right images, the distance in the depth direction of that point can be determined. Here, the coordinates X, and x
The difference ΔX=X, -X@ is called parallax.

〔発明が解決しようとする課題〕[Problem to be solved by the invention]

そこで視差情報を如何にして精度よく求めるかが問題と
なるが、従来のこの種の方法には、(1)ステレオ画像
対上に演算用の移動する窓を設けその局所的相関を計算
して対応画素を求める方法、(2)ステレオ画像対から
特徴点くエツジなど)を抽出しその対応関係を求める方
法、があった、しかし前者の方法では、ある程度の広が
った範囲(窓の範囲)の情報をとりいれているために8
例えば背景を遮る物体の縁の近傍では、縁から見て物体
側に入り込んだ部分も背景側にはみ出た部分も同じ視差
となってしまう、という問題点があった。また後者では
、特徴点の位置は画像の2次元的量子比重位(画素)で
与えられるために、この単位よりも細かい高精度な視差
検出は不可能であった。
Therefore, the problem is how to obtain disparity information with high accuracy. Conventional methods of this type include (1) providing a moving window for calculation over a pair of stereo images and calculating the local correlation; There are two methods: (2) extracting feature points (edges, etc.) from a pair of stereo images and finding the correspondence between them; 8 for being informed
For example, in the vicinity of the edge of an object that blocks the background, there is a problem in that the portion that extends toward the object and the portion that protrudes toward the background when viewed from the edge have the same parallax. Furthermore, in the latter method, the positions of feature points are given by two-dimensional quantum specific gravity (pixels) of the image, so it has been impossible to detect parallax with a finer precision than this unit.

(ii1題を解決するための手段〕 本発明は、上記2方法が有していた各々の問題点を解決
し両者の長所を活かすことを狙いとしてまず特徴点検出
によって対応させるべき部分を画素の大きさの位置単位
で決定し、しかるのちに局所的相関値の計算によってこ
の部分の視差を画素よりも細かい精度で得ることを特徴
とした方法であり、以下1図面に従って説明する。
(Means for Solving Problem ii) The present invention aims to solve the problems of the above two methods and take advantage of the strengths of both methods. This method is characterized by determining the size in units of position, and then calculating the local correlation value to obtain the parallax of this part with an accuracy finer than that of pixels, and will be explained below with reference to one drawing.

第2図は本発明の一実施例を示したもので、11′は画
像入力部、2.2’は画像格納部、3.3’は特徴点検
出部、4は対応特徴点検出部、5は局所相関演算部、6
は視差演算部、7は距離演算部である。この動作を説明
すると、まずテレビカメラ等の画像入力装置からなる画
像入力部1.1′によりステレオ画像となるべき左右画
像がそれぞれ入力され、入力された該画像はフレームメ
モリ等からなる画像格納部2.2′にそれぞれ格納され
る。
FIG. 2 shows an embodiment of the present invention, in which 11' is an image input section, 2.2' is an image storage section, 3.3' is a feature point detection section, 4 is a corresponding feature point detection section, 5 is a local correlation calculation unit, 6
7 is a parallax calculation unit, and 7 is a distance calculation unit. To explain this operation, first, left and right images to be a stereo image are respectively inputted by an image input unit 1.1' consisting of an image input device such as a television camera, and the input images are stored in an image storage unit consisting of a frame memory etc. 2.2' respectively.

特徴点検出部3.3′は、格納された該画像に対してそ
れぞれ特徴点検出を行い特徴点の属性と画像上の位置座
標を獲得する。ここで具体的な特徴点としては1例えば
画像の濃淡や色彩の不連続部分であるエツジがあり、そ
の属性としてはその画像平面内の傾き角度や濃淡勾配が
ある。このエツジを求める手法については、公知の技術
として1尾上守夫他編「画像処理ハンドブック」 (株
式会社昭晃堂発行)の12.2節などに掲載されている
ものを使用する。検出された該特徴点は、対応特徴点検
出部4によって左右画像で互いに対応する対として検出
される。対応対の具体的な検出法としては9例えば関=
1セグメントの形状情報を利用したステレオマツチング
(電子情報通信学会技術報告PRU−84,1988年
)に示される公知の技術を使用することができる。局所
相関演算部5は、特徴点によって対応のつけられた左右
画像上の位置座標のうち2例えば左画像の特徴点位置に
左画像用の相関芯を固定し、右画像の対応する特徴点位
置の近傍で、右画像用の相関窓を第1図における直線C
+Cxに平行となる方向に動かしながら、相互相関値を
求める。視差演算部6は得られた該相関値をもとに内挿
法を用いて、視差を計算する。距離演算部7は、該視差
情報を式(3)に通用して、測定対象点の距離を計算す
る。
The feature point detection unit 3.3' performs feature point detection on each of the stored images and obtains the attributes of the feature points and the position coordinates on the image. Here, specific feature points include, for example, edges, which are discontinuous parts of the image's shading and color, and its attributes include the inclination angle and the shading gradient within the image plane. As for the method for finding this edge, the known technique described in Section 12.2 of "Image Processing Handbook" (edited by Morio Onoe et al., published by Shokodo Co., Ltd.) is used. The detected feature points are detected by the corresponding feature point detection unit 4 as a pair corresponding to each other in the left and right images. As a specific method for detecting corresponding pairs, 9, for example, the relation =
A known technique disclosed in Stereo Matching Using Shape Information of One Segment (IEICE technical report PRU-84, 1988) can be used. The local correlation calculation unit 5 fixes the correlation center for the left image at two of the position coordinates on the left and right images that are correlated by the feature points, for example, at the feature point position of the left image, and fixes the correlation center for the left image at the feature point position of the left image, and fixes the correlation center at the corresponding feature point position of the right image. In the vicinity of
The cross-correlation value is determined while moving in the direction parallel to +Cx. The parallax calculation unit 6 calculates parallax using an interpolation method based on the obtained correlation value. The distance calculation unit 7 applies the parallax information to equation (3) to calculate the distance to the measurement target point.

第3図は、第2図の視差演算部6における演算原理の例
を示したものである。いま左画像上のX座標X、を有す
る点に対して対応している右画像上の点のX座標X、の
まわりで2次のような近傍X座標p、q、rを求める。
FIG. 3 shows an example of the calculation principle in the parallax calculation section 6 of FIG. 2. Now, quadratic neighboring X coordinates p, q, and r are found around the X coordinate X of a point on the right image that corresponds to the point having the X coordinate X on the left image.

3p−max (Sp、Sq、5r) Sp−3(p、xt ) SQ歌S (q、X、) 3r−3(r、X+ ) −q−1 r=q+1 上式で、S (m、n)は左画像のX座標値mの位置お
よび右画像のX座標値nの位置に相関窓を設けた場合の
相互相関値を示す、なお理想的にはqw z 、である
が、エツジ検出における位置精度などの問題で必ずしも
qが×2に一致するとは限らない、そして縦軸に相関値
を、横軸に右画像のX座標値を設けた平面上で、3点(
p、Sp)。
3p-max (Sp, Sq, 5r) Sp-3 (p, xt) SQ song S (q, X,) 3r-3 (r, X+) -q-1 r=q+1 In the above formula, S (m, n) indicates the cross-correlation value when a correlation window is provided at the position of the X coordinate value m of the left image and the position of the X coordinate value n of the right image, ideally qw z, but edge detection Due to problems such as positional accuracy in
p, Sp).

(q、Sq)、  (r、Sr)を通る放物線を求めそ
の放物線上で相関値が最大となるときのX座標値X□°
を用いて、八X−X、 −Xg’を求めるべき視差とす
る。これにより1画素単位よりも細かい精度で視差が得
られることになる。ここでXよ′は。
Find a parabola that passes through (q, Sq), (r, Sr) and find the X coordinate value X□° when the correlation value is maximum on the parabola
Using , 8X-X, -Xg' is the parallax to be found. As a result, parallax can be obtained with a precision finer than that of one pixel. Here is X'.

p、q、r、Sp、Sq、Srを用いて。Using p, q, r, Sp, Sq, Sr.

と表セる。It says.

なお上記第2図の実施例の説明では、エツジなどの特徴
点の対応関係を求めるのに2枚のステレオ画像を用いる
方法で説明したが、より対応特徴点検出の信頌性を増す
ために、3枚以上のステレオ画像を用いる3Nステレオ
方式(例えば、金子:1条規点画像を用いた3次元形状
入力に関する一検討”、テレビジョン学会技術報告、I
C388−30,1988)などによって対応特徴点位
置を画素単位の位置で求めてから、第3図に示すような
視差演算を行うことも可能である(この場合は、複数枚
の画像から例えば一番左と一番右の画像等の適当な2枚
の画像を選んで局所的相関をとることになる)。また同
じく前記説明では。
In the explanation of the embodiment shown in FIG. 2 above, we have explained the method of using two stereo images to find the correspondence between feature points such as edges, but in order to further increase the credibility of the detection of corresponding feature points, , 3N stereo system using three or more stereo images (for example, Kaneko: A study on three-dimensional shape input using one-point reference image,” Television Society Technical Report, I
It is also possible to calculate the parallax as shown in Figure 3 after finding the corresponding feature point position in pixel units using a method such as C388-30, 1988. Two appropriate images, such as the leftmost and rightmost images, are selected to perform local correlation.) Also in the above explanation.

物体上で特徴点として現れる部分についてのみ距離を計
測することとしているが、特徴点として現れない部分<
tS淡・色彩に変化がなく、また平坦な部分)について
は、特徴点部分で得られた情報を用いて補間処理をする
ことにより、その距離を求めることができる。
The distance is measured only for the parts that appear as feature points on the object, but the parts that do not appear as feature points
For areas (tS light, where there is no change in color, and where the area is flat), the distance can be determined by performing interpolation processing using information obtained from the feature point area.

〔発明の効果〕〔Effect of the invention〕

以上説明したように9本発明によれば、撮影位置から物
体の表面までの距離を、高精度でより信頼性が高く計測
することができる。
As described above, according to the present invention, the distance from the photographing position to the surface of the object can be measured with high accuracy and reliability.

【図面の簡単な説明】[Brief explanation of the drawing]

第2図はステレオ立体視による距離計測の原理説明図、
第2図は本発明の一実施例構成、第3図は本発明中の視
差演算部の動作を説明する説明図を示す。 図中、1.1’は画像入力部、2.2’は画像格納部、
3.3’は特徴点検出部、4は対応特徴点検出部、5は
局所相関演算部、6は視差演算部、7は距離演算部、を
表す。 特許出願人  日本電信電話株式会社
Figure 2 is a diagram explaining the principle of distance measurement using stereoscopic vision.
FIG. 2 shows the configuration of an embodiment of the present invention, and FIG. 3 is an explanatory diagram illustrating the operation of the parallax calculating section in the present invention. In the figure, 1.1' is an image input section, 2.2' is an image storage section,
3.3' is a feature point detection section, 4 is a corresponding feature point detection section, 5 is a local correlation calculation section, 6 is a disparity calculation section, and 7 is a distance calculation section. Patent applicant Nippon Telegraph and Telephone Corporation

Claims (1)

【特許請求の範囲】[Claims] ステレオ立体画像をなす画像対の特徴点を検出する過程
と、検出された各々の画像上の特徴点の集合から各画像
間で対をなす対応特徴点を検出する過程と、該対応特徴
点位置に関して原画像対の局所的相関をとる過程と、該
局所的相関値に内挿法を適用することにより原画像対の
視差情報を得る過程と、該視差情報を用いて撮影物体ま
での距離を計算する過程とを有し、画像の2次元的量子
化単位である画素よりも細かい精度の視差情報を用いて
撮影対象の距離を高精度に計算することを特徴とする距
離計測方法。
A process of detecting feature points of a pair of images forming a stereoscopic image, a process of detecting a corresponding feature point forming a pair between each image from a set of feature points on each detected image, and a position of the corresponding feature point. A process of calculating local correlation between a pair of original images, a process of obtaining disparity information of the pair of original images by applying an interpolation method to the local correlation value, and a process of calculating the distance to the photographed object using the disparity information. 1. A distance measuring method comprising: calculating a distance to a photographing target with high precision using parallax information with a precision finer than that of a pixel, which is a two-dimensional quantization unit of an image.
JP3809589A 1989-02-17 1989-02-17 Range finding method Pending JPH02216413A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP3809589A JPH02216413A (en) 1989-02-17 1989-02-17 Range finding method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP3809589A JPH02216413A (en) 1989-02-17 1989-02-17 Range finding method

Publications (1)

Publication Number Publication Date
JPH02216413A true JPH02216413A (en) 1990-08-29

Family

ID=12515916

Family Applications (1)

Application Number Title Priority Date Filing Date
JP3809589A Pending JPH02216413A (en) 1989-02-17 1989-02-17 Range finding method

Country Status (1)

Country Link
JP (1) JPH02216413A (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020112881A (en) * 2019-01-08 2020-07-27 キヤノン株式会社 Distance calculation device, distance calculation method, program, and storage medium
CN114570495A (en) * 2020-11-17 2022-06-03 株式会社希力卡·装备Nq Analysis device

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2020112881A (en) * 2019-01-08 2020-07-27 キヤノン株式会社 Distance calculation device, distance calculation method, program, and storage medium
CN114570495A (en) * 2020-11-17 2022-06-03 株式会社希力卡·装备Nq Analysis device
CN114570495B (en) * 2020-11-17 2023-08-29 株式会社新科集团 Analysis device

Similar Documents

Publication Publication Date Title
JP4198054B2 (en) 3D video conferencing system
CN111192235B (en) Image measurement method based on monocular vision model and perspective transformation
CN110838164B (en) Monocular image three-dimensional reconstruction method, system and device based on object point depth
CN102997891A (en) Device and method for measuring scene depth
CN106225676A (en) Method for three-dimensional measurement, Apparatus and system
CN109974659A (en) A kind of embedded range-measurement system based on binocular machine vision
JPH05303629A (en) Method for synthesizing shape
JP4193342B2 (en) 3D data generator
KR102129206B1 (en) 3 Dimensional Coordinates Calculating Apparatus and 3 Dimensional Coordinates Calculating Method Using Photo Images
JP2005322128A (en) Calibration method for stereo three-dimensional measurement and three-dimensional position calculating method
JP3842988B2 (en) Image processing apparatus for measuring three-dimensional information of an object by binocular stereoscopic vision, and a method for recording the same, or a recording medium recording the measurement program
JP3328478B2 (en) Camera system
KR101995344B1 (en) A dual depth camera module without blind spot
WO2023112971A1 (en) Three-dimensional model generation device, three-dimensional model generation method, and three-dimensional model generation program
JPH02216413A (en) Range finding method
JP2807137B2 (en) 3D shape detection method
JPH11125522A (en) Image processor and method
JPH0875454A (en) Range finding device
JP3912638B2 (en) 3D image processing device
KR20150047604A (en) Method for description of object points of the object space and connection for its implementation
JPH07220113A (en) Image recording/reproducing device
JPH09179998A (en) Three-dimensional image display system
CN112629440A (en) Data fusion method combining luminosity and structured light 3D
JP3525712B2 (en) Three-dimensional image capturing method and three-dimensional image capturing device
CN105551068B (en) A kind of synthetic method of 3 D laser scanning and optical photograph