JPH01214982A - Binocular parallax detecting system - Google Patents

Binocular parallax detecting system

Info

Publication number
JPH01214982A
JPH01214982A JP63039483A JP3948388A JPH01214982A JP H01214982 A JPH01214982 A JP H01214982A JP 63039483 A JP63039483 A JP 63039483A JP 3948388 A JP3948388 A JP 3948388A JP H01214982 A JPH01214982 A JP H01214982A
Authority
JP
Japan
Prior art keywords
left pictures
absolute value
binocular parallax
power spectrum
right images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP63039483A
Other languages
Japanese (ja)
Other versions
JP2726814B2 (en
Inventor
Masahide Nomura
野村 正英
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National Institute of Advanced Industrial Science and Technology AIST
Original Assignee
Agency of Industrial Science and Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Agency of Industrial Science and Technology filed Critical Agency of Industrial Science and Technology
Priority to JP63039483A priority Critical patent/JP2726814B2/en
Publication of JPH01214982A publication Critical patent/JPH01214982A/en
Application granted granted Critical
Publication of JP2726814B2 publication Critical patent/JP2726814B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Landscapes

  • Testing, Inspecting, Measuring Of Stereoscopic Televisions And Televisions (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

PURPOSE:To accurately determine the binocular parallax by executing local frequency analysis by a Gaussian function type window to respective points of two right and left pictures and comparing the absolute value or power spectrum of the frequency component with right and left pictures. CONSTITUTION:In order to correspond to respective points of the right and left pictures, it is necessary that the characteristic to be extracted is unchanged at the position. As such a characteristic, the absolute value or the power spectrum when the picture is locally frequency-analyzed is used. Namely, the local frequency analysis is executed by the Gaussian function type window for respective points of two right and left pictures and the absolute value or the power spectrum is compared between right and left pictures. Thus, the binocular parallax between right and left pictures can be accurately determined.

Description

【発明の詳細な説明】 (産業上の利用分野) 本発明は、2つのカメラからの入力画像を用いて、三次
元的な奥行き情報を得るための、両眼立体視方式に関す
るものである。
DETAILED DESCRIPTION OF THE INVENTION (Field of Industrial Application) The present invention relates to a binocular stereoscopic system for obtaining three-dimensional depth information using input images from two cameras.

(発明が解決しようとする課題) 左右2つのカメラから得られた画像のみを用いて、その
画像に写っている物体の三次元的奥行きを求めるために
は、左右の画像の各点に対して各々それらの対応を決め
る必要がある。そのためには、まず左右の画像の各点か
ら適当な特徴を抽出する必要がある。現在広く知られて
いる両眼視差検出の方式は、左右の画像からエツジに対
応する特徴、即ち画像にΔGの操作を行なった結果がゼ
ロとなる点(ゼロクロッシング)を検出し、それを用い
て左右画像の各点の間の対応づけを決めるものである。
(Problem to be solved by the invention) In order to obtain the three-dimensional depth of an object in the images using only the images obtained from the two left and right cameras, it is necessary to Each needs to decide how to deal with them. To do this, it is first necessary to extract appropriate features from each point in the left and right images. The currently widely known binocular parallax detection method detects the feature corresponding to an edge from the left and right images, that is, the point where the result of performing ΔG operation on the image becomes zero (zero crossing), and uses this. This method determines the correspondence between each point of the left and right images.

しかしながら、このような方式では、左右の画像でゼロ
クロッシングのある点の対応づけは可能であるが、そう
でない点に関しては補間によらざるを得ないという問題
点があった。
However, in such a method, it is possible to associate points with zero crossings in the left and right images, but there is a problem in that interpolation has to be used for other points.

(課題を解決するための手段) 左右2つのカメラから入力された左右2つの画像の各点
の両眼視差を決定する方式において、左右の2つの画像
の各点にたいして、ガウス関数型の窓を用いて局所的な
周波数分析を行い、その絶対値またはパワースペクトル
を左右の画像・間で比較することにより、左右の画像間
の両眼視差を決定する。
(Means for solving the problem) In a method for determining the binocular disparity of each point of the two left and right images input from the two left and right cameras, a Gaussian function type window is formed for each point of the two left and right images. The binocular disparity between the left and right images is determined by performing local frequency analysis using the 3D image and comparing its absolute value or power spectrum between the left and right images.

(作用) 左右の画像の各点を対応づけるには、抽出する特徴が位
置に不変なものである必要がある。そのような特徴とし
ては、画像を局所的な周波数分析した場合の絶対値また
はパワースペクトルがある。よって、それを用いて左右
画像の対応づけを行なうことが有効である。
(Operation) In order to associate each point of the left and right images, the features to be extracted must be position-invariant. Such features include the absolute value or power spectrum of a local frequency analysis of the image. Therefore, it is effective to use this to associate the left and right images.

両眼視差を精度良く決定するためには、その特徴が一致
した位置を、精度良く固定することが必要である。その
ためには、局所的な周波数分析を行なう際の窓関数の空
間的な幅δXが狭い程良い。
In order to accurately determine binocular disparity, it is necessary to accurately fix the position where the features match. For this purpose, the narrower the spatial width δX of the window function when performing local frequency analysis, the better.

しかし、窓関数の空間的な幅が狭い程、局所的な空間周
波数分析を行なう際の、窓関数を施す以前の画像につい
ての空間周波数空間での広がりδには大きくなる。これ
は左右の画像の対応づけの場合に、誤対応を増加させる
。従って、両眼視差の決定のためには、dXとδkを同
時に小さくするように窓関数を選ぶ必要がある。局所的
な周波数分析の際の窓関数がガウス関数であるときに、
それらの積δX・δkが最小となるため、このときにd
Xとδkを同時に最も小さくすることができる。
However, the narrower the spatial width of the window function, the larger the spread δ in the spatial frequency space of the image before the window function is applied when performing local spatial frequency analysis. This increases the number of mismatches when matching left and right images. Therefore, in order to determine binocular disparity, it is necessary to select a window function so as to simultaneously reduce dX and δk. When the window function for local frequency analysis is a Gaussian function,
Since their product δX・δk is the minimum, at this time d
X and δk can be minimized at the same time.

(実施例) 第1図はこのような本発明の原理を実現するための実施
例である。第1図において 101.102,103,104,105は画像データ
及びそれを処理した結果を記憶するフレームメモリ、1
06はコンボリューションに用いるマスク用メモリ、1
11.112,113,114は演算機である。
(Embodiment) FIG. 1 shows an embodiment for realizing the principle of the present invention. In FIG. 1, 101, 102, 103, 104, and 105 are frame memories 1 for storing image data and the results of processing the image data;
06 is mask memory used for convolution, 1
11. 112, 113, and 114 are computing machines.

次に実施例の動作について説明する。フレームメモリ1
01に蓄えられた左右の多値画像データaL(x、y)
+aR(x+y)は、マスク用メモリ106に記憶され
ている以下のようなマスクMe(x、y、θ、k)及び
MS(Xlylθ、k)を用いて、演算機111により
局所的な周波数分析をされ、その結果fL(X、y、θ
、k)、ち(X+3’+θ、k)がフレームメモリ10
2に書き込まれる。
Next, the operation of the embodiment will be explained. Frame memory 1
Left and right multivalued image data aL(x,y) stored in 01
+aR(x+y) is calculated by local frequency analysis by the calculator 111 using the following masks Me(x, y, θ, k) and MS(Xlylθ, k) stored in the mask memory 106. As a result, fL(X, y, θ
, k), (X+3'+θ, k) is the frame memory 10
2 is written.

そして演算機112によってfL、fRの絶対イ証FL
(X、y、01k))FR(x13’jθ、k)が計算
され、その結果がフレームメモリ103に書き込まれ、
それを用いて演算機113により相関P(x、y、dX
、dy、θ、k)が次のように計算される。
Then, using the calculator 112, the absolute proof FL of fL and fR is calculated.
(X, y, 01k)) FR (x13'jθ, k) is calculated, and the result is written to the frame memory 103,
Using this, the calculation unit 113 calculates the correlation P(x, y, dX
, dy, θ, k) are calculated as follows.

P(X、y、dx、dy、θ、k)=[FL(x+dx
/2.y+dy/2.θ、k)−FR(x −dx/2
.y −dy/2.θ、k)12このPはフレームメモ
リ104に書き込まれる。このPから求める局所的な視
差D(x、y)が以下で表わされる量を最小にするベク
トル(dx、dy)として演算機114によって求めら
れ、フレームメモリ105に書き込まれる。
P(X, y, dx, dy, θ, k) = [FL(x+dx
/2. y+dy/2. θ,k)-FR(x-dx/2
.. y-dy/2. θ, k) 12 This P is written into the frame memory 104. The local parallax D (x, y) determined from this P is determined by the computing device 114 as a vector (dx, dy) that minimizes the amount expressed below, and is written into the frame memory 105.

第2図は第1図に示した実施例をより高速化するために
、複数の演算機により並列的に計算するようにしたもの
である。各演算機は各フレームメモリに蓄えられた2次
元画像データ及びその中間処理結果に対して領域分別に
より同時に演算を実行するものである。
In FIG. 2, in order to speed up the embodiment shown in FIG. 1, calculations are performed in parallel by a plurality of computing machines. Each arithmetic machine simultaneously performs arithmetic operations on the two-dimensional image data stored in each frame memory and its intermediate processing results by region classification.

(発明の効果) 本発明によれば両眼視差を精度良く決定することができ
る。
(Effects of the Invention) According to the present invention, binocular parallax can be determined with high accuracy.

【図面の簡単な説明】[Brief explanation of the drawing]

第1図は本発明の両眼視差検出方式の実施例を示すブロ
ック図、第2図は本発明の両眼視差検出方式の多数の演
算機を用いた実施例を示すブロック図である。 図において、101.102.103.104.105
.201.202゜203.204,205はフレーム
メモリ。111.112.113゜114.211,2
12,213,214は演算機、106,206はマス
ク用メモリ。 工業技術院長 飯塚幸三 第  1  図 第2図
FIG. 1 is a block diagram showing an embodiment of the binocular disparity detection method of the present invention, and FIG. 2 is a block diagram showing an embodiment of the binocular disparity detection method of the present invention using a large number of computing machines. In the figure, 101.102.103.104.105
.. 201, 202, 203, 204, and 205 are frame memories. 111.112.113゜114.211,2
12, 213, 214 are computing machines; 106, 206 are mask memories. Director of the Agency of Industrial Science and Technology Kozo Iizuka Figure 1 Figure 2

Claims (1)

【特許請求の範囲】[Claims] 左右2つのカメラから入力された左右2つの画像の各点
にたいして、ガウス関数型の窓による局所的な周波数分
析を行い、その周波数成分の絶対値またはパワースペク
トルを左右の画像で比較することにより、左右画像の間
の両眼視差を決定することを特徴とする両眼視差検出方
式。
By performing local frequency analysis using a Gaussian function window on each point of the two left and right images input from the two left and right cameras, and comparing the absolute value or power spectrum of the frequency component between the left and right images, A binocular disparity detection method characterized by determining binocular disparity between left and right images.
JP63039483A 1988-02-24 1988-02-24 Binocular parallax detection method Expired - Lifetime JP2726814B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP63039483A JP2726814B2 (en) 1988-02-24 1988-02-24 Binocular parallax detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP63039483A JP2726814B2 (en) 1988-02-24 1988-02-24 Binocular parallax detection method

Publications (2)

Publication Number Publication Date
JPH01214982A true JPH01214982A (en) 1989-08-29
JP2726814B2 JP2726814B2 (en) 1998-03-11

Family

ID=12554306

Family Applications (1)

Application Number Title Priority Date Filing Date
JP63039483A Expired - Lifetime JP2726814B2 (en) 1988-02-24 1988-02-24 Binocular parallax detection method

Country Status (1)

Country Link
JP (1) JP2726814B2 (en)

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5325744A (en) * 1976-08-23 1978-03-09 Ngk Spark Plug Co Ltd Low flame extinguishing spark plug
JPS58181179A (en) * 1982-04-15 1983-10-22 Toshiba Corp Extracting device of picture feature

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS5325744A (en) * 1976-08-23 1978-03-09 Ngk Spark Plug Co Ltd Low flame extinguishing spark plug
JPS58181179A (en) * 1982-04-15 1983-10-22 Toshiba Corp Extracting device of picture feature

Also Published As

Publication number Publication date
JP2726814B2 (en) 1998-03-11

Similar Documents

Publication Publication Date Title
US11562498B2 (en) Systems and methods for hybrid depth regularization
Almatrafi et al. Distance surface for event-based optical flow
Zheng et al. A general and simple method for camera pose and focal length determination
KR101706216B1 (en) Apparatus and method for reconstructing dense three dimension image
US20190080464A1 (en) Stereo matching method and apparatus
Wu et al. Single-shot face anti-spoofing for dual pixel camera
Cho et al. Event-image fusion stereo using cross-modality feature propagation
Calway et al. Multiresolution estimation of 2-d disparity using a frequency domain approach
Liu et al. Blind Removal of Facial Foreign Shadows.
JPH01214982A (en) Binocular parallax detecting system
Nozick Camera array image rectification and calibration for stereoscopic and autostereoscopic displays
Baik et al. Fast census transform-based stereo algorithm using SSE2
Li et al. Matching algorithm and parallax extraction based on binocular stereo vision
KR100489894B1 (en) Apparatus and its method for virtual conrtol baseline stretch of binocular stereo images
KR101220003B1 (en) Generating method for disparity map
Jagadeeswari et al. A comparative study based on video stitching methods
JP2870465B2 (en) Three-dimensional reference image clipping method and device, and object determination device
Shin et al. Visual stereo matching combined with intuitive transition of pixel values
Jeon et al. A robust stereo-matching algorithm using multiple-baseline cameras
Nugroho et al. Fuzzy vector implementation on manifold embedding for head pose estimation with degraded images using fuzzy nearest distance
Kondo et al. A matching technique using gradient orientation patterns
Hamzah et al. Depth Estimation Based on Stereo Image Using Passive Sensor
JP2860992B2 (en) Moving target extraction method
JP2679634B2 (en) Method and apparatus for extracting three-dimensional reference image
Adil et al. A Real-Time Stereoscopic Images Rectification and Matching Algorithm Based on Python

Legal Events

Date Code Title Description
EXPY Cancellation because of completion of term