JPH095050A - Three-dimensional image measuring apparatus - Google Patents

Three-dimensional image measuring apparatus

Info

Publication number
JPH095050A
JPH095050A JP7153361A JP15336195A JPH095050A JP H095050 A JPH095050 A JP H095050A JP 7153361 A JP7153361 A JP 7153361A JP 15336195 A JP15336195 A JP 15336195A JP H095050 A JPH095050 A JP H095050A
Authority
JP
Japan
Prior art keywords
image
measurement
distance
measuring device
dimensional
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP7153361A
Other languages
Japanese (ja)
Other versions
JP3614935B2 (en
Inventor
Hiroshi Matsuzaki
弘 松崎
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Olympus Corp
Original Assignee
Olympus Optical Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Olympus Optical Co Ltd filed Critical Olympus Optical Co Ltd
Priority to JP15336195A priority Critical patent/JP3614935B2/en
Publication of JPH095050A publication Critical patent/JPH095050A/en
Application granted granted Critical
Publication of JP3614935B2 publication Critical patent/JP3614935B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Landscapes

  • Length Measuring Devices With Unspecified Measuring Means (AREA)
  • Measurement Of Optical Distance (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

PURPOSE: To carry out a three-dimensional image measurement at high precision and with high resolution and a measurement in a wide range of distance and shorten the processing time by combining a measurement method carried out by actively emitting measurement light and a passive measurement method based on multiple-eye images. CONSTITUTION: A laser range finder 1 to carry out the active measurement is one which carries out measurement while dividing the horizontal direction and the vertical direction into a prescribed number of portions and emits a laser light whose intensity is modulated, detects the phase difference of light reflected by an object 5 to be measured, and converts the phase difference into distance. Moreover, together with phase data, luminous intensity data of light entering a light receiving element is simultaneously measured. In the passive measurement for which image-pickup apparatuses 2, 3 to carry out stereo-image measurement are employed, image data of a prescribed number of picture elements in the vertical and horizontal directions is measured. A data processing apparatus 4 correlates a distance image obtained actively by a finder 1 and an image obtained passively by the image-pickup apparatuses 2, 3 for every image element and thus a three-dimensional image is obtained by carrying out complementary fusion processing of these distance images.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【産業上の利用分野】本発明は、三次元画像計測装置に
関し、特に、能動的な手段及び受動的な手段の複数の手
段によって取得した画像を処理することにより、三次元
の情報を得るための三次元画像計測装置に関するもので
ある。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a three-dimensional image measuring apparatus, and more particularly, to obtain three-dimensional information by processing an image acquired by a plurality of means including active means and passive means. The present invention relates to a three-dimensional image measuring device.

【0002】[0002]

【従来の技術】従来、三次元画像を計測する方法とし
て、能動的に光線を測定対象に照射して、その反射光を
測定することにより三次元形状を測定する方法と、受動
的に画像を取り込み、その画像より三次元形状を復元す
る方法とがある。
2. Description of the Related Art Conventionally, as a method of measuring a three-dimensional image, a method of actively irradiating a measuring object with a light beam and measuring the reflected light thereof, and a method of passively measuring the three-dimensional shape. There is a method of capturing and restoring a three-dimensional shape from the image.

【0003】能動的方法として、レーザーレンジファイ
ンダと呼ばれるものがある。これは、強度変調されたレ
ーザ光線を出射して、対象物体からの反射光と変調信号
との位相差から距離を求める方式の距離測定装置におい
て、測定光線を2次元的に走査し、各点で距離を測定し
て三次元の形状を測定するものである。これは、SPI
E(Vol.852,1987,p.34)等に示され
ており、図5に示すようなものである。すなわち、レー
ザダイオードから発射された強度変調レーザ光をポリゴ
ンミラーと揺動ミラーで2次元的に走査し、対象物体か
ら反射して戻る光を同じ揺動ミラーとポリゴンミラーを
介してAPD受光器で検出し、反射光と変調信号との位
相差から対象物体の各走査点までの距離を測定して三次
元の形状を測定するものである。
As an active method, there is one called a laser range finder. This is a distance measuring device that emits an intensity-modulated laser beam and calculates the distance from the phase difference between the reflected light from the target object and the modulation signal, and scans the measuring beam two-dimensionally at each point. The three-dimensional shape is measured by measuring the distance. This is the SPI
E (Vol. 852, 1987, p. 34) and the like, and is as shown in FIG. That is, the intensity-modulated laser light emitted from the laser diode is two-dimensionally scanned by the polygon mirror and the oscillating mirror, and the light reflected and returned from the target object is passed through the same oscillating mirror and the polygon mirror by the APD receiver. The three-dimensional shape is measured by detecting and measuring the distance to each scanning point of the target object from the phase difference between the reflected light and the modulation signal.

【0004】また、スリット光を投影し、測定対象物体
にできるスリット像を観察し、その歪みの量から三次元
形状を測定する方法がある。これは、電子通信学会論文
誌(Vol.J68−D,1985,p.1141)等
に示されている。
There is also a method of projecting slit light, observing a slit image formed on an object to be measured, and measuring the three-dimensional shape from the amount of distortion. This is shown in a journal of the Institute of Electronics and Communication Engineers (Vol. J68-D, 1985, p. 1141) and the like.

【0005】受動的な方法としては、CCDカメラ等の
撮像装置を複数用いて、視点の異なる位置から撮影され
た複数の多眼画像から三次元の形状を測定するステレオ
画像計測法がある。これは、例えば、三次元画像計測
(昭晃堂、p.14)、SPIE(Vol.848,1
987,p.411)等に示されるように、左右のカメ
ラにより撮像された画像の強度値IL (x,y),IR
(x,y)から、全ての(x,y)についてそれぞれ、 が最小になるような視差量(δx,δy)を求める。通
常の測定では、左右水平にカメラを配置するため、y軸
方向の視差は無視するとすれば、δy=0であり、視差
量はδxとなるので、その値から、 d(x,y)=(f・L)/(δx・μ) (2) により、点(x,y)の位置の距離d(x,y)を求め
る。ただし、(2)式において、fは結像レンズの焦点
距離、Lは基線長、すなわち、2つの結像レンズの光軸
間の距離、μはCCD等の受像素子(撮像素子)の画素
間距離である。
As a passive method, there is a stereo image measuring method in which a plurality of image pickup devices such as CCD cameras are used to measure a three-dimensional shape from a plurality of multi-view images photographed from different viewpoints. This includes, for example, three-dimensional image measurement (Shokoido, p.14), SPIE (Vol.848, 1).
987, p. 411) etc., the intensity values I L (x, y), I R of the images captured by the left and right cameras are shown.
From (x, y), for all (x, y), The parallax amount (δx, δy) that minimizes is obtained. In a normal measurement, the cameras are arranged horizontally on the left and right, so ignoring the parallax in the y-axis direction, δy = 0, and the parallax amount is δx. From that value, d (x, y) = (F · L) / (δx · μ) From (2), the distance d (x, y) at the position of the point (x, y) is obtained. However, in the equation (2), f is the focal length of the imaging lens, L is the baseline length, that is, the distance between the optical axes of the two imaging lenses, and μ is the pixel of the image receiving element (imaging element) such as CCD. It is a distance.

【0006】また、異種の視覚センサによる測定値を融
合させて処理を行うことにより、信頼性を高めようと試
みる例もあり、これは、日本ロボット学会誌(Vol.
12,1994,p.700)に示されている。これ
は、多眼画像とカラー画像とを用い、多眼画像において
対応点検索を行う際の中間結果を利用して、カラー画像
の領域分割を行い、全体的な処理時間の短縮を行ったも
のである。
There is also an example of attempting to improve reliability by fusing measured values obtained by different types of visual sensors to perform processing. This is described in the Journal of the Robotics Society of Japan (Vol.
12, 1994, p. 700). This uses a multi-view image and a color image, and uses an intermediate result when performing corresponding point search in the multi-view image to perform area division of the color image, thereby shortening the overall processing time. Is.

【0007】さらに、3D Image Confer
ence’93,p.173には、スリット光投影法に
よる距離画像とステレオ画像との融合処理による三次元
計測を行った例もある。
[0007] Furthermore, 3D Image Confer
ence '93, p. In 173, there is also an example in which three-dimensional measurement is performed by fusion processing of a distance image and a stereo image by the slit light projection method.

【0008】[0008]

【発明の解決しようとする課題】しかし、上記のよう
に、能動的に測定光線を出射して測定を行う方法では、
距離が遠くなるに従って反射光強度が弱くなり、距離が
遠い位置での測定が行えないという不都合がある。さら
に、対象物体の表面状態によっても測定特性が変化し、
反射率の悪い表面では反射光量が弱く、測定が行えない
という不都合がある。
However, as described above, in the method of actively emitting the measurement light beam to perform the measurement,
As the distance becomes longer, the intensity of the reflected light becomes weaker, and there is an inconvenience that measurement cannot be performed at a position where the distance is long. Furthermore, the measurement characteristics also change depending on the surface condition of the target object,
On a surface with poor reflectivity, the amount of reflected light is weak and there is the disadvantage that measurement cannot be performed.

【0009】また、受動的な測定であるステレオ画像計
測法では、左右の対応点検索が困難で時間もかかるとい
う課題を持っている。さらに、距離が近い程視差量が大
きくなるため、対応点検索の難易度は増してくる。
Further, the stereo image measuring method which is a passive measurement has a problem that it is difficult and time-consuming to search for corresponding points on the left and right. Furthermore, the closer the distance is, the larger the parallax amount becomes, and thus the difficulty level of the corresponding point search increases.

【0010】本発明は従来技術の上記のような問題点に
鑑みてなされたものであり、その目的は、能動的に測定
光線を出射して測定する方法と、受動的な多眼画像から
の測定方法とを組み合わせて三次元画像計測を行うこと
により、高精度、高分解能な測定と、広い距離範囲の測
定を行うことを可能とし、処理時間も短縮するようにす
ることである。
The present invention has been made in view of the above problems of the prior art, and an object thereof is to provide a method for actively emitting a measurement light beam for measurement and a passive multi-view image. By performing three-dimensional image measurement in combination with a measurement method, it is possible to perform high-accuracy, high-resolution measurement and measurement in a wide distance range, and shorten the processing time.

【0011】[0011]

【課題を解決するための手段】上記目的を達成する本発
明の三次元画像計測装置は、能動的に光線を出射してそ
の反射光線を受光することにより距離画像を得ることの
できる能動的距離画像計測装置で得られた距離画像と、
撮像装置で受動的に得られた画像とを、各画像中の各画
素毎に対応付けを行う対応付け手段と、前記撮像装置で
受動的に得られた画像から得られた距離画像と、前記能
動的距離画像計測装置で得られた距離画像とを相補的に
融合処理を行い三次元画像を得る融合処理手段とを備え
たことを特徴とするものである。
A three-dimensional image measuring apparatus of the present invention that achieves the above object is an active range capable of obtaining a range image by actively emitting a light beam and receiving a reflected light beam thereof. Distance image obtained by the image measuring device,
An image passively obtained by the image pickup device, an associating unit that associates each pixel in each image, a distance image obtained from the image passively obtained by the image pickup device, It is characterized in that it is provided with a fusion processing means for complementing fusion processing with a distance image obtained by the active distance image measuring device to obtain a three-dimensional image.

【0012】この場合、上記の対応付け手段において、
対応付けに前記能動的距離画像計測装置で得られた反射
強度画像を用いることができる。
In this case, in the above associating means,
A reflection intensity image obtained by the active range image measuring device can be used for association.

【0013】また、上記の融合処理手段において、能動
的距離画像計測装置で得られた反射強度画像の強度が所
定の敷居値以下の画素位置においては、撮像装置で受動
的に得られた画像から得られた距離を三次元画像の値と
して選択するようにすることが望ましい。
Further, in the above fusion processing means, at a pixel position where the intensity of the reflection intensity image obtained by the active range image measuring device is less than a predetermined threshold value, the image obtained passively by the image pickup device is used. It is desirable to select the obtained distance as the value of the three-dimensional image.

【0014】また、その融合処理手段において、能動的
距離画像計測装置で得られた反射強度画像の強度が所定
の敷居値以上の画素位置においては、撮像装置で受動的
に得られた画像から距離画像を得るのに、能動的距離画
像計測装置で得られた距離測定値を対応点探索を行う際
の視差量の初期値として用いることが望ましい。
Further, in the fusion processing means, at a pixel position where the intensity of the reflection intensity image obtained by the active range image measuring device is equal to or more than a predetermined threshold value, the distance from the image passively obtained by the image pickup device is obtained. In order to obtain an image, it is desirable to use the distance measurement value obtained by the active distance image measurement device as the initial value of the parallax amount when performing the corresponding point search.

【0015】[0015]

【作用】以下、本発明において、上記構成をとった理由
と作用を説明する。能動的に光線を出射して距離を測定
する方法では、測定値は、一般に、図4中に符号24で
示すように、d(φ,θ)(dは距離、φ,θは測定方
向角度で、φは水平方向角度、θは垂直方向角度)と表
される。
In the following, the reason why the above structure is adopted and the operation thereof will be described. In the method of actively emitting a light beam to measure the distance, the measured value is generally d (φ, θ) (d is the distance and φ, θ is the measurement direction angle, as indicated by reference numeral 24 in FIG. Where φ is the horizontal angle and θ is the vertical angle).

【0016】また、受動的な方向による画像は、図4中
に符号26、27に示すように、IL (x,y)、IR
(x,y)(IL ,IR は受光強度、x,yはx=D
((x2 +y2 1/2 )・tanφ,y=D((x2
2 1/2 )・tanθの関係式を満たす。D(r)は
結像レンズの歪曲収差関数である。)と表され、能動的
な方式と、受動的な方式とでは、一般的に座標系が異な
るので、両者を融合的に処理する際には、座標系を一致
させる必要がある。
The image in the passive direction is I L (x, y), I R as shown by reference numerals 26 and 27 in FIG.
(X, y) (I L and I R are received light intensities, x and y are x = D
((X 2 + y 2 ) 1/2 ) · tan φ, y = D ((x 2 +
The relational expression of y 2 ) 1/2 ) · tan θ is satisfied. D (r) is a distortion aberration function of the imaging lens. ), The active system and the passive system generally have different coordinate systems, so it is necessary to match the coordinate systems when processing the two in an integrated manner.

【0017】そのために、まず、視野の中心の方向一致
させる方法としては、1つの方法として、予め能動的測
定手段と受動的測定手段との視野を機械的に一致させて
おく方法がある。また、座標変換を行うためには、受動
的手段による測定における結像光学系の収差特性を予め
測定、又は、シュミレーションしておき、その結果を用
いて(φ,θ)を(x,y)に変換することができる。
For this reason, first, as a method of matching the directions of the centers of the visual fields, there is a method of mechanically matching the visual fields of the active measuring means and the passive measuring means in advance. Further, in order to perform the coordinate conversion, the aberration characteristic of the imaging optical system in the measurement by the passive means is previously measured or simulated, and the result is used to calculate (φ, θ) as (x, y). Can be converted to.

【0018】他の方法として、得られた画像データを用
いて、能動的、受動的に得られた各画像の相関を取るこ
とにより、(φ,θ)を(x,y)に変換することもで
きる。その際、能動的手段で測定された画像としては、
多眼画像の明るさ画像と相関を取るため、図4中の符号
25で示す反射強度画像I(φ,θ)を用いることにす
る。相関の取り方としては、画像の強度データを利用す
る方法と、特徴、例えばエッジ等を利用する方法とがあ
る。
As another method, the obtained image data is used to convert (φ, θ) into (x, y) by correlating each image obtained actively and passively. You can also At that time, as the image measured by the active means,
In order to make a correlation with the brightness image of the multi-view image, the reflection intensity image I (φ, θ) indicated by reference numeral 25 in FIG. 4 is used. As a method of obtaining the correlation, there are a method of using intensity data of an image and a method of using a feature such as an edge.

【0019】具体的に、I(φ,θ)とIR (x,y)
とのデータを対応させ、全ての(x,y)について、I
R (x,y)をI(φ,θ)に対応するように(x,
y)を(φ,θ)に対応付ける。なお、予め視野を機械
的に一致させておく場合は、簡単に(x,y)を(φ,
θ)に置き換えられる。そして、 を計算し、E(φ,θ)が最小となるように、δφ0
δθ0 を求め、 x=φ−δφ0 ,y=θ−δθ0 と座標変換する。これを全てのφ,θについて行う。以
上のような方法で、視野の一致した座標系を(x,y)
に統一することができる。このようにして、視野の中心
の方向の一致と同時に座標変換を行う。ここで、座標変
換された距離画像はd’(x,y)、反射強度画像は
I’(x,y)となり、多眼画像は、IL (x,y),
R (x,y)である。
Specifically, I (φ, θ) and I R (x, y)
And the data are associated with each other, and for all (x, y), I
Let R (x, y) correspond to I (φ, θ) by (x,
y) is associated with (φ, θ). Note that if the fields of view are to be mechanically matched in advance, (x, y) can be simply changed to (φ,
θ). And And δφ 0 , such that E (φ, θ) is minimized.
δθ 0 is obtained, and coordinate conversion is performed as x = φ−δφ 0 , y = θ−δθ 0 . This is performed for all φ and θ. With the above method, the coordinate system with the same field of view is (x, y)
Can be unified. In this way, coordinate conversion is performed at the same time when the directions of the centers of the visual fields coincide. Here, the coordinate-converted distance image is d '(x, y), the reflection intensity image is I' (x, y), and the multi-view image is IL (x, y),
I R (x, y).

【0020】能動的手段で測定されたデータd’(x,
y)は、すでに直接距離が測定できているので、d’
(x,y)は測定精度内で確からしい値を持つ。この値
に基づいて、多眼画像の対応点検索を行う。
The data d '(x,
In y), the distance can already be measured directly, so d '
(X, y) has a certain value within the measurement accuracy. Based on this value, the corresponding points of the multi-view image are searched.

【0021】さらに、反射強度の値I’(x,y)があ
る適当に設定した値Ithに対して、 I(x,y)<Ith である場合には、能動的手段で測定された距離の測定値
に信頼性がなくなるため、多眼画像による計測のみのデ
ータを用いる。このような条件式となるのは、測定対象
物体の距離が遠い場合である可能性が高く、そのような
場合には、視差も小さくなるので、多眼画像の対応点検
索が容易な位置となるため、能動的測定手段のデータが
利用できないとしても、特に支障はない。
Further, if I (x, y) <I th with respect to an appropriately set value I th of the reflection intensity value I '(x, y), it is measured by active means. Since the measured value of the distance becomes unreliable, only the data measured by the multi-view image is used. Such a conditional expression is likely to be when the distance of the measurement target object is long, and in such a case, the parallax becomes small, so that the position where the corresponding points of the multi-view image can be easily searched is set. Therefore, even if the data of the active measuring means cannot be used, there is no particular problem.

【0022】能動的手段で測定を行うことのできる領
域、すなわち、 I(x,y)≧Ith においては、レーザーレンジファインダにより測定され
た距離画像d’(x,y)と多眼画像より得られた距離
画像d”(x,y)との2つの値が得られるが、理想的
にはこれらの値は一致するものである。しかしながら、
通常は、誤差等のため異なる値となる可能性が高い。そ
のような場合には、これらの値、d’(x,y),d”
(x,y)から、差ε(x,y)、 ε(x,y)=d”(x,y)−d’(x,y) を判断し、測定値としてより確からしい距離を決定す
る。
In the region where the measurement can be performed by the active means, that is, I (x, y) ≧ I th , the distance image d ′ (x, y) measured by the laser range finder and the multi-view image are used. Two values are obtained with the obtained range image d ″ (x, y), but ideally these values are the same.
Usually, there is a high possibility that the values will be different due to an error or the like. In such cases, these values, d '(x, y), d "
From (x, y), the differences ε (x, y) and ε (x, y) = d ″ (x, y) −d ′ (x, y) are determined, and a more probable distance is determined as a measurement value. To do.

【0023】すなわち、ある閾値εthを設定し、 ε≦εth である場合には、d’(x,y),d”(x,y)は共
に確からしいと見なし、この際は、d’(x,y)と
d”(x,y)との平均値を取る等すれば、より正確な
値となる。また、 ε>εth となった場合は、どちらかが不確かであることが予想さ
れるので、それぞれ近傍の画素の値と比較して、連続性
等の条件(近傍の値との差が所定の敷居値以下にある場
合を連続性があるとして、より確かな測定値と判断す
る、等)から、より適切な方の値を確定値とする。
That is, when a certain threshold value ε th is set and ε ≦ ε th, it is considered that both d ′ (x, y) and d ″ (x, y) are probable, and in this case, d A more accurate value can be obtained by taking the average value of '(x, y) and d "(x, y). Also, if ε> ε th , it is expected that one of them is uncertain, so that the continuity is compared with the values of the neighboring pixels (the difference between the neighboring values is If the measured value is below the threshold value, it is considered to have continuity and it is judged as a more reliable measurement value.), And the more appropriate value is determined.

【0024】以上のように、能動的手段により測定され
たデータと多眼画像とを併用して三次元情報を測定する
ことにより、より高精度、高分解能の三次元画像情報が
得られる。
As described above, by using the data measured by the active means and the multi-view image together to measure the three-dimensional information, it is possible to obtain the three-dimensional image information with higher accuracy and resolution.

【0025】[0025]

【実施例】以下に、本発明の三次元画像計測装置を実施
例に基づいて説明する。 (実施例1)図1に本発明の実施例1の装置の配置図を
示す。図中、1は能動的な測定を行うレーザレンズファ
インダ、2、3はステレオ画像計測を行うための撮像装
置、4はデータ処理装置、5は測定対象物体である。
EXAMPLES A three-dimensional image measuring apparatus of the present invention will be described below based on examples. (Embodiment 1) FIG. 1 shows a layout of an apparatus according to Embodiment 1 of the present invention. In the figure, 1 is a laser lens finder that performs active measurement, 2 and 3 are imaging devices for performing stereo image measurement, 4 is a data processing device, and 5 is an object to be measured.

【0026】ここに示したレーザレンジファインダ1
は、水平方向約80°を256分割、鉛直方向約40°
を64分割で測定できるものである。強度変調されたレ
ーザー光を出射して、対象物体5から反射した光の位相
ずれを検出して距離に変換する方法をとっている。ま
た、位相データと共に、その中の受光素子に入射する光
強度データも同時に測定する機構も備えている。
The laser range finder 1 shown here
Is about 80 ° in the horizontal direction divided into 256, about 40 ° in the vertical direction
Can be measured in 64 divisions. A method of emitting intensity-modulated laser light, detecting the phase shift of the light reflected from the target object 5, and converting it into a distance is used. Further, a mechanism for simultaneously measuring the phase data and the light intensity data incident on the light receiving element therein is also provided.

【0027】撮像装置2、3を用いた受動的な測定にお
いては、縦512画素、横512画素の画像データを測
定している。レーザーレンジファインダー1で測定され
た距離画像データをd(φ,θ)、反射強度データをI
(φ,θ)とする。また、多眼で測定された左右の画像
の受光強度データをIL (x,y),IR(x,y)と
する。
In the passive measurement using the image pickup devices 2 and 3, image data of 512 pixels in the vertical direction and 512 pixels in the horizontal direction are measured. Distance image data measured by the laser range finder 1 is d (φ, θ), and reflection intensity data is I.
(Φ, θ). Further, the received light intensity data of the left and right images measured by multiple eyes is I L (x, y), I R (x, y).

【0028】このような2種類の装置を用いて行った測
定データから最適な距離画像を求めるためのフローチャ
ートを図3に示す。この図において、ステップ1は、レ
ーザレンジファインダ画像d(φ,θ),I(φ,
θ)、多眼画像IL (x,y),IR (x,y)をデー
タ処理装置4に入力するステップであり、ステップ2
は、視野を合わせるための処理ステップであり、前記の
2つの方法の何れかをとる。ステップ3は、レーザレン
ジファインダ1の測定値における各画素を多眼画像の内
の1つの画像と位置を合わせる座標変換処理であり、前
記の(3)式と同様にして行う。ステップ4は、レーザ
レンジファインダ1による距離の測定値が使えるか否か
を判断する処理ステップであり、反射強度画像I(φ,
θ)が敷居値Ith以上であるか否かを判断する。ステッ
プ5は、I(φ,θ)≧Ithの時に多眼画像の対応点検
索を行う処理であり、レーザレンジファインダ1での測
定データd’(x,y)を初期値として前記の(1)〜
(2)式に基づいて多眼画像の対応点検索を行うため、
データの処理量が大幅に減少する。ステップ6は、能動
的手段と受動的手段で測定した距離測定値d’(x,
y)とd”(x,y)とを比較し、ステップ7で、適切
な値を求める。ステップ8は、能動的手段での測定にお
いて受光量が小さい場合、すなわち、I(φ,θ)<I
thの時に通常の方法で前記の(1)〜(2)式に基づい
て多眼画像の対応点検索を行う処理であり、ステップ9
で、ステップ8で得られた距離を最適値とする。以上の
各ステップを経て、ステップ10で、全ての(x,y)
について最終的な距離画像を決定する。
FIG. 3 shows a flow chart for obtaining an optimum range image from the measurement data obtained by using such two kinds of devices. In this figure, step 1 is a laser range finder image d (φ, θ), I (φ,
θ) and the multi-view images I L (x, y) and I R (x, y) are input to the data processing device 4, and step 2
Is a processing step for matching the fields of view, and takes either of the above two methods. Step 3 is a coordinate conversion process for aligning the position of each pixel in the measured value of the laser range finder 1 with one of the multi-view images, and is performed in the same manner as the above-mentioned formula (3). Step 4 is a processing step of determining whether or not the distance measurement value by the laser range finder 1 can be used, and the reflection intensity image I (φ,
θ) is greater than or equal to the threshold value I th . Step 5 is a process of searching for corresponding points in the multi-view image when I (φ, θ) ≧ I th , and the measurement data d ′ (x, y) in the laser range finder 1 is used as an initial value for the above ( 1) ~
Since the corresponding points of the multi-view image are searched based on the equation (2),
Data throughput is greatly reduced. Step 6 is to measure the distance measurement value d '(x,
y) and d ″ (x, y) are compared, and an appropriate value is obtained in step 7. In step 8, when the amount of received light is small in the measurement by the active means, that is, I (φ, θ) <I
When th , it is a process of performing a corresponding point search of a multi-view image based on the above formulas (1) and (2) at a normal method, and step 9
Then, the distance obtained in step 8 is set as the optimum value. After each of the above steps, in step 10, all (x, y)
Determine the final range image for.

【0029】(実施例2)図2に他の実施例の装置の配
置図を示す。この実施例においては、受動的な撮像装置
における結像レンズ16、17と撮像素子18、19の
間にハーフミラー20を配置し、多眼画像の内の1つの
画像がレーザレンジファインダ21と同一視野を持つよ
うに構成したものである。22はデータ処理装置、23
は測定対象物体である。このような構成にすることによ
り、視野の中心を画像処理的に合わせる必要がなくなる
ので、その分処理時間が短縮される。
(Embodiment 2) FIG. 2 shows a layout of an apparatus according to another embodiment. In this embodiment, a half mirror 20 is arranged between the imaging lenses 16 and 17 and the image pickup devices 18 and 19 in the passive image pickup device, and one of the multi-view images is the same as the laser range finder 21. It is configured to have a field of view. 22 is a data processor, 23
Is an object to be measured. With such a configuration, it is not necessary to align the center of the visual field in terms of image processing, and the processing time is shortened accordingly.

【0030】以上、本発明の三次元画像計測装置をその
原理と実施例に基づいて説明してきたが、本発明はこれ
らに限定されず種々の変形が可能である。
The three-dimensional image measuring apparatus of the present invention has been described above based on its principle and embodiments, but the present invention is not limited to these and various modifications are possible.

【0031】[0031]

【発明の効果】以上の説明から明らかなように、本発明
の三次元画像計測装置によると、能動的に測定光線を出
射して測定する方法と、受動的な多眼画像に基づく測定
方法ととを組み合わせて三次元画像計測を行うことによ
り、高精度、高分解能な測定、広い距離範囲の測定を行
うことが可能となり、さらに、処理時間を短縮すること
もできる。
As is apparent from the above description, according to the three-dimensional image measuring apparatus of the present invention, there are provided a method of actively emitting a measuring light beam and a measurement method, and a passive multi-view image-based measuring method. By performing the three-dimensional image measurement in combination with and, it becomes possible to perform high-precision, high-resolution measurement and measurement in a wide distance range, and further it is possible to shorten the processing time.

【図面の簡単な説明】[Brief description of drawings]

【図1】本発明の実施例1の三次元画像計測装置の配置
図である。
FIG. 1 is a layout view of a three-dimensional image measuring device according to a first embodiment of the present invention.

【図2】本発明の実施例2の三次元画像計測装置の配置
図である。
FIG. 2 is a layout diagram of a three-dimensional image measuring device according to a second embodiment of the present invention.

【図3】実施例1における測定値の処理手順を示すフロ
ーチャートである。
FIG. 3 is a flowchart illustrating a processing procedure of measured values according to the first embodiment.

【図4】各センサによる測定画像内の画素間の関係を示
す図である。
FIG. 4 is a diagram showing a relationship between pixels in a measurement image by each sensor.

【図5】レーザーレンジファインダの従来例の構成を示
す図である。
FIG. 5 is a diagram showing a configuration of a conventional example of a laser range finder.

【符号の説明】[Explanation of symbols]

1…レーザレンズファインダ 2、3…ステレオ画像計測を行うための撮像装置 4…4はデータ処理装置 5…測定対象物 16、17…結像レンズ 18、19…撮像素子 20…ハーフミラー 21…レーザレンジファインダ 22…データ処理装置 23…測定対象物体 24…能動的に求めた距離画像 25…能動的に求めた反射強度画像 26…受動的に得られた左眼画像 27…受動的に得られた右眼画像 DESCRIPTION OF SYMBOLS 1 ... Laser lens finder 2, 3 ... Imaging device for performing stereo image measurement 4 ... 4 is a data processing device 5 ... Measurement object 16, 17 ... Imaging lens 18, 19 ... Imaging element 20 ... Half mirror 21 ... Laser Range finder 22 ... Data processing device 23 ... Measurement target object 24 ... Actively obtained distance image 25 ... Actively obtained reflection intensity image 26 ... Passively obtained left eye image 27 ... Passively obtained Right eye image

Claims (4)

【特許請求の範囲】[Claims] 【請求項1】 能動的に光線を出射してその反射光線を
受光することにより距離画像を得ることのできる能動的
距離画像計測装置で得られた距離画像と、撮像装置で受
動的に得られた画像とを、各画像中の各画素毎に対応付
けを行う対応付け手段と、前記撮像装置で受動的に得ら
れた画像から得られた距離画像と、前記能動的距離画像
計測装置で得られた距離画像とを相補的に融合処理を行
い三次元画像を得る融合処理手段とを備えたことを特徴
とする三次元画像計測装置。
1. A range image obtained by an active range image measuring device capable of obtaining a range image by actively emitting a ray and receiving a reflected ray thereof, and a range image obtained passively by an image pickup device. The obtained image is obtained by the associating means for making correspondence for each pixel in each image, the range image obtained from the image passively obtained by the image pickup device, and the active range image measuring device. A three-dimensional image measuring device, comprising: a fusion processing means that complementarily performs fusion processing with the obtained range image to obtain a three-dimensional image.
【請求項2】 前記対応付け手段において、前記の対応
付けに前記能動的距離画像計測装置で得られた反射強度
画像を用いること特徴とする請求項1記載の三次元画像
計測装置。
2. The three-dimensional image measuring device according to claim 1, wherein the associating means uses a reflection intensity image obtained by the active range image measuring device for the associating.
【請求項3】 前記融合処理手段において、前記能動的
距離画像計測装置で得られた反射強度画像の強度が所定
の敷居値以下の画素位置においては、前記撮像装置で受
動的に得られた画像から得られた距離を三次元画像の値
として選択すること特徴とする請求項2記載の三次元画
像計測装置。
3. The image obtained passively by the image pickup device at the pixel position where the intensity of the reflection intensity image obtained by the active range image measuring device is less than or equal to a predetermined threshold value in the fusion processing means. 3. The three-dimensional image measuring device according to claim 2, wherein the distance obtained from is selected as the value of the three-dimensional image.
【請求項4】 前記融合処理手段において、前記能動的
距離画像計測装置で得られた反射強度画像の強度が所定
の敷居値以上の画素位置においては、前記撮像装置で受
動的に得られた画像から距離画像を得るのに、前記能動
的距離画像計測装置で得られた距離測定値を対応点探索
を行う際の視差量の初期値として用いること特徴とする
請求項2又は3記載の三次元画像計測装置。
4. The image obtained passively by the imaging device at the pixel position where the intensity of the reflection intensity image obtained by the active range image measuring device is equal to or higher than a predetermined threshold value in the fusion processing means. The three-dimensional according to claim 2 or 3, wherein the distance measurement value obtained by the active distance image measuring device is used as an initial value of the parallax amount when performing the corresponding point search, in order to obtain the distance image from the distance image. Image measurement device.
JP15336195A 1995-06-20 1995-06-20 3D image measuring device Expired - Fee Related JP3614935B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP15336195A JP3614935B2 (en) 1995-06-20 1995-06-20 3D image measuring device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP15336195A JP3614935B2 (en) 1995-06-20 1995-06-20 3D image measuring device

Publications (2)

Publication Number Publication Date
JPH095050A true JPH095050A (en) 1997-01-10
JP3614935B2 JP3614935B2 (en) 2005-01-26

Family

ID=15560780

Family Applications (1)

Application Number Title Priority Date Filing Date
JP15336195A Expired - Fee Related JP3614935B2 (en) 1995-06-20 1995-06-20 3D image measuring device

Country Status (1)

Country Link
JP (1) JP3614935B2 (en)

Cited By (20)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6251609B1 (en) 2000-07-27 2001-06-26 Becton, Dickinson And Company Amplification and detection of Legionella pneumophila targeting the mip gene
JP2006317418A (en) * 2005-05-16 2006-11-24 Nikon Corp Image measuring device, image measurement method, measurement processing program, and recording medium
JP2007524085A (en) * 2003-12-11 2007-08-23 ストライダー ラブス,インコーポレイテッド A technique for predicting the surface of a shielded part by calculating symmetry.
JP2008008687A (en) * 2006-06-27 2008-01-17 Toyota Motor Corp Distance measuring system and distance measuring method
JP2008116308A (en) * 2006-11-02 2008-05-22 Fujifilm Corp Method and apparatus for generating range image
KR100892232B1 (en) * 2008-12-23 2009-04-09 한승원 3 dimensional image information collection device
JP2010060451A (en) * 2008-09-04 2010-03-18 Toyota Motor Corp Robotic apparatus and method for estimating position and attitude of object
CN101853528A (en) * 2010-05-10 2010-10-06 沈阳雅克科技有限公司 Hand-held three-dimensional surface information extraction method and extractor thereof
JP2012149987A (en) * 2011-01-19 2012-08-09 Seiko Epson Corp Position detecting system, display system, and information processing system
JP2012220471A (en) * 2011-04-14 2012-11-12 Mitsubishi Electric Corp Development view generation device, development view generation method and development view display method
JP2013104784A (en) * 2011-11-14 2013-05-30 Mitsubishi Electric Corp Optical three-dimensional camera
JP2013120083A (en) * 2011-12-06 2013-06-17 Honda Motor Co Ltd Environment recognition apparatus
JP2013210339A (en) * 2012-03-30 2013-10-10 Honda Motor Co Ltd Contact state estimation device
JP2015038777A (en) * 2014-11-12 2015-02-26 セイコーエプソン株式会社 Position detecting system, display system, and information processing system
RU173576U1 (en) * 2017-02-03 2017-08-31 Федеральное государственное унитарное предприятие "Всероссийский научно-исследовательский институт автоматики им. Н.Л. Духова" (ФГУП "ВНИИА") Device for obtaining stereoscopic images of small objects
WO2018042801A1 (en) * 2016-09-01 2018-03-08 ソニーセミコンダクタソリューションズ株式会社 Imaging device
JP2018506700A (en) * 2014-11-05 2018-03-08 シエラ・ネバダ・コーポレイション System and method for generating an improved environmental display for a mobile
RU187528U1 (en) * 2018-11-15 2019-03-12 Федеральное Государственное Унитарное Предприятие "Всероссийский Научно-Исследовательский Институт Автоматики Им.Н.Л.Духова" (Фгуп "Внииа") Device for obtaining stereoscopic images of fast processes
CN109579731A (en) * 2018-11-28 2019-04-05 华中科技大学 A method of executing 3 d surface topography measurement based on image co-registration
JP2020031406A (en) * 2018-08-24 2020-02-27 独立行政法人日本スポーツ振興センター Determination system and determination method

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR102090502B1 (en) * 2014-01-02 2020-03-18 엘지전자 주식회사 Distance measuring device and method thereof

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6060616A (en) * 1983-09-13 1985-04-08 Canon Inc Automatic focus adjusting device
JPS62194413A (en) * 1986-02-21 1987-08-26 Yamaha Motor Co Ltd Three-dimensional coordinate measuring instrument
JPH04169805A (en) * 1990-11-01 1992-06-17 Matsushita Electric Ind Co Ltd Measuring apparatus of three-dimensional image
JPH06501774A (en) * 1990-10-15 1994-02-24 シュルツ・ウォールディーン・エー Three-dimensional non-contact shape detection method and device
JPH06258048A (en) * 1993-03-05 1994-09-16 Toshiba Corp Object input device

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6060616A (en) * 1983-09-13 1985-04-08 Canon Inc Automatic focus adjusting device
JPS62194413A (en) * 1986-02-21 1987-08-26 Yamaha Motor Co Ltd Three-dimensional coordinate measuring instrument
JPH06501774A (en) * 1990-10-15 1994-02-24 シュルツ・ウォールディーン・エー Three-dimensional non-contact shape detection method and device
JPH04169805A (en) * 1990-11-01 1992-06-17 Matsushita Electric Ind Co Ltd Measuring apparatus of three-dimensional image
JPH06258048A (en) * 1993-03-05 1994-09-16 Toshiba Corp Object input device

Cited By (31)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6251609B1 (en) 2000-07-27 2001-06-26 Becton, Dickinson And Company Amplification and detection of Legionella pneumophila targeting the mip gene
JP2007524085A (en) * 2003-12-11 2007-08-23 ストライダー ラブス,インコーポレイテッド A technique for predicting the surface of a shielded part by calculating symmetry.
US7961934B2 (en) 2003-12-11 2011-06-14 Strider Labs, Inc. Probable reconstruction of surfaces in occluded regions by computed symmetry
JP2006317418A (en) * 2005-05-16 2006-11-24 Nikon Corp Image measuring device, image measurement method, measurement processing program, and recording medium
JP2008008687A (en) * 2006-06-27 2008-01-17 Toyota Motor Corp Distance measuring system and distance measuring method
JP2008116308A (en) * 2006-11-02 2008-05-22 Fujifilm Corp Method and apparatus for generating range image
JP2010060451A (en) * 2008-09-04 2010-03-18 Toyota Motor Corp Robotic apparatus and method for estimating position and attitude of object
KR100892232B1 (en) * 2008-12-23 2009-04-09 한승원 3 dimensional image information collection device
CN101853528A (en) * 2010-05-10 2010-10-06 沈阳雅克科技有限公司 Hand-held three-dimensional surface information extraction method and extractor thereof
JP2012149987A (en) * 2011-01-19 2012-08-09 Seiko Epson Corp Position detecting system, display system, and information processing system
JP2012220471A (en) * 2011-04-14 2012-11-12 Mitsubishi Electric Corp Development view generation device, development view generation method and development view display method
JP2013104784A (en) * 2011-11-14 2013-05-30 Mitsubishi Electric Corp Optical three-dimensional camera
US8896689B2 (en) 2011-12-06 2014-11-25 Honda Motor Co., Ltd. Environment recognition apparatus
JP2013120083A (en) * 2011-12-06 2013-06-17 Honda Motor Co Ltd Environment recognition apparatus
US9197862B2 (en) 2012-03-30 2015-11-24 Honda Motor Co., Ltd. Contact state estimating apparatus
JP2013210339A (en) * 2012-03-30 2013-10-10 Honda Motor Co Ltd Contact state estimation device
JP2018506700A (en) * 2014-11-05 2018-03-08 シエラ・ネバダ・コーポレイション System and method for generating an improved environmental display for a mobile
JP2022003592A (en) * 2014-11-05 2022-01-11 シエラ・ネバダ・コーポレイション System and method for generating improved environmental display for vehicle
US11682314B2 (en) 2014-11-05 2023-06-20 Sierra Nevada Corporation Systems and methods for generating improved environmental displays for vehicles
US10410531B2 (en) 2014-11-05 2019-09-10 Sierra Nevada Corporation Systems and methods for generating improved environmental displays for vehicles
US11056012B2 (en) 2014-11-05 2021-07-06 Sierra Nevada Corporation Systems and methods for generating improved environmental displays for vehicles
JP2015038777A (en) * 2014-11-12 2015-02-26 セイコーエプソン株式会社 Position detecting system, display system, and information processing system
WO2018042801A1 (en) * 2016-09-01 2018-03-08 ソニーセミコンダクタソリューションズ株式会社 Imaging device
CN108027238A (en) * 2016-09-01 2018-05-11 索尼半导体解决方案公司 Imaging device
CN108027238B (en) * 2016-09-01 2022-06-14 索尼半导体解决方案公司 Image forming apparatus with a plurality of image forming units
JPWO2018042801A1 (en) * 2016-09-01 2019-06-24 ソニーセミコンダクタソリューションズ株式会社 Imaging device
US10866321B2 (en) 2016-09-01 2020-12-15 Sony Semiconductor Solutions Corporation Imaging device
RU173576U1 (en) * 2017-02-03 2017-08-31 Федеральное государственное унитарное предприятие "Всероссийский научно-исследовательский институт автоматики им. Н.Л. Духова" (ФГУП "ВНИИА") Device for obtaining stereoscopic images of small objects
JP2020031406A (en) * 2018-08-24 2020-02-27 独立行政法人日本スポーツ振興センター Determination system and determination method
RU187528U1 (en) * 2018-11-15 2019-03-12 Федеральное Государственное Унитарное Предприятие "Всероссийский Научно-Исследовательский Институт Автоматики Им.Н.Л.Духова" (Фгуп "Внииа") Device for obtaining stereoscopic images of fast processes
CN109579731A (en) * 2018-11-28 2019-04-05 华中科技大学 A method of executing 3 d surface topography measurement based on image co-registration

Also Published As

Publication number Publication date
JP3614935B2 (en) 2005-01-26

Similar Documents

Publication Publication Date Title
JP3614935B2 (en) 3D image measuring device
US8718326B2 (en) System and method for extracting three-dimensional coordinates
JP6480441B2 (en) Time-of-flight camera system
US7075661B2 (en) Apparatus and method for obtaining three-dimensional positional data from a two-dimensional captured image
US7656508B2 (en) Distance measuring apparatus, distance measuring method, and computer program product
US20140168424A1 (en) Imaging device for motion detection of objects in a scene, and method for motion detection of objects in a scene
US10909395B2 (en) Object detection apparatus
JP6172978B2 (en) IMAGING DEVICE, IMAGING SYSTEM, SIGNAL PROCESSING DEVICE, PROGRAM, AND STORAGE MEDIUM
JP2004530144A (en) How to provide image information
JP2003130621A (en) Method and system for measuring three-dimensional shape
US11619481B2 (en) Coordinate measuring device
JP2006322853A (en) Distance measuring device, distance measuring method and distance measuring program
US20180276844A1 (en) Position or orientation estimation apparatus, position or orientation estimation method, and driving assist device
EP3832601A1 (en) Image processing device and three-dimensional measuring system
US20230232130A1 (en) Image sensors and sensing methods to obtain time-of-flight and phase detection information
CN112230244B (en) Fused depth measurement method and measurement device
US20090262369A1 (en) Apparatus and method for measuring distances
US11997247B2 (en) Three-dimensional space camera and photographing method therefor
Mure-Dubois et al. Fusion of time of flight camera point clouds
JP2020051903A (en) Stereo camera system and distance measurement method
CN114063111A (en) Radar detection system and method of image fusion laser
JP3525712B2 (en) Three-dimensional image capturing method and three-dimensional image capturing device
JPH11223516A (en) Three dimensional image pickup device
WO2023095375A1 (en) Three-dimensional model generation method and three-dimensional model generation device
CN112750098B (en) Depth map optimization method, device, system, electronic device and storage medium

Legal Events

Date Code Title Description
A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20040130

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20040204

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20040309

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20040401

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20041020

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20041028

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20071112

Year of fee payment: 3

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20081112

Year of fee payment: 4

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20091112

Year of fee payment: 5

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20101112

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20101112

Year of fee payment: 6

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111112

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111112

Year of fee payment: 7

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20121112

Year of fee payment: 8

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20131112

Year of fee payment: 9

LAPS Cancellation because of no payment of annual fees