JPH0953914A - Three-dimensional coordinate measuring instrument - Google Patents

Three-dimensional coordinate measuring instrument

Info

Publication number
JPH0953914A
JPH0953914A JP7206932A JP20693295A JPH0953914A JP H0953914 A JPH0953914 A JP H0953914A JP 7206932 A JP7206932 A JP 7206932A JP 20693295 A JP20693295 A JP 20693295A JP H0953914 A JPH0953914 A JP H0953914A
Authority
JP
Japan
Prior art keywords
dimensional
dimensional coordinate
camera
equation
coordinates
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP7206932A
Other languages
Japanese (ja)
Other versions
JP2970835B2 (en
Inventor
Atsushi Marukame
敦 丸亀
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
NEC Corp
Original Assignee
NEC Corp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by NEC Corp filed Critical NEC Corp
Priority to JP7206932A priority Critical patent/JP2970835B2/en
Publication of JPH0953914A publication Critical patent/JPH0953914A/en
Application granted granted Critical
Publication of JP2970835B2 publication Critical patent/JP2970835B2/en
Anticipated expiration legal-status Critical
Expired - Lifetime legal-status Critical Current

Links

Abstract

PROBLEM TO BE SOLVED: To easily find the three-dimensional coordinates of an objective point from picture data at a high speed without using the optical characteristics of a light source and a camera nor the positional information of the camera. SOLUTION: The three-dimensional coordinates of a measuring point Q are found by observing the positional relations between the point Q and six or more reference points P1-Pm in different planes by means of an image picking-up section 1 and fetching the positional relations by means of a two-dimensional coordinate acquiring section 2 composed of a plurality of two-dimensional coordinate acquiring means G1-Gn, and then, combining the positional relations.

Description

【発明の詳細な説明】Detailed Description of the Invention

【0001】[0001]

【発明の属する技術分野】この発明は、対象物の三次元
座標計測を行う三次元座標計測装置に関する。
BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to a three-dimensional coordinate measuring device for measuring the three-dimensional coordinate of an object.

【0002】[0002]

【従来の技術】物体の三次元座標を計測する方式とし
て、従来より様々の方式が提案されてきた。たとえば、
写真測量を応用した従来からの両眼視方式や、対象物に
スリット光を照射して反射光を受光する方式、および、
江川宏一らによる特開平5−79819号公報記載の
「三次元座標位置の計測方法および装置」で説明されて
いる単位部分空間分割方式などである。
2. Description of the Related Art Conventionally, various methods have been proposed as methods for measuring the three-dimensional coordinates of an object. For example,
A conventional binocular vision method that applies photogrammetry, a method that irradiates an object with slit light and receives reflected light, and
The unit subspace division method and the like described in "Method and apparatus for measuring three-dimensional coordinate position" described in Japanese Patent Laid-Open No. 5-79819 by Koichi Egawa et al.

【0003】上述の、写真測量を応用した従来からの両
眼視方式は、複数の撮像装置の位置情報を用いて、三角
測量の原理により物体の三次元座標を計測する方式であ
る。また、対象物にスリット光を照射して反射光を受光
する方式は、光源位置とカメラとの位置情報を用いて三
角測量により対象物の三次元座標を計測する方式であ
る。
The above-mentioned conventional binocular vision system to which photogrammetry is applied is a system in which the three-dimensional coordinates of an object are measured by the principle of triangulation using the position information of a plurality of image pickup devices. The method of irradiating the object with slit light and receiving the reflected light is a method of measuring the three-dimensional coordinates of the object by triangulation using the position information of the light source position and the camera.

【0004】そして、単位部分空間分割方式では、以下
に示すようにして物体の三次元座標を計測する。まず、
測定対象空間を六面体の単位部分空間(ボクセル)で分
割し、その頂点の三次元座標位置と撮像画面上における
二次元座標値を対応づけた対応表を作成する。この対応
表を用いて撮像画面上における測定点に最も近いボクセ
ル頂点を算出し、測定点が含まれるボクセルを抽出す
る。
Then, in the unit subspace division method, the three-dimensional coordinates of the object are measured as follows. First,
The measurement target space is divided into unit subspaces (voxels) of a hexahedron, and a correspondence table is created in which the three-dimensional coordinate positions of the vertices are associated with the two-dimensional coordinate values on the imaging screen. By using this correspondence table, the voxel vertex closest to the measurement point on the imaging screen is calculated, and the voxel including the measurement point is extracted.

【0005】次に、前述のボクセルの各面またはその延
長上の面の撮像画面上への投影を考え、その中で測定点
の投影点が内部に含まれる面を複数面算出する。次い
で、前述の対応表を用いて、測定点の面上への投影点の
三次元座標を各面ごとに求め、複数面において求めた投
影点を通る直線を算出する。そして、複数台の撮像装置
により得られる画像データごとに算出した直線の交点を
算出し、交点の座標を対象物の三次元座標とする。
Next, considering the projection of each surface of the voxel or the surface on the extension thereof onto the image pickup screen, a plurality of surfaces in which the projection points of the measurement points are included are calculated. Next, using the above correspondence table, the three-dimensional coordinates of the projection point of the measurement point on the surface are obtained for each surface, and a straight line passing through the projection points obtained on the plurality of surfaces is calculated. Then, the intersection of the straight lines calculated for each of the image data obtained by the plurality of imaging devices is calculated, and the coordinates of the intersection are set as the three-dimensional coordinates of the object.

【0006】[0006]

【発明が解決しようとする課題】しかし従来では、以下
に示すような問題点があった。まず、写真測量を応用し
た従来の両眼視法では、カメラの光学的特性やカメラの
位置情報を必要とし、各カメラの焦点距離,2台のカメ
ラの位置関係,2台のカメラの光軸の角度差などを事前
に測定していた。カメラの光学的特性やカメラの光軸の
位置の測定を精密に行うには、多大な苦労が伴う。ま
た、スリット光を対象物に照射して反射光をカメラで受
光する方式では、光を照射するため、外部光の影響を考
慮しなければならない等、照明条件が限定される問題点
がある。
However, there have conventionally been the following problems. First, in the conventional binocular vision method applying photogrammetry, the optical characteristics of the cameras and the position information of the cameras are required, and the focal length of each camera, the positional relationship of the two cameras, the optical axes of the two cameras, and so on. I was measuring the angle difference in advance. Precise measurement of the optical characteristics of the camera and the position of the optical axis of the camera requires great effort. Further, in the method in which the slit light is applied to the object and the reflected light is received by the camera, there is a problem that the illumination condition is limited, such that the effect of external light must be taken into consideration because the light is applied.

【0007】また、単位部分空間分割方式では、光源や
カメラの光学的特性、カメラの位置情報などを必要とし
ない三次元座標計測を可能にしている。しかし、測定対
象空間を多くのボクセルに分割するため、測定対象空間
上の多くの点の三次元座標を予め測定しておく必要があ
り、実用において不可能な場合もある。また、測定点の
最近隣ボクセル頂点の探索,測定点の内在するボクセル
の探索,直線を定義するための面の探索を行うため、計
算時間が膨大になる問題点もある。
Further, the unit subspace division method enables three-dimensional coordinate measurement which does not require optical characteristics of the light source or the camera, camera position information and the like. However, since the measurement target space is divided into many voxels, it is necessary to measure the three-dimensional coordinates of many points in the measurement target space in advance, which may be impossible in practice. Further, since the nearest voxel vertex of the measurement point is searched, the voxel inside the measurement point is searched, and the surface for defining a straight line is searched, there is a problem that the calculation time becomes huge.

【0008】この発明は、以上のような問題点を解消す
るためになされたものであり、光源やカメラの光学的特
性,カメラの位置情報は用いず、画像データから対象点
の三次元座標を容易にかつ高速に求めることを目的とす
る。
The present invention has been made in order to solve the above problems, and does not use the optical characteristics of the light source or the camera and the position information of the camera, but the three-dimensional coordinates of the target point from the image data. The purpose is to seek easily and quickly.

【0009】[0009]

【課題を解決するための手段】この発明の三次元座標計
測装置は、複数の撮像手段と、同一平面上にない6つ以
上の点からなり、それぞれの相対位置がわかっている基
準点に対する、撮像手段で得られた撮像画像上での二次
元座標を取得する第1の二次元座標取得手段と、基準点
の二次元座標と相対位置関係とを用いて座標算出パラメ
ーターを算出する座標算出パラメーター計算手段と、撮
像手段により撮像されたそれぞれの撮像画像上で、測定
点の二次元座標を取得する第2の二次元座標取得手段
と、座標算出パラメーターと測定点の二次元座標を用い
て測定点の三次元座標を算出する座標算出手段とを備え
る。このことにより、測定手段の位置情報なしに、測定
点の三次元座標が求められる。また、立方体の頂点を基
準点として用いるようにした。このため、得られる三次
元座標が、正規化されている。
A three-dimensional coordinate measuring apparatus according to the present invention comprises a plurality of image pickup means and six or more points that are not on the same plane, and a relative point of each relative position is known. First two-dimensional coordinate acquisition means for acquiring two-dimensional coordinates on the captured image obtained by the imaging means, and coordinate calculation parameters for calculating coordinate calculation parameters using the two-dimensional coordinates of the reference point and the relative positional relationship. The calculation means, the second two-dimensional coordinate acquisition means for acquiring the two-dimensional coordinates of the measurement point on each captured image captured by the imaging means, the measurement using the coordinate calculation parameter and the two-dimensional coordinates of the measurement point Coordinate calculating means for calculating the three-dimensional coordinates of the point. As a result, the three-dimensional coordinates of the measuring point can be obtained without the position information of the measuring means. Also, the vertex of the cube is used as a reference point. Therefore, the obtained three-dimensional coordinates are normalized.

【0010】[0010]

【発明の実施の形態】以下、この発明の1実施形態を図
を参照して説明する。まず、この発明の動作原理につい
て説明する。簡略化のため、図1の両眼視幾何モデルを
用いて本発明の三次元座標計測装置の動作原理を説明す
る。図1のP1 ,P2 ,P3 ,P4 ,P5 は相対的な位
置関係が判っている基準点である。これは、相対位置が
わかっている6点以上のPi (i=0,1,・・・)で
あればよい。また、位置f,f’を焦点距離とするカメ
ラ11とカメラ12に対して、座標系(a,b,c),
(a’,b’,c’)が設定されている。その原点O,
O’はそれぞれカメラ11、カメラ12のレンズの中心
11a,12aとし、そのz軸,z’軸はカメラの撮影
方向を正としてそれぞれの光軸と一致させるようにと
り、これらをカメラ11系,カメラ12系と呼ぶことに
する。
DETAILED DESCRIPTION OF THE INVENTION An embodiment of the present invention will be described below with reference to the drawings. First, the operating principle of the present invention will be described. For simplification, the operation principle of the three-dimensional coordinate measuring apparatus of the present invention will be described using the binocular geometric model of FIG. P 1 , P 2 , P 3 , P 4 , and P 5 in FIG. 1 are reference points whose relative positional relationship is known. This may be P i (i = 0, 1, ...) Of 6 points or more whose relative positions are known. Further, for the cameras 11 and 12 whose focal lengths are positions f and f ′, coordinate systems (a, b, c),
(A ', b', c ') are set. The origin O,
O'is the centers 11a and 12a of the lenses of the camera 11 and the camera 12, respectively, and its z axis and z'axis are set so that the photographing direction of the camera is positive and coincide with the respective optical axes. We will call it the 12th series.

【0011】基準点Pi (i=0,1,・・・,5)、
カメラ11系、カメラ12系での位置ベクトルをそれぞ
れベクトルpi ,ベクトルp’i (i=0,1,・・
・,5)とする。また、測定点Qのカメラ11系、カメ
ラ12系での位置ベクトルをそれぞれベクトルq,ベク
トルq’とする。基準点Pi を2台のカメラ11,12
の撮像面11b,12b上へ投影したときのカメラ11
系およびカメラ12系における座標を、(ui,vi
f)および(u’i ,v’i ,f’)(i=0,1,・
・・5)とする。
Reference points P i (i = 0, 1, ..., 5),
Camera 11 system position vector of each vector p i of the camera 12 based vector p 'i (i = 0,1, ··
・, 5). Further, the position vectors of the measurement point Q in the camera 11 system and the camera 12 system are vector q and vector q ', respectively. The reference point P i is set to the two cameras 11, 12
Camera 11 when projected onto the image pickup surfaces 11b and 12b of
The coordinates in the system and the camera 12 system are (u i , v i ,
f) and (u ′ i , v ′ i , f ′) (i = 0, 1, ...
・ ・ 5)

【0012】また、座標系(x,y,z)を同一平面上
にない4つの基準点P0 ,P1 ,P2 ,P3 を用いて、
0 を原点、P0 −P1 をx軸、P0 −P2 をy軸、P
0 −P3 をz軸とするように定め、これを測定系と呼ぶ
ことにする。基準点Pi の測定系の座標を(xi
i ,zi )、(i=0,1,・・・,5)とし、測定
点Qの測定系の座標を(α,β,γ)とする。
Further, by using four reference points P 0 , P 1 , P 2 and P 3 whose coordinate system (x, y, z) is not on the same plane,
P 0 is the origin, P 0 -P 1 is the x-axis, P 0 -P 2 is the y-axis, P
The 0- P 3 is defined as the z-axis, and this will be referred to as the measurement system. The coordinates of the measurement system of the reference point P i are (x i ,
y i , z i ), (i = 0, 1, ..., 5), and the coordinates of the measurement system of the measurement point Q are (α, β, γ).

【0013】ここで、座標算出パラメーターの決定につ
いて説明する。基準点Piの投影点のカメラ11系にお
ける座標表示(ui,vi,f)(i=0,1,・・・,
5)が、以下の数1で示される線形変換Tで、以下の数
2のベクトルpi で示される四次元射影空間上の点[X
i,Yi,Zi,Wit に対応づけられたとする(但し、
λiは未知定数である)。
Here, the determination of the coordinate calculation parameter will be described. Coordinate display of the projection point of the reference point P i in the camera 11 system (u i , v i , f) (i = 0, 1, ...,
5) is a linear transformation T represented by the following equation 1 and a point [X on the four-dimensional projective space represented by the vector p i of the following equation 2
i , Y i , Z i , W i ] t (however,
λ i is an unknown constant).

【0014】[0014]

【数1】 [Equation 1]

【0015】[0015]

【数2】 [Equation 2]

【0016】ここで考えている四次元射影空間上の座標
とは、P0,P1,P2,P3がそれぞれベクトルe0
[1,0,0,0]t,ベクトルe1=[0,1,0,
0]t,ベクトルe2=[0,0,1,0]t,ベクトル
3=[0,0,0,1]tとなるようなものとする。そ
うすると、上記数1で示したTは、以下の数3となる。
The coordinates in the four-dimensional projective space considered here mean that P 0 , P 1 , P 2 and P 3 are vectors e 0 =
[1,0,0,0] t , vector e 1 = [0,1,0,
0] t , vector e 2 = [0,0,1,0] t , and vector e 3 = [0,0,0,1] t . Then, T shown in the above-mentioned expression 1 becomes the following expression 3.

【0017】[0017]

【数3】 (Equation 3)

【0018】ただし、前記数2は、未知定数λi を含ん
でおり、自由度が一つあるので、以下t3=1とする。
すると上記数2は、以下の数4となる。
However, the equation 2 includes the unknown constant λ i and has one degree of freedom, and therefore t3 = 1.
Then, the above equation 2 becomes the following equation 4.

【0019】[0019]

【数4】 (Equation 4)

【0020】同様にして、カメラ12系から、以下の数
5が得られる。
Similarly, the following equation 5 is obtained from the camera 12 system.

【0021】[0021]

【数5】 (Equation 5)

【0022】以下、ti,t’i(i=0,1,2),u
j,vj,u’j,v’j,(j=0,1,2,3)を座標
算出パラメーターと呼ぶことにする。ここで、座標算出
パラメーターti,t’i(i=0,1,2)の決定を考
える。基準点Pi (i=0,1,・・・)の位置関係は
既知であるから、測定系(x,y,z)における座標
(xi ,yi ,zi )(i=4,5,・・・)も既知で
ある。そこで、ベクトルpi (i=4,5,・・・)
は、以下の数6で表現できる。
Hereinafter, t i , t ′ i (i = 0, 1, 2), u
Let j , v j , u ′ j , v ′ j , (j = 0, 1, 2, 3) be the coordinate calculation parameters. Here, let us consider the determination of the coordinate calculation parameters t i , t ′ i (i = 0, 1, 2). Since the positional relationship of the reference points P i (i = 0, 1, ...) Is known, the coordinates (x i , y i , z i ) (i = 4) in the measurement system (x, y, z). 5, ...) are also known. Therefore, the vector p i (i = 4,5, ...)
Can be expressed by Equation 6 below.

【0023】[0023]

【数6】 (Equation 6)

【0024】この数6をTと四次元座標系[ベクトルe
0 ,ベクトルe1 ,ベクトルe2 ,ベクトルe3 ]で表
すと、以下の数7となる。
[Mathematical formula-see original document] This equation 6 is represented by T and the four-dimensional coordinate system
0 , vector e 1 , vector e 2 , vector e 3 ], the following Expression 7 is obtained.

【0025】[0025]

【数7】 (Equation 7)

【0026】この数7を上記数4と比較することで、
[Xi,Yi,Zi,Wi](i=4,5,・・・)がわか
る。そして、数4からλi を消去すると、各Pi ,(i
=4,5,・・・)に対して、t0 ,t1 ,t2 を変数
とする以下に示す数8と数9の2組の一次方程式が得ら
れる。
By comparing this equation 7 with the above equation 4,
[X i , Y i , Z i , W i ] (i = 4,5, ...) Is known. Then, if λ i is deleted from Equation 4, each P i , (i
= 4, 5, ...), two sets of linear equations of the following equations 8 and 9 in which t 0 , t 1 , and t 2 are variables are obtained.

【0027】[0027]

【数8】 (Equation 8)

【0028】[0028]

【数9】 [Equation 9]

【0029】したがって、さらに2点以上の基準点Pi
(i=4,5,・・・)に対して、数8,数9が得られ
れば、t0 ,t1 ,t2 の3変数に対して一次方程式が
4組以上得られるから、最小二乗法によってt0,t1
2が求まる。同様にして、t’0,t’1,t’2も求ま
る。
Therefore, further two or more reference points P i
If Equations 8 and 9 are obtained for (i = 4,5, ...), four or more linear equations are obtained for three variables of t 0 , t 1 , and t 2 , so that the minimum By the method of squares, t 0 , t 1 ,
t 2 is obtained. Similarly, t '0, t' 1 , t '2 also determined.

【0030】次に、ある測定点の三次元座標を確定する
方法として、その測定点Qの測定系座標(α,β,γ)
を求める方法を考える。測定点Qを2台のカメラの撮像
面上へ投影したときの、カメラ11系およびカメラ12
系における座標をそれぞれ(u,v,f)、(u’,
v’,f’)とする。測定点Qのカメラ11系での位置
ベクトルqは、以下の数10で示される。
Next, as a method of determining the three-dimensional coordinates of a certain measuring point, the measuring system coordinates (α, β, γ) of the measuring point Q are determined.
Think about how to ask. The camera 11 system and the camera 12 when the measurement point Q is projected onto the imaging surfaces of the two cameras
The coordinates in the system are (u, v, f), (u ',
v ', f'). The position vector q of the measurement point Q in the camera 11 system is represented by the following Expression 10.

【0031】[0031]

【数10】 (Equation 10)

【0032】この数10をTと四次元座標系[ベクトル
0 ,ベクトルe1 ,ベクトルe2,ベクトルe3 ]で
表すと以下の数11,数12となる。
When this equation 10 is represented by T and the four-dimensional coordinate system [vector e 0 , vector e 1 , vector e 2 , vector e 3 ], the following equations 11 and 12 are obtained.

【0033】[0033]

【数11】 [Equation 11]

【0034】[0034]

【数12】 (Equation 12)

【0035】つまり、以下の数13となる(λは未知定
数)。
That is, the following expression 13 is obtained (λ is an unknown constant).

【0036】[0036]

【数13】 (Equation 13)

【0037】したがって、この数13のλを消去する
と、α,β,γを変数とする、以下に示す数14と数1
5の2組の一次方程式が得られる。
Therefore, if λ of this equation 13 is eliminated, the following equation 14 and equation 1 with α, β and γ as variables are shown.
Two sets of linear equations of 5 are obtained.

【0038】[0038]

【数14】 [Equation 14]

【0039】[0039]

【数15】 (Equation 15)

【0040】しかし、このままでは式の数が変数の数よ
り少ないので解けない。そこで、カメラ11系とカメラ
12系は、共に正規直交系であるため、カメラ11系上
の表示とカメラ12系上の表示は互いに合同であること
を用いる。つまり、Sをある回転行列、ベクトルhを平
行移動を表すある定ベクトルとして、ベクトルq=Sベ
クトルq’+ベクトルh,ベクトルpi =Sベクトル
p’i +ベクトルh(i=0,1,2,3)が成り立つ
から、これらを前記数11に代入して、以下の数16を
得る。
However, as it is, it cannot be solved because the number of expressions is smaller than the number of variables. Therefore, since both the camera 11 system and the camera 12 system are orthonormal systems, it is used that the display on the camera 11 system and the display on the camera 12 system are congruent. That is, S is a certain rotation matrix, and vector h is a certain constant vector representing translation, vector q = S vector q ′ + vector h, vector p i = S vector p ′ i + vector h (i = 0, 1, 2 and 3) are established, these are substituted into the above equation 11 to obtain the following equation 16.

【0041】[0041]

【数16】 (Equation 16)

【0042】そして、この数16の両辺左からSt を掛
ければ、以下の数17が得られる。
By multiplying S t from the left side of both sides of this equation 16, the following equation 17 is obtained.

【0043】[0043]

【数17】 [Equation 17]

【0044】そして、上述のことと同様にすると、α,
β,γを変数とする、さらに2組の一次方程式、数1
8,数19が得られる。
Then, in the same manner as described above, α,
Two more linear equations with β and γ as variables, Equation 1
8 and the number 19 are obtained.

【0045】[0045]

【数18】 (Equation 18)

【0046】[0046]

【数19】 [Equation 19]

【0047】したがって、α,β,γを変数とする4組
の一次方程式、数14,15,18,19が得られ、最
小二乗法によって測定系での測定点Qの三次元座標
(α,β,γ)を求めることができる。
Therefore, four sets of linear equations having α, β, γ as variables, Formulas 14, 15, 18, and 19 are obtained, and the three-dimensional coordinates (α, β, γ) can be obtained.

【0048】以下、この発明の1実施形態について説明
する。図2は、この発明の1実施形態における三次元座
標計測装置の構成を示す構成図である。この図は、大き
く分けて、基準点Pi (i=1,・・・,m)(m≧
6)、撮像部1、二次元座標取得部2、座標算出パラメ
ーター計算部3、三次元座標計算部4で構成される。撮
像部1は複数のカメラCi(i=1,2,・・・,n)
で構成されている。各カメラCi間の位置関係およびそ
れらの光学的特性は、事前に明らかにしておく必要はな
い。また、二次元座標取得部2は複数の二次元座標取得
手段Gi(i=1,2,・・・,n)で構成されてい
る。
An embodiment of the present invention will be described below. FIG. 2 is a configuration diagram showing a configuration of the three-dimensional coordinate measuring device according to the embodiment of the present invention. This figure is roughly divided into reference points P i (i = 1, ..., m) (m ≧
6), the image pickup unit 1, the two-dimensional coordinate acquisition unit 2, the coordinate calculation parameter calculation unit 3, and the three-dimensional coordinate calculation unit 4. The imaging unit 1 includes a plurality of cameras Ci (i = 1, 2, ..., N).
It is composed of It is not necessary to clarify in advance the positional relationship between the cameras Ci and their optical characteristics. The two-dimensional coordinate acquisition unit 2 is composed of a plurality of two-dimensional coordinate acquisition means Gi (i = 1, 2, ..., N).

【0049】以下、この実施形態の三次元座標計測装置
について、図2を用いて説明する。基準点Pi は、図3
に示す、立体11の頂点であるP0,・・・,Pmのよう
な同一平面上にない互いの位置関係が既知な6点以上の
点である。ここで、図4に示す、立方体12の6頂点R
0,・・・,Rmを用いれば、正規直交座標系における測
定点の座標が容易に求まる。
The three-dimensional coordinate measuring device of this embodiment will be described below with reference to FIG. The reference point P i is shown in FIG.
, P 0 , ..., P m , which are the vertices of the solid 11, are 6 or more points that are not on the same plane and have a known positional relationship with each other. Here, the six vertices R of the cube 12 shown in FIG.
By using 0 , ..., R m , the coordinates of the measurement point in the orthonormal coordinate system can be easily obtained.

【0050】まず、撮像部1のカメラCi(i=1,
2,・・・,n)で6点以上の基準点P0,・・・,Pm
を撮像する。カメラCi(i=1,2,・・・,n)が
得たn個の画像をそれぞれ画像i(i=1,2,・・
・,n)と呼ぶ。画像i(i=1,2,・・・,n)
は、画像通過線10i(i=1,2,・・・,n)を通
じてそれぞれ二次元座標取得部2の二次元座標取得手段
Gi(i=1,2,・・・,n)に送られる。
First, the camera Ci of the image pickup unit 1 (i = 1,
2, ..., N), 6 or more reference points P 0 , ..., P m
Is imaged. The n images obtained by the cameras Ci (i = 1, 2, ..., N) are image i (i = 1, 2, ...
,, n). Image i (i = 1, 2, ..., N)
Are sent to the two-dimensional coordinate acquisition means Gi (i = 1, 2, ..., N) of the two-dimensional coordinate acquisition unit 2 through the image passing lines 10i (i = 1, 2, ..., N). .

【0051】二次元座標取得部2は、複数の二次元座標
取得手段Gi(i=1,2,・・・,n)で構成されて
いる。各二次元座標取得手段Giは、それぞれ画像メモ
リ(図示せず)を備えており、基準点を検出してその二
次元座標を出力する機能をもつ。これは、オペレーター
が、図5に示す、表示部13を見ながら指定してもよい
し、パターン認識の技術を用いて自動検出してもよい。
The two-dimensional coordinate acquisition unit 2 is composed of a plurality of two-dimensional coordinate acquisition means Gi (i = 1, 2, ..., N). Each two-dimensional coordinate acquisition unit Gi includes an image memory (not shown), and has a function of detecting a reference point and outputting the two-dimensional coordinate thereof. This may be designated by the operator while looking at the display unit 13 shown in FIG. 5, or may be automatically detected by using a pattern recognition technique.

【0052】画像i上の基準点の二次元座標は、二次元
座標取得手段Gi(i=1,2,・・・,n)でそれぞ
れ取得された後、数値通過線20i(i=1,2,・・
・,n)を通じて座標算出パラメーター計算部3に送ら
れる。画像i上の基準点Pj (j=0,1,・・・,
m)の二次元座標を(ui j,vi j)、あらかじめ記憶さ
れている基準点のPj (j=4,5,・・・,m)の位
置関係情報であるデータ信号300を(xj ,yj ,z
j )としたとき、座標算出パラメーター計算部3は、以
下に示す式を計算する。
The two-dimensional coordinates of the reference point on the image i are acquired by the two-dimensional coordinate acquisition means Gi (i = 1, 2, ..., N), and then the numerical passage line 20i (i = 1, 1). 2, ...
, N) and is sent to the coordinate calculation parameter calculation unit 3. Reference point P j on image i (j = 0, 1, ...,
m) the two-dimensional coordinates (u i j , v i j ), and a data signal 300 that is the positional relationship information of P j (j = 4,5, ..., M) of the reference point stored in advance. (X j , y j , z
j ), the coordinate calculation parameter calculation unit 3 calculates the formula shown below.

【0053】xui j=(1−xj−yj−zj)(ui j
i 0),yui j=xj(ui j−ui 1),zui j=yj(u
i j−ui 2),wui j=zj(ui 3−ui j),xvi j
(1−xj−yj−zj)(vi j−vi 0),yvi j=x
j(vi j−vi 1),zvi j=yj(vi j−vi 2),wvi j
=zj(vi 3−vi j)。 そして、座標算出パラメーター計算部3は、以下の数2
0で示す方程式(但し、mは5以上)に最小二乗法を適
用し、画像i(i=1,2,・・・,n)それぞれに対
してti 0,ti 1,ti 2を求める。
[0053] xu i j = (1-x j -y j -z j) (u i j -
u i 0 ), yu i j = x j (u i j −u i 1 ), zu i j = y j (u
i j −u i 2 ), wu i j = z j (u i 3 −u i j ), xv i j =
(1-x j -y j -z j) (v i j -v i 0), yv i j = x
j (v i j -v i 1 ), zv i j = y j (v i j -v i 2), wv i j
= Z j (v i 3 -v i j). Then, the coordinate calculation parameter calculation unit 3 calculates
The least squares method is applied to the equation shown by 0 (where m is 5 or more), and t i 0 , t i 1 , t i 2 for each image i (i = 1, 2, ..., N). Ask for.

【0054】[0054]

【数20】 (Equation 20)

【0055】これらti 0,ti 1,ti 2とui j,vi j(j
=0,1,2,3)を座標変換パラメーターとよぶ。こ
の座標変換パラメーターは、数値通過線500を通じて
三次元座標計算部4に送られる。次に撮像部1のカメラ
Ci(i=1,2,・・・,n)で測定点Qを撮像す
る。カメラCi(i=1,2,・・・,n)が得たn個
の画像をそれぞれ画像i’(i=1,2,・・・,n)
と呼ぶ。画像i’はそれぞれ画像通過線10i(i=
1,2,・・・,n)を通じて、二次元座標取得部2の
二次元座標取得手段Gi(i=1,2,・・・,n)に
送られる。
These t i 0 , t i 1 , t i 2 and u i j , v i j (j
= 0, 1, 2, 3) is called a coordinate conversion parameter. This coordinate conversion parameter is sent to the three-dimensional coordinate calculation unit 4 via the numerical pass line 500. Next, the measurement point Q is imaged by the camera Ci (i = 1, 2, ..., N) of the imaging unit 1. The n images obtained by the camera Ci (i = 1, 2, ..., N) are image i ′ (i = 1, 2, ..., N), respectively.
Call. Each image i ′ is an image passing line 10i (i =
1, 2, ..., N), and is sent to the two-dimensional coordinate acquisition means Gi (i = 1, 2, ..., N) of the two-dimensional coordinate acquisition unit 2.

【0056】画像i’上の測定点の二次元座標(ui
i)(i=1,2,・・・,n)は、各二次元座標取
得手段Giによって基準点の場合と同じように取得さ
れ、数値通過線40i(i=1,2,・・・,n)を通
じて、三次元座標計算部4に送られる。三次元座標計算
部4は、三次元変換パラメーターti 0,ti 1,ti 2,u
i j,vi j(i=1,2,・・・,n)(j=0,1,
2,3)、測定点の二次元座標(ui,vi)(i=1,
2,・・・,n)を用いて、以下の式を計算する。
Two-dimensional coordinates (u i ,
v i ) (i = 1, 2, ..., N) is acquired by each two-dimensional coordinate acquisition means Gi in the same manner as the case of the reference point, and the numerical value passing line 40i (i = 1, 2, ... , N), and is sent to the three-dimensional coordinate calculation unit 4. The three-dimensional coordinate calculation unit 4 uses the three-dimensional conversion parameters t i 0 , t i 1 , t i 2 , and u.
i j, v i j (i = 1,2, ···, n) (j = 0,1,
2, 3), the two-dimensional coordinates (u i , v i ) of the measurement point (i = 1,
2, ..., N) is used to calculate the following equation.

【0057】aui=ti 0(ui 0−ui),bui=ti 1
(ui−ui 1)+aui,cui=ti 2(ui−ui 2)+a
i,dui=ti 2(ui−ui 3)+aui,avi=ti 0
(vi 0−vi),bvi=ti 1(vi−vi 1)+avi,c
i=ti 2(vi−vi 2)+avi,dvi=ti 2(vi
i 3)+avi。 そして、三次元座標計算部4は、以下の数21で示され
る方程式(但し、nは2以上)に最小二乗法を適用し
て、測定点Qの三次元座標であるデータ信号600
(α,β,γ)を得る。
Au i = t i 0 (u i 0 −u i ), bu i = t i 1
(U i −u i 1 ) + au i , cu i = t i 2 (u i −u i 2 ) + a
u i , du i = t i 2 (u i −u i 3 ) + au i , av i = t i 0
(V i 0 −v i ), bv i = t i 1 (v i −v i 1 ) + av i , c
v i = t i 2 (v i −v i 2 ) + av i , d v i = t i 2 (v i
v i 3 ) + av i . Then, the three-dimensional coordinate calculation unit 4 applies the least squares method to the equation (where n is 2 or more) represented by the following equation 21 to obtain the data signal 600 that is the three-dimensional coordinate of the measurement point Q.
Obtain (α, β, γ).

【0058】[0058]

【数21】 (Equation 21)

【0059】ここで基準点に図4のような立方体の6頂
点を用いていれば、正規直交座標系における三次元座標
が容易に得られる。
If six vertices of a cube as shown in FIG. 4 are used as the reference points, three-dimensional coordinates in the orthonormal coordinate system can be easily obtained.

【0060】[0060]

【発明の効果】以上説明したように、この発明によれ
ば、同一平面上にない6つ以上の点からなり、それぞれ
の相対位置がわかっている基準点に対する測定点の位置
関係を、複数の位置から観察して二次元座標として取り
込み、これらを合わせることで、測定点の三次元座標を
得るようにした。すなわち、精密な測定が困難なカメラ
の光学的特性やカメラの位置情報の代わりに、容易に測
定できる同一平面上にない6点の位置関係を用いて、測
定点の三次元位置を測定するようにした。
As described above, according to the present invention, the positional relationship of the measurement point with respect to the reference point, which is composed of six or more points that are not on the same plane and whose relative position is known, is determined by a plurality of points. The three-dimensional coordinates of the measurement point were obtained by observing from the position and importing them as two-dimensional coordinates and combining them. That is, instead of the optical characteristics of the camera or the position information of the camera, which is difficult to measure precisely, the three-dimensional position of the measurement point is measured using the positional relationship of 6 points that are not on the same plane and can be easily measured. I chose

【0061】このため、光源やカメラの光学的特性,カ
メラの位置情報を用いなくても、画像データから対象点
の三次元座標が、容易にかつ高速に求められるという効
果を有する。また、立方体の頂点を基準点として用いる
ことにより、カメラの光学的特性やカメラの位置情報を
用いることなく、容易に正規直交座標系における三次元
位置が測定できる。
Therefore, the three-dimensional coordinates of the target point can be easily and quickly obtained from the image data without using the light source, the optical characteristics of the camera, and the position information of the camera. Further, by using the vertex of the cube as the reference point, the three-dimensional position in the orthonormal coordinate system can be easily measured without using the optical characteristics of the camera or the position information of the camera.

【図面の簡単な説明】[Brief description of drawings]

【図1】 この発明の三次元座標計測装置の動作原理を
説明する両眼視幾何モデルを示す斜視図である。
FIG. 1 is a perspective view showing a binocular geometric model for explaining the operation principle of the three-dimensional coordinate measuring apparatus of the present invention.

【図2】 この発明の1実施形態における三次元座標計
測装置の構成を示す構成図である。
FIG. 2 is a configuration diagram showing a configuration of a three-dimensional coordinate measuring device according to an embodiment of the present invention.

【図3】 この発明における基準点の関係を示す説明図
である。
FIG. 3 is an explanatory diagram showing a relationship between reference points in the present invention.

【図4】 この発明における基準点の関係を示す説明図
である。
FIG. 4 is an explanatory diagram showing a relationship between reference points in the present invention.

【図5】 図2の三次元座標計測装置の出力例を示す画
面図である。
5 is a screen diagram showing an output example of the three-dimensional coordinate measuring apparatus of FIG.

【符号の説明】[Explanation of symbols]

1…撮像部、2…二次元座標取得部、3…座標検出パラ
メータ計算装置、4…三次元座標計算装置、101〜1
0n…画像通過線、201〜20n,401〜40n,
500…数値通過線、300,600…データ信号、C
1〜Cn…カメラ、Gn…二次元座標取得手段、P0〜
P5,Pm…基準点、Q…測定点。
DESCRIPTION OF SYMBOLS 1 ... Imaging part, 2 ... Two-dimensional coordinate acquisition part, 3 ... Coordinate detection parameter calculation device, 4 ... Three-dimensional coordinate calculation device 101-1
0n ... image passing lines, 201 to 20n, 401 to 40n,
500 ... Numerical value passing line, 300, 600 ... Data signal, C
1-Cn ... Camera, Gn ... Two-dimensional coordinate acquisition means, P0
P5, Pm ... Reference point, Q ... Measurement point.

Claims (2)

【特許請求の範囲】[Claims] 【請求項1】 複数の撮像手段と、 同一平面上にない6つ以上の点からなり、それぞれの相
対位置がわかっている基準点に対する、前記撮像手段で
得られた撮像画像上での二次元座標を取得する第1の二
次元座標取得手段と、 前記基準点の二次元座標と相対位置関係とを用いて座標
算出パラメーターを算出する座標算出パラメーター計算
手段と、 前記撮像手段により撮像されたそれぞれの撮像画像上
で、測定点の二次元座標を取得する第2の二次元座標取
得手段と、 前記座標算出パラメーターと前記測定点の二次元座標を
用いて測定点の三次元座標を算出する座標算出手段とを
備えたことを特徴とする三次元座標計測装置。
1. A two-dimensional image on an imaged image obtained by said imager with respect to a reference point which is composed of a plurality of imager and six or more points that are not on the same plane and whose relative positions are known. First two-dimensional coordinate acquisition means for acquiring coordinates, coordinate calculation parameter calculation means for calculating coordinate calculation parameters using the two-dimensional coordinates of the reference point and the relative positional relationship, and each imaged by the imaging means. Second two-dimensional coordinate acquisition means for acquiring the two-dimensional coordinates of the measurement point on the captured image, and coordinates for calculating the three-dimensional coordinates of the measurement point using the coordinate calculation parameter and the two-dimensional coordinates of the measurement point. A three-dimensional coordinate measuring device comprising a calculating means.
【請求項2】 請求項1記載の三次元座標計測装置にお
いて、 前記基準点は、立方体の頂点であることを特徴とする三
次元座標計測装置。
2. The three-dimensional coordinate measuring device according to claim 1, wherein the reference point is a vertex of a cube.
JP7206932A 1995-08-14 1995-08-14 3D coordinate measuring device Expired - Lifetime JP2970835B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP7206932A JP2970835B2 (en) 1995-08-14 1995-08-14 3D coordinate measuring device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP7206932A JP2970835B2 (en) 1995-08-14 1995-08-14 3D coordinate measuring device

Publications (2)

Publication Number Publication Date
JPH0953914A true JPH0953914A (en) 1997-02-25
JP2970835B2 JP2970835B2 (en) 1999-11-02

Family

ID=16531439

Family Applications (1)

Application Number Title Priority Date Filing Date
JP7206932A Expired - Lifetime JP2970835B2 (en) 1995-08-14 1995-08-14 3D coordinate measuring device

Country Status (1)

Country Link
JP (1) JP2970835B2 (en)

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003042726A (en) * 2001-08-03 2003-02-13 Topcon Corp Object for calibration
JP2006301991A (en) * 2005-04-21 2006-11-02 Pulstec Industrial Co Ltd Correction method of coordinate transformation function
JP2007064836A (en) * 2005-08-31 2007-03-15 Kyushu Institute Of Technology Algorithm for automating camera calibration
JP2010528318A (en) * 2007-05-29 2010-08-19 コグネックス・テクノロジー・アンド・インベストメント・コーポレーション 3D assembly inspection with 2D images
WO2013035554A1 (en) 2011-09-07 2013-03-14 日東電工株式会社 Method for detecting motion of input body and input device using same

Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05164517A (en) * 1991-12-18 1993-06-29 Ono Sokki Co Ltd Measuring method for three-dimensional coordinates

Patent Citations (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH05164517A (en) * 1991-12-18 1993-06-29 Ono Sokki Co Ltd Measuring method for three-dimensional coordinates

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003042726A (en) * 2001-08-03 2003-02-13 Topcon Corp Object for calibration
WO2003014664A1 (en) * 2001-08-03 2003-02-20 Topcon Corporation Calibration object
JP2006301991A (en) * 2005-04-21 2006-11-02 Pulstec Industrial Co Ltd Correction method of coordinate transformation function
JP2007064836A (en) * 2005-08-31 2007-03-15 Kyushu Institute Of Technology Algorithm for automating camera calibration
JP2010528318A (en) * 2007-05-29 2010-08-19 コグネックス・テクノロジー・アンド・インベストメント・コーポレーション 3D assembly inspection with 2D images
WO2013035554A1 (en) 2011-09-07 2013-03-14 日東電工株式会社 Method for detecting motion of input body and input device using same

Also Published As

Publication number Publication date
JP2970835B2 (en) 1999-11-02

Similar Documents

Publication Publication Date Title
US10825198B2 (en) 3 dimensional coordinates calculating apparatus, 3 dimensional coordinates calculating method, 3 dimensional distance measuring apparatus and 3 dimensional distance measuring method using images
US5757674A (en) Three-dimensional position detecting apparatus
CN108965690B (en) Image processing system, image processing apparatus, and computer-readable storage medium
JPH02264808A (en) Three-dimensional curved surface configuration measuring instrument
JPH10221072A (en) System and method for photogrammetry
US20210374978A1 (en) Capturing environmental scans using anchor objects for registration
JP2559939B2 (en) Three-dimensional information input device
JP3696336B2 (en) How to calibrate the camera
JP2970835B2 (en) 3D coordinate measuring device
JP4006296B2 (en) Displacement measuring method and displacement measuring apparatus by photogrammetry
JP3221384B2 (en) 3D coordinate measuring device
JP2003006618A (en) Method and device for generating three-dimensional model and computer program
JPH05196432A (en) Measuring equipment for three-dimensional coordinates
JPH10320558A (en) Calibration method, corresponding point search method and device therefor, focus distance detection method and device therefor, three-dimensional position information detection method and device therefor, and recording medium
JPH0875454A (en) Range finding device
JP2004117186A (en) Three-dimensional shape measuring device
JP2006317418A (en) Image measuring device, image measurement method, measurement processing program, and recording medium
JP3452188B2 (en) Tracking method of feature points in 2D video
JP3655065B2 (en) Position / attitude detection device, position / attitude detection method, three-dimensional shape restoration device, and three-dimensional shape restoration method
JP3340599B2 (en) Plane estimation method
JPH0252204A (en) Measuring instrument for three-dimensional coordinate
KR100395773B1 (en) Apparatus for measuring coordinate based on optical triangulation using the images
Uyanik et al. A method for determining 3D surface points of objects by a single camera and rotary stage
JPH0367566B2 (en)
JP3099780B2 (en) Three-dimensional shape generation method and apparatus