JPH05164517A - Measuring method for three-dimensional coordinates - Google Patents
Measuring method for three-dimensional coordinatesInfo
- Publication number
- JPH05164517A JPH05164517A JP33493491A JP33493491A JPH05164517A JP H05164517 A JPH05164517 A JP H05164517A JP 33493491 A JP33493491 A JP 33493491A JP 33493491 A JP33493491 A JP 33493491A JP H05164517 A JPH05164517 A JP H05164517A
- Authority
- JP
- Japan
- Prior art keywords
- coordinate system
- dimensional coordinates
- coordinates
- point
- dimensional
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Landscapes
- Length Measuring Devices By Optical Means (AREA)
- Measurement Of Optical Distance (AREA)
Abstract
Description
【0001】[0001]
【産業上の利用分野】本発明は、複数台のカメラによる
視差を利用したステレオ法による三次元座標計測に関す
る。BACKGROUND OF THE INVENTION 1. Field of the Invention The present invention relates to three-dimensional coordinate measurement by a stereo method utilizing parallax by a plurality of cameras.
【0002】[0002]
【従来の技術】従来より被測定体の三次元的な座標を計
測する三次元座標計測方法の1つとしてステレオ法が採
用されている。図1は、ステレオ法による三次元座標計
測方法の一例を模式的に示した図である。2. Description of the Related Art Conventionally, a stereo method has been adopted as one of three-dimensional coordinate measuring methods for measuring the three-dimensional coordinates of an object to be measured. FIG. 1 is a diagram schematically showing an example of a three-dimensional coordinate measuring method by the stereo method.
【0003】互いに所定の距離dだけ離れた2台のCC
Dカメラ2,4により被写体6を撮影する。このとき、
被写体6上の一点Pの、2台のCCDカメラ2,4の各
撮像面上の各像点の二次元座標と、点Pの実際の三次元
座標(これをワールド座標と称する)との間の座標変換
式を求め得る場合、上記像点の二次元座標から点Pのワ
ールド座標を求めることができる。Two CCs separated from each other by a predetermined distance d
The subject 6 is photographed by the D cameras 2 and 4. At this time,
Between the two-dimensional coordinate of one point P on the subject 6 on each image plane of each of the two CCD cameras 2 and 4 and the actual three-dimensional coordinate of the point P (this is referred to as world coordinate). When the coordinate conversion formula of can be obtained, the world coordinate of the point P can be obtained from the two-dimensional coordinates of the image point.
【0004】図2は、上記座標変換式の求め方を説明す
るための図である。2本の直線3,5(図1参照)の交
点によって3次元空間の1点P(ターゲット点)が決定
できその座標は任意ワールド座標系で記述できる。よっ
て2台のカメラ2,4の視線がなす直線などを数式化
し、得られた方程式を連立させることによって2台のカ
メラ2,4の視線がなす直線などを数式化し、得られた
方程式を連立させることによってそれらの交点つまりタ
ーゲット点Pの座標を求めることができる。ここではカ
メラのレンズによる結像光学系をモデル化しそのモデル
を透視変換モデルと呼ぶ。カメラに関するパラメータと
してはその位置、姿勢、画角などがある。FIG. 2 is a diagram for explaining how to obtain the coordinate conversion formula. One point P (target point) in the three-dimensional space can be determined by the intersection of the two straight lines 3 and 5 (see FIG. 1), and its coordinates can be described in an arbitrary world coordinate system. Therefore, the straight lines formed by the lines of sight of the two cameras 2 and 4 are mathematically expressed, and the equations obtained by simultaneous equations are converted to the mathematical formulas that are formed by the lines of sight of the two cameras 2 and 4 and the obtained equations are made simultaneous. By doing so, the coordinates of these intersections, that is, the coordinates of the target point P can be obtained. Here, the imaging optical system using the lens of the camera is modeled, and the model is called a perspective transformation model. The parameters relating to the camera include its position, orientation, and angle of view.
【0005】図2はピンホールカメラに基づいた透視変
換モデルを表わした図である。カメラを理想化したこの
透視変換モデルは、レンズ面の中心にピンホールが開い
たものであり、視線はこの点を通る1本の直線として定
義される。ガラスレンズを用いた一般の結像系も、歪曲
収差が無視できるほど小さい場合、この簡易なモデルで
表現できる。実際のカメラは対象−レンズ−結像面と配
置されているが、これでは像が逆転してわかりにくいた
め、ここでは仮想的に結像面をレンズの前に置き、対象
−結像面−レンズと配置する。結像面をカメラに固定し
た座標系の基準に考えているため、結像面Iの中心を座
標系の原点とする。FIG. 2 is a diagram showing a perspective transformation model based on a pinhole camera. This perspective transformation model in which the camera is idealized has a pinhole opened at the center of the lens surface, and the line of sight is defined as a straight line passing through this point. A general imaging system using a glass lens can also be represented by this simple model when the distortion is so small that it can be ignored. The actual camera is arranged as the object-lens-imaging plane, but since the image is reversed and it is difficult to see this, here, the imaging plane is virtually placed in front of the lens, and the object-imaging plane- Arrange with the lens. Since the image plane is considered as the reference of the coordinate system fixed to the camera, the center of the image plane I is the origin of the coordinate system.
【0006】空間内のある1点P(X,Y,Z)を結像
面Iへ透視した点P’(Xc,Yc,Zc)、すなわち
測定点Pへ向かっている視線と結像面との交点は、A point P '(Xc, Yc, Zc) obtained by seeing through a point P (X, Y, Z) in the space to the image plane I, that is, the line of sight toward the measuring point P and the image plane. The intersection of
【0007】[0007]
【数1】 [Equation 1]
【0008】但しα≡f/(f+Z) で与えられる。また、表現を変えれば Xc=fX/(f+Z), Yc=fY/(f+Z), Zc=0 …(2) と表される。透視変換はこのように非線形な変換である
が、3次元の座標を媒介する変数を一つ加え、1次元高
めた表現を用いることにより、線形化することができ
る。これは同次座標系(homogeneous co
ordinates)と呼ばれている。3次元の点
(X,Y,Z)を、次のようにWhを媒介とする4次元
の点(Xh,Yh,Zh,Wh)で表面したものが同次
表現である。However, it is given by α≡f / (f + Z). Further, if the expression is changed, it is expressed as Xc = fX / (f + Z), Yc = fY / (f + Z), Zc = 0 (2). Although the perspective transformation is a non-linear transformation as described above, it can be linearized by adding one variable that mediates three-dimensional coordinates and using a one-dimensionally enhanced expression. This is a homogeneous coordinate system.
It is called ordinates). The three-dimensional point (X, Y, Z) is surfaced by the four-dimensional point (Xh, Yh, Zh, Wh) mediated by Wh as described below, which is a homogeneous expression.
【0009】 X=Xh/Wh, Y=Yh/Wh, Z=Zh/Wh …(3) この同次座標系により、透視変換はX = Xh / Wh, Y = Yh / Wh, Z = Zh / Wh (3) With this homogeneous coordinate system, perspective transformation is performed.
【0010】[0010]
【数2】 [Equation 2]
【0011】の4×4の行列演算で記述することができ
る。上記の透視変換は点Pと点P’が共にカメラに固定
した座標系で表現されているときに適用できるものであ
るが、一般的には測定対象である点Pはワールド座標系
で表わされ、点P’はカメラ中心を原点としカメラから
みた座標系であるカメラ座標系で表わされる。この二つ
の座標系を関係づける変換Tは、同次座標系表現では回
転と平行移動を含めて、It can be described by a 4 × 4 matrix operation. The perspective transformation described above can be applied when both the point P and the point P ′ are represented in the coordinate system fixed to the camera. Generally, the point P to be measured is represented in the world coordinate system. The point P'is represented by the camera coordinate system which is the coordinate system viewed from the camera with the center of the camera as the origin. The transformation T that relates these two coordinate systems is, in the homogeneous coordinate system representation, including rotation and translation,
【0012】[0012]
【数3】 [Equation 3]
【0013】で表わすことができる。ワールド座標系で
の点Pから座標系での点P’への変換は、Can be represented by The conversion from the point P in the world coordinate system to the point P'in the coordinate system is
【0014】[0014]
【数4】 [Equation 4]
【0015】と表わすことができる。カメラ座標系での
結像面はZch=0なので、結像面上での二次元座標
(Xc,Yc)、つまり入力画像中での画素の位置で上
記の式を簡略化すると、It can be expressed as Since the image plane in the camera coordinate system is Zch = 0, the above formula can be simplified by the two-dimensional coordinates (Xc, Yc) on the image plane, that is, the pixel position in the input image.
【0016】[0016]
【数5】 [Equation 5]
【0017】と記述される。この3×4のC行列がカメ
ラパラメータである。次にこのパラメータをキャリブレ
ーションする方法について述べる。図3は、カメラパラ
メータのキャリブレーションのため校正体をモデル化し
て示した図である。It is described as This 3 × 4 C matrix is a camera parameter. Next, a method of calibrating this parameter will be described. FIG. 3 is a diagram showing a model of a calibrator for camera parameter calibration.
【0018】カメラパラメータを求めるには通常図3に
示されるような3次元形状が既知の校正体、すなわち図
3の点1〜点8のようにワールド座標系での座標が既知
の基準となる物体を利用しそれを3次元計測することに
よってパラメータをキャリブレーションする。式(7)
を展開して整理すると、 C11X+C12Y+C13Z+C14−C31XXc−C32YXc− C33ZXc−C34Xc=0 C21X+C22Y+C23Z+C24−C31XYc−C32YYc− C33ZYc−C34Yc=0 …(7の2) の二つの式が成立する。To obtain the camera parameters, normally, a calibration body whose three-dimensional shape is known as shown in FIG. 3, that is, coordinates in the world coordinate system, such as points 1 to 8 in FIG. 3, is a known reference. The parameters are calibrated by using the object and measuring it three-dimensionally. Formula (7)
In summary, expand, C 11 X + C 12 Y + C 13 Z + C 14 -C 31 XXc-C 32 YXc- C 33 ZXc-C 34 Xc = 0 C 21 X + C 22 Y + C 23 Z + C 24 -C 31 XYc-C 32 YYc- Two expressions of C 33 ZYc−C 34 Yc = 0 (7-2) are established.
【0019】従ってカメラパラメータC11からC34まで
12個の未知数を求めるにはワールド座標系で基準とな
る点(X,Y,Z)とそれに対応するカメラ座標系での
位置(Xc,Yc)の組が6組以上であればよいことに
なる。通常、キャリブレーションの精度を高めるために
6個を越える多数の基準点を用い、最小2乗法によって
パラメータを同定する。n点の基準点のワールド座標
(Xi,Yi,Zi)とそれに対応するカメラ座標(X
ci,Yci)が既知であればC34=1と置くことによ
りTherefore, in order to obtain 12 unknowns from the camera parameters C 11 to C 34 , the reference point (X, Y, Z) in the world coordinate system and the corresponding position (Xc, Yc) in the camera coordinate system. If the number of sets is 6 or more, it is sufficient. Usually, in order to improve the accuracy of calibration, a large number of reference points exceeding 6 are used, and the parameters are identified by the least square method. World coordinates (Xi, Yi, Zi) of n reference points and corresponding camera coordinates (X
If ci, Yci) is known, by setting C 34 = 1
【0020】[0020]
【数6】 [Equation 6]
【0021】となる。これを A・C=R …(9) と表わせば、最小2乗法により C=(At A)-1At R …(10) となりカメラパラメータがキャリブレーションされる。[0021] It expressed to as A · C = R ... (9 ), C = (A t A) -1 A t R ... (10) next to the camera parameters are calibrated by the least squares method.
【0022】以上のようにしてカメラパラメータを求め
た後、今後はカメラの視野内に、校正体に代えて三次元
座標が未知の被測定体(ターゲット)を配置し、このタ
ーゲットの3次元座標(=ワールド座標系でのターゲッ
ト座標)(X,Y,Z)を撮影によって得られたカメラ
座標系のターゲット座標(Xc,Yc)から算出する場
合について考察する。After the camera parameters are obtained as described above, the object to be measured (target) whose three-dimensional coordinates are unknown is placed in the visual field of the camera instead of the calibration object, and the three-dimensional coordinates of this target are set. Consider a case where (= target coordinates in the world coordinate system) (X, Y, Z) is calculated from target coordinates (Xc, Yc) in the camera coordinate system obtained by photographing.
【0023】式(7)を展開して整理すると、 (C11−C31Xc)X+(C12−C32Xc)Y +(C13−C33Xc)Z=C34Xc−C14, (C21−C31Yc)X+(C22−C32Yc)Y +(C23−C33Yc)Z=C34Yc−C24 …(7の3) となる。上記の式は未知数がX,Y,Zの3個であるの
に対し式の本数が2本であるため一意には解けず解は1
本の直線となりターゲットがこの直線に乗っていること
がわかるのみである。When formula (7) is expanded and arranged, (C 11 -C 31 Xc) X + (C 12 -C 32 Xc) Y + (C 13 -C 33 Xc) Z = C 34 Xc-C 14 , It becomes (3 7) (C 21 -C 31 Yc ) X + (C 22 -C 32 Yc) Y + (C 23 -C 33 Yc) Z = C 34 Yc-C 24 .... The above equation has three unknowns, X, Y, and Z, but the number of equations is two.
It's just a straight line in the book, and you can only see that the target is on this straight line.
【0024】そこで空間上の異なる位置に置かれたもう
一つのカメラによる測定結果を同時に用いる。こちらの
カメラについてもあらかじめカメラパラメータB11〜B
34をキャリブレーションしておく。こちらのカメラで得
られたカメラ座標系のターゲット座標を(Xb,Yb)
とすると下式が得られる。 (B11−B31Xb)X+(B12−B32Xb)Y +(B13−B33Xb)Z=B34Xb−B14, (B21−B31Yb)X+(B22−B32Yb)Y +(B23−B33Yb)Z=B34Yb−B24 …(11) (7の3),(11)式をまとめて行列の形式で表わす
とTherefore, the measurement results obtained by another camera placed at different positions in space are used at the same time. For this camera, the camera parameters B 11 to B are set in advance.
Calibrate 34 . The target coordinates of the camera coordinate system obtained by this camera are (Xb, Yb)
Then, the following formula is obtained. (B 11 -B 31 Xb) X + (B 12 -B 32 Xb) Y + (B 13 -B 33 Xb) Z = B 34 Xb-B 14, (B 21 -B 31 Yb) X + (B 22 -B 32 Yb) Y + (B 23 -B 33 Yb) Z = B 34 Yb-B 24 ... (11) (7 3 ) of, when expressed in the form of a matrix together (11)
【0025】[0025]
【数7】 [Equation 7]
【0026】となり、ここでAnd here
【0027】[0027]
【数8】 [Equation 8]
【0028】とおけば(12)式は F=Q・V …(14) と行列演算の形で表現できる。したがってQの逆行列が
存在するならば V=Q-1F となりワールド座標系のターゲット座標が求められる。In other words, the expression (12) can be expressed in the form of matrix operation F = Q · V (14). Therefore, if the inverse matrix of Q exists, V = Q −1 F and the target coordinates in the world coordinate system are obtained.
【0029】[0029]
【発明が解決しようとする課題】上記手法を用い、2台
のカメラを用いることにより被測定体(ターゲット)の
三次元座標は求められるが、例えば被写体が人間の頭部
である場合の、上顎に対する下顎の相対的な動き等ワー
ルド座標系に対して移動する基準体に対しさらに相対的
に移動する被測定体の、基準体に対する相対的な動きを
どのようにして測定するかが問題となる。The three-dimensional coordinates of the object to be measured (target) can be obtained by using the above-mentioned method and using two cameras. For example, when the subject is a human head, the upper jaw A problem is how to measure the relative movement of the measured object that moves relative to the reference body that moves relative to the world coordinate system, such as the relative movement of the lower jaw. ..
【0030】本発明は、上記事情に鑑み、ステレオ法を
用いて、校正体を用いて定めたワールド座標系に対して
移動する基準体に対してさらに相対的に移動する被測定
体の、基準体に対する座標を求める方法を提供すること
を目的とする。In view of the above circumstances, the present invention uses the stereo method to determine the reference of a measured object that moves further relative to the reference object that moves with respect to the world coordinate system defined using the calibration object. It is an object to provide a method for obtaining coordinates with respect to a body.
【0031】[0031]
【課題を解決するための手段】上記目的を達成するため
の本発明の三次元座標計測方法は、基準体に対する被測
定体の相対的な三次元座標を求める三次元座標計測方法
であって、複数のカメラで互いに異なる角度から互いに
同一の校正体上の少なくとも6点を観察して該カメラの
撮像面上の二次元座標と前記校正体の三次元座標との変
換式を求め、前記複数のカメラの共通の視野内に前記標
準体と前記被測定体を配置し、前記基準体上の少なくと
も3点の三次元座標を求めることにより該基準体上に基
準座標系を求め、前記被測定体上の所定点の三次元座標
を求めることにより該所定点の前記基準座標系上の位置
を求めることを特徴とするものである。A three-dimensional coordinate measuring method of the present invention for achieving the above object is a three-dimensional coordinate measuring method for obtaining a relative three-dimensional coordinate of an object to be measured with respect to a reference object. At least six points on the same calibration body are observed from different angles with a plurality of cameras to obtain a conversion formula between the two-dimensional coordinates on the imaging surface of the camera and the three-dimensional coordinates of the calibration body, The standard body and the object to be measured are arranged in a common field of view of a camera, and a reference coordinate system is obtained on the reference object by obtaining three-dimensional coordinates of at least three points on the reference object. The position of the predetermined point on the reference coordinate system is obtained by obtaining the three-dimensional coordinates of the predetermined point.
【0032】ここで上記「校正体上」、上記「基準体
上」、上記「被測定体上」とは、必ずしも校正体(ない
し基準体、被測定体)の表面を意味するものではなく、
表面を含めその校正体(ないし基準体、被測定体)に対
する相対的な動きのない点を意味している。The terms "on the calibration body", "on the reference body", and "on the measured object" do not necessarily mean the surface of the calibration object (or the reference object or the measured object).
It means a point that does not move relative to the calibration object (or reference object, measured object) including the surface.
【0033】[0033]
【作用】前述したように校正体を用いて上記変換式を求
めた後基準体上の少なくとも3点の三次元座標を求める
と、これにより基準体上に基準座標系を定めることがで
きる。このようにして基準座標系を定めると、基準体に
対し相対的に移動する被測定体上の所定点のカメラ座標
を、ワールド座標を経由して上記基準座標に変換するこ
とができることとなり、これにより該所定点の基準座標
系上の位置が求められる。As described above, if the three-dimensional coordinates of at least three points on the reference body are calculated after the conversion formula is calculated using the calibration body, the reference coordinate system can be set on the reference body. By defining the reference coordinate system in this way, the camera coordinates of a predetermined point on the measured object that moves relative to the reference object can be converted to the above reference coordinates via world coordinates. Thus, the position of the predetermined point on the reference coordinate system is obtained.
【0034】[0034]
【実施例】以下、本発明の実施例について説明する。図
4は、人体の上顎に固定された3点の発光体10,1
2,14と下顎に固定された1点の発光体16を模式的
に表わした図である。図1に示すように、互いに異なる
方向から互いに同一の視野を睨むように2台のカメラ
2,4を配置し、図1に示す被写体6として先ず例えば
図3に示すような校正体を配置してカメラパラメータの
キャリブレーションを行う。EXAMPLES Examples of the present invention will be described below. FIG. 4 shows three-point light emitters 10 and 1 fixed to the upper jaw of the human body.
It is the figure which represented 2 and 14 and the light-emitting body 16 of 1 point fixed to the lower jaw typically. As shown in FIG. 1, two cameras 2 and 4 are arranged so as to gaze at the same field of view from different directions, and a calibrator such as that shown in FIG. 3 is first arranged as an object 6 shown in FIG. To calibrate camera parameters.
【0035】次に校正体に代えてCCDカメラ2,4の
視野内に人体の頭部を配置する。この頭部の上顎には、
図3に示す3つの発光体10,12,14が固定されて
おり、下顎には発光体16が固定されている。ここで、
上顎に固定された3つの発光体10,12,14のそれ
ぞれについて前述したように各三次元座標が求められ
る。Next, the head of the human body is placed in the visual fields of the CCD cameras 2 and 4 instead of the calibration body. In the upper jaw of this head,
The three light emitters 10, 12, 14 shown in FIG. 3 are fixed, and the light emitter 16 is fixed to the lower jaw. here,
Each three-dimensional coordinate is obtained as described above for each of the three light emitters 10, 12, 14 fixed to the upper jaw.
【0036】この各三次元座標が求められると、発光体
10の座標点を原点0とし、2つの発光体12,14を
結ぶ直線の中心を通る軸をz軸、3つの発光体10,1
2,14で作る平面内で上記z軸を直交する方向にx
軸、これらのz軸、x軸の双方に直交する方向にy軸が
定められ、これにより上顎を基準とした基準座標系(上
顎座標系)が定められる。When the respective three-dimensional coordinates are obtained, the coordinate point of the light emitting body 10 is set to the origin 0, the axis passing through the center of the straight line connecting the two light emitting bodies 12 and 14 is the z axis, and the three light emitting bodies 10 and 1 are arranged.
X in the direction orthogonal to the z axis in the plane formed by 2 and 14
An axis, a y-axis is defined in a direction orthogonal to both the z-axis and the x-axis, and a reference coordinate system (upper jaw coordinate system) with the upper jaw as a reference is thereby defined.
【0037】次に下顎に固定された発光体16の三次元
座標を計測し、この座標の、上記のようにして求めた上
顎座標系における座標が求められる。このような測定方
法を採用することにより、例えば頭部全体(上顎)が動
いた場合であっても上顎に対する下顎の相対的な位置、
動きが検出される。尚ここでは下顎には発光体16が1
つのみ固定されているが、複数の発光体を固定しておい
て各発光体の各位置座標を順次求めるようにしてもよ
い。Next, the three-dimensional coordinates of the luminous body 16 fixed to the lower jaw are measured, and the coordinates of this coordinate in the upper jaw coordinate system obtained as described above are obtained. By adopting such a measuring method, for example, even when the entire head (upper jaw) moves, the relative position of the lower jaw with respect to the upper jaw,
Motion is detected. It should be noted that here, the luminous body 16 is 1 on the lower jaw.
Although only one light emitter is fixed, a plurality of light emitters may be fixed and each position coordinate of each light emitter may be sequentially obtained.
【0038】また、顆頭点(かとうてん)と呼ばれる、
耳の近傍の上顎と下顎の関節部の、上顎に対する相対的
な動きを知ることが求められる場合がある。この場合に
は本発明をさらに発展させて下顎についても座標系を定
め、顆頭点の下顎座標系における位置を定め、下顎座標
系の原点の上顎座標系に対する位置を定めることにより
顆頭点の上顎座標系における位置(上顎に対する顆頭点
の相対移動)を知ることができることとなる。[0038] Also called the condylar point,
It may be necessary to know the relative movement of the upper and lower jaw joints near the ear with respect to the upper jaw. In this case, the present invention is further developed to determine the coordinate system for the lower jaw, determine the position in the lower jaw coordinate system of the condyle point, and determine the position of the condyle point relative to the upper jaw coordinate system of the origin of the lower jaw coordinate system. The position in the maxillary coordinate system (relative movement of the condylar point with respect to the maxilla) can be known.
【0039】図5は、本発明の他の実施例を示した図で
ある。自動車のエンジン30に燃料パイプ32が接続さ
れている。この燃料パイプ32は、エンジン30を動か
すことによりエンジン30が発熱すると膨張、変形す
る。この燃料パイプ32の膨張、変形の度合を知ること
が要請されている。この場合、エンジンを動かすとエン
ジン30が振動するが、この振動分を差し引いた、エン
ジン30に対する燃料パイプ32の相対的な膨張、変形
を測定する必要がある。FIG. 5 is a diagram showing another embodiment of the present invention. A fuel pipe 32 is connected to the engine 30 of the automobile. The fuel pipe 32 expands and deforms when the engine 30 generates heat by moving the engine 30. It is required to know the degree of expansion and deformation of the fuel pipe 32. In this case, when the engine is moved, the engine 30 vibrates, but it is necessary to measure the relative expansion and deformation of the fuel pipe 32 with respect to the engine 30 by subtracting this vibration.
【0040】本発明は、この場合にも適用され、互いに
異なる方向から2台のCCDカメラ2,4で、エンジン
30に固定された3つの発光体20,22,24、燃料
パイプ32に固定された発光体26を観察することによ
りエンジン30に対する燃料パイプ32の相対的な動き
(膨張、変形等)を求めることができる。尚、上記各実
施例ではカメラ2,4で観察する点に発光体が固定され
ているものとして説明したが、各カメラ2,4で互いに
同一点を観察していることが特定できさえすればよく発
光体である必要はない。The present invention is also applied to this case and is fixed to the three light emitters 20, 22, 24 fixed to the engine 30 and the fuel pipe 32 by two CCD cameras 2 and 4 from different directions. By observing the luminous body 26, the relative movement (expansion, deformation, etc.) of the fuel pipe 32 with respect to the engine 30 can be obtained. In each of the above-described embodiments, the light-emitting body is described as being fixed at the points observed by the cameras 2 and 4, but it is only necessary to specify that the cameras 2 and 4 observe the same point. It need not be a good emitter.
【0041】[0041]
【発明の効果】以上説明したように、本発明の三次元座
標計測方法は、基準体上の少なくとも3点の三次元座標
を求めることにより該基準体上に基準座標系を求め、被
測定体上の所定点の三次元座標を求めることにより該所
定点の上記基準座標系上の位置を求めたため、基準体自
体が動く場合であってもこの基準体に対する被測定体の
相対的な位置、動きが求められる。As described above, according to the three-dimensional coordinate measuring method of the present invention, the reference coordinate system is obtained on the reference body by obtaining the three-dimensional coordinates of at least three points on the reference body, and the measured object is measured. Since the position of the predetermined point on the reference coordinate system is obtained by obtaining the three-dimensional coordinates of the predetermined point, even if the reference body itself moves, the relative position of the measured object with respect to the reference body, Movement is required.
【図1】ステレオ法による三次元座標計測方法の一例を
模式的に示した図である。FIG. 1 is a diagram schematically showing an example of a three-dimensional coordinate measuring method by a stereo method.
【図2】ピンホールカメラに基づいた透視変換モデルを
表わした図である。FIG. 2 is a diagram showing a perspective transformation model based on a pinhole camera.
【図3】カメラパラメータのキャリブレーションのため
校正体をモデル化して示した図である。FIG. 3 is a diagram showing a model of a calibrator for calibrating camera parameters.
【図4】人体の上顎に固定された3点の発光体と下顎に
固定された1点の発光体を模式的に表わした図である。FIG. 4 is a diagram schematically showing a three-point light emitter fixed to the upper jaw of a human body and a one-point light emitter fixed to the lower jaw.
【図5】本発明の他の実施例を示した図である。FIG. 5 is a diagram showing another embodiment of the present invention.
2,4 CCDカメラ 6 被写体 10,12,14,16 発光体 20,22,24,26 発光体 30 エンジン 32 燃料パイプ 2,4 CCD camera 6 Subject 10, 12, 14, 16 Light emitter 20, 22, 24, 26 Light emitter 30 Engine 32 Fuel pipe
Claims (1)
元座標を求める三次元座標計測方法であって、 複数のカメラで互いに異なる角度から互いに同一の校正
体上の少なくとも6点を観察して該カメラの撮像面上の
二次元座標と前記校正体の三次元座標との変換式を求
め、 前記複数のカメラの共通の視野内に前記標準体と前記被
測定体を配置し、 前記基準体上の少なくとも3点の三次元座標を求めるこ
とにより該基準体上に基準座標系を求め、 前記被測定体上の所定点の三次元座標を求めることによ
り該所定点の前記基準座標系上の位置を求めることを特
徴とする三次元座標計測方法。1. A three-dimensional coordinate measuring method for obtaining relative three-dimensional coordinates of an object to be measured with respect to a reference object, comprising observing at least six points on the same calibration object from different angles with a plurality of cameras. Then, the conversion formula between the two-dimensional coordinates on the imaging surface of the camera and the three-dimensional coordinates of the calibration body is obtained, and the standard body and the measured object are arranged in the common field of view of the plurality of cameras, A reference coordinate system is obtained on the reference body by obtaining three-dimensional coordinates of at least three points on the body, and a three-dimensional coordinate of a predetermined point on the object to be measured is obtained on the reference coordinate system of the predetermined point. A three-dimensional coordinate measuring method characterized by obtaining the position of the.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP03334934A JP3118047B2 (en) | 1991-12-18 | 1991-12-18 | 3D coordinate measurement method |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP03334934A JP3118047B2 (en) | 1991-12-18 | 1991-12-18 | 3D coordinate measurement method |
Publications (2)
Publication Number | Publication Date |
---|---|
JPH05164517A true JPH05164517A (en) | 1993-06-29 |
JP3118047B2 JP3118047B2 (en) | 2000-12-18 |
Family
ID=18282869
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
JP03334934A Expired - Fee Related JP3118047B2 (en) | 1991-12-18 | 1991-12-18 | 3D coordinate measurement method |
Country Status (1)
Country | Link |
---|---|
JP (1) | JP3118047B2 (en) |
Cited By (8)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JPH0953914A (en) * | 1995-08-14 | 1997-02-25 | Nec Corp | Three-dimensional coordinate measuring instrument |
US5805287A (en) * | 1993-05-24 | 1998-09-08 | Metronor As | Method and system for geometry measurements |
JP2003042726A (en) * | 2001-08-03 | 2003-02-13 | Topcon Corp | Object for calibration |
JP2007064836A (en) * | 2005-08-31 | 2007-03-15 | Kyushu Institute Of Technology | Algorithm for automating camera calibration |
JP2007219765A (en) * | 2006-02-15 | 2007-08-30 | Toyota Motor Corp | Image processor, method therefor, and imaging processing program |
JP2009243984A (en) * | 2008-03-29 | 2009-10-22 | Tokyo Electric Power Co Inc:The | Calibration data generation method, calibration data generation device, and computer program |
CN106247975A (en) * | 2016-07-11 | 2016-12-21 | 大连理工大学 | A kind of lift vision measurement car |
JP2017169627A (en) * | 2016-03-18 | 2017-09-28 | 株式会社東芝 | X-ray imaging apparatus alignment adjustment support device, method, and program |
-
1991
- 1991-12-18 JP JP03334934A patent/JP3118047B2/en not_active Expired - Fee Related
Cited By (9)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US5805287A (en) * | 1993-05-24 | 1998-09-08 | Metronor As | Method and system for geometry measurements |
JPH0953914A (en) * | 1995-08-14 | 1997-02-25 | Nec Corp | Three-dimensional coordinate measuring instrument |
JP2003042726A (en) * | 2001-08-03 | 2003-02-13 | Topcon Corp | Object for calibration |
WO2003014664A1 (en) * | 2001-08-03 | 2003-02-20 | Topcon Corporation | Calibration object |
JP2007064836A (en) * | 2005-08-31 | 2007-03-15 | Kyushu Institute Of Technology | Algorithm for automating camera calibration |
JP2007219765A (en) * | 2006-02-15 | 2007-08-30 | Toyota Motor Corp | Image processor, method therefor, and imaging processing program |
JP2009243984A (en) * | 2008-03-29 | 2009-10-22 | Tokyo Electric Power Co Inc:The | Calibration data generation method, calibration data generation device, and computer program |
JP2017169627A (en) * | 2016-03-18 | 2017-09-28 | 株式会社東芝 | X-ray imaging apparatus alignment adjustment support device, method, and program |
CN106247975A (en) * | 2016-07-11 | 2016-12-21 | 大连理工大学 | A kind of lift vision measurement car |
Also Published As
Publication number | Publication date |
---|---|
JP3118047B2 (en) | 2000-12-18 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN101876532B (en) | Camera on-field calibration method in measuring system | |
EP0782100B1 (en) | Three-dimensional shape extraction apparatus and method | |
US9679385B2 (en) | Three-dimensional measurement apparatus and robot system | |
BR112020001912A2 (en) | method for deriving an alignment correction between a right eye screen and a left eye screen and method for stereoscopic alignment correction between a right eye screen and a left eye screen | |
JP2009042162A (en) | Calibration device and method therefor | |
JP7102115B2 (en) | Calibration method, calibration device, 3D measuring device, 3D visual measuring device, robot end effector, program, recording medium | |
CN106127745A (en) | The combined calibrating method of structure light 3 D visual system and line-scan digital camera and device | |
CN103983186B (en) | Binocular vision system bearing calibration and calibration equipment | |
CN103959012A (en) | Position and orientation determination in 6-dof | |
JP2012168180A (en) | Device and method for determining 3d coordinates of object and calibrating industrial robot | |
CN108700408A (en) | Three-dimensional shape data and texture information generate system, shooting control program and three-dimensional shape data and texture information generation method | |
Yu et al. | A calibration method based on virtual large planar target for cameras with large FOV | |
Pesce et al. | A low-cost multi camera 3D scanning system for quality measurement of non-static subjects | |
CN113870366B (en) | Calibration method and calibration system of three-dimensional scanning system based on pose sensor | |
CN112229323B (en) | Six-degree-of-freedom measurement method of checkerboard cooperative target based on monocular vision of mobile phone and application of six-degree-of-freedom measurement method | |
TW201310004A (en) | Correlation arrangement device of digital images | |
CN106767526A (en) | A kind of colored multi-thread 3-d laser measurement method based on the projection of laser MEMS galvanometers | |
JP3118047B2 (en) | 3D coordinate measurement method | |
CN109493378B (en) | Verticality detection method based on combination of monocular vision and binocular vision | |
JP2017098859A (en) | Calibration device of image and calibration method | |
JP4085671B2 (en) | Data processing method, data processing program, and recording medium | |
JP2019045299A (en) | Three-dimensional information acquisition device | |
JP3696336B2 (en) | How to calibrate the camera | |
WO2015159835A1 (en) | Image processing device, image processing method, and program | |
Yamauchi et al. | Calibration of a structured light system by observing planar object from unknown viewpoints |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
A01 | Written decision to grant a patent or to grant a registration (utility model) |
Free format text: JAPANESE INTERMEDIATE CODE: A01 Effective date: 20000926 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
R250 | Receipt of annual fees |
Free format text: JAPANESE INTERMEDIATE CODE: R250 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20071006 Year of fee payment: 7 |
|
FPAY | Renewal fee payment (event date is renewal date of database) |
Free format text: PAYMENT UNTIL: 20111006 Year of fee payment: 11 |
|
LAPS | Cancellation because of no payment of annual fees |