JP2004191051A - Three-dimensional shape measuring method and its apparatus - Google Patents

Three-dimensional shape measuring method and its apparatus Download PDF

Info

Publication number
JP2004191051A
JP2004191051A JP2002355403A JP2002355403A JP2004191051A JP 2004191051 A JP2004191051 A JP 2004191051A JP 2002355403 A JP2002355403 A JP 2002355403A JP 2002355403 A JP2002355403 A JP 2002355403A JP 2004191051 A JP2004191051 A JP 2004191051A
Authority
JP
Japan
Prior art keywords
coordinate system
measured
shape
target
partial
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2002355403A
Other languages
Japanese (ja)
Other versions
JP4220768B2 (en
Inventor
Teruaki Yogo
照明 與語
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Individual
Original Assignee
Individual
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Individual filed Critical Individual
Priority to JP2002355403A priority Critical patent/JP4220768B2/en
Publication of JP2004191051A publication Critical patent/JP2004191051A/en
Application granted granted Critical
Publication of JP4220768B2 publication Critical patent/JP4220768B2/en
Anticipated expiration legal-status Critical
Expired - Fee Related legal-status Critical Current

Links

Images

Abstract

<P>PROBLEM TO BE SOLVED: To provide a three-dimensional shape measuring method for easily and three-dimensionally measuring even a large object. <P>SOLUTION: A partial shape of the object 1 is optically and three-dimensionally measured in a partial coordinate system by a shape measuring device 2, and at least three reference points MP1-MP3 having known three-dimensional locational relation with the partial coordinate system are provided for the shape measuring device 2. The locations of the reference points MP1-MP3 are each optically and three-dimensionally measured in a whole coordinate system by a location measuring device 14. On the basis of each of the reference points MP1-MP3 measured by the location measuring device 14, measurement values of the partial coordinate system are converted into measurement values of the whole coordinate system. The shape measuring device 2 is moved, and measurements on partial shapes of the object 1 at other parts and measurements on each of the reference points MP1-MP3 at these locations are repeated to measure the shape of the object to be measured 1. <P>COPYRIGHT: (C)2004,JPO&NCIPI

Description

【0001】
【発明の属する技術分野】
本発明は、被測定物の外形形状を3次元で測定する3次元形状測定装置に関する。
【0002】
【従来の技術】
従来より、モアレトポグラフィに代表されるような縞を利用し、被測定物上に縞状パターンを投影し、被測定物の形状により変形した縞模様と基準の縞模様を重ね合わせ、その差周波数として生じる等高線を示すモアレ縞を解析することにより、被測定物の3次元形状を測定するものとして、例えば、特開昭53−68267号公報や特開昭61−260107号公報にあるような装置が知られている。
【0003】
このような装置では、被測定物上にできた縞模様をCCDカメラにより撮像して、解析している。
【0004】
【発明が解決しようとする課題】
しかしながら、こうした従来のものでは、被測定物が小さい場合は、被測定物全体をCCDカメラで撮像して解析しても、十分な測定精度が得られるが、被測定物が大きい場合、一度で被測定物全体を撮像したのでは、十分な解像度が得られない場合がある。
【0005】
そのような場合には、被測定物の一部を撮像して3次元形状を測定し、次に、場所を変えて、被測定物の一部を撮像して3次元形状を測定し、これを繰り返して全体の3次元形状を測定しているが、分けて測定した測定値を一つの基準座標系の測定値に合成する作業が繁雑であるという問題があった。
【0006】
本発明の課題は、大きな被測定物であっても容易に3次元測定できる3次元形状測定方法及びその装置を提供することにある。
【0007】
【課題を解決するための手段】
かかる課題を達成すべく、本発明は課題を解決するため次の手段を取った。即ち、
被測定物の部分外形形状を形状測定手段により部分座標系で光学的に3次元測定すると共に、前記部分座標系との3次元位置関係が既知のターゲットを前記形状測定手段に設け、前記ターゲットを位置測定手段により全体座標系で光学的に3次元測定し、前記ターゲットの前記部分座標系での3次元位置と前記ターゲットの前記全体座標系での測定値との関係に基づいて前記被測定物の前記部分座標系での測定値を前記全体座標系の測定値に変換し、前記形状測定手段を移動させて、前記被測定物の他箇所の部分外形形状の測定と、その位置での前記ターゲットの測定とを繰り返し、前記被測定物の外形形状を測定することを特徴とする3次元形状測定方法がそれである。
【0008】
前記ターゲットは少なくとも3つの基準点からなってもよい。また、前記基準点は発光体であってもよい。前記位置測定手段は、前記基準点の位置を三角測量で測定してもよい。また、前記位置測定手段は、一定間隔で固定された2台のカメラを備え、前記カメラにより前記基準点を撮像し、前記基準点の位置を三角測量で測定してもよい。前記ターゲットは3次元形状を有するものでもよい。あるいは、前記ターゲットは少なくとも3個の球体からなってもよい。前記位置測定手段は、前記ターゲットをモアレ縞を用いて測定してもよい。
【0009】
また、被測定物の部分外形形状を部分座標系で光学的に3次元測定する形状測定手段に、前記部分座標系との3次元位置関係が既知のターゲットを設け、
かつ、前記ターゲットを全体座標系で光学的に3次元測定する位置測定手段と、前記形状測定手段を移動する移動手段とを設け、
また、前記ターゲットの前記部分座標系での3次元位置と前記ターゲットの前記全体座標系での測定値との関係に基づいて前記被測定物の前記部分座標系での測定値を前記全体座標系の測定値に変換する変換手段を設け、前記移動手段により前記形状測定手段を移動させて、前記被測定物の他箇所の部分外形形状の測定と、その位置での前記ターゲットの測定とを繰り返し、前記被測定物の外形形状を測定することを特徴とする3次元形状測定装置がそれである。
【0010】
【発明の実施の形態】
以下本発明の実施の形態を図面に基づいて詳細に説明する。
図1に示すように、1は被測定物であり、例えば、乗用車の車体等の大型の3次元形状の外形を有するものである。この被測定物1の3次元形状を測定する形状測定器2を備え、この形状測定器2は、被測定物1の部分外形形状を光学的に測定するものである。形状測定器2は、図2に示すように、CCDカメラ4とフリンジプロジェクター6とを備えている。
【0011】
形状測定器2は、フリンジプロジェクター6が複数の格子を被測定物1の表面に投影し、CCDカメラ4がこの被測定物1の外形形状に応じて変形した格子を撮像する。そして、形状測定器2は、この変形格子と基準格子とに基づいて、被測定物1の3次元形状を測定するものである。
【0012】
形状測定器2は、CCDカメラ4により撮像する領域Sが広くなれば測定精度が粗くなり、撮像する領域Sが狭くなれば測定精度が精密になる。従って、必要な測定精度を得ようとすると、CCDカメラ4により撮像する領域Sの面積が限定され、被測定物1の部分外形形状を測定することになる。形状測定器2は、3次元の部分座標系に基づいた測定値を出力する。部分座標系は、形状測定器2が有する座標系であり、形状測定器2を移動させたときには、部分座標系も同時に移動する。
【0013】
また、形状測定器2には、図1に示すように、ターゲットとしての3個の第1〜第3基準点MP1〜MP3が一体的に設けられている。第1〜第3基準点MP1〜MP3は発光ダイオード等を用いた発光体であってもよく、あるいは、マークであってもよい。この基準点MP1〜MP3は、予め部分座標系における3次元位置関係が測定されて、その位置(Pc1,Pc2,Pc3)は既知となっている。例えば、第1基準点MP1を部分座標系の原点Pc1(0,0,0)に設け、第2基準点MP2を部分座標系のX軸上の点Pc2(a,0,0)に設け、第3基準点MP3を部分座標系のY軸上の点Pc3(0,b,0)に設けてもよい。
【0014】
更に、本実施形態では、第1ナビゲーションカメラ10と第2ナビゲーションカメラ12とを備えた位置測定器14が設けられている。第1ナビゲーションカメラ10と第2ナビゲーションカメラ12とにより、第1〜第3基準点MP1〜MP3を撮像することができる位置に配置されている。
【0015】
3次元の全体座標系における第1ナビゲーションカメラ10と第2ナビゲーションカメラ12との位置は既知であり、第1ナビゲーションカメラ10と第2ナビゲーションカメラ12との間隔Lは予め測定されている。第1ナビゲーションカメラ10により第1基準点MP1とのなす角度θ1が測定され、第2ナビゲーションカメラ12により第1基準点MP1とのなす角度θ2が測定される。これらの角度θ1,θ2、間隔Lから、三角測量により第1基準点MP1の全体座標系における3次元位置の測定値が得られる。
【0016】
同様にして、第2基準点MP2及び第3基準点MP3の全体座標系における3次元位置の測定値が得られる。その際、各基準点MP1〜MP3に発光体を用いた場合、順番に発光させることにより、各基準点MP1〜MP3を容易に判別できる。あるいは、各基準点MP1〜MP3の認識は、発光体を同時に発光して形状認識することにより、認識してもよい。
【0017】
尚、位置測定器14は、第1ナビゲーションカメラ10と第2ナビゲーションカメラ12との2つのカメラを用いたものに限らず、レーザ距離計を用いて、第1〜第3基準点MP1〜MP3の全体座標系における3次元位置の測定値Pc1〜Pc3を得るようにしたものでもよい。
【0018】
図1に示すように、形状測定器2、第1ナビゲーションカメラ10、第2ナビゲーションカメラ12は、変換制御装置16に接続されており、変換制御装置16に形状測定器2からの部分座標系による形状測定値が入力されると共に、位置測定器14とから第1〜第3基準点MP1〜MP3の全体座標系における位置測定値が入力される。
【0019】
前述した形状測定器2は、本実施形態では、図3に示すように、多関節型のロボットを用いた移動装置18に取り付けられている。この移動装置18により形状測定器2を移動させて、形状測定器2により被測定物1の外形形状を測定できる。移動装置18は、形状測定器2を平行移動させるものに限らず、形状測定器2により被測定物1の外形形状を測定できるように移動できればよい。
【0020】
次に、変換制御装置16で行われる測定処理について、図4のフローチャートによって説明する。
まず、測定を開始するのか否かを判断する(ステップ100)。測定をするのであれば、次に、形状測定器2により被測定物1の部分外形形状を測定する(ステップ110)。形状測定器2は、図1に示すように、被測定物1の領域S1の部分外形形状を部分座標系で3次元測定する。その測定値Ps1(s1xn,s1yn,s1zn)を変換制御装置16に出力する。添字nはn個の測定点があることを示す。
【0021】
続いて、第1ナビゲーションカメラ10と第2ナビゲーションカメラ12とにより、形状測定器2がステップ110の処理により測定したときの第1〜第3基準点MP1〜MP3の全体座標系における3次元の位置Pg11〜Pg31を測定する(ステップ120)。第1〜第3基準点MP1〜MP3の位置を測定し、その測定値(Pg11,Pg21,Pg31)を変換制御装置16に出力する。
【0022】
次に、ステップ120の測定値に基づいて、座標変換係数R1を算出する(ステップ130)。座標変換係数R1は、部分座標系での既知の第1〜第3基準点MP1〜MP3の位置座標(Pc1,Pc2,Pc3)と、ステップ120の処理により測定した第1〜第3基準点MP1〜MP3の位置座標(Pg11,Pg21,Pg31)とに基づいて下記式により算出される。
【0023】
Pg11=[R1]×Pc1
Pg21=[R1]×Pc2
Pg31=[R1]×Pc3
ここで、[R1]はマトリックスである。
【0024】
続いて、被測定物1の領域S1の全体座標系における測定値P1を算出する(ステップ140)。全体座標系における測定値P1は、前記ステップ110の処理により測定した部分座標系の測定値Ps1を、ステップ130の処理により算出した座標変換係数R1に基づいて変換することにより下記式により算出される。
【0025】
P1=Ps1×[R1]
次に、測定が終了したか否かを判断し(ステップ150)、引き続いて測定する場合には、前記ステップ100以下の処理を繰り返す。そして、本実施形態では、移動装置18により、被測定物1の次の測定箇所に形状測定器2を移動する。移動の際は、平行移動でなくてもよく、第1ナビゲーションカメラ10と第2ナビゲーションカメラ12とにより第1〜第3基準点MP1〜MP3を撮像することができるように移動すればよい。そして、前述したと同様に、形状測定器2により被測定物1の領域S2の部分外形形状を部分座標系で測定する(ステップ110)。その測定値Ps2(s2xn,s2yn,s2zn)を変換制御装置16に出力する。
【0026】
次に、移動後の第1〜第3基準点MP1〜MP3の全体座標系における3次元の位置(Pg12,Pg22,Pg32)を測定する(ステップ120)。続いて、この測定値(Pg12,Pg22,Pg32)に基づいて、座標変換係数R2を算出する。
【0027】
Pg12=[R2]×Pc1
Pg22=[R2]×Pc2
Pg32=[R2]×Pc3
続いて、被測定物1の領域S2の全体座標系における測定値P2を算出する(ステップ140)。
【0028】
P2=Ps2×[R2]
前述した測定処理を繰り返し、k回目の測定でも同様に、形状測定器2により被測定物1の領域Skの部分外形形状を部分座標系で測定する(ステップ110)。その測定値Psk(skxn,skyn,skzn)を変換制御装置16に出力する。
【0029】
次に、移動後の第1〜第3基準点MP1〜MP3の全体座標系における3次元の位置(Pg1k,Pg2k,Pg3k)を測定する(ステップ120)。続いて、この測定値(Pg1k,Pg2k,Pg3k)に基づいて、座標変換係数Rkを算出する。
【0030】
Pg1k=[Rk]×Pc1
Pg2k=[Rk]×Pc2
Pg3k=[Rk]×Pc3
続いて、被測定物1の領域Skの全体座標系における測定値Pkを算出する(ステップ140)。
【0031】
Pk=Psk×[Rk]
ステップ140の処理により、同じ全体座標系での測定値となり、ステップ100以下の処理を被測定物1の全体を測定するまで繰り返し行うことにより、被測定物1の全体の外形形状を3次元で測定することができる。
【0032】
次に前述した実施形態と異なる第2実施形態について、図5〜図7によって説明する。尚、前述した実施形態と同じ部材、処理については同一番号を付して詳細な説明を省略する。
図5に示すように、形状側定器2には、ターゲット8が一体的に設けられており、ターゲット8は、多角形状の3次元形状に形成されている。ターゲット8は、本実施形態では、立方体状に形成されており、このターゲット8の左側面8L、正面8F、右側面8Rは、それぞれ特徴のある形状に形成されている。例えば、図6に示すように、左側面8Lには、三角形状の平面8Laが突出形成されており、この三角形状の平面8Laから傾斜した3つの斜面8Lb,8Lc,8Ldが形成されている。
【0033】
正面8F及び右側面8Rについても同様に、三角形状の平面8Fa,8Raが突出形成されており、この三角形状の平面8Fa,8Raから傾斜した3つの斜面8Fb,8Rb,8Fc,8Rc,8Fd,8Rdが形成されている。但し、各左側面8L、正面8F、右側面8Rでは三角形状の平面8La,8Fa,8Raの方向がそれぞれ異なり、この形状から左側面8L、正面8F、右側面8Rの違いを判別できるように形成されている。
【0034】
また、左側面8Lでの三角形状の平面8Laの頂点である第1〜第3基準点αL,βL,γLは予め部分座標系における3次元位置関係が測定されて、部分座標系での位置座標PcαL,PcβL,PcγLが既知となっている。正面8F、右側面8Rでの三角形状の平面8Fa,8Raの各頂点αF,αR,βF,βR,γF,γRも予め部分座標系における3次元位置関係が測定されて、部分座標系での3次元位置座標PcαF,PcαR,PcβF,PcβR,PcγF,PcγRが既知となっている。また、各三角形状の平面8La,8Fa,8Raの形状も予め測定されて既知となっている。
【0035】
更に、本実施形態では、CCDカメラ20とフリンジプロジェクター22とを備えた位置側定器24が設けられている。CCDカメラ20とフリンジプロジェクター22とは、前述した形状側定器2のCCDカメラ4とフリンジプロジェクター6と同様のものである。位置側定器24は、ターゲット8を撮像することができる位置に配置され、固定されている。位置側定器24は、全体座標系でターゲット8を撮像して、その3次元測定値を出力する。
【0036】
図5に示すように、形状側定器2、位置側定器24は、変換制御装置16に接続されており、変換制御装置16に形状側定器2からの部分座標系による形状測定値が入力されると共に、位置側定器24からターゲット8の全体座標系における測定値が入力される。前述した形状側定器2は、本実施形態では、図3に示すように、多関節型のロボットを用いた移動装置18に取り付けられている。
【0037】
次に、変換制御装置16で行われる測定処理について、図7のフローチャートによって説明する。
まず、測定を開始するのか否かを判断する(ステップ100)。測定をするのであれば、次に、形状側定器2により被測定物1の部分外形形状を測定する(ステップ110)。形状側定器2は、図5に示すように、被測定物1の領域S1の部分外形形状を部分座標系で3次元測定する。その測定値Ps1(s1xn,s1yn,s1zn)を変換制御装置16に出力する。添字nはn個の測定点があることを示す。
【0038】
続いて、形状側定器2がステップ110の処理により測定したときにおけるターゲット8を位置側定器24により撮像する(ステップ120a)。そして、その撮像結果から、形状側定器2の姿勢を判定する(ステップ120b)。姿勢の判定は、撮像したターゲット8の各三角形状の平面8La,8Fa,8Raの形状に基づいて行われる。
【0039】
例えば、図5に示すように、正面8Fの三角形状の平面8Faが、位置側定器24のほぼ正面にある場合には、位置側定器24は、正面8Fの三角形状の平面8Faの形状を正確に測定することができる。また、左側面8Lの三角形状の平面8Laが、位置側定器24のほぼ正面にある場合には、位置側定器24は、左側面8Lの三角形状の平面8Laの形状を正確に測定することができる。右側面8Rの三角形状の平面8Raの場合も同様である。これにより、形状側定器2の姿勢を判定して、後述するステップの処理による基準点を抽出する三角形状の平面8La,8Fa,8Raを決定する。
【0040】
次に、ターゲット8から第1〜第3基準点α,β,γを抽出する(ステップ120c)。抽出する第1〜第3基準点α,β,γは、ステップ120bの処理により判定された形状側定器2の姿勢から決定され、位置側定器24からほぼ正面となるターゲット8の左側面8L、正面8F、右側面8Rから選ばれる。
【0041】
選ばれた左側面8L、正面8F、右側面8Rの三角形状の平面8La,8Fa,8Raの頂点である第1〜第3基準点α,β,γが抽出され、その3次元座標が算出される。例えば、正面8Fの三角形状の平面8Faの頂点である第1〜第3基準点αF,βF,γFが抽出され、全体座標系での3次元座標値PgαF,PgβF,PgγFが算出される。
【0042】
次に、ステップ120cの3次元座標値PgαF,PgβF,PgγFに基づいて、座標変換係数R1を算出する(ステップ130)。座標変換係数R1は、部分座標系での既知の正面8Fの第1〜第3基準点αF,βF,γFの位置座標PcαF,PcβF,PcγFと、ステップ120cの処理により測定した全体座標系での第1〜第3基準点αF,βF,γFの位置座標PgαF,PgβF,PgγFとに基づいて下記式により算出される。
【0043】
PgαF=[R1]×PcαF
PgβF=[R1]×PcβF
PgγF=[R1]×PcγF
ここで、[R1]はマトリックスである。
【0044】
続いて、被測定物1の領域S1の全体座標系における測定値P1を算出する(ステップ140)。全体座標系における測定値P1は、前記ステップ110の処理により測定した部分座標系の測定値Ps1を、ステップ130の処理により算出した座標変換係数R1に基づいて変換することにより下記式により算出される。
【0045】
P1=Ps1×[R1]
次に、測定が終了したか否かを判断し(ステップ150)、引き続いて測定する場合には、前記ステップ100以下の処理を繰り返す。そして、本実施形態では、移動装置18により、被測定物1の次の測定箇所に形状側定器2を移動する。移動の際は、平行移動でなくてもよく、位置側定器24によりターゲット8を撮像することができるように移動すればよい。そして、前述したと同様に、形状側定器2により被測定物1の領域S2の部分外形形状を部分座標系で測定する(ステップ110)。その測定値Ps2(s2xn,s2yn,s2zn)を変換制御装置16に出力する。
【0046】
次に、移動後のターゲット8を撮像し(ステップ120a)、その結果から位置側定器24の姿勢を判定すると共に(ステップ120b)、全体座標系における第1〜第3基準点α,β,γを抽出する(ステップ120c)。例えば、図5に示すように、位置側定器24のほぼ正面にターゲット8の左側面8Lがある場合、左側面8Lの第1〜第3基準点αL,βL,γLを抽出する。
【0047】
続いて、この3次元座標値PgαL,PgβL,PgγLに基づいて、座標変換係数R2を算出する(ステップ130)。
PgαL=[R2]×PcαL
PgβL=[R2]×PcβL
PgγL=[R2]×PcγL
続いて、被測定物1の領域S2の全体座標系における測定値P2を算出する(ステップ140)。
【0048】
P2=Ps2×[R2]
前述した測定処理を繰り返し、k回目の測定でも同様に、形状側定器2により被測定物1の領域Skの部分外形形状を部分座標系で測定する(ステップ110)。その測定値Psk(skxn,skyn,skzn)を変換制御装置16に出力する。
【0049】
次に、移動後のターゲット8を位置側定器24により撮像し、形状側定器2の姿勢を判定して、第1〜第3基準点α,β,γの3次元座標値Pgα,Pgβ,Pgγを抽出する(ステップ120a,120b,120c)。続いて、座標変換係数Rkを算出する(ステップ130)。
【0050】
Pgα(L,F,R)k=[Rk]×Pcα(L,F,R)
Pgβ(L,F,R)k=[Rk]×Pcβ(L,F,R)
Pgγ(L,F,R)k=[Rk]×Pcγ(L,F,R)
続いて、被測定物1の領域Skの全体座標系における測定値Pkを算出する(ステップ140)。
【0051】
Pk=Psk×[Rk]
ステップ140の処理により、同じ座標系での測定値となり、ステップ100以下の処理を被測定物1の全体を測定するまで繰り返し行うことにより、被測定物1の全体の外形形状を3次元で測定することができる。
【0052】
次に、前述した実施形態と異なる第3実施形態について、図8によって説明する。本第3実施形態では、前述した実施形態とはターゲットが異なり、本ターゲットは、3個の球体30,31,32を備えている。この球体30,31,32は直径が予め測定されて既知であると共に、部分座標系での球体30,31,32の中心位置座標Pcα,Pcβ,Pcγが既知である。
【0053】
図9に示すように、このターゲットとしての球体30,31,32を位置側定器24により撮像して(ステップ120a)、球体30,31,32の中心位置の全体座標系における位置を抽出することにより、3つの基準点を抽出できる(ステップ120c)。尚、図9では、図7の処理と同じステップについては同一番号を付して詳細な説明を省略する。これにより、前述した実施形態と同様に、被測定物1の全体の外形形状を3次元で測定することができる。
【0054】
以上本発明はこの様な実施形態に何等限定されるものではなく、本発明の要旨を逸脱しない範囲において種々なる態様で実施し得る。
【0055】
【発明の効果】
以上詳述したように本発明の3次元形状測定方法及びその装置によると、被測定物が大きなものであっても、容易にその3次元形状を測定できるという効果を奏する。
【図面の簡単な説明】
【図1】本発明の一実施形態としての3次元形状測定装置の概略構成を示す説明図である。
【図2】本実施形態の形状測定器の概略構成を示す説明図である。
【図3】本実施形態の形状測定装置を移動する移動装置の正面図である。
【図4】本実施形態の変換制御装置において行われる測定処理の一例を示すフローチャートである。
【図5】第2実施形態としての3次元形状測定装置の概略構成を示す説明図である。
【図6】第2実施形態のターゲットの説明図である。
【図7】第2実施形態の変換制御装置において行われる測定処理の一例を示すフローチャートである。
【図8】第3実施形態のターゲットの説明図である。
【図9】第3実施形態の変換制御装置において行われる測定処理の一例を示すフローチャートである。
【符号の説明】
1…被測定物 2…形状測定器
4,20…CCDカメラ
6,22…フリンジプロジェクター
8…ターゲット
10…第1ナビゲーションカメラ
12…第2ナビゲーションカメラ
14,24…位置測定器 16…変換制御装置
18…移動装置 30,31,32…球体
[0001]
TECHNICAL FIELD OF THE INVENTION
The present invention relates to a three-dimensional shape measuring device for measuring the external shape of a measured object in three dimensions.
[0002]
[Prior art]
Conventionally, a fringe pattern represented by a moire topography is used to project a fringe pattern on an object to be measured, and a fringe pattern deformed according to the shape of the object to be measured and a reference fringe pattern are superimposed on each other, and the difference frequency is obtained. As a device for measuring the three-dimensional shape of an object to be measured by analyzing moire fringes indicating contour lines generated as, for example, an apparatus as disclosed in JP-A-53-68267 and JP-A-61-260107. It has been known.
[0003]
In such an apparatus, a stripe pattern formed on an object to be measured is imaged by a CCD camera and analyzed.
[0004]
[Problems to be solved by the invention]
However, in such a conventional device, when the object to be measured is small, sufficient measurement accuracy can be obtained even when the entire object to be measured is imaged and analyzed with a CCD camera. Sufficient resolution may not be obtained by imaging the entire DUT.
[0005]
In such a case, a part of the measured object is imaged to measure the three-dimensional shape, and then, at a different location, a part of the measured object is imaged to measure the three-dimensional shape. Is repeated to measure the entire three-dimensional shape. However, there is a problem that the work of synthesizing the separately measured values into the measured values of one reference coordinate system is complicated.
[0006]
An object of the present invention is to provide a three-dimensional shape measuring method and a three-dimensional shape measuring method which can easily measure three-dimensionally even a large object.
[0007]
[Means for Solving the Problems]
In order to achieve the object, the present invention has taken the following means to solve the object. That is,
A three-dimensional optical measurement of a partial external shape of the object to be measured is performed in a partial coordinate system by a shape measuring means, and a target having a known three-dimensional positional relationship with the partial coordinate system is provided in the shape measuring means. Position measurement means optically three-dimensionally measures in a whole coordinate system, and based on a relationship between a three-dimensional position of the target in the partial coordinate system and a measurement value of the target in the whole coordinate system, The measured value in the partial coordinate system is converted into a measured value in the overall coordinate system, and the shape measuring means is moved to measure the partial external shape of another portion of the object to be measured, and the position at the position is measured. The three-dimensional shape measuring method is characterized by repeating the measurement of the target and measuring the outer shape of the object to be measured.
[0008]
The target may consist of at least three reference points. Further, the reference point may be a light emitting body. The position measuring means may measure the position of the reference point by triangulation. Further, the position measuring means may include two cameras fixed at a fixed interval, take an image of the reference point with the camera, and measure the position of the reference point by triangulation. The target may have a three-dimensional shape. Alternatively, the target may consist of at least three spheres. The position measuring means may measure the target using moiré fringes.
[0009]
In addition, a target having a known three-dimensional positional relationship with the partial coordinate system is provided in shape measuring means for optically three-dimensionally measuring the partial external shape of the object to be measured in a partial coordinate system,
And a position measuring means for optically and three-dimensionally measuring the target in a general coordinate system; and a moving means for moving the shape measuring means.
Further, based on a relationship between a three-dimensional position of the target in the partial coordinate system and a measured value of the target in the global coordinate system, a measured value of the object to be measured in the partial coordinate system is calculated in the global coordinate system. A conversion unit for converting the measured value to the measured value is provided, and the shape measuring unit is moved by the moving unit, and the measurement of the partial external shape of another portion of the object to be measured and the measurement of the target at that position are repeated. The three-dimensional shape measuring apparatus is characterized by measuring the outer shape of the object to be measured.
[0010]
BEST MODE FOR CARRYING OUT THE INVENTION
Hereinafter, embodiments of the present invention will be described in detail with reference to the drawings.
As shown in FIG. 1, reference numeral 1 denotes an object to be measured, which has, for example, a large three-dimensional shape such as a car body of a passenger car. A shape measuring device 2 for measuring a three-dimensional shape of the DUT 1 is provided. The shape measuring device 2 optically measures a partial outer shape of the DUT 1. The shape measuring device 2 includes a CCD camera 4 and a fringe projector 6, as shown in FIG.
[0011]
In the shape measuring device 2, the fringe projector 6 projects a plurality of gratings on the surface of the device 1, and the CCD camera 4 captures an image of the grating deformed according to the external shape of the device 1. The shape measuring device 2 measures the three-dimensional shape of the DUT 1 based on the deformed grid and the reference grid.
[0012]
The measurement accuracy of the shape measuring device 2 becomes coarser when the region S imaged by the CCD camera 4 becomes wider, and becomes higher when the region S imaged becomes narrower. Therefore, in order to obtain the required measurement accuracy, the area of the region S imaged by the CCD camera 4 is limited, and the partial external shape of the DUT 1 is measured. The shape measuring device 2 outputs a measured value based on a three-dimensional partial coordinate system. The partial coordinate system is a coordinate system of the shape measuring device 2, and when the shape measuring device 2 is moved, the partial coordinate system also moves at the same time.
[0013]
In addition, as shown in FIG. 1, the shape measuring device 2 is integrally provided with three first to third reference points MP1 to MP3 as targets. Each of the first to third reference points MP1 to MP3 may be a light emitting body using a light emitting diode or the like, or may be a mark. The three-dimensional positional relationship of the reference points MP1 to MP3 in the partial coordinate system is measured in advance, and the positions (Pc1, Pc2, Pc3) are known. For example, the first reference point MP1 is provided at the origin Pc1 (0, 0, 0) of the partial coordinate system, the second reference point MP2 is provided at the point Pc2 (a, 0, 0) on the X axis of the partial coordinate system, The third reference point MP3 may be provided at a point Pc3 (0, b, 0) on the Y axis of the partial coordinate system.
[0014]
Further, in the present embodiment, a position measuring device 14 including the first navigation camera 10 and the second navigation camera 12 is provided. The first navigation camera 10 and the second navigation camera 12 are arranged at positions where the first to third reference points MP1 to MP3 can be imaged.
[0015]
The positions of the first navigation camera 10 and the second navigation camera 12 in the three-dimensional overall coordinate system are known, and the distance L between the first navigation camera 10 and the second navigation camera 12 is measured in advance. The first navigation camera 10 measures an angle θ1 with the first reference point MP1, and the second navigation camera 12 measures an angle θ2 with the first reference point MP1. From these angles θ1, θ2 and the interval L, a measured value of the three-dimensional position of the first reference point MP1 in the entire coordinate system is obtained by triangulation.
[0016]
Similarly, a measured value of the three-dimensional position of the second reference point MP2 and the third reference point MP3 in the overall coordinate system is obtained. At this time, when a light emitting body is used for each of the reference points MP1 to MP3, the reference points MP1 to MP3 can be easily determined by sequentially emitting light. Alternatively, each of the reference points MP1 to MP3 may be recognized by simultaneously illuminating the light emitters and recognizing the shape.
[0017]
In addition, the position measuring device 14 is not limited to the one using the two cameras of the first navigation camera 10 and the second navigation camera 12, but uses a laser distance meter to detect the first to third reference points MP1 to MP3. Measurements Pc1 to Pc3 of the three-dimensional position in the whole coordinate system may be obtained.
[0018]
As shown in FIG. 1, the shape measuring device 2, the first navigation camera 10, and the second navigation camera 12 are connected to the conversion control device 16, and the conversion control device 16 uses the partial coordinate system from the shape measuring device 2. While the shape measurement value is input, the position measurement value of the first to third reference points MP1 to MP3 in the entire coordinate system is input from the position measurement device 14.
[0019]
In the present embodiment, the shape measuring device 2 described above is attached to a moving device 18 using an articulated robot, as shown in FIG. By moving the shape measuring device 2 by the moving device 18, the external shape of the DUT 1 can be measured by the shape measuring device 2. The moving device 18 is not limited to the device that moves the shape measuring device 2 in parallel, and may be any device that can move the shape measuring device 2 so that the external shape of the DUT 1 can be measured.
[0020]
Next, the measurement processing performed by the conversion control device 16 will be described with reference to the flowchart of FIG.
First, it is determined whether or not to start the measurement (step 100). If the measurement is to be performed, next, the partial external shape of the DUT 1 is measured by the shape measuring device 2 (step 110). As shown in FIG. 1, the shape measuring device 3 three-dimensionally measures the partial external shape of the region S1 of the DUT 1 in a partial coordinate system. The measured value Ps1 (s1xn, s1yn, s1zn) is output to the conversion control device 16. The subscript n indicates that there are n measurement points.
[0021]
Subsequently, the first navigation camera 10 and the second navigation camera 12 use the three-dimensional position of the first to third reference points MP1 to MP3 in the overall coordinate system when measured by the shape measuring device 2 in step 110. Pg11 to Pg31 are measured (step 120). The positions of the first to third reference points MP1 to MP3 are measured, and the measured values (Pg11, Pg21, Pg31) are output to the conversion control device 16.
[0022]
Next, a coordinate conversion coefficient R1 is calculated based on the measured value of step 120 (step 130). The coordinate conversion coefficient R1 is obtained by calculating the position coordinates (Pc1, Pc2, Pc3) of the known first to third reference points MP1 to MP3 in the partial coordinate system, and the first to third reference points MP1 measured by the processing of step 120. It is calculated by the following equation based on the position coordinates (Pg11, Pg21, Pg31) of .about.MP3.
[0023]
Pg11 = [R1] × Pc1
Pg21 = [R1] × Pc2
Pg31 = [R1] × Pc3
Here, [R1] is a matrix.
[0024]
Subsequently, a measurement value P1 of the region S1 of the DUT 1 in the entire coordinate system is calculated (Step 140). The measured value P1 in the global coordinate system is calculated by the following equation by converting the measured value Ps1 in the partial coordinate system measured in the process in step 110 based on the coordinate conversion coefficient R1 calculated in the process in step 130. .
[0025]
P1 = Ps1 × [R1]
Next, it is determined whether or not the measurement has been completed (step 150). If the measurement is to be continued, the processing of step 100 and subsequent steps is repeated. Then, in the present embodiment, the shape measuring device 2 is moved by the moving device 18 to the next measuring point of the DUT 1. At the time of the movement, the movement need not be a parallel movement, and may be performed so that the first navigation camera 10 and the second navigation camera 12 can image the first to third reference points MP1 to MP3. Then, in the same manner as described above, the partial external shape of the region S2 of the DUT 1 is measured by the shape measuring device 2 in the partial coordinate system (step 110). The measured value Ps2 (s2xn, s2yn, s2zn) is output to the conversion control device 16.
[0026]
Next, the three-dimensional positions (Pg12, Pg22, Pg32) of the moved first to third reference points MP1 to MP3 in the overall coordinate system are measured (step 120). Subsequently, a coordinate conversion coefficient R2 is calculated based on the measured values (Pg12, Pg22, Pg32).
[0027]
Pg12 = [R2] × Pc1
Pg22 = [R2] × Pc2
Pg32 = [R2] × Pc3
Subsequently, a measured value P2 of the area S2 of the DUT 1 in the entire coordinate system is calculated (step 140).
[0028]
P2 = Ps2 × [R2]
The above-described measurement processing is repeated, and in the k-th measurement, the shape measuring device 2 similarly measures the partial external shape of the region Sk of the DUT 1 in the partial coordinate system (step 110). The measured value Psk (skxn, skyn, skzn) is output to the conversion control device 16.
[0029]
Next, three-dimensional positions (Pg1k, Pg2k, Pg3k) of the moved first to third reference points MP1 to MP3 in the overall coordinate system are measured (step 120). Subsequently, a coordinate transformation coefficient Rk is calculated based on the measured values (Pg1k, Pg2k, Pg3k).
[0030]
Pg1k = [Rk] × Pc1
Pg2k = [Rk] × Pc2
Pg3k = [Rk] × Pc3
Subsequently, a measured value Pk of the region Sk of the DUT 1 in the entire coordinate system is calculated (Step 140).
[0031]
Pk = Psk × [Rk]
The measured value in the same overall coordinate system is obtained by the processing in step 140, and the processing in step 100 and subsequent steps is repeatedly performed until the entire measured object 1 is measured, so that the overall external shape of the measured object 1 is three-dimensional. Can be measured.
[0032]
Next, a second embodiment different from the above-described embodiment will be described with reference to FIGS. Note that the same members and processes as those in the above-described embodiment are denoted by the same reference numerals, and detailed description is omitted.
As shown in FIG. 5, a target 8 is provided integrally with the shape-side fixed device 2, and the target 8 is formed in a polygonal three-dimensional shape. In the present embodiment, the target 8 is formed in a cubic shape, and the left side 8L, the front 8F, and the right side 8R of the target 8 are each formed in a characteristic shape. For example, as shown in FIG. 6, a triangular plane 8La protrudes from the left side surface 8L, and three slopes 8Lb, 8Lc, 8Ld inclined from the triangular plane 8La are formed.
[0033]
Similarly, the front face 8F and the right side face 8R are also formed with protruding triangular planes 8Fa, 8Ra, and three slopes 8Fb, 8Rb, 8Fc, 8Rc, 8Fd, 8Rd inclined from the triangular planes 8Fa, 8Ra. Is formed. However, the directions of the triangular planes 8La, 8Fa, and 8Ra are different from each other on the left side 8L, the front 8F, and the right side 8R, and are formed so that the difference between the left side 8L, the front 8F, and the right side 8R can be determined from this shape. Have been.
[0034]
The first to third reference points αL, βL, and γL, which are the vertices of the triangular plane 8La on the left side surface 8L, are measured in advance in a three-dimensional positional relationship in the partial coordinate system, and the position coordinates in the partial coordinate system are determined. PcαL, PcβL, and PcγL are known. The vertices αF, αR, βF, βR, γF, and γR of the triangular planes 8Fa and 8Ra at the front surface 8F and the right side surface 8R are also measured in advance in the three-dimensional positional relationship in the partial coordinate system. The dimensional position coordinates PcαF, PcαR, PcβF, PcβR, PcγF, and PcγR are known. Also, the shapes of the triangular planes 8La, 8Fa, 8Ra are measured in advance and are known.
[0035]
Further, in the present embodiment, a position-side fixture 24 having a CCD camera 20 and a fringe projector 22 is provided. The CCD camera 20 and the fringe projector 22 are the same as the CCD camera 4 and the fringe projector 6 of the shape-side fixture 2 described above. The position-side fixture 24 is arranged and fixed at a position where the target 8 can be imaged. The position-side determiner 24 captures an image of the target 8 in the entire coordinate system and outputs a three-dimensional measurement value.
[0036]
As shown in FIG. 5, the shape-side fixed device 2 and the position-side fixed device 24 are connected to a conversion control device 16, and the conversion control device 16 receives shape measurement values from the shape-side fixed device 2 in a partial coordinate system. At the same time, the measured value of the target 8 in the entire coordinate system is input from the position-side fixture 24. In the present embodiment, the shape-side fixture 2 described above is attached to a moving device 18 using an articulated robot, as shown in FIG.
[0037]
Next, the measurement processing performed by the conversion control device 16 will be described with reference to the flowchart of FIG.
First, it is determined whether or not to start the measurement (step 100). If the measurement is to be performed, then, the partial outer shape of the DUT 1 is measured by the shape-side fixture 2 (step 110). As shown in FIG. 5, the shape-side fixed device 2 three-dimensionally measures the partial external shape of the region S1 of the DUT 1 in a partial coordinate system. The measured value Ps1 (s1xn, s1yn, s1zn) is output to the conversion control device 16. The subscript n indicates that there are n measurement points.
[0038]
Subsequently, the target 8 at the time when the shape-side fixture 2 measures by the processing of Step 110 is imaged by the position-side fixture 24 (Step 120a). Then, the posture of the shape-side fixed device 2 is determined from the imaging result (step 120b). The determination of the posture is performed based on the shape of each of the triangular planes 8La, 8Fa, and 8Ra of the captured target 8.
[0039]
For example, as shown in FIG. 5, when the triangular plane 8Fa of the front surface 8F is substantially in front of the position side setter 24, the position side setter 24 has the shape of the triangular plane 8Fa of the front side 8F. Can be accurately measured. Further, when the triangular plane 8La of the left side surface 8L is substantially in front of the position side setter 24, the position side setter 24 accurately measures the shape of the triangular plane 8La of the left side surface 8L. be able to. The same applies to the case of the triangular plane 8Ra of the right side surface 8R. Thereby, the posture of the shape-side fixed device 2 is determined, and the triangular planes 8La, 8Fa, 8Ra from which the reference points are extracted by the processing of the steps described later are determined.
[0040]
Next, the first to third reference points α, β, γ are extracted from the target 8 (step 120c). The first to third reference points α, β, and γ to be extracted are determined from the posture of the shape-side fixed device 2 determined by the processing in step 120b, and the left side surface of the target 8 substantially in front of the position-side fixed device 24. 8L, front 8F, right side 8R.
[0041]
First to third reference points α, β, γ, which are vertices of the selected triangular planes 8La, 8Fa, 8Ra of the left side 8L, the front 8F, and the right side 8R, are extracted, and their three-dimensional coordinates are calculated. You. For example, first to third reference points αF, βF, and γF, which are vertices of a triangular plane 8Fa of the front surface 8F, are extracted, and three-dimensional coordinate values PgαF, PgβF, and PgγF in the entire coordinate system are calculated.
[0042]
Next, a coordinate conversion coefficient R1 is calculated based on the three-dimensional coordinate values PgαF, PgβF, and PgγF in step 120c (step 130). The coordinate conversion coefficient R1 is obtained by calculating the position coordinates PcαF, PcβF, and PcγF of the first to third reference points αF, βF, and γF of the known front surface 8F in the partial coordinate system, and the whole coordinate system measured by the processing in step 120c. It is calculated by the following equation based on the position coordinates PgαF, PgβF, PgγF of the first to third reference points αF, βF, γF.
[0043]
PgαF = [R1] × PcαF
PgβF = [R1] × PcβF
PgγF = [R1] × PcγF
Here, [R1] is a matrix.
[0044]
Subsequently, a measurement value P1 of the region S1 of the DUT 1 in the entire coordinate system is calculated (Step 140). The measured value P1 in the global coordinate system is calculated by the following equation by converting the measured value Ps1 in the partial coordinate system measured in the process in step 110 based on the coordinate conversion coefficient R1 calculated in the process in step 130. .
[0045]
P1 = Ps1 × [R1]
Next, it is determined whether or not the measurement has been completed (step 150). If the measurement is to be continued, the processing of step 100 and subsequent steps is repeated. Then, in the present embodiment, the shape-side fixed device 2 is moved to the next measuring point of the DUT 1 by the moving device 18. At the time of movement, the target 8 may be moved so that the target 8 can be imaged by the position-side fixed unit 24 instead of the parallel movement. Then, in the same manner as described above, the partial outer shape of the region S2 of the DUT 1 is measured by the shape-side fixture 2 in the partial coordinate system (step 110). The measured value Ps2 (s2xn, s2yn, s2zn) is output to the conversion control device 16.
[0046]
Next, an image of the target 8 after the movement is taken (step 120a), the posture of the position-side fixed unit 24 is determined from the result (step 120b), and the first to third reference points α, β, γ is extracted (step 120c). For example, as shown in FIG. 5, when the left side surface 8L of the target 8 is located substantially in front of the position side setter 24, the first to third reference points αL, βL, γL of the left side surface 8L are extracted.
[0047]
Subsequently, a coordinate conversion coefficient R2 is calculated based on the three-dimensional coordinate values PgαL, PgβL, PgγL (step 130).
PgαL = [R2] × PcαL
PgβL = [R2] × PcβL
PgγL = [R2] × PcγL
Subsequently, a measured value P2 of the area S2 of the DUT 1 in the entire coordinate system is calculated (step 140).
[0048]
P2 = Ps2 × [R2]
The measurement process described above is repeated, and in the k-th measurement, similarly, the partial external shape of the region Sk of the DUT 1 is measured in the partial coordinate system by the shape-side fixture 2 (step 110). The measured value Psk (skxn, skyn, skzn) is output to the conversion control device 16.
[0049]
Next, the target 8 after the movement is imaged by the position-side fixture 24, the orientation of the shape-side fixture 2 is determined, and the three-dimensional coordinate values Pgα, Pgβ of the first to third reference points α, β, γ. , Pgγ are extracted (steps 120a, 120b, 120c). Subsequently, a coordinate conversion coefficient Rk is calculated (step 130).
[0050]
Pgα (L, F, R) k = [Rk] × Pcα (L, F, R)
Pgβ (L, F, R) k = [Rk] × Pcβ (L, F, R)
Pgγ (L, F, R) k = [Rk] × Pcγ (L, F, R)
Subsequently, a measured value Pk of the region Sk of the DUT 1 in the entire coordinate system is calculated (Step 140).
[0051]
Pk = Psk × [Rk]
The measured value in the same coordinate system is obtained by the processing in step 140, and the entire outer shape of the measured object 1 is measured three-dimensionally by repeating the processing in step 100 and subsequent steps until the entire measured object 1 is measured. can do.
[0052]
Next, a third embodiment different from the above-described embodiment will be described with reference to FIG. In the third embodiment, the target is different from the above-described embodiment, and the target includes three spheres 30, 31, and 32. The diameters of the spheres 30, 31, and 32 are measured in advance and are known, and the center position coordinates Pcα, Pcβ, and Pcγ of the spheres 30, 31, and 32 in the partial coordinate system are known.
[0053]
As shown in FIG. 9, the spheres 30, 31, and 32 serving as the targets are imaged by the position-side determiner 24 (step 120a), and the positions of the center positions of the spheres 30, 31, and 32 in the overall coordinate system are extracted. Thereby, three reference points can be extracted (step 120c). In FIG. 9, the same steps as those in FIG. 7 are denoted by the same reference numerals, and detailed description is omitted. Thereby, similarly to the above-described embodiment, the entire external shape of the DUT 1 can be measured three-dimensionally.
[0054]
The present invention is not limited to such an embodiment at all, and can be implemented in various modes without departing from the gist of the present invention.
[0055]
【The invention's effect】
As described above in detail, according to the method and the apparatus for measuring a three-dimensional shape of the present invention, even if the object to be measured is large, the three-dimensional shape can be easily measured.
[Brief description of the drawings]
FIG. 1 is an explanatory diagram showing a schematic configuration of a three-dimensional shape measuring apparatus as one embodiment of the present invention.
FIG. 2 is an explanatory diagram illustrating a schematic configuration of a shape measuring instrument according to the present embodiment.
FIG. 3 is a front view of a moving device that moves the shape measuring device according to the present embodiment.
FIG. 4 is a flowchart illustrating an example of a measurement process performed in the conversion control device according to the embodiment.
FIG. 5 is an explanatory diagram showing a schematic configuration of a three-dimensional shape measuring apparatus as a second embodiment.
FIG. 6 is an explanatory diagram of a target according to a second embodiment.
FIG. 7 is a flowchart illustrating an example of a measurement process performed in the conversion control device according to the second embodiment.
FIG. 8 is an explanatory diagram of a target according to a third embodiment.
FIG. 9 is a flowchart illustrating an example of a measurement process performed in the conversion control device according to the third embodiment.
[Explanation of symbols]
DESCRIPTION OF SYMBOLS 1 ... Measurement object 2 ... Shape measuring device 4,20 ... CCD camera 6,22 ... Fringe projector 8 ... Target 10 ... First navigation camera 12 ... Second navigation camera 14,24 ... Position measuring device 16 ... Conversion control device 18 ... Movement device 30,31,32 ... Sphere

Claims (9)

被測定物の部分外形形状を形状測定手段により部分座標系で光学的に3次元測定すると共に、前記部分座標系との3次元位置関係が既知のターゲットを前記形状測定手段に設け、前記ターゲットを位置測定手段により全体座標系で光学的に3次元測定し、前記ターゲットの前記部分座標系での3次元位置と前記ターゲットの前記全体座標系での測定値との関係に基づいて前記被測定物の前記部分座標系での測定値を前記全体座標系の測定値に変換し、前記形状測定手段を移動させて、前記被測定物の他箇所の部分外形形状の測定と、その位置での前記ターゲットの測定とを繰り返し、前記被測定物の外形形状を測定することを特徴とする3次元形状測定方法。A three-dimensional optical measurement of a partial external shape of the object to be measured is performed in a partial coordinate system by a shape measuring means, and a target having a known three-dimensional positional relationship with the partial coordinate system is provided in the shape measuring means. Position measurement means optically three-dimensionally measures in a whole coordinate system, and based on a relationship between a three-dimensional position of the target in the partial coordinate system and a measurement value of the target in the whole coordinate system, The measured value in the partial coordinate system is converted into a measured value in the overall coordinate system, and the shape measuring means is moved to measure the partial external shape of another portion of the object to be measured, and the position at the position is measured. A three-dimensional shape measuring method, characterized by repeatedly measuring a target and measuring an outer shape of the object to be measured. 前記ターゲットは少なくとも3つの基準点からなることを特徴とする請求項1記載の3次元形状測定方法。The method according to claim 1, wherein the target comprises at least three reference points. 前記基準点は発光体であることを特徴とする請求項2記載の3次元形状測定方法。3. The three-dimensional shape measuring method according to claim 2, wherein the reference point is a light emitter. 前記位置測定手段は、前記基準点の位置を三角測量で測定することを特徴とする請求項2又は請求項3記載の3次元形状測定方法。The three-dimensional shape measuring method according to claim 2 or 3, wherein the position measuring means measures the position of the reference point by triangulation. 前記位置測定手段は、一定間隔で固定された2台のカメラを備え、前記カメラにより前記基準点を撮像し、前記基準点の位置を三角測量で測定することを特徴とする請求項2又は請求項3記載の3次元形状測定方法。The said position measuring means is provided with two cameras fixed at fixed intervals, images the reference point by the camera, and measures the position of the reference point by triangulation. Item 3. The three-dimensional shape measuring method according to Item 3. 前記ターゲットは3次元形状を有することを特徴とする請求項1記載の3次元形状測定方法。The method according to claim 1, wherein the target has a three-dimensional shape. 前記ターゲットは少なくとも3個の球体からなることを特徴とする請求項1記載の3次元形状測定方法。The three-dimensional shape measuring method according to claim 1, wherein the target comprises at least three spheres. 前記位置測定手段は、前記ターゲットをモアレ縞を用いて測定することを特徴とする請求項6又は請求項7記載の3次元形状測定方法。The three-dimensional shape measuring method according to claim 6, wherein the position measuring unit measures the target using moiré fringes. 被測定物の部分外形形状を部分座標系で光学的に3次元測定する形状測定手段に、前記部分座標系との3次元位置関係が既知のターゲットを設け、
かつ、前記ターゲットを全体座標系で光学的に3次元測定する位置測定手段と、前記形状測定手段を移動する移動手段とを設け、
また、前記ターゲットの前記部分座標系での3次元位置と前記ターゲットの前記全体座標系での測定値との関係に基づいて前記被測定物の前記部分座標系での測定値を前記全体座標系の測定値に変換する変換手段を設け、前記移動手段により前記形状測定手段を移動させて、前記被測定物の他箇所の部分外形形状の測定と、その位置での前記ターゲットの測定とを繰り返し、前記被測定物の外形形状を測定することを特徴とする3次元形状測定装置。
A shape measuring means for optically three-dimensionally measuring the partial external shape of the object to be measured in a partial coordinate system is provided with a target whose three-dimensional positional relationship with the partial coordinate system is known,
And a position measuring means for optically and three-dimensionally measuring the target in a general coordinate system; and a moving means for moving the shape measuring means.
Further, based on a relationship between a three-dimensional position of the target in the partial coordinate system and a measured value of the target in the global coordinate system, a measured value of the measured object in the partial coordinate system is calculated in the global coordinate system. A conversion unit for converting the measured value into a measured value is provided, and the shape measuring unit is moved by the moving unit, and the measurement of the partial external shape at another portion of the object to be measured and the measurement of the target at that position are repeated. A three-dimensional shape measuring apparatus for measuring an outer shape of the object to be measured.
JP2002355403A 2002-12-06 2002-12-06 Three-dimensional shape measuring method and apparatus Expired - Fee Related JP4220768B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2002355403A JP4220768B2 (en) 2002-12-06 2002-12-06 Three-dimensional shape measuring method and apparatus

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2002355403A JP4220768B2 (en) 2002-12-06 2002-12-06 Three-dimensional shape measuring method and apparatus

Publications (2)

Publication Number Publication Date
JP2004191051A true JP2004191051A (en) 2004-07-08
JP4220768B2 JP4220768B2 (en) 2009-02-04

Family

ID=32756115

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2002355403A Expired - Fee Related JP4220768B2 (en) 2002-12-06 2002-12-06 Three-dimensional shape measuring method and apparatus

Country Status (1)

Country Link
JP (1) JP4220768B2 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006194792A (en) * 2005-01-14 2006-07-27 Hiroshima Univ Method for predicting strength degradation of corrosion structure
JP2009058503A (en) * 2007-08-10 2009-03-19 Leica Geosystems Ag Method and system for noncontact coordinate measurement on object surface
JP2015049379A (en) * 2013-09-02 2015-03-16 ホヤ レンズ タイランド リミテッドHOYA Lens Thailand Ltd Independent type measurement assist device and noncontact type measurement method
US9020240B2 (en) 2007-08-10 2015-04-28 Leica Geosystems Ag Method and surveying system for noncontact coordinate measurement on an object surface

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006194792A (en) * 2005-01-14 2006-07-27 Hiroshima Univ Method for predicting strength degradation of corrosion structure
JP4595077B2 (en) * 2005-01-14 2010-12-08 国立大学法人広島大学 Prediction method for strength deterioration of corroded structures
JP2009058503A (en) * 2007-08-10 2009-03-19 Leica Geosystems Ag Method and system for noncontact coordinate measurement on object surface
US9020240B2 (en) 2007-08-10 2015-04-28 Leica Geosystems Ag Method and surveying system for noncontact coordinate measurement on an object surface
JP2015049379A (en) * 2013-09-02 2015-03-16 ホヤ レンズ タイランド リミテッドHOYA Lens Thailand Ltd Independent type measurement assist device and noncontact type measurement method

Also Published As

Publication number Publication date
JP4220768B2 (en) 2009-02-04

Similar Documents

Publication Publication Date Title
CN109115126B (en) Method for calibrating a triangulation sensor, control and processing unit and storage medium
CA2656163C (en) Auto-referenced system and apparatus for three-dimensional scanning
US10812694B2 (en) Real-time inspection guidance of triangulation scanner
US20180238681A1 (en) Two-camera triangulation scanner with detachable coupling mechanism
US8082120B2 (en) Hand-held self-referenced apparatus for three-dimensional scanning
US20140268178A1 (en) System and method of acquiring three dimensional coordinates using multiple coordinate measurement devices
KR20100087083A (en) System and method for three-dimensional measurment of the shape of material object
CN108759669A (en) A kind of self-positioning 3-D scanning method and system in interior
JP2007139776A (en) Optical edge break gage
JP4760358B2 (en) Road surface shape measuring method and measuring system
US9441959B2 (en) Calibration method and shape measuring apparatus
US20150085108A1 (en) Lasergrammetry system and methods
JP4255865B2 (en) Non-contact three-dimensional shape measuring method and apparatus
JP3690581B2 (en) POSITION DETECTION DEVICE AND METHOD THEREFOR, PLAIN POSITION DETECTION DEVICE AND METHOD THEREOF
JP2007093412A (en) Three-dimensional shape measuring device
JP2020180914A (en) Device, method, and program for detecting position attitude of object
CA2956319A1 (en) Calibration for 3d imaging with a single-pixel camera
JP5669195B2 (en) Surface shape measuring device and surface shape measuring method
CN112415010A (en) Imaging detection method and system
Barone et al. Catadioptric stereo-vision system using a spherical mirror
CA3126592A1 (en) Methode et systeme de profilometrie par illumination haute vitesse a bande limitee avec deux objectifs
JP2004191051A (en) Three-dimensional shape measuring method and its apparatus
JP7180783B2 (en) CALIBRATION METHOD FOR COMPUTER VISION SYSTEM AND 3D REFERENCE OBJECT USED FOR CALIBRATION METHOD
JP2011047876A (en) Three-dimensional shape measurement method
JP4956960B2 (en) 3D shape measuring apparatus and 3D shape measuring method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20051201

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20070829

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20070904

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20071102

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20081021

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20081114

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111121

Year of fee payment: 3

R150 Certificate of patent or registration of utility model

Free format text: JAPANESE INTERMEDIATE CODE: R150

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111121

Year of fee payment: 3

S531 Written request for registration of change of domicile

Free format text: JAPANESE INTERMEDIATE CODE: R313531

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20111121

Year of fee payment: 3

R350 Written notification of registration of transfer

Free format text: JAPANESE INTERMEDIATE CODE: R350

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20121121

Year of fee payment: 4

LAPS Cancellation because of no payment of annual fees