JP2006177781A - Three-dimensional shape measurement method, three-dimensional measurement apparatus, and three-dimensional measurement program - Google Patents

Three-dimensional shape measurement method, three-dimensional measurement apparatus, and three-dimensional measurement program Download PDF

Info

Publication number
JP2006177781A
JP2006177781A JP2004371461A JP2004371461A JP2006177781A JP 2006177781 A JP2006177781 A JP 2006177781A JP 2004371461 A JP2004371461 A JP 2004371461A JP 2004371461 A JP2004371461 A JP 2004371461A JP 2006177781 A JP2006177781 A JP 2006177781A
Authority
JP
Japan
Prior art keywords
pattern
fringe
light
image
dimensional shape
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2004371461A
Other languages
Japanese (ja)
Other versions
JP4670341B2 (en
Inventor
Ryosuke Mitaka
良介 三高
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Panasonic Electric Works Co Ltd
Original Assignee
Matsushita Electric Works Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Matsushita Electric Works Ltd filed Critical Matsushita Electric Works Ltd
Priority to JP2004371461A priority Critical patent/JP4670341B2/en
Publication of JP2006177781A publication Critical patent/JP2006177781A/en
Application granted granted Critical
Publication of JP4670341B2 publication Critical patent/JP4670341B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Landscapes

  • Length Measuring Devices By Optical Means (AREA)

Abstract

<P>PROBLEM TO BE SOLVED: To measure the three-dimensional shape of an object, having steps and holes on its surface with suppressed influence due to disturbance light and fewer number of photographing. <P>SOLUTION: An auxiliary pattern PRi for determining a fringe order is projected with light of a wavelength which is different from the wavelength of light for projecting a fringe pattern PBi simultaneously with the fringe pattern PBi. A photographed image, on which the fringe pattern PBi and the auxiliary pattern PRi are simultaneously projected, is divided on the basis of the wavelength of light into an image on which the fringe pattern PBi is projected, and an image on which the auxiliary pattern PRi is projected. From a plurality of images on which the auxiliary pattern is projected, fringe order n is determined. The fringe pattern PBi having a sinusoidal shape and the auxiliary pattern PRi for determining the fringe order of the fringe pattern PBi are allowed to be projected on an object to be measured and to be photographed simultaneously. Furthermore, by making the light for projecting the fringe pattern PBi monochromatic light of a specific frequency, the influence of disturbance light can be suppressed, as compared with white light. <P>COPYRIGHT: (C)2006,JPO&NCIPI

Description

本発明は、位相シフト法により計測対象物の3次元形状を計測する3次元形状計測方法並びに3次元形状計測装置、3次元形状計測用プログラムに関するものである。   The present invention relates to a three-dimensional shape measurement method, a three-dimensional shape measurement apparatus, and a three-dimensional shape measurement program for measuring a three-dimensional shape of a measurement object by a phase shift method.

従来、物体の3次元形状を計測する方法として位相シフト法が広く用いられている。位相シフト法は光切断法の一種であって、図8に示すように投影装置11により計測対象物13に特定のパターンを投影し、計測対象物13の表面で乱反射された投影像を投影装置11の投影光軸と異なる方向から撮像装置12で撮像し、この撮像装置12で撮像した画像を画像処理することで計測対象物13の3次元形状を計測するものである。   Conventionally, the phase shift method has been widely used as a method for measuring the three-dimensional shape of an object. The phase shift method is a kind of light cutting method. As shown in FIG. 8, the projection device 11 projects a specific pattern onto the measurement target 13 and projects a projection image irregularly reflected on the surface of the measurement target 13. The three-dimensional shape of the measurement object 13 is measured by picking up an image from the direction different from the projection optical axis 11 and image processing the image picked up by the image pickup device 12.

以下、位相シフト法の原理についてさらに詳しく説明する。   Hereinafter, the principle of the phase shift method will be described in more detail.

図9のフローチャートに示すように、まず、投影装置11から正弦波状に明度が変化する縞パターンを計測対象物13に投影し、この縞パターンの位相を、例えばπ/2ずつずらして撮像装置12で撮像するという手順(ステップ1〜ステップ6)を、縞パターンの位相が1周期分移動するまで複数回(最低3回、通常は4回以上)繰り返すことにより、図10(a)に示すような4枚の画像M1〜M4を得る。撮像された4枚の画像M1〜M4上の同じ位置p(x,y)での明度a0〜a3は、図10(b)に示すように、絶対的な明るさはその位置での表面性状や色などにより変化しても、相対的な明度差は必ず投影パターンの位相差分だけの変化を示すから、式1により正弦波の方程式を当てはめれば、その位置p(x,y)での投影されたパターンの相対位相値θpが求められる(図9におけるステップ7)。この処理を行って生成された位相復元画像を図11に示す。   As shown in the flowchart of FIG. 9, first, a fringe pattern whose brightness changes in a sinusoidal shape from the projection device 11 is projected onto the measurement object 13, and the phase of the fringe pattern is shifted by, for example, π / 2, for example, the imaging device 12. As shown in FIG. 10 (a), the procedure (step 1 to step 6) of imaging is repeated a plurality of times (minimum 3 times, usually 4 times or more) until the phase of the fringe pattern moves by one period. Four images M1 to M4 are obtained. As shown in FIG. 10B, the brightness a0 to a3 at the same position p (x, y) on the four captured images M1 to M4 is the surface property at that position. Even if it changes depending on the color or the like, the relative brightness difference always shows a change of only the phase difference of the projection pattern. Therefore, if the sine wave equation is applied by Equation 1, the relative lightness difference at the position p (x, y) The relative phase value θp of the projected pattern is obtained (step 7 in FIG. 9). FIG. 11 shows a phase restoration image generated by performing this process.

図11に示す位相復元画像の各画素値(相対位相値)θpは、式1が逆正接関数からなることより明らかなように、投影縞パターンの1位相毎の値、すなわち−π〜πの間の値となるから、複数位相分投影された縞の絶対位相値(図10(a)において上端の縞を基準として−π〜0〜π〜2π〜3π…と表される絶対的な位相値)θaを求めるには(図9におけるステップ9)、縞次数n(上端から下方に向かって数えてn本目の縞であることを表す値)の縞が、撮像された各画像M1〜M4上でどの位置にあるかを推定する処理(図9におけるステップ8)が必要である。この処理を行って求められた縞次数n(=1,2,…)を位相復元画像に当てはめた画像を図12に示す。 Each pixel value (relative phase value) θp of the phase-restored image shown in FIG. 11 is a value for each phase of the projected fringe pattern, that is, −π to π, as is clear from the fact that Equation 1 is composed of an arctangent function. The absolute phase value of the fringes projected for a plurality of phases (absolute phase expressed as -π to 0 to π to 2π to 3π ... with reference to the fringes at the upper end in FIG. 10A). (Value) θa (step 9 in FIG. 9), the fringes of the fringe order n (value indicating the nth fringe counting from the upper end downward) are captured in the respective images M1 to M4. The process of estimating the position at the top (step 8 in FIG. 9) is necessary. FIG. 12 shows an image in which the fringe order n (= 1, 2,...) Obtained by performing this process is applied to the phase restored image.

そして、撮像された4枚の画像M1〜M4の各点p(x,y)において、相対位相値θpと縞次数nとを用いて、図13に示すように、上記絶対位相値θa(=θp+2nπ)を求める。この処理(図9におけるステップ9)を以下では「位相接続処理」と呼ぶ。位相接続処理によって求められた絶対位相値θaが等しい点を連結して得られる線(等位相線)が、光切断法における切断線と同じく計測対象物13をある平面で切断した断面の形状を表すから、この絶対位相値θaをもとに三角測量の原理により計測対象物13の3次元形状(画像各点での高さ情報)が計測できる。尚、位相接続処理を行って生成された絶対位相値θaの位相復元画像を図14に示す。   Then, at each point p (x, y) of the four captured images M1 to M4, using the relative phase value θp and the fringe order n, as shown in FIG. 13, the absolute phase value θa (= θp + 2nπ) is obtained. This process (step 9 in FIG. 9) is hereinafter referred to as “phase connection process”. A line (equal phase line) obtained by connecting points having the same absolute phase value θa obtained by the phase connection process has a cross-sectional shape obtained by cutting the measurement target 13 along a certain plane in the same manner as the cutting line in the optical cutting method. Therefore, based on the absolute phase value θa, the three-dimensional shape (height information at each point of the image) of the measurement object 13 can be measured by the principle of triangulation. FIG. 14 shows a phase restoration image of the absolute phase value θa generated by performing the phase connection process.

而して、図15に示すように正弦波縞の絶対位相値、すなわち縞パターンの投影に用いた実体格子A上の位置δと撮像装置12の撮像素子B上の結像点位置(画像上の座標)p(x,y)が特定できるので、投影装置11と撮像装置12の光学的な配置に基づいて三角測量の原理を表す式2により、画像上の点p(x,y)に対応する計測対象物13上の投影点P(X,Y,Z)の3次元空間での絶対的な座標値(X,Y,Z)を求めれば、図5に示すような計測対象物13の3次元形状を得ることができる(図9のステップ10)。但し、図15に示した三角測量の測定原理については従来周知であるから詳細な説明は省略する。   Thus, as shown in FIG. 15, the absolute phase value of the sine wave fringe, that is, the position δ on the actual lattice A used for projecting the fringe pattern and the position of the imaging point on the image pickup device B of the image pickup device 12 (on the image) ) P (x, y) can be specified, and the point p (x, y) on the image is represented by Equation 2 representing the principle of triangulation based on the optical arrangement of the projection device 11 and the imaging device 12. If the absolute coordinate value (X, Y, Z) in the three-dimensional space of the projection point P (X, Y, Z) on the corresponding measurement object 13 is obtained, the measurement object 13 as shown in FIG. Can be obtained (step 10 in FIG. 9). However, since the measurement principle of triangulation shown in FIG.

このように位相シフト法では、得られた画像の各点に対応する3次元座標が簡単な計算処理により求められるとともに高密度の3次元形状の計測データが最低3回の撮像により得られるため、1点あたりの計測時間で評価すると他の計測方法に比べて数倍から数万倍の速度での計測が可能になるという特長がある。 As described above, in the phase shift method, three-dimensional coordinates corresponding to each point of the obtained image are obtained by a simple calculation process, and measurement data of a high-density three-dimensional shape is obtained by imaging at least three times. When evaluated by the measurement time per point, there is a feature that it is possible to measure at a speed several to several tens of thousands times faster than other measurement methods.

しかしながら、計測対象物13の表面に段差や穴があって縞と縞との境界(位相境界)が特定できない場合などには、縞次数nが正しく求められないために位相接続処理が困難になり、正しい距離が得られなくなるという問題(位相飛び問題)が発生するという欠点がある。また、位相シフト法は、なるべく多数の縞を投影し、撮影された画像上において投影縞のコントラストがなるべく高くなるように撮像することで高精度の計測が可能となるが、縞次数が多いと上記位相飛び問題が発生しやすく、高さ計測に必要な絶対位相値を得るのが困難になるという問題がある。   However, when there is a step or a hole on the surface of the measurement object 13 and the boundary (phase boundary) between the stripes cannot be specified, the fringe order n cannot be obtained correctly, making phase connection processing difficult. There is a disadvantage that a problem that a correct distance cannot be obtained (phase jump problem) occurs. In addition, the phase shift method projects as many fringes as possible and takes an image so that the contrast of the projected fringes is as high as possible on the captured image. There is a problem that the phase jump problem is likely to occur and it is difficult to obtain an absolute phase value necessary for height measurement.

ここで、位相飛び問題について具体的に例を挙げて説明する。計測対象物の表面が理想的な乱反射面であって、かつ段差や穴が一切存在しないのであれば、図13における縞次数n=1のように相対位相値θpがπから-πへ変化する位置を位相境界として、その位置から見て位相が増加する側に現れる次の位相境界までを1本の縞とみなし縞次数nを求めればよい。しかしながら、実際には撮像装置12のノイズなどにより位相境界位置(図13における縞次数n=2とn=3との境界)では相対位相値θpの値がπから-π、-πからπへ数回入れ替わるようなデータが出現する場合が多く、このような単純な方法では位相接続がうまくいかないことが多い。また、図11や図12に示す例のように大きな段差がある場合には、例えば図12に示す縞次数n=4の縞は影によって3つに分断されているため、前記方法ではこれらが同じ次数の縞であることを認識できないので、正しい縞次数が求められない。   Here, the phase skip problem will be described with a specific example. If the surface of the measurement object is an ideal irregular reflection surface and there are no steps or holes, the relative phase value θp changes from π to −π as the fringe order n = 1 in FIG. With the position as a phase boundary, the first phase boundary appearing on the side where the phase increases when viewed from the position is regarded as one fringe and the fringe order n may be obtained. However, in actuality, the relative phase value θp changes from π to −π and from −π to π at the phase boundary position (the boundary between the stripe orders n = 2 and n = 3 in FIG. 13) due to noise of the imaging device 12. In many cases, data that changes several times appears, and phase connection is often not successful in such a simple method. Further, when there are large steps as in the examples shown in FIGS. 11 and 12, for example, the stripe of the stripe order n = 4 shown in FIG. 12 is divided into three by shadows. Since it cannot be recognized that the stripes have the same order, the correct stripe order cannot be obtained.

そこで従来より、位相飛び問題を解決する方法が種々提案されている(特許文献1〜4参照)。特許文献1に記載された方法では、位相境界位置を明示的に取得することを目的として、位相境界位置に画像的特徴が現れるようなスリット光や白黒縞状の補助パターンを計測対象物に投影することにより、段差や穴がある計測対象物に適用しやすいマルチスリット光切断法と、段差には弱いが高密度な高さ情報が求められる位相シフト法の双方の長所を活かし、段差や穴を含む場合にも高密度なデータを位相シフト法により取得するようにしている。また、特許文献2には、基準正弦波とその整数倍の波長をもつ正弦波を重畳させて投影する方法が記載されている。このように、波長の違う正弦波を用い、各々の波長で求められた相対的位相の情報を総合する方法を用いれば、位相境界位置を一意に得ることができる。
特開2001−124534号公報 特許3199041号公報 特開2004−108950号公報 特開2003−279329号公報
Thus, various methods for solving the phase skip problem have been proposed (see Patent Documents 1 to 4). In the method described in Patent Document 1, for the purpose of explicitly acquiring the phase boundary position, slit light or black-and-white striped auxiliary patterns in which image features appear at the phase boundary position are projected onto the measurement object. By taking advantage of both the multi-slit optical cutting method, which is easy to apply to measurement objects with steps and holes, and the phase shift method, which is sensitive to steps but requires high-density height information, steps and holes are used. Even in the case of including high-density data, high-density data is acquired by the phase shift method. Patent Document 2 describes a method of superimposing and projecting a reference sine wave and a sine wave having an integer multiple of the reference sine wave. As described above, if a method of using sine waves having different wavelengths and combining information on relative phases obtained at respective wavelengths is used, the phase boundary position can be uniquely obtained.
JP 2001-124534 A Japanese Patent No. 399041 JP 2004-108950 A JP 2003-279329 A

しかしながら、特許文献1および2に記載されている方法では、計測対象物を撮像する回数が増えてしまうという問題がある。位相シフト法では、短時間に撮像が完了するようにすれば、特に人体のように完全に固定することが困難な物体を計測対象とする場合に、撮影中に計測対象(人体)が移動してしまうことによる誤差の発生を防ぐことができるので、より少ない撮像回数で3次元形状を計測することが望ましい。   However, the methods described in Patent Documents 1 and 2 have a problem that the number of times of imaging the measurement object increases. In the phase shift method, if imaging is completed in a short time, the measurement target (human body) moves during shooting, especially when an object that is difficult to fix completely, such as the human body, is the measurement target. Therefore, it is desirable to measure a three-dimensional shape with a smaller number of imaging.

このような問題に対処する方法として、特許文献3には、RGBの3色の光で互いに初期位置が異なり且つ明度が階段状に変化する3通りの補助パターンと、基準用の白色光のパターンとを投影することによって縞次数の特定を容易に行えるようにする方法が記載されている。しかしながら、この方法では明度が階段状に変化する境界点を捉えるために基準用の白色パターンを撮影せねばならず、撮影枚数が最低1枚増えてしまうという問題がある。また、特許文献4には、波長の違う2つの成分の縞画像を投影することで投影回数を半分にするという方法が開示されている。しかしながら、この方法では投影する正弦波パターンの波長数だけの撮像素子を用いるので、同一の撮像素子で縞画像を撮影する場合とは異なり、撮像素子の感度バラツキを計測して補正する処理を行わないと正しい位相値が得られないという問題がある。さらに、太陽光はもとより屋内照明のほとんどが白色光であるので、白色光を用いて計測すると外乱光の影響によりコントラストが低下し、計測結果の誤差(ホワイトノイズ)が増加するという問題も生じる。   As a method for coping with such a problem, Patent Document 3 discloses three auxiliary patterns whose initial positions are different from each other with three colors of RGB light and whose brightness changes stepwise, and a reference white light pattern. A method is described in which the fringe order can be easily specified by projecting. However, this method has a problem that the reference white pattern has to be photographed in order to capture the boundary point where the brightness changes stepwise, and the number of photographed images increases by at least one. Patent Document 4 discloses a method of halving the number of projections by projecting two fringe images having different wavelengths. However, since this method uses image sensors with the number of wavelengths of the sine wave pattern to be projected, unlike the case where a striped image is captured with the same image sensor, a process for measuring and correcting sensitivity variations of the image sensors is performed. Otherwise, there is a problem that a correct phase value cannot be obtained. Furthermore, since most of indoor lighting as well as sunlight is white light, there is a problem that when measuring using white light, the contrast decreases due to the influence of disturbance light, and the error (white noise) of the measurement result increases.

本発明は上記事情に鑑みて為されたものであり、その目的は、表面に段差や孔のある物体の3次元形状を外乱光による影響を抑制しつつ少ない撮像回数で計測することができる3次元形状計測方法並びに3次元形状計測装置、3次元形状計測用プログラムを提供することにある。   The present invention has been made in view of the above circumstances, and an object of the present invention is to measure the three-dimensional shape of an object having a step or a hole on the surface with a small number of imaging operations while suppressing the influence of ambient light. A three-dimensional shape measurement method, a three-dimensional shape measurement device, and a three-dimensional shape measurement program are provided.

請求項1の発明は、上記目的を達成するために、光の明度が正弦波状に変化する縞パターンを計測対象物に投影し、縞パターンの投影方向と異なる方向から計測対象物を撮像した撮像画像を得る過程を、縞パターンの位相を一定間隔でずらしながら複数回実行し、前記複数回の過程で得られた複数枚の撮像画像における同一点での明度変化に基づいて縞パターンの位相値を演算するとともに該位相値から撮像画像上の各点における高さ情報を得ることで計測対象物の3次元形状を計測する3次元形状計測方法において、撮像画像における縞次数を確定するための補助パターンを、縞パターンを投影する光の波長と異なる波長の光で縞パターンと同時に投影し、縞パターンと補助パターンが同時に投影された撮像画像を、光の波長に基づいて縞パターンが投影された画像と補助パターンが投影された画像に分離し、補助パターンが投影された複数枚の画像から縞次数を確定することを特徴とする。   In order to achieve the above object, the invention of claim 1 projects a fringe pattern whose light intensity changes in a sinusoidal shape onto a measurement object, and images the measurement object from a direction different from the projection direction of the fringe pattern. The process of obtaining the image is executed a plurality of times while shifting the phase of the fringe pattern at regular intervals, and the phase value of the fringe pattern is based on the change in brightness at the same point in the plurality of captured images obtained in the plurality of processes. In the three-dimensional shape measurement method for measuring the three-dimensional shape of the measurement object by obtaining height information at each point on the picked-up image from the phase value, an auxiliary for determining the fringe order in the picked-up image The pattern is projected simultaneously with the fringe pattern with light having a wavelength different from that of the light that projects the fringe pattern, and the captured image on which the fringe pattern and the auxiliary pattern are simultaneously projected is fringed based on the light wavelength. Turn separated into image image and the auxiliary pattern projected is projected, the auxiliary pattern is characterized in that to determine the fringe order from a plurality of images projected.

請求項2の発明は、請求項1の発明において、縞パターンおよび補助パターンの光の波長と異なる波長の光を略均一な明度で縞パターンおよび補助パターンと同時に投影して撮像し、撮像画像から当該略均一な明度の光の波長成分のみを含む画像を分離し、複数回の過程で得られる複数枚の当該画像から計測対象物の動きを検出するとともに検出した動きに基づいて計測対象物の高さ情報を補正することを特徴とする。   According to a second aspect of the present invention, in the first aspect of the invention, light having a wavelength different from that of the light of the fringe pattern and the auxiliary pattern is projected simultaneously with the fringe pattern and the auxiliary pattern with substantially uniform brightness, and is captured. The image including only the wavelength component of the light having the substantially uniform brightness is separated, and the movement of the measurement object is detected from a plurality of the images obtained in a plurality of processes, and the measurement object is detected based on the detected movement. It is characterized by correcting height information.

請求項3の発明は、請求項1又は2の発明において、縞パターンを投影する光の波長を、380ナノメートルから590ナノメートルの範囲とすることを特徴とする。   The invention of claim 3 is characterized in that, in the invention of claim 1 or 2, the wavelength of the light for projecting the fringe pattern is in the range of 380 nanometers to 590 nanometers.

請求項4の発明は、請求項1〜3の何れかの発明において、補助パターンは、縞パターンの位相をずらして投影する各回毎に、縞パターンの周期に対して各回で異なる整数値を乗算した周期で明暗が反転するパターンであることを特徴とする。   According to a fourth aspect of the present invention, in the invention according to any one of the first to third aspects, the auxiliary pattern is multiplied by a different integer value at each time with respect to the period of the fringe pattern every time the phase of the fringe pattern is shifted and projected. The pattern is characterized in that the lightness and darkness are reversed in the cycle.

請求項5の発明は、請求項1〜4の何れかの発明において、縞パターンの光の波長および補助パターンの光の波長は、計測対象物に投影される全ての光の波長を合成すると白色光となるように前記各波長を設定することを特徴とする。   According to a fifth aspect of the present invention, in any one of the first to fourth aspects of the invention, the wavelength of the light of the fringe pattern and the wavelength of the light of the auxiliary pattern are white when the wavelengths of all the light projected on the measurement object are combined. Each wavelength is set so as to be light.

請求項6の発明は、上記目的を達成するために、請求項1〜5の何れかに記載の3次元形状計測方法を実行する3次元形状計測装置であって、縞パターン並びに補助パターンを同時に計測対象物に投影する投影手段と、縞パターン並びに補助パターンが投影された計測対象物の画像を撮像する撮像手段と、撮像手段で撮像した撮像画像を画像処理することによって縞次数を確定し、確定した縞次数に基づいて縞パターンの位相値を演算するとともに該位相値から撮像画像上の各点における高さ情報を求める画像処理手段とを備えたことを特徴とする。   A sixth aspect of the invention is a three-dimensional shape measuring apparatus for executing the three-dimensional shape measuring method according to any one of the first to fifth aspects, in order to achieve the above object, wherein the fringe pattern and the auxiliary pattern are simultaneously applied. A projection unit that projects onto the measurement object, an imaging unit that captures an image of the measurement object on which the fringe pattern and the auxiliary pattern are projected, and a fringe order is determined by performing image processing on the captured image captured by the imaging unit, Image processing means for calculating the phase value of the fringe pattern based on the determined fringe order and obtaining height information at each point on the captured image from the phase value is provided.

請求項7の発明は、上記目的を達成するために、請求項1〜5の何れかに記載の3次元形状計測方法をコンピュータに実行させる3次元形状計測用プログラムであって、投影装置により縞パターン並びに補助パターンを同時に計測対象物に投影させる手順と、縞パターン並びに補助パターンが投影された計測対象物の画像を撮像手段に撮像させる手順と、撮像手段で撮像した撮像画像を画像処理することによって縞次数を確定し、確定した縞次数に基づいて縞パターンの位相値を演算するとともに該位相値から撮像画像上の各点における高さ情報を求めさせる手順とをコンピュータに実行させることを特徴とする。   A seventh aspect of the invention is a three-dimensional shape measurement program for causing a computer to execute the three-dimensional shape measurement method according to any one of the first to fifth aspects, in order to achieve the above object. A procedure for simultaneously projecting the pattern and the auxiliary pattern onto the measurement target, a procedure for causing the imaging unit to capture an image of the measurement target on which the fringe pattern and the auxiliary pattern are projected, and image processing the captured image captured by the imaging unit And a step of causing the computer to execute a procedure for calculating a phase value of the fringe pattern based on the determined fringe order and obtaining height information at each point on the captured image from the phase value. And

本発明によれば、位相シフト法を利用した3次元形状計測において必須となる正弦波状の縞パターンと、縞パターンの縞次数を確定するための補助パターンとを計測対象物に同時に投影して撮像することが可能であり、しかも、縞パターンを投影する光を特定周波数の単色光とすることで白色光に比較して外乱光の影響を抑制することができ、その結果、表面に段差や孔のある物体の3次元形状を外乱光による影響を抑制しつつ少ない撮像回数で計測することができる3次元形状計測方法並びに3次元形状計測装置、3次元形状計測用プログラムを提供することができるという効果がある。   According to the present invention, a sinusoidal fringe pattern essential for three-dimensional shape measurement using the phase shift method and an auxiliary pattern for determining the fringe order of the fringe pattern are simultaneously projected onto the measurement object and imaged. In addition, by making the light that projects the fringe pattern a monochromatic light of a specific frequency, the influence of disturbance light can be suppressed compared to white light, and as a result, steps and holes are formed on the surface. 3D shape measurement method, 3D shape measurement device, and 3D shape measurement program capable of measuring the 3D shape of a certain object with a small number of times of imaging while suppressing the influence of disturbance light can be provided. effective.

以下、図面を参照して本発明を実施形態により詳細に説明する。   Hereinafter, the present invention will be described in detail with reference to the drawings.

図2は本実施形態の3次元形状計測装置の概略構成図である。この3次元形状計測装置は、互いに異なる波長の光で計測対象物に縞パターン並びに補助パターンを投影する投影装置1と、縞パターン並びに補助パターンが投影された計測対象物の画像を撮像する撮像装置2と、投影装置1および撮像装置2を制御するとともに撮像装置2で撮像した撮像画像に対して画像処理を行う制御装置3とで構成される。   FIG. 2 is a schematic configuration diagram of the three-dimensional shape measuring apparatus of the present embodiment. The three-dimensional shape measurement apparatus includes a projection apparatus 1 that projects a fringe pattern and an auxiliary pattern onto a measurement object with light having different wavelengths, and an imaging apparatus that captures an image of the measurement object on which the fringe pattern and the auxiliary pattern are projected. 2 and a control device 3 that controls the projection device 1 and the imaging device 2 and performs image processing on the captured image captured by the imaging device 2.

投影装置1は、白色光源、白色光源の光をR、G、Bの単色光に分光する3枚の液晶パネル、これら3枚の液晶パネルで分光されたR、G、Bの単色光の光路を一致させる光学系などで構成された、いわゆる液晶プロジェクタである。また撮像装置2は、原色タイプのカラーフィルタを採用したCCD撮像素子、被写体の像をCCD撮像素子の撮像面に結像させる光学系、CCD撮像素子の出力を信号処理してR、G、Bの各単色光毎の画像データを得る信号処理回路などで構成された、いわゆる単板式のCCDカメラである。制御装置3は、CPU、メモリ、ディスプレイ、ハードディスクなどの記憶装置、入出力用の各種インタフェース等を具備する汎用のコンピュータ(ハードウェア)と、本発明に係る3次元形状計測方法をコンピュータに実行させる3次元形状計測用プログラム(ソフトウェア)とで構成される。尚、この3次元形状計測用プログラムは、磁気ディスクや光ディスクなどの記録媒体に記録するか、あるいはインターネットなどの電気通信回線を通じてコンピュータに提供される。投影装置1と制御装置3との間では、例えば、DVI(Digital Visual Interface)のような汎用のディスプレイ用インタフェースを介してパターン画像を伝送し、RS232CやIEEE488などの汎用の通信インタフェースを介して制御装置3から投影装置1の動作(光源の点灯・消灯や光量調整など)を制御している。また撮像装置2では各画素の画素値(明度)を量子化し、R、G、Bの各色毎に出力する機能を有しており、制御装置3では撮像装置2から出力されるR、G、Bのディジタル信号をディジタル信号入力用のインタフェースを介して取り込むとともに、RS232CやIEEE488などの汎用の通信インタフェースを介して撮像装置2の動作(撮像のタイミングなど)を制御している。   The projection apparatus 1 includes a white light source, three liquid crystal panels that split the light of the white light source into R, G, and B monochromatic light, and an optical path of R, G, and B monochromatic light that is split by the three liquid crystal panels. This is a so-called liquid crystal projector composed of an optical system for matching the two. The image pickup apparatus 2 also includes a CCD image pickup element that employs a primary color filter, an optical system that forms an image of a subject on the image pickup surface of the CCD image pickup element, and R, G, and B by performing signal processing on the output of the CCD image pickup element. This is a so-called single-plate CCD camera composed of a signal processing circuit for obtaining image data for each monochromatic light. The control device 3 causes the computer to execute a general-purpose computer (hardware) including a CPU, a storage device such as a memory, a display, and a hard disk, various input / output interfaces, and the three-dimensional shape measurement method according to the present invention. It consists of a three-dimensional shape measurement program (software). The three-dimensional shape measurement program is recorded on a recording medium such as a magnetic disk or an optical disk, or provided to a computer through an electric communication line such as the Internet. For example, a pattern image is transmitted between the projection device 1 and the control device 3 via a general-purpose display interface such as DVI (Digital Visual Interface), and is controlled via a general-purpose communication interface such as RS232C or IEEE488. The apparatus 3 controls the operation of the projection apparatus 1 (lighting on / off of the light source, light amount adjustment, etc.). The imaging device 2 has a function of quantizing the pixel value (brightness) of each pixel and outputting it for each color of R, G, and B. The control device 3 outputs R, G, and R output from the imaging device 2. The digital signal B is taken in via a digital signal input interface, and the operation (imaging timing, etc.) of the image pickup apparatus 2 is controlled via a general-purpose communication interface such as RS232C or IEEE488.

ところで、投影装置1によって計測対象物に投影される縞パターン並びに補助パターンは、汎用のアプリケーションソフト(例えば、画像加工用のレタッチソフトなど)を用いて予め作成され、制御装置3の記憶装置に格納されている。縞パターンPBは、図3(a)に示すように上下方向において青色の単色光の明度が正弦波状に変化するパターンであって、4種類の縞パターンPB1〜PB4は縞の位相が順次π/4ずつずらしてある。また補助パターンPR1〜PR4は、図3(b)に示すように赤色の単色光の明度が縞パターンPBiの周期を2i倍(i=1,2,3,4)した周期で明暗が反転するパターンである。図3(c)は、後述するように計測対象物の動きを検出するために縞パターンPBi並びに補助パターンPRiと同時に計測対象物に投影される動き検出用パターンPG1〜PG4であって、緑色の単色光の明度を均一としたものである。尚、このようなパターンは、画像をRGBの3原色のチャンネルに分解する機能、各チャンネル毎に明度を周期的に変化させる機能(グラデーション調整機能)、加工された3つのチャンネルを一つの画像に合成する機能を有した汎用のレタッチソフトなどを用いて作成可能である。 By the way, the fringe pattern and the auxiliary pattern projected onto the measurement object by the projection device 1 are created in advance using general-purpose application software (for example, retouch software for image processing) and stored in the storage device of the control device 3. Has been. The stripe pattern PB is a pattern in which the brightness of blue monochromatic light changes in a sine wave shape in the vertical direction as shown in FIG. 3A. The four types of stripe patterns PB1 to PB4 have a stripe phase of π / sequentially. It is shifted by 4 each. Further, as shown in FIG. 3B, the auxiliary patterns PR1 to PR4 are inverted in brightness with a period in which the brightness of the red monochromatic light is 2 i times (i = 1, 2, 3, 4) the period of the stripe pattern PBi. Pattern. FIG. 3C shows motion detection patterns PG1 to PG4 projected on the measurement object simultaneously with the stripe pattern PBi and the auxiliary pattern PRi in order to detect the movement of the measurement object, as will be described later. The brightness of monochromatic light is uniform. Such a pattern has the function of separating an image into three primary color channels of RGB, the function of periodically changing the brightness for each channel (gradation adjustment function), and the three processed channels into one image. It can be created using general-purpose retouching software having the function of combining.

而して、縞パターンPBiと補助パターンPRiと動き検出用パターンPGi(i=1,2,3,4)を合成したパターン画像が制御装置3からインタフェースを介して投影装置1に伝送されると、投影装置1からは縞パターンPBi、補助パターンPRi、動き検出用パターンPGiがそれぞれ青色、赤色、緑色の波長の光で同時に計測対象物に投影されることになる。   Thus, when a pattern image obtained by combining the stripe pattern PBi, the auxiliary pattern PRi, and the motion detection pattern PGi (i = 1, 2, 3, 4) is transmitted from the control device 3 to the projection device 1 via the interface. From the projection apparatus 1, the stripe pattern PBi, the auxiliary pattern PRi, and the motion detection pattern PGi are simultaneously projected onto the measurement object with light of blue, red, and green wavelengths, respectively.

次に、本実施形態の3次元形状計測装置の動作、すなわち、本発明に係る3次元形状計測方法について説明する。   Next, the operation of the three-dimensional shape measurement apparatus of this embodiment, that is, the three-dimensional shape measurement method according to the present invention will be described.

まず、制御装置3のCPUで3次元形状計測用プログラムを実行することで3次元形状計測装置が動作を開始すると、記憶装置に格納されている最初のパターン画像(縞パターンPB1と補助パターンPR1と動き検出用パターンPG1を合成した画像)がCPUによってメモリの作業領域に読み出され、さらにディスプレイ用インタフェースを介して投影装置1に伝送され、投影装置1からは縞パターンPB1、補助パターンPR1、動き検出用パターンPG1がそれぞれ青色、赤色、緑色の波長の光で同時に計測対象物に投影される。パターン画像が投影装置1に伝送された後、CPUから撮像装置2に対して計測対象物の画像を撮像するコマンドが通信インタフェースを介して伝送される。撮像装置2では、制御装置3のCPUから上記コマンドを受け取ると計測対象物の画像を撮像してR、G、Bの各単色光毎の画像データを制御装置3に伝送する。制御装置3では、撮像装置2から受け取ったR、G、Bの画像データがCPUによってメモリの作業領域に格納された後、記憶装置に格納されている2つめのパターン画像(縞パターンPB2と補助パターンPR2と動き検出用パターンPG2を合成した画像)がメモリの作業領域に読み出され、さらにディスプレイ用インタフェースを介して投影装置1に伝送される。そして、最初の縞パターンPB1から位相がπ/4だけずれた縞パターンPB2と、縞パターンPB2の周期を4倍した周期で明暗が反転する補助パターンPR2と、動き検出用パターンPG2がそれぞれ青色、赤色、緑色の波長の光で同時に投影装置1から計測対象物に投影された状態で、制御装置3のCPUからコマンドを受け取った撮像装置2が計測対象物の画像を撮像してR、G、Bの各単色光毎の画像データを制御装置3に伝送し、撮像装置2から受け取ったR、G、Bの画像データがCPUによってメモリの作業領域に格納される。同様にして3つめのパターン画像(縞パターンPB3と補助パターンPR3と動き検出用パターンPG3を合成した画像)、4つめのパターン画像(縞パターンPB4と補助パターンPR4と動き検出用パターンPG4を合成した画像)が投影装置1から投影された状態で計測対象物の画像が撮像装置2で撮像され、撮像装置2で得られたR、G、Bの画像データがCPUによってメモリの作業領域に格納されると画像の取得処理が終了する。尚、この画像取得処理は、図9のフローチャートにおけるステップ1〜ステップ6の処理に対応するものである。   First, when the three-dimensional shape measurement apparatus starts operating by executing a three-dimensional shape measurement program with the CPU of the control device 3, the first pattern image (the stripe pattern PB1 and the auxiliary pattern PR1 stored in the storage device) is stored. The image obtained by synthesizing the motion detection pattern PG1) is read by the CPU into the work area of the memory, and is further transmitted to the projection apparatus 1 via the display interface, and the projection apparatus 1 transmits the stripe pattern PB1, the auxiliary pattern PR1, and the motion. The detection pattern PG1 is simultaneously projected onto the measurement object with light of blue, red, and green wavelengths. After the pattern image is transmitted to the projection apparatus 1, a command for capturing an image of the measurement target is transmitted from the CPU to the imaging apparatus 2 via the communication interface. When the imaging device 2 receives the above command from the CPU of the control device 3, the imaging device 2 captures an image of the measurement target and transmits image data for each R, G, and B monochromatic light to the control device 3. In the control device 3, the R, G, B image data received from the imaging device 2 is stored in the work area of the memory by the CPU, and then the second pattern image (the stripe pattern PB2 and the auxiliary pattern stored in the storage device) is stored. An image obtained by synthesizing the pattern PR2 and the motion detection pattern PG2) is read into the work area of the memory and further transmitted to the projection apparatus 1 via the display interface. Then, the stripe pattern PB2 whose phase is shifted by π / 4 from the first stripe pattern PB1, the auxiliary pattern PR2 whose brightness is inverted with a period four times the period of the stripe pattern PB2, and the motion detection pattern PG2 are blue, The imaging device 2 that has received a command from the CPU of the control device 3 captures an image of the measurement target and receives R, G, R, G, The image data for each monochromatic light of B is transmitted to the control device 3, and the R, G, B image data received from the imaging device 2 is stored in the work area of the memory by the CPU. Similarly, a third pattern image (an image obtained by combining the stripe pattern PB3, the auxiliary pattern PR3, and the motion detection pattern PG3), and a fourth pattern image (an image obtained by combining the stripe pattern PB4, the auxiliary pattern PR4, and the motion detection pattern PG4). The image of the object to be measured is picked up by the image pickup device 2 in a state where the image is projected from the projection device 1, and the R, G, B image data obtained by the image pickup device 2 is stored in the work area of the memory by the CPU. Then, the image acquisition process ends. This image acquisition process corresponds to the process of steps 1 to 6 in the flowchart of FIG.

図4の(a)〜(d)は、計測対象物13を球体と仮定し、制御装置3のメモリの作業領域に格納された1回目〜4回目のR、G、Bの画像データDBi、DRi、DGi(i=1,2,3,4)を表している。制御装置3のCPUでは、作業領域に格納した4つの画像データDB1〜DB4(縞パターンPB1〜PB4が投影された計測対象物13を撮像した画像)に対して任意の位置p(x,y)における相対位相値θpを求める演算処理を行い、求められた相対位相値θpをメモリの作業領域に格納する。尚、相対位相値θpを求めるための演算処理は、従来技術で説明したものと共通であるから説明は省略する。   4A to 4D assume that the measurement target 13 is a sphere, and the first to fourth R, G, B image data DBi stored in the work area of the memory of the control device 3, DRi and DGi (i = 1, 2, 3, 4) are represented. The CPU of the control device 3 has an arbitrary position p (x, y) with respect to the four image data DB1 to DB4 (images obtained by imaging the measurement object 13 onto which the stripe patterns PB1 to PB4 are projected) stored in the work area. Is calculated, and the obtained relative phase value θp is stored in the work area of the memory. Note that the arithmetic processing for obtaining the relative phase value θp is the same as that described in the prior art, and thus the description thereof is omitted.

次に制御装置3のCPUでは、いわゆる空間コーディングの手法を用いて画像データDBiにおける各縞の縞次数nを確定する演算処理を行う。図1(a)は縞パターンPBiを投影して撮像された画像データDBiの明度変化、(b)〜(e)は補助パターンPRi(i=1,2,3,4)を投影して撮像された画像データDRiの明度変化をそれぞれ示しており、制御装置3のCPUでは、画像データDBiの任意の位置においてi番目の画像データDRiの同位置における明度が明るいときは1、暗いときは0の明度コードを下位iビット目に代入した縞次数コードを作成する(図1(f)参照)。例えば、図1における左端の縞では1回目から4回目の画像データDR1〜DR4の明度コードが全て0となるから縞次数コードは0x0000となり、その隣(左から2番目)の縞では1回目の画像データDR1の明度コードのみが1で他の画像データDR2〜DR4の明度コードが全て0となるから縞次数コードは0x0001となり、結局、縞次数コードとして下位4ビットの値が0000〜1111の16通りの値で求められ、さらに下位4ビットの値を10進数に変換すれば、16個の縞の縞次数n(=0〜15)を一意に確定することができる(図1(g)参照)。尚、青色および赤色以外の波長、例えば緑色の波長の光により、i回目の撮像時における明度が縞パターンPBiの周期を2i+4倍(i=1,2,3,4)した周期で明暗が反転するパターンを同時に投影して撮像すれば、28=256通りの位相の縞次数を一意に確定することが可能であり、解像度の高い光学系を用いて密な縞パターンを投影する場合でも位相飛びが生じない計測が可能になる。 Next, the CPU of the control device 3 performs arithmetic processing for determining the stripe order n of each stripe in the image data DBi using a so-called spatial coding technique. FIG. 1A shows a change in brightness of image data DBi imaged by projecting a fringe pattern PBi, and FIGS. 1B and 1E show images obtained by projecting an auxiliary pattern PRi (i = 1, 2, 3, 4). The change in brightness of the image data DRi is shown, and the CPU of the control device 3 is 1 when the brightness at the same position of the i-th image data DRi is bright at any position of the image data DBi, and 0 when dark. The fringe order code is created by substituting the lightness code of the first i-th bit (see FIG. 1F). For example, since the brightness codes of the first to fourth image data DR1 to DR4 are all 0 in the leftmost stripe in FIG. 1, the stripe order code is 0x0000, and the adjacent stripe (second from the left) is the first. Since only the lightness code of the image data DR1 is 1 and the lightness codes of the other image data DR2 to DR4 are all 0, the fringe order code is 0x0001. As a result, the lower 4 bits are 16 to 16 in which the value of the lower 4 bits is 0000-1111. If the values are obtained as street values and the lower 4 bits are converted into decimal numbers, the stripe order n (= 0 to 15) of the 16 stripes can be uniquely determined (see FIG. 1G). ). The lightness at the i-th imaging is 2 i + 4 times the period of the stripe pattern PBi (i = 1, 2, 3, 4) by light of wavelengths other than blue and red, for example, green wavelengths. By simultaneously projecting and imaging patterns in which light and dark are reversed, it is possible to uniquely determine the stripe order of 2 8 = 256 phases, and project a dense stripe pattern using an optical system with high resolution. Even in this case, measurement without phase jump is possible.

上述のようにして縞次数を確定した後、CPUでは縞次数と相対位相値θpから絶対位相値θaを求める演算処理を行い、さらに絶対位相値θaをもとに三角測量の原理により画像各点での高さ情報(3次元空間における各点の座標値)を求め、その座標値を計測対象物13の3次元形状データとして記憶装置に格納する。但し、縞次数と相対位相値θpから絶対位相値θaを求める演算処理、並びに絶対位相値θaをもとに画像各点の3次元空間における座標値を求める演算処理は従来技術で説明したように周知であるから説明を省略する。尚、従来例と同様に、画像上の点p(x,y)に対応する計測対象物13上の投影点P(X,Y)の3次元空間での絶対的な座標値を求めれば、図5に示すような計測対象物13の3次元形状を得ることができる。   After the fringe order is determined as described above, the CPU performs a calculation process for obtaining the absolute phase value θa from the fringe order and the relative phase value θp, and further, each point of the image by the principle of triangulation based on the absolute phase value θa. Is obtained (coordinate values of each point in the three-dimensional space), and the coordinate values are stored in the storage device as the three-dimensional shape data of the measurement object 13. However, the arithmetic processing for obtaining the absolute phase value θa from the fringe order and the relative phase value θp and the arithmetic processing for obtaining the coordinate value in the three-dimensional space of each point of the image based on the absolute phase value θa are as described in the prior art. The description is omitted because it is well known. As in the conventional example, if the absolute coordinate value in the three-dimensional space of the projection point P (X, Y) on the measurement object 13 corresponding to the point p (x, y) on the image is obtained, A three-dimensional shape of the measurement object 13 as shown in FIG. 5 can be obtained.

このように本実施形態によれば、位相シフト法を利用した3次元形状計測において必須となる正弦波状の縞パターンと、縞パターンの縞次数を確定するための補助パターンとを計測対象物に同時に投影して撮像することが可能であるから、短時間で撮像を完了することができ、人体のように完全に固定することが困難な物体を計測する場合でも、計測対象物の動きにより画像間で位置ずれが発生し計測に誤差が生じるのを抑制することができる。しかしながら、縞パターンPBiにおける縞次数nを増やすにつれて計測対象物の動きによる位置ずれの影響が大きくなるから、本実施形態では、縞パターンPBi並びに補助パターンPRiと同時に動き検出用パターンPGiを計測対象物に投影することにより画像間での位置ずれを能動的に検出して補正するようにしている。   As described above, according to the present embodiment, the sinusoidal fringe pattern essential in the three-dimensional shape measurement using the phase shift method and the auxiliary pattern for determining the fringe order of the fringe pattern are simultaneously applied to the measurement object. Since it is possible to project images by projecting, imaging can be completed in a short time, and even when measuring an object that is difficult to fix completely, such as a human body, Thus, it is possible to suppress the occurrence of an error in the measurement due to the positional deviation. However, as the fringe order n in the fringe pattern PBi is increased, the influence of the position shift due to the movement of the measurement object becomes larger. In this way, the positional deviation between the images is actively detected and corrected.

すなわち、動き検出用パターンPGiを投影して得られる画像データDGiは、図4に示すように単純な濃淡画像であるので、制御装置3のCPUでは、画像データDGi間における計測対象物のオプティカルフローを求めることによって計測対象物の動き(動いた向きと移動量)を検出する。ここで、オプティカルフローを求めるには勾配法などの従来周知の手法を用いることが可能であるから、詳細については説明を省略する。但し、計測対象物の動きをさらに精密に検出するためには、オプティカルフローを求める方法に代えて、1回目の画像データDG1をテンプレートとして2〜4回目の画像データDG2〜DG4を少しずつ位置をずらして重ね合わせ、重なり合った画素値の相互相関値を求めて最も高い相関値を示す位置を検出する方法、いわゆる正規化相関パターンマッチング法を用いることが望ましい。そして、1回目の画像データDG1を基準としてi回目(i=2〜4)の画像データDG2〜DG4で計測対象物が位相単位でdθiだけ移動したことが検出されると、制御装置3のCPUでは、画像データDBiに対して任意の位置p(x,y)における相対位相値θpを求める際に、式1の代わりに下記の式3を用いることで計測対象物の動きによる位置ずれを補正することができる。 That is, since the image data DGi obtained by projecting the motion detection pattern PGi is a simple gray image as shown in FIG. 4, the CPU of the control device 3 uses the optical flow of the measurement object between the image data DGi. Is detected to detect the movement (direction and amount of movement) of the measurement object. Here, since it is possible to use a conventionally well-known method such as a gradient method in order to obtain the optical flow, a detailed description thereof will be omitted. However, in order to detect the movement of the measurement object more precisely, the position of the second to fourth image data DG2 to DG4 is set little by little using the first image data DG1 as a template instead of the method of obtaining the optical flow. It is desirable to use a so-called normalized correlation pattern matching method in which the positions where the highest correlation value is obtained by obtaining the cross-correlation values of the pixel values that are overlapped and shifted are detected. When the image data DG2~DG4 measurement object i th the first image data DG1 as a reference (i = 2 to 4) is detected to have moved by d .theta.i the phase units, the control device 3 In the CPU, when the relative phase value θp at an arbitrary position p (x, y) is obtained with respect to the image data DBi, the positional deviation due to the movement of the measurement object is obtained by using the following Expression 3 instead of Expression 1. It can be corrected.

ところで、本実施形態では縞パターンPBiを投影する光として波長が約450nmの青色光を用いているが、この光の波長として380nmから590nmの範囲内で任意に選択可能であり、かかる範囲内の波長の光で縞パターンPBiを投影することには、以下のような利点がある。 By the way, in the present embodiment, blue light having a wavelength of about 450 nm is used as light for projecting the stripe pattern PBi. However, the wavelength of this light can be arbitrarily selected within a range of 380 nm to 590 nm, and within this range. Projecting the fringe pattern PBi with light of the wavelength has the following advantages.

図6に示すように、人体の皮膚は表面から角質層121、カロチン層122、メラニン層123、真皮層124によって形成されているが、角質層121からメラニン層123までは極めて薄く且つ毛細血管が分布しておらず半透明であり、その下の真皮層124は角質層得121等に比べて厚く且つ毛細血管が分布している。紫外光および赤外光125は真皮層124を通過して人体の深部まで透過してしまう上、特に赤外光では体温による輻射を伴うという特性がある。また、可視光でも赤色光126は半透明である角質層121、カロチン層122、メラニン層123を透過して真皮層124で散乱され易いという特性がある。このため、紫外光、赤外光、赤色光を用いて縞パターンを投影すると、縞パターンの光が皮膚の深部で散乱するために画像に波形歪みが生じやすい。位相シフト法では物体の表面で乱反射された投影光の明度に基づいて計測を行うので、このように計測対象物を透過しやすい波長の光で縞パターンを投影することは好ましくない。これに対して、黄色から紫色に至る波長の短い可視光127,128は大半が皮膚表面で反射され、一部透過する成分も角質層121からメラニン層123までの浅い位置で大部分が散乱するので、上述のような波形歪みが生じにくく、高精度な計測が可能になるのである。   As shown in FIG. 6, the human skin is formed from the surface by the stratum corneum layer 121, the carotene layer 122, the melanin layer 123, and the dermis layer 124. The stratum corneum layer 121 to the melanin layer 123 are extremely thin and have capillaries. It is not distributed and is translucent. The underlying dermis layer 124 is thicker than the stratum corneum layer 121 and the like, and capillaries are distributed. Ultraviolet light and infrared light 125 pass through the dermis layer 124 and pass through to the deep part of the human body. In particular, infrared light is accompanied by radiation due to body temperature. Further, even in visible light, the red light 126 has a characteristic that it is easily scattered by the dermis layer 124 through the stratum corneum layer 121, the carotene layer 122, and the melanin layer 123, which are translucent. For this reason, when a fringe pattern is projected using ultraviolet light, infrared light, or red light, the fringe pattern light is scattered in the deep part of the skin, so that waveform distortion tends to occur in the image. In the phase shift method, the measurement is performed based on the brightness of the projection light irregularly reflected on the surface of the object. Therefore, it is not preferable to project the fringe pattern with light having a wavelength that easily passes through the measurement target. In contrast, most of the visible light 127 and 128 having a short wavelength ranging from yellow to purple is reflected on the skin surface, and most of the components that are partially transmitted are scattered at a shallow position from the stratum corneum 121 to the melanin layer 123. Therefore, the waveform distortion as described above hardly occurs, and highly accurate measurement is possible.

また、縞パターンを投影する光の波長として外乱光の影響が少なくなるような波長を選択すれば、撮像される画像のコントラストを向上させて一層精度を高めることができる。例えば、本実施形態の3次元形状計測装置は、主な用途が人体や数10cm〜数m程度の物体の3次元形状計測であって、基本的に屋内での使用を想定しているから、通常屋内照明に用いられている蛍光灯の光が外乱光として存在することが多いと考えられる。現在市販されている白色蛍光灯についてスペクトル分布を計測したところ、およそ3〜4つの強いピーク波長が存在し、特に435nm(紫に近い青)と545nm(緑)の波長に強いピークが存在するから、これらの波長を除外した波長の光でパターンを投影すれば、殆ど暗室で撮像したときと同じ程度の高いコントラストの画像を得ることができる。また、白熱灯の場合は蛍光灯に比べて広い波長域に渡ってスペクトルが分布しているが、そのうちでオレンジから赤にかけての波長域が最も強度が大きく、青などの短い波長域のスペクトルは少ないので、青色の光でパターンを投影することにより白熱灯の照明下でもある程度は外乱光の影響を除去することができる。   Further, if a wavelength that reduces the influence of disturbance light is selected as the wavelength of the light for projecting the fringe pattern, the contrast of the captured image can be improved and the accuracy can be further increased. For example, the three-dimensional shape measurement apparatus of the present embodiment is mainly intended for indoor use and is intended for three-dimensional shape measurement of a human body or an object of about several tens of cm to several meters, It is considered that light from fluorescent lamps usually used for indoor lighting often exists as disturbance light. When measuring the spectral distribution of white fluorescent lamps currently on the market, there are about 3 to 4 strong peak wavelengths, especially strong peaks at 435 nm (blue near purple) and 545 nm (green). If a pattern is projected with light having a wavelength excluding these wavelengths, an image with a contrast as high as that obtained when imaged in a dark room can be obtained. Incandescent lamps have a spectrum that is distributed over a wider wavelength range than fluorescent lamps. Among them, the wavelength range from orange to red has the highest intensity, and the spectrum in the short wavelength range such as blue Therefore, by projecting a pattern with blue light, the influence of disturbance light can be removed to some extent even under incandescent lighting.

また、本実施形態の3次元形状計測装置は人体あるいは工業製品等を計測してコンピュータ・グラフィック(CG)等に利用する用途を想定しているが、こういった用途では3次元形状データに加えて、テクスチャマッピングに用いるための計測対象物のカラー画像が同時に計測され、かつ当該カラー画像と3次元形状データとの対応関係が得られていることが望ましい。   The three-dimensional shape measuring apparatus of the present embodiment is assumed to be used for measuring a human body or an industrial product and using it for computer graphics (CG). In such a use, in addition to the three-dimensional shape data, Thus, it is desirable that the color image of the measurement object to be used for texture mapping is simultaneously measured and the correspondence between the color image and the three-dimensional shape data is obtained.

そこで本実施形態では、縞パターンPBi、補助パターンPRi、動き検出用パターンPGiを投影する光としてB、R、Gの3原色の光を用いることにより、各パターン毎に得られる画像データDBi、DRi、DGiからパターンを消去した画像を合成し、さらにパターンが消去された3種類の波長の画像データを合成することによって、平坦な(パターンのない)白色光照明を計測対象物に照射してカラーカメラで撮像したときと同じカラー画像を得るようにしている。   Therefore, in this embodiment, image data DBi, DRi obtained for each pattern by using light of the three primary colors B, R, and G as light for projecting the stripe pattern PBi, auxiliary pattern PRi, and motion detection pattern PGi. , Synthesizes an image from which the pattern has been erased from DGi, and synthesizes image data of three types of wavelengths from which the pattern has been erased. The same color image as when captured with a camera is obtained.

このようなカラー画像は、制御装置3のCPUで以下のような演算処理を行うことで得られる。   Such a color image is obtained by performing the following arithmetic processing by the CPU of the control device 3.

まず縞パターンPBiの投影画像であるBチャンネルの画像データDBiは、正弦波状の縞の位相が各画像データDBi間でπ/4ずつ変化しているので、下記の式4により、各画素において4回分の画像データDB1〜DB4における明度を平均化したBチャンネルの合成画像を生成すれば、正弦波状の縞が消えて青色の平坦光を投影したのと同じ画像が取得できる。   First, in the B channel image data DBi that is a projection image of the fringe pattern PBi, the phase of the sine wave fringe changes by π / 4 between the image data DBi. If a composite image of the B channel that averages the brightness in the image data DB1 to DB4 for the batch is generated, the same image as that obtained by projecting the blue flat light can be obtained with the sine wave-like stripes disappearing.

次に補助パターンPRiの投影画像であるRチャンネルの画像データDRiは、二値のパターンを投影して撮像されているので、下記の式5により、各画素において4回分の画像データDR1〜DR4のうち最大の明度を画素値としたRチャンネルの合成画像を生成すれば、矩形波のパターンが消えて赤色の平坦光を投影したのと同じ画像が取得できる。 Next, the R channel image data DRi, which is the projection image of the auxiliary pattern PRi, is captured by projecting a binary pattern, so that the following four expressions 5 represent the image data DR1 to DR4 for each pixel. If an R channel composite image having the maximum brightness as a pixel value is generated, the rectangular wave pattern disappears and the same image as that obtained by projecting red flat light can be obtained.

また、動き検出用パターンPGiの投影画像であるGチャンネルの画像データDGiは、平坦光を投影して撮像されているので、4回分の画像データDG1〜DG4のうち任意の回の画像データDGiを用いるか、あるいは、下記の式6により、4回分の画像データDG1〜DG4の画素値を平均化した画像をGチャンネルの合成画像として取得すればよい。 Further, since the G channel image data DGi, which is a projection image of the motion detection pattern PGi, is captured by projecting flat light, the image data DGi of any number of times among the image data DG1 to DG4 for the four times is obtained. Alternatively, an image obtained by averaging the pixel values of the image data DG1 to DG4 for four times may be acquired as a composite image of the G channel using the following Expression 6.

そして、式4〜式6により求められた、RGB3原色の画像の画素値R(x,y),G(x,y),B(x,y)を加色混合により合成すれば、白色光照明を計測対象物に投射してカラーカメラで撮影したのと同じカラー画像が取得できる。このカラー画像は、計測に用いた画像そのものから合成されているので、撮影された画像上の点p(x,y)に対して3次元形状データの点P(X,Y,Z)ごとにR、G、Bのカラー情報が与えられるため、3次元形状データとテクスチャマッピング用の画像データの位置合わせを行う必要がなく簡単にカラーテクスチャ付きの3次元CG画像が生成できる。 Then, if the pixel values R (x, y), G (x, y), and B (x, y) of the RGB three primary color images obtained by Expressions 4 to 6 are combined by additive color mixing, white light is obtained. The same color image as that obtained by projecting illumination onto a measurement object and photographed with a color camera can be acquired. Since this color image is synthesized from the image itself used for measurement, for each point P (X, Y, Z) of the three-dimensional shape data with respect to the point p (x, y) on the photographed image. Since color information of R, G, and B is given, it is not necessary to align the 3D shape data and the image data for texture mapping, and a 3D CG image with a color texture can be easily generated.

なお、上述のようにカラー画像を合成する際、式4〜式6に示すように、R、B、Gの各チャンネルの画素値に対して各々異なる重み係数Wr,Wg,Wbやベース輝度Br,Bg,Bbにより変換を行うことによって、ホワイトバランスや全体の輝度値、コントラストを調整して自然な色合いとコントラストをもったカラー画像を生成することが可能となる。但し、上記方法を用いるためには、補助パターンPRiにおいて、全ての位置で4回の投影のうち少なくとも1回は最大明度での照射が生じるようなパターンを投影せねばならないことは自明である。例えば、上述のように周期の異なる矩形波を用いた空間コーディングを行う場合、空間コードが0x0000(全て明暗の暗側)であるような領域が生じないようにせねばならない。   When a color image is synthesized as described above, weight coefficients Wr, Wg, Wb and base luminance Br that are different from each other for the pixel values of the R, B, and G channels as shown in Expressions 4 to 6. , Bg, and Bb, it is possible to adjust the white balance, the overall luminance value, and the contrast to generate a color image having a natural hue and contrast. However, in order to use the above method, it is obvious that in the auxiliary pattern PRi, it is necessary to project a pattern that causes irradiation at the maximum brightness at least once out of four projections at all positions. For example, when performing spatial coding using rectangular waves with different periods as described above, it is necessary to prevent an area where the spatial code is 0x0000 (all dark and light sides) from occurring.

ところで本実施形態では、補助パターンPRiとして投影回毎に周期が異なる矩形波のパターンを用いて空間コーディング法により縞次数を確定するようにしているが、比較的平坦で段差や穴がさほど大きくなく、位相飛びが起きにくいような計測対象物である場合には、特許文献1に記載されているように、単純に正弦波縞の1位相ごとに明暗が反転するパターンや、位相境界位置にスリット状の光が投影されるようなパターンを補助パターンとして投影しても構わない。   By the way, in this embodiment, the fringe order is determined by the spatial coding method using a rectangular wave pattern having a different period for each projection as the auxiliary pattern PRi. However, the level difference and the hole are not so large. In the case of a measurement object that is unlikely to cause phase jumps, as described in Patent Document 1, a pattern in which light and dark are simply reversed for each phase of a sine wave fringe, or a slit at a phase boundary position. A pattern in which a shaped light is projected may be projected as an auxiliary pattern.

また、本実施形態では投影装置1として液晶プロジェクタを用いる構成としたが、図7に示すように投影する光の波長を反射し且つ他の波長の光を透過するダイクロイックミラー等の分光装置11A、11Bを用いて、投影する波長ごとに設けられた複数の投影装置10A、10B、10Cの光を1つの光束に合成して投影する構成としてもよい。このような構成は、(1)パターンごとに個別のパターン表示装置を用いることで高い解像度のパターンを投影したい場合、(2)液晶のような可変表示装置ではなく、実体格子(パターンが印刷された透明板など)を機械的手段により移動させる構成を採用する場合、(3)投影光源に発光ダイオードなどの単色光の光源を利用する場合、(4)外乱光の影響を除きたい、対象物表面の色に合わせたい等、RGB以外の特定の単色光を用いて投影を行いたい場合、に特に有効である。   Further, in the present embodiment, a liquid crystal projector is used as the projection apparatus 1. However, as shown in FIG. 7, a spectroscopic apparatus 11A such as a dichroic mirror that reflects the wavelength of light to be projected and transmits light of other wavelengths, 11B may be used to synthesize and project the light from a plurality of projection apparatuses 10A, 10B, and 10C provided for each wavelength to be projected into one light beam. In such a configuration, (1) when it is desired to project a high-resolution pattern by using an individual pattern display device for each pattern, (2) instead of a variable display device such as a liquid crystal, a solid lattice (pattern is printed) (3) When a monochromatic light source such as a light emitting diode is used as the projection light source, (4) The object to be excluded from the influence of disturbance light This is particularly effective when it is desired to perform projection using specific monochromatic light other than RGB, such as to match the color of the surface.

尚、縞パターンと補助パターンの光の波長を異ならせる場合、少なくとも撮像装置2において2つのパターンが分離可能な程度の波長差を有しておればよく、仮に投影装置1の分光能力が撮像装置2の分光能力よりも低いとすれば、投影装置1(あるいは、上記分光装置11)において分光可能な最小限の波長差以上の波長差を有しておればよい。   In the case where the light wavelengths of the fringe pattern and the auxiliary pattern are made different, it is sufficient that at least the imaging device 2 has a wavelength difference that allows the two patterns to be separated. If it is lower than the spectral capacity of 2, it is sufficient that the projection apparatus 1 (or the spectral apparatus 11) has a wavelength difference that is equal to or larger than the minimum wavelength difference that can be dispersed.

ところで、本実施形態では1台の制御装置3によって投影装置1と撮像装置2を制御する構成としているが、これに限らず、投影装置1と撮像装置2をそれぞれ制御装置で個別に制御する構成としても構わない。かかる構成によれば、一方の制御装置の制御の元で撮像装置2が撮像と画像データの転送を行っている間に、他方の制御装置から投影装置1に出力する次の投影パターンの読み込みを行うことができるから、投影と撮像の時間間隔を短くして計測時間が短縮できるという利点がある。また、撮像装置2から画像の取り込みを行うための制御装置と、画像処理を行うための別の制御装置とを備える構成とすることも可能である。かかる構成によれば、時系列で続けて何度も3次元形状計測を行うような場合、一方の制御装置で画像の取り込みを行っている間に別の制御装置により画像処理を行うことができるから、3次元形状データを取得する時間を短縮することができる。さらに、複数台の制御装置で画像処理を並列処理する構成とすれば、図9に示すように縞パターンから局所位相値θpを演算する処理(ステップ7)と補助パターンの解析により縞次数nを推定する処理(ステップ8)は逐次行う構成となっているので、ステップ7とステップ8の処理がそれぞれ別の制御装置により並列で実行することができ、画像処理時間を短縮することができる。また、ステップ9、ステップ10の処理も各画素ごとに独立して処理を行うことができるので、画像を分割して複数の制御装置により画像処理を行うことで画像処理時間を短縮することができる。ここで、3次元の形状データを画像処理するには長時間を有するので、上述した構成を採用することによって画像処理時間を短縮すれば、得られた画像データをリアルタイムで観察する場合に特に有効となる。   By the way, in this embodiment, although it is set as the structure which controls the projection apparatus 1 and the imaging device 2 by the one control apparatus 3, it is not restricted to this, The structure which controls the projection apparatus 1 and the imaging apparatus 2 separately with a control apparatus, respectively. It does not matter. According to such a configuration, while the imaging device 2 is performing imaging and transferring image data under the control of one control device, the next projection pattern output from the other control device to the projection device 1 is read. Therefore, there is an advantage that the measurement time can be shortened by shortening the time interval between projection and imaging. It is also possible to employ a configuration including a control device for capturing an image from the imaging device 2 and another control device for performing image processing. According to this configuration, when three-dimensional shape measurement is performed repeatedly in time series, image processing can be performed by another control device while an image is being captured by one control device. Thus, the time for acquiring the three-dimensional shape data can be shortened. Furthermore, if the configuration is such that image processing is performed in parallel by a plurality of control devices, the fringe order n is calculated by processing (step 7) for calculating the local phase value θp from the fringe pattern and analysis of the auxiliary pattern as shown in FIG. Since the estimation process (step 8) is performed sequentially, the processes of step 7 and step 8 can be executed in parallel by different control devices, and the image processing time can be shortened. Further, since the processing in step 9 and step 10 can be performed independently for each pixel, the image processing time can be shortened by dividing the image and performing image processing by a plurality of control devices. . Here, since it takes a long time to perform image processing of three-dimensional shape data, if the image processing time is shortened by adopting the above-described configuration, it is particularly effective when observing the obtained image data in real time. It becomes.

また、制御装置3を投影装置1及び撮像装置2とは別の場所に設置し、通信回線等を介して画像データを制御装置3に転送する構成とすれば、計測対象物を遠隔監視することができる。あるいは、制御装置3の記憶装置が破壊されるような環境(例えば、強い磁気・電磁波・放射線が存在する場所)に計測対象物が設置されている場合、画像データを安全な場所にある制御装置3に転送して処理を行うことができるので、特に有効となる。さらに、複数の撮像装置2及び投影装置1を異なる場所に設置し、これらの撮像装置2で撮像して得られた3次元形状データを制御装置3で統合することにより、計測対象物を広い角度(例えば180度以上)にて3次元計測して画像表示することが可能となる。   Further, if the control device 3 is installed in a place different from the projection device 1 and the imaging device 2 and the image data is transferred to the control device 3 via a communication line or the like, the measurement object can be remotely monitored. Can do. Alternatively, when the measurement object is installed in an environment where the storage device of the control device 3 is destroyed (for example, a place where strong magnetism, electromagnetic waves, or radiation exists), the control device in which the image data is in a safe place Since the processing can be performed by transferring to 3, it is particularly effective. Further, by installing a plurality of imaging devices 2 and the projection device 1 in different places and integrating the three-dimensional shape data obtained by imaging with these imaging devices 2 with the control device 3, a measurement object can be arranged at a wide angle. It becomes possible to display an image by three-dimensional measurement (for example, 180 degrees or more).

本実施形態の動作説明用の波形図である。It is a wave form diagram for operation explanation of this embodiment. 同上の構成図である。It is a block diagram same as the above. (a)〜(c)は同上に用いる縞パターン、補助パターン、動き検出用パターンの説明図である。(A)-(c) is explanatory drawing of the fringe pattern, auxiliary pattern, and pattern for motion detection which are used for the same as the above. (a)〜(d)は同上において縞パターン、補助パターン、動き検出用パターンを投影して撮像した1回目〜4回目の各画像データの説明図である。(A)-(d) is explanatory drawing of each image data of the 1st time-the 4th time imaged by projecting a fringe pattern, an auxiliary pattern, and a pattern for motion detection in the same as the above. 同上の計測対象物の3次元形状計測結果を示す図である。It is a figure which shows the three-dimensional shape measurement result of a measurement target object same as the above. 人体の皮膚の構造を説明するための断面図である。It is sectional drawing for demonstrating the structure of the skin of a human body. 同上における他の投影装置の構成図である。It is a block diagram of the other projection apparatus same as the above. 従来の3次元形状計測装置の構成図である。It is a block diagram of the conventional three-dimensional shape measuring apparatus. 同上の動作説明用のフローチャートである。It is a flowchart for operation | movement description same as the above. 位相シフト法の原理を説明するための説明図である。It is explanatory drawing for demonstrating the principle of a phase shift method. 同上による位相復元画像を示す図である。It is a figure which shows the phase restoration image by the same as the above. 同上における位相飛び問題を説明するための説明図である。It is explanatory drawing for demonstrating the phase jump problem in the same as the above. 同上における位相飛び問題を説明するための説明図である。It is explanatory drawing for demonstrating the phase jump problem in the same as the above. 同上によって得られる絶対位相値の復元画像である。It is a restored image of the absolute phase value obtained by the same as above. 絶対位相値をもとにして三角測量の原理により計測対象物上の投影点の3次元空間での絶対的な座標値を求めるための説明図である。It is explanatory drawing for calculating | requiring the absolute coordinate value in the three-dimensional space of the projection point on a measurement object based on the principle of a triangulation based on an absolute phase value.

符号の説明Explanation of symbols

1 投影装置
2 撮像装置
3 制御装置
DESCRIPTION OF SYMBOLS 1 Projection apparatus 2 Imaging apparatus 3 Control apparatus

Claims (7)

光の明度が正弦波状に変化する縞パターンを計測対象物に投影し、縞パターンの投影方向と異なる方向から計測対象物を撮像した撮像画像を得る過程を、縞パターンの位相を一定間隔でずらしながら複数回実行し、前記複数回の過程で得られた複数枚の撮像画像における同一点での明度変化に基づいて縞パターンの位相値を演算するとともに該位相値から撮像画像上の各点における高さ情報を得ることで計測対象物の3次元形状を計測する3次元形状計測方法において、撮像画像における縞次数を確定するための補助パターンを、縞パターンを投影する光の波長と異なる波長の光で縞パターンと同時に投影し、縞パターンと補助パターンが同時に投影された撮像画像を、光の波長に基づいて縞パターンが投影された画像と補助パターンが投影された画像に分離し、補助パターンが投影された複数枚の画像から縞次数を確定することを特徴とする3次元形状計測方法。   The process of projecting a fringe pattern whose light intensity changes in a sinusoidal pattern onto a measurement object and obtaining a captured image obtained by imaging the measurement object from a direction different from the projection direction of the fringe pattern is shifted at regular intervals. The phase value of the fringe pattern is calculated based on the brightness change at the same point in the plurality of captured images obtained in the plurality of processes, and at each point on the captured image from the phase value. In a three-dimensional shape measurement method for measuring a three-dimensional shape of a measurement object by obtaining height information, an auxiliary pattern for determining a fringe order in a captured image has a wavelength different from the wavelength of light for projecting the fringe pattern. A captured image that is projected simultaneously with the fringe pattern with light, and the fringe pattern and the auxiliary pattern are projected at the same time, and an image with the fringe pattern projected based on the light wavelength and the auxiliary pattern are projected Three-dimensional shape measurement method characterized by being separated into image, the auxiliary pattern is determined a fringe order from a plurality of images projected. 縞パターンおよび補助パターンの光の波長と異なる波長の光を略均一な明度で縞パターンおよび補助パターンと同時に投影して撮像し、撮像画像から当該略均一な明度の光の波長成分のみを含む画像を分離し、複数回の過程で得られる複数枚の当該画像から計測対象物の動きを検出するとともに検出した動きに基づいて計測対象物の高さ情報を補正することを特徴とする請求項1記載の3次元形状計測方法。   An image including only the wavelength component of the light having the substantially uniform brightness from the captured image by projecting and imaging light having a wavelength different from that of the light of the stripe pattern and the auxiliary pattern simultaneously with the stripe pattern and the auxiliary pattern. 2. The measurement object is detected from a plurality of images obtained in a plurality of processes, and the height information of the measurement object is corrected based on the detected movement. The three-dimensional shape measurement method described. 縞パターンを投影する光の波長を、380ナノメートルから590ナノメートルの範囲とすることを特徴とする請求項1又は2記載の3次元形状計測方法。   The three-dimensional shape measuring method according to claim 1 or 2, wherein the wavelength of the light for projecting the fringe pattern is in the range of 380 nanometers to 590 nanometers. 補助パターンは、縞パターンの位相をずらして投影する各回毎に、縞パターンの周期に対して各回で異なる整数値を乗算した周期で明暗が反転するパターンであることを特徴とする請求項1〜3の何れかに記載の3次元形状計測方法。   The auxiliary pattern is a pattern in which light and dark are inverted at a period obtained by multiplying a period of the fringe pattern by a different integer value at each time every time the phase of the fringe pattern is projected and shifted. 3. The three-dimensional shape measuring method according to any one of 3 above. 縞パターンの光の波長および補助パターンの光の波長は、計測対象物に投影される全ての光の波長を合成すると白色光となるように前記各波長を設定することを特徴とする請求項1〜4の何れかに記載の3次元形状計測方法。   The wavelength of the light of the stripe pattern and the wavelength of the light of the auxiliary pattern are set such that each wavelength is white light when the wavelengths of all the light projected on the measurement object are combined. The three-dimensional shape measuring method according to any one of? 請求項1〜5の何れかに記載の3次元形状計測方法を実行する3次元形状計測装置であって、縞パターン並びに補助パターンを同時に計測対象物に投影する投影手段と、縞パターン並びに補助パターンが投影された計測対象物の画像を撮像する撮像手段と、撮像手段で撮像した撮像画像を画像処理することによって縞次数を確定し、確定した縞次数に基づいて縞パターンの位相値を演算するとともに該位相値から撮像画像上の各点における高さ情報を求める画像処理手段とを備えたことを特徴とする3次元形状計測装置。   A three-dimensional shape measuring apparatus for executing the three-dimensional shape measuring method according to any one of claims 1 to 5, wherein a projecting unit that simultaneously projects the fringe pattern and the auxiliary pattern onto the measurement object, the fringe pattern and the auxiliary pattern An imaging unit that captures an image of the measurement object onto which the image is projected, and a fringe order is determined by performing image processing on the captured image captured by the imaging unit, and a phase value of the fringe pattern is calculated based on the determined fringe order And a three-dimensional shape measuring apparatus comprising image processing means for obtaining height information at each point on the captured image from the phase value. 請求項1〜5の何れかに記載の3次元形状計測方法をコンピュータに実行させる3次元形状計測用プログラムであって、投影装置により縞パターン並びに補助パターンを同時に計測対象物に投影させる手順と、縞パターン並びに補助パターンが投影された計測対象物の画像を撮像手段に撮像させる手順と、撮像手段で撮像した撮像画像を画像処理することによって縞次数を確定し、確定した縞次数に基づいて縞パターンの位相値を演算するとともに該位相値から撮像画像上の各点における高さ情報を求めさせる手順とをコンピュータに実行させることを特徴とする3次元形状計測用プログラム。   A three-dimensional shape measurement program for causing a computer to execute the three-dimensional shape measurement method according to any one of claims 1 to 5, wherein a projection device simultaneously projects a fringe pattern and an auxiliary pattern onto a measurement object; The procedure for causing the imaging means to image the image of the measurement object onto which the fringe pattern and the auxiliary pattern are projected, and the fringe order are determined by performing image processing on the captured image captured by the imaging means, and the fringe order is determined based on the determined fringe order. A program for measuring a three-dimensional shape, which causes a computer to execute a procedure for calculating a phase value of a pattern and obtaining height information at each point on a captured image from the phase value.
JP2004371461A 2004-12-22 2004-12-22 Three-dimensional shape measurement method, three-dimensional shape measurement device, and three-dimensional shape measurement program Active JP4670341B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2004371461A JP4670341B2 (en) 2004-12-22 2004-12-22 Three-dimensional shape measurement method, three-dimensional shape measurement device, and three-dimensional shape measurement program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2004371461A JP4670341B2 (en) 2004-12-22 2004-12-22 Three-dimensional shape measurement method, three-dimensional shape measurement device, and three-dimensional shape measurement program

Publications (2)

Publication Number Publication Date
JP2006177781A true JP2006177781A (en) 2006-07-06
JP4670341B2 JP4670341B2 (en) 2011-04-13

Family

ID=36732030

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2004371461A Active JP4670341B2 (en) 2004-12-22 2004-12-22 Three-dimensional shape measurement method, three-dimensional shape measurement device, and three-dimensional shape measurement program

Country Status (1)

Country Link
JP (1) JP4670341B2 (en)

Cited By (25)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008076107A (en) * 2006-09-19 2008-04-03 Denso Corp Apparatus and method for visual inspection, height measuring method, and circuit board manufacturing method
JP2008145139A (en) * 2006-12-06 2008-06-26 Mitsubishi Electric Corp Shape measuring device
JP2008185370A (en) * 2007-01-26 2008-08-14 Matsushita Electric Works Ltd Three-dimensional shape measuring device and method
WO2009019966A1 (en) * 2007-08-08 2009-02-12 Ckd Corporation Three-dimensional measurement device and board inspecting machine
JP2009075104A (en) * 2007-09-18 2009-04-09 Ncb Networks Co Ltd Apparatus and method for phase transition projection three-dimensional shape measurement with moire fringe generator
JP2009079934A (en) * 2007-09-25 2009-04-16 Shibuya Kogyo Co Ltd Three-dimensional measuring method
JP2009115612A (en) * 2007-11-06 2009-05-28 Panasonic Electric Works Co Ltd Three-dimensional shape measuring device and three-dimensional shape measurement method
JP2010507079A (en) * 2006-10-16 2010-03-04 フラウンホッファー−ゲゼルシャフト ツァー フェーデルング デア アンゲバンテン フォルシュング エー ファー Apparatus and method for non-contact detection of 3D contours
JP2010169572A (en) * 2009-01-23 2010-08-05 Nikon Corp Arithmetic apparatus, arithmetic program, surface shape measurement apparatus, and surface shape measurement method
JP2010169433A (en) * 2009-01-20 2010-08-05 Ckd Corp Three-dimensional measuring device
JP2010181299A (en) * 2009-02-06 2010-08-19 Ckd Corp Three-dimensional measuring instrument
JP2010271580A (en) * 2009-05-22 2010-12-02 Panasonic Electric Works Co Ltd Illumination system, space-rendering system, projection image generating method
JP2011133327A (en) * 2009-12-24 2011-07-07 Roland Dg Corp Method and apparatus for measurement of three-dimensional shape
JP2013104858A (en) * 2011-11-17 2013-05-30 Ckd Corp Three-dimensional measuring device
WO2013175827A1 (en) * 2012-05-24 2013-11-28 三菱電機エンジニアリング株式会社 Image capture device and image capture method
JP2014059239A (en) * 2012-09-18 2014-04-03 Fujitsu Ltd Shape measurement apparatus and shape measurement method
WO2015133053A1 (en) * 2014-03-06 2015-09-11 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Measurement system, measurement method, and vision chip
JP2017110975A (en) * 2015-12-15 2017-06-22 キヤノン株式会社 Measuring device, system, measurement method, determination method, and program
JP2019086483A (en) * 2017-11-10 2019-06-06 株式会社サキコーポレーション Method for acquiring height information on inspection device and inspection device
JP2019139069A (en) * 2018-02-09 2019-08-22 パナソニックIpマネジメント株式会社 Projection system
JP2020003342A (en) * 2018-06-28 2020-01-09 株式会社Screenホールディングス Shape measuring device and shape measuring method
CN112384751A (en) * 2018-07-10 2021-02-19 希罗纳牙科系统有限公司 Optical measuring method and optical measuring device
CN112562064A (en) * 2020-12-08 2021-03-26 四川大学 Precision lossless real-time calculation method and system for three-dimensional point cloud
CN113048912A (en) * 2021-02-26 2021-06-29 山东师范大学 Calibration system and method for projector
CN114424020A (en) * 2019-10-28 2022-04-29 电装波动株式会社 Three-dimensional measuring device

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0921620A (en) * 1995-07-05 1997-01-21 Fuji Facom Corp Method for measuring three-dimensional shape of object
JPH10300443A (en) * 1997-04-25 1998-11-13 Suzuki Motor Corp Three-dimensional shape measuring device
JPH11148810A (en) * 1997-09-09 1999-06-02 Ckd Corp Shape measuring instrument
JP2001159510A (en) * 1999-12-01 2001-06-12 Matsushita Electric Ind Co Ltd Three-dimensional shape measuring method and its device
JP2003057017A (en) * 2001-08-10 2003-02-26 Kao Corp Three-dimensional matter measuring instrument
JP2004325096A (en) * 2003-04-22 2004-11-18 Fujitsu Ltd Image processing method in lattice pattern projective method, image processor and measurement device

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0921620A (en) * 1995-07-05 1997-01-21 Fuji Facom Corp Method for measuring three-dimensional shape of object
JPH10300443A (en) * 1997-04-25 1998-11-13 Suzuki Motor Corp Three-dimensional shape measuring device
JPH11148810A (en) * 1997-09-09 1999-06-02 Ckd Corp Shape measuring instrument
JP2001159510A (en) * 1999-12-01 2001-06-12 Matsushita Electric Ind Co Ltd Three-dimensional shape measuring method and its device
JP2003057017A (en) * 2001-08-10 2003-02-26 Kao Corp Three-dimensional matter measuring instrument
JP2004325096A (en) * 2003-04-22 2004-11-18 Fujitsu Ltd Image processing method in lattice pattern projective method, image processor and measurement device

Cited By (46)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2008076107A (en) * 2006-09-19 2008-04-03 Denso Corp Apparatus and method for visual inspection, height measuring method, and circuit board manufacturing method
JP2010507079A (en) * 2006-10-16 2010-03-04 フラウンホッファー−ゲゼルシャフト ツァー フェーデルング デア アンゲバンテン フォルシュング エー ファー Apparatus and method for non-contact detection of 3D contours
JP2008145139A (en) * 2006-12-06 2008-06-26 Mitsubishi Electric Corp Shape measuring device
JP2008185370A (en) * 2007-01-26 2008-08-14 Matsushita Electric Works Ltd Three-dimensional shape measuring device and method
JP2009042015A (en) * 2007-08-08 2009-02-26 Ckd Corp Three-dimensional measuring device and board inspection machine
WO2009019966A1 (en) * 2007-08-08 2009-02-12 Ckd Corporation Three-dimensional measurement device and board inspecting machine
US8436890B2 (en) 2007-08-08 2013-05-07 Ckd Corporation Three-dimensional measuring device and board inspection device
TWI384194B (en) * 2007-08-08 2013-02-01 Ckd Corp Three-dimensional measuring device and substrate inspection machine
CN101889190B (en) * 2007-08-08 2012-10-17 Ckd株式会社 Three-dimensional measurement device and board inspecting machine
JP2009075104A (en) * 2007-09-18 2009-04-09 Ncb Networks Co Ltd Apparatus and method for phase transition projection three-dimensional shape measurement with moire fringe generator
JP2009079934A (en) * 2007-09-25 2009-04-16 Shibuya Kogyo Co Ltd Three-dimensional measuring method
JP2009115612A (en) * 2007-11-06 2009-05-28 Panasonic Electric Works Co Ltd Three-dimensional shape measuring device and three-dimensional shape measurement method
JP4744610B2 (en) * 2009-01-20 2011-08-10 シーケーディ株式会社 3D measuring device
JP2010169433A (en) * 2009-01-20 2010-08-05 Ckd Corp Three-dimensional measuring device
JP2010169572A (en) * 2009-01-23 2010-08-05 Nikon Corp Arithmetic apparatus, arithmetic program, surface shape measurement apparatus, and surface shape measurement method
JP2010181299A (en) * 2009-02-06 2010-08-19 Ckd Corp Three-dimensional measuring instrument
JP2010271580A (en) * 2009-05-22 2010-12-02 Panasonic Electric Works Co Ltd Illumination system, space-rendering system, projection image generating method
JP2011133327A (en) * 2009-12-24 2011-07-07 Roland Dg Corp Method and apparatus for measurement of three-dimensional shape
JP2013104858A (en) * 2011-11-17 2013-05-30 Ckd Corp Three-dimensional measuring device
CN104364606A (en) * 2012-05-24 2015-02-18 三菱电机工程技术株式会社 Image capture device and image capture method
EP2857793A4 (en) * 2012-05-24 2016-01-06 Mitsubishi Electric Eng Image capture device and image capture method
WO2013175827A1 (en) * 2012-05-24 2013-11-28 三菱電機エンジニアリング株式会社 Image capture device and image capture method
JP2013245980A (en) * 2012-05-24 2013-12-09 Mitsubishi Electric Engineering Co Ltd Imaging apparatus and imaging method
US9746318B2 (en) 2012-05-24 2017-08-29 Mitsubishi Electric Engineering Company, Limited Imaging apparatus and imaging method
JP2014059239A (en) * 2012-09-18 2014-04-03 Fujitsu Ltd Shape measurement apparatus and shape measurement method
CN106062507A (en) * 2014-03-06 2016-10-26 松下电器(美国)知识产权公司 Measurement system, measurement method, and vision chip
CN106062507B (en) * 2014-03-06 2020-03-17 松下电器(美国)知识产权公司 Measurement system, measurement method, and vision chip
KR20160130938A (en) * 2014-03-06 2016-11-15 파나소닉 인텔렉츄얼 프로퍼티 코포레이션 오브 아메리카 Measurement system, measurement method and vision chip
EP3115743A4 (en) * 2014-03-06 2017-04-12 Panasonic Intellectual Property Corporation of America Measurement system, measurement method, and vision chip
JP2015180857A (en) * 2014-03-06 2015-10-15 パナソニック インテレクチュアル プロパティ コーポレーション オブアメリカPanasonic Intellectual Property Corporation of America Measurement system, measurement method and vision chip
US10240915B2 (en) 2014-03-06 2019-03-26 Panasonic Intellectual Property Corporation Of America Measurement system, measurement method, and vision chip
WO2015133053A1 (en) * 2014-03-06 2015-09-11 パナソニック インテレクチュアル プロパティ コーポレーション オブ アメリカ Measurement system, measurement method, and vision chip
KR102217934B1 (en) * 2014-03-06 2021-02-19 파나소닉 인텔렉츄얼 프로퍼티 코포레이션 오브 아메리카 Measurement system, measurement method and vision chip
JP2017110975A (en) * 2015-12-15 2017-06-22 キヤノン株式会社 Measuring device, system, measurement method, determination method, and program
JP2019086483A (en) * 2017-11-10 2019-06-06 株式会社サキコーポレーション Method for acquiring height information on inspection device and inspection device
JP2019139069A (en) * 2018-02-09 2019-08-22 パナソニックIpマネジメント株式会社 Projection system
JP7236680B2 (en) 2018-02-09 2023-03-10 パナソニックIpマネジメント株式会社 projection system
JP2020003342A (en) * 2018-06-28 2020-01-09 株式会社Screenホールディングス Shape measuring device and shape measuring method
JP7022661B2 (en) 2018-06-28 2022-02-18 株式会社Screenホールディングス Shape measuring device and shape measuring method
CN112384751A (en) * 2018-07-10 2021-02-19 希罗纳牙科系统有限公司 Optical measuring method and optical measuring device
CN112384751B (en) * 2018-07-10 2023-03-07 希罗纳牙科系统有限公司 Optical measuring method and optical measuring device
CN114424020A (en) * 2019-10-28 2022-04-29 电装波动株式会社 Three-dimensional measuring device
CN112562064A (en) * 2020-12-08 2021-03-26 四川大学 Precision lossless real-time calculation method and system for three-dimensional point cloud
CN112562064B (en) * 2020-12-08 2023-03-14 四川大学 Precision lossless real-time calculation method and system for three-dimensional point cloud
CN113048912A (en) * 2021-02-26 2021-06-29 山东师范大学 Calibration system and method for projector
CN113048912B (en) * 2021-02-26 2022-07-19 山东师范大学 Calibration system and method of projector

Also Published As

Publication number Publication date
JP4670341B2 (en) 2011-04-13

Similar Documents

Publication Publication Date Title
JP4670341B2 (en) Three-dimensional shape measurement method, three-dimensional shape measurement device, and three-dimensional shape measurement program
US10347031B2 (en) Apparatus and method of texture mapping for dental 3D scanner
JP4830871B2 (en) 3D shape measuring apparatus and 3D shape measuring method
US8199335B2 (en) Three-dimensional shape measuring apparatus, three-dimensional shape measuring method, three-dimensional shape measuring program, and recording medium
US7969583B2 (en) System and method to determine an object distance from a reference point to a point on the object surface
EP2428913B1 (en) Object classification for measured three-dimensional object scenes
JP4883517B2 (en) Three-dimensional measuring apparatus, three-dimensional measuring method, and three-dimensional measuring program
EP2775914B1 (en) 3d intraoral measurements using optical multiline method
EP3577416B1 (en) System and method for 3d scanning
JP5757950B2 (en) Non-contact object measurement
JP6364777B2 (en) Image data acquisition system and image data acquisition method
US20110316978A1 (en) Intensity and color display for a three-dimensional metrology system
US9074879B2 (en) Information processing apparatus and information processing method
JP2009115612A (en) Three-dimensional shape measuring device and three-dimensional shape measurement method
JP2011504230A (en) Optical measurement method of objects using trigonometry
EP3069100B1 (en) 3d mapping device
JP2010071782A (en) Three-dimensional measurement apparatus and method thereof
US11212508B2 (en) Imaging unit and system for obtaining a three-dimensional image
JP6104662B2 (en) Measuring device, method and program
MX2014005207A (en) Apparatus and method for simultaneous three-dimensional measuring of surfaces with multiple wavelengths.
JP2005520142A (en) Method and apparatus for measuring absolute coordinates of object
JP6237032B2 (en) Color and three-dimensional shape measuring method and apparatus
KR20170045232A (en) 3-d intraoral measurements using optical multiline method
JP2001330417A (en) Three-dimensional shape measuring method and apparatus using color pattern light projection
JP2012237613A (en) Shape measuring device and shape measuring method

Legal Events

Date Code Title Description
A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20070910

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20100311

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100316

RD04 Notification of resignation of power of attorney

Free format text: JAPANESE INTERMEDIATE CODE: A7424

Effective date: 20100623

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20100713

A521 Written amendment

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20100913

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20101221

A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20110103

R151 Written notification of patent or utility model registration

Ref document number: 4670341

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151

FPAY Renewal fee payment (event date is renewal date of database)

Free format text: PAYMENT UNTIL: 20140128

Year of fee payment: 3