JPWO2020039575A1 - 3D measuring device, 3D measuring method - Google Patents

3D measuring device, 3D measuring method Download PDF

Info

Publication number
JPWO2020039575A1
JPWO2020039575A1 JP2020537988A JP2020537988A JPWO2020039575A1 JP WO2020039575 A1 JPWO2020039575 A1 JP WO2020039575A1 JP 2020537988 A JP2020537988 A JP 2020537988A JP 2020537988 A JP2020537988 A JP 2020537988A JP WO2020039575 A1 JPWO2020039575 A1 JP WO2020039575A1
Authority
JP
Japan
Prior art keywords
pixel
sensitivity
pixels
low
pixel value
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2020537988A
Other languages
Japanese (ja)
Other versions
JP7051260B2 (en
Inventor
伸章 田端
伸章 田端
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Yamaha Motor Co Ltd
Original Assignee
Yamaha Motor Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Yamaha Motor Co Ltd filed Critical Yamaha Motor Co Ltd
Publication of JPWO2020039575A1 publication Critical patent/JPWO2020039575A1/en
Application granted granted Critical
Publication of JP7051260B2 publication Critical patent/JP7051260B2/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2518Projection by scanning of the object
    • G01B11/2527Projection by scanning of the object with phase change by in-plane movement of the patern
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/002Measuring arrangements characterised by the use of optical techniques for measuring two or more coordinates
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2509Color coding
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • HELECTRICITY
    • H05ELECTRIC TECHNIQUES NOT OTHERWISE PROVIDED FOR
    • H05KPRINTED CIRCUITS; CASINGS OR CONSTRUCTIONAL DETAILS OF ELECTRIC APPARATUS; MANUFACTURE OF ASSEMBLAGES OF ELECTRICAL COMPONENTS
    • H05K3/00Apparatus or processes for manufacturing printed circuits
    • H05K3/30Assembling printed circuits with electric components, e.g. with resistor
    • H05K3/32Assembling printed circuits with electric components, e.g. with resistor electrically connecting electric components or wires to printed circuits
    • H05K3/34Assembling printed circuits with electric components, e.g. with resistor electrically connecting electric components or wires to printed circuits by soldering
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B2210/00Aspects not specifically covered by any group under G01B, e.g. of wheel alignment, caliper-like sensors
    • G01B2210/56Measuring geometric parameters of semiconductor structures, e.g. profile, critical dimensions or trench depth

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Manufacturing & Machinery (AREA)
  • Microelectronics & Electronic Packaging (AREA)
  • Length Measuring Devices By Optical Means (AREA)
  • Solid State Image Pick-Up Elements (AREA)
  • Studio Devices (AREA)

Abstract

撮像カメラ31は、半田B(対象物)に照射される波長λgの光に対して、高い感度を持つ分光感度特性SP(G)を有する高感度画素Phと、低い感度を持つ分光感度特性SP(R)、SP(B)を有する低感度画素Plとを備える。したがって、半田Bの表面うち、高反射領域Ahで反射されたパターン光L(S)は、低感度画素Plにより適切な画素値Vに変換でき、低反射領域Alで反射されたパターン光L(S)は、高感度画素Phにより適切な画素値Vに変換できる。つまり、高反射領域Ahおよび低感度画素Plで反射されたパターン光L(S)の両方を、適切な画素値Vに変換できる。こうして、高反射領域Ahと低感度画素Plとが半田Bに混在する場合であっても、両領域Ah、Alについて正確な画素値Vを取得することが可能となっている。The imaging camera 31 has a high-sensitivity pixel Ph having a spectral sensitivity characteristic SP (G) having a high sensitivity and a spectral sensitivity characteristic SP having a low sensitivity with respect to light having a wavelength of λg irradiated to the solder B (object). It includes a low-sensitivity pixel Pl having (R) and SP (B). Therefore, of the surface of the solder B, the pattern light L (S) reflected in the high reflection region Ah can be converted into an appropriate pixel value V by the low-sensitivity pixel Pl, and the pattern light L (S) reflected in the low reflection region Al ( S) can be converted into an appropriate pixel value V by the high-sensitivity pixel Ph. That is, both the high reflection region Ah and the pattern light L (S) reflected by the low sensitivity pixel Pl can be converted into an appropriate pixel value V. In this way, even when the high reflection region Ah and the low sensitivity pixel Pl are mixed in the solder B, it is possible to acquire accurate pixel values V for both regions Ah and Al.

Description

この発明は、対象物の三次元形状を計測する技術に関する。 The present invention relates to a technique for measuring a three-dimensional shape of an object.

特許文献1では、プリント基板の電極に印刷された半田等の対象物の三次元形状を計測する三次元計測装置が記載されている。この三次元計測装置は、白色の光を対象物に照射し、対象物で反射された光を撮像した画像に基づき、対象物の三次元形状を計測する。また、このような三次元計測装置では、入射した光の強度に応じた画素値をそれぞれ出力する複数の画素を有する撮像カメラが、対象物で反射された光の撮像に使用することができる。 Patent Document 1 describes a three-dimensional measuring device that measures a three-dimensional shape of an object such as solder printed on an electrode of a printed circuit board. This three-dimensional measuring device irradiates an object with white light and measures the three-dimensional shape of the object based on an image obtained by capturing the light reflected by the object. Further, in such a three-dimensional measuring device, an imaging camera having a plurality of pixels that output pixel values according to the intensity of the incident light can be used for imaging the light reflected by the object.

特許第4256059号公報Japanese Patent No. 4256059

なお、撮像カメラの各画素はダイナミックレンジを有しており、ダイナミックレンジよりも暗いあるいは明るい光に対しては、正確な画素値を出力することができない。そこで、対象物の反射率が低い場合には、対象物に照射する光の強度を高くする一方、対象物の反射率が高い場合には、対象物に照射する光の強度を低くするといった手法が考えられる。しかしながら、反射率の低い領域と反射率の高い領域とが対象物に混在する場合、光の強度を高くすると反射率の高い領域について正確な画素値が得られず、光の強度を低くすると反射率の低い領域について正確な画素値が得られない。そのため、かかる手法は必ずしも有効ではなかった。 It should be noted that each pixel of the imaging camera has a dynamic range, and it is not possible to output an accurate pixel value for light darker or brighter than the dynamic range. Therefore, when the reflectance of the object is low, the intensity of the light irradiating the object is increased, while when the reflectance of the object is high, the intensity of the light irradiating the object is decreased. Can be considered. However, when a region with low reflectance and a region with high reflectance coexist in the object, accurate pixel values cannot be obtained for the region with high reflectance when the light intensity is high, and reflection is obtained when the light intensity is low. Accurate pixel values cannot be obtained for regions with low reflectance. Therefore, such a method was not always effective.

この発明は上記課題に鑑みなされたものであり、反射率の低い領域と反射率の高い領域とが対象物に混在する場合であっても、両領域について正確な画素値を取得することを可能とする技術の提供を目的とする。 The present invention has been made in view of the above problems, and even when a region having low reflectance and a region having high reflectance coexist in an object, it is possible to obtain accurate pixel values for both regions. The purpose is to provide the technology.

本発明に係る三次元計測装置は、所定波長の光を対象物に照射するプロジェクターと、 対象物で反射された光が入射する複数の画素を有し、複数の画素のそれぞれは入射した光の強度に応じた画素値を出力する撮像カメラと、画素値に基づき対象物の三次元形状の形状算出を実行する制御部とを備え、複数の画素は、複数の高感度画素と、高感度画素が有する分光感度特性と比較して所定波長の入力に対する出力の比が低い分光感度特性を有する複数の低感度画素とを含む。 The three-dimensional measuring device according to the present invention has a projector that irradiates an object with light having a predetermined wavelength, and a plurality of pixels on which the light reflected by the object is incident, and each of the plurality of pixels is of the incident light. It is equipped with an imaging camera that outputs pixel values according to the intensity and a control unit that calculates the shape of the three-dimensional shape of the object based on the pixel values. Includes a plurality of low-sensitivity pixels having a spectral sensitivity characteristic in which the ratio of the output to the input of a predetermined wavelength is low as compared with the spectral sensitivity characteristic of the camera.

本発明に係る三次元計測方法は、所定波長の光を対象物に照射する工程と、対象物で反射された光を複数の画素に入射させ、複数の画素が入射した光の強度に応じた画素値を出力する工程と、画素値に基づき対象物の三次元形状の形状算出を実行する工程とを備え、複数の画素は、複数の高感度画素と、高感度画素が有する分光感度特性と比較して所定波長の入力に対する出力の比が低い分光感度特性を有する複数の低感度画素とを含む。 The three-dimensional measurement method according to the present invention corresponds to a step of irradiating an object with light having a predetermined wavelength and light reflected by the object incident on a plurality of pixels according to the intensity of the incident light on the plurality of pixels. It includes a step of outputting a pixel value and a step of executing the shape calculation of a three-dimensional shape of an object based on the pixel value. It includes a plurality of low-sensitivity pixels having spectral sensitivity characteristics in which the ratio of output to input of a predetermined wavelength is relatively low.

このように構成された本発明(三次元計測装置、三次元計測方法)では、撮像カメラは、高感度画素と、当該高感度画素の分光感度特性と比較して所定波長の入力に対する出力の比が低い分光感度特性を有する低感度画素とを有する。すなわち、対象物に照射される所定波長の光に対して、高い感度を持つ分光感度特性を有する高感度画素と、高感度画外比較して低い感度を持つ分光感度特性を有する低感度画素とが具備される。したがって、対象物のうち、反射率の高い領域で反射された光は、低感度画素により適切な画素値に変換でき、反射率の低い領域で反射された光は、高感度画素により適切な画素値に変換できる。つまり、反射率の高い領域で反射された光および反射率の低い領域で反射された光の両方を、適切な画素値に変換できる。こうして、反射率の低い領域と反射率の高い領域とが対象物に混在する場合であっても、両領域について正確な画素値を取得することが可能となっている。 In the present invention (three-dimensional measuring device, three-dimensional measuring method) configured in this way, the imaging camera has a ratio of an output to an input of a predetermined wavelength in comparison with the spectral sensitivity characteristics of the high-sensitivity pixel. Has a low sensitivity pixel having a low spectral sensitivity characteristic. That is, a high-sensitivity pixel having a high-sensitivity spectral sensitivity characteristic with respect to light of a predetermined wavelength irradiating an object, and a low-sensitivity pixel having a low-sensitivity spectral sensitivity characteristic as compared with a high-sensitivity out-of-image. Is provided. Therefore, among the objects, the light reflected in the region with high reflectance can be converted into an appropriate pixel value by the low-sensitivity pixel, and the light reflected in the region with low reflectance can be converted into an appropriate pixel by the high-sensitivity pixel. Can be converted to a value. That is, both the light reflected in the high reflectance region and the light reflected in the low reflectance region can be converted into appropriate pixel values. In this way, even when a region having a low reflectance and a region having a high reflectance coexist in the object, it is possible to acquire accurate pixel values for both regions.

また、高感度画素と低感度画素とが交互に配列されているように、三次元計測装置を構成しても良い。かかる構成では、高感度画素と低感度画素とが互いに隣接しつつ均一に配置されるため、低い反射率で反射された光を高感度画素で的確に捉えつつ、高い反射率で反射された光を低感度画素で的確に捉えることができる。その結果、反射率の低い領域と反射率の高い領域とが対象物に混在する場合であっても、両領域について正確な画素値を取得することが可能となる。 Further, the three-dimensional measuring device may be configured so that the high-sensitivity pixels and the low-sensitivity pixels are arranged alternately. In such a configuration, since the high-sensitivity pixels and the low-sensitivity pixels are uniformly arranged while being adjacent to each other, the light reflected with a high reflectance can be accurately captured by the high-sensitivity pixels and the light reflected with a high reflectance. Can be accurately captured with low-sensitivity pixels. As a result, even when a region having a low reflectance and a region having a high reflectance coexist in the object, it is possible to acquire accurate pixel values for both regions.

また、複数の画素は、高感度画素と低感度画素とを同じ比率で含むように、三次元計測装置を構成しても良い。かかる構成では、高い反射率の領域で反射された光と、低い反射率の領域で反射された光の一方に偏ることなく、これらを適切な画素値に変換することができる。その結果、反射率の低い領域と反射率の高い領域とが対象物に混在する場合であっても、両領域について正確な画素値を取得することが可能となる。 Further, the three-dimensional measuring device may be configured so that the plurality of pixels include the high-sensitivity pixels and the low-sensitivity pixels in the same ratio. In such a configuration, it is possible to convert the light reflected in the high reflectance region and the light reflected in the low reflectance region into appropriate pixel values without being biased to one of them. As a result, even when a region having a low reflectance and a region having a high reflectance coexist in the object, it is possible to acquire accurate pixel values for both regions.

また、所定波長は、緑色の波長であり、複数の画素は、赤色、緑色および青色を所定のパターンで配列したベイヤー配列により配列され、複数の高感度画素のそれぞれは、ベイヤー配列における緑色の画素であり、複数の低感度画素は、ベイヤー配列における青色の画素と赤色の画素とを同数ずつ含むように、三次元計測装置を構成しても良い。かかる構成では、緑色の高感度画素と赤色あるいは青色の低感度画素とが互いに隣接しつつ均一に配置されるため、低い反射率で反射された光を高感度画素で的確に捉えつつ、高い反射率で反射された光を低感度画素で的確に捉えることができる。その結果、反射率の低い領域と反射率の高い領域とが対象物に混在する場合であっても、両領域について正確な画素値を取得することが可能となる。 Further, the predetermined wavelength is a green wavelength, and the plurality of pixels are arranged by a Bayer array in which red, green and blue are arranged in a predetermined pattern, and each of the plurality of high-sensitivity pixels is a green pixel in the Bayer array. Therefore, the three-dimensional measuring device may be configured so that the plurality of low-sensitivity pixels include the same number of blue pixels and the same number of red pixels in the Bayer array. In such a configuration, the green high-sensitivity pixels and the red or blue low-sensitivity pixels are uniformly arranged adjacent to each other, so that the light reflected with low reflectance is accurately captured by the high-sensitivity pixels and highly reflected. The light reflected by the rate can be accurately captured by the low-sensitivity pixels. As a result, even when a region having a low reflectance and a region having a high reflectance coexist in the object, it is possible to acquire accurate pixel values for both regions.

なお、反射率の高い領域で反射されて高感度画素に入射した光や、反射率の低い領域で反射されて低感度画素に入射した光を変換した画素値は、不適切である可能性が高い。そこで、制御部は、画素から出力される画素値が適切か否かを画素値に基づき判定する判定処理を複数の画素のそれぞれについて実行し、判定処理での判定結果に基づき形状算出を実行するように、三次元計測装置を構成しても良い。かかる構成では、不適切な画素値の影響を抑えつつ適切な画素値を用いて形状算出を実行できる。そのため、反射率の低い領域と反射率の高い領域とが対象物に混在する場合であっても、対象物の三次元形状を正確に算出することが可能となる。 Note that the pixel value obtained by converting the light reflected in the high reflectance region and incident on the high-sensitivity pixel or the light reflected in the low reflectance region and incident on the low-sensitivity pixel may be inappropriate. high. Therefore, the control unit executes a determination process for determining whether or not the pixel value output from the pixel is appropriate based on the pixel value for each of the plurality of pixels, and executes shape calculation based on the determination result in the determination process. As described above, a three-dimensional measuring device may be configured. In such a configuration, shape calculation can be performed using an appropriate pixel value while suppressing the influence of an inappropriate pixel value. Therefore, even when a region having a low reflectance and a region having a high reflectance coexist in the object, it is possible to accurately calculate the three-dimensional shape of the object.

また、制御部は、判定処理で画素値が不適切と判定された高感度画素に対して、当該高感度画素から所定範囲内に位置する低感度画素の画素値による補間を実行し、あるいは判定処理で画素値が不適切と判定された低感度画素に対して、当該低感度画素から所定範囲内に位置する高感度画素の画素値による補間を実行した結果に基づき形状算出を実行するように、三次元計測装置を構成しても良い。かかる構成では、判定処理で不適切と判定された画素の画素値を当該画素から所定範囲内に位置する画素の画素値により補間し、その結果に基づき形状算出を実行できる。その結果、反射率の低い領域と反射率の高い領域とが対象物に混在する場合であっても、対象物の三次元形状を正確に算出することが可能となる。 In addition, the control unit executes interpolation based on the pixel values of the low-sensitivity pixels located within a predetermined range from the high-sensitivity pixels, or determines the high-sensitivity pixels whose pixel values are determined to be inappropriate in the determination process. For low-sensitivity pixels whose pixel values are determined to be inappropriate by processing, shape calculation is executed based on the result of interpolation by the pixel values of high-sensitivity pixels located within a predetermined range from the low-sensitivity pixels. , A three-dimensional measuring device may be configured. In such a configuration, the pixel values of the pixels determined to be inappropriate in the determination process can be interpolated by the pixel values of the pixels located within a predetermined range from the pixels, and the shape calculation can be executed based on the result. As a result, even when a region having a low reflectance and a region having a high reflectance coexist in the object, it is possible to accurately calculate the three-dimensional shape of the object.

また、プロジェクターは、所定波長を有して、互いに異なる位相を有する複数の縞パターンの光を対象物に照射し、制御部は、位相シフト法によって形状算出を実行するように、三次元計測装置を構成しても良い。かかる構成では、反射率の低い領域と反射率の高い領域とが対象物に混在する場合であっても、対象物の三次元形状を位相シフト法によって適切に算出することが可能となる。 Further, the projector irradiates the object with light having a plurality of striped patterns having a predetermined wavelength and different phases, and the control unit performs a shape calculation by the phase shift method. May be configured. In such a configuration, even when a region having a low reflectance and a region having a high reflectance are mixed in the object, the three-dimensional shape of the object can be appropriately calculated by the phase shift method.

本発明によれば、こうして、反射率の低い領域と反射率の高い領域とが対象物に混在する場合であっても、両領域について正確な画素値を取得することが可能となる。 According to the present invention, it is possible to obtain accurate pixel values for both regions even when the region having low reflectance and the region having high reflectance coexist in the object.

本発明に係る外観検査装置を模式的に例示するブロック図。The block diagram which schematically exemplifies the appearance inspection apparatus which concerns on this invention. 撮像ユニットの構成を模式的に示す図。The figure which shows typically the structure of the image pickup unit. 撮像ユニットが有する画素の分光感度特性を模式的に示す図。The figure which shows typically the spectral sensitivity characteristic of a pixel which an image pickup unit has. 高反射領域および低反射領域それぞれで反射された光と画素値との関係を模式的に示す図。The figure which shows typically the relationship between the light reflected in each of a high reflection region and a low reflection region, and a pixel value. 外観検査装置が実行する三次元計測の一例を示すフローチャート。A flowchart showing an example of three-dimensional measurement performed by a visual inspection device. 図5の三次元計測で実行される不適画素の補間の一例を示すフローチャート。FIG. 5 is a flowchart showing an example of interpolation of unsuitable pixels executed in the three-dimensional measurement of FIG. 図5の三次元計測で実行される演算の内容を説明する図。The figure explaining the content of the operation executed by the three-dimensional measurement of FIG. 不適画素の補間の一例を示す図。The figure which shows an example of the interpolation of the unsuitable pixel.

図1は本発明に係る外観検査装置を模式的に例示するブロック図である。同図および以下の図では、鉛直方向に平行なZ方向、水平方向に平行なX方向およびY方向で構成されるXYZ直交座標を適宜示す。図1の外観検査装置1は、制御装置100によって搬送コンベア2、検査ヘッド3および駆動機構4を制御することで、基板10(プリント基板)に部品(電子部品)を接合する半田Bの状態の良否を検査する。 FIG. 1 is a block diagram schematically illustrating a visual inspection apparatus according to the present invention. In the figure and the following figures, XYZ Cartesian coordinates composed of the Z direction parallel to the vertical direction, the X direction parallel to the horizontal direction, and the Y direction are appropriately shown. The visual inspection device 1 of FIG. 1 is in a state of solder B for joining a component (electronic component) to a substrate 10 (printed circuit board) by controlling a conveyor 2, an inspection head 3, and a drive mechanism 4 by a control device 100. Inspect the quality.

搬送コンベア2は、基板10を所定の搬送経路に沿って搬送する。具体的には、搬送コンベア2は、検査前の基板10を外観検査装置1内の検査位置に搬入し、基板10を検査位置で水平に保持する。また、検査位置における基板10への検査が終了すると、搬送コンベア2は、検査後の基板10を外観検査装置1の外へ搬出する。 The transport conveyor 2 transports the substrate 10 along a predetermined transport path. Specifically, the transport conveyor 2 carries the substrate 10 before inspection to the inspection position in the visual inspection device 1 and holds the substrate 10 horizontally at the inspection position. When the inspection of the substrate 10 at the inspection position is completed, the transfer conveyor 2 carries the inspected substrate 10 out of the visual inspection apparatus 1.

検査ヘッド3は、撮像視野V31内を上方から撮像する撮像カメラ31を有しており、検査位置に搬入された基板10の半田Bを撮像視野V31に収めて撮像カメラ31によって撮像する。撮像カメラ31は、半田Bからの反射光を撮像する平板形状の撮像ユニット311を有する。この撮像ユニット311の詳細は、図2を用いて後述する。さらに、検査ヘッド3は、光強度分布が正弦波状に変化する縞状のパターン光L(S)を撮像視野V31に投影するプロジェクター32を有する。プロジェクター32は、LED(Light Emitting Diode)等の光源と、光源からの光を撮像視野V31へ向けて反射するデジタル・マイクロミラー・デバイスとを有している。かかるプロジェクター32は、デジタル・マイクロミラー・デバイスの各マイクロミラーの角度を調整することで、互いに位相の異なる複数種のパターン光L(S)を撮像視野V31に投影できる。つまり、検査ヘッド3は、プロジェクター32から投影するパターン光L(S)の位相を変化させながら撮像カメラ31により撮像を行うことで、位相シフト法によって撮像視野V31内の半田Bの三次元形状Bsを計測することができる。 The inspection head 3 has an imaging camera 31 that images the inside of the imaging field of view V31 from above, and the solder B of the substrate 10 carried into the inspection position is housed in the imaging field of view V31 and imaged by the imaging camera 31. The image pickup camera 31 has a flat plate-shaped image pickup unit 311 that captures the reflected light from the solder B. Details of the imaging unit 311 will be described later with reference to FIG. Further, the inspection head 3 has a projector 32 that projects a striped pattern light L (S) whose light intensity distribution changes in a sinusoidal shape onto an imaging field of view V31. The projector 32 has a light source such as an LED (Light Emitting Diode) and a digital micromirror device that reflects the light from the light source toward the imaging field of view V31. By adjusting the angle of each micromirror of the digital micromirror device, the projector 32 can project a plurality of types of pattern lights L (S) having different phases to the imaging field of view V31. That is, the inspection head 3 takes an image with the imaging camera 31 while changing the phase of the pattern light L (S) projected from the projector 32, and thereby, the three-dimensional shape Bs of the solder B in the imaging field of view V31 by the phase shift method. Can be measured.

ちなみに、検査ヘッド3は、8個のプロジェクター32を有している(図1では、図示を簡便化するために2個のプロジェクター32が代表して示されている)。8個のプロジェクター32は、撮像カメラ31の周囲を囲むように配置されており、鉛直方向Zを中心として円周状に等ピッチで並ぶ。そして、各プロジェクター32は、撮像カメラ31の撮像視野V31に対して斜め上方からパターン光L(S)を投影する。したがって、複数の方プロジェクター32のうち、半田Bとの位置関係が適切な一のプロジェクター32から、撮像視野V31にパターン光L(S)を投影することができる。 Incidentally, the inspection head 3 has eight projectors 32 (in FIG. 1, two projectors 32 are represented as representatives for simplification of illustration). The eight projectors 32 are arranged so as to surround the periphery of the image pickup camera 31, and are arranged at equal pitches in a circle around the vertical direction Z. Then, each projector 32 projects the pattern light L (S) from diagonally above the image field of view V31 of the image pickup camera 31. Therefore, the pattern light L (S) can be projected onto the imaging field of view V31 from one of the plurality of projectors 32, which has an appropriate positional relationship with the solder B.

駆動機構4は、検査ヘッド3を支持しつつ、モーターによって水平方向および鉛直方向へ検査ヘッド3を駆動させる。この駆動機構4の駆動によって、検査ヘッド3は半田Bの上方に移動して、半田Bを撮像視野V31内に捉えることができ、撮像視野V31内の半田Bの三次元形状Bsを計測できる。 The drive mechanism 4 supports the inspection head 3 and drives the inspection head 3 in the horizontal direction and the vertical direction by a motor. By driving the drive mechanism 4, the inspection head 3 can move above the solder B to capture the solder B in the imaging field of view V31, and can measure the three-dimensional shape Bs of the solder B in the imaging field of view V31.

制御装置100は、CPU(Central Processing Unit)およびメモリーで構成されたプロセッサーである主制御部110を有しており、主制御部110が装置各部の制御を統括することで、検査が実行される。また。制御装置100は、ディスプレイ、キーボードおよびマウス等の入出力機器で構成されたユーザーインターフェース200を有しており、ユーザーは、ユーザーインターフェース200を介して制御装置100に指令を入力したり、制御装置100による検査結果を確認したりすることができる。さらに、制御装置100は、プロジェクター32を制御する投影制御部120、撮像カメラ31を制御する撮像制御部130および駆動機構4を制御する駆動制御部140を有する。搬送コンベア2が検査位置に基板10を搬入すると、主制御部110は、駆動制御部140により駆動機構4を制御して、基板10の半田Bの上方へ検査ヘッド3を移動させる。これによって、撮像カメラ31の撮像視野V31内に半田Bが収まる。 The control device 100 has a main control unit 110 which is a processor composed of a CPU (Central Processing Unit) and a memory, and the inspection is executed when the main control unit 110 controls the control of each unit of the device. .. Also. The control device 100 has a user interface 200 composed of input / output devices such as a display, a keyboard, and a mouse, and the user can input a command to the control device 100 via the user interface 200 or control the control device 100. You can check the inspection result by. Further, the control device 100 includes a projection control unit 120 that controls the projector 32, an image pickup control unit 130 that controls the image pickup camera 31, and a drive control unit 140 that controls the drive mechanism 4. When the transfer conveyor 2 carries the substrate 10 to the inspection position, the main control unit 110 controls the drive mechanism 4 by the drive control unit 140 to move the inspection head 3 above the solder B of the substrate 10. As a result, the solder B fits within the imaging field of view V31 of the imaging camera 31.

続いて、主制御部110は、プロジェクター32から半田Bを含む撮像視野V31へパターン光L(S)を投影しつつ撮像視野V31に投影されたパターン光L(S)を撮像カメラ31により撮像する(パターン撮像動作)。具体的には、主制御部110は、不揮発性メモリーで構成された記憶部150を有しており、記憶部150に記憶された投影パターンT(S)を読み出す。そして、主制御部110は、記憶部150から読み出した投影パターンT(S)に基づいて投影制御部120を制御することで、プロジェクター32のデジタル・マイクロミラー・デバイスの各マイクロミラーの角度を投影パターンT(S)に応じて調整する。こうして、撮像視野V31には、投影パターンT(S)を有するパターン光L(S)が投影される。さらに、主制御部110は、撮像制御部130を制御することで、撮像視野V31に投影されたパターン光L(S)を撮像カメラ31により撮像して撮像画像I(S)を取得する。この撮像画像Iは、記憶部150に記憶される。なお、記憶部150には、互いに90度ずつ位相の異なる4種類の投影パターンT(S)が記憶されており、パターン撮像動作は、投影パターンT(S)を変えながら4回実行される(S=1、2、3、4)。その結果、それぞれ90度ずつ位相の異なるパターン光L(S)を撮像した4種類の撮像画像I(S)が取得される。 Subsequently, the main control unit 110 uses the imaging camera 31 to image the pattern light L (S) projected on the imaging field V31 while projecting the pattern light L (S) from the projector 32 onto the imaging field V31 including the solder B. (Pattern imaging operation). Specifically, the main control unit 110 has a storage unit 150 composed of a non-volatile memory, and reads out the projection pattern T (S) stored in the storage unit 150. Then, the main control unit 110 projects the angle of each micromirror of the digital micromirror device of the projector 32 by controlling the projection control unit 120 based on the projection pattern T (S) read from the storage unit 150. Adjust according to the pattern T (S). In this way, the pattern light L (S) having the projection pattern T (S) is projected on the imaging field of view V31. Further, the main control unit 110 controls the image pickup control unit 130 to take an image of the pattern light L (S) projected on the image pickup field of view V31 by the image pickup camera 31 and acquire the image capture image I (S). The captured image I is stored in the storage unit 150. The storage unit 150 stores four types of projection patterns T (S) having different phases by 90 degrees from each other, and the pattern imaging operation is executed four times while changing the projection pattern T (S) ( S = 1, 2, 3, 4). As a result, four types of captured images I (S) in which the pattern lights L (S) having different phases by 90 degrees are captured are acquired.

主制御部110は、こうして取得された4種類の撮像画像I(S)から、位相シフト法によって、撮像視野V31の高さを撮像カメラ31の画素毎に求める。これによって、半田Bの表面の高さが撮像カメラ31の画素毎に求められることとなる。 The main control unit 110 obtains the height of the imaging field of view V31 for each pixel of the imaging camera 31 from the four types of captured images I (S) thus acquired by the phase shift method. As a result, the height of the surface of the solder B can be obtained for each pixel of the image pickup camera 31.

図2は撮像ユニットの構成を模式的に示す図である。撮像ユニット311は例えばCCD(Charge Coupled Device)イメージセンサー等の固体撮像素子312と、固体撮像素子312に重ねられたカラーフィルター313とを有する。固体撮像素子312は、X方向およびY方向のそれぞれに一定の配列ピッチΔPで配列された複数の受光画素Piを有する。つまり、固体撮像素子312では、複数の受光画素Piが二次元的に配列されている。また、カラーフィルター313は、X方向およびY方向のそれぞれに配列ピッチΔPで配列された複数のフィルター画素Pfを有する。つまり、カラーフィルター313では、複数のフィルター画素Pfが二次元的に配列されている。 FIG. 2 is a diagram schematically showing the configuration of the imaging unit. The image pickup unit 311 has, for example, a solid-state image sensor 312 such as a CCD (Charge Coupled Device) image sensor, and a color filter 313 superimposed on the solid-state image sensor 312. The solid-state image sensor 312 has a plurality of light receiving pixels Pi arranged at a constant array pitch ΔP in each of the X direction and the Y direction. That is, in the solid-state image sensor 312, a plurality of light receiving pixels Pi are two-dimensionally arranged. Further, the color filter 313 has a plurality of filter pixels Pf arranged at an array pitch ΔP in each of the X direction and the Y direction. That is, in the color filter 313, a plurality of filter pixels Pf are arranged two-dimensionally.

このように、複数の受光画素Piと複数のフィルター画素Pfとが一対一の対応関係で設けられ、互いに対応する受光画素Piとフィルター画素Pfとが対向する。換言すれば、撮像ユニット311では、互いに対向する受光画素Piとフィルター画素Pfとで画素Pxが構成され、複数の画素PxがX方向およびY方向のそれぞれに配列ピッチΔPで配列される。そして、各画素Pxは、フィルター画素Pfを透過して受光画素Piに入射した光の強度に応じた画素値V(図3)を、受光画素Piから出力する。 In this way, the plurality of light receiving pixels Pi and the plurality of filter pixels Pf are provided in a one-to-one correspondence relationship, and the light receiving pixels Pi and the filter pixels Pf corresponding to each other face each other. In other words, in the image pickup unit 311, the pixel Px is composed of the light receiving pixels Pi and the filter pixels Pf facing each other, and the plurality of pixels Px are arranged in the X direction and the Y direction at the arrangement pitch ΔP, respectively. Then, each pixel Px outputs a pixel value V (FIG. 3) corresponding to the intensity of the light transmitted through the filter pixel Pf and incident on the light receiving pixel Pi from the light receiving pixel Pi.

図2に示すように、カラーフィルター313では、複数のフィルター画素Pfがベイヤー配列に従って配列されており、各フィルター画素Pfは、赤(R)、緑(G)および青(B)のうち、その配列位置に応じた色の光の透過を許容し、配列位置に応じた色とは異なる色の光の透過を制限する。したがって、撮像ユニット311の各画素Pxは、そのフィルター画素Pfが透過を許容する光の色に応じた分光感度特性を有する。 As shown in FIG. 2, in the color filter 313, a plurality of filter pixels Pf are arranged according to a Bayer array, and each filter pixel Pf is among red (R), green (G), and blue (B). Allows the transmission of light of a color according to the arrangement position, and limits the transmission of light of a color different from the color according to the arrangement position. Therefore, each pixel Px of the image pickup unit 311 has a spectral sensitivity characteristic according to the color of light that the filter pixel Pf allows transmission.

図3は撮像ユニットが有する画素の分光感度特性を模式的に示す図である。図3では、横軸に光の波長を示すとともに縦軸に画素値Vを示すグラフにおいて、赤(R)の透過を許容するフィルター画素Pfを有する画素Px(以下、「赤の画素Px」と適宜称する)の分光感度特性SP(R)が二点鎖線で示され、緑(G)の透過を許容するフィルター画素Pfを有する画素Px(以下、「緑の画素Px」と適宜称する)の分光感度特性SP(G)が破線で示され、青(B)の透過を許容するフィルター画素Pfを有する画素Px(以下、「青の画素Px」と適宜称する)の分光感度特性SP(B)が一点鎖線で示される。また、図3では、プロジェクター32から投影されるパターン光L(S)の波長分布が実線で併記されている。 FIG. 3 is a diagram schematically showing the spectral sensitivity characteristics of the pixels of the imaging unit. In FIG. 3, in a graph showing a wavelength of light on the horizontal axis and a pixel value V on the vertical axis, a pixel Px having a filter pixel Pf that allows transmission of red (R) (hereinafter referred to as “red pixel Px”). The spectral sensitivity characteristic SP (R) of (appropriately referred to as) is indicated by a two-point chain line, and the spectrum of a pixel Px (hereinafter, appropriately referred to as “green pixel Px”) having a filter pixel Pf that allows transmission of green (G). The sensitivity characteristic SP (G) is indicated by a broken line, and the spectral sensitivity characteristic SP (B) of the pixel Px (hereinafter, appropriately referred to as “blue pixel Px”) having the filter pixel Pf that allows the transmission of blue (B) is It is indicated by a single point chain line. Further, in FIG. 3, the wavelength distribution of the pattern light L (S) projected from the projector 32 is also shown by a solid line.

つまり、本実施形態では、パターン光L(S)は、緑色の波長λgにピークを有する波長分布を有する(換言すれば、緑色の発光スペクトルを有する)。これに対して、緑の画素Pxは、パターン光L(S)の波長λgに対して高い感度を持つ分光感度特性SP(G)を有する。赤の画素Pxは、波長λgよりも長い波長にピークを持つ分光感度特性SP(R)を有し、パターン光L(S)の波長λgに対して緑の画素Pxよりも低い感度を有する。青の画素Pxは、波長λgよりも短い波長にピークを持つ分光感度特性SP(B)を有し、パターン光L(S)の波長λgに対して緑の画素Pxよりも低い感度を有する。 That is, in the present embodiment, the pattern light L (S) has a wavelength distribution having a peak at the green wavelength λg (in other words, has a green emission spectrum). On the other hand, the green pixel Px has a spectral sensitivity characteristic SP (G) having high sensitivity to the wavelength λg of the pattern light L (S). The red pixel Px has a spectral sensitivity characteristic SP (R) having a peak at a wavelength longer than the wavelength λg, and has a lower sensitivity than the green pixel Px with respect to the wavelength λg of the pattern light L (S). The blue pixel Px has a spectral sensitivity characteristic SP (B) having a peak at a wavelength shorter than the wavelength λg, and has a lower sensitivity than the green pixel Px with respect to the wavelength λg of the pattern light L (S).

つまり、図2に示すように、撮像ユニット311が有する複数の画素Pxのうち、緑の画素Pxが波長λgに対して高い感度を示す高感度画素Phとして機能し、赤の画素Pxおよび青の画素Pxのそれぞれが波長λgに対して高感度画素Phより低い感度を示す低感度画素Plとして機能する。そして、高感度画素Ph(緑の画素Px)と低感度画素Pl(赤の画素Px)がY方向に交互に並ぶとともに、高感度画素Ph(緑の画素Px)と低感度画素Pl(青の画素Px)がX方向に交互に並ぶ。こうして、高感度画素Ph(緑の画素Px)に対して低感度画素Pl(赤の画素Px)がY方向の両側で隣接し、高感度画素Ph(緑の画素Px)に対して低感度画素Pl(青の画素Px)がX方向の両側で隣接する。換言すれば、高感度画素Phには4方向から低感度画素Plが隣接し、低感度画素Plには、4方向から高感度画素Phが隣接する。なお、ここで、画素Pxが隣接するとは、対象の2個の画素Pxが配列ピッチΔPで配置された状態を示すものとする。 That is, as shown in FIG. 2, among the plurality of pixels Px included in the image pickup unit 311, the green pixel Px functions as a high-sensitivity pixel Ph showing high sensitivity to the wavelength λg, and the red pixel Px and the blue pixel Px Each of the pixels Px functions as a low-sensitivity pixel Pl that exhibits a sensitivity lower than that of the high-sensitivity pixel Ph with respect to the wavelength λg. Then, the high-sensitivity pixel Ph (green pixel Px) and the low-sensitivity pixel Pl (red pixel Px) are alternately arranged in the Y direction, and the high-sensitivity pixel Ph (green pixel Px) and the low-sensitivity pixel Pl (blue pixel Px) are arranged alternately. Pixels Px) are arranged alternately in the X direction. In this way, the low-sensitivity pixels Pl (red pixels Px) are adjacent to the high-sensitivity pixels Ph (green pixels Px) on both sides in the Y direction, and the low-sensitivity pixels Ph (green pixels Px) are adjacent to the high-sensitivity pixels Ph (green pixels Px). Pl (blue pixel Px) is adjacent on both sides in the X direction. In other words, the high-sensitivity pixel Ph is adjacent to the low-sensitivity pixel Ph from four directions, and the low-sensitivity pixel Pl is adjacent to the high-sensitivity pixel Ph from four directions. Here, the fact that the pixels Px are adjacent means that the two target pixels Px are arranged at the array pitch ΔP.

かかる構成では、図4に示すように、半田Bの表面のうち、高い反射率を有する高反射領域Ahで反射された光および低い反射率を有する低反射領域Alで反射された光の両方を、画素Pxによって正確な画素値Vに変換することができる。 In such a configuration, as shown in FIG. 4, both the light reflected in the high reflectance region Ah having a high reflectance and the light reflected in the low reflection region Al having a low reflectance on the surface of the solder B are combined. , It can be converted into an accurate pixel value V by the pixel Px.

図4は高反射領域および低反射領域それぞれで反射された光と画素値との関係を模式的に示す図である。同図は、正弦波であるパターン光L(S)を高反射領域Ahおよび低反射領域Alのそれぞれに投影して、各領域Ah、Alで反射された光を赤(R)、緑(G)、青(B)の画素Pxで検知した際に、これらの画素Pxが出力する画素値Vを模式的に示す。なお、実際には、画素Pxから出力される画素値VはダイナミックレンジDから外れた値となることはできないため、画素値Vの波形がつぶれるが、ここでは波形をつぶさずに示した。 FIG. 4 is a diagram schematically showing the relationship between the light reflected in each of the high reflection region and the low reflection region and the pixel value. In the figure, the pattern light L (S) which is a sine wave is projected onto each of the high reflection region Ah and the low reflection region Al, and the light reflected by each region Ah and Al is red (R) and green (G). ), The pixel value V output by these pixels Px when detected by the blue (B) pixels Px is schematically shown. Actually, since the pixel value V output from the pixel Px cannot be a value outside the dynamic range D, the waveform of the pixel value V is crushed, but here, the waveform is shown without being crushed.

高反射領域Ahで反射されたパターン光L(S)を検知した緑Gの画素Px(高感度画素Ph)が出力する画素値Vは、画素PxのダイナミックレンジD(換言すれば受光画素PiのダイナミックレンジD)の上限を一部で超える。したがって、緑の画素Pxは、高反射領域Ahで反射されたパターン光L(S)を正確な画素値Vに変換することができない。一方、高反射領域Ahで反射されたパターン光L(S)を検知した赤(R)および青(B)の画素Px(低感度画素Pl)が出力する画素値Vは、画素PxのダイナミックレンジDに収まる。したがって、赤(R)あるいは青(B)の画素Pxは、高反射領域Ahで反射されたパターン光L(S)を正確な画素値Vに変換することができる。 The pixel value V output by the green G pixel Px (high-sensitivity pixel Ph) that detects the pattern light L (S) reflected in the high-reflection region Ah is the dynamic range D of the pixel Px (in other words, the light-receiving pixel Pi). The upper limit of the dynamic range D) is partially exceeded. Therefore, the green pixel Px cannot convert the pattern light L (S) reflected in the high reflection region Ah into an accurate pixel value V. On the other hand, the pixel value V output by the red (R) and blue (B) pixels Px (low-sensitivity pixels Pl) that detect the pattern light L (S) reflected in the high reflection region Ah is the dynamic range of the pixels Px. It fits in D. Therefore, the red (R) or blue (B) pixel Px can convert the pattern light L (S) reflected in the high reflection region Ah into an accurate pixel value V.

低反射領域Alで反射されたパターン光L(S)を検知した赤(R)および青(B)の画素Px(低感度画素Pl)が出力する画素値Vは、画素PxのダイナミックレンジDの下限を一部で超える。したがって、赤(R)および青(B)の画素Pxは、低反射領域Alで反射されたパターン光L(S)を正確な画素値Vに変換することができない。一方、低反射領域Alで反射されたパターン光L(S)を検知した緑(G)の画素Px(高感度画素Ph)が出力する画素値Vは、画素PxのダイナミックレンジDに収まる。したがって、緑(G)の画素Pxは、低反射領域Alで反射されたパターン光L(S)を正確な画素値Vに変換することができる。 The pixel value V output by the red (R) and blue (B) pixels Px (low-sensitivity pixels Pl) that detect the pattern light L (S) reflected in the low-reflection region Al is the dynamic range D of the pixels Px. The lower limit is partially exceeded. Therefore, the red (R) and blue (B) pixels Px cannot convert the pattern light L (S) reflected in the low reflection region Al into an accurate pixel value V. On the other hand, the pixel value V output by the green (G) pixel Px (high-sensitivity pixel Ph) that detects the pattern light L (S) reflected in the low reflection region Al falls within the dynamic range D of the pixel Px. Therefore, the green (G) pixel Px can convert the pattern light L (S) reflected in the low reflection region Al into an accurate pixel value V.

つまり、高反射領域Ahで反射されたパターン光L(S)は、赤(R)および青(B)の画素Px(低感度画素Pl)によって正確な画素値Vに変換でき、低反射領域Alで反射されたパターン光L(S)は、緑(G)の画素Px(高感度画素Ph)によって正確な画素値Vに変換できる。 That is, the pattern light L (S) reflected in the high reflection region Ah can be converted into an accurate pixel value V by the red (R) and blue (B) pixels Px (low-sensitivity pixels Pl), and the low-reflection region Al. The pattern light L (S) reflected by the above can be converted into an accurate pixel value V by the green (G) pixel Px (high-sensitivity pixel Ph).

図5は外観検査装置が実行する三次元計測の一例を示すフローチャートであり、図6は図5の三次元計測で実行される不適画素の補間の一例を示すフローチャートであり、図7は図5の三次元計測で実行される演算の内容を説明する図である。図5および図6は主制御部110の制御によって実行される。 FIG. 5 is a flowchart showing an example of three-dimensional measurement executed by the visual inspection apparatus, FIG. 6 is a flowchart showing an example of interpolation of unsuitable pixels executed in the three-dimensional measurement of FIG. 5, and FIG. 7 is FIG. It is a figure explaining the content of the operation executed by the three-dimensional measurement of. 5 and 6 are executed under the control of the main control unit 110.

ステップS101では、半田Bにパターン光L(S)を投影しつつ撮像カメラ31によりパターン光L(S)を撮像するパターン撮像動作を、パターン光L(S)の位相を90度ずつ変更しつつ繰り返し実行することで、90度ずつ位相が異なる4つの撮像画像I(S)が取得される(S=1、2、3、4)。 In step S101, the pattern imaging operation of capturing the pattern light L (S) by the imaging camera 31 while projecting the pattern light L (S) on the solder B is performed while changing the phase of the pattern light L (S) by 90 degrees. By repeating the execution, four captured images I (S) having different phases by 90 degrees are acquired (S = 1, 2, 3, 4).

ステップS102では、主制御部110は、位相シフト法に基づき、これら撮像画像I(S)から半田Bの三次元形状Bsを示す三次元画像を算出する。具体的には、図7の式1に基づいて、4つの撮像画像I(S)の画素値V0〜V3から角度θを求める演算を、複数の画素Pxのそれぞれについて実行することで、三次元画像が得られる。 In step S102, the main control unit 110 calculates a three-dimensional image showing the three-dimensional shape Bs of the solder B from these captured images I (S) based on the phase shift method. Specifically, based on Equation 1 in FIG. 7, the calculation of obtaining the angle θ from the pixel values V0 to V3 of the four captured images I (S) is executed for each of the plurality of pixels Px in three dimensions. An image is obtained.

ステップS103では、主制御部110は、複数の画素Pxそれぞれの画素値Vの信頼度を示す信頼度画像を算出する。この信頼度は、画素Pxの画素値VがダイナミックレンジDに収まっているか否かを示す。つまり、画素Pxの画素値Vが明るすぎるあるいは暗すぎる場合には、信頼度が低くなる。具体的には、図7の式2に基づいて、4つの撮像画像I(S)の画素値V0〜V3から信頼度を求める演算を、複数の画素Pxのそれぞれについて実行することで、信頼度画像が得られる。なお、これらの画素値V0〜V3のうちに、飽和した画素値(すなわち、8bitで表した場合には「255」を示す画素値)が存在する場合には、図7の式2によらずに画素値Vの信頼度を「0」にする。 In step S103, the main control unit 110 calculates a reliability image showing the reliability of the pixel value V of each of the plurality of pixels Px. This reliability indicates whether or not the pixel value V of the pixel Px is within the dynamic range D. That is, when the pixel value V of the pixel Px is too bright or too dark, the reliability becomes low. Specifically, based on Equation 2 in FIG. 7, the reliability is calculated by executing the calculation for obtaining the reliability from the pixel values V0 to V3 of the four captured images I (S) for each of the plurality of pixels Px. An image is obtained. If a saturated pixel value (that is, a pixel value indicating "255" when represented by 8 bits) exists among these pixel values V0 to V3, it does not depend on Equation 2 in FIG. The reliability of the pixel value V is set to "0".

ステップS104では、図6に示す不適画素の補間が実行される。ステップS201で、複数の画素Pxを識別するためのカウント値Nがゼロにリセットされ、ステップS202で、カウント値Nがインクリメントされる。そして、カウント値Nの画素Pxの画素値Vの信頼度が閾値以上であるか否かが判断される(ステップS203)。信頼度が閾値以上である場合(ステップS203で「YES」の場合)には、ステップS202に戻ってカウント値Nがインクリメントされる。 In step S104, the interpolation of unsuitable pixels shown in FIG. 6 is executed. In step S201, the count value N for identifying the plurality of pixels Px is reset to zero, and in step S202, the count value N is incremented. Then, it is determined whether or not the reliability of the pixel value V of the pixel Px having the count value N is equal to or greater than the threshold value (step S203). When the reliability is equal to or higher than the threshold value (when “YES” in step S203), the process returns to step S202 and the count value N is incremented.

信頼度が閾値未満である場合(ステップS203で「NO」の場合)には、このカウント値Nの画素Px(不適画素)から配列ピッチΔP以内に位置する画素Px、すなわち不適画素に隣接する4個の画素Pxの画素値Vによって、不適画素の画素値Vを補間可能であるかが判断される(ステップS204)。具体的には、これら4個の画素Pxのうちに閾値未満の信頼度を有する画素Pxが在る場合には、補間不能と判断され、これら4個の画素Pxの画素値Vの全てが閾値以上の信頼度を有する場合には、補間可能と判断される。 When the reliability is less than the threshold value (when "NO" in step S203), the pixel Px located within the array pitch ΔP from the pixel Px (inappropriate pixel) of this count value N, that is, the pixel Px adjacent to the unsuitable pixel 4 It is determined whether the pixel value V of the unsuitable pixel can be interpolated by the pixel value V of the pixel Px (step S204). Specifically, when there is a pixel Px having a reliability less than the threshold value among these four pixel Px, it is determined that interpolation is not possible, and all of the pixel values V of these four pixel Px are the threshold value. If it has the above reliability, it is judged that interpolation is possible.

補間不能の場合(ステップS204で「NO」の場合)には、ステップS202に戻ってカウント値Nがインクリメントされる。補間可能の場合(ステップS204で「YES」の場合)には、補間演算が実行されて、不適画素の画素値Vが、当該不適画素に隣接する4個の画素Pxの画素値Vによって補間される(ステップS205)。つまり、隣接する4個の画素Pxの画素値V0によって、対象となる画素Pxの画素値V0が補間され、画素値V1〜V3についても同様に補間される。かかる補間演算は、線形補間あるいは多項式補間等、周知の補間方法を用いて実行できる。また、ステップS205では、図7の式1に基づき、補間された画素値V(V0〜V3)から角度θが算出され、ステップS102で算出された三次元画像における該当画素Px(すなわち、ステップS205の補間対象となった画素Px)として採用される。そして、カウント値Nが最大値となるまで(ステップS206で「YES」となるまで)ステップS202〜S205が実行される。 If interpolation is not possible (“NO” in step S204), the process returns to step S202 and the count value N is incremented. When interpolation is possible (when "YES" in step S204), the interpolation calculation is executed, and the pixel value V of the unsuitable pixel is interpolated by the pixel value V of the four pixels Px adjacent to the unsuitable pixel. (Step S205). That is, the pixel value V0 of the four adjacent pixels Px interpolates the pixel value V0 of the target pixel Px, and the pixel values V1 to V3 are also interpolated in the same manner. Such interpolation operations can be performed using well-known interpolation methods such as linear interpolation and polynomial interpolation. Further, in step S205, the angle θ is calculated from the interpolated pixel values V (V0 to V3) based on Equation 1 in FIG. 7, and the corresponding pixel Px in the three-dimensional image calculated in step S102 (that is, step S205). It is adopted as the pixel Px) that is the object of interpolation. Then, steps S202 to S205 are executed until the count value N reaches the maximum value (until “YES” in step S206).

以上に説明した実施形態では、撮像カメラ31は、半田B(対象物)に照射される波長λgの光に対して、高い感度を持つ分光感度特性SP(G)を有する高感度画素Phと、低い感度を持つ分光感度特性SP(R)、SP(B)を有する低感度画素Plとを備える。したがって、半田Bの表面うち、高反射領域Ahで反射されたパターン光L(S)は、低感度画素Plにより適切な画素値Vに変換でき、低反射領域Alで反射されたパターン光L(S)は、高感度画素Phにより適切な画素値Vに変換できる。つまり、高反射領域Ahおよび低感度画素Plで反射されたパターン光L(S)の両方を、適切な画素値Vに変換できる。こうして、高反射領域Ahと低感度画素Plとが半田Bに混在する場合であっても、両領域Ah、Alについて正確な画素値Vを取得することが可能となっている。 In the embodiment described above, the imaging camera 31 has a high-sensitivity pixel Ph having a spectral sensitivity characteristic SP (G) having high sensitivity to light having a wavelength of λg irradiated on the solder B (object). It includes low-sensitivity pixels Pl having spectral sensitivity characteristics SP (R) and SP (B) having low sensitivity. Therefore, of the surface of the solder B, the pattern light L (S) reflected in the high reflection region Ah can be converted into an appropriate pixel value V by the low-sensitivity pixel Pl, and the pattern light L (S) reflected in the low reflection region Al ( S) can be converted into an appropriate pixel value V by the high-sensitivity pixel Ph. That is, both the high reflection region Ah and the pattern light L (S) reflected by the low sensitivity pixel Pl can be converted into an appropriate pixel value V. In this way, even when the high reflection region Ah and the low sensitivity pixel Pl are mixed in the solder B, it is possible to acquire accurate pixel values V for both regions Ah and Al.

また、高感度画素Phと低感度画素Plとが交互に配列されている。かかる構成では、高感度画素Phと低感度画素Plとが互いに隣接しつつ均一に配置されるため、低反射領域Alで反射されたパターン光L(S)を高感度画素Phで的確に捉えつつ、高反射領域Ahで反射されたパターン光L(S)を低感度画素Plで的確に捉えることができる。その結果、高反射領域Ahと低感度画素Plとが半田Bに混在する場合であっても、両領域Ah、Alについて正確な画素値Vを取得することが可能となる。 Further, the high-sensitivity pixels Ph and the low-sensitivity pixels Pl are arranged alternately. In such a configuration, since the high-sensitivity pixel Ph and the low-sensitivity pixel Pl are uniformly arranged while being adjacent to each other, the pattern light L (S) reflected in the low-reflection region Al can be accurately captured by the high-sensitivity pixel Ph. The pattern light L (S) reflected in the high reflection region Ah can be accurately captured by the low-sensitivity pixels Pl. As a result, even when the high reflection region Ah and the low sensitivity pixel Pl are mixed in the solder B, it is possible to acquire accurate pixel values V for both regions Ah and Al.

また、高感度画素Phと低感度画素Plとが同じ比率で含まれている。かかる構成では、高反射領域Ahで反射されたパターン光L(S)と、低反射領域Alで反射されたパターン光L(S)の一方に偏ることなく、これらを適切な画素値Vに変換することができる。その結果、高反射領域Ahと低感度画素Plとが半田Bに混在する場合であっても、両領域Ah、Alについて正確な画素値Vを取得することが可能となる。 Further, the high-sensitivity pixel Ph and the low-sensitivity pixel Pl are included in the same ratio. In such a configuration, the pattern light L (S) reflected in the high reflection region Ah and the pattern light L (S) reflected in the low reflection region Al are not biased, and these are converted into appropriate pixel values V. can do. As a result, even when the high reflection region Ah and the low sensitivity pixel Pl are mixed in the solder B, it is possible to acquire accurate pixel values V for both regions Ah and Al.

また、波長λgは、緑色の波長であり、複数の画素Pxは、ベイヤー配列により配列されている。そして、複数の高感度画素Phのそれぞれは、ベイヤー配列における緑色の画素Pxであり、複数の低感度画素Plは、ベイヤー配列における青色の画素Pxと赤色の画素Pxとを同数ずつ含む。かかる構成では、緑色の高感度画素Phと赤色あるいは青色の低感度画素Plとが互いに隣接しつつ均一に配置されるため、低反射領域Alで反射されたパターン光L(S)を高感度画素Phで的確に捉えつつ、高反射領域Ahで反射されたパターン光L(S)を低感度画素Plで的確に捉えることができる。その結果、高反射領域Ahと低感度画素Plとが半田Bに混在する場合であっても、両領域Ah、Alについて正確な画素値Vを取得することが可能となる。 Further, the wavelength λg is a green wavelength, and a plurality of pixels Px are arranged in a Bayer array. Each of the plurality of high-sensitivity pixels Ph is a green pixel Px in the Bayer array, and the plurality of low-sensitivity pixels Pl include the same number of blue pixels Px and red pixels Px in the Bayer array. In such a configuration, since the green high-sensitivity pixel Ph and the red or blue low-sensitivity pixel Pl are uniformly arranged while being adjacent to each other, the pattern light L (S) reflected in the low-reflection region Al is the high-sensitivity pixel. The pattern light L (S) reflected in the high reflection region Ah can be accurately captured by the low-sensitivity pixel Pl while being accurately captured by Ph. As a result, even when the high reflection region Ah and the low sensitivity pixel Pl are mixed in the solder B, it is possible to acquire accurate pixel values V for both regions Ah and Al.

なお、高反射領域Ahで反射されて高感度画素Phに入射したパターン光L(S)や、低反射領域Alで反射されて低感度画素Plに入射したパターン光L(S)を変換した画素値Vは、不適切である可能性が高い。そこで、主制御部110は、画素Pxから出力される画素値Vが適切か否かを画素値Vに基づき判定する判定処理(ステップS203)を複数の画素Pxのそれぞれについて実行する。そして、判定処理(ステップS203)での判定結果に基づき形状算出が実行される(ステップS205)。かかる構成では、不適切な画素値Vの影響を抑えつつ適切な画素値Vを用いて形状算出を実行できる(ステップS102、S205)。そのため、高反射領域Ahと低感度画素Plとが半田Bに混在する場合であっても、半田Bの三次元形状Bsを正確に算出することが可能となる。 A pixel obtained by converting the pattern light L (S) reflected in the high reflection region Ah and incident on the high-sensitivity pixel Ph and the pattern light L (S) reflected in the low reflection region Al and incident on the low-sensitivity pixel Pl. The value V is likely to be inappropriate. Therefore, the main control unit 110 executes a determination process (step S203) for determining whether or not the pixel value V output from the pixel Px is appropriate based on the pixel value V for each of the plurality of pixel Px. Then, the shape calculation is executed based on the determination result in the determination process (step S203) (step S205). In such a configuration, the shape can be calculated using the appropriate pixel value V while suppressing the influence of the inappropriate pixel value V (steps S102 and S205). Therefore, even when the high reflection region Ah and the low-sensitivity pixels Pl are mixed in the solder B, it is possible to accurately calculate the three-dimensional shape Bs of the solder B.

また、主制御部110は、判定処理(ステップS203)で半田Bが不適切と判定された高感度画素Phに対して、当該高感度画素Phから配列ピッチΔPの範囲内に位置する低感度画素Plの画素値Vによる補間を実行し、あるいは判定処理(ステップS203)で画素値Vが不適切と判定された低感度画素Plに対して、当該低感度画素Plから配列ピッチΔPの範囲内に位置する高感度画素Phの画素値Vによる補間を実行した結果に基づき形状算出(ステップS102、S205)を実行する。かかる構成では、判定処理(ステップS203)で不適切と判定された不適画素Pxの画素値Vを当該不適画素Pxから配列ピッチΔPの範囲内に位置する画素Pxの画素値Vにより補間し、その結果に基づき形状算出(ステップS102、S205)を実行できる。したがって、高反射領域Ahと低反射領域Alとが半田Bに混在する場合であっても、半田Bの三次元形状Bsを正確に算出することが可能となる。 Further, the main control unit 110 has a low-sensitivity pixel located within the range of the arrangement pitch ΔP from the high-sensitivity pixel Ph with respect to the high-sensitivity pixel Ph in which the solder B is determined to be inappropriate in the determination process (step S203). For the low-sensitivity pixel Pl that is determined to have an inappropriate pixel value V in the determination process (step S203) or the interpolation by the pixel value V of Pl is executed, the low-sensitivity pixel Pl is within the range of the array pitch ΔP. Shape calculation (steps S102 and S205) is executed based on the result of performing interpolation by the pixel value V of the positioned high-sensitivity pixel Ph. In such a configuration, the pixel value V of the unsuitable pixel Px determined to be inappropriate in the determination process (step S203) is interpolated by the pixel value V of the pixel Px located within the range of the array pitch ΔP from the unsuitable pixel Px, and the pixel value V thereof is interpolated. Shape calculation (steps S102, S205) can be executed based on the result. Therefore, even when the high reflection region Ah and the low reflection region Al are mixed in the solder B, it is possible to accurately calculate the three-dimensional shape Bs of the solder B.

また、プロジェクター32は、波長λgを有して、互いに異なる位相を有する4つの投影パターンT(S)のパターン光L(S)を半田Bに照射する。そして、主制御部110は、位相シフト法によって形状算出を実行する(ステップS102、S205)。かかる構成では、高反射領域Ahと低反射領域Alとが半田Bに混在する場合であっても、半田Bの三次元形状Bsを位相シフト法によって適切に算出することが可能となる。 Further, the projector 32 irradiates the solder B with pattern light L (S) of four projection patterns T (S) having a wavelength λg and having different phases. Then, the main control unit 110 executes the shape calculation by the phase shift method (steps S102 and S205). In such a configuration, even when the high reflection region Ah and the low reflection region Al are mixed in the solder B, the three-dimensional shape Bs of the solder B can be appropriately calculated by the phase shift method.

このように本実施形態では、外観検査装置1が本発明の「三次元計測装置」の一例に相当し、プロジェクター32が本発明の「プロジェクター」の一例に相当し、撮像カメラ31が本発明の「撮像カメラ」の一例に相当し、制御装置100が本発明の「制御部」の一例に相当し、パターン光L(S)が本発明の「光」の一例の相当し、投影パターンT(S)が本発明の「縞パターン」の一例に相当し、波長λgが本発明の「所定波長」の一例に相当し、半田Bが本発明の「対象物」の一例に相当し、三次元形状Bsが本発明の「三次元形状」の一例に相当し、画素Pxが本発明の「画素」の一例に相当し、高感度画素Phが本発明の「高感度画素」の一例に相当し、低感度画素Plが本発明の「低感度画素」の一例に相当し、配列ピッチΔPが本発明の「所定範囲」の一例に相当し、分光感度特性SP(R)、SP(B)、SP(G)が本発明の「分光感度特性」の一例に相当し、画素値Vが本発明の「画素値」の一例に相当し、ステップS203が本発明の「判定処理」の一例に相当する。 As described above, in the present embodiment, the visual inspection device 1 corresponds to an example of the "three-dimensional measuring device" of the present invention, the projector 32 corresponds to an example of the "projector" of the present invention, and the imaging camera 31 corresponds to the present invention. The control device 100 corresponds to an example of the "control unit" of the present invention, the pattern light L (S) corresponds to an example of the "light" of the present invention, and the projection pattern T ( S) corresponds to an example of the "striped pattern" of the present invention, the wavelength λg corresponds to an example of the "predetermined wavelength" of the present invention, and the solder B corresponds to an example of the "object" of the present invention. The shape Bs corresponds to an example of the "three-dimensional shape" of the present invention, the pixel Px corresponds to an example of the "pixel" of the present invention, and the high-sensitivity pixel Ph corresponds to an example of the "high-sensitivity pixel" of the present invention. The low-sensitivity pixel Pl corresponds to an example of the "low-sensitivity pixel" of the present invention, the arrangement pitch ΔP corresponds to an example of the "predetermined range" of the present invention, and the spectral sensitivity characteristics SP (R), SP (B), SP (G) corresponds to an example of the "spectral sensitivity characteristic" of the present invention, pixel value V corresponds to an example of the "pixel value" of the present invention, and step S203 corresponds to an example of the "determination process" of the present invention. do.

なお、本発明は上記実施形態に限定されるものではなく、その趣旨を逸脱しない限りにおいて上述したものに対して種々の変更を加えることが可能である。例えば、複数の画素Pxをベイヤー配列に従って配列する必要は必ずしもない。例えば、青(B)の画素Pxに代えて赤(R)の画素Pxを配置しても良い。この場合、プロジェクター32から赤(R)の波長のパターン光L(S)を投影しても良い。あるいは、赤(R)の画素Pxに代えて青(B)の画素Pxを配置しても良い。この場合、プロジェクター32から青(G)の波長のパターン光L(S)を投影しても良い。 The present invention is not limited to the above embodiment, and various modifications can be made to the above as long as it does not deviate from the gist thereof. For example, it is not always necessary to arrange a plurality of pixels Px according to a Bayer array. For example, the red (R) pixel Px may be arranged instead of the blue (B) pixel Px. In this case, the pattern light L (S) having a wavelength of red (R) may be projected from the projector 32. Alternatively, the blue (B) pixel Px may be arranged instead of the red (R) pixel Px. In this case, the pattern light L (S) having a wavelength of blue (G) may be projected from the projector 32.

また、高感度画素Phと低感度画素Plとの個数の比率や、配列パターン等も適宜変更できる。 Further, the ratio of the number of high-sensitivity pixels Ph to the low-sensitivity pixels Pl, the arrangement pattern, and the like can be appropriately changed.

また、上記のステップS204において補間の可否を判断する具体的手法は上記の例に限られない。つまり、不適画素を挟んでX方向に並ぶ2個の画素Pxのペアおよび不適画素を挟んでY方向に並ぶ2個の画素Pxのペアのうち、いずれか一方のペアを構成する2個の画素Pxの信頼度が閾値以上、すなわち有効であれば、補間可能と判断しても良い。かかる例では、ステップS205の補間演算を次のように行えば良い。つまり、信頼度が有効なペアが1組のみの場合には、当該ペアを構成する2個の画素Pxの画素値Vの平均値により、不適画素の画素値Vが補間される。また、信頼度が有効なペアが2組ある場合には、これら2組のペアのうち、ペアを構成する2個の画素Pxの画素値Vの差(輝度差)の絶対値が小さい方のペアを構成する2個の画素Pxの画素値Vの平均値により、不適画素の画素値Vが補間される。図8は不適画素の補間の一例を示す図である。図8に示すように、不適画素Pxnを挟む2個の画素Pxgの画素値Vgの平均値によって、不適画素Pxnの画素値Vnを補間できる(線形補間)。これらの手法によれば、不適画素が高反射領域Ahと低反射領域Alの境界に位置する場合であっても、補間された画素値Vの誤差を抑えることができる。ただし、補間演算は、ここで例示した線形補間に限られず、他の周知の補間方法を用いて実行できる点は、上述と同様である。 Further, the specific method for determining whether or not interpolation is possible in step S204 is not limited to the above example. That is, two pixels forming one of a pair of two pixels Px arranged in the X direction with the unsuitable pixel sandwiched between them and a pair of two pixel Px arranged in the Y direction with the inappropriate pixel sandwiched between them. If the reliability of Px is equal to or higher than the threshold value, that is, it is valid, it may be determined that interpolation is possible. In such an example, the interpolation operation in step S205 may be performed as follows. That is, when there is only one pair for which the reliability is valid, the pixel value V of the unsuitable pixel is interpolated by the average value of the pixel values V of the two pixels Px constituting the pair. When there are two pairs with effective reliability, the one with the smaller absolute value of the difference (brightness difference) of the pixel values V of the two pixels Px constituting the pair among these two pairs. The pixel value V of the unsuitable pixel is interpolated by the average value of the pixel value V of the two pixels Px forming the pair. FIG. 8 is a diagram showing an example of interpolation of unsuitable pixels. As shown in FIG. 8, the pixel value Vn of the unsuitable pixel Pxn can be interpolated by the average value of the pixel values Vg of the two pixels Pxg sandwiching the unsuitable pixel Pxn (linear interpolation). According to these methods, even when the unsuitable pixel is located at the boundary between the high reflection region Ah and the low reflection region Al, the error of the interpolated pixel value V can be suppressed. However, the interpolation operation is not limited to the linear interpolation illustrated here, and can be executed by using other well-known interpolation methods, as described above.

また、上記の例では、パターン光L(S)を撮像した撮像画像I(S)の画素Pxの画素値Vに対して補間を実行している。しかしながら、4つの撮像画像I(S)の画素値V0〜V3から算出される各画素Pxの角度θに対して補間を実行しても良い。あるいは、この角度θから算出される各画素Pxの高さに対して補間を実行しても良い。こうして、閾値以上の信頼度を有する画素値Vに基づき不適画素を補間しつつ、半田Bの三次元形状を算出することができる。 Further, in the above example, interpolation is performed on the pixel value V of the pixel Px of the captured image I (S) obtained by capturing the pattern light L (S). However, interpolation may be executed for the angle θ of each pixel Px calculated from the pixel values V0 to V3 of the four captured images I (S). Alternatively, interpolation may be executed for the height of each pixel Px calculated from this angle θ. In this way, the three-dimensional shape of the solder B can be calculated while interpolating unsuitable pixels based on the pixel value V having a reliability equal to or higher than the threshold value.

また、信頼度の算出方法は、上記の例に限られない。例えば、特開2014−119442号公報あるいは特許第3996560号公報に記載の方法によって信頼度を算出しても良い。 Further, the method of calculating the reliability is not limited to the above example. For example, the reliability may be calculated by the method described in Japanese Patent Application Laid-Open No. 2014-119442 or Japanese Patent No. 3996560.

また、三次元計測の対象物は、半田Bに限られない。 Further, the object of the three-dimensional measurement is not limited to the solder B.

1…外観検査装置(三次元計測装置)
31…撮像カメラ
32…プロジェクター
100…制御装置(制御部)
B…半田(対象物)
Bs…三次元形状
L(S)…パターン光(光)
Px…画素
Ph…高感度画素
Pl…低感度画素
ΔP…配列ピッチ(所定範囲)
SP(R)、SP(B)、SP(G)…分光感度特性
T(S)…投影パターン(縞パターン)
V…画素値
λg…緑色の波長(所定波長)
S203…判定処理
1 ... Visual inspection device (three-dimensional measuring device)
31 ... Imaging camera 32 ... Projector 100 ... Control device (control unit)
B ... Handa (object)
Bs ... Three-dimensional shape L (S) ... Pattern light (light)
Px ... Pixel Ph ... High-sensitivity pixel Pl ... Low-sensitivity pixel ΔP ... Arrangement pitch (predetermined range)
SP (R), SP (B), SP (G) ... Spectral sensitivity characteristics T (S) ... Projection pattern (striped pattern)
V ... Pixel value λg ... Green wavelength (predetermined wavelength)
S203 ... Judgment processing

Claims (8)

所定波長の光を対象物に照射するプロジェクターと、
前記対象物で反射された光が入射する複数の画素を有し、前記複数の画素のそれぞれは入射した光の強度に応じた画素値を出力する撮像カメラと、
前記画素値に基づき前記対象物の三次元形状の形状算出を実行する制御部と
を備え、
前記複数の画素は、複数の高感度画素と、前記高感度画素が有する分光感度特性と比較して前記所定波長の入力に対する出力の比が低い分光感度特性を有する複数の低感度画素とを含む三次元計測装置。
A projector that irradiates an object with light of a predetermined wavelength,
An imaging camera having a plurality of pixels on which the light reflected by the object is incident, and each of the plurality of pixels outputting a pixel value according to the intensity of the incident light.
A control unit that executes shape calculation of the three-dimensional shape of the object based on the pixel value is provided.
The plurality of pixels include a plurality of high-sensitivity pixels and a plurality of low-sensitivity pixels having a spectral sensitivity characteristic in which the ratio of an output to an input having a predetermined wavelength is lower than that of the spectral sensitivity characteristic of the high-sensitivity pixel. Three-dimensional measuring device.
前記高感度画素と前記低感度画素とが交互に配列されている請求項1に記載の三次元計測装置。 The three-dimensional measuring device according to claim 1, wherein the high-sensitivity pixels and the low-sensitivity pixels are alternately arranged. 前記複数の画素は、前記高感度画素と前記低感度画素とを同じ比率で含む請求項2に記載の三次元計測装置。 The three-dimensional measuring device according to claim 2, wherein the plurality of pixels include the high-sensitivity pixels and the low-sensitivity pixels in the same ratio. 前記所定波長は、緑色の波長であり、
前記複数の画素は、赤色、緑色および青色を所定のパターンで配列したベイヤー配列により配列され、
前記複数の高感度画素のそれぞれは、ベイヤー配列における緑色の画素であり、
前記複数の低感度画素は、ベイヤー配列における青色の画素と赤色の画素とを同数ずつ含む請求項3に記載の三次元計測装置。
The predetermined wavelength is a green wavelength,
The plurality of pixels are arranged by a Bayer array in which red, green, and blue are arranged in a predetermined pattern.
Each of the plurality of high-sensitivity pixels is a green pixel in the Bayer array.
The three-dimensional measuring device according to claim 3, wherein the plurality of low-sensitivity pixels include the same number of blue pixels and the same number of red pixels in the Bayer array.
前記制御部は、前記画素から出力される前記画素値が適切か否かを前記画素値に基づき判定する判定処理を前記複数の画素のそれぞれについて実行し、前記判定処理での判定結果に基づき前記形状算出を実行する請求項1ないし4のいずれか一項に記載の三次元計測装置。 The control unit executes a determination process for determining whether or not the pixel value output from the pixel is appropriate based on the pixel value for each of the plurality of pixels, and the determination result in the determination process is used as the basis for the determination process. The three-dimensional measuring device according to any one of claims 1 to 4, which executes shape calculation. 前記制御部は、前記判定処理で前記画素値が不適切と判定された前記高感度画素に対して、当該高感度画素から所定範囲内に位置する前記低感度画素の前記画素値による補間を実行し、あるいは前記判定処理で前記画素値が不適切と判定された前記低感度画素に対して、当該低感度画素から前記所定範囲内に位置する前記高感度画素の前記画素値による補間を実行した結果に基づき前記形状算出を実行する請求項5に記載の三次元計測装置。 The control unit executes interpolation by the pixel value of the low-sensitivity pixel located within a predetermined range from the high-sensitivity pixel with respect to the high-sensitivity pixel whose pixel value is determined to be inappropriate by the determination process. Alternatively, the low-sensitivity pixel for which the pixel value is determined to be inappropriate by the determination process is interpolated by the pixel value of the high-sensitivity pixel located within the predetermined range from the low-sensitivity pixel. The three-dimensional measuring device according to claim 5, which executes the shape calculation based on the result. 前記プロジェクターは、前記所定波長を有して、互いに異なる位相を有する複数の縞パターンの光を前記対象物に照射し、
前記制御部は、位相シフト法によって前記形状算出を実行する請求項1ないし6のいずれか一項に記載の三次元計測装置。
The projector irradiates the object with light having a plurality of striped patterns having the predetermined wavelengths and having different phases from each other.
The three-dimensional measuring device according to any one of claims 1 to 6, wherein the control unit executes the shape calculation by a phase shift method.
所定波長の光を対象物に照射する工程と、
前記対象物で反射された光を複数の画素に入射させ、前記複数の画素が入射した光の強度に応じた画素値を出力する工程と、
前記画素値に基づき前記対象物の三次元形状の形状算出を実行する工程と
を備え、
前記複数の画素は、複数の高感度画素と、前記高感度画素が有する分光感度特性と比較して前記所定波長の入力に対する出力の比が低い分光感度特性を有する複数の低感度画素とを含む三次元計測方法。
The process of irradiating an object with light of a predetermined wavelength and
A step of incidenting light reflected by the object on a plurality of pixels and outputting a pixel value according to the intensity of the incident light on the plurality of pixels.
A step of executing the shape calculation of the three-dimensional shape of the object based on the pixel value is provided.
The plurality of pixels include a plurality of high-sensitivity pixels and a plurality of low-sensitivity pixels having a spectral sensitivity characteristic in which the ratio of an output to an input having a predetermined wavelength is lower than that of the spectral sensitivity characteristic of the high-sensitivity pixel. Three-dimensional measurement method.
JP2020537988A 2018-08-24 2018-08-24 3D measuring device, 3D measuring method Active JP7051260B2 (en)

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2018/031331 WO2020039575A1 (en) 2018-08-24 2018-08-24 Three-dimensional measuring device and three-dimensional measuring method

Publications (2)

Publication Number Publication Date
JPWO2020039575A1 true JPWO2020039575A1 (en) 2021-09-16
JP7051260B2 JP7051260B2 (en) 2022-04-11

Family

ID=69592824

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2020537988A Active JP7051260B2 (en) 2018-08-24 2018-08-24 3D measuring device, 3D measuring method

Country Status (6)

Country Link
US (1) US11796308B2 (en)
JP (1) JP7051260B2 (en)
KR (1) KR102513710B1 (en)
CN (1) CN112567199B (en)
DE (1) DE112018007930T5 (en)
WO (1) WO2020039575A1 (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
FR3134448B1 (en) * 2022-04-11 2024-04-12 Insidix TOPOGRAPHIC MEASURING METHOD AND TOPOGRAPHIC MEASURING MACHINE
WO2024062809A1 (en) * 2022-09-21 2024-03-28 ソニーセミコンダクタソリューションズ株式会社 Optical detecting device, and optical detecting system

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003014422A (en) * 2002-05-10 2003-01-15 Matsushita Electric Ind Co Ltd Real time range finder
JP2006162386A (en) * 2004-12-06 2006-06-22 Canon Inc Three-dimensional model generation device, three-dimensional model generation system, and three-dimensional model generation program
JP2009085739A (en) * 2007-09-28 2009-04-23 Sunx Ltd Shape measuring instrument and shape measuring method
WO2016136085A1 (en) * 2015-02-27 2016-09-01 ソニー株式会社 Image processing device, image processing method and image capturing element
JP2017173259A (en) * 2016-03-25 2017-09-28 キヤノン株式会社 Measurement device, system, and goods manufacturing method

Family Cites Families (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP3414624B2 (en) * 1997-09-16 2003-06-09 松下電器産業株式会社 Real-time range finder
JP4256059B2 (en) 2000-10-04 2009-04-22 シーケーディ株式会社 3D measuring device
JP3996560B2 (en) 2003-08-18 2007-10-24 株式会社リコー Object shape measuring device
JP2009031150A (en) * 2007-07-27 2009-02-12 Omron Corp Three-dimensional shape measuring device, three-dimensional shape measurement method, three-dimensional shape measurement program, and record medium
JP5765651B2 (en) * 2011-02-01 2015-08-19 Jukiオートメーションシステムズ株式会社 3D measuring device
JP6238521B2 (en) 2012-12-19 2017-11-29 キヤノン株式会社 Three-dimensional measuring apparatus and control method thereof
JP6331308B2 (en) * 2013-09-26 2018-05-30 株式会社ニコン Shape measuring apparatus, structure manufacturing system, and shape measuring computer program
JP6364777B2 (en) * 2014-01-10 2018-08-01 凸版印刷株式会社 Image data acquisition system and image data acquisition method
CA2977481C (en) * 2015-02-24 2021-02-02 The University Of Tokyo Dynamic high-speed high-sensitivity imaging device and imaging method

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2003014422A (en) * 2002-05-10 2003-01-15 Matsushita Electric Ind Co Ltd Real time range finder
JP2006162386A (en) * 2004-12-06 2006-06-22 Canon Inc Three-dimensional model generation device, three-dimensional model generation system, and three-dimensional model generation program
JP2009085739A (en) * 2007-09-28 2009-04-23 Sunx Ltd Shape measuring instrument and shape measuring method
WO2016136085A1 (en) * 2015-02-27 2016-09-01 ソニー株式会社 Image processing device, image processing method and image capturing element
JP2017173259A (en) * 2016-03-25 2017-09-28 キヤノン株式会社 Measurement device, system, and goods manufacturing method

Also Published As

Publication number Publication date
US11796308B2 (en) 2023-10-24
WO2020039575A1 (en) 2020-02-27
CN112567199A (en) 2021-03-26
DE112018007930T5 (en) 2021-05-06
KR20210031967A (en) 2021-03-23
CN112567199B (en) 2022-11-08
US20210310791A1 (en) 2021-10-07
KR102513710B1 (en) 2023-03-24
JP7051260B2 (en) 2022-04-11

Similar Documents

Publication Publication Date Title
JP5162702B2 (en) Surface shape measuring device
JP6184289B2 (en) 3D image processing apparatus, 3D image processing method, 3D image processing program, computer-readable recording medium, and recorded apparatus
US8199335B2 (en) Three-dimensional shape measuring apparatus, three-dimensional shape measuring method, three-dimensional shape measuring program, and recording medium
TWI582383B (en) Three-dimensional measuring device
JP5443303B2 (en) Appearance inspection apparatus and appearance inspection method
JP6322335B2 (en) Appearance inspection device
KR20160007361A (en) Image capturing method using projecting light source and image capturing device using the method
JP2015045587A (en) Three-dimensional image processor, method of determining change in state of three-dimensional image processor, program for determining change in state of three-dimensional image processor, computer readable recording medium, and apparatus having the program recorded therein
US9243899B2 (en) Method of measuring a height of 3-dimensional shape measurement apparatus
TWI580926B (en) Three - dimensional measuring device
JP7051260B2 (en) 3D measuring device, 3D measuring method
KR101766468B1 (en) Method for 3D shape measuring using of Triple Frequency Pattern
JP2009180689A (en) Three-dimensional shape measuring apparatus
JP2010281778A (en) Three-dimensional shape measuring device
JP2009139285A (en) Solder ball inspection device, its inspection method, and shape inspection device
JP2009210509A (en) Three-dimensional shape measuring device and three-dimensional shape measuring computer program
KR101750883B1 (en) Method for 3D Shape Measuring OF Vision Inspection System
TW202113344A (en) Appearance inspection device, appearance inspection device calibration method, and program
JP6126640B2 (en) Three-dimensional measuring apparatus and three-dimensional measuring method
WO2023170814A1 (en) Operation device for three-dimensional measurement, three-dimensional measurement program, recording medium, three-dimensional measurement device, and operation method for three-dimensional measurement
JP7074400B2 (en) Optical measuring device
TW201719112A (en) Three-dimensional measurement device
JP2022049269A (en) Three-dimensional shape measuring method and three-dimensional shape measuring device
JP6733039B2 (en) Appearance inspection device, appearance inspection method
JP2021018081A (en) Imaging apparatus, measuring device, and measuring method

Legal Events

Date Code Title Description
A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20210217

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20210217

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20220329

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20220329

R150 Certificate of patent or registration of utility model

Ref document number: 7051260

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R150