JP2018163172A - Measurement device - Google Patents

Measurement device Download PDF

Info

Publication number
JP2018163172A
JP2018163172A JP2018133428A JP2018133428A JP2018163172A JP 2018163172 A JP2018163172 A JP 2018163172A JP 2018133428 A JP2018133428 A JP 2018133428A JP 2018133428 A JP2018133428 A JP 2018133428A JP 2018163172 A JP2018163172 A JP 2018163172A
Authority
JP
Japan
Prior art keywords
image
wavelength
light
test object
division element
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
JP2018133428A
Other languages
Japanese (ja)
Other versions
JP6611872B2 (en
Inventor
晃宏 畑田
Akihiro Hatada
晃宏 畑田
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Canon Inc
Original Assignee
Canon Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Canon Inc filed Critical Canon Inc
Priority to JP2018133428A priority Critical patent/JP6611872B2/en
Publication of JP2018163172A publication Critical patent/JP2018163172A/en
Application granted granted Critical
Publication of JP6611872B2 publication Critical patent/JP6611872B2/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

PROBLEM TO BE SOLVED: To provide a measurement device for improving a deterioration in accuracy caused by a wavelength division element.SOLUTION: A measurement device for measuring a shape of an object to be inspected comprises: a first illumination part for illuminating the object to be inspected with pattern light of a first wavelength; a second illumination part for illuminating the object to be inspected with light of a second wavelength different from the first wavelength; an imaging optical system including a wavelength division element; a first imaging element for forming a first image; a second imaging element for forming a second image; and a processing part for acquiring information on the shape of the object to be inspected by processing the data of the first image and the data of the second image. The first imaging element receives a light component of the first wavelength transmitted through the wavelength division element without being reflected once by the wavelength division element, and forms the first image of the light component of the first wavelength. The second imaging element receives a light component of the second wavelength reflected at least once by the wavelength division element, and forms the second image of the light component of the second wavelength.SELECTED DRAWING: Figure 1

Description

本発明は、被検物の形状を計測する計測装置に関する。   The present invention relates to a measuring apparatus that measures the shape of a test object.

近年のロボット技術の発展により、工業製品の組立などの複雑な工程をロボットが担うようになってきている。ロボットによる組立を実現するために、ロボットのハンドなどのエンドエフェクトと把持対象部品との間の相対的な位置姿勢を計測することが求められている。対象部品の位置姿勢を計測する方法としては、特許文献1が挙げられる。特許文献1に記載の方法は、濃淡画像から得られるエッジ情報と、パターン投影法などの3次元計測法から得られる距離画像との両方を利用して、CADなどのモデルとフィッティングすることにより対象部品の位置、姿勢を計測する。   With the recent development of robot technology, robots are taking on complicated processes such as assembly of industrial products. In order to realize assembly by a robot, it is required to measure a relative position and orientation between an end effect such as a robot hand and a gripping target component. As a method for measuring the position and orientation of the target component, Patent Literature 1 can be cited. The method described in Patent Document 1 uses an edge information obtained from a grayscale image and a distance image obtained from a three-dimensional measurement method such as a pattern projection method to fit a model such as CAD. Measure the position and orientation of the part.

濃淡画像と距離画像を単一の撮像素子から得ようとすると、時間的にずらしてそれぞれの画像を取得する、または1枚の撮像画像からそれぞれの画像を分離して取得する必要がある。組立工程の高速化に適応するためには、ロボットが移動しながら対象部品の位置姿勢を計測することが必要である。したがって、時間的にずらして画像を取得する場合には、互いの画像に位置ずれを生じてしまう。また1枚の撮像画像からそれぞれの画像を分離して得ようとすると、対象部品のエッジに距離画像用のパターン光が重畳してしまい、濃淡画像においてエッジの誤検出が発生してしまい、位置姿勢の計測精度が低下する。   In order to obtain a grayscale image and a distance image from a single image sensor, it is necessary to acquire each image with a time shift, or to acquire each image separately from one captured image. In order to adapt to the high-speed assembly process, it is necessary to measure the position and orientation of the target part while the robot moves. Therefore, when images are acquired while being shifted in time, positional deviation occurs between the images. In addition, if each image is to be obtained separately from one captured image, the pattern light for the distance image is superimposed on the edge of the target component, and an erroneous detection of the edge occurs in the grayscale image. Attitude measurement accuracy is reduced.

そこで、特許文献2では、波長の異なる濃淡画像用の照明とパターン光を投影する距離画像用の照明とにより被検物を同時に照明する。特許文献2記載の方法では、被検物により反射した光を、撮像レンズを介して集光し、ダイクロイックプリズムのような波長分割素子により2つの波長に分離した後に、2つのセンサを用いて同時刻の濃淡画像および距離画像を取得する。   Therefore, in Patent Document 2, the object to be examined is simultaneously illuminated by illumination for grayscale images having different wavelengths and illumination for distance images for projecting pattern light. In the method described in Patent Document 2, the light reflected by the test object is collected through an imaging lens, separated into two wavelengths by a wavelength division element such as a dichroic prism, and then the two sensors are used. A gray image and a distance image of the time are acquired.

特許5393318号公報Japanese Patent No. 5393318 特許5122729号公報Japanese Patent No. 5122729 特許2517062号公報Japanese Patent No. 2517062

特許文献2で開示された方法では、ダイクロイックプリズムの反射面により光束を分離して、距離画像と濃淡画像とを取得している。特許文献2のように反射面を用いると、振動、装置の姿勢変化や周辺温度の変化などによりダイクロイックプリズムが変形したり位置姿勢が変化したりする際に、反射光線の変位が透過光線と比較して大きくなってしまう。特に距離画像をパターン投影方式などの三角測量の原理により取得するものは、輻輳角度の構成条件により光線の変位量に伴う計測誤差量が増幅されてしまう。   In the method disclosed in Patent Document 2, a light beam is separated by a reflection surface of a dichroic prism, and a distance image and a grayscale image are acquired. When a reflecting surface is used as in Patent Document 2, when the dichroic prism is deformed or its position and orientation are changed due to vibration, a change in the attitude of the device or a change in ambient temperature, the displacement of the reflected light is compared with the transmitted light. And then grows. In particular, when a distance image is acquired by the principle of triangulation such as a pattern projection method, the measurement error amount accompanying the amount of displacement of the light beam is amplified by the configuration condition of the convergence angle.

そこで本発明では、波長分割素子に起因する精度の低下を改善する計測装置を提供することを目的とする。   Therefore, an object of the present invention is to provide a measuring apparatus that improves the decrease in accuracy caused by the wavelength division element.

本発明は、第1波長のパターン光で被検物を照明する第1照明部と、前記第1波長とは異なる第2波長の光で被検物を照明する第2照明部と、前記第1照明部および前記第2照明部により同時に照明された前記被検物からの光を前記第1波長の光成分と前記第2波長の光成分とに分割する波長分割素子を含む撮像光学系と、第1画像を形成する第1撮像素子と、第2画像を形成する第2撮像素子と、前記第1画像のデータおよび前記第2画像のデータを処理することにより前記被検物の形状の情報を取得する処理部と、を備える計測装置であって、前記第1撮像素子は、前記波長分割素子で1回も反射されることなく前記波長分割素子を透過した前記第1波長の光成分を受光して前記第1波長の光成分の前記第1画像を形成し、前記第2撮像素子は、前記波長分割素子で少なくとも1回反射された前記第2波長の光成分を受光して前記第2波長の光成分の前記第2画像を形成することを特徴とする。   The present invention includes a first illumination unit that illuminates a test object with pattern light having a first wavelength, a second illumination unit that illuminates the test object with light having a second wavelength different from the first wavelength, An imaging optical system including a wavelength division element that divides light from the test object illuminated simultaneously by one illumination unit and the second illumination unit into a light component of the first wavelength and a light component of the second wavelength; The first image sensor that forms the first image, the second image sensor that forms the second image, the data of the first image, and the data of the second image are processed to process the shape of the test object. And a processing unit that acquires information, wherein the first imaging element is an optical component of the first wavelength that is transmitted through the wavelength division element without being reflected by the wavelength division element even once. The first image of the light component of the first wavelength is received, and the second imaging Child, and forming a second image of the optical component of at least one reflected the second said receiving light component of the wavelength the second wavelength by the wavelength division element.

本発明によると、波長分割素子に起因する精度の低下を改善する計測装置を提供することが可能となる。   According to the present invention, it is possible to provide a measuring apparatus that improves a decrease in accuracy caused by a wavelength division element.

第1実施形態の計測装置を示す図。The figure which shows the measuring device of 1st Embodiment. ダイクロイックプリズムの位置姿勢による光線が変位する敏感度を示す図。The figure which shows the sensitivity which the light ray by the position and orientation of a dichroic prism changes. 輻輳角度による距離画像と濃淡画像との計測誤差比率を示した図。The figure which showed the measurement error ratio of the distance image by a convergence angle, and a grayscale image. 第1実施形態におけるドットパターン光を示す図。The figure which shows the dot pattern light in 1st Embodiment. 第2実施形態の計測装置を示す図。The figure which shows the measuring device of 2nd Embodiment. 第2実施形態の撮像部の変形例を示す図。The figure which shows the modification of the imaging part of 2nd Embodiment. ダイクロイックプリズムの位置姿勢による光線が変位する敏感度を示す図。The figure which shows the sensitivity which the light ray by the position and orientation of a dichroic prism changes.

[第1実施形態]
被検物の形状を計測する第1実施形態の計測装置について、図1を用いて説明する。本実施形態の計測装置は、被検物の三次元形状を示す距離画像(第1画像)用の第1照明部1、被検物の輪郭を示す濃淡画像(第2画像)用の第2照明部2、撮像部3、処理部4を含む。第1照明部1は、第1波長のパターン光(第1光)で被検物5を照明する。同時に、第2照明部2は、第1波長とは異なる第2波長の第2光で被検物5を照明する。
[First Embodiment]
A measuring apparatus according to a first embodiment for measuring the shape of a test object will be described with reference to FIG. The measurement apparatus of the present embodiment includes a first illumination unit 1 for a distance image (first image) indicating a three-dimensional shape of a test object, and a second for a grayscale image (second image) indicating the contour of the test object. An illumination unit 2, an imaging unit 3, and a processing unit 4 are included. The 1st illumination part 1 illuminates the to-be-tested object 5 with the pattern light (1st light) of a 1st wavelength. At the same time, the second illumination unit 2 illuminates the test object 5 with the second light having the second wavelength different from the first wavelength.

まず、距離画像の計測について説明する。距離画像は、三角測量の原理に基づいてパターン投影方式により取得され、被検物5の三次元形状情報が点群データとして生成される。本実施形態では、距離画像用の第1照明部1は、光源6とパターン光生成部7と投影レンズ8とを含む。パターン光生成部7は、光源6から射出された第1波長の第1光から、距離画像を取得するために用いられるパターン光を生成する。投影レンズ8は、パターン光を拡大し、拡大されたパターン光で被検物5を照明する。被検物5からの反射光および散乱光は、撮像部3内の撮像レンズ9により、波長分割素子であるダイクロイックプリズム10を介して、第1撮像素子11に受光され、パターン光成分(第1波長の光成分)による被検物5の第1画像が形成される。第1撮像素子11は、例えば、CCDやCMOSカメラなどである。処理部4は、第1撮像素子11で形成されたパターン光による第1画像のデータからパターン光のピーク座標を検出し、三角測量の原理に基づいて距離画像を取得(算出)する。本実施形態は、移動中の被検物5を計測することを想定しており、パターン光による1枚の第1画像のデータから距離を算出する。例えば、特許文献3に記載されるように、パターン光生成部7は、ドットによる符号化されたドットパターン光を生成する。ドットパターン光を図4に例示する。処理部4は、ドットパターン光を被検物5に照明して得られる画像からドットの位置を検出し、その位置関係に基づいて投影パターンと撮像された画像との対応付けを行うことで、1枚の画像から距離画像を取得する。   First, distance image measurement will be described. The distance image is acquired by a pattern projection method based on the principle of triangulation, and the three-dimensional shape information of the test object 5 is generated as point cloud data. In the present embodiment, the first illumination unit 1 for distance images includes a light source 6, a pattern light generation unit 7, and a projection lens 8. The pattern light generator 7 generates pattern light used for acquiring a distance image from the first light having the first wavelength emitted from the light source 6. The projection lens 8 enlarges the pattern light and illuminates the test object 5 with the enlarged pattern light. Reflected light and scattered light from the test object 5 are received by the first imaging element 11 via the dichroic prism 10 which is a wavelength division element by the imaging lens 9 in the imaging unit 3, and the pattern light component (first A first image of the test object 5 by the light component of the wavelength is formed. The first image sensor 11 is, for example, a CCD or a CMOS camera. The processing unit 4 detects the peak coordinates of the pattern light from the data of the first image by the pattern light formed by the first image sensor 11, and acquires (calculates) the distance image based on the principle of triangulation. In the present embodiment, it is assumed that the moving test object 5 is measured, and the distance is calculated from the data of the first image of the single pattern light. For example, as described in Patent Document 3, the pattern light generation unit 7 generates dot pattern light encoded by dots. The dot pattern light is illustrated in FIG. The processing unit 4 detects the position of the dot from the image obtained by illuminating the test object 5 with the dot pattern light, and associates the projection pattern with the captured image based on the positional relationship. A distance image is acquired from one image.

ついで、濃淡画像の計測について説明する。濃淡画像は、第2撮像素子12により形成されたモノクロの濃淡を有する第2波長の光成分によるグレースケールの画像(第2画像)である。処理部4は、濃淡画像のデータから被検物5の輪郭や稜線に相当するエッジを検出し、エッジを画像の特徴として被検物5の位置姿勢の計測に用いる。本実施形態では濃淡画像の計測では、リング照明などの第2照明部2により、第2波長の第2光で被検物5を均一に照明する。被検物5からの反射光および散乱光は、撮像レンズ9によりダイクロイックプリズム10を介して、撮像素子12に結像され、撮像される。処理部4は、撮像された第2画像のデータから被検物5の輪郭や稜線に相当するエッジを検出する。エッジを検出する手法としては、Canny法やその他のさまざまな手法があるが、本実施形態において、どの手法を用いても良い。処理部4は、事前に入力された被検物5のCADデータと、距離画像、濃淡画像を用いてモデルフィッティングすることにより被検物5の位置、姿勢を算出する。本実施形態では、撮像レンズ9、波長分割素子であるダイクロイックプリズム10が撮像光学系を構成している。   Next, measurement of a grayscale image will be described. The grayscale image is a grayscale image (second image) formed by the second image sensor 12 and having a monochrome light and shade and having a second wavelength light component. The processing unit 4 detects an edge corresponding to the contour or ridge line of the test object 5 from the data of the grayscale image, and uses the edge as a feature of the image to measure the position and orientation of the test object 5. In the present embodiment, in the measurement of the grayscale image, the test object 5 is uniformly illuminated with the second light having the second wavelength by the second illumination unit 2 such as ring illumination. Reflected light and scattered light from the test object 5 are imaged by the imaging lens 9 via the dichroic prism 10 and imaged. The processing unit 4 detects an edge corresponding to the contour or ridge line of the test object 5 from the captured second image data. As a method for detecting an edge, there are the Canny method and various other methods, and any method may be used in the present embodiment. The processing unit 4 calculates the position and orientation of the test object 5 by performing model fitting using CAD data of the test object 5 input in advance, a distance image, and a grayscale image. In the present embodiment, the imaging lens 9 and the dichroic prism 10 that is a wavelength division element constitute an imaging optical system.

第1実施形態のポイントは、波長分割素子として、波長の異なる第1光および第2光を透過光と反射光として分離するダイクロイックプリズム10を用い、プリズム10の位置姿勢の変化、変形による計測誤差の影響を抑えることである。ダイクロイックプリズム10が、振動、計測装置の姿勢変化や周辺温度の変化などにより変形したり位置姿勢が変化したりする。その場合には、反射光線の第2撮像素子12の面上における変位量は透過光線の第1撮像素子11の面上における変位量と比較して大きくなる。例えば、寸法17mm立方のダイクロイックプリズム10の位置姿勢の変化による透過光線および反射光線の第1撮像素子11、第2撮像素子12面上での変位量を図2に示す。透過光線では、ダイクロイックプリズム10の位置シフトに対して光線の変位はなく、撮像光学系の光軸と垂直な軸を中心とする姿勢が変化した場合にのみ光線の変位が発生する。それに対して、反射光線では、ダイクロイックプリズム10の反射面における位置および姿勢の変化によっても光線の変位は発生し、かつ、反射光線の変位量は、透過光線の変位量よりも数倍大きくなっている。   The point of the first embodiment is that the dichroic prism 10 that separates the first light and the second light having different wavelengths as transmitted light and reflected light is used as the wavelength division element, and the measurement error due to the change in the position and orientation of the prism 10 and deformation. It is to suppress the influence of. The dichroic prism 10 is deformed or changes its position and orientation due to vibration, a change in posture of the measuring device, a change in ambient temperature, and the like. In that case, the amount of displacement of the reflected light on the surface of the second image sensor 12 is larger than the amount of displacement of the transmitted light on the surface of the first image sensor 11. For example, FIG. 2 shows displacement amounts of transmitted light and reflected light on the surface of the first image sensor 11 and the second image sensor 12 due to a change in the position and orientation of a dichroic prism 10 having a dimension of 17 mm. In the transmitted light, there is no displacement of the light with respect to the position shift of the dichroic prism 10, and the displacement of the light occurs only when the attitude around the axis perpendicular to the optical axis of the imaging optical system is changed. On the other hand, in the reflected light beam, the displacement of the light beam also occurs due to the change in the position and posture on the reflection surface of the dichroic prism 10, and the displacement amount of the reflected light beam is several times larger than the displacement amount of the transmitted light beam. Yes.

本実施形態での距離画像を取得する方法は、三角測量の原理によるパターン投影方式である。この方式において検出するパターン光強度のピーク座標にずれを生じた場合の計測誤差ΔZは次式1で表される。ここで、δXCは第1撮像素子面上の座標系におけるパターン光強度のピーク座標の変位量[pixel]、θは輻輳角度で、距離画像用の第1照明部1の光軸と撮像光学系の光軸とがなす角度である。
ΔZ=δXc×(物体面上での第1撮像素子の画素サイズ)/tanθ・・・(1)
一方、濃淡画像において検出するエッジ位置座標に対する計測誤差ΔP(x,y)は次式2のようになる。ここで、δPC(xC、yC)は、第2撮像素子面上の座標系におけるエッジ検出の位置の変位量[pixel]である。
ΔP(x,y)=δPC(xC、yC)×(物体面上での第2撮像素子の画素サイズ)・・・(2)
式2に示されるように、エッジ位置の計測誤差は、単純に第2撮像素子面上の座標系における変位量に物体面上での撮像素子の画素サイズをかけたものから得られる。式1、式2を用いて、距離画像と濃淡画像における第1、第2撮像素子面上における光線の変位量に対する計測誤差の敏感度の比の輻輳角度θによる変化を図3に示す。例えば、第1照明部1と撮像部3との主点間距離である基線長が50mm、被検物5までのワーキングディスタンスが200mmであるとする。この場合に、輻輳角度は11°程度となり、同量の光線の変位に対する誤差の発生量は、距離画像のほうが4倍大きくなる。
The method for obtaining a distance image in this embodiment is a pattern projection method based on the principle of triangulation. A measurement error ΔZ when a deviation occurs in the peak coordinates of the pattern light intensity detected in this method is expressed by the following equation 1. Here, δXC is the displacement amount [pixel] of the peak coordinate of the pattern light intensity in the coordinate system on the first image sensor surface, θ is the convergence angle, and the optical axis of the first illumination unit 1 for the distance image and the imaging optical system The angle formed by the optical axis.
ΔZ = δXc × (pixel size of the first image sensor on the object plane) / tan θ (1)
On the other hand, the measurement error ΔP (x, y) with respect to the edge position coordinates detected in the grayscale image is expressed by the following equation 2. Here, δPC (xC, yC) is the displacement [pixel] of the position of edge detection in the coordinate system on the second image sensor surface.
ΔP (x, y) = δPC (xC, yC) × (pixel size of the second image sensor on the object plane) (2)
As shown in Equation 2, the edge position measurement error is obtained by simply multiplying the displacement in the coordinate system on the second image sensor surface by the pixel size of the image sensor on the object surface. FIG. 3 shows changes in the ratio of the sensitivity of the measurement error to the amount of displacement of the light beam on the first and second imaging element surfaces in the distance image and the grayscale image by the convergence angle θ using Expression 1 and Expression 2. For example, it is assumed that the baseline length, which is the distance between the principal points of the first illumination unit 1 and the imaging unit 3, is 50 mm, and the working distance to the test object 5 is 200 mm. In this case, the convergence angle is about 11 °, and the amount of error with respect to the same amount of light displacement is four times larger in the distance image.

パターン投影方式による距離画像の計測では、投影レンズ8のNAを絞って被写界深度をとることで計測範囲を確保する必要がある。このような条件下で、高精度な計測を行うために、光源6としては高輝度なLED光源を用いることができる。LED光源は、赤色、緑色、青色の3原色が入手可能であるが、最も高輝度な波長は青色(450〜470nm)である。よって、距離画像用の光源として、青色LEDを使用し、ダイクロイックプリズム10に青色波長の光を透過する光学特性を持たせる。そうすることで、プリズム10の位置姿勢の変化のみならず、移動中の被検物5の計測等、十分な露光時間を確保できない状態でも精度を担保できる。   In the distance image measurement by the pattern projection method, it is necessary to secure the measurement range by reducing the NA of the projection lens 8 and taking the depth of field. In order to perform highly accurate measurement under such conditions, a high-luminance LED light source can be used as the light source 6. As the LED light source, three primary colors of red, green, and blue are available, but the wavelength with the highest luminance is blue (450 to 470 nm). Therefore, a blue LED is used as the light source for the distance image, and the dichroic prism 10 has an optical characteristic of transmitting blue wavelength light. By doing so, the accuracy can be ensured not only in the change in the position and orientation of the prism 10 but also in a state where a sufficient exposure time cannot be secured, such as measurement of the moving object 5 to be measured.

図7に、プリズム10の位置や姿勢による計測誤差の敏感度と輻輳角度との関係を示す。図7中、実線は、プリズム10を透過した光で距離画像計測を行い、プリズム10で反射された光で濃淡画像計測を行う場合の関係を示す。点線は、プリズム10で反射された光で距離画像計測を行い、プリズム10を透過した光で濃淡画像計測を行う場合の関係を示す。本実施形態では、輻輳角度が小さい条件下では距離画像の光線変位に対する計測誤差の敏感度が濃淡画像よりも高い。しかし、光線の変位量が小さいダイクロイックプリズム10の透過方向から距離画像を取得することで精度が担保できる。一方、濃淡画像をダイクロイックプリズム10の反射方向から取得する場合、濃淡画像用の反射方向の光線の変位量は大きくなる。しかし、輻輳角度が小さい条件下では、濃淡画像の光線変位に対する計測誤差の敏感度が低いので、濃淡画像の計測精度を担保することが可能となる。   FIG. 7 shows the relationship between the sensitivity of the measurement error due to the position and orientation of the prism 10 and the convergence angle. In FIG. 7, the solid line indicates the relationship when the distance image measurement is performed with the light transmitted through the prism 10 and the grayscale image measurement is performed with the light reflected by the prism 10. A dotted line indicates a relationship when distance image measurement is performed with light reflected by the prism 10 and grayscale image measurement is performed with light transmitted through the prism 10. In the present embodiment, the sensitivity of the measurement error to the light beam displacement of the distance image is higher than that of the grayscale image under the condition that the convergence angle is small. However, accuracy can be ensured by acquiring a distance image from the transmission direction of the dichroic prism 10 in which the amount of displacement of the light beam is small. On the other hand, when the gray image is acquired from the reflection direction of the dichroic prism 10, the amount of displacement of the light beam in the reflection direction for the gray image becomes large. However, under the condition that the convergence angle is small, the sensitivity of the measurement error to the light beam displacement of the grayscale image is low, so that the measurement accuracy of the grayscale image can be ensured.

本実施形態では、輻輳角度を図7に基づいて39°より小さい角度とし、第1撮像素子11が、ダイクロイックプリズム10を透過した第1光を受光し、ダイクロイックプリズム10で反射された第2光を受光する。そうすれば、距離画像および濃淡画像の計測結果のどちらにも同程度の計測精度を担保することが可能となり、計測装置の計測精度を確保し得る。なお、本実施形態では、波長分割素子が、そこに入射した光を透過光と反射光とに分割する。しかし、波長分割素子が、例えば、そこに入射した光を、波長分割素子で1回も反射されることなく透過した光と、波長分割素子で透過することなく反射された光と、波長分離素子で透過され反射された光とに分割するものがある。そのような場合には、第1撮像素子11が、波長分割素子10で1回も反射されることなく透過した光を受光し、波長分割素子で少なくとも1回反射された光を受光する。   In the present embodiment, the convergence angle is set to an angle smaller than 39 ° based on FIG. 7, the first imaging element 11 receives the first light transmitted through the dichroic prism 10, and the second light reflected by the dichroic prism 10. Is received. By doing so, it is possible to ensure the same level of measurement accuracy for both the distance image and the measurement result of the grayscale image, and the measurement accuracy of the measurement apparatus can be ensured. In this embodiment, the wavelength division element divides the light incident thereon into transmitted light and reflected light. However, the wavelength division element, for example, the light that has entered the light without being reflected by the wavelength division element even once, the light that has been reflected without transmission by the wavelength division element, and the wavelength separation element There is one that divides the light into the light that has been transmitted and reflected. In such a case, the first imaging element 11 receives light that has been transmitted without being reflected once by the wavelength division element 10, and receives light that has been reflected at least once by the wavelength division element.

[第2実施形態]
第2実施形態の計測装置について、図5を用いて説明する。第2実施形態の計測装置は、距離画像用の第1照明部1、濃淡画像用の第2照明部2、撮像部3、処理部4を含む。第1実施形態と共通な部分の説明は省略する。
[Second Embodiment]
A measuring apparatus according to the second embodiment will be described with reference to FIG. The measurement apparatus according to the second embodiment includes a first illumination unit 1 for a distance image, a second illumination unit 2 for a grayscale image, an imaging unit 3, and a processing unit 4. Descriptions of parts common to the first embodiment are omitted.

第2実施形態の計測装置は、波長分割素子として平行平板型のダイクロイックミラー13を含む。第1実施形態のプリズム形状の波長分割素子に対して、第2実施形態では、平行平板型の波長分割素子とすることにより、波長分割素子自体の剛性が低下し反射光線の変位は大きくなる。平行平板型のダイクロイックミラー13を用いると、透過光線が通過する媒質内の距離が短くなることから、透過光線の変位は小さくなる。したがって、特に、輻輳角度が小さく光線の変位に対する距離画像の誤差の敏感度が高い条件下において第2実施形態は有効である。また、第2実施形態は、プリズム型のものと比べて、安価であるという利点もある。   The measurement apparatus of the second embodiment includes a parallel plate dichroic mirror 13 as a wavelength division element. In contrast to the prism-shaped wavelength division element of the first embodiment, in the second embodiment, by using a parallel plate type wavelength division element, the rigidity of the wavelength division element itself decreases and the displacement of the reflected light beam increases. When the parallel plate type dichroic mirror 13 is used, since the distance in the medium through which the transmitted light passes is shortened, the displacement of the transmitted light becomes small. Therefore, the second embodiment is particularly effective under the condition that the convergence angle is small and the sensitivity of the error of the distance image to the displacement of the light beam is high. Further, the second embodiment has an advantage that it is less expensive than the prism type.

上述のように撮像素子面上における光線の変位が計測精度に寄与することは説明した。しかし、実際には光線の変位が発生しなくても第1撮像素子11が光線に対して変位してしまうことも想定する必要がある。この対策として、図6に示すように、第1撮像素子11を撮像光学系に対して固定して保持する保持部14を用いる。筺体15に第1撮像素子11と撮像レンズ9とを独自に保持する場合に対し、撮像レンズ9と第1撮像素子11との間の剛性を高められるため、第1撮像素子11の相対位置の変位は小さくなる。   As described above, it has been described that the displacement of the light beam on the imaging element surface contributes to the measurement accuracy. However, in reality, it is necessary to assume that the first image sensor 11 is displaced with respect to the light beam even if the light beam is not displaced. As a countermeasure against this, as shown in FIG. 6, a holding unit 14 that holds the first image sensor 11 fixed to the imaging optical system is used. Since the rigidity between the imaging lens 9 and the first imaging element 11 can be increased as compared with the case where the first imaging element 11 and the imaging lens 9 are independently held in the housing 15, the relative position of the first imaging element 11 can be increased. The displacement becomes smaller.

以上、本発明の実施形態について説明したが、本発明はこれらの実施形態に限定されず、その要旨の範囲内で種々の変形および変更が可能である。   As mentioned above, although embodiment of this invention was described, this invention is not limited to these embodiment, A various deformation | transformation and change are possible within the range of the summary.

1:第1照明部。2:第2照明部。3:撮像部。4:処理部。9:撮像レンズ。10:ダイクロイックプリズム(波長分割素子)。11:第1撮像素子。12:第2撮像素子。13:ダイクロイックミラー。   1: 1st illumination part. 2: 2nd illumination part. 3: An imaging unit. 4: Processing unit. 9: Imaging lens. 10: Dichroic prism (wavelength division element). 11: First image sensor. 12: Second image sensor. 13: Dichroic mirror.

本発明の1つの側面は、計測装置に係り、前記計測装置は、第1波長のパターン光で物体を照明する第1照明部と、前記第1波長とは異なる第2波長の光で前記物体を照明する第2照明部と、前記第1照明部および前記第2照明部により照明された前記物体からの光のうち前記第1波長の光を透過させ、前記第2波長の光を反射させる波長分離素子を有し、前記波長分離素子で1回も反射されることなく前記波長分離素子を透過した前記第1波長の光により第1撮像素子に第1画像を形成すると共に前記波長分離素子で反射した前記第2波長の光により第2撮像素子に第2画像を形成する光学系と、前記第1撮像素子に形成した前記第1画像の情報を処理することにより前記物体の三次元形状を示す情報を取得し、前記第2撮像素子に形成した前記第2画像の情報を処理することにより前記物体のエッジを示す情報を取得する処理部と、を有する。One aspect of the present invention relates to a measurement device, and the measurement device includes a first illumination unit that illuminates an object with pattern light having a first wavelength, and the object with light having a second wavelength different from the first wavelength. A second illuminator that illuminates the light, and transmits light of the first wavelength among light from the object illuminated by the first illuminator and the second illuminator, and reflects light of the second wavelength A wavelength separation element that forms a first image on the first imaging element by the light of the first wavelength that has passed through the wavelength separation element without being reflected by the wavelength separation element once, and the wavelength separation element; An optical system that forms a second image on the second image sensor with the light of the second wavelength reflected by the light source, and a three-dimensional shape of the object by processing information of the first image formed on the first image sensor Is obtained and formed on the second image sensor. Having a processing unit that acquires information indicating an edge of the object by processing the information of the second image.

Claims (9)

第1波長のパターン光で被検物を照明する第1照明部と、前記第1波長とは異なる第2波長の光で被検物を照明する第2照明部と、前記第1照明部および前記第2照明部により同時に照明された前記被検物からの光を前記第1波長の光成分と前記第2波長の光成分とに分割する波長分割素子を含む撮像光学系と、第1画像を形成する第1撮像素子と、第2画像を形成する第2撮像素子と、前記第1画像のデータおよび前記第2画像のデータを処理することにより前記被検物の形状の情報を取得する処理部と、を備える計測装置であって、
前記第1撮像素子は、前記波長分割素子で1回も反射されることなく前記波長分割素子を透過した前記第1波長の光成分を受光して前記第1波長の光成分の前記第1画像を形成し、前記第2撮像素子は、前記波長分割素子で少なくとも1回反射された前記第2波長の光成分を受光して前記第2波長の光成分の前記第2画像を形成することを特徴とする計測装置。
A first illumination unit that illuminates the test object with pattern light of the first wavelength, a second illumination unit that illuminates the test object with light of a second wavelength different from the first wavelength, the first illumination unit, and An imaging optical system including a wavelength division element that divides light from the object illuminated simultaneously by the second illumination unit into a light component of the first wavelength and a light component of the second wavelength; and a first image Information on the shape of the test object is obtained by processing the data of the first image and the data of the second image, the first image sensor forming the second image, the second image sensor forming the second image, A measuring device comprising a processing unit,
The first image sensor receives the light component of the first wavelength that has passed through the wavelength divider without being reflected by the wavelength divider once, and the first image of the light component of the first wavelength. The second imaging element receives the light component of the second wavelength reflected at least once by the wavelength division element and forms the second image of the light component of the second wavelength. A characteristic measuring device.
前記第1照明部の光軸と前記撮像光学系の光軸とのなす角度が、前記波長分割素子による計測誤差の敏感度により定まる角度より小さいことを特徴とする請求項1に記載の計測装置。   2. The measurement apparatus according to claim 1, wherein an angle formed by an optical axis of the first illumination unit and an optical axis of the imaging optical system is smaller than an angle determined by sensitivity of a measurement error by the wavelength division element. . 前記第1画像は、前記被検物の三次元形状を示す画像であり、前記第2画像は、前記被検物の輪郭を示す画像であることを特徴とする請求項1又は2に記載の計測装置。   The said 1st image is an image which shows the three-dimensional shape of the said test object, The said 2nd image is an image which shows the outline of the said test object, The Claim 1 or 2 characterized by the above-mentioned. Measuring device. 前記波長分割素子は、ダイクロイックプリズムであることを特徴とする請求項1ないし3のいずれか1項に記載の計測装置。   The measuring apparatus according to claim 1, wherein the wavelength division element is a dichroic prism. 前記波長分割素子は、平行平板型のダイクロイックミラーであることを特徴とする請求項1ないし3のいずれか1項に記載の計測装置。   The measuring apparatus according to claim 1, wherein the wavelength division element is a parallel plate dichroic mirror. 前記第1照明部は、青色LEDを含み、前記波長分割素子は、450〜470nmの波長の光を透過させることを特徴とする請求項1ないし5のいずれか1項に記載の計測装置。   The measurement apparatus according to claim 1, wherein the first illumination unit includes a blue LED, and the wavelength division element transmits light having a wavelength of 450 to 470 nm. 前記第1撮像素子と前記第2撮像素子とを前記撮像光学系に対して固定して保持する保持部を備えることを特徴とする請求項1ないし6のいずれか1項に記載の計測装置。   The measuring apparatus according to claim 1, further comprising a holding unit that holds the first image sensor and the second image sensor fixed to the image pickup optical system. 前記パターン光は、ドットパターンを有することを特徴とする請求項1ないし7のいずれか1項に記載の計測装置。   The measuring apparatus according to claim 1, wherein the pattern light has a dot pattern. 前記処理部は、前記第1画像および前記第2画像を前記被検物のCADデータとモデルフィッティングすることにより前記被検物の位置および姿勢を示す情報を取得することを特徴とする請求項1ないし8のいずれか1項に記載の計測装置。   2. The processing unit obtains information indicating a position and a posture of the test object by performing model fitting of the first image and the second image with CAD data of the test object. 9. The measuring device according to any one of items 8 to 8.
JP2018133428A 2018-07-13 2018-07-13 Measuring device Expired - Fee Related JP6611872B2 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018133428A JP6611872B2 (en) 2018-07-13 2018-07-13 Measuring device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2018133428A JP6611872B2 (en) 2018-07-13 2018-07-13 Measuring device

Related Parent Applications (1)

Application Number Title Priority Date Filing Date
JP2015174016A Division JP6371742B2 (en) 2015-09-03 2015-09-03 Measuring device and acquisition method

Publications (2)

Publication Number Publication Date
JP2018163172A true JP2018163172A (en) 2018-10-18
JP6611872B2 JP6611872B2 (en) 2019-11-27

Family

ID=63859985

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2018133428A Expired - Fee Related JP6611872B2 (en) 2018-07-13 2018-07-13 Measuring device

Country Status (1)

Country Link
JP (1) JP6611872B2 (en)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH085348A (en) * 1994-06-20 1996-01-12 Matsushita Electric Ind Co Ltd Three-dimensional shape inspection method
JPH1194520A (en) * 1997-09-16 1999-04-09 Matsushita Electric Ind Co Ltd Real time range finder
JPWO2010137637A1 (en) * 2009-05-27 2012-11-15 株式会社ニコン Shape measuring device, shape measuring method, and manufacturing method
JP5122729B2 (en) * 2005-04-26 2013-01-16 照明 與語 3D shape measurement method
JP5393318B2 (en) * 2009-07-28 2014-01-22 キヤノン株式会社 Position and orientation measurement method and apparatus
US20150330775A1 (en) * 2012-12-12 2015-11-19 The University Of Birminggham Simultaneous multiple view surface geometry acquisition using structured light and mirrors
JP2015206749A (en) * 2014-04-23 2015-11-19 株式会社ニコン Coupling method of three-dimensional data, shape measurement method, coupling device of three-dimensional data, shape measurement device, structure manufacturing method, structure manufacturing system and shape measurement program

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH085348A (en) * 1994-06-20 1996-01-12 Matsushita Electric Ind Co Ltd Three-dimensional shape inspection method
JPH1194520A (en) * 1997-09-16 1999-04-09 Matsushita Electric Ind Co Ltd Real time range finder
JP5122729B2 (en) * 2005-04-26 2013-01-16 照明 與語 3D shape measurement method
JPWO2010137637A1 (en) * 2009-05-27 2012-11-15 株式会社ニコン Shape measuring device, shape measuring method, and manufacturing method
JP5393318B2 (en) * 2009-07-28 2014-01-22 キヤノン株式会社 Position and orientation measurement method and apparatus
US20150330775A1 (en) * 2012-12-12 2015-11-19 The University Of Birminggham Simultaneous multiple view surface geometry acquisition using structured light and mirrors
JP2015206749A (en) * 2014-04-23 2015-11-19 株式会社ニコン Coupling method of three-dimensional data, shape measurement method, coupling device of three-dimensional data, shape measurement device, structure manufacturing method, structure manufacturing system and shape measurement program

Also Published As

Publication number Publication date
JP6611872B2 (en) 2019-11-27

Similar Documents

Publication Publication Date Title
JP6478725B2 (en) Measuring device and robot
EP3531066B1 (en) Three-dimensional scanning method including a plurality of lasers with different wavelengths, and scanner
US10782126B2 (en) Three-dimensional scanning method containing multiple lasers with different wavelengths and scanner
US10068348B2 (en) Method and apparatus for indentifying structural elements of a projected structural pattern in camera images
US20160267668A1 (en) Measurement apparatus
US10713810B2 (en) Information processing apparatus, method of controlling information processing apparatus, and storage medium
CN105698707B (en) A kind of grating three-dimensional shape measuring apparatus
JP2017020873A (en) Measurement device for measuring shape of measurement object
JP2007093412A (en) Three-dimensional shape measuring device
US6730926B2 (en) Sensing head and apparatus for determining the position and orientation of a target object
US20180284032A1 (en) Method of testing an object and apparatus for performing the same
JP5874252B2 (en) Method and apparatus for measuring relative position with object
US20180098053A1 (en) Imaging device, endoscope apparatus, and imaging method
US20170309035A1 (en) Measurement apparatus, measurement method, and article manufacturing method and system
US20170016716A1 (en) Method and apparatus for recording images in the aligning of vehicles having a color-selective beam splitter
JP6371742B2 (en) Measuring device and acquisition method
JP2016161351A (en) Measurement apparatus
JP6611872B2 (en) Measuring device
JP2018116032A (en) Measurement device for measuring shape of target measurement object
JP6508763B2 (en) Surface inspection device
JP2017173259A (en) Measurement device, system, and goods manufacturing method
JP2017049179A5 (en) Measuring device and acquisition method
JP2008164338A (en) Position sensor
JP6780533B2 (en) Shape measurement system and shape measurement method
JP2005221495A (en) Identifying method and system for position of stage using target images

Legal Events

Date Code Title Description
A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20180903

A621 Written request for application examination

Free format text: JAPANESE INTERMEDIATE CODE: A621

Effective date: 20180903

A131 Notification of reasons for refusal

Free format text: JAPANESE INTERMEDIATE CODE: A131

Effective date: 20190618

A977 Report on retrieval

Free format text: JAPANESE INTERMEDIATE CODE: A971007

Effective date: 20190619

A521 Request for written amendment filed

Free format text: JAPANESE INTERMEDIATE CODE: A523

Effective date: 20190809

TRDD Decision of grant or rejection written
A01 Written decision to grant a patent or to grant a registration (utility model)

Free format text: JAPANESE INTERMEDIATE CODE: A01

Effective date: 20190930

A61 First payment of annual fees (during grant procedure)

Free format text: JAPANESE INTERMEDIATE CODE: A61

Effective date: 20191029

R151 Written notification of patent or utility model registration

Ref document number: 6611872

Country of ref document: JP

Free format text: JAPANESE INTERMEDIATE CODE: R151

LAPS Cancellation because of no payment of annual fees