JP2020041976A - Three-dimensional shape measuring device - Google Patents

Three-dimensional shape measuring device Download PDF

Info

Publication number
JP2020041976A
JP2020041976A JP2018171587A JP2018171587A JP2020041976A JP 2020041976 A JP2020041976 A JP 2020041976A JP 2018171587 A JP2018171587 A JP 2018171587A JP 2018171587 A JP2018171587 A JP 2018171587A JP 2020041976 A JP2020041976 A JP 2020041976A
Authority
JP
Japan
Prior art keywords
dimensional shape
shape
focus
inspection object
unit
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
JP2018171587A
Other languages
Japanese (ja)
Inventor
鶴田 知彦
Tomohiko Tsuruta
知彦 鶴田
康行 高井
Yasuyuki Takai
康行 高井
悟史 柏原
Satoshi Kashiwabara
悟史 柏原
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Toyota Motor Corp
Soken Inc
Original Assignee
Toyota Motor Corp
Soken Inc
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Toyota Motor Corp, Soken Inc filed Critical Toyota Motor Corp
Priority to JP2018171587A priority Critical patent/JP2020041976A/en
Publication of JP2020041976A publication Critical patent/JP2020041976A/en
Pending legal-status Critical Current

Links

Abstract

To provide a three-dimensional shape measuring device with which it is possible to suitably specify the three-dimensional shape of surface of an inspection object even when it is an unknown shape.SOLUTION: A three-dimensional shape measuring device 10 measures a three-dimensional shape of an inspection object. The three-dimensional shape measuring device 10 comprises: an imaging unit 11 including a camera for imaging an inspection object the distance of which to an in-focus position is constant, the imaging distance from the camera to the inspection object being variable; an imaging control unit 12 for causing the inspection object to be imaged at a plurality of imaging distances by the imaging unit; an in-focus position calculation unit 13 for acquiring each image imaged by the imaging unit and calculating an in-focus position on the basis of an in-focus degree in each portion of each image; a shape specification unit 14 for calculating an in-focus distribution that is a set of in-focus positions on the basis of the in-focus position calculated by the in-focus position calculation unit and specifying the three-dimensional shape of the inspection object on the basis of the calculated in-focus distribution; and a shape extraction unit 15 for calculating a feature at each position on the three-dimensional shape of the inspection object specified by the shape specification unit and extracting a prescribed type of shape that constitutes the three-dimensional shape on the basis of the calculated feature.SELECTED DRAWING: Figure 1

Description

本発明は、物体の表面の立体形状を計測する立体形状計測に関する。   The present invention relates to three-dimensional shape measurement for measuring a three-dimensional shape of a surface of an object.

製品の表面に微細な異物が付着していないか検査することは品質保証上重要である。そのため、例えば製品を撮影した画像の合焦度に基づいて表面の微細な立体形状を計測することが行われている。特許文献1は、立体形状として、ボンディングされたワイヤを、合焦位置を異ならせながら複数撮影し、撮影した各画像内の合焦度に基づいて、撮影方向の断面における合焦位置分布を生成し、ワイヤの立体形状を抽出することを開示している。   It is important for quality assurance to inspect for fine foreign substances on the surface of the product. Therefore, for example, a fine three-dimensional shape of the surface is measured based on the degree of focus of an image of a product. Japanese Patent Application Laid-Open No. H11-163873 takes a plurality of images of a bonded wire as a three-dimensional shape while changing the focus position, and generates a focus position distribution in a cross section in the shooting direction based on the degree of focus in each of the shot images. It discloses that the three-dimensional shape of the wire is extracted.

特開2017−92187号公報JP-A-2017-92187

特許文献1が開示する方法では、ワイヤの位置、方向や太さが既知であるため、合焦位置分布からワイヤの形状を特定しやすいが、異物のような未知の形状の場合は特定が困難な場合がある。   In the method disclosed in Patent Document 1, since the position, direction, and thickness of the wire are known, it is easy to specify the shape of the wire from the focus position distribution. However, it is difficult to specify the shape of the wire when the shape is unknown such as a foreign matter. It may be.

本発明は、上記課題を鑑みてなされたものであり、検査対象物の表面の立体形状が未知の形状であっても好適に特定できる立体形状計測装置を提供することを目的とする。   The present invention has been made in view of the above-described problems, and has as its object to provide a three-dimensional shape measuring apparatus that can appropriately specify a three-dimensional shape of a surface of an inspection target even if the three-dimensional shape is unknown.

上記課題を解決するために、本発明の一局面は、検査対象物の立体形状を計測する立体形状計測装置であって、合焦位置までの距離が一定である、検査対象物を撮影するカメラを含み、カメラから検査対象物までの撮影距離が可変である撮影部と、撮影部に、複数の撮影距離において検査対象物を撮影させる撮影制御部と、撮影部が撮影した各画像を取得し、各画像の各部分における合焦度に基づいて合焦位置を算出する合焦位置算出部と、合焦位置算出部が算出した合焦位置に基づいて、合焦位置集合である合焦分布を算出し、算出した合焦分布に基づいて、検査対象物の立体形状を特定する形状特定部と、形状特定部が特定した検査対象物の立体形状の各位置における特徴を算出し、算出した特徴に基づいて立体形状を構成する所定の種別の形状を抽出する形状抽出部とを備える、立体形状計測装置である。   In order to solve the above-described problem, one aspect of the present invention is a three-dimensional shape measurement apparatus that measures a three-dimensional shape of an inspection target, and a camera that captures an inspection target, in which a distance to a focus position is constant. Including a photographing unit having a variable photographing distance from the camera to the inspection object, a photographing control unit for causing the photographing unit to photograph the inspection object at a plurality of photographing distances, and acquiring each image photographed by the photographing unit. A focus position calculation unit that calculates a focus position based on the degree of focus in each part of each image; and a focus distribution that is a focus position set based on the focus positions calculated by the focus position calculation unit. Based on the calculated focus distribution, a shape specifying unit that specifies the three-dimensional shape of the inspection object, and a feature at each position of the three-dimensional shape of the inspection object specified by the shape specification unit are calculated and calculated. Predetermined seeds that construct a three-dimensional shape based on features And a shape extraction unit for extracting a shape, a three-dimensional shape measuring apparatus.

本発明によれば、検査対象物の表面の立体形状が未知の形状であっても好適に特定できる立体形状計測装置を提供することができる。   According to the present invention, it is possible to provide a three-dimensional shape measurement apparatus that can appropriately specify even a three-dimensional shape of the surface of an inspection target even when the three-dimensional shape is unknown.

本発明の一実施形態に係る立体形状計測装置の機能ブロック図Functional block diagram of a three-dimensional shape measuring apparatus according to an embodiment of the present invention 本発明の一実施形態に係る撮影部の模式側面図A schematic side view of a photographing unit according to an embodiment of the present invention. 本発明の一実施形態に係る処理を示すフローチャートFlowchart showing processing according to an embodiment of the present invention 本発明の一実施形態に係る撮影画像と合焦度とを示す図FIG. 4 is a diagram illustrating a captured image and a focus degree according to an embodiment of the present invention. 本発明の一実施形態に係る撮影画像と合焦度とを示す図FIG. 4 is a diagram illustrating a captured image and a focus degree according to an embodiment of the present invention. 本発明の一実施形態に係る撮影画像と合焦度とを示す図FIG. 4 is a diagram illustrating a captured image and a focus degree according to an embodiment of the present invention. 本発明の一実施形態に係る撮影画像とその合焦位置とを示す図FIG. 2 is a diagram illustrating a captured image and an in-focus position thereof according to an embodiment of the present invention. 本発明の一実施形態に係る合焦位置と合焦分布とを示す図FIG. 4 is a diagram illustrating a focus position and a focus distribution according to an embodiment of the present invention. 本発明の一実施形態に係る立体形状の特徴抽出を示す図The figure which shows the feature extraction of the three-dimensional shape which concerns on one Embodiment of this invention. 本発明の一実施形態に係る合焦分布と立体形状の各部分の形状種別を色分けした例とを示す図FIG. 6 is a diagram illustrating an example of a focus distribution according to an embodiment of the present invention and an example in which the shape type of each part of a three-dimensional shape is color-coded.

(実施形態)
本発明に係る立体形状計測装置は、検査対象物の表面を撮影した画像に基づいて検査対象物の合焦分布を算出し、合焦分布に基づいて検査対象物表面の立体形状を特定し、さらに立体形状から特徴を抽出して、立体形状の各部分の形状の種別を抽出する。
(Embodiment)
The three-dimensional shape measuring apparatus according to the present invention calculates a focus distribution of the inspection target based on an image of the surface of the inspection target, specifies a three-dimensional shape of the inspection target surface based on the focus distribution, Further, features are extracted from the three-dimensional shape, and the type of the shape of each part of the three-dimensional shape is extracted.

以下、本発明の一実施形態について、図面を参照しながら詳細に説明する。   Hereinafter, an embodiment of the present invention will be described in detail with reference to the drawings.

<構成>
図1に、本実施形態における立体形状計測装置10の機能ブロック図を示す。立体形状計測装置10は、撮影部11、撮影制御部12、合焦位置算出部13、形状特定部14、形状抽出部15を含む。図2に撮影部11の側面図の一例を模式的に示す。撮影部11は、カメラ111と、カメラ111を支持するカメラ支持部112と、検査対象物900を載置する載置部113とを含む。カメラ111は、カメラ111から合焦位置までの距離が固定されている。カメラ支持部112は、カメラ111の高さを可変に支持する。これによって、カメラ111と検査対象物900との間の距離である撮影距離を可変とすることができる。なお、検査対象物900は、例えば金属部品であり、表面から突出する微細突出部分(ウィスカ)901のような異物を有しうる。
<Structure>
FIG. 1 shows a functional block diagram of a three-dimensional shape measuring apparatus 10 according to the present embodiment. The three-dimensional shape measuring apparatus 10 includes a photographing unit 11, a photographing control unit 12, a focus position calculating unit 13, a shape specifying unit 14, and a shape extracting unit 15. FIG. 2 schematically shows an example of a side view of the photographing unit 11. The imaging unit 11 includes a camera 111, a camera support unit 112 that supports the camera 111, and a mounting unit 113 that mounts the inspection object 900. The camera 111 has a fixed distance from the camera 111 to the in-focus position. The camera support 112 variably supports the height of the camera 111. Thus, the shooting distance, which is the distance between the camera 111 and the inspection object 900, can be made variable. Note that the inspection object 900 is, for example, a metal component, and may have a foreign substance such as a fine protruding portion (whisker) 901 protruding from the surface.

撮影制御部12は、撮影部11を制御し、カメラ111に複数の撮影距離において検査対象物900を撮影させることができる。撮影された画像は、画像を撮影した際の撮影距離を対応付けて、図示しない記憶部に記憶される。合焦位置算出部13は、記憶部から各画像を取得し、画像の各部分の合焦度を算出し、合焦度に基づいて、さらに合焦位置を算出する。形状特定部14は、合焦位置に基づいて、合焦位置の集合である合焦分布を算出し、合焦分布に基づいて検査対象物900の立体形状を特定する。形状抽出部15は、特定された立体形状から特徴を抽出し、立体形状を構成する所定の種別の形状を抽出する。   The photographing control unit 12 can control the photographing unit 11 to cause the camera 111 to photograph the inspection object 900 at a plurality of photographing distances. The photographed image is stored in a storage unit (not shown) in association with a photographing distance at the time of photographing the image. The focus position calculation unit 13 acquires each image from the storage unit, calculates the degree of focus of each part of the image, and further calculates the focus position based on the degree of focus. The shape specifying unit 14 calculates a focus distribution, which is a set of focus positions, based on the focus positions, and specifies the three-dimensional shape of the inspection object 900 based on the focus distribution. The shape extraction unit 15 extracts a feature from the specified three-dimensional shape, and extracts a predetermined type of shape that forms the three-dimensional shape.

<処理>
本実施形態に係る処理の例を説明する。図3は、立体形状計測装置10が行う、立体形状を特定する処理を示すフローチャートである。本処理は、載置部113に検査対象物900を載置して、開始する。
<Process>
An example of a process according to the present embodiment will be described. FIG. 3 is a flowchart illustrating processing performed by the three-dimensional shape measurement device 10 to specify a three-dimensional shape. This processing is started by mounting the inspection object 900 on the mounting section 113.

(ステップS101):撮影制御部12は、撮影部11を制御して、カメラ111で検査対象物900の同一範囲を同一の撮影方向で相異なる撮影距離で撮影させる。カメラ111の初期の高さは、例えば予め取得した検査対象物900のCADデータ等の設計上の形状等に基づいて設定することができる。また、撮影距離の変化範囲は、例えば想定される微細突出部分901の長さに基づいて設定することができる。   (Step S101): The imaging control unit 12 controls the imaging unit 11 to cause the camera 111 to image the same area of the inspection object 900 in the same imaging direction at different imaging distances. The initial height of the camera 111 can be set based on a design shape or the like of CAD data of the inspection object 900 acquired in advance, for example. Further, the change range of the photographing distance can be set based on, for example, the length of the assumed minute protrusion 901.

(ステップS102):撮影制御部12は、検査対象物900の表面の概形状を取得する。これは、検査対象物900のCADデータ等の設計上の形状等に基づいて取得してもよいし、立体形状計測装置10の近傍に設けられた3Dセンサ等による検査対象物900の計測結果を取得してもよい。あるいは、後述するように、各画像に基づいて、検査対象物900の表面に対応する撮影距離を算出することで検査対象物900の表面の概形状を取得してもよい。この概形状によって検査対象物900の表面までの撮影距離が分かるので、後述する各処理において、検査対象物900の内部に合焦する撮影距離の画像を処理対象から外し処理量を軽減できる。   (Step S102): The imaging control unit 12 acquires the approximate shape of the surface of the inspection object 900. This may be obtained based on a design shape or the like of CAD data or the like of the inspection object 900, or a measurement result of the inspection object 900 by a 3D sensor or the like provided near the three-dimensional shape measurement device 10. May be acquired. Alternatively, as described later, the approximate shape of the surface of the inspection object 900 may be obtained by calculating a shooting distance corresponding to the surface of the inspection object 900 based on each image. Since the photographing distance to the surface of the inspection object 900 can be known from the approximate shape, in each processing described later, the image of the photographing distance focused inside the inspection object 900 is excluded from the processing object, and the processing amount can be reduced.

(ステップS103):合焦位置算出部13は、各画像を取得し、検査対象物900の表面やその上方が合焦位置となる撮影距離で撮影した画像中の各部分の合焦度を算出する。合焦位置算出部13は、各画像の一定の領域を対象領域として、対象領域内の輝度勾配の絶対値、輝度の分散値、あるいはエッジ強度等、種々の算出方法に基づいて、合焦度を算出することができる。なお、輝度勾配は、例えば、対象領域内の画像を構成する各画素について、その輝度値と、隣接する複数の画素のうち、1つ以上の輝度値との差分に基づいて算出されるものであり、種々の算出方法があるが、いずれかを適宜用いることができる。   (Step S103): The focus position calculation unit 13 acquires each image, and calculates the degree of focus of each part in the image photographed at the photographing distance at which the surface of the inspection object 900 and the area above it become the focus position. I do. The focus position calculation unit 13 sets a certain area of each image as a target area, and calculates a focus degree based on various calculation methods such as an absolute value of a luminance gradient, a variance of luminance, or an edge intensity in the target area. Can be calculated. The luminance gradient is calculated based on, for example, a difference between a luminance value of each pixel constituting an image in the target area and one or more luminance values of a plurality of adjacent pixels. Although there are various calculation methods, any of them can be used as appropriate.

図4、図5、図6に、実際の画像および合焦度の分布の例を示す。図4に示す第1画像においては、検査対象物900の表面のうち、対象領域である点線Lに沿った部分の合焦度が高い。そのため、点線Lに沿った輝度値の変化が、検査対象物900の表面を撮影した広い範囲において大きくかつ急峻である。このように、画像中の合焦度が高い部分が多いことによって、検査対象物900の表面に合焦していることを検出することができ、表面までの撮影距離を算出することができる。   4, 5, and 6 show examples of actual images and the distribution of the degree of focus. In the first image shown in FIG. 4, the degree of focus of a portion along the dotted line L, which is the target area, on the surface of the inspection target 900 is high. Therefore, the change in the luminance value along the dotted line L is large and steep in a wide range in which the surface of the inspection object 900 is photographed. As described above, since there are many portions with a high degree of focus in the image, it can be detected that the surface of the inspection object 900 is in focus, and the shooting distance to the surface can be calculated.

図5に示す第2画像は、第1画像と同様の範囲を撮影したものであるが、第1画像より撮影距離が大きく、第1画像の点線Lと同一位置の点線Lに沿った検査対象物900の表面の合焦度が小さいが、表面からカメラ111方向に突出する微細突出部分901が識別でき、その部分の合焦度は大きい。そのため、点線Lに沿った輝度値の変化が、微細突出部分901を撮影した特定の範囲においてのみ大きくかつ急峻である。   The second image shown in FIG. 5 is obtained by photographing the same range as the first image, but has a longer photographing distance than the first image, and is to be inspected along the dotted line L at the same position as the dotted line L of the first image. Although the degree of focusing on the surface of the object 900 is small, the fine projection 901 projecting from the surface in the direction of the camera 111 can be identified, and the degree of focusing on that part is large. Therefore, the change in the luminance value along the dotted line L is large and steep only in a specific range in which the minute projection 901 is photographed.

図6に示す第3画像は、第1画像、第2画像と同様の範囲を撮影したものであるが、第2画像より撮影距離が大きく、第1画像、第2画像の点線Lと同一位置の点線Lに沿った検査対象物900の表面および微細突出部分901の合焦度が小さい。そのため、点線Lに沿った輝度値の変化が、全範囲において小さい。図4、図5、図6に示す例では、点線Lに沿った輝度勾配の絶対値の平均値は、第1画像、第2画像、第3画像の順に小さくなる。   The third image shown in FIG. 6 is obtained by shooting the same range as the first image and the second image, but has a longer shooting distance than the second image, and has the same position as the dotted line L of the first image and the second image. The degree of focus of the surface of the inspection object 900 and the fine projection 901 along the dotted line L is small. Therefore, the change in the luminance value along the dotted line L is small in the entire range. In the examples shown in FIGS. 4, 5, and 6, the average of the absolute values of the luminance gradients along the dotted line L becomes smaller in the order of the first image, the second image, and the third image.

(ステップS104):合焦位置算出部13は、算出した各画像中の合焦度に基づいて、検査対象物900の表面とその上方部分において、例えば合焦度が所定値以上である位置を合焦位置として算出する。図7に、相異なる撮影距離で撮影した各画像(左列)と、各画像からそれぞれ算出した合焦位置を示す各画像(右列)との例を示す。左列の各画像は上から下に向かう順に検査対象物900の表面までの撮影距離を小さくして撮影したものである。右列の画像において、合焦位置を明るく示す。検査対象物900の表面より上方に微細突出部分901に合焦した部分が存在することが確認できる。また、形状特定部14は、合焦位置算出部13が算出した各画像における合焦位置に基づいて、合焦位置の集合(合焦位置の分布範囲)を合焦分布として算出する。図8に合焦位置を示す各画像と、これらに基づいて算出された合焦分布を示す画像との例を示す。合焦分布は、合焦位置の高さ(z方向)が高いほど明るく示し、また、見やすくするため輪郭の一部を点線で示す。   (Step S104): Based on the calculated degree of focus in each image, the focus position calculation unit 13 determines, for example, a position where the degree of focus is equal to or more than a predetermined value on the surface of the inspection object 900 and a portion thereabove. It is calculated as the in-focus position. FIG. 7 shows an example of each image (left column) photographed at different photographing distances and each image (right column) indicating a focus position calculated from each image. Each image in the left column is obtained by decreasing the photographing distance to the surface of the inspection object 900 in order from top to bottom. In the image in the right column, the in-focus position is shown brightly. It can be confirmed that there is a portion focused on the fine projection 901 above the surface of the inspection object 900. Further, the shape specifying unit 14 calculates a set of focus positions (a distribution range of the focus positions) as a focus distribution based on the focus positions in each image calculated by the focus position calculation unit 13. FIG. 8 shows an example of each image showing the focus position and an image showing the focus distribution calculated based on these images. The focus distribution is brighter as the height (z direction) of the focus position is higher, and a part of the contour is indicated by a dotted line for easy viewing.

(ステップS105):形状特定部14は、検査対象物900の合焦分布に基づいて、検査対象物900の立体形状を特定する。図9の(a)に、特定された検査対象物900の立体形状における微細突出部分901近傍の断面図の例を示す。図9の(a)に示すように、立体形状は、例えば合焦位置を表す点の集合として表される。   (Step S105): The shape specifying unit 14 specifies the three-dimensional shape of the inspection object 900 based on the focus distribution of the inspection object 900. FIG. 9A shows an example of a cross-sectional view of the vicinity of the minute protruding portion 901 in the three-dimensional shape of the specified inspection object 900. As shown in FIG. 9A, the three-dimensional shape is represented, for example, as a set of points representing a focus position.

(ステップS106):形状抽出部15は、特定された立体形状から特徴を算出する。形状抽出部15は、例えば、立体形状の特徴として、合焦位置を表す各点の分布方向、分布密度、各点における立体形状の法線の少なくとも1つを算出し、算出結果に基づいて、立体形状の各部分を構成する形状を抽出する。図9の(b)に特徴抽出の例を示す。図9の(b)に示す例では、形状抽出部15は、合焦位置を表す各点の配置に基づいて、立体形状の各点における法線を算出する。さらに、形状抽出部15は、近接する各点における法線に基づいて、各点における立体形状の曲率を算出する。そして形状抽出部15は、曲率に基づいて、立体形状の各部分から、平面、曲面、突出物等の形状を抽出する。図9の(c)に示す例では、形状抽出部15は、丸印で示す、各点の配置が略同一面内にあり各方向に沿った曲率が小さい領域を平面部分として抽出し、四角印で示す、各点が平面部分から傾斜する細長い領域に分布しており、各点における紙面の面内方向に沿った曲率が小さいが、紙面前後方向に沿った曲率が大きい領域を、例えば線状の突出形状として抽出する。これは一例であって、形状の種別や抽出の方法は限定されない。また、形状抽出部15は、算出した特徴や抽出した形状に基づいて、立体形状やその部分について、例えば平面部分の表面積や線状部分の長さのような各種寸法を算出するようにしてもよい。   (Step S106): The shape extraction unit 15 calculates a feature from the specified three-dimensional shape. The shape extraction unit 15 calculates, for example, at least one of a distribution direction and a distribution density of each point representing the in-focus position and a normal line of the three-dimensional shape at each point as a feature of the three-dimensional shape, based on the calculation result. The shapes that make up each part of the three-dimensional shape are extracted. FIG. 9B shows an example of feature extraction. In the example illustrated in FIG. 9B, the shape extracting unit 15 calculates a normal line at each point of the three-dimensional shape based on the arrangement of each point representing the focus position. Further, the shape extracting unit 15 calculates the curvature of the three-dimensional shape at each point based on the normal line at each adjacent point. Then, the shape extracting unit 15 extracts shapes such as a plane, a curved surface, and a protrusion from each part of the three-dimensional shape based on the curvature. In the example illustrated in FIG. 9C, the shape extraction unit 15 extracts, as a plane portion, an area indicated by a circle where the arrangement of the points is substantially in the same plane and the curvature along each direction is small. As shown by the mark, each point is distributed in an elongated area inclined from the plane portion, and the area where the curvature along the in-plane direction of the paper at each point is small, but the curvature along the front-back direction of the paper is large, for example, a line It is extracted as a protruding shape. This is an example, and the type of shape and the method of extraction are not limited. In addition, the shape extraction unit 15 may calculate various dimensions, such as a surface area of a plane portion and a length of a linear portion, of a three-dimensional shape or a portion thereof based on the calculated features or the extracted shape. Good.

図10の(a)に合焦分布によって立体形状を示す図を、一部輪郭を点線で補って示し、図10の(b)に形状の抽出結果に応じて色分けされた立体形状を示す図を示す。図10に示すように、立体形状の特徴に基づいて各部分の形状を抽出した結果、微細突出部分901の形状が検査対象物900の表面とは好適に区別されて識別できる。なお、形状抽出部15は、ステップS102で取得した検査対象物900の表面の概形状の特徴を利用して、概形状部分と概形状には含まれない微細突出部分901との区別を行えば、とくに好適に微細突出部分901を抽出することができる。   FIG. 10A is a diagram illustrating a three-dimensional shape according to a focus distribution, with a partial outline supplemented by dotted lines, and FIG. 10B is a diagram illustrating a three-dimensional shape color-coded according to the shape extraction result. Is shown. As shown in FIG. 10, as a result of extracting the shape of each part based on the feature of the three-dimensional shape, the shape of the minute projection 901 can be appropriately distinguished from the surface of the inspection object 900 and can be identified. Note that the shape extraction unit 15 may use the characteristics of the approximate shape of the surface of the inspection object 900 acquired in step S102 to distinguish between the approximate shape portion and the minute protrusion 901 not included in the approximate shape. In particular, the fine protruding portion 901 can be extracted particularly suitably.

(効果)
本発明によれば、撮影画像に基づいて合焦分布を算出し、検査対象物の立体形状を特定し、さらに立体形状の特徴を抽出して、各部分の形状を抽出する。これにより、検査対象物の表面の異物等の物体は、その立体形状が未知であっても、検査対象物の平坦な表面と明確に区別され、好適に特定できる。
(effect)
According to the present invention, the focus distribution is calculated based on the photographed image, the three-dimensional shape of the inspection object is specified, and the feature of the three-dimensional shape is extracted to extract the shape of each part. Thus, even if the three-dimensional shape of the object such as a foreign substance on the surface of the inspection object is unknown, it can be clearly distinguished from the flat surface of the inspection object and can be specified appropriately.

なお、本発明は、立体形状計測装置として捉えるだけでなく、立体形状計測方法、立体形状計測装置のコンピューターが実行するプログラム、プログラムを格納したコンピューター読み取り可能な非一時的記録媒体として捉えることも可能である。   In addition, the present invention can be considered not only as a three-dimensional shape measurement device, but also as a three-dimensional shape measurement method, a program executed by a computer of the three-dimensional shape measurement device, and a computer-readable non-temporary recording medium storing the program. It is.

本発明は、物体表面の立体形状計測装置等に有用である。   INDUSTRIAL APPLICABILITY The present invention is useful for an apparatus for measuring a three-dimensional shape of an object surface.

10 立体形状計測装置
11 撮影部
12 撮影制御部
13 合焦位置算出部
14 形状特定部
15 形状抽出部
111 カメラ
112 カメラ支持部
113 載置部
900 検査対象物
901 微細突出部分
DESCRIPTION OF SYMBOLS 10 Three-dimensional shape measuring apparatus 11 Imaging part 12 Imaging control part 13 Focus position calculation part 14 Shape specification part 15 Shape extraction part 111 Camera 112 Camera support part 113 Mounting part 900 Inspection object 901 Fine projection part

Claims (1)

検査対象物の立体形状を計測する立体形状計測装置であって、
合焦位置までの距離が一定である、前記検査対象物を撮影するカメラを含み、前記カメラから前記検査対象物までの撮影距離が可変である撮影部と、
前記撮影部に、複数の前記撮影距離において前記検査対象物を撮影させる撮影制御部と、
前記撮影部が撮影した各画像を取得し、各前記画像の各部分における合焦度に基づいて合焦位置を算出する合焦位置算出部と、
前記合焦位置算出部が算出した合焦位置に基づいて、合焦位置集合である合焦分布を算出し、算出した合焦分布に基づいて、前記検査対象物の立体形状を特定する形状特定部と、
前記形状特定部が特定した前記検査対象物の立体形状の各位置における特徴を算出し、算出した特徴に基づいて前記立体形状を構成する所定の種別の形状を抽出する形状抽出部とを備える、立体形状計測装置。
A three-dimensional shape measurement device that measures a three-dimensional shape of an inspection object,
A distance to the in-focus position is constant, including a camera for photographing the inspection object, an imaging unit in which the imaging distance from the camera to the inspection object is variable,
A photographing control unit that causes the photographing unit to photograph the inspection object at a plurality of the photographing distances;
A focus position calculation unit that obtains each image captured by the imaging unit and calculates a focus position based on the degree of focus in each part of each image;
A shape specification that calculates a focus distribution that is a set of focus positions based on the focus position calculated by the focus position calculation unit, and specifies a three-dimensional shape of the inspection object based on the calculated focus distribution. Department and
A shape extraction unit that calculates a feature at each position of the three-dimensional shape of the inspection object identified by the shape identification unit, and extracts a predetermined type of shape configuring the three-dimensional shape based on the calculated feature. 3D shape measuring device.
JP2018171587A 2018-09-13 2018-09-13 Three-dimensional shape measuring device Pending JP2020041976A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
JP2018171587A JP2020041976A (en) 2018-09-13 2018-09-13 Three-dimensional shape measuring device

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
JP2018171587A JP2020041976A (en) 2018-09-13 2018-09-13 Three-dimensional shape measuring device

Publications (1)

Publication Number Publication Date
JP2020041976A true JP2020041976A (en) 2020-03-19

Family

ID=69798086

Family Applications (1)

Application Number Title Priority Date Filing Date
JP2018171587A Pending JP2020041976A (en) 2018-09-13 2018-09-13 Three-dimensional shape measuring device

Country Status (1)

Country Link
JP (1) JP2020041976A (en)

Similar Documents

Publication Publication Date Title
TWI441095B (en) Distance evaluation methods and apparatuses, and machine readable medium thereof
JP6363863B2 (en) Information processing apparatus and information processing method
CN106052591B (en) Measuring device, measurement method, system and article production method
JP5567922B2 (en) Image processing apparatus and control method thereof
KR102368453B1 (en) Device and method for three-dimensional reconstruction of a scene by image analysis
JP6519265B2 (en) Image processing method
JP2017198671A (en) Image processor and image processing method
JP6570370B2 (en) Image processing method, image processing apparatus, program, and recording medium
JP6659098B2 (en) Image processing method, image processing apparatus, program, and recording medium
JP5822463B2 (en) Three-dimensional measuring apparatus, three-dimensional measuring method, and program
JP6581293B2 (en) Measurement of rotational position of lenticular lens sheet
JP6141497B2 (en) Method and measuring device for specifying dimensional characteristics of measurement object
US9972095B2 (en) Image measuring apparatus and non-temporary recording medium on which control program of same apparatus is recorded
JP6555211B2 (en) Edge extraction method for 2D images
JP2013205202A (en) Visual inspection apparatus for solder spike
JP6395429B2 (en) Image processing apparatus, control method thereof, and storage medium
JP2020041976A (en) Three-dimensional shape measuring device
JP6781963B1 (en) Measuring device and measuring method
JP2013002968A (en) Component height measuring method and apparatus therefor
JP2017037017A (en) Distance measurement device and distance measurement method
JP6818263B2 (en) Fracture surface analysis device and fracture surface analysis method
JPWO2013035847A1 (en) Shape measuring device, structure manufacturing system, shape measuring method, structure manufacturing method, shape measuring program, computer-readable recording medium
JP2020038096A (en) Three-dimensional shape measuring device
Senthilnathan et al. Estimation of sparse depth based on an inspiration from SFF
JP5712790B2 (en) Line sensor camera calibration apparatus and calibration method