WO2020008713A1 - Dispositif de mesure - Google Patents

Dispositif de mesure Download PDF

Info

Publication number
WO2020008713A1
WO2020008713A1 PCT/JP2019/017337 JP2019017337W WO2020008713A1 WO 2020008713 A1 WO2020008713 A1 WO 2020008713A1 JP 2019017337 W JP2019017337 W JP 2019017337W WO 2020008713 A1 WO2020008713 A1 WO 2020008713A1
Authority
WO
WIPO (PCT)
Prior art keywords
positional relationship
imaging
unit
illumination unit
imaging target
Prior art date
Application number
PCT/JP2019/017337
Other languages
English (en)
Japanese (ja)
Inventor
山田 智明
Original Assignee
Dmg森精機株式会社
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dmg森精機株式会社 filed Critical Dmg森精機株式会社
Publication of WO2020008713A1 publication Critical patent/WO2020008713A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques

Definitions

  • the present invention relates to an apparatus for measuring an object to be photographed, and particularly to a measuring apparatus for measuring an object to be photographed on a machine tool.
  • an image sensor can be attached to a moving part (for example, a main shaft) of the machine tool to measure the object to be photographed.
  • a moving part for example, a main shaft
  • image unevenness in particular, unevenness in luminance may occur.
  • an image processing apparatus that corrects luminance unevenness by performing shading correction (for example, see Patent Document 1).
  • Patent Literature 1 aims to easily calculate correction information for shading correction. This makes it possible to correct uneven brightness.
  • the image sensor attached to the moving part of the machine tool needs to be small, lightweight, and practically inexpensive in consideration of storage in an automatic tool changer. To cope with this, it is preferable to use an image sensor in which ring illumination such as an LED is arranged around the imaging lens.
  • image unevenness in an imaging target having a three-dimensional structure occurs due to a variation in the angle of view of illumination viewed from each point on the surface of the imaging target.
  • image acquisition unit imaging lens
  • arrows E and E of the profile are used.
  • a large error may occur in the profile due to a difference in light distribution characteristics.
  • the present invention has been made in view of the above-described problem, and provides a measuring apparatus capable of realizing highly accurate measurement of an imaging target using an image sensor even when an error in light distribution characteristics occurs due to illumination.
  • the purpose is to provide.
  • a measuring device includes: An image acquisition unit; Lighting part, A control unit; With The image acquisition unit captures an image when the illumination unit and the imaging target are in a first positional relationship, and the illumination unit and the imaging target are in a second positional relationship symmetrical to the first positional relationship. If the imaging and get, The control unit measures a surface position of the imaging target based on the imaging of the first positional relationship and the second positional relationship.
  • FIG. 3 is a flowchart illustrating an example of a process of measuring by offsetting an error in light distribution characteristics using the measurement device illustrated in FIG. 1.
  • 2 is a flowchart illustrating a process for canceling an error in light distribution characteristics and performing a measurement using the measurement device illustrated in FIG. 1 and other examples. It is a figure which shows typically the influence by the error of a light distribution characteristic.
  • FIGS. 1 and 3A to 3C directions orthogonal to each other in the horizontal direction of the machine tool are defined as an X-axis direction and a Y-axis direction, a direction perpendicular to the machine tool is defined as a Z-axis direction, and a rotation direction around the Z-axis is defined as C. Shown as axial direction.
  • FIG. 1 is a perspective view schematically showing a configuration of a measuring device according to one embodiment of the present invention.
  • FIG. 2 is a perspective view schematically illustrating an example of an image sensor provided with an illumination unit and an image acquisition unit.
  • the measurement device 2 includes an image sensor 10 and a control unit 20 that performs control for measurement.
  • the image sensor 10 is electrically connected to the control unit 20. Further, the image sensor 10 is attached to a tool spindle 30 which is a moving part of the machine tool.
  • the drive unit of the moving unit (tool main shaft) 30 of the machine tool is electrically connected to the NC device 40, and the driving unit (tool main shaft) 30 is controlled by the NC device 40 so that the moving unit (tool main shaft) 30 has an X axis, a Y axis, and a Z axis. Direction and can rotate about a main axis.
  • the control unit 20 of the measurement device 2 and the NC device 40 are electrically connected.
  • FIG. 1 shows a case where the photographing target W is attached to the table 50 of the machine tool.
  • the moving unit (tool main shaft) 30 to which the image sensor 10 is attached has a main axis oriented in the Z-axis direction (vertical direction), and the image sensor 10 can image the photographing target W from vertically above.
  • the image sensor 10 attached to the moving part (tool spindle) 30 of the machine tool needs to be small, lightweight, practical and low cost in consideration of storage in an automatic tool changer.
  • the image sensor 10 according to the present embodiment has an image acquisition unit 12 and an illumination unit 14 arranged around the image acquisition unit 12, as shown in FIG. More specifically, an illumination unit 14 that is a ring illumination using six LEDs is arranged around the imaging lens of the image acquisition unit 12. Further, the image sensor 10 includes a shank 16, and the shank 16 is inserted and fixed in a tool holder of a tool spindle 30, which is a moving member of a machine tool.
  • the image sensor 10 acquires the imaging target W by the image acquisition unit 12 while illuminating the imaging target W fixed on the table 50 from above with the illumination unit 14. be able to.
  • the image sensor 10 attached to the moving unit (tool spindle) 30 can be moved in the X-axis direction and the Y-axis direction under the control of the NC device 40, and can be rotated in the C-axis direction around the Z-axis.
  • the image sensor 10 including the ring illumination (illumination unit) 14 is small, lightweight, and excellent in cost.
  • an edge sensor e, f of a convex portion is measured using an image sensor in which a ring illumination is arranged around an image acquisition unit (imaging lens), arrows E,
  • a large error may occur in the profile due to a difference in light distribution characteristics. Therefore, it may be difficult to measure an imaging target with high accuracy due to a difference in light distribution characteristics.
  • highly accurate measurement of the imaging target can be realized. This will be described in detail below.
  • FIG. 3A is a diagram schematically illustrating an example of acquiring an image when the illumination unit 14 and the imaging target object W are in the first positional relationship.
  • the rectangular frame in FIG. 3A indicates the field of view (FOV) of the image acquisition unit 12.
  • 6 shows an arrangement of six LEDs L1 to L3 and R1 to R3 constituting a lighting unit 14 around an image acquisition unit 12.
  • the illumination unit 14 (L1 to L3, R1 to R3) and the imaging target W are in the first positional relationship, and the upper surface shape of the imaging target W is shown at the upper left of the field of view (FOV) of the image acquisition unit 12. I have.
  • the upper surface shape of the photographing target W shows a rectangle having four edges a to d.
  • the rotation center when rotating the image sensor 10 is indicated by P.
  • FIG. 3B is a diagram schematically illustrating an example in which the positional relationship between the illumination unit 14 and the imaging target object W is changed from the first positional relationship to the second positional relationship.
  • ⁇ Circle around (3) ⁇ on the right side of FIG. 3B shows solid-state imaging when the illumination unit 14 and the imaging target W are in the first positional relationship.
  • the moving unit (tool spindle) 30 to which the image sensor 10 is attached moves from the position Ps1 corresponding to the first positional relationship to the second positional relationship under the control of the NC device 40. Move to position Ps2. Furthermore, it is rotated 180 degrees in the C-axis direction around the Z-axis. Thereby, the illumination unit 14 and the imaging target W are changed to the second positional relationship.
  • the position of the relative second positional relationship in the field of view (FOV) of the image acquiring unit 12 having the first positional relationship in FIG. 3A is indicated by a dotted line.
  • FIG. 3C is a diagram schematically illustrating an example of acquiring an image when the illumination unit 14 and the imaging target object W are in the second positional relationship. If the image sensor 10 is rotated 180 degrees without moving in the X-axis and Y-axis directions from the first positional relationship shown in FIG. 3A, the photographing target W is shown at the position shown by the dotted line in FIG. 3C. It is.
  • the measurement position in the field of view (FOV) of the image acquisition unit 12 is changed to a symmetrical edge. It can be in the same position above. Specifically, the edge a at the first position matches the second one edge c, the edge b at the first position matches the second one edge d, and the edge c at the first position Coincides with the second edge a, and the edge d at the first position coincides with the second edge b. Therefore, even if an error occurs in the light distribution characteristics due to the illumination unit 14, the error can be offset.
  • the image acquisition unit 12 of the image sensor 10 captures the image when the illumination unit 14 and the imaging target W are in the first positional relationship, and sets the illumination unit 14 and the imaging target W in the first positional relationship.
  • the control unit 20 measures the surface position of the imaging target based on the images of the first positional relationship and the second positional relationship.
  • the case where the illumination unit 14 and the imaging target object W are rotated by 180 degrees is shown as the “symmetric positional relationship”, but this is merely an example, and even if the relative position is slightly shifted, Seen from the intermediate point before and after the shift, it is also included in the “symmetric positional relationship”.
  • the control unit 20 of the measurement device 2 performs imaging when the illumination unit 14 and the imaging target W are in the first positional relationship, and performs imaging in the second position. It is conceivable to superimpose the imaging in the case of the relationship. As a result, errors in the light distribution characteristics can be effectively canceled, and highly accurate measurement of the imaging target can be efficiently realized.
  • measurement data is acquired from the imaging when the illumination unit 14 and the imaging target W are in the first positional relationship
  • the measurement data is obtained from the imaging when the illumination unit 14 and the imaging target W are in the second positional relationship. It is also conceivable to acquire and obtain the respective measurement data obtained. By canceling out the measurement data obtained by the imaging in the first positional relationship and the imaging in the second positional relationship, it is possible to reliably measure the object to be photographed with high accuracy.
  • the first and second positional relationships are not limited to the case where the image sensor 10 to which the image acquisition unit 12 and the illumination unit 14 are fixed is moved and rotated as described above.
  • the image acquisition unit may not move with respect to the imaging target, but may move only the illumination unit, may move the imaging target side, or may combine them.
  • the table 50 on which the photographing object W is placed is rotated, not only when the moving unit (tool spindle) 30 of the machine tool is rotated and translated.
  • the translation may be performed, or both may be combined.
  • the control unit 20 may exist as a control device unique to the measurement device, or may use a control device of a machine tool. Further, the measuring device 2 may have a unique moving mechanism, and may move from the first positional relationship to the second positional relationship without using the moving mechanism of the machine tool.
  • FIG. 4 is a flowchart showing one example of a process of measuring by offsetting an error in light distribution characteristics using the measurement device shown in FIG.
  • a signal of a preparation command is transmitted from the control unit 20 of the measuring device 2 to the image sensor 10, and after the image sensor 10 completes the preparation, the preparation completion signal is transmitted from the image sensor 10 to the control unit 20.
  • an instruction signal for moving the moving unit (tool spindle) 30 to which the image sensor 10 is attached to the position Ps1 corresponding to the first positional relationship is transmitted from the control unit 20 to the NC device 40.
  • the NC device 40 moves the moving unit (tool spindle) 30 to the position Ps1.
  • a signal indicating the completion of the movement is transmitted from the NC device 40 to the control unit 20.
  • the illumination unit 14 and the photographing target W are arranged in the first positional relationship.
  • a signal of a shooting command is transmitted from the control unit 20 to the image sensor 10.
  • the image sensor 10 performs photographing, and after photographing, transmits a signal indicating that photographing has been completed from the image sensor 10 to the control unit 20.
  • an instruction signal for moving the moving unit (tool spindle) 30 to which the image sensor 10 is attached to the position Ps2 corresponding to the second positional relationship is transmitted from the control unit 20 to the NC device 40.
  • the NC device 40 moves the moving unit (tool spindle) 30 to the position Ps2.
  • a signal indicating the completion of the movement is transmitted from the NC device 40 to the control unit 20.
  • the control unit 20 transmits to the NC device 40 an instruction signal for rotating the main shaft of the moving unit (tool main shaft) 30 to which the image sensor 10 is attached by 180 degrees.
  • the NC device 40 rotates the main shaft of the moving unit (tool main shaft) 30 by 180 degrees.
  • a signal indicating the completion of spindle rotation is transmitted from the NC device 40 to the control unit 20.
  • the illumination unit 14 and the photographing target W are arranged in the second positional relationship.
  • a signal of a shooting command is transmitted from the control unit 20 to the image sensor 10.
  • the image sensor 10 performs photographing, and after photographing, transmits a signal indicating the end of photographing from the image sensor 10 to the control unit 20.
  • the measurement position Ps1a of the edge a of the imaging target W is calculated based on the imaging having the first positional relationship, and the measurement position Ps1c of the edge c of the imaging target W is calculated.
  • the measurement position Ps2a of the edge a of the photographing target W is calculated based on the imaging having the second positional relationship, and the measurement position Ps2c of the edge c of the photographing target W is calculated.
  • the measurement data can be offset by “Ps1a-Ps2a” and “Ps1c-Ps2c”.
  • FIG. 5 is a flowchart showing another example of the step of performing measurement by offsetting an error in light distribution characteristics using the measurement apparatus shown in FIG.
  • the image acquisition unit 12 of the image sensor 10 acquires an image having a positional relationship obtained by rotating the positional relationship between the illumination unit 14 and the imaging target W by 180 degrees.
  • the image acquisition unit 12 of the image sensor 10 is rotated 90, 180, and 270 degrees with respect to the predetermined positional relationship between the illumination unit 14 and the photographing target W as 0 degree. The difference is that an image is acquired.
  • the moving unit (tool spindle) 30 to which the image sensor 10 is attached is moved in the X-axis direction and the Y-axis direction so that the symmetrical edge is located at the same position in the field of view.
  • the image acquisition unit 12 sets the predetermined positional relationship between the illumination unit 14 and the imaging target W to 0 degree, rotates the image by 90 degrees, 180 degrees, and 270 degrees, and captures images having symmetrical positional relationships. By acquiring, the error of the light distribution characteristic can be offset, and highly accurate measurement of the two-dimensional imaging target can be realized. Other points are the same as those of the above-described one example of the process, and thus further description is omitted.
  • the illumination unit 14 when both the illumination unit 14 and the white light source are turned on, and when only one of them is turned on, the illumination unit 14 emits outgoing light of various wavelengths and outputs a difference in luminance or a ratio of luminance. Can detect a large wavelength.
  • the illumination unit 14 is a ring illumination including a red LED, a green LED, and a blue LED, by changing the output of the LED of each wavelength, a wavelength difference or a ratio of the luminance in the emitted light is large. Can be detected.
  • the white light source it is conceivable to use the in-machine illumination of the machine tool or the indoor lighting in which the machine tool is installed.
  • the image acquisition unit 12 acquires an image of the imaging target W in a state where the illumination unit 14 and the white light source are turned on, and further, turns on only the illumination unit 14 or turns on only the white light source. An image of the imaging target W in the state is acquired. Then, of the emitted light that can be emitted from the illumination unit 14, in the two acquired images, light having a wavelength with a large difference in luminance or a large luminance ratio in a region where the object to be photographed W is measured is determined. The image acquisition unit 12 measures the surface position of the imaging target W using the emission light having the determined wavelength.
  • a wavelength having a large difference in luminance or a large luminance ratio in the measurement region of the imaging target W is used.
  • Measurement device 10 Image sensor 12 Image acquisition unit 14 Illumination unit 16 Shank 20 Control unit 30 Moving unit (tool spindle) 40 NC device 50 Table W Object to be photographed

Abstract

Cette invention concerne un dispositif de mesure pourvu d'une unité d'acquisition d'image, d'une unité d'éclairage et d'une unité de commande, l'unité d'acquisition d'image acquérant une image dans un cas dans lequel l'unité d'éclairage et un objet d'imagerie sont dans une première relation de position, et une image dans un cas dans lequel l'unité d'éclairage et l'objet d'imagerie sont dans une seconde relation de position symétrique à la première relation de position, et l'unité de commande mesurant la position de surface de l'objet d'imagerie sur la base des images dans la première relation de position et la seconde relation de position. Une mesure très précise de l'objet d'imagerie à l'aide d'un capteur d'image peut ainsi être effectuée même lorsqu'une erreur se produit dans les caractéristiques de distribution de lumière sous l'effet de l'éclairage.
PCT/JP2019/017337 2018-07-04 2019-04-24 Dispositif de mesure WO2020008713A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2018-127663 2018-07-04
JP2018127663A JP7083282B2 (ja) 2018-07-04 2018-07-04 測定装置

Publications (1)

Publication Number Publication Date
WO2020008713A1 true WO2020008713A1 (fr) 2020-01-09

Family

ID=69060636

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2019/017337 WO2020008713A1 (fr) 2018-07-04 2019-04-24 Dispositif de mesure

Country Status (2)

Country Link
JP (1) JP7083282B2 (fr)
WO (1) WO2020008713A1 (fr)

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6361104A (ja) * 1986-08-30 1988-03-17 T Ii D:Kk 位置ずれ検査装置
JPS63106508A (ja) * 1986-10-24 1988-05-11 Toshiba Corp 装着状態検査方法及びその装置
JP2007040801A (ja) * 2005-08-02 2007-02-15 Techno Horon:Kk 3次元座標測定装置及び方法
JP2008026255A (ja) * 2006-07-25 2008-02-07 Kobe Steel Ltd 傷検査装置、傷検査方法
JP2013113793A (ja) * 2011-11-30 2013-06-10 Panasonic Corp 3次元計測装置およびそれに用いられる照明装置

Family Cites Families (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPH0599632A (ja) * 1991-10-07 1993-04-23 Bekutoru:Kk ビデオカメラによる物体計測の照明方法と、それに用いる照明装置
JP2000097669A (ja) 1998-09-18 2000-04-07 Canon Inc 光波干渉計装置、及び該装置におけるデータ処理方法
US20110063437A1 (en) 2008-08-20 2011-03-17 Tatsumi Watanabe Distance estimating device, distance estimating method, program, integrated circuit, and camera
JP2010266330A (ja) 2009-05-14 2010-11-25 Yokogawa Electric Corp 平面モータ
JP6241935B2 (ja) 2014-02-12 2017-12-06 東レエンジニアリング株式会社 繊維強化プラスチックテープの貼付状態を検査する装置
JP6345944B2 (ja) 2014-02-21 2018-06-20 株式会社ミツトヨ 斜入射干渉計
JP6547472B2 (ja) 2015-07-10 2019-07-24 日本製鉄株式会社 形状測定装置
WO2017168469A1 (fr) 2016-03-28 2017-10-05 パナソニックIpマネジメント株式会社 Appareil d'inspection visuelle et procédé d'inspection visuelle

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JPS6361104A (ja) * 1986-08-30 1988-03-17 T Ii D:Kk 位置ずれ検査装置
JPS63106508A (ja) * 1986-10-24 1988-05-11 Toshiba Corp 装着状態検査方法及びその装置
JP2007040801A (ja) * 2005-08-02 2007-02-15 Techno Horon:Kk 3次元座標測定装置及び方法
JP2008026255A (ja) * 2006-07-25 2008-02-07 Kobe Steel Ltd 傷検査装置、傷検査方法
JP2013113793A (ja) * 2011-11-30 2013-06-10 Panasonic Corp 3次元計測装置およびそれに用いられる照明装置

Also Published As

Publication number Publication date
JP7083282B2 (ja) 2022-06-10
JP2020008348A (ja) 2020-01-16

Similar Documents

Publication Publication Date Title
US10508902B2 (en) Three-dimensional measurement device
JP6550536B2 (ja) マルチラインアレイレーザ光3次元走査システム、及びマルチラインアレイレーザ光3次元走査方法
KR101458991B1 (ko) 측정 대상 표면에 대한 3d 좌표들을 결정하기 위한 광학 측정 방법 및 측정 시스템
US9441957B2 (en) Three-dimensional shape measuring apparatus
JP6848385B2 (ja) 三次元形状計測装置
US10380764B2 (en) System and method for performing vision system planar hand-eye calibration from straight line features
KR20130112740A (ko) 배광 특성 측정 장치 및 배광 특성 측정 방법
JP6700424B2 (ja) 二層位置合わせデバイス及び方法
CN112082480A (zh) 芯片的空间取向的测量方法、系统、电子装置和存储介质
JP2014035261A (ja) 情報処理方法、情報処理装置、プログラム、撮像装置、検査方法、検査装置、及び基板の製造方法
US20210152810A1 (en) Adaptive 3d-scanner with variable measuring range
JP6542955B1 (ja) 測定装置及び測定方法
JP5555049B2 (ja) タイヤ検査装置
WO2020008713A1 (fr) Dispositif de mesure
US10060733B2 (en) Measuring apparatus
CN112352137B (zh) 运动编码器
JP2006105755A (ja) 三次元形状計測システム及び計測方法
JP5079779B2 (ja) カメラ較正方法、カメラ較正装置
JP2007292606A (ja) 表面検査装置
TWI758737B (zh) 基板之對準方法
US20220316867A1 (en) Three-dimensional shape measuring apparatus
JP2022103956A (ja) 三次元形状計測方法および三次元形状計測装置
JP2022158516A (ja) 三次元形状測定装置
JP5298339B2 (ja) アライメント方法及びアライメント装置
JP2016156745A (ja) 計測方法および計測装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 19831224

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 19831224

Country of ref document: EP

Kind code of ref document: A1