WO2022244175A1 - 形状測定装置 - Google Patents
形状測定装置 Download PDFInfo
- Publication number
- WO2022244175A1 WO2022244175A1 PCT/JP2021/019128 JP2021019128W WO2022244175A1 WO 2022244175 A1 WO2022244175 A1 WO 2022244175A1 JP 2021019128 W JP2021019128 W JP 2021019128W WO 2022244175 A1 WO2022244175 A1 WO 2022244175A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- line light
- image
- light
- measurement area
- measurement
- Prior art date
Links
- 238000005259 measurement Methods 0.000 title claims abstract description 196
- 238000003384 imaging method Methods 0.000 claims abstract description 121
- 238000010586 diagram Methods 0.000 description 14
- 230000003287 optical effect Effects 0.000 description 8
- 238000000034 method Methods 0.000 description 7
- 230000007704 transition Effects 0.000 description 6
- 230000000694 effects Effects 0.000 description 4
- 238000000691 measurement method Methods 0.000 description 3
- 230000000007 visual effect Effects 0.000 description 3
- 230000000295 complement effect Effects 0.000 description 1
- 230000006866 deterioration Effects 0.000 description 1
- 229910044991 metal oxide Inorganic materials 0.000 description 1
- 150000004706 metal oxides Chemical class 0.000 description 1
- 239000004065 semiconductor Substances 0.000 description 1
- 230000001360 synchronised effect Effects 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2518—Projection by scanning of the object
- G01B11/2522—Projection by scanning of the object the position of the object changing and being recorded
Definitions
- the present disclosure relates to a shape measuring device.
- a contact measurement method and a non-contact measurement method are known as methods for measuring the three-dimensional shape of an object.
- the light section method is widely used.
- a light projecting unit irradiates an object with line light
- an imaging unit captures an image of the line light projected onto the object.
- the direction of the imaging visual field of the imaging unit is different from the irradiation direction of the line light.
- the three-dimensional shape of an object is determined based on images that are sequentially obtained by moving the position of the object relative to an optical system that includes a light projecting unit and an imaging unit. measured.
- the resolution of the shape measuring device is the value obtained by dividing the imaging field of view of the imaging unit by the number of pixels. Therefore, when the measurement range in the height direction of the object is widened, the resolution is lowered.
- a shape measuring apparatus having two imaging units is known. See, for example, US Pat.
- Patent Literature 1 has a problem of high cost because two imaging units are provided. Moreover, in the configuration of Patent Document 1, the number of image data required for the measurement process increases, so there is also the problem that the computational load increases. Furthermore, the configuration of Patent Document 1 requires synchronous control for synchronizing the images acquired by the two imaging units, which also poses a problem of complicating the apparatus.
- An object of the present disclosure is to provide a low-cost, low-calculation load, and simple shape measuring device.
- a shape measuring apparatus includes a first light projecting unit that emits first line light that is linear light and a second line light that is linear light, and a first imaging unit. capturing an image of an object having a field of view and passing through a first measurement region, which is a region where a first plane through which the first line light passes and the first imaging field of view intersect; A first imaging for imaging the object passing through a second measurement area different from the first measurement area, which is an area where a second plane through which line light passes and the first imaging field of view intersect. a first image that is an image of a portion of the object that is irradiated with the first line light that passes through the first measurement region; and the object that passes through the second measurement region. and a measuring unit that measures the shape of the object based on a second image that is an image of a portion irradiated with the second line light.
- FIG. 1 is a perspective view showing a schematic configuration of a shape measuring device according to Embodiment 1 and a first object
- FIG. 2 is a view of the shape measuring apparatus according to Embodiment 1 and the first object at the time of starting measurement as seen in the X-axis direction
- 4A and 4B are diagrams showing an example of transition of images captured by the imaging unit shown in FIGS. 1 to 3 when the first object passes through the first measurement area
- FIG. (A) is a view of the shape measuring apparatus according to Embodiment 1 and a second object at the start of measurement, as seen in the X-axis direction.
- FIG. 4 is a diagram showing; FIG. 11 is a view of the shape measuring device and the object according to Embodiment 2 as viewed in the X-axis direction; FIG. 11 is a view of a shape measuring device and an object according to Embodiment 3 as viewed in the X-axis direction; FIG. 11 is a view of a shape measuring device and an object according to Embodiment 4 as viewed in the X-axis direction;
- a shape measuring device according to an embodiment of the present disclosure will be described below with reference to the drawings.
- the following embodiments are merely examples, and the embodiments can be combined as appropriate and each embodiment can be modified as appropriate.
- the shape measuring device measures the three-dimensional shape of an object to be measured (hereinafter also referred to as "object") by the light section method.
- object an object to be measured
- the drawing shows the coordinate axes of the XYZ orthogonal coordinate system.
- the X-axis and Y-axis are coordinate axes parallel to the reference plane S on which the object is placed.
- the Y-axis is a coordinate axis parallel to the scanning direction of the shape measuring device.
- the Z-axis is a coordinate axis orthogonal to the X-axis and the Y-axis, and is the direction of the height of the object to be measured from the reference plane S.
- FIG. 1 is a perspective view showing a schematic configuration of a shape measuring apparatus 100 and a first target object 110 according to Embodiment 1.
- FIG. FIG. 2 is a diagram of the shape measuring apparatus 100 and the first object 110 shown in FIG. 1 as viewed in the X-axis direction.
- a first object 110 (hereinafter also referred to as “object 110”) is an example of a measurement object.
- the shape measuring apparatus 100 includes a light projecting section 10 (hereinafter also referred to as “first light projecting section 10"), an imaging section 20 (hereinafter also referred to as “first imaging section 20”), and a measuring section. 50.
- the light projecting unit 10 emits a first line light L1 that is linear light and a second line light L2 that is linear light.
- the light projecting unit 10 emits two line lights.
- the first line light L1 and the second line light L2 are lights that spread in the X-axis direction.
- the first line light L ⁇ b>1 and the second line light L ⁇ b>2 are projected onto the surface of the first object 110 .
- the shape of each image (pattern) of the first line light L1 and the second line light L2 projected onto the surface of the first object 110 is a line shape extending in the X-axis direction. Note that the light projecting unit 10 may emit three or more line lights.
- the second line light L2 is emitted from a position distant from the first line light L1 in the Y-axis direction, which is the predetermined scanning direction.
- the second line light L2 is parallel to the first line light L1 in the X-axis direction.
- the second line light L2 may be parallel to the first line light L1 in one predetermined direction, not limited to the X-axis direction.
- the light projecting unit that emits the first line light L1 and the light projecting unit that emits the second line light L2 may be separate bodies.
- the wavelength of the first line light L1 may be different from the wavelength of the second line light L2, and the power of the first line light L1 may be different from the power of the second line light L2. good.
- the light projecting section 10 has, for example, a spot light source 10a, a collimator lens 10b, and a cylindrical lens 10c.
- the collimator lens 10b collimates the light emitted from the spot light source 10a.
- the cylindrical lens 10c makes the pattern of the light collimated by the collimating lens 10b linear. Thereby, the first line light L1 and the second line light L2 are formed.
- the imaging unit 20 is, for example, a camera having a CMOS (Complementary Metal Oxide Semiconductor) image sensor.
- the imaging unit 20 has an imaging field of view F1 (hereinafter also referred to as “first imaging field of view F1”).
- the direction of the imaging visual field F1 is different from the irradiation direction of the first line light L1 and the second line light L2 by the light projecting unit 10 .
- the imaging unit 20 images the first object 110 passing through the first measurement area Z1.
- the first measurement area Z1 is an area where the first line light L1 and the imaging field of view F1 intersect.
- the first measurement area Z1 is an area where the first plane V1 through which the first line light L1 passes and the field of view F1 intersect.
- the first plane V1 is a virtual XZ plane.
- a height H1 of the first object 110 from the reference plane S is included in the first measurement area Z1.
- the first line light L1 and the second line light L2 may not be parallel to the XZ plane depending on the incident angle, they are irradiated along the XZ plane. Therefore, each of the first plane V1 and the second plane V2 through which the second line light L2 (to be described later) passes will be referred to as a "virtual XZ plane".
- the imaging unit 20 acquires an image of a portion of the first object 110 passing through the first measurement area Z1, which is irradiated with the first line light L1.
- a portion of the first object 110 irradiated with the first line light L1 is, for example, a surface (hereinafter also referred to as “upper surface”) 111 of the surface of the first object 110 facing the +Z-axis direction. is.
- the imaging unit 20 images the second object 120 passing through a second measurement area Z2 different from the first measurement area Z1.
- the second measurement area Z2 is an area where the second line light L2 and the imaging field of view F1 intersect.
- the second measurement area Z2 is an area where the second plane V2 through which the second line light L2 passes and the field of view F1 intersect.
- a second plane V2 is a virtual XZ plane.
- the second measurement area Z2 is arranged closer to the light projecting section 10 (that is, +Z-axis side) than the first measurement area Z1.
- the second measurement area Z2 is arranged adjacent to the first measurement area Z1 in the Z-axis direction.
- the second measurement area Z2 does not overlap the first measurement area Z1 in the Z-axis direction. If most of the second measurement area Z2 does not overlap the first measurement area Z1, part of the second measurement area Z2 may overlap the first measurement area Z1. That is, the second measurement area Z2 should include an area that does not overlap with the first measurement area Z1.
- the incident angle of the second line light L2 with respect to the normal W of the reference plane S (for example, the angle ⁇ shown in FIG. 2) is different from the incident angle of the first line light L1. .
- the first line light L1 is incident along the normal line W on the object. Therefore, the incident angle of the first line light L1 to the object is 0 degrees.
- the angle of incidence of the second line light L2 on the object is an angle ⁇ greater than 0 degree.
- the incident angle of the second line light L2 may be the same as the incident angle of the first line light L1. That is, the second angle of incidence, which is the angle of incidence of the second line light L2, may be greater than or equal to the first angle of incidence, which is the angle of incidence of the first line light L1.
- the second measurement area Z2 can be made different from the first measurement area Z1.
- the length A2 of the second measurement area Z2 is the same as the length A1 of the first measurement area Z1 in the Z-axis direction.
- the length A1 of the first measurement area Z1 may be shorter than the length A2 of the second measurement area Z2. That is, the length A1 of the first measurement area Z1 should be equal to or less than the length A2 of the second measurement area Z2.
- the shape measuring apparatus 100 further has a moving unit (not shown) that moves (conveys) the object in the scanning direction (-Y-axis direction in Embodiment 1).
- the moving unit may move the optical system 15 including the light projecting unit 10 and the imaging unit 20 in the scanning direction while the object is kept stationary. That is, in Embodiment 1, the optical system 15 and the object relatively move in the Y-axis direction.
- the imaging unit 20 sequentially captures images including the image of the line light projected onto the object.
- the measuring section 50 measures the shape of the object based on the image acquired by the imaging section 20 .
- the measurement unit 50 is, for example, a CPU (Central Processing Unit) having a memory.
- the measurement unit 50 calculates data indicating height information of the object based on the image acquired by the imaging unit 20 . For example, when the object is the first object 110 having a height H1, the measurement unit 50 emits the first line light L1 of the first object 110 passing through the first measurement area Z1.
- the height H1 of the first target object 110 is calculated based on the image of the portion that is covered.
- the measurement unit 50 determines that the image of the line light projected onto the object is the image of the first line light L1 and the image of the second line light L2. It is necessary to determine whether it is one of them. A determination method by the measurement unit 50 will be described below. First, an image captured by the imaging unit 20 when the first object 110 passes through the first measurement area Z1 will be described with reference to FIGS. 3, 4A, and 4B.
- FIG. 3 is a diagram of the shape measuring device 100 and the first object 110 viewed in the X-axis direction at the start of measurement.
- the first line light L1 is applied to the edge of the upper surface 111 of the first object 110 on the -Y-axis side.
- the imaging unit 20 includes an image of the first line light L1 projected on the upper surface 111 of the first object 110 by moving the first object 110 in the -Y-axis direction. Get an image.
- FIG. 4A and 4B show an example of transition of images captured by the imaging unit 20 shown in FIGS. 1 to 3 when the shape measuring device 100 measures the shape of the first object 110.
- FIG. 4 is a diagram showing; In order to facilitate the explanation of the images shown in FIGS. 4A and 4B and FIGS. 6A to 6D described later, Y 1 axis and the X 1 axis, which is a coordinate axis orthogonal to the Y 1 axis.
- FIG. 4A is a schematic diagram showing an example of an image B10 acquired when time t ⁇ 0.
- the first line light L1 is not applied to the first object 110, but the reference plane S is applied. Therefore, the image P10 of the first line light L1 projected onto the reference plane S is reflected in the image B10.
- the reference plane S is also irradiated with the second line light L2, but the image of the second line light L2 is not included within the imaging field F1. Therefore, the image of the second line light L2 does not appear in the image B10.
- FIG. 4B is a schematic diagram showing an example of an image B11 acquired when time t ⁇ 0.
- the first line light L ⁇ b>1 is applied to the reference plane S and the upper surface 111 of the first object 110 . Therefore, in the image B11, in addition to the image P10, the image P11 of the first line light L1 projected on the upper surface 111 is reflected.
- the position of the image P11 in the image B11 in the Y1 - axis direction corresponds to the height H1 of the first object 110 .
- the distance between the images P10 and P11 in the Y1 axis direction is the same as the height H1.
- the second line light L2 is also applied to the reference plane S and the first object 110, but the image of the second line light L2 is not included within the imaging field F1. do not have. Therefore, the image of the second line light L2 does not appear in the image B11.
- the second object 120 also moves in the -Y-axis direction, like the first object 110 .
- the height H2 of the second object 120 from the reference plane S is included in the first measurement area Z1 and the second measurement area Z2.
- the imaging unit 20 is irradiated with the first line light L1 and the second line light L2 of the second object 120 when the second object 120 moves in the -Y-axis direction. Get a partial image.
- a portion of the second object 120 irradiated with the first line light L1 is an upper surface 121 of the second object 120 .
- the portion of the second object 120 that is irradiated with the second line light L2 is the upper surface 121 and the side surface 122 of the second object 120 .
- the first line light L1 is applied to the edge of the upper surface 121 of the second object 120 on the -Y axis side.
- FIG. 5(B) is a view of the shape measuring apparatus 100 according to Embodiment 1 and the second object 120 passing through the second measurement area Z2 after the start of measurement, viewed in the X-axis direction.
- the side surface 122 of the second object 120 is irradiated with the second line light L2.
- the time at which the side surface 122 of the second object 120 is irradiated with the second line light L2 is t1. As time passes from time t1, the portion of the second object 120 irradiated with the second line light L2 changes from the side surface 122 to the top surface 121.
- the height H2 of the second object 120 is included in the second measurement area Z2, the time t1 is constant regardless of the height H2.
- FIGS. 5A and 5B are images captured by the imaging unit 20 shown in FIGS. 5A and 5B when the shape measuring device 100 measures the shape of the second object 120.
- FIG. 6A is a schematic diagram showing an example of an image B20 acquired when time t ⁇ 0.
- the first line light L1 does not irradiate the second object 120, but irradiates the reference plane S.
- the reference plane S is also irradiated with the second line light L2, but the image of the second line light L2 is not included within the imaging field F1. Therefore, the image of the second line light L2 does not appear in the image B10.
- FIG. 6(B) is a schematic diagram showing an example of an image B21 acquired after the start of measurement of the second object 120 and before time t1 has passed.
- the first line light L1 is applied to the reference plane S and the upper surface 121.
- the image of the first line light L1 projected onto the upper surface 121 is not included within the imaging field of view F1. Therefore, in the image B21, the image of the first line light L1 projected onto the upper surface 121 of the second object 120 is not shown, and the image P10 of the first line light L1 projected onto the reference plane S is shown. .
- the first line light L1 illuminates the reference surface S and the upper surface 121
- the second line light L2 illuminates the second object 120.
- the image of the first line light L1 projected onto the upper surface 121 is not included within the imaging field F1. Therefore, an image P10 of the first line light L1 projected onto the reference plane S and an image P21 of the second line light L2 projected onto the side surface 122 are shown in the image B22.
- the images P10 and P21 appear at the end of the image B22 on the -Y1 axis side.
- FIG. 6(D) is a schematic diagram showing an example of the image B23 acquired after the time t1 has passed.
- the reference surface S and the upper surface 121 are irradiated with the first line light L1.
- the second line light L2 irradiates the side surface 122 for a certain period of time and then irradiates the top surface 121 .
- the image of the first line light L1 projected onto the upper surface 121 is not included within the imaging field of view F1. Therefore, in the image B23, in addition to the image P10 of the first line light L1 projected onto the reference plane S, the image P22 of the second line light L2 projected onto the upper surface 121 or the side surface 122 is shown.
- the image B11 is obtained as the first image including the image P11 of the first line light L1.
- the image B22 as the second image including the images P21 and P22 of the second line light L2 , B23 are obtained. That is, the line light image included in the image acquired by the imaging unit 20 differs depending on the height of the object.
- the transition of the image acquired by the imaging unit 20 is different. Specifically, when the second object 120 passes through the first measurement area Z1 and the second measurement area Z2, images B20, B21, B22 and B23 are captured in this order. Also, when the first object 110 passes through the first measurement area Z1, the image B10 and the image B11 are captured in this order.
- the image of the line light included in the image acquired by the imaging unit 20 and the transition of the image differ according to the height of the object. It can be determined whether the image is the image of the first line light L1 or the image of the second line light L2.
- the imaging unit 20 of the shape measuring apparatus 100 captures an image of the first object 110 passing through the first measurement region Z1 and passes through the second measurement region Z2.
- the second object 120 to be captured is imaged.
- the measurement unit 50 measures the shape of the first object 110 based on the first image B11 of the portion of the first object 110 irradiated with the first line light L1.
- the measurement unit 50 also measures the shape of the second object 120 based on the second images B22 and B23 of the portion of the second object 120 irradiated with the second line light L2.
- the shape measuring apparatus 100 can widen the measurement range in the height direction of the object and measure the object with high accuracy by using the single imaging unit 20 . Therefore, it is possible to provide the shape measuring apparatus 100 that expands the measurement range at low cost and measures an object with high accuracy.
- one imaging unit 20 has a plurality of measurement areas (that is, the first measurement area Z1 and the second measurement area Z2) corresponding to the height of the object.
- the measurement unit 50 calculates The number of pixels in the processed images is the same. Therefore, the calculation load in the measurement unit 50 is reduced, and the calculation processing in the measurement unit 50 can be speeded up.
- Embodiment 1 synchronization control of the images acquired by the two imaging units is not required as compared with a configuration in which the shape measuring device is provided with two imaging units. Therefore, the simple shape measuring device 100 can be provided. Therefore, according to Embodiment 1, it is possible to provide the shape measuring apparatus 100 that is low in cost, has a reduced calculation load, and is simple.
- FIG. 7 is a view of the shape measuring apparatus 200 and the object 110 according to Embodiment 2 as seen in the X-axis direction. 7, the same or corresponding components as those shown in FIG. 2 are given the same reference numerals as those shown in FIG.
- the shape measuring apparatus 200 according to Embodiment 2 differs from the length A21 of the first measurement area Z21 in the Z-axis direction in that the length A22 of the second measurement area Z22 differs from the length A21 of the first measurement area Z21. It differs from the shape measuring device 100 . Except for this point, the shape measuring apparatus 200 according to the second embodiment is the same as the shape measuring apparatus 100 according to the first embodiment.
- the shape measuring device 200 has a light projecting section 10, an imaging section 220, and a measuring section 50.
- the imaging unit 220 images the object 110 passing through the first measurement area Z21 and the second measurement area Z22.
- the first measurement area Z21 is an area where the first plane V1 through which the first line light L1 passes and the imaging field F1 of the imaging unit 220 intersect.
- the second measurement area Z22 is an area where the second plane V2 through which the second line light L2 passes and the imaging visual field F1 of the imaging unit 220 intersect.
- the length A22 of the second measurement area Z22 differs from the length A21 of the first measurement area Z21 in the Z-axis direction. Thereby, the resolution in the second measurement area Z22 can be changed with respect to the resolution in the first measurement area Z21.
- length A21 is shorter than length A22.
- the resolution of each of the first measurement area Z21 and the second measurement area Z22 is a value obtained by dividing the imaging field of view by the number of pixels.
- the number of pixels when imaging the object 110 passing through the first measurement region Z21 is the same as the number of pixels when imaging the object 110 passing through the second measurement region Z22. .
- the resolution value in the first measurement area 21 is smaller than the resolution value in the second measurement area Z22. Thereby, the resolution in the first measurement area Z21 can be increased with respect to the resolution in the second measurement area Z22.
- the length A21 is lengthened.
- length can be shorter than A22.
- the length A22 may be shorter than the length A21. That is, one of length A21 and length A22 should be shorter than the other.
- the focal position of the imaging unit 20 is the length of the first measurement region Z21 and the second measurement region Z22 in the Z-axis direction. should be contained within a short measurement area.
- the length A22 of the second measurement area Z22 differs from the length A21 of the first measurement area Z21 in the Z-axis direction. Thereby, the resolution in the first measurement area Z21 can be changed with respect to the resolution in the second measurement area Z22.
- the length A21 of the first measurement area Z21 is shorter than the length A22 of the second measurement area Z22. Thereby, the resolution in the first measurement area Z21 can be increased with respect to the resolution in the second measurement area Z22.
- FIG. 8 is a diagram of the shape measuring apparatus 300 and the object 110 according to Embodiment 3 as viewed in the X-axis direction. 8, the same or corresponding components as those shown in FIG. 2 are given the same reference numerals as those shown in FIG.
- the focal position G1 of the first line light L31 is included in the first measurement area Z1
- the focal position G2 of the second line light L32 is included in the second measurement area.
- Z2 differs from the shape measuring apparatus 100 according to the first embodiment.
- the shape measuring apparatus 300 according to the third embodiment is the same as the shape measuring apparatus 100 according to the first embodiment.
- the shape measuring device 300 has a light projecting section 310, an imaging section 20, and a measuring section 50.
- the light projecting section 310 has a collimator lens and a cylindrical lens (for example, the collimator lens 10b and the cylindrical lens 10c shown in FIG. 2 described above), and the arrangement of each of the collimator lens and the cylindrical lens is adjusted.
- a collimating lens collimates the light emitted from the spot light source.
- the cylindrical lens collects the pattern of light collimated by the collimating lens into a line.
- the light projecting unit 310 emits a first line light L31 that is a line of light and a second line light L32 that is a line of light.
- Each of the first line light L31 and the second line light L32 spreads in the Y-axis direction.
- the width of the first line light L31 in the Y-axis direction is w1
- the width of the second line light L32 in the Y-axis direction is w2.
- the focal position G1 of the first line light L31 which is the position where the width w1 is the narrowest, is included in the first measurement area Z1.
- the object 110 passes through the first measurement region Z1, the object 110 is irradiated with the condensed first line light L1. Therefore, since the positional accuracy of the image of the first line light L1 projected onto the object 110 is improved, the calculation accuracy of the height H1 of the object 110 in the measurement unit 50 can be improved.
- the focal position G2 of the second line light L32 which is the position where the width w2 is the narrowest, is included in the second measurement area Z2.
- the focal position G1 can be included in the first measurement area Z1
- the focal position G2 can be included in the second measurement area Z2.
- the focal position G1 of the first line light L1 is included in the first measurement area Z1.
- the focal position G2 of the second line light L2 is included in the second measurement area Z2. Therefore, when the target passes through the second measurement area Z2, the accuracy of calculating the height of the target in the measurement unit 50 can be improved.
- FIG. 9 is a diagram of the shape measuring apparatus 400 and the object 130 according to Embodiment 4 as viewed in the X-axis direction. 9, the same or corresponding components as those shown in FIG. 2 are given the same reference numerals as those shown in FIG.
- a shape measuring device 400 according to Embodiment 4 differs from the shape measuring device 100 according to Embodiment 1 in that it further includes a second light projecting section 30 and a second imaging section 40 .
- the shape measuring device 400 according to the fourth embodiment is the same as the shape measuring device 100 according to the first embodiment except for this point.
- An object 130 shown in FIG. 9 is an example of a measurement object.
- the shape of the object 130 is different from the shape of each of the first object 110 and the second object 120 of the first embodiment.
- the shape of the object 130 when viewed in the X-axis direction is trapezoidal. That is, in Embodiment 4, the side surfaces 132 and 133 of the object 130 facing the Y-axis direction are inclined surfaces.
- the shape measuring apparatus 100 according to Embodiment 1 measures the shape of the object 130
- the first imaging unit 20 is projected onto the side surface 132 facing the ⁇ Y axis direction. A pattern of line light can be imaged.
- the first imaging unit 20 cannot capture the line light pattern projected on the side surface 133 .
- there may be a blind spot area (for example, the side surface 133 facing the +Y-axis direction) that cannot be imaged by the first imaging unit 20 .
- the shape measuring device 400 has a first light projecting section 10 , a first imaging section 20 , a second light projecting section 30 , a second imaging section 40 and a measuring section 50 .
- the second light projecting section 30 emits the third line light L3, which is linear light, and the fourth line light L4, which is linear light.
- the third line light L3 and the fourth line light L4 are lights that spread in the X-axis direction.
- the third line light L3 and the fourth line light L4 are projected onto the surface of the object 130.
- FIG. The shape of each image of the third line light L3 and the fourth line light L4 projected onto the surface of the object 130 is a line shape extending in the X-axis direction.
- the fourth line light L4 is emitted from a position away from the third line light L3 in the Y-axis direction, which is the scanning direction.
- the fourth line light L4 is parallel to the third line light L3 in the X-axis direction.
- the fourth line light L4 may be parallel to the third line light L3 in one predetermined direction, not limited to the X-axis direction.
- the second light projecting section 30 is arranged on the +Y axis side of the first light projecting section 10 .
- the irradiation directions of the third line light L3 and the fourth line light L4 are different from the irradiation directions of the first line light L1 and the second line light L2.
- the third line light L3 and the fourth line light L4 are applied to the side surface 133 of the object 130 that faces the +Y-axis direction and cannot be imaged by the first imaging unit 20 .
- the second imaging unit 40 has a second imaging field of view F2 different from the first imaging field of view F1.
- the orientation of the second imaging field F2 is different from the irradiation direction of the third line light L3 and the fourth line light L4 by the second light projecting section 30 .
- the second imaging unit 40 is, for example, a camera having a CCD image sensor.
- the second imaging unit 40 images the object 130 passing through the third measurement area Z3.
- the third measurement area Z3 is an area where the third plane V3 through which the third line light L3 passes and the second imaging field of view F2 intersect.
- a third plane V3 is a virtual XZ plane. In the example shown in FIG.
- the height H3 of the object 130 from the reference plane S is included in the third measurement area Z3 and the first measurement area Z1.
- the third line light L3 and the fourth line light L4 may not be parallel to the XZ plane depending on the incident angle, they are irradiated along the XZ plane. Therefore, each of the third plane V3 and the fourth plane V4 through which the fourth line light L4 (to be described later) passes will be referred to as a "virtual XZ plane".
- the second imaging unit 40 images an object passing through a fourth measurement area Z4 different from the third measurement area Z3.
- the fourth measurement area Z4 is an area where the fourth plane V4 through which the fourth line light L4 passes and the second imaging field F2 intersect.
- a fourth plane V4 is a virtual XZ plane.
- the fourth measurement area Z4 is arranged closer to the second light projecting section 30 (that is, +Z-axis side) than the third measurement area Z3.
- the fourth measurement area Z4 is arranged adjacent to the third measurement area Z3 in the Z-axis direction.
- the fourth measurement area Z4 does not overlap the third measurement area Z3 in the Z-axis direction. If most of the fourth measurement area Z4 does not overlap the third measurement area Z3, part of the fourth measurement area Z4 may overlap the third measurement area Z3. That is, it is sufficient that the fourth measurement area Z4 includes an area that does not overlap with the third measurement area Z3.
- the incident angle of the fourth line light L4 is different from the incident angle of the third line light L3. This allows the fourth measurement area Z4 to be a different area from the third measurement area Z3.
- the fourth line light L4 is incident on the object 130 along the normal to the reference plane S, so the incident angle of the fourth line light L4 is 0 degrees.
- the incident angle of the third line light L3 is greater than 0 degrees.
- the incident angle of the fourth line light L4 may be the same as the incident angle of the third line light L3.
- the fourth measurement area Z4 can be made different from the third measurement area Z3.
- the optical axis of the second imaging unit 40 is on the opposite side of the optical axis of the first imaging unit 20 across a center line C that passes through the center of the object 130 in the Y-axis direction and is perpendicular to the reference plane S. are placed in In other words, the second imaging section 40 is arranged on the opposite side of the first imaging section 20 with the first light projecting section 10 and the second light projecting section 30 interposed therebetween in the Y-axis direction.
- the measurement unit 50 determines the shape of the target object 130 based on the image including the line light image acquired by the first imaging unit 20 and the image including the line light image acquired by the second imaging unit 40. Measure.
- the measurement unit 50 measures the top surface 131 of the object 130, which is the portion irradiated with the line light L1.
- the shape of the object 130 is determined based on the first image, which is the image of the area 131a of the object 130, and the third image, which is the image of the area 131b of the upper surface 131, which is the part irradiated with the third line light L3. to measure.
- the shape of the object 130 is measured based on the third image acquired by the second imaging unit 40 for the area 131b of the object 130 that is the blind spot of the first imaging unit 20. can do.
- the measurement unit 50 detects that the target object 130 is irradiated with the second line light L2.
- the shape of the object 130 is measured based on the second image, which is the image of the portion, and the fourth image, which is the image of the portion irradiated with the fourth line light L4.
- the image acquired by the first imaging unit 20 and the image acquired by the second imaging unit 40 may be common.
- the measurement unit 50 calculates data indicating height information of the object 130 based on both the image acquired by the first imaging unit 20 and the image acquired by the second imaging unit 40.
- the measurement unit 50 may calculate the data based on one of the images acquired by the first imaging unit 20 and the second imaging unit 40 .
- the shape measuring device 400 is provided with the second light projecting section 30 and the second imaging section 40 .
- the shape of the occlusion portion of the object 130 that is, the area of the object 130 that is the blind spot of the first imaging unit 20 (specifically, the side surface 133 facing the +Y-axis direction) can be measured. can be done.
- Second imaging unit 50 Measuring unit 100, 200, 300, 400
Landscapes
- Engineering & Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
図1は、実施の形態1に係る形状測定装置100の概略的な構成と第1の対象物110とを示す斜視図である。図2は、図1に示される形状測定装置100及び第1の対象物110をX軸方向に見た図である。第1の対象物110(以下、「対象物110」とも呼ぶ。)は、測定対象物の一例である。
以上に説明した実施の形態1によれば、形状測定装置100の撮像部20は、第1の測定領域Z1を通過する第1の対象物110を撮像し、且つ第2の測定領域Z2を通過する第2の対象物120を撮像する。測定部50は、第1の対象物110のうち第1のライン光L1が照射されている部分の第1の画像B11に基づいて第1の対象物110の形状を測定する。また、測定部50は、第2の対象物120のうち第2のライン光L2が照射されている部分の第2の画像B22、B23に基づいて第2の対象物120の形状を測定する。これにより、形状測定装置100は、1つの撮像部20によって、対象物の高さ方向における測定範囲を広げることができ、且つ対象物を高精度に測定することができる。よって、低コストで測定範囲を広げ、且つ高精度に対象物を測定する形状測定装置100を提供することができる。
図7は、実施の形態2に係る形状測定装置200及び対象物110をX軸方向に見た図である。図7において、図2に示される構成要素と同一又は対応する構成要素には、図2に示される符号と同じ符号が付される。実施の形態2に係る形状測定装置200は、Z軸方向において、第2の測定領域Z22の長さA22と第1の測定領域Z21の長さA21とが異なる点で、実施の形態1に係る形状測定装置100と相違する。これ以外の点については、実施の形態2に係る形状測定装置200は、実施の形態1に係る形状測定装置100と同じである。
以上に説明した実施の形態2によれば、Z軸方向において、第2の測定領域Z22の長さA22は、第1の測定領域Z21の長さA21と異なる。これにより、第1の測定領域Z21における分解能を、第2の測定領域Z22における分解能に対して変化することができる。
図8は、実施の形態3に係る形状測定装置300及び対象物110をX軸方向に見た図である。図8において、図2に示される構成要素と同一又は対応する構成要素には、図2に示される符号と同じ符号が付される。実施の形態3に係る形状測定装置300は、第1のライン光L31の焦点位置G1が第1の測定領域Z1に含まれ、且つ第2のライン光L32の焦点位置G2が第2の測定領域Z2に含まれている点で、実施の形態1に係る形状測定装置100と相違する。これ以外の点については、実施の形態3に係る形状測定装置300は、実施の形態1に係る形状測定装置100と同じである。
以上に説明した実施の形態3によれば、第1のライン光L1の焦点位置G1が、第1の測定領域Z1に含まれている。これにより、対象物110が第1の測定領域Z1を通過する場合に、測定部50における対象物110の高さH1の算出精度を向上させることができる。
図9は、実施の形態4に係る形状測定装置400及び対象物130をX軸方向に見た図である。図9において、図2に示される構成要素と同一又は対応する構成要素には、図2に示される符号と同じ符号が付される。実施の形態4に係る形状測定装置400は、第2の投光部30及び第2の撮像部40を更に有する点で、実施の形態1に係る形状測定装置100と相違する。これ以外の点については、実施の形態4に係る形状測定装置400は、実施の形態1に係る形状測定装置100と同じである。
以上に説明した実施の形態4によれば、形状測定装置400に第2の投光部30及び第2の撮像部40が備えられている。これにより、対象物130のうち第1の撮像部20の死角となっている領域(具体的には、+Y軸方向を向く側面133)、すなわち、対象物130のオクルージョン部分の形状を測定することができる。
Claims (9)
- ライン状の光である第1のライン光及びライン状の光である第2のライン光を照射する第1の投光部と、
第1の撮像視野を有し、前記第1のライン光が通過する第1の平面と前記第1の撮像視野とが交差する領域である第1の測定領域を通過する対象物を撮像し、前記第2のライン光が通過する第2の平面と前記第1の撮像視野とが交差する領域であり、前記第1の測定領域と異なる第2の測定領域を通過する前記対象物を撮像する第1の撮像部と、
前記第1の測定領域を通過する前記対象物の前記第1のライン光が照射されている部分の画像である第1の画像と前記第2の測定領域を通過する前記対象物の前記第2のライン光が照射されている部分の画像である第2の画像とに基づいて前記対象物の形状を測定する測定部と
を有することを特徴とする形状測定装置。 - 前記対象物への前記第2のライン光の入射角である第2の入射角は、前記対象物への前記第1のライン光の入射角である第1の入射角以上である、
ことを特徴とする請求項1に記載の形状測定装置。 - 前記第2の入射角は、前記第1の入射角より大きい、
ことを特徴とする請求項2に記載の形状測定装置。 - 前記対象物が載せられた基準面からの前記対象物の高さの方向において、前記第2の測定領域の長さ及び前記第1の測定領域の長さのうち一方の長さが、他方の長さ以下である、
ことを特徴とする請求項1から3のいずれか1項に記載の形状測定装置。 - 前記第1の測定領域の長さは、前記第2の測定領域の長さより短い、
ことを特徴とする請求項4に記載の形状測定装置。 - 前記第1のライン光の焦点位置が、前記第1の測定領域に含まれる、
ことを特徴とする請求項1から5のいずれか1項に記載の形状測定装置。 - 前記第2のライン光の焦点位置が、前記第2の測定領域に含まれる、
ことを特徴とする請求項1から6のいずれか1項に記載の形状測定装置。 - ライン状の光である第3のライン光及びライン状の光である第4のライン光を照射する第2の投光部と、
前記第1の撮像視野と異なる第2の撮像視野を有し、前記第3のライン光が通過する第3の平面と前記第2の撮像視野とが交差する領域である第3の測定領域を通過する対象物を撮像し、前記第4のライン光が通過する第4の平面と前記第2の撮像視野とが交差する領域であり、前記第3の測定領域と異なる第4の測定領域を通過する前記対象物を撮像する第2の撮像部と
を更に有し、
前記測定部は、前記第1の画像、前記第2の画像、前記第3の測定領域を通過する前記対象物の前記第3のライン光が照射されている部分の画像である第3の画像、及び前記第4の測定領域を通過する前記対象物の前記第4のライン光が照射されている部分の画像である第4の画像に基づいて前記対象物の形状を測定する、
ことを特徴とする請求項1から7のいずれか1項に記載の形状測定装置。 - 前記第3のライン光及び前記第4のライン光の照射方向は、前記第1のライン光及び前記第2のライン光の照射方向と異なる、
ことを特徴とする請求項8に記載の形状測定装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/019128 WO2022244175A1 (ja) | 2021-05-20 | 2021-05-20 | 形状測定装置 |
CN202180098220.0A CN117295925A (zh) | 2021-05-20 | 2021-05-20 | 形状测定装置 |
JP2023522109A JP7450813B2 (ja) | 2021-05-20 | 2021-05-20 | 形状測定装置 |
DE112021007686.8T DE112021007686T5 (de) | 2021-05-20 | 2021-05-20 | Formmessvorrichtung |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
PCT/JP2021/019128 WO2022244175A1 (ja) | 2021-05-20 | 2021-05-20 | 形状測定装置 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2022244175A1 true WO2022244175A1 (ja) | 2022-11-24 |
Family
ID=84141494
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2021/019128 WO2022244175A1 (ja) | 2021-05-20 | 2021-05-20 | 形状測定装置 |
Country Status (4)
Country | Link |
---|---|
JP (1) | JP7450813B2 (ja) |
CN (1) | CN117295925A (ja) |
DE (1) | DE112021007686T5 (ja) |
WO (1) | WO2022244175A1 (ja) |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008175625A (ja) * | 2007-01-17 | 2008-07-31 | Konica Minolta Sensing Inc | 三次元測定装置及び携帯型計測器 |
JP2014130091A (ja) * | 2012-12-28 | 2014-07-10 | Canon Inc | 測定装置および測定方法 |
JP2020153718A (ja) * | 2019-03-18 | 2020-09-24 | 株式会社リコー | 測定装置及び造形装置 |
Family Cites Families (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP6287153B2 (ja) | 2013-12-12 | 2018-03-07 | 株式会社ニコン | センサユニット、形状測定装置、及び構造物製造システム |
-
2021
- 2021-05-20 WO PCT/JP2021/019128 patent/WO2022244175A1/ja active Application Filing
- 2021-05-20 CN CN202180098220.0A patent/CN117295925A/zh active Pending
- 2021-05-20 DE DE112021007686.8T patent/DE112021007686T5/de active Pending
- 2021-05-20 JP JP2023522109A patent/JP7450813B2/ja active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2008175625A (ja) * | 2007-01-17 | 2008-07-31 | Konica Minolta Sensing Inc | 三次元測定装置及び携帯型計測器 |
JP2014130091A (ja) * | 2012-12-28 | 2014-07-10 | Canon Inc | 測定装置および測定方法 |
JP2020153718A (ja) * | 2019-03-18 | 2020-09-24 | 株式会社リコー | 測定装置及び造形装置 |
Also Published As
Publication number | Publication date |
---|---|
DE112021007686T5 (de) | 2024-03-07 |
JPWO2022244175A1 (ja) | 2022-11-24 |
JP7450813B2 (ja) | 2024-03-15 |
CN117295925A (zh) | 2023-12-26 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US20200326184A1 (en) | Dual-resolution 3d scanner and method of using | |
US7800643B2 (en) | Image obtaining apparatus | |
US9074879B2 (en) | Information processing apparatus and information processing method | |
CN102538679B (zh) | 图像相关位移传感器 | |
JP2008241643A (ja) | 3次元形状測定装置 | |
JP2012047830A (ja) | 回折光学素子、並びに測距装置及び測距方法 | |
JP2007093412A (ja) | 3次元形状測定装置 | |
KR100264393B1 (ko) | 프리즘에 의한 스테레오 카메라 시스템 | |
JP2018163092A (ja) | 表面形状測定装置及びそのスティッチング測定方法 | |
JP4837538B2 (ja) | 端部位置測定方法および寸法測定方法 | |
US9560250B2 (en) | Information processing apparatus, measurement system, control system, light amount determination method and storage medium | |
WO2022244175A1 (ja) | 形状測定装置 | |
US10091493B2 (en) | Device and method for scanning object outline image | |
JPH05306915A (ja) | 形状測定方法およびその装置 | |
US20220113131A1 (en) | Measurement apparatus, image capturing apparatus, measurement system, control method, and storage medium | |
JPH11118438A (ja) | 3次元形状測定方法および装置 | |
JP3065367B2 (ja) | 線路周辺構造物の形状計測装置 | |
JP6508763B2 (ja) | 表面検査装置 | |
JP2012026816A (ja) | 寸法測定方法および装置 | |
JP4158300B2 (ja) | 3次元入力方法及び3次元入力装置 | |
WO2023089788A1 (ja) | 三次元計測装置 | |
JP4788968B2 (ja) | 焦点面傾斜型共焦点表面形状計測装置 | |
JP5141103B2 (ja) | 非接触三次元形状測定機 | |
WO2022113877A1 (ja) | 三次元計測装置及び三次元計測プログラム | |
US20190113336A1 (en) | Multi-Directional Triangulation Measuring System with Method |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 21940789 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2023522109 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 202180098220.0 Country of ref document: CN Ref document number: 18290457 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 112021007686 Country of ref document: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 21940789 Country of ref document: EP Kind code of ref document: A1 |