WO2016020966A1 - Detection method, shape measurement method, shape measurement device, structure production method, structure production system, and shape measurement program - Google Patents

Detection method, shape measurement method, shape measurement device, structure production method, structure production system, and shape measurement program Download PDF

Info

Publication number
WO2016020966A1
WO2016020966A1 PCT/JP2014/070460 JP2014070460W WO2016020966A1 WO 2016020966 A1 WO2016020966 A1 WO 2016020966A1 JP 2014070460 W JP2014070460 W JP 2014070460W WO 2016020966 A1 WO2016020966 A1 WO 2016020966A1
Authority
WO
WIPO (PCT)
Prior art keywords
measurement object
image
reference image
shape
light
Prior art date
Application number
PCT/JP2014/070460
Other languages
French (fr)
Japanese (ja)
Inventor
鈴木 康夫
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Priority to PCT/JP2014/070460 priority Critical patent/WO2016020966A1/en
Publication of WO2016020966A1 publication Critical patent/WO2016020966A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object

Definitions

  • the present invention relates to a detection method, a shape measuring method, a shape measuring device, a structure manufacturing method, a structure manufacturing system, and a shape measuring program.
  • the phase shift method is known as a method for measuring the three-dimensional shape of the measurement object.
  • a shape measuring apparatus using the phase shift method includes a projection unit, an imaging unit, and a control unit.
  • This projection unit projects a striped pattern light having a sinusoidal light intensity distribution (hereinafter referred to as structured light) onto a measurement object.
  • structured light a sinusoidal light intensity distribution
  • the phase of the stripe of structured light is shifted four times, for example, by ⁇ / 2 over one period (2 ⁇ ), and the phase of the stripe is 0, ⁇ / 2.
  • ⁇ , and 3 ⁇ / 2 are projected.
  • the imaging unit images the object from different angles with respect to the projection unit, and the projection unit, the measurement object, and the imaging unit are arranged so as to have a triangulation positional relationship.
  • the imaging unit images the measurement object and acquires four images.
  • the control unit applies data relating to the signal intensity of each pixel in the four images captured by the imaging unit to a predetermined arithmetic expression, and obtains the phase value of the fringes at each pixel according to the surface shape of the measurement target.
  • a calculating part calculates the three-dimensional coordinate data of a measuring object from the phase value of the fringe in each pixel using the principle of triangulation.
  • An apparatus using this phase shift method is disclosed in Patent Document 1, for example.
  • the first reference image of the measuring object is used as the measuring instrument. , Taking a second reference image of the measurement object with a measuring machine, simultaneously with or after taking the first reference image, and before taking the second reference image Taking an image of the measuring object obtained by projecting the structured light for shape measurement from the measuring machine to the measuring object, detecting a feature region from the first reference image, and a second reference image Detecting a feature region from the detection region, detecting a relative shake between the measurement object and the measuring device based on the feature region in the first reference image and the feature region in the second reference image; A detection method is provided.
  • the measuring object is obtained by capturing an image of the measuring object obtained by projecting the structural light for shape measurement from the measuring instrument onto the measuring object, and the detection method of the first aspect.
  • a shape measurement method is provided that includes detecting a relative shake between an object and a measuring device, and calculating a shape of the measurement object based on an image and the shake of the measurement object. .
  • a shape measuring apparatus for measuring the shape of the measurement object, the projection unit projecting at least the structural light for shape measurement onto the measurement object, and the image of the measurement object.
  • An imaging unit that captures the image and a first reference image of the measurement object are imaged by the imaging unit, and a second reference image of the measurement object is imaged by the imaging unit, and at the same time as or imaging of the first reference image Control that causes the imaging unit to capture an image of the measurement object on which the structured light is projected by projecting the structural light for shape measurement from the projection unit onto the measurement object later and before the imaging of the second reference image
  • a feature region from the first reference image, a feature region from the second reference image, and a feature region in the first reference image and a feature region in the second reference image Detects relative shake between the measurement object and the measuring machine, and based on the image and shake of the measurement object
  • Shape measuring apparatus and a calculator for calculating the shape of the measuring object is provided.
  • the design apparatus for producing the design information related to the shape of the structure the molding apparatus for producing the structure based on the design information, and the third for measuring the shape of the produced structure.
  • a structure manufacturing system including a shape measuring apparatus according to an aspect, and an inspection apparatus that compares design information with shape information related to the shape of the structure obtained by the shape measuring apparatus.
  • the design information relating to the shape of the structure is produced, the structure is produced based on the design information, and the shape of the produced structure is measured.
  • a structure manufacturing method including a shape measuring method, and comparing shape information related to the shape of the structure obtained by the shape measuring method with design information.
  • the computer included in the shape measuring apparatus that measures the shape of the measurement object by capturing an image of the measurement object obtained by projecting the structured light for shape measurement onto the measurement object is measured by the computer.
  • the process of capturing the first reference image of the object, the process of capturing the second reference image of the measurement object, the imaging of the first reference image, or simultaneously with or after the imaging of the second reference image Prior to imaging, processing for capturing an image of a measurement object obtained by projecting structured light for shape measurement onto the measurement object, processing for detecting a feature region from the first reference image, and from the second reference image
  • Shape measuring program for causing the execution is provided.
  • the aspect of the present invention it is possible to accurately detect the relative shake between the measurement object and the shape measuring device, and to accurately measure the three-dimensional shape of the measurement object.
  • FIG. 1 It is a figure which shows an example of the shape measuring apparatus which concerns on 1st Embodiment. It is a block diagram which shows an example of a detailed structure of the shape measuring apparatus shown in FIG. It is a figure which shows intensity distribution of structured light and reference light in a projection area
  • FIG. 1 is a diagram illustrating an example of a shape measuring apparatus according to the first embodiment.
  • the right direction of the drawing is the X1 axis
  • a certain direction orthogonal to the X1 axis is the Y1 axis
  • a direction orthogonal to the X1 axis and the Y1 axis is the Z1 axis.
  • the shape measuring device 1 is a device that measures the three-dimensional shape of the measuring object 2 using the phase shift method.
  • the shape measuring apparatus 1 includes a projection unit 10, an imaging unit 50, an arithmetic processing unit 60, a display device 70, and a housing 80.
  • the shape measuring device 1 is configured to be housed in a case 80 in which the projection unit 10, the imaging unit 50, the arithmetic processing unit 60, and the display device 70 can be carried.
  • the projection unit 10 generates projection light 100 along the first direction D1 (X1 axis direction in FIG. 1). Then, the projection unit 10 scans the generated projection light 100 along the second direction D2 (the Y1 axis direction in FIG. 1) different from the first direction, so that the structured light 101 is applied to the projection region 200. And the reference beam 102 is projected.
  • the structured light 101 of the first embodiment is structured light used in the phase shift method.
  • the reference light 102 of the first embodiment is used to detect a feature region on the measurement object 2 in order to detect a relative shake between the measurement object 2 and the shape measurement apparatus 1. Light. Details of the structured light 101, the reference light 102, the projection region 200, and the feature region will be described later (see FIGS. 3 and 4).
  • the projection unit 10 includes a light generation unit 20, a projection optical system 30, and a scanning unit 40.
  • the light generation unit 20 generates the projection light 100.
  • the projection optical system 30 projects the projection light 100 generated by the light generation unit 20.
  • the projection light 100 emitted from the projection optical system 30 is projected toward the measurement object 2 or the vicinity of the measurement object 2 via the scanning unit 40.
  • the measuring object 2 has one corner
  • the scanning unit 40 scans the projection light 100 in the second direction D2 (Y1 axis direction in FIG. 1).
  • the imaging unit 50 is arranged at a position different from the position of the projection unit 10.
  • the imaging unit 50 images the measurement object 2 onto which the projection light 100 is projected from a direction different from the direction in which the projection unit 10 projects.
  • the imaging unit 50 captures an image of the measurement object 2 onto which the structured light 101 is projected (hereinafter referred to as “measurement image”).
  • the imaging unit 50 captures an image of the measurement object 2 onto which the reference light 102 is projected (hereinafter referred to as “reference image”).
  • the imaging unit 50 includes a light receiving optical system 51 and an imaging device 52.
  • the light receiving optical system 51 is an optical system that causes the imaging device 52 to form an image of a region including a portion on which the projection light 100 is projected on the surface of the measurement object 2.
  • the imaging device 52 generates image data of the measurement object 2 based on the image formed by the light receiving optical system 51 and stores the generated image data.
  • the arithmetic processing unit 60 controls the generation of the projection light 100 by the light generation unit 20. In addition, the arithmetic processing unit 60 controls the scanning unit 40 and the imaging unit 50 so that the scanning of the projection light 100 by the scanning unit 40 and the imaging of the measurement object 2 by the imaging unit 50 are synchronized. Further, the arithmetic processing unit 60 calculates the three-dimensional shape of the measurement object 2 based on the luminance data (signal intensity) of each pixel in the image data captured by the imaging unit 50.
  • FIG. 2 is a block diagram showing an example of a detailed configuration of the shape measuring apparatus 1 shown in FIG.
  • the projection unit 10 includes a laser controller 21, a laser diode (light source) 22, a projection optical system 30, and a scanning unit 40.
  • the light generation unit 20 illustrated in FIG. 1 includes a laser controller 21 and a laser diode 22.
  • the laser controller 21 controls the irradiation of the laser light by the laser diode 22 based on the command signal from the control unit 62.
  • the laser diode 22 is a light source that emits laser light based on a control signal from the laser controller 21.
  • the laser diode 22 includes, for example, a red laser diode that emits red light, a green laser diode that emits green light, and a blue laser diode that emits blue light.
  • the projection optical system 30 projects the projection light 100 as described above.
  • the projection optical system 30 includes one or a plurality of transmission optical elements or reflection optical elements.
  • the scanning unit 40 reflects the projection light 100 emitted from the projection optical system 30 by using, for example, a reflection optical element such as a mirror, and changes the reflection angle thereof to change the projection light 100 in the second direction D2 ( Scan in the Y1 axis direction in FIG.
  • a reflection optical element such as a mirror
  • a MEMS (Micro Electro Mechanical Systems) mirror that changes the reflection angle of the projection light 100 by resonating the mirror with static electricity is used.
  • the second direction D2 is a direction on the measurement object 2 different from the first direction D1 (X1 axis direction in FIG. 2).
  • the first direction D1 and the second direction D2 are orthogonal to each other.
  • the MEMS mirror vibrates in the direction S (see FIG. 1) with the vibration center AX in the paper as an axis, and reflects the projection light 100 at a predetermined reflection angle and changes its reflection angle.
  • the scanning width in the second direction D2 by the MEMS mirror (that is, the length in the second direction D2 in the projection region 200) is determined by the amplitude in the vibration direction S of the MEMS mirror.
  • the speed at which the projection light 100 is scanned in the second direction D2 by the MEMS mirror is determined by the angular speed (that is, the resonance frequency) of the MEMS mirror.
  • the projection light 100 can be scanned back and forth.
  • the start position of scanning with the projection light 100 is arbitrary. For example, in addition to starting the scanning of the projection light 100 from the end of the projection area 200, the scanning may be started from approximately the center of the projection area 200.
  • FIG. 3A is a diagram showing the intensity distribution of the structured light 101 in the projection region 200.
  • FIG. 3B is a diagram illustrating the intensity distribution of the reference light 102 in the projection region 200.
  • the projection light 100 is slit-like light having a predetermined length in the first direction D1.
  • the projection light 100 is scanned over a predetermined distance in the second direction D2, thereby forming a rectangular projection region 200.
  • the projection area 200 is an area onto which the structured light 101 and the reference light 102 are projected, and is an area defined by the first direction D1 and the second direction D2.
  • the projection area 200 includes part or all of the measurement object 2.
  • the structured light 101 shown in FIG. 3A is pattern light having a periodic light intensity distribution along the second direction D2.
  • a stripe pattern P having a sinusoidal periodic light intensity distribution along the second direction D2 is used as an example of the structured light 101.
  • the fringe pattern P is formed, for example, by setting the wavelength of the projection light 100 to a predetermined wavelength (eg, about 680 nm) and scanning in the second direction D2 while periodically changing the light intensity of the projection light 100.
  • the stripe pattern P has a light and dark pattern in which a bright part (white part in FIG. 3A) and a dark part (black part in FIG. 3B) change along the second direction D2.
  • the fringe pattern P is also expressed as a shading pattern in which a dark portion (black portion in FIG.
  • the stripe pattern P is a lattice pattern, it is also expressed as a lattice pattern.
  • the second direction D2 is also referred to as a light / dark direction, a light / dark direction, or a lattice direction.
  • the reference light 102 shown in FIG. 3B has a uniform light intensity (or light and dark, light and shade) in the first direction D1 and the second direction D2.
  • a uniform pattern Q formed of white light including red light, green light, and blue light is used as an example of the reference light 102.
  • the uniform pattern Q is formed, for example, by using the projection light 100 as white light and scanning in the second direction D2 while keeping the light intensity constant.
  • the imaging unit 50 includes a light receiving optical system 51, a CCD camera 52a, and an image memory 52b.
  • the imaging device 52 includes a CCD camera 52a and an image memory 52b.
  • the light receiving optical system 51 forms an image of a region including a portion on which the projection light 100 is projected on the surface of the measurement object 2 on the light receiving surface of the CCD camera 52a.
  • the CCD camera 52a is a camera using a charge-coupled device.
  • the image data generated by the CCD camera 52a is composed of signal intensity data for each pixel.
  • the image memory 52b stores image data generated by the CCD camera 52a.
  • FIG. 4 is a diagram showing the relationship between the projection area and the imaging area.
  • a region where the imaging unit 50 images the measurement object 2 (hereinafter referred to as an imaging region) will be briefly described with reference to FIG.
  • the right direction of the paper is the X1 axis
  • the downward direction of the paper is the Y1 axis
  • the direction from the back of the paper to the front is the Z1 axis.
  • the imaging region 210 indicates a region of the measurement object 2 that is imaged by the imaging unit 50.
  • the imaging area 210 is within the area of the projection area 200 and is narrower than the projection area 200. However, it is sufficient that the imaging region 210 does not protrude beyond at least the projection region 200.
  • the imaging area 210 may be the same area as the projection area 200. Note that the imaging area 210 may be an area larger than the projection area 200.
  • the projection light 100 starts scanning from the outside of the imaging area 210 (that is, outside the imaging field) and from within the imaging area 210 (that is, within the imaging field). Either the case where scanning is started or the case where scanning starts.
  • the calculation processing unit 60 includes an operation unit 61, a control unit 62, a setting information storage unit 63, a capture memory 64, a calculation unit 65, an image storage unit 66, and a display control unit 67.
  • the operation unit 61 outputs an operation signal corresponding to a user operation to the control unit 62.
  • the operation unit 61 is, for example, a button or switch operated by the user. Further, for example, a touch panel is formed on the display device 70. This touch panel is also used as the operation unit 61.
  • the control unit 62 includes a first control unit 62a and a second control unit 62b.
  • the first control unit 62a controls the scanning unit 40 and the imaging unit 50.
  • the second controller 62b controls the light generator 20.
  • the control unit 62 executes the following control according to the program stored in the setting information storage unit 63.
  • the first control unit 62a outputs a command signal to the scanning unit 40 and the CCD camera 52a, and controls the imaging of the measurement object 2 by the CCD camera 52a to be synchronized with the scanning of the fringe pattern P by the scanning unit 40. In addition, the first control unit 62a performs control so as to synchronize imaging of one frame by the CCD camera 52a and scanning of the stripe pattern P a plurality of times.
  • the second control unit 62b can emit desired laser light combining red light, blue light, and green light from the laser diode 22 by outputting a command signal to the laser controller 21.
  • the second control unit 62 b can adjust the light intensity of the laser light emitted from the laser diode 22 by outputting a command signal to the laser controller 21.
  • the control unit 62 controls the laser controller 21 and the scanning unit 40, for example, to synchronize control with a predetermined wavelength.
  • the projection light 100 is scanned in the second direction D2 while periodically changing the light intensity of the projection light 100.
  • control unit 62 projects the uniform pattern Q onto the measurement object 2, for example, by synchronously controlling the laser controller 21 and the scanning unit 40,
  • the projection light 100 is scanned in the second direction D2 while keeping the light intensity of the white projection light 100 constant.
  • the frequency of the MEMS mirror constituting the scanning unit 40 is set to, for example, 500 Hz (the oscillation cycle of the MEMS mirror is 2 ms for reciprocation).
  • the shutter speed of the CCD camera 52a exposure time of the CCD camera 52a
  • the first control unit 62a performs control so that the projection light 100 is reciprocated 20 times, for example, by the scanning unit 40 during imaging of one frame by the CCD camera 52a.
  • the setting information storage unit 63 stores a program for causing the control unit 62 to execute control.
  • the setting information storage unit 63 stores a program for causing the calculation unit 65 to perform shake detection processing and a program for executing calculation processing of a three-dimensional shape.
  • the setting information storage unit 63 stores a program for causing the display control unit 67 to execute display control.
  • the setting information storage unit 63 also stores calibration information used when calculating the actual coordinate value of the measurement object 2 from the fringe phase of the fringe pattern P in the calculation process of the calculation unit 65.
  • the setting information storage unit 63 sets the phase of the fringe pattern P for each pixel based on information (eg, luminance) of the image captured by the imaging unit 50 (CCD camera 52a) in the calculation process of the calculation unit 65.
  • information eg, luminance
  • a phase calculation program, data, and the like for obtaining are stored.
  • the capture memory 64 captures and stores the image data stored in the image memory 52b.
  • the capture memory 64 stores a measurement image of the measurement object 2 imaged by projecting the fringe pattern P, a reference image of the measurement object 2 imaged by projecting the uniform pattern Q, and the like.
  • the capture memory 64 is provided with a plurality of storage areas.
  • the image data of the measurement image and the image data of the reference image are stored in different storage areas, for example.
  • the calculation unit 65 executes a predetermined calculation according to the program and calibration information stored in the setting information storage unit 63. For example, relative blur between the measurement object 2 and the shape measuring device 1 is detected from the image data of the reference image stored in the capture memory 64. This blur is change information regarding at least one of a relative position change and a posture change between the measurement object 2 and the shape measuring apparatus 1.
  • the position change and the posture change are for the X1 direction, the Y1 direction, and the direction around the Z1 axis (hereinafter referred to as “ ⁇ Z1 direction”).
  • the calculation unit 65 calculates the three-dimensional shape data (three-dimensional shape coordinate data) of the measurement object 2 from the image data of the measurement image stored in the capture memory 64. In the calculation of the three-dimensional shape data, the calculation unit 65 can perform calculation based on the measurement image and the shake.
  • the calculation unit 65 detects a feature area (refer to FIG. 8 described later) in the reference image.
  • the feature region is a region that is included in the reference image of the measurement object 2, for example, and can be identified by the change in luminance with respect to other regions. In this case, the change in luminance is based on a change in the shape of the measurement object 2, a change in the light reflectance of the surface of the measurement object 2, and the like.
  • the feature region includes a plurality of regions arranged with a predetermined distance in the reference image, and includes, for example, three regions. In the present embodiment, a case will be described as an example where the reference image is set so that the region corresponding to the corner 2a of the measurement object 2 and its periphery is set as the feature region.
  • the feature region in this case includes three unit regions corresponding to the three straight lines gathered at the corner 2a (see FIG. 8).
  • the image storage unit 66 stores the three-dimensional shape data of the measurement object 2 calculated by the calculation unit 65.
  • the display control unit 67 executes display control of a three-dimensional image according to a program stored in the setting information storage unit 63. That is, the display control unit 67 reads the three-dimensional shape data stored in the image storage unit 66 in accordance with the operation of the operation unit 61 by the user or automatically. And the display control part 67 performs control which displays the image of the three-dimensional shape of the measuring object 2 on the display screen of the display apparatus 70 based on the read-out three-dimensional shape data.
  • the display device 70 is a device that displays a three-dimensional image of the measurement object 2.
  • a liquid crystal display device or an organic EL display device is used as the display device 70. In FIG. 1, the display device 70 is omitted.
  • control unit 62, the calculation unit 65, and the display control unit 67 are configured by a calculation processing device such as a CPU (Central Processing Unit). That is, the arithmetic processing unit performs processing executed by the control unit 62 in accordance with a program stored in the setting information storage unit 63. In addition, the arithmetic processing unit performs processing executed by the arithmetic unit 65 in accordance with a program stored in the setting information storage unit 63. Further, the arithmetic processing unit performs processing executed by the display control unit 67 in accordance with a program stored in the setting information storage unit 63.
  • This program includes a shape measurement program.
  • the shape program includes, for an arithmetic processing unit (control unit 62), processing for capturing a first reference image of the measurement object 2, processing for capturing a second reference image of the measurement object 2, and A process of imaging a measurement image of the measurement object 2 obtained by projecting the fringe pattern P onto the measurement object 2 simultaneously with or after the imaging of the first reference image and before the imaging of the second reference image; Is executed.
  • the shape program detects a feature region (described later) from the first reference image and detects the feature region (described later) from the second reference image with respect to the arithmetic processing unit (arithmetic unit 65).
  • the phase shift method is based on the principle of triangulation, and a fringe image (the fringe pattern P is projected by shifting the fringe phase of the fringe pattern P having a sinusoidal light intensity distribution projected onto the measurement object 2.
  • This is a method of measuring the shape three-dimensionally by analyzing the measured image of the measured object 2).
  • the fringe pattern P is four types of fringe patterns P obtained by shifting the fringe phase by ⁇ / 2 along the second direction D2.
  • the phase of the fringe pattern P can be rephrased as a phase of a sine wave that is a light intensity distribution of the fringe pattern P. That is, four types of fringe patterns P are generated by shifting a sine wave, which is a light intensity distribution, by ⁇ / 2 along the second direction D2.
  • the reference stripe pattern P is a first stripe pattern (first phase light) P1, and the phase of the first stripe pattern P1 is zero.
  • the stripe pattern P obtained by shifting the phase of the first stripe pattern P1 by ⁇ / 2 is defined as the second stripe pattern (second phase light) P2, and the stripe pattern obtained by shifting the phase of the first stripe pattern P1 by ⁇ .
  • P be the third stripe pattern (third phase light) P3
  • the stripe pattern P obtained by shifting the phase of the first stripe pattern P1 by 3 ⁇ / 2 be the fourth stripe pattern (fourth phase light) P4.
  • FIGS. 5A to 5D are views showing a state in which the first fringe pattern P1 to the fourth fringe pattern P4 are projected on a plane without the measuring object 2, and the imaging region 210 in the projection region 200 is shown in FIG. It is an image.
  • 5A shows the first stripe pattern P1
  • FIG. 5B shows the second stripe pattern P2
  • FIG. 5C shows the third stripe pattern P3
  • FIG. 5D shows the fourth stripe pattern P4.
  • the first fringe pattern P1 to the fourth fringe pattern P4 as shown in FIGS. 5A to 5D are projected from the projection unit 10 onto the measurement object 2 and are different from the projection unit 10.
  • the measurement object 2 is imaged by the imaging unit 50 arranged at an angle.
  • the projection unit 10, the measurement object 2, and the imaging unit 50 are arranged so as to have a triangulation positional relationship.
  • the imaging unit 50 captures four measurement images by imaging the measurement object 2 in a state where the first stripe pattern P1 to the fourth stripe pattern P4 are projected onto the measurement object 2, respectively. Then, the arithmetic processing unit 60 applies the data on the signal strengths of the four measurement images captured by the imaging unit 50 to the following (Equation 1), and the fringes in each pixel according to the surface shape of the measurement object 2 are calculated. A phase value ⁇ is obtained.
  • ⁇ (u, v) tan ⁇ 1 ⁇ (I4 (u, v) ⁇ I2 (u, v)) / (I1 (u, v) ⁇ I3 (u, v)) ⁇ (Expression 1)
  • (u, v) indicates the position coordinates of the pixel.
  • I1 is the signal intensity of the measurement image captured when the first fringe pattern P1 is projected.
  • I2 is the second stripe pattern P2
  • I3 is the third stripe pattern P3
  • I4 is the signal intensity of the measurement image when the fourth stripe pattern P4 is projected.
  • phase of the signal intensity that changes sinusoidally for each pixel of the image can be obtained.
  • a line (equal phase line) obtained by connecting points having the same phase ⁇ (u, v) represents the shape of a cross section obtained by cutting an object along a certain plane in the same manner as the cutting line in the optical cutting method. Therefore, a three-dimensional shape (height information at each point of the image) is obtained by the principle of triangulation based on this phase ⁇ (u, v).
  • the second direction D2 is equal to the first stripe pattern P1 by the distance corresponding to the position of the stripe corresponding to the phase ⁇ / 2. It is shifted to. Further, in the third stripe pattern P3, the position of the stripe is shifted in the second direction D2 by a distance corresponding to the phase ⁇ with respect to the first stripe pattern P1. Similarly, in the fourth stripe pattern P4, the position of the stripe is shifted in the second direction D2 by a distance corresponding to the phase 3 ⁇ / 2 with respect to the first stripe pattern P1. For this reason, on the imaging region 210, the position of the stripe is projected from the first stripe pattern P1 to the fourth stripe pattern P4 in the second direction D2 at equal intervals.
  • 5A to 5D show the image of the stripe pattern P projected on the plane, the shape of the image of the stripe pattern P does not change.
  • the fringe pattern P is projected on the surface of the measuring object 2, so that the image of the fringe pattern P in the second direction D2 (see FIG. 3 in the Y1-axis direction).
  • the shape measuring apparatus 1 is handled with the user holding the casing 80, for example.
  • the shape measuring device 1 projects the first stripe pattern P1 to the fourth stripe pattern P4 onto the measurement object 2, and A measurement image of the measurement object 2 is captured for each of the stripe patterns.
  • a relative positional change may occur between the measuring object 2 and the shape measuring apparatus 1 due to, for example, camera shake of the user.
  • FIG. 6 is a diagram for explaining the relative shake between the measurement object 2 and the shape measuring apparatus 1.
  • the surface 2 f of the measurement object 2 is displayed as a curved surface, but the same explanation can be made when it is a flat surface.
  • the measuring object 2 is not displaced and only the shape measuring apparatus 1 is displaced, and only the measuring object 2 is displaced and the shape measuring apparatus is displaced.
  • the case where 1 is not displaced can be treated as the same value. Therefore, hereinafter, in explaining the relative shake between the measuring object 2 and the shape measuring apparatus 1, it is assumed that only the measuring object 2 is displaced and the shape measuring apparatus 1 is not displaced for convenience.
  • first timing a measurement image is captured with the first stripe pattern P1 projected
  • second stripe pattern P2 a measurement image is captured with the second stripe pattern P2 projected
  • a position on the surface 2f that is imaged at the first timing (hereinafter, a first position L1) due to the displacement of the surface 2f.
  • a position on the surface 2f (hereinafter, the second position L2) imaged at the second timing is different.
  • the position where the first stripe pattern P1 is projected at the first timing and the position where the second stripe pattern P2 is projected at the second timing are different on the surface 2f. Accordingly, at the second timing with respect to the first timing, the second fringe pattern P2 is projected at a position on the surface 2f to be projected when the shape measuring apparatus 1 and the measurement object 2 are not relatively displaced.
  • the projection position is displaced. That is, with the displacement between the first position L1 and the second position L2 on the surface 2f, the phase of the projected fringe pattern P is projected at the first timing with the first fringe pattern P1 and the second timing.
  • the phase changes hereinafter, the amount of phase change is referred to as ⁇ 1).
  • the signal intensity (I2 (u1, v1)) of the pixel (u1, v1) in the measurement image captured at the second timing is relative to the shape measuring apparatus 1 and the measurement object 2. If it is not displaced, the value deviates from the value that should be obtained. For this reason, the fringe phase value ( ⁇ (u1, v1)) in each pixel is a value deviated from the value to be obtained when the shape measuring apparatus 1 and the measurement object 2 are not relatively displaced. . Therefore, the measurement accuracy of the three-dimensional shape of the measurement object 2 obtained based on the phase value is lowered.
  • the shape measuring apparatus 1 accurately detects a relative shake between the measurement object 2 and the shape measuring apparatus 1 and corrects the shake, thereby correcting the side wisdom object. 2 three-dimensional shape is accurately measured.
  • a shake detection method by the shape measuring apparatus 1 and an example of a shape measurement method using this detection method will be described.
  • the imaging unit 50 captures a measurement image of the measurement target 2 obtained by projecting the stripe pattern P for shape measurement from the projection unit 10 onto the measurement target 2, and the measurement target 2 Detecting relative shake with the shape measuring apparatus 1.
  • the shape measuring method according to the present embodiment includes calculating the shape of the measuring object 2 based on the measurement image of the measuring object 2 and the above-described blur.
  • the imaging unit 50 of the shape measuring apparatus 1 is calibrated in advance and the internal parameters are known. This calibration is so-called camera calibration, and is included in information related to the focal length of the imaging unit 50, information related to the position on the imaging surface of the imaging device 52 where the optical axes of the imaging unit 50 intersect, and the imaging unit 50. Information related to distortion of lenses and the like is used as an internal parameter.
  • FIG. 7 is a flowchart illustrating an example of the detection method and the shape measurement method according to the first embodiment. Further, FIG. 8 is a diagram schematically showing images captured in the detection method in the order of processing.
  • the control unit 62 receives a signal indicating that the shutter operation has been performed from the operation unit 61.
  • the standby state is set. Further, when the shutter operation is performed, the distance from the measurement object 2 may be measured and the projection optical system 30 and the imaging lens 51 may be focused.
  • control unit 62 When the shutter operation is performed, the control unit 62 outputs a command signal to the light generation unit 20 and the scanning unit 40 to project the first fringe pattern P1 having the phase 0 onto the measurement object 2. In addition, the control unit 62 outputs a command signal to the imaging unit 50 to capture a measurement image of the measurement object 2 on which the first fringe pattern P1 is projected (step S01).
  • the control unit 62 controls the operations of the light generation unit 20 and the scanning unit 40 so that the light generation of the projection light 100 by the light generation unit 20 (laser controller 21) and the scanning by the scanning unit 40 are synchronized.
  • the control unit 62 adjusts the light intensity of the projection light 100 having a predetermined wavelength so as to periodically change in a sinusoidal shape.
  • the control unit 62 causes the projection light 100 to scan in the second direction D2 at a predetermined speed.
  • the first stripe pattern P1 in which the light intensity (or light and dark, light and shade) periodically changes in a sine wave shape in the second direction D2 is projected onto the projection region 200.
  • the first fringe pattern P1 is projected onto the measurement object 2 arranged in the projection region 200. Note that the number of scans of the projection light 100 is arbitrarily set.
  • the CCD camera 52a images the surface of the measurement object 2 on which the first stripe pattern P1 is projected. As shown in FIG. 8, the first measurement image M1 of the measurement object 2 on which the first stripe pattern P1 is projected is acquired by this imaging. Then, the CCD camera 52a generates image data of the first measurement image M1. The image data of the first measurement image M1 is temporarily stored in the image memory 52b and then stored in a storage area provided in the capture memory 64.
  • the first measurement image M1 in FIG. 8 for convenience of explanation, an image in a state where the first stripe pattern P1 projected on the plane is captured as it is is shown. Actually, since the first fringe pattern P1 is projected on the surface of the measurement object 2, an image in which the first fringe pattern P1 is deformed according to the shape of the measurement object 2 is obtained. The same applies to a second measurement image M2 to a fourth measurement image M4 described later.
  • control unit 62 outputs a command signal to the light generation unit 20 and the scanning unit 40 to project the uniform pattern Q (see FIG. 3B) on the measurement object 2.
  • control unit 62 outputs a command signal to the imaging unit 50 to capture a reference image of the measurement object 2 on which the uniform pattern Q is projected (step S02).
  • the control unit 62 controls the operations of the light generation unit 20 and the scanning unit 40 so that the light generation of the projection light 100 by the light generation unit 20 (laser controller 21) and the scanning by the scanning unit 40 are synchronized.
  • the control unit 62 sets the projection light 100 to white light.
  • the control unit 62 adjusts so that the light intensity of the projection light 100 is constant.
  • the control unit 62 causes the projection light 100 to scan in the second direction D2 at a predetermined speed.
  • the uniform pattern Q adjusted so that the light intensity (or light and dark, light and shade) is uniform in the second direction D2 is projected onto the projection region 200. Therefore, the uniform pattern Q is projected onto the measurement object 2 arranged in the projection area 200.
  • the number of scans of the projection light 100 is arbitrarily set.
  • the CCD camera 52a images the surface of the measurement object 2 on which the uniform pattern Q is projected.
  • a first reference image (first reference image) R1 of the measurement object 2 onto which the uniform pattern Q is projected is acquired.
  • the CCD camera 52a generates image data of the first reference image R01.
  • the image data of the first reference image R01 is once stored in the image memory 52b and then stored in a storage area provided in the capture memory 64.
  • control unit 62 outputs a command signal to the light generation unit 20 and the scanning unit 40 to project the second fringe pattern P2 having a phase of ⁇ / 2 onto the measurement object 2.
  • control unit 62 outputs a command signal to the imaging unit 50 to capture a measurement image of the measurement object 2 on which the second stripe pattern P2 is projected (step S03).
  • the 2nd measurement image M2 of the measuring object 2 in which the 2nd striped pattern P2 was projected is acquired.
  • image data of the second measurement image M ⁇ b> 2 is generated by the CCD camera 52 a, and this image data is stored in the capture memory 64.
  • control unit 62 outputs a command signal to the light generation unit 20 and the scanning unit 40 to project the uniform pattern Q onto the measurement object 2.
  • control unit 62 outputs a command signal to the imaging unit 50 to capture a reference image of the measurement object 2 on which the uniform pattern Q is projected (step S04).
  • the 2nd reference image (2nd reference image) R2 of the measuring object 2 in which the uniform pattern Q was projected is acquired.
  • image data of the second reference image R02 is generated by the CCD camera 52a, and this image data is stored in the capture memory 64.
  • the control unit 62 alternately projects the stripe pattern P and the uniform pattern Q on the measurement object 2 in the order of the third stripe pattern P3, the uniform pattern Q, the fourth stripe pattern P4, and the uniform pattern Q,
  • the measurement object 2 on which the stripe pattern P or the uniform pattern Q is projected is imaged (steps S05 to S08). Accordingly, as shown in FIG. 8, the third measurement image M3 of the measurement object 2 on which the third stripe pattern P3 is projected, the third reference image R03 of the measurement object 2 on which the uniform pattern Q is projected, A fourth measurement image M4 of the measurement object 2 onto which the four-stripe pattern P4 is projected and a fourth reference image R04 of the measurement object 2 onto which the uniform pattern Q is projected are obtained. Thereafter, image data of each image is generated by the CCD camera 52 a, and this image data is stored in the capture memory 64.
  • the calculation unit 65 detects a feature region in the first reference image R01 to the fourth reference image R04 (step S09).
  • the computing unit 65 detects the feature region C corresponding to the corner 2a of the measurement object 2 and its periphery from the first reference image R01 to the fourth reference image R04.
  • the feature region C three unit regions Ca to Cc corresponding to each of the three straight lines gathering at the corner 2a are provided.
  • only the feature region C is displayed as the first reference image R01 to the fourth reference image R04 in FIG. Actually, an image displaying the entire surface of the measuring object 2 is obtained.
  • step S10 the calculation unit 65 detects a relative shake between the measurement object 2 and the shape measuring apparatus 1 (step S10).
  • the calculation unit 65 first calculates the three-dimensional shape of the measurement object 2 using the first measurement image M1 to the fourth measurement image M4.
  • the calculation unit 65 obtains the initial phase distribution ⁇ (u, v) of each pixel on the assumption that there is no shake due to camera shake or the like (the phase change amount ⁇ 1 is 0).
  • the calculation unit 65 performs a phase connection process on the obtained initial phase distribution ⁇ (u, v).
  • a continuous phase distribution ⁇ ′ (u, v) is obtained.
  • the calculation unit 65 obtains the coordinate data (x ′, y ′, z ′) of the three-dimensional shape of the measuring object 2 from the obtained phase distribution ⁇ ′ (u, v) using the principle of triangulation. calculate.
  • the coordinate data of the three-dimensional shape calculated in this way is calculated in step 10 as having no relative shake between the measuring object 2 and the shape measuring device 1 (the phase change amount ⁇ 1 is 0).
  • the calculated coordinate data of the three-dimensional shape is a rough value that may be different from the value of the actual three-dimensional shape.
  • the calculation unit 65 associates the calculated coordinate data of the rough three-dimensional shape of the measurement object 2 with the two-dimensional coordinates of the feature regions C of the first reference image R01 to the fourth reference image R04.
  • the rotation and translation from the shape measuring device 1 to the surface 2f are calculated.
  • the calculation method of rotation and translation in this case includes academic papers (eg, V. Lepetit et al. “EPnP: An Accurate O (n) Solution to the PnP Problem”, International Journal Of Computer Vision, vol. 81, p 155-166, 2009.) and publicly known publications can be used.
  • the calculation unit 65 performs rotation R1 and translation from the shape measuring device 1 to the surface 2f at the first timing based on the correspondence between the coordinate data of the rough three-dimensional shape and the two-dimensional coordinates of the feature region C of the first reference image R01. t1 is calculated.
  • the calculation unit 65 rotates from the shape measuring apparatus 1 to the surface 2f at the second timing based on the correspondence between the coordinate data of the rough three-dimensional shape and the two-dimensional coordinates of the feature region C of the second reference image R02, for example.
  • R2 and translation t2 are calculated.
  • the arithmetic unit 65 obtains the rotation Ra and the translation ta of the feature region C by the following [Equation 1] using the obtained R1, t1, R2, and t2.
  • the rotations R1, R2, and Ra are represented by determinants, and the translations t1, t2, and ta are represented by vectors.
  • the rotation Ra and the translation ta are first change information about the shake that occurs between the first measurement image M1 and the second measurement image M2.
  • the calculation unit 65 captures the third measurement image M3 from the shape measurement apparatus 1 (third) based on the correspondence between the rough three-dimensional shape coordinate data and the two-dimensional coordinates of the feature region C of the third reference image R03.
  • the rotation R3 and translation t3 to the surface 2f at (timing) are calculated.
  • the computing unit 65 obtains the rotation Rb and translation tb of the feature region C by the following [Equation 2] using the obtained R3, t3 and the above R2, t2.
  • the rotation Rb and the translation tb are second change information about the shake that occurs between the second measurement image M2 and the third measurement image M3.
  • the calculation unit 65 captures the fourth measurement image M4 from the shape measurement device 1 (fourth) based on the correspondence between the rough three-dimensional shape coordinate data and the two-dimensional coordinates of the feature region C of the fourth reference image R04. Rotation R4 and translation t4 to the surface 2f at (timing) are calculated. Next, the arithmetic unit 65 obtains the rotation Rc and the translation tc of the feature region C by the following [Equation 3] using the obtained R4 and t4 and the above R3 and t3. The rotation Rc and the translation tc are the third change information about the shake that occurs between the third measurement image M3 and the fourth measurement image M4.
  • blur first change information
  • second change information between the first measurement image M1 and the second measurement image M2 and the third measurement image M3.
  • a shake second change information
  • a shake third change information
  • the calculation unit 65 calculates the shape of the measurement object 2 based on the shake detection result and the first measurement image M1 to the fourth measurement image M4 (step S11). .
  • the arithmetic unit 65 causes the blur between the first measurement image M1 and the second measurement image M2, the blur between the second measurement image M2 and the third measurement image M3, and the third measurement image M3. And the fourth measurement image M4 are corrected.
  • the distance between the first position L1 and the second position L2 is very small, so the change in the distance from the imaging position to the imaging unit 50 and the fringe pattern at the imaging position.
  • the influence of changes in the incident angle and reflection angle of P can be ignored. Therefore, the change in the light intensity of the fringe pattern P projected to the first position L1 and the second position L2 can be regarded as being only due to the change in the phase of the fringe pattern P projected from the projection unit 10. Therefore, in this embodiment, for each pixel, the amount of change in the phase of the fringe pattern P (for example, the amount of change ⁇ 1) is obtained, and the initial phase distribution ⁇ (u, v) taking this amount of change into consideration is obtained. Can be corrected.
  • the calculating unit 65 first converts the position of the surface 2f at the first timing by converting the position of the surface 2f at the first timing based on the first change information (rotation R, translation t) obtained as described above. Calculate the position. The position of the surface 2f at the first timing is calculated based on rough three-dimensional coordinate data.
  • the calculation unit 65 obtains the position coordinates of the first position L1 and the second position L2.
  • the calculation unit 65 obtains, for example, the coordinates of the intersection of the light beam back-projected from any one pixel (u1, v1) of the imaging unit 50 and the surface 2f at the first timing as the position coordinate of the first position L1. Further, the calculation unit 65 obtains the coordinates of the intersection of the light ray back projected from the pixel (u1, v1) and the surface 2f at the second timing as the position coordinate of the second position L2.
  • FIGS. 9A and 9B are diagrams for explaining the principle of obtaining the phase change amount ⁇ 1.
  • FIG. 9A is a diagram showing the intensity distribution of the fringe pattern P projected on the first position L1.
  • FIG. 9B is a diagram illustrating the intensity distribution of the fringe pattern P projected on the first position and the second position L2.
  • the first fringe pattern P1 is projected onto the measurement object 2 at the first timing.
  • a portion corresponding to the phase ⁇ L1 of the fringe pattern P (the phase of one dark portion is set to 0) is projected onto the first position L1.
  • luminance image of which corresponds to the light intensity of the phase phi L1 stripe pattern P is captured.
  • the second fringe pattern P2 having a phase larger by ⁇ than the first fringe pattern P1 is projected onto the measurement object 2. For this reason, when there is no camera shake or the like, a portion corresponding to a phase ( ⁇ L1 + ⁇ / 2) obtained by adding ⁇ / 2 to the phase ⁇ L1 in the stripe pattern P is projected at the first position L1.
  • an image having a luminance corresponding to the light intensity at the phase ( ⁇ L1 + ⁇ / 2) in the stripe pattern P is captured by the one pixel of the CCD camera 52a.
  • the imaging position of the pixel of the imaging unit 50 is the second position L2.
  • the position of the second position L2 is a fringe pattern P (second pattern).
  • the first position L1 is shifted in the Y1 direction (eg, + Y1 direction).
  • the phase ⁇ L2 of the second position L2 is a value different from the phase ( ⁇ L1 + ⁇ / 2) of the first position L1.
  • the luminance image of which corresponds to the light intensity of the phase phi L2 of the fringe pattern P is captured.
  • the fringe pattern P has a periodic light intensity distribution in a sine wave shape in the Y1 direction, the portion corresponding to the phase ( ⁇ L1 + ⁇ / 2) of the first position L1 and the phase ⁇ of the second position L2 The light intensity is different from the portion corresponding to L2 . For this reason, the brightness at the first position L1 and the image at the second position L2 captured by the one pixel of the CCD camera 52a at the second timing are different from each other.
  • the change in the light intensity of the fringe pattern P projected to the first position L1 and the second position L2 can be regarded as being only due to the change in the phase of the fringe pattern P projected from the projection unit 10. .
  • the difference in luminance between the image at the first position L1 and the image at the second position L2 captured by the one pixel at the second timing is caused only by the change in the phase of the fringe pattern P. Can do.
  • the calculation unit 65 obtains the phase ⁇ L2 of the fringe pattern P projected on the second position L2 based on the luminance of the image captured by the pixel. Then, the calculation unit 65 determines how much the obtained phase ⁇ L2 is deviated from the phase ( ⁇ L1 + ⁇ / 2) of the fringe pattern P projected on the first position L1 as a phase change amount ⁇ 1. calculate.
  • the calculating unit 65 uses the same method as described above, and the phase change amount ⁇ 2 caused by the shake between the second stripe pattern P2 projected at the second timing and the third stripe pattern P3 projected at the third timing. (Change in phase different from phase difference ( ⁇ / 2)) and blurring between the third stripe pattern P3 projected at the third timing and the fourth stripe pattern P4 projected at the fourth timing A phase change amount ⁇ 3 (a phase change different from the phase difference ( ⁇ / 2)) is calculated.
  • the calculation unit 65 corresponds to the fourth timing with the initial phase of the fringe pattern P corresponding to the second timing being ( ⁇ / 2 + ⁇ 1), the initial phase of the fringe pattern P corresponding to the third timing being ( ⁇ + ⁇ 1 + ⁇ 2). Assuming that the initial phase of the fringe pattern P is (3 ⁇ / 2 + ⁇ 1 + ⁇ 2 + ⁇ 3), an initial phase distribution ⁇ (u1, v1) is obtained, and phase connection processing is performed. Then, the calculation unit 65 calculates three-dimensional coordinate data (x1, y1, z1) from the obtained phase distribution ⁇ ′ (u1, v1) using the principle of triangulation.
  • the calculation unit 65 performs the above calculation for each pixel to calculate the coordinate data (x, y, z) of the three-dimensional shape of the measurement object 2. ⁇ .
  • the calculation unit 65 performs the above correction again or repeatedly using the coordinate data (x, y, z) calculated in this way as the rough coordinate data of the three-dimensional shape. Also good. As a result, the accuracy of the position of the surface 2f at the first timing is increased, and therefore, more accurate detection is possible.
  • the calculation unit 65 stores the calculated coordinate data of the three-dimensional shape of the measurement object 2 in the image storage unit 66.
  • the display control unit 67 reads the coordinate data of the three-dimensional shape stored in the image storage unit 66 according to the operation of the operation unit 61 by the user or automatically.
  • the display control unit 67 displays the three-dimensional shape of the measurement object 2 on the display screen of the display device 70 based on the read coordinate data of the three-dimensional shape.
  • the three-dimensional shape is displayed as a point group that is a set of points in the three-dimensional space. This point cloud data can be output from the shape measuring apparatus 1.
  • the display device 70 may display not only the three-dimensional shape of the measurement object 2 but also a fringe image captured by the imaging unit 50. That is, the display control unit 67 may cause the display device 70 to display the fringe image captured by the imaging unit 50 based on the image data stored in the capture memory 64. According to such a configuration, the user can confirm whether or not the measurement object 2 has been accurately imaged at the imaging site based on the fringe image captured by the imaging unit 50.
  • the display device 70 may be configured to display at least one of the image captured by the imaging unit 50 and the three-dimensional shape calculated by the calculation unit 65.
  • at least one of the image picked up by the image pickup unit 50 and the three-dimensional shape calculated by the calculation unit 65 is displayed on an external display device connected to the shape measuring device 1 wirelessly or by wire. But you can.
  • the detection method for detecting the relative shake between the shape measuring apparatus 1 that measures the shape of the measuring object 2 and the measuring object 2 The first reference image R01 and the second reference image are captured, the first measurement image M1 of the measurement object is captured after the first reference image R01 and before the second reference image R02, The feature region C is detected from the first reference image R01, the feature region C is detected from the second reference image R02, and the relative blur between the measurement object 2 and the shape measuring device 1 is based on these feature regions C. Therefore, the relative shake between the measurement object 2 and the shape measuring device 1 can be detected with high accuracy.
  • FIGS. 10A to 10D are diagrams showing the intensity distribution of the spatial code pattern projected onto the measurement object 2, for example.
  • the spatial code pattern QA is a striped pattern light having a rectangular intensity distribution in the second direction D2.
  • the spatial code pattern QA bright portions (white portions in FIG. 10) and dark portions (black portions in FIG. 10) appear alternately.
  • description will be made using spatial code patterns QA1 to QA4 having four types of rectangular light intensity distributions with different frequencies in the second direction D2.
  • the spatial code pattern is, for example, a lattice pattern in which white and black are combined.
  • an image obtained by shifting the phase of the fringe pattern P is acquired, and the three-dimensional shape of the measurement object 2 is obtained by performing phase connection.
  • the three-dimensional shape of the measuring object 2 can be obtained accurately.
  • the fringes of the fringe pattern P are shown as a pattern with one period of the sine wave in the sinusoidal intensity distribution of the fringe pattern P as a unit.
  • the fringe pattern P includes a fringe having a phase of 0 to 2 ⁇ and a fringe having a phase of 2 ⁇ to 4 ⁇ with a phase reference of 0, which is projected on one end of the projection region 200, (m ⁇ 1) It includes fringes with a phase range of ⁇ to 2m ⁇ (where m is an integer).
  • the elevation (Y1 coordinate in FIG. 1) changes from the lower surface to the upper surface of the step. Is larger than the altitude difference corresponding to the period (that is, 2 ⁇ ) of the fringe pattern P in the area on the measurement object 2 corresponding to the area of any adjacent pixel of the imaging device 52 (that is, fringe). If any stripe of the pattern P is displaced in the D1 direction for one period or more), which phase range of the stripe of the stripe pattern P, which is a periodic sine wave, is projected on the upper surface of the step.
  • the spatial code pattern QA1 in (a) eight white lines and eight black lines are alternately arranged.
  • the spatial code pattern QA2 in (b) four white lines and four black lines are alternately arranged.
  • the spatial code pattern QA3 in (c) two white lines and two black lines are alternately arranged.
  • the spatial code pattern QA4 in (d) the left half is white and the right half is black.
  • FIG. 11 is a diagram schematically illustrating images captured in the detection method according to the second embodiment in the order of processing.
  • the control unit 62 after capturing the first measurement image M1, the control unit 62 outputs an instruction signal to the projection unit 10 to project the spatial code pattern QA1.
  • the control unit 62 outputs an instruction signal to the imaging unit 50, and images the measurement object 2 on which the spatial code pattern QA1 is projected.
  • the first reference image RA1 is acquired.
  • the control unit 62 projects the spatial code patterns QA2 to QA4 and images the measurement object 2 on which the spatial code patterns QA2 to QA4 are projected.
  • the second reference image RA2 to the fourth reference image RA4 are acquired.
  • the calculation unit 65 detects a feature region in the first reference image RA1 to the fourth reference image RA4.
  • the calculation unit 65 sets a part of an image captured in a bright part of the spatial code patterns QA1 to QA4 as a feature region.
  • the second feature region CA2 displayed in common in the bright portions of the spatial code patterns QA2 to QA4 Is a feature region.
  • only the feature areas CA1 to CA2 are displayed as the first reference image RA1 to the fourth reference image RA4 in FIG. Actually, a part of the measuring object 2 corresponding to the bright part of the spatial code patterns QA1 to QA4 is displayed.
  • the calculation unit 65 detects a relative shake between the first measurement image M1 and the second measurement image M2. For example, the calculation unit 65 obtains rough three-dimensional coordinate data as in the first embodiment, and calculates the three-dimensional shape and the feature region CA1 of the first reference image RA1 and the feature region CA1 of the second reference image RA2. According to the correspondence with the two-dimensional coordinates, rotation and translation from the shape measuring device 1 to the surface 2f of the measurement object 2 are obtained. Then, the computing unit 65 calculates the rotation and translation of the feature area CA1 between the first reference image RA1 and the first reference image RA2 from the obtained rotation and translation, as the first measurement image M1 and the second measurement image M2.
  • the shake is caused by the movement of the shape measuring apparatus 1.
  • the calculation unit 65 performs rotation and translation of the feature area CA2 between the second reference image RA2 and the third reference image RA3, relative to each other between the second measurement image M2 and the third measurement image M3. It asks for information about blurring.
  • the arithmetic unit 65 rotates and translates the feature area CA2 between the third reference image RA3 and the fourth reference image RA4, and the relative blur between the third measurement image M3 and the fourth measurement image M4.
  • the feature region is not limited to a common part in all reference images.
  • the calculation unit 65 calculates the shape of the measurement object 2 based on the shake detection result and the first measurement image M1 to the fourth measurement image M4.
  • the arithmetic unit 65 obtains the phase variations ⁇ , ⁇ 2, and ⁇ 3 of the fringe pattern P for each pixel as in the first embodiment, and the initial phase distribution in consideration of the variations ⁇ , ⁇ 2, and ⁇ 3.
  • the blur is corrected by obtaining ⁇ (u, v).
  • the arithmetic unit 65 may associate the spatial code patterns QA1 to QA4 with each stripe pattern P.
  • the spatial code patterns QA1 to QA4 serve as structured light and reference light.
  • the respective fringes (m ⁇ 1) ⁇ to 2m ⁇ (where m is an integer) of the sinusoidal fringe pattern P are identified in the respective regions within the projection region assigned with the spatial code. .
  • the spatial code patterns QA1 to QA4 are projected as the reference light, so that the measurement object 2 and The relative shake with the shape measuring apparatus 1 can be detected with high accuracy.
  • the shape of the measurement target object 2 is calculated by correct
  • the spatial code patterns QA1 to QA4 can be used as the reference light, thereby reducing the overall imaging time. can do.
  • the second embodiment by projecting a code with a coarser pattern than a code with a finer pattern than the spatial code patterns QA1 to QA4 later, it is possible to easily detect the feature region as the imaging time elapses. In some cases, camera shake due to the user is likely to occur as the imaging time elapses. In such a case, measurement errors can be detected with high accuracy.
  • the present invention is not limited to this.
  • the first reference image RB1 of the measurement object 2 projected with the uniform pattern Q before imaging the first measurement image M1 of the measurement object 2 projected with the first stripe pattern P1.
  • the second reference image RB2 of the measurement object 2 onto which the uniform pattern Q is projected may be imaged.
  • the calculation unit 65 detects the feature region CB in the first reference image RB1 and the second reference image RB, and based on these feature regions CB, the measurement object 2 and the shape measuring device are obtained by the same method as described above. Detect relative shake between 1 and 1. In this case, since the time required for the projection of the uniform pattern Q and the imaging of the reference image is reduced as compared with the first embodiment, the overall imaging time can be reduced.
  • the reference image may be continuously captured between the first measurement image M1 to the fourth measurement image M4.
  • reference images RC1 and RC2 of the measurement object 2 onto which the spatial code pattern QA1 is projected are shown as continuously captured reference images.
  • the present invention is not limited to this. For example, after capturing a reference image of the measurement object 2 onto which the spatial code pattern is projected, a reference image of the measurement object 2 onto which the uniform pattern Q is projected may be captured.
  • the shape measuring device 201 may include a second imaging unit 150 in addition to the imaging unit 50.
  • one of the imaging unit 50 and the second imaging unit 150 is used for imaging the measurement object 2 on which structured light is projected, and the other is used for imaging the measurement object 2 on which reference light is projected. Can be used.
  • the projection unit 10 scans in the second direction D2 while changing the light intensity periodically, for example, by setting the wavelength of the projection light 100D to a predetermined wavelength (eg, about 680 nm), The structured light is projected onto the measurement object 2.
  • a filter 153 that blocks light having a wavelength corresponding to the predetermined wavelength and transmits light having another wavelength is disposed between the light receiving optical system 151 and the imaging device 152.
  • the imaging unit 50 can capture a measurement image of the measuring object 2 onto which the structured light is projected.
  • the structured light is blocked by the filter 153.
  • the second imaging unit 150 can acquire an image of the measurement object 2 by natural light as a reference image without projecting the reference light from the projection unit 10.
  • the measurement object 2 on which the structured light is projected and the measurement object 2 on which the reference light is projected are imaged by separate imaging units (the imaging unit 50 and the second imaging unit 150), thereby measuring the measurement object. It becomes possible to acquire the measurement image and the reference image of the object 2 at the same time. Thereby, the imaging time can be shortened.
  • the calculation unit 65 may determine whether or not to perform subsequent correction and measurement of the three-dimensional shape according to the degree of shake after detecting the shake. In this case, when the degree of blur exceeds a preset threshold value, it is possible to make a negative determination for subsequent correction and measurement of the three-dimensional shape. Moreover, when it is determined as NO, the calculation unit 65 may output a predetermined signal and warn the user. Then, the user who has received the warning may measure the shape of the measurement object again.
  • the first change information, the second change information, and the third change information are detected by using the first reference image R01 to the fourth reference image R04, respectively.
  • the present invention is not limited to this.
  • at least two of the first change information, the second change information, and the third change information may be detected, and one other change information may be detected based on this. Thereby, shortening of processing time can be aimed at.
  • the present invention is not limited to this.
  • the configuration including three unit regions has been described as an example of the feature region.
  • the configuration is not limited to this, and for example, a configuration including two or one unit region may be used. It may be configured to include two or more unit regions.
  • the present invention is not limited to the case where the feature region is set in advance, and for example, the calculation unit 65 may automatically detect the feature region.
  • the calculation unit 65 may obtain a pattern that is commonly included in the plurality of reference images by a pattern matching method or the like. In this case, a closed shape such as an L-shaped part or a cross-shaped part can be selected as the pattern.
  • a mark for example, a circular seal or a QR code mark
  • the positions and shapes of such marks are set in advance. Therefore, when the mark is used as the feature region, the feature region is set in advance.
  • region was an area
  • a part of the measurement object 2 can be used as a feature region.
  • FIG. 15 is a block diagram illustrating an example of an embodiment of a structure manufacturing system.
  • the structure manufacturing system SYS illustrated in FIG. 15 includes the shape measuring device 1 (or the shape measuring device 201), the design device 710, the molding device 720, the control device (inspection device) 730, and the repair device 740. .
  • the design device 710 creates design information related to the shape of the structure. Then, the design device 710 transmits the produced design information to the molding device 720 and the control device 730.
  • the design information is information indicating the coordinates of each position of the structure.
  • the measurement object is a structure.
  • the forming apparatus 720 forms a structure based on the design information transmitted from the design apparatus 710.
  • the molding process of the molding apparatus 720 includes casting, forging, cutting, or the like.
  • the shape measuring devices 1 and 201 measure the three-dimensional shape of the structure (measurement object 2) produced by the forming device 720, that is, the coordinates of the structure. Then, the shape measuring devices 1, 201 transmit information indicating the measured coordinates (hereinafter referred to as shape information) to the control device 730.
  • the control device 730 includes a coordinate storage unit 731 and an inspection unit 732.
  • the coordinate storage unit 731 stores design information transmitted from the design device 710.
  • the inspection unit 732 reads design information from the coordinate storage unit 731. Further, the inspection unit 732 compares the design information read from the coordinate storage unit 731 with the shape information transmitted from the shape measuring devices 1 and 201. And the test
  • the inspection unit 732 determines whether or not the structure molded by the molding device 720 is a non-defective product. Whether or not the structure is a non-defective product is determined based on, for example, whether or not the error between the design information and the shape information is within a predetermined threshold range. If the structure is not molded according to the design information, the inspection unit 732 determines whether the structure can be repaired according to the design information. If it is determined that it can be repaired, the inspection unit 732 calculates a defective portion and a repair amount based on the comparison result. Then, the inspection unit 732 transmits information indicating a defective portion (hereinafter referred to as defective portion information) and information indicating a repair amount (hereinafter referred to as repair amount information) to the repair device 740.
  • defective portion information information indicating a defective portion
  • repair amount information information indicating a repair amount
  • the repair device 740 processes the defective portion of the structure based on the defective portion information and the repair amount information transmitted from the control device 730.
  • FIG. 16 is a flowchart showing processing by the structure manufacturing system SYS, and shows an example of an embodiment of a structure manufacturing method.
  • the design device 710 creates design information related to the shape of the structure (step S31).
  • the design device 710 transmits the produced design information to the molding device 720 and the control device 730.
  • the control device 730 receives the design information transmitted from the design device 710. Then, the control device 730 stores the received design information in the coordinate storage unit 731.
  • the molding apparatus 720 molds the structure based on the design information created by the design apparatus 710 (step S32). Then, the shape measuring devices 1 and 201 measure the three-dimensional shape of the structure formed by the forming device 720 (step S33). Thereafter, the shape measuring devices 1 and 201 transmit shape information that is a measurement result of the structure to the control device 730. Next, the inspection unit 732 compares the shape information transmitted from the shape measuring apparatuses 1 and 201 with the design information stored in the coordinate storage unit 731, and whether the structure has been molded according to the design information. Whether or not is checked (step S34).
  • the inspection unit 732 determines whether or not the structure is a good product (step S35). If it is determined that the structure is a non-defective product (step S35: YES), the process by the structure manufacturing system SYS is terminated. On the other hand, when the inspection unit 732 determines that the structure is not a non-defective product (step S35: NO), the inspection unit 732 determines whether the structure can be repaired (step S36).
  • step S36 determines that the structure can be repaired (step S36: YES)
  • the inspection unit 732 calculates the defective portion of the structure and the repair amount based on the comparison result of step S34. Then, the inspection unit 732 transmits the defective part information and the repair amount information to the repair device 740.
  • the repair device 740 performs repair (rework) of the structure based on the defective part information and the repair amount information (step S37). Then, the process proceeds to step S33. That is, the process after step S33 is performed again with respect to the structure which the repair apparatus 740 performed repair.
  • step S36 determines that the structure can be repaired (step S36: NO)
  • the inspection unit 732 determines whether the structure is manufactured according to the design information. judge. Accordingly, it can be accurately determined whether or not the structure manufactured by the molding apparatus 720 is a non-defective product, and the determination time can be shortened. Further, in the structure manufacturing system SYS described above, when the inspection unit 732 determines that the structure is not a non-defective product, the structure can be repaired immediately.
  • the molding device 720 may execute the processing again instead of the repair device 740 executing the processing.
  • the first direction D1 and the second direction D2 are orthogonal to each other, but are orthogonal if the first direction D1 and the second direction D2 are different directions. You don't have to.
  • the second direction D2 may be set to an angle of 60 degrees or 80 degrees with respect to the first direction D1.
  • each drawing shows one or more optical elements, but unless the number to be used is specified, it is used as long as the same optical performance is exhibited.
  • the number of optical elements to be performed is arbitrary.
  • the light for generating the structured light 101 and the reference light 102 by the light generation unit 20 and the like is light having a wavelength in the visible light region, light having a wavelength in the infrared region, and an ultraviolet region. Any of the light having a wavelength of may be used.
  • the user can recognize the projection region 200.
  • a red wavelength in the visible light region damage to the measurement object 2 can be reduced.
  • the scanning unit 40 uses an optical element that reflects structured light, but is not limited thereto.
  • a diffractive optical element, a refractive optical element, parallel flat glass, or the like may be used.
  • the structured light may be scanned by vibrating a refractive optical element such as a lens with respect to the optical axis.
  • a refractive optical element such as a lens with respect to the optical axis.
  • a part of the optical elements of the projection optical system 30 may be used.
  • the CCD camera 52a is used as the imaging unit 50, but the present invention is not limited to this.
  • an image sensor such as a CMOS image sensor (CMOS: Complementary Metal Oxide Semiconductor) may be used instead of the CCD camera.
  • CMOS Complementary Metal Oxide Semiconductor
  • the 4-bucket method is used in which the phase of the fringe pattern P used in the phase shift method is shifted four times during one period, but is not limited thereto.
  • a 5-bucket method in which one period 2 ⁇ of the phase of the fringe pattern P is divided into 5
  • a 6-bucket method in which the period is also divided into 6 may be used.
  • the phase shift method is used, but the three-dimensional shape of the measurement object 2 is measured using only the spatial code method described in the second embodiment. But you can.
  • the fringe pattern P is imaged before the spatial code pattern QA is imaged, but this may be reversed.
  • the stripe pattern P and the spatial code pattern QA are expressed in white and black.
  • the present invention is not limited to this, and either one or both of them may be monochromatic.
  • the stripe pattern P and the spatial code pattern QA may be generated in white and red.
  • the blur between the first measurement image M1 and the second measurement image M2 and the second measurement image when the first measurement image M1 to the fourth measurement image M4 are projected.
  • the blur between M2 and the 3rd measurement image M3 and the blur between the 3rd measurement image M3 and the 4th measurement image M4 were detected, it is not limited to this.
  • the projection unit 10, the imaging unit 50, the arithmetic processing unit 60, and the display device 70 have been described as an example of a configuration housed in a portable case 80. It is not limited to this.
  • the arithmetic device 60 and the display device 70 do not have to be arranged in the housing 80 and may be installed outside the housing 80.
  • a personal computer including a notebook type and a desktop type
  • the arithmetic processing unit 60 may not be housed in a portable case, and some functions of the arithmetic processing unit 60 (the arithmetic unit, the image storage unit, the display control unit, and the setting information storage) May be provided to an external computer.
  • the present invention is not limited to the portable shape measuring devices 1 and 201.
  • a measuring machine in which a multi-joint arm is provided with a three-dimensional measuring unit or a stage on which a measuring object 2 is placed is three-dimensionally displayed.
  • the present invention can also be applied to a stationary shape measuring apparatus such as a measuring machine in which the measuring unit is configured to be movable.
  • the shape measuring device housed in a portable case that can carry the projection unit 10, the imaging unit 50, the arithmetic processing unit 60, and the display device 70, the external measurement environment (temperature, humidity, atmospheric pressure, etc.) changes in particular. Although it becomes easy, even if the external environment changes, the shape of the measurement object 2 can be measured with high accuracy.
  • each different position of the measurement target object 2 is measured, and each measurement result is connected.
  • the three-dimensional shape of the entire measurement object 2 may be measured.
  • an overlapping process for overlapping a part of the measurement results may be used. This overlapping process is performed by the calculation unit 65.
  • the overlapping process will be described.
  • the first portion of the measurement object 2 is imaged by the imaging unit 50.
  • the imaging unit 50 captures an image of the second part that partially overlaps the first part of the measurement object 2.
  • the calculation unit 65 calculates a three-dimensional shape for each of the first part and the second part. Further, the calculation unit 65 searches for a portion where the first portion and the second portion overlap, and connects the three-dimensional shapes of the first portion and the second portion by overlapping the portions. Note that the calculation unit 65 searches for pixels in a predetermined area that is common coordinate data for the three-dimensional shape of the first portion and the three-dimensional shape of the second portion, and overlaps the first portion and the second portion. Judging.
  • the three-dimensional shape of the entire measurement object 2 is measured by repeatedly executing such processing until the entire measurement object 2 is imaged. According to this, even when the measuring object 2 is a large object, the three-dimensional shape of the entire measuring object 2 can be easily measured.
  • a part of the configuration of the shape measuring apparatus 1 may be realized by a computer.
  • the calculation unit processing unit 60 may be realized by a computer.
  • the computer captures the first reference image of the measurement object 2 according to the shape measurement program stored in the storage unit, the process of capturing the second reference image of the measurement object 2, and the first A process of imaging a measurement image of the measurement object 2 obtained by projecting the fringe pattern P onto the measurement object 2 simultaneously with or after the imaging of the first reference image and before the imaging of the second reference image; Processing for detecting a feature region (described later) from the first reference image, processing for detecting the feature region (described later) from the second reference image, feature region in the first reference image, and second reference image
  • the shape of the measuring object 2 is determined based on the processing for detecting the relative shake between the measuring object 2 and the shape measuring device 1 based on the characteristic area in FIG. Processing to calculate.

Abstract

[Problem] To accurately detect measurement error and accurately measure the three-dimensional shape of an object to be measured. [Solution] A detection method for detecting the displacement of a measurement instrument for measuring the shape of an object to be measured relative to the object to be measured wherein: the measurement instrument photographs a first reference image of the object to be measured and a second reference image of the object to be measured; at the same time as or after the photographing of the first reference image and before the photographing of the second reference image, the measurement instrument photographs an image of the object to be measured having structured light for shape measurement projected thereon by the measurement instrument; a feature area is detected from both the first reference image and second reference image; and relative displacement between the object to be measured and the measurement instrument is detected on the basis of the feature area in the first reference image and the feature area in the second reference image.

Description

検出方法、形状測定方法、形状測定装置、構造物製造方法、構造物製造システム、及び形状測定プログラムDetection method, shape measuring method, shape measuring device, structure manufacturing method, structure manufacturing system, and shape measuring program
 本発明は、検出方法、形状測定方法、形状測定装置、構造物製造方法、構造物製造システム、及び形状測定プログラムに関する。 The present invention relates to a detection method, a shape measuring method, a shape measuring device, a structure manufacturing method, a structure manufacturing system, and a shape measuring program.
 測定対象物の三次元形状を測定する手法として位相シフト法が知られている。位相シフト法を用いた形状測定装置は、投影部、撮像部、及び制御部を備えている。この投影部は、正弦波状の光強度の分布を有する縞状のパターン光(以下、構造光という。)を測定対象物に投影する。この際、構造光の縞(言い換えると、構造光の強度分布である正弦波)の位相を1周期(2π)にわたって例えばπ/2ずつ4回シフトさせて、縞の位相が0、π/2、π、3π/2となる4種類の構造光を投影する。撮像部は投影部に対して異なる角度から対象を撮影し、投影部、測定対象物、撮影部が三角測量の位置関係になるよう配置されている。この撮像部は、4種類の異なる位相の構造光がそれぞれ測定対象物に投影されるときに、それぞれ測定対象物を撮像して4つの画像を取得する。制御部は、撮像部が撮像した4つの画像における各画素の信号強度に関するデータを所定の演算式に当てはめ、測定対象物の面形状に応じた各画素における縞の位相値を求める。そして、演算部は、三角測量の原理を利用して、各画素における縞の位相値から測定対象物の三次元座標データを算出する。この位相シフト法を利用した装置は、例えば、特許文献1に開示されている。 The phase shift method is known as a method for measuring the three-dimensional shape of the measurement object. A shape measuring apparatus using the phase shift method includes a projection unit, an imaging unit, and a control unit. This projection unit projects a striped pattern light having a sinusoidal light intensity distribution (hereinafter referred to as structured light) onto a measurement object. At this time, the phase of the stripe of structured light (in other words, the sine wave that is the intensity distribution of the structured light) is shifted four times, for example, by π / 2 over one period (2π), and the phase of the stripe is 0, π / 2. , Π, and 3π / 2 are projected. The imaging unit images the object from different angles with respect to the projection unit, and the projection unit, the measurement object, and the imaging unit are arranged so as to have a triangulation positional relationship. When the four types of structured light having different phases are projected onto the measurement object, the imaging unit images the measurement object and acquires four images. The control unit applies data relating to the signal intensity of each pixel in the four images captured by the imaging unit to a predetermined arithmetic expression, and obtains the phase value of the fringes at each pixel according to the surface shape of the measurement target. And a calculating part calculates the three-dimensional coordinate data of a measuring object from the phase value of the fringe in each pixel using the principle of triangulation. An apparatus using this phase shift method is disclosed in Patent Document 1, for example.
米国特許第5450204号明細書US Pat. No. 5,450,204
 しかしながら、上述した形状測定装置においては、複数種類の構造光が測定対象物に投影され、それぞれ測定対象物が撮像される一連の動作の間に、測定対象物と形状測定装置とが相対的に移動してぶれる可能性がある。この場合、算出される三次元座標データの精度が低下してしまう。 However, in the shape measuring apparatus described above, a plurality of types of structured light are projected onto the measurement object, and the measurement object and the shape measurement apparatus are relatively moved during a series of operations in which each measurement object is imaged. There is a possibility of moving and blurring. In this case, the accuracy of the calculated three-dimensional coordinate data is lowered.
 以上のような事情に鑑み、本発明は、測定対象物と形状測定装置との相対的なぶれを精度よく検出し、測定対象物の三次元形状を精度よく測定することを目的とする。 In view of the circumstances as described above, it is an object of the present invention to accurately detect a relative shake between a measurement object and a shape measuring device and accurately measure the three-dimensional shape of the measurement object.
 本発明の第1態様によれば、測定対象物の形状を測定する測定機と該測定対象物との相対的なぶれを検出する検出方法において、測定対象物の第1の参照像を測定機で撮像することと、測定対象物の第2の参照像を測定機で撮像することと、第1の参照像の撮像と同時又は撮像後で、かつ第2の参照像の撮像よりも前に、形状測定用の構造光を測定機から測定対象物に投影した測定対象物の像を測定機で撮像することと、第1の参照像から特徴領域を検出することと、第2の参照像から特徴領域を検出することと、第1の参照像における特徴領域と、第2の参照像における特徴領域とに基づいて、測定対象物と測定機との相対的なぶれを検出することと、を含む検出方法が提供される。 According to the first aspect of the present invention, in the detection method for detecting the relative shake between the measuring instrument that measures the shape of the measuring object and the measuring object, the first reference image of the measuring object is used as the measuring instrument. , Taking a second reference image of the measurement object with a measuring machine, simultaneously with or after taking the first reference image, and before taking the second reference image Taking an image of the measuring object obtained by projecting the structured light for shape measurement from the measuring machine to the measuring object, detecting a feature region from the first reference image, and a second reference image Detecting a feature region from the detection region, detecting a relative shake between the measurement object and the measuring device based on the feature region in the first reference image and the feature region in the second reference image; A detection method is provided.
 本発明の第2態様によれば、形状測定用の構造光を測定機から測定対象物に投影した測定対象物の像を測定機で撮像することと、第1態様の検出方法によって、測定対象物と測定機との間の相対的なぶれを検出することと、測定対象物の像とぶれとに基づいて、測定対象物の形状を算出することと、を含む形状測定方法が提供される。 According to the second aspect of the present invention, the measuring object is obtained by capturing an image of the measuring object obtained by projecting the structural light for shape measurement from the measuring instrument onto the measuring object, and the detection method of the first aspect. A shape measurement method is provided that includes detecting a relative shake between an object and a measuring device, and calculating a shape of the measurement object based on an image and the shake of the measurement object. .
 本発明の第3態様によれば、測定対象物の形状を測定する形状測定装置であって、測定対象物に対して少なくとも形状測定用の構造光を投影する投影部と、測定対象物の像を撮像する撮像部と、測定対象物の第1の参照像を撮像部に撮像させ、測定対象物の第2の参照像を撮像部に撮像させ、第1の参照像の撮像と同時又は撮像後で、かつ第2の参照像の撮像よりも前に、形状測定用の構造光を投影部から測定対象物に投影させて構造光を投影した測定対象物の像を撮像部に撮像させる制御部と、第1の参照像から特徴領域を検出し、第2の参照像から特徴領域を検出し、第1の参照像における特徴領域と、第2の参照像における特徴領域とに基づいて、測定対象物と測定機との相対的なぶれを検出し、測定対象物の像とぶれとに基づいて、測定対象物の形状を算出する演算部とを備える形状測定装置が提供される。 According to the third aspect of the present invention, there is provided a shape measuring apparatus for measuring the shape of the measurement object, the projection unit projecting at least the structural light for shape measurement onto the measurement object, and the image of the measurement object. An imaging unit that captures the image and a first reference image of the measurement object are imaged by the imaging unit, and a second reference image of the measurement object is imaged by the imaging unit, and at the same time as or imaging of the first reference image Control that causes the imaging unit to capture an image of the measurement object on which the structured light is projected by projecting the structural light for shape measurement from the projection unit onto the measurement object later and before the imaging of the second reference image And a feature region from the first reference image, a feature region from the second reference image, and a feature region in the first reference image and a feature region in the second reference image, Detects relative shake between the measurement object and the measuring machine, and based on the image and shake of the measurement object Shape measuring apparatus and a calculator for calculating the shape of the measuring object is provided.
 本発明の第4態様によれば、構造物の形状に関する設計情報を作製する設計装置と、設計情報に基づいて構造物を作製する成形装置と、作製された構造物の形状を測定する第3態様の形状測定装置と、形状測定装置によって得られた構造物の形状に関する形状情報と設計情報とを比較する検査装置と、を含む構造物製造システムが提供される。 According to the fourth aspect of the present invention, the design apparatus for producing the design information related to the shape of the structure, the molding apparatus for producing the structure based on the design information, and the third for measuring the shape of the produced structure. There is provided a structure manufacturing system including a shape measuring apparatus according to an aspect, and an inspection apparatus that compares design information with shape information related to the shape of the structure obtained by the shape measuring apparatus.
 本発明の第5態様によれば、構造物の形状に関する設計情報を作製することと、設計情報に基づいて構造物を作製することと、作製された構造物の形状を測定する第2態様の形状測定方法と、形状測定方法によって得られた構造物の形状に関する形状情報と設計情報とを比較することと、を含む構造物製造方法が提供される。 According to the fifth aspect of the present invention, the design information relating to the shape of the structure is produced, the structure is produced based on the design information, and the shape of the produced structure is measured. There is provided a structure manufacturing method including a shape measuring method, and comparing shape information related to the shape of the structure obtained by the shape measuring method with design information.
 本発明の第6態様によれば、形状測定用の構造光を測定対象物に投影した測定対象物の像を撮像して測定対象物の形状を測定する形状測定装置に含まれるコンピュータに、測定対象物の第1の参照像を撮像する処理と、測定対象物の第2の参照像を撮像する処理と、第1の参照像の撮像と同時又は撮像後で、かつ第2の参照像の撮像よりも前に、形状測定用の構造光を測定対象物に投影した測定対象物の像を撮像する処理と、第1の参照像から特徴領域を検出する処理と、第2の参照像から特徴領域を検出する処理と、第1の参照像における特徴領域と、第2の参照像における特徴領域とに基づいて、測定対象物と形状測定装置との相対的なぶれを検出する処理と、測定対象物の像とぶれとに基づいて、測定対象物の形状を算出する処理と、を実行させる形状測定プログラムが提供される。 According to the sixth aspect of the present invention, the computer included in the shape measuring apparatus that measures the shape of the measurement object by capturing an image of the measurement object obtained by projecting the structured light for shape measurement onto the measurement object is measured by the computer. The process of capturing the first reference image of the object, the process of capturing the second reference image of the measurement object, the imaging of the first reference image, or simultaneously with or after the imaging of the second reference image Prior to imaging, processing for capturing an image of a measurement object obtained by projecting structured light for shape measurement onto the measurement object, processing for detecting a feature region from the first reference image, and from the second reference image A process for detecting a feature region, a process for detecting a relative shake between the measurement object and the shape measuring device based on the feature region in the first reference image and the feature region in the second reference image; Processing to calculate the shape of the measurement object based on the image and blur of the measurement object; Shape measuring program for causing the execution is provided.
 本発明の態様によれば、測定対象物と形状測定装置との相対的なぶれを精度よく検出し、測定対象物の三次元形状を精度よく測定することができる。 According to the aspect of the present invention, it is possible to accurately detect the relative shake between the measurement object and the shape measuring device, and to accurately measure the three-dimensional shape of the measurement object.
第1実施形態に係る形状測定装置の一例を示す図である。It is a figure which shows an example of the shape measuring apparatus which concerns on 1st Embodiment. 図1に示す形状測定装置の詳細構成の一例を示すブロック図である。It is a block diagram which shows an example of a detailed structure of the shape measuring apparatus shown in FIG. 投影領域における構造光及び参照光の強度分布を示す図である。It is a figure which shows intensity distribution of structured light and reference light in a projection area | region. 投影領域と撮像領域との関係を示す図である。It is a figure which shows the relationship between a projection area | region and an imaging area. 測定対象物のない平面に各位相の縞パターンが投影された状態を示す図である。It is a figure which shows the state by which the fringe pattern of each phase was projected on the plane without a measuring object. ぶれについて模式的に示す図である。It is a figure which shows typically about blurring. 形状測定装置の動作を説明しつつ、検出方法及び形状測定方法の一例について説明するフローチャートである。It is a flowchart explaining an example of a detection method and a shape measuring method, explaining operation | movement of a shape measuring apparatus. 第1実施形態に係る検出方法において撮像される画像を処理順に沿って模式的に示す図である。It is a figure which shows typically the image imaged in the detection method which concerns on 1st Embodiment along process order. 補正方法について模式的に示す図である。It is a figure which shows typically about the correction method. 空間コードパターンの強度分布示す図である。It is a figure which shows intensity distribution of a space code pattern. 第2実施形態に係る検出方法において撮像される画像を処理順に沿って模式的に示す図である。It is a figure which shows typically the image imaged in the detection method which concerns on 2nd Embodiment along process order. 変形例に係る検出方法において撮像される画像を処理順に沿って模式的に示す図である。It is a figure which shows typically the image imaged in the detection method which concerns on a modification along process order. 変形例に係る検出方法において撮像される画像を処理順に沿って模式的に示す図である。It is a figure which shows typically the image imaged in the detection method which concerns on a modification along process order. 変形例に係る形状測定装置の一例を示す図である。It is a figure which shows an example of the shape measuring apparatus which concerns on a modification. 構造物製造システムの実施形態の一例を示すブロック図である。It is a block diagram which shows an example of embodiment of a structure manufacturing system. 構造物製造方法の実施形態の一例を示すフローチャートである。It is a flowchart which shows an example of embodiment of a structure manufacturing method.
 <第1実施形態> 
 図1は、第1実施形態に係る形状測定装置の一例を示す図である。なお、図1において、紙面の右方向をX1軸とし、X1軸と直交するある方向をY1軸とし、X1軸及びY1軸と直交する方向をZ1軸としている。形状測定装置1は、位相シフト法を用いて測定対象物2の三次元形状を測定する装置である。形状測定装置1は、図1に示すように、投影部10と、撮像部50と、演算処理部60と、表示装置70と、筐体80とを備える。形状測定装置1は、投影部10、撮像部50、演算処理部60、及び表示装置70が持ち運び可能な筐体80に収容された構成となっている。
<First Embodiment>
FIG. 1 is a diagram illustrating an example of a shape measuring apparatus according to the first embodiment. In FIG. 1, the right direction of the drawing is the X1 axis, a certain direction orthogonal to the X1 axis is the Y1 axis, and a direction orthogonal to the X1 axis and the Y1 axis is the Z1 axis. The shape measuring device 1 is a device that measures the three-dimensional shape of the measuring object 2 using the phase shift method. As shown in FIG. 1, the shape measuring apparatus 1 includes a projection unit 10, an imaging unit 50, an arithmetic processing unit 60, a display device 70, and a housing 80. The shape measuring device 1 is configured to be housed in a case 80 in which the projection unit 10, the imaging unit 50, the arithmetic processing unit 60, and the display device 70 can be carried.
 投影部10は、第1の方向D1(図1のX1軸方向)に沿った投影光100を生成する。そして、投影部10は、生成した投影光100を第1の方向とは異なる第2の方向D2(図1のY1軸方向)に沿って走査することにより、投影領域200に対して構造光101及び参照光102を投影する。第1実施形態の構造光101は、位相シフト法で用いる構造光である。また、第1実施形態の参照光102は、測定対象物2と形状測定装置1との間の相対的なぶれを検出するために、測定対象物2上の特徴領域を検出することに用いられる光である。なお、構造光101、参照光102、投影領域200、及び特徴領域の詳細については後述する(図3及び図4参照)。 The projection unit 10 generates projection light 100 along the first direction D1 (X1 axis direction in FIG. 1). Then, the projection unit 10 scans the generated projection light 100 along the second direction D2 (the Y1 axis direction in FIG. 1) different from the first direction, so that the structured light 101 is applied to the projection region 200. And the reference beam 102 is projected. The structured light 101 of the first embodiment is structured light used in the phase shift method. In addition, the reference light 102 of the first embodiment is used to detect a feature region on the measurement object 2 in order to detect a relative shake between the measurement object 2 and the shape measurement apparatus 1. Light. Details of the structured light 101, the reference light 102, the projection region 200, and the feature region will be described later (see FIGS. 3 and 4).
 投影部10は、図1に示すように、光生成部20と、投影光学系30と、走査部40とを有する。光生成部20は、投影光100を生成する。投影光学系30は、光生成部20で生成された投影光100を投影する。投影光学系30から出射された投影光100は、走査部40を介して測定対象物2または測定対象物2の近傍に向けて投影される。なお、測定対象物2は、例えば一の角部2aを有している。走査部40は、投影光100を第2の方向D2(図1のY1軸方向)に走査する。 As shown in FIG. 1, the projection unit 10 includes a light generation unit 20, a projection optical system 30, and a scanning unit 40. The light generation unit 20 generates the projection light 100. The projection optical system 30 projects the projection light 100 generated by the light generation unit 20. The projection light 100 emitted from the projection optical system 30 is projected toward the measurement object 2 or the vicinity of the measurement object 2 via the scanning unit 40. In addition, the measuring object 2 has one corner | angular part 2a, for example. The scanning unit 40 scans the projection light 100 in the second direction D2 (Y1 axis direction in FIG. 1).
 撮像部50は、投影部10の位置と異なる位置に配置されている。撮像部50は、投影光100が投影された測定対象物2を、投影部10による投影方向とは異なる方向から撮像する。例えば、撮像部50は、構造光101が投影された測定対象物2の像(以下、「測定像」と表記する。)を撮像する。また、例えば、撮像部50は、参照光102が投影された測定対象物2の像(以下、「参照像」と表記する)を撮像する。 The imaging unit 50 is arranged at a position different from the position of the projection unit 10. The imaging unit 50 images the measurement object 2 onto which the projection light 100 is projected from a direction different from the direction in which the projection unit 10 projects. For example, the imaging unit 50 captures an image of the measurement object 2 onto which the structured light 101 is projected (hereinafter referred to as “measurement image”). Further, for example, the imaging unit 50 captures an image of the measurement object 2 onto which the reference light 102 is projected (hereinafter referred to as “reference image”).
 撮像部50は、受光光学系51及び撮像装置52を有している。受光光学系51は、測定対象物2の表面において、投影光100が投影された部分を含む領域の像を撮像装置52に結像させる光学系である。受光光学系51は、例えば複数のレンズが用いられる。撮像装置52は、受光光学系51によって結像された像に基づいて測定対象物2の画像データを生成するとともに、生成した画像データを記憶する。 The imaging unit 50 includes a light receiving optical system 51 and an imaging device 52. The light receiving optical system 51 is an optical system that causes the imaging device 52 to form an image of a region including a portion on which the projection light 100 is projected on the surface of the measurement object 2. For the light receiving optical system 51, for example, a plurality of lenses are used. The imaging device 52 generates image data of the measurement object 2 based on the image formed by the light receiving optical system 51 and stores the generated image data.
 演算処理部60は、光生成部20による投影光100の生成を制御する。また、演算処理部60は、走査部40による投影光100の走査と、撮像部50による測定対象物2の撮像とを同期させるように、走査部40及び撮像部50を制御する。また、演算処理部60は、撮像部50が撮像した画像データにおける各画素の輝度データ(信号強度)に基づいて、測定対象物2の三次元形状を算出する。 The arithmetic processing unit 60 controls the generation of the projection light 100 by the light generation unit 20. In addition, the arithmetic processing unit 60 controls the scanning unit 40 and the imaging unit 50 so that the scanning of the projection light 100 by the scanning unit 40 and the imaging of the measurement object 2 by the imaging unit 50 are synchronized. Further, the arithmetic processing unit 60 calculates the three-dimensional shape of the measurement object 2 based on the luminance data (signal intensity) of each pixel in the image data captured by the imaging unit 50.
 次に、図2を参照して形状測定装置1に含まれる投影部10、撮像部50、及び演算処理部60の詳細な構成について説明する。図2は、図1に示す形状測定装置1の詳細構成の一例を示すブロック図である。図1に示すように3軸座標系を設定した場合、図2においては、紙面の右方向がX1軸となり、紙面の上方向がZ1軸となり、紙面の裏から表に向かう方向がY1軸となる。図2に示すように、投影部10は、レーザコントローラ21、レーザダイオード(光源)22、投影光学系30、及び走査部40を有している。図1に示す光生成部20は、レーザコントローラ21及びレーザダイオード22を含む。 Next, detailed configurations of the projection unit 10, the imaging unit 50, and the arithmetic processing unit 60 included in the shape measuring apparatus 1 will be described with reference to FIG. FIG. 2 is a block diagram showing an example of a detailed configuration of the shape measuring apparatus 1 shown in FIG. When a three-axis coordinate system is set as shown in FIG. 1, in FIG. 2, the right direction of the paper surface is the X1 axis, the upward direction of the paper surface is the Z1 axis, and the direction from the back of the paper surface to the front is the Y1 axis. Become. As shown in FIG. 2, the projection unit 10 includes a laser controller 21, a laser diode (light source) 22, a projection optical system 30, and a scanning unit 40. The light generation unit 20 illustrated in FIG. 1 includes a laser controller 21 and a laser diode 22.
 レーザコントローラ21は、制御部62からの指令信号に基づいてレーザダイオード22によるレーザ光の照射を制御する。レーザダイオード22は、レーザコントローラ21からの制御信号に基づいてレーザ光を照射する光源である。レーザダイオード22は、例えば赤色光を射出する赤色レーザダイオードと、緑色光を射出する緑色レーザダイオードと、青色光を射出する青色レーザダイオードとを有している。 The laser controller 21 controls the irradiation of the laser light by the laser diode 22 based on the command signal from the control unit 62. The laser diode 22 is a light source that emits laser light based on a control signal from the laser controller 21. The laser diode 22 includes, for example, a red laser diode that emits red light, a green laser diode that emits green light, and a blue laser diode that emits blue light.
 投影光学系30は、上述したように、投影光100を投影する。投影光学系30は、一つまたは複数の透過光学素子または反射光学素子によって構成される。 The projection optical system 30 projects the projection light 100 as described above. The projection optical system 30 includes one or a plurality of transmission optical elements or reflection optical elements.
 走査部40は、投影光学系30から出射された投影光100を、例えば、ミラー等の反射光学素子を用いて反射し、その反射角を変化させることにより投影光100を第2の方向D2(図2のY1軸方向)に走査する。走査部40を構成する反射光学素子の一例として、静電気でミラーを共振させて投影光100の反射角を変化させるMEMS(Micro Electro Mechanical Systems)ミラーが用いられる。第2の方向D2は、第1の方向D1(図2のX1軸方向)と異なる測定対象物2上の方向である。例えば、第1の方向D1と第2の方向D2とは直交している。 The scanning unit 40 reflects the projection light 100 emitted from the projection optical system 30 by using, for example, a reflection optical element such as a mirror, and changes the reflection angle thereof to change the projection light 100 in the second direction D2 ( Scan in the Y1 axis direction in FIG. As an example of the reflective optical element constituting the scanning unit 40, a MEMS (Micro Electro Mechanical Systems) mirror that changes the reflection angle of the projection light 100 by resonating the mirror with static electricity is used. The second direction D2 is a direction on the measurement object 2 different from the first direction D1 (X1 axis direction in FIG. 2). For example, the first direction D1 and the second direction D2 are orthogonal to each other.
 MEMSミラーは、図1に示すように紙面内の振動中心AXを軸として方向S(図1参照)に振動し、投影光100を所定の反射角で反射させつつ、その反射角を変化させる。MEMSミラーによる第2の方向D2の走査幅(つまり、投影領域200における第2の方向D2の長さ)は、MEMSミラーの振動方向Sの振幅によって決定される。また、MEMSミラーにより投影光100が第2の方向D2に走査される速度は、MEMSミラーの角速度(つまり、共振周波数)によって決定される。また、MEMSミラーを振動させることにより、投影光100を往復して走査可能となる。投影光100の走査の開始位置は任意である。例えば、投影領域200の端から投影光100の走査が開始されるほかに、投影領域200の略中央付近から走査が開始されてもよい。 As shown in FIG. 1, the MEMS mirror vibrates in the direction S (see FIG. 1) with the vibration center AX in the paper as an axis, and reflects the projection light 100 at a predetermined reflection angle and changes its reflection angle. The scanning width in the second direction D2 by the MEMS mirror (that is, the length in the second direction D2 in the projection region 200) is determined by the amplitude in the vibration direction S of the MEMS mirror. Further, the speed at which the projection light 100 is scanned in the second direction D2 by the MEMS mirror is determined by the angular speed (that is, the resonance frequency) of the MEMS mirror. Further, by vibrating the MEMS mirror, the projection light 100 can be scanned back and forth. The start position of scanning with the projection light 100 is arbitrary. For example, in addition to starting the scanning of the projection light 100 from the end of the projection area 200, the scanning may be started from approximately the center of the projection area 200.
 図3(a)は、投影領域200における構造光101の強度分布を示す図である。図3(b)は、投影領域200における参照光102の強度分布を示す図である。図1に示すような3軸座標系を設定した場合、図3(a)及び(b)においては、紙面の右方向がX1軸となり、紙面の下方向がY1軸となり、紙面の裏から表に向かう方向がZ1軸となる。 FIG. 3A is a diagram showing the intensity distribution of the structured light 101 in the projection region 200. FIG. 3B is a diagram illustrating the intensity distribution of the reference light 102 in the projection region 200. When a three-axis coordinate system as shown in FIG. 1 is set, in FIGS. 3A and 3B, the right direction of the paper surface is the X1 axis, and the downward direction of the paper surface is the Y1 axis. The direction toward is the Z1 axis.
 図3(a)及び(b)に示すように、投影光100は、第1の方向D1に所定の長さを有するスリット状の光である。投影光100は、第2の方向D2に所定の距離にわたって走査されることで矩形状の投影領域200を形成する。投影領域200は、構造光101及び参照光102が投影される領域であり、第1の方向D1と第2の方向D2とで規定される領域である。投影領域200は、測定対象物2の一部または全部を含んでいる。 As shown in FIGS. 3A and 3B, the projection light 100 is slit-like light having a predetermined length in the first direction D1. The projection light 100 is scanned over a predetermined distance in the second direction D2, thereby forming a rectangular projection region 200. The projection area 200 is an area onto which the structured light 101 and the reference light 102 are projected, and is an area defined by the first direction D1 and the second direction D2. The projection area 200 includes part or all of the measurement object 2.
 図3(a)に示す構造光101は、第2の方向D2に沿って周期的な光強度の分布を有するパターン光である。第1実施形態では、構造光101の一例として、第2の方向D2に沿って正弦波状の周期的な光強度の分布を有する縞パターンPが用いられる。縞パターンPは、例えば投影光100の波長を所定波長(例、約680nm)として、投影光100の光強度を周期的に変化させつつ第2の方向D2に走査することで形成される。縞パターンPは、明るい部分(図3(a)の白い部分)と暗い部分(図3(b)の黒い部分)とが第2の方向D2に沿って変化する明暗パターンを有する。また、縞パターンPは、濃い部分(図3(a)の黒い部分)と薄い部分(図3(a)の白い部分)とが徐々に変化する濃淡パターンとも表現される。また、縞パターンPは、格子状のパターンであるから格子パターンとも表現される。また、第2の方向D2を明暗の方向または濃淡の方向、格子の方向ともいう。 The structured light 101 shown in FIG. 3A is pattern light having a periodic light intensity distribution along the second direction D2. In the first embodiment, a stripe pattern P having a sinusoidal periodic light intensity distribution along the second direction D2 is used as an example of the structured light 101. The fringe pattern P is formed, for example, by setting the wavelength of the projection light 100 to a predetermined wavelength (eg, about 680 nm) and scanning in the second direction D2 while periodically changing the light intensity of the projection light 100. The stripe pattern P has a light and dark pattern in which a bright part (white part in FIG. 3A) and a dark part (black part in FIG. 3B) change along the second direction D2. The fringe pattern P is also expressed as a shading pattern in which a dark portion (black portion in FIG. 3A) and a thin portion (white portion in FIG. 3A) gradually change. Further, since the stripe pattern P is a lattice pattern, it is also expressed as a lattice pattern. Further, the second direction D2 is also referred to as a light / dark direction, a light / dark direction, or a lattice direction.
 一方、図3(b)に示す参照光102は、本実施形態において、第1の方向D1及び第2の方向D2において光強度(又は、明暗、濃淡)が一様となっている。第1実施形態では、参照光102の一例として、赤色光、緑色光及び青色光が含まれた白色の光で形成される一様パターンQが用いられる。一様パターンQは、例えば投影光100を白色光とし、光強度を一定にしつつ第2の方向D2に走査することで形成される。 On the other hand, in the present embodiment, the reference light 102 shown in FIG. 3B has a uniform light intensity (or light and dark, light and shade) in the first direction D1 and the second direction D2. In the first embodiment, a uniform pattern Q formed of white light including red light, green light, and blue light is used as an example of the reference light 102. The uniform pattern Q is formed, for example, by using the projection light 100 as white light and scanning in the second direction D2 while keeping the light intensity constant.
 続いて、図2に示すように、撮像部50は、受光光学系51、CCDカメラ52a、及び画像メモリ52bを有している。撮像装置52は、CCDカメラ52a及び画像メモリ52bを含む。受光光学系51は、上述したように、測定対象物2の表面において、投影光100が投影された部分を含む領域の像をCCDカメラ52aの受光面に結像させる。CCDカメラ52aは、電荷結合素子(Charge Coupled Device)を用いたカメラである。 Subsequently, as shown in FIG. 2, the imaging unit 50 includes a light receiving optical system 51, a CCD camera 52a, and an image memory 52b. The imaging device 52 includes a CCD camera 52a and an image memory 52b. As described above, the light receiving optical system 51 forms an image of a region including a portion on which the projection light 100 is projected on the surface of the measurement object 2 on the light receiving surface of the CCD camera 52a. The CCD camera 52a is a camera using a charge-coupled device.
 CCDカメラ52aにより生成される画像データは画素毎の信号強度データによって構成される。例えば、画像データは512×512=262144画素の信号強度データで構成される。画像メモリ52bは、CCDカメラ52aが生成した画像データを記憶する。 The image data generated by the CCD camera 52a is composed of signal intensity data for each pixel. For example, the image data is composed of signal intensity data of 512 × 512 = 262144 pixels. The image memory 52b stores image data generated by the CCD camera 52a.
 図4は、投影領域と撮像領域との関係を示す図である。図4を用いて、撮像部50が測定対象物2を撮像する領域(以下、撮像領域と称する)について簡単に説明する。図1に示すように3軸座標系を設定した場合、図4においては、紙面の右方向がX1軸となり、紙面の下方向がY1軸となり、紙面の裏から表に向かう方向がZ1軸となる。 FIG. 4 is a diagram showing the relationship between the projection area and the imaging area. A region where the imaging unit 50 images the measurement object 2 (hereinafter referred to as an imaging region) will be briefly described with reference to FIG. When a three-axis coordinate system is set as shown in FIG. 1, in FIG. 4, the right direction of the paper is the X1 axis, the downward direction of the paper is the Y1 axis, and the direction from the back of the paper to the front is the Z1 axis. Become.
 図4に示すように、撮像領域210は、撮像部50により撮像される測定対象物2の領域を示している。この撮像領域210は、投影領域200の領域内であって、この投影領域200よりも狭い領域とされている。ただし、撮像領域210は、少なくとも投影領域200の領域外にはみ出さなければよい。例えば、撮像領域210は投影領域200と同じ領域であってもよい。なお、撮像領域210は、投影領域200よりも大きな領域であってもよい。 As shown in FIG. 4, the imaging region 210 indicates a region of the measurement object 2 that is imaged by the imaging unit 50. The imaging area 210 is within the area of the projection area 200 and is narrower than the projection area 200. However, it is sufficient that the imaging region 210 does not protrude beyond at least the projection region 200. For example, the imaging area 210 may be the same area as the projection area 200. Note that the imaging area 210 may be an area larger than the projection area 200.
 なお、投影領域200が撮像領域210より大きいとき、投影光100は、撮像領域210の外側(すなわち撮像視野の外側)から走査が開始される場合と、撮像領域210内(すなわち撮像視野内)から走査が開始される場合と、のいずれであってもよい。 When the projection area 200 is larger than the imaging area 210, the projection light 100 starts scanning from the outside of the imaging area 210 (that is, outside the imaging field) and from within the imaging area 210 (that is, within the imaging field). Either the case where scanning is started or the case where scanning starts.
 続いて、図2に示すように、演算処理部60は、操作部61、制御部62、設定情報記憶部63、取込メモリ64、演算部65、画像記憶部66、及び表示制御部67を有している。 
 操作部61は、使用者の操作に応じた操作信号を制御部62に出力する。この操作部61は、例えば、使用者によって操作されるボタン、スイッチである。また、表示装置70には例えばタッチパネルが形成されている。このタッチパネルも操作部61として用いられる。
Subsequently, as shown in FIG. 2, the calculation processing unit 60 includes an operation unit 61, a control unit 62, a setting information storage unit 63, a capture memory 64, a calculation unit 65, an image storage unit 66, and a display control unit 67. Have.
The operation unit 61 outputs an operation signal corresponding to a user operation to the control unit 62. The operation unit 61 is, for example, a button or switch operated by the user. Further, for example, a touch panel is formed on the display device 70. This touch panel is also used as the operation unit 61.
 制御部62は、第1制御部62a及び第2制御部62bを含む。第1制御部62aは、走査部40と撮像部50とを制御する。第2制御部62bは、光生成部20を制御する。制御部62は、設定情報記憶部63に記憶されているプログラムに従って次の制御を実行する。 The control unit 62 includes a first control unit 62a and a second control unit 62b. The first control unit 62a controls the scanning unit 40 and the imaging unit 50. The second controller 62b controls the light generator 20. The control unit 62 executes the following control according to the program stored in the setting information storage unit 63.
 第1制御部62aは、走査部40及びCCDカメラ52aに指令信号を出力し、CCDカメラ52aによる測定対象物2の撮像が、走査部40による縞パターンPの走査に同期するように制御する。また、第1制御部62aは、CCDカメラ52aによる1フレームの撮像と、縞パターンPの複数回の走査とを同期させるように制御する。 The first control unit 62a outputs a command signal to the scanning unit 40 and the CCD camera 52a, and controls the imaging of the measurement object 2 by the CCD camera 52a to be synchronized with the scanning of the fringe pattern P by the scanning unit 40. In addition, the first control unit 62a performs control so as to synchronize imaging of one frame by the CCD camera 52a and scanning of the stripe pattern P a plurality of times.
 第2制御部62bは、レーザコントローラ21に指令信号を出力することにより、レーザダイオード22から赤色光、青色光及び緑色光を組み合わせた所望のレーザ光を照射可能である。また、第2制御部62bは、レーザコントローラ21に指令信号を出力することにより、レーザダイオード22から照射されるレーザ光の光強度を調整可能である。制御部62(第1制御部62a及び第2制御部62b)は、縞パターンPを測定対象物2に投影する場合、例えばレーザコントローラ21と走査部40とを同期制御することにより、所定波長の投影光100の光強度を周期的に変化させつつ該投影光100を第2の方向D2に走査する。また、制御部62(第1制御部62a及び第2制御部62b)は、一様パターンQを測定対象物2に投影する場合、例えばレーザコントローラ21と走査部40とを同期制御することにより、白色の投影光100の光強度を一定にしつつ該投影光100を第2の方向D2に走査する。 The second control unit 62b can emit desired laser light combining red light, blue light, and green light from the laser diode 22 by outputting a command signal to the laser controller 21. The second control unit 62 b can adjust the light intensity of the laser light emitted from the laser diode 22 by outputting a command signal to the laser controller 21. When projecting the fringe pattern P onto the measurement object 2, the control unit 62 (the first control unit 62a and the second control unit 62b) controls the laser controller 21 and the scanning unit 40, for example, to synchronize control with a predetermined wavelength. The projection light 100 is scanned in the second direction D2 while periodically changing the light intensity of the projection light 100. Further, when the control unit 62 (the first control unit 62a and the second control unit 62b) projects the uniform pattern Q onto the measurement object 2, for example, by synchronously controlling the laser controller 21 and the scanning unit 40, The projection light 100 is scanned in the second direction D2 while keeping the light intensity of the white projection light 100 constant.
 走査部40を構成するMEMSミラーの周波数は、例えば500Hz(MEMSミラーの振動周期は往復2ms)に設定される。また、CCDカメラ52aのシャッタースピード(CCDカメラ52aの露光時間)は例えば40msに設置される。従って、CCDカメラ52aが1枚の画像を撮像する間に、走査部40は投影光100を投影領域200に40回走査(20回往復走査)する。第1制御部62aは、CCDカメラ52aによる1フレームの撮像の間に、例えば走査部40により、投影光100を20回往復させるように制御を行う。ただし、CCDカメラ52aによる1フレームの撮像において、投影光100を何往復走査させるかは、任意に設定可能である。例えば、CCDカメラ52aのシャッタースピードの調整や、MEMSミラーの周波数の調整により、1フレームの撮像で取り込む投影光100の走査数は調整される。 The frequency of the MEMS mirror constituting the scanning unit 40 is set to, for example, 500 Hz (the oscillation cycle of the MEMS mirror is 2 ms for reciprocation). The shutter speed of the CCD camera 52a (exposure time of the CCD camera 52a) is set to 40 ms, for example. Accordingly, while the CCD camera 52a captures one image, the scanning unit 40 scans the projection light 100 in the projection region 200 40 times (20 reciprocating scans). The first control unit 62a performs control so that the projection light 100 is reciprocated 20 times, for example, by the scanning unit 40 during imaging of one frame by the CCD camera 52a. However, it is possible to arbitrarily set how many reciprocating scans of the projection light 100 are performed when the CCD camera 52a captures one frame. For example, by adjusting the shutter speed of the CCD camera 52a and the frequency of the MEMS mirror, the number of scans of the projection light 100 captured by one frame imaging is adjusted.
 設定情報記憶部63は、制御部62に制御を実行させるためのプログラムを記憶する。また、設定情報記憶部63は、演算部65に対して、ぶれの検出処理を実行させるためのプログラムや、三次元形状の演算処理を実行させるためのプログラムを記憶する。設定情報記憶部63は、表示制御部67に表示制御を実行させるためのプログラムを記憶する。設定情報記憶部63は、演算部65の演算処理において縞パターンPの縞の位相から測定対象物2の実座標値を算出する際に用いるキャリブレーション情報なども記憶する。なお、キャリブレーション情報として、撮像部50の焦点距離に関する情報、撮像部50の光軸が交差する撮像装置52の撮像面上の位置に関する情報や、撮像部50に含まれるレンズ等のディストーションに関する情報などが含まれる。また、設定情報記憶部63は、演算部65の演算処理において、撮像部50(CCDカメラ52a)で撮像された画像の情報(例、輝度等)に基づいて画素ごとに縞パターンPの位相を求めるための位相算出用のプログラム、データ等を記憶する。 The setting information storage unit 63 stores a program for causing the control unit 62 to execute control. In addition, the setting information storage unit 63 stores a program for causing the calculation unit 65 to perform shake detection processing and a program for executing calculation processing of a three-dimensional shape. The setting information storage unit 63 stores a program for causing the display control unit 67 to execute display control. The setting information storage unit 63 also stores calibration information used when calculating the actual coordinate value of the measurement object 2 from the fringe phase of the fringe pattern P in the calculation process of the calculation unit 65. As the calibration information, information related to the focal length of the imaging unit 50, information related to the position on the imaging surface of the imaging device 52 where the optical axes of the imaging unit 50 intersect, and information related to distortion of the lens included in the imaging unit 50, etc. Etc. are included. In addition, the setting information storage unit 63 sets the phase of the fringe pattern P for each pixel based on information (eg, luminance) of the image captured by the imaging unit 50 (CCD camera 52a) in the calculation process of the calculation unit 65. A phase calculation program, data, and the like for obtaining are stored.
 取込メモリ64は、画像メモリ52bに記憶された画像データを取り込んで記憶する。この取込メモリ64は、縞パターンPを投影して撮像した測定対象物2の測定像や、一様パターンQを投影して撮像した測定対象物2の参照像などが記憶される。取込メモリ64には、複数の記憶領域が設けられている。測定像の画像データ及び参照像の画像データは、例えばそれぞれ異なる記憶領域に記憶される。 The capture memory 64 captures and stores the image data stored in the image memory 52b. The capture memory 64 stores a measurement image of the measurement object 2 imaged by projecting the fringe pattern P, a reference image of the measurement object 2 imaged by projecting the uniform pattern Q, and the like. The capture memory 64 is provided with a plurality of storage areas. The image data of the measurement image and the image data of the reference image are stored in different storage areas, for example.
 演算部65は、設定情報記憶部63に記憶されているプログラムやキャリブレーション情報に従って、所定の演算を実行する。例えば、取込メモリ64に記憶された参照像の画像データから、測定対象物2と形状測定装置1との間の相対的なぶれを検出する。このぶれは、測定対象物2と形状測定装置1との間の相対的な位置変化及び姿勢変化のうち少なくとも一方に関する変化情報である。ここでは、位置変化及び姿勢変化は、X1方向、Y1方向及びZ1軸周りの方向(以下、「θZ1方向」と表記する。)についてのものである。また、演算部65は、取込メモリ64に記憶された測定像の画像データから、測定対象物2の三次元形状データ(三次元形状の座標データ)を算出する。この三次元形状データの算出において、演算部65は、測定像と上記ぶれとに基づいて、演算を行うことができる。 The calculation unit 65 executes a predetermined calculation according to the program and calibration information stored in the setting information storage unit 63. For example, relative blur between the measurement object 2 and the shape measuring device 1 is detected from the image data of the reference image stored in the capture memory 64. This blur is change information regarding at least one of a relative position change and a posture change between the measurement object 2 and the shape measuring apparatus 1. Here, the position change and the posture change are for the X1 direction, the Y1 direction, and the direction around the Z1 axis (hereinafter referred to as “θZ1 direction”). Further, the calculation unit 65 calculates the three-dimensional shape data (three-dimensional shape coordinate data) of the measurement object 2 from the image data of the measurement image stored in the capture memory 64. In the calculation of the three-dimensional shape data, the calculation unit 65 can perform calculation based on the measurement image and the shake.
 演算部65は、参照像における特徴領域(後述、図8参照)を検出する。特徴領域は、例えば測定対象物2の参照像に含まれる領域であって、他の領域に対して輝度が変化していることにより識別可能な領域である。この場合、輝度の変化は、測定対象物2の形状変化や測定対象物2の表面の光反射率等の変化等に基づくものである。特徴領域は、参照像において所定の距離を空けて配置される複数の領域を含むものであり、例えば3つの領域を含んでいる。本実施形態では、参照像のうち測定対象物2の角部2a及びその周辺に対応する領域を特徴領域とするように設定されている場合を例に挙げて説明する。この場合の特徴領域には、角部2aに集まる3つの直線に対応した3つの単位領域が含まれている(図8参照)。 The calculation unit 65 detects a feature area (refer to FIG. 8 described later) in the reference image. The feature region is a region that is included in the reference image of the measurement object 2, for example, and can be identified by the change in luminance with respect to other regions. In this case, the change in luminance is based on a change in the shape of the measurement object 2, a change in the light reflectance of the surface of the measurement object 2, and the like. The feature region includes a plurality of regions arranged with a predetermined distance in the reference image, and includes, for example, three regions. In the present embodiment, a case will be described as an example where the reference image is set so that the region corresponding to the corner 2a of the measurement object 2 and its periphery is set as the feature region. The feature region in this case includes three unit regions corresponding to the three straight lines gathered at the corner 2a (see FIG. 8).
 画像記憶部66は、演算部65が算出した測定対象物2の三次元形状データを記憶する。表示制御部67は、設定情報記憶部63に記憶されているプログラムに従って三次元形状の画像の表示制御を実行する。すなわち、表示制御部67は、使用者による操作部61の操作に応じて、または自動的に、画像記憶部66に記憶された三次元形状データを読み出す。そして、表示制御部67は、読み出した三次元形状データに基づいて表示装置70の表示画面に測定対象物2の三次元形状の画像を表示させる制御を実行する。 The image storage unit 66 stores the three-dimensional shape data of the measurement object 2 calculated by the calculation unit 65. The display control unit 67 executes display control of a three-dimensional image according to a program stored in the setting information storage unit 63. That is, the display control unit 67 reads the three-dimensional shape data stored in the image storage unit 66 in accordance with the operation of the operation unit 61 by the user or automatically. And the display control part 67 performs control which displays the image of the three-dimensional shape of the measuring object 2 on the display screen of the display apparatus 70 based on the read-out three-dimensional shape data.
 表示装置70は、測定対象物2の三次元形状の画像を表示する装置である。この表示装置70は、例えば液晶表示装置や有機EL表示装置などが用いられる。なお、図1では表示装置70を省略している。 The display device 70 is a device that displays a three-dimensional image of the measurement object 2. As the display device 70, for example, a liquid crystal display device or an organic EL display device is used. In FIG. 1, the display device 70 is omitted.
 また、上記の制御部62、演算部65、及び表示制御部67は、CPU(Central Processing Unit)などの演算処理装置により構成される。すなわち、演算処理装置が設定情報記憶部63に記憶されているプログラムに従って制御部62が実行する処理を行う。また、演算処理装置が設定情報記憶部63に記憶されているプログラムに従って演算部65が実行する処理を行う。また、演算処理装置が設定情報記憶部63に記憶されているプログラムに従って表示制御部67が実行する処理を行う。このプログラムには、形状測定プログラムが含まれる。 Further, the control unit 62, the calculation unit 65, and the display control unit 67 are configured by a calculation processing device such as a CPU (Central Processing Unit). That is, the arithmetic processing unit performs processing executed by the control unit 62 in accordance with a program stored in the setting information storage unit 63. In addition, the arithmetic processing unit performs processing executed by the arithmetic unit 65 in accordance with a program stored in the setting information storage unit 63. Further, the arithmetic processing unit performs processing executed by the display control unit 67 in accordance with a program stored in the setting information storage unit 63. This program includes a shape measurement program.
 この形状プログラムは、演算処理装置(制御部62)に対して、測定対象物2の第1の参照像を撮像する処理と、測定対象物2の第2の参照像を撮像する処理と、第1の参照像の撮像と同時又は撮像後で、かつ第2の参照像の撮像よりも前に、縞パターンPを測定対象物2に投影した測定対象物2の測定像を撮像する処理と、を実行させる。また、この形状プログラムは、演算処理装置(演算部65)に対して、第1の参照像から特徴領域(後述)を検出する処理と、第2の参照像からその特徴領域(後述)を検出する処理と、第1の参照像における特徴領域と、第2の参照像における特徴領域とに基づいて、測定対象物2と形状測定装置1との相対的なぶれを検出する処理と、検出したぶれと測定対象物2の測定像とに基づいて、測定対象物2の形状を算出する処理とを実行させる。 The shape program includes, for an arithmetic processing unit (control unit 62), processing for capturing a first reference image of the measurement object 2, processing for capturing a second reference image of the measurement object 2, and A process of imaging a measurement image of the measurement object 2 obtained by projecting the fringe pattern P onto the measurement object 2 simultaneously with or after the imaging of the first reference image and before the imaging of the second reference image; Is executed. In addition, the shape program detects a feature region (described later) from the first reference image and detects the feature region (described later) from the second reference image with respect to the arithmetic processing unit (arithmetic unit 65). And a process for detecting a relative shake between the measurement object 2 and the shape measuring device 1 based on the feature region in the first reference image and the feature region in the second reference image. Based on the shake and the measurement image of the measurement object 2, a process for calculating the shape of the measurement object 2 is executed.
 次に、位相シフト法の原理について説明する。 
 位相シフト法は、三角測量の原理に基づいて、測定対象物2へ投影した正弦波状の光強度分布を有する縞パターンPの縞の位相をシフトさせて撮像した縞画像(縞パターンPが投影された測定対象物2の測定像)を解析することにより、三次元的に形状を計測する手法である。本実施形態において、縞パターンPは、縞の位相を第2の方向D2に沿ってπ/2ずつシフトさせた4種類の縞パターンPである。ここで、縞パターンPの位相は、縞パターンPの光強度の分布である正弦波の位相と言い換えることができる。つまり、光強度の分布である正弦波をπ/2ずつ第2の方向D2に沿ってシフトさせて4種類の縞パターンPを生成する。
Next, the principle of the phase shift method will be described.
The phase shift method is based on the principle of triangulation, and a fringe image (the fringe pattern P is projected by shifting the fringe phase of the fringe pattern P having a sinusoidal light intensity distribution projected onto the measurement object 2. This is a method of measuring the shape three-dimensionally by analyzing the measured image of the measured object 2). In the present embodiment, the fringe pattern P is four types of fringe patterns P obtained by shifting the fringe phase by π / 2 along the second direction D2. Here, the phase of the fringe pattern P can be rephrased as a phase of a sine wave that is a light intensity distribution of the fringe pattern P. That is, four types of fringe patterns P are generated by shifting a sine wave, which is a light intensity distribution, by π / 2 along the second direction D2.
 以下、例えば基準となる縞パターンPを第1縞パターン(第1位相光)P1とし、この第1縞パターンP1の位相を0とする。そして、この第1縞パターンP1の位相をπ/2だけシフトさせた縞パターンPを第2縞パターン(第2位相光)P2とし、第1縞パターンP1の位相をπだけシフトさせた縞パターンPを第3縞パターン(第3位相光)P3とし、第1縞パターンP1の位相を3π/2だけシフトさせた縞パターンPを第4縞パターン(第4位相光)P4とする。 Hereinafter, for example, the reference stripe pattern P is a first stripe pattern (first phase light) P1, and the phase of the first stripe pattern P1 is zero. The stripe pattern P obtained by shifting the phase of the first stripe pattern P1 by π / 2 is defined as the second stripe pattern (second phase light) P2, and the stripe pattern obtained by shifting the phase of the first stripe pattern P1 by π. Let P be the third stripe pattern (third phase light) P3, and let the stripe pattern P obtained by shifting the phase of the first stripe pattern P1 by 3π / 2 be the fourth stripe pattern (fourth phase light) P4.
 図5(a)~(d)は、測定対象物2のない平面に第1縞パターンP1~第4縞パターンP4が投影された状態を示す図であり、投影領域200内における撮像領域210の画像である。図5(a)は第1縞パターンP1、(b)は第2縞パターンP2、(c)は第3縞パターンP3、(d)は第4縞パターンP4を示している。 FIGS. 5A to 5D are views showing a state in which the first fringe pattern P1 to the fourth fringe pattern P4 are projected on a plane without the measuring object 2, and the imaging region 210 in the projection region 200 is shown in FIG. It is an image. 5A shows the first stripe pattern P1, FIG. 5B shows the second stripe pattern P2, FIG. 5C shows the third stripe pattern P3, and FIG. 5D shows the fourth stripe pattern P4.
 位相シフト法では、図5(a)~(d)に示すような第1縞パターンP1~第4縞パターンP4を投影部10から測定対象物2に投影すると共に、投影部10に対して異なる角度に配置される撮像部50で測定対象物2を撮影する。このとき、投影部10、測定対象物2、撮影部50は、三角測量の位置関係になるよう配置される。 In the phase shift method, the first fringe pattern P1 to the fourth fringe pattern P4 as shown in FIGS. 5A to 5D are projected from the projection unit 10 onto the measurement object 2 and are different from the projection unit 10. The measurement object 2 is imaged by the imaging unit 50 arranged at an angle. At this time, the projection unit 10, the measurement object 2, and the imaging unit 50 are arranged so as to have a triangulation positional relationship.
 撮像部50は、第1縞パターンP1~第4縞パターンP4がそれぞれ測定対象物2に投影された状態で、それぞれ測定対象物2を撮像して4つの測定像を取得する。そして、演算処理部60は、撮像部50が撮像した4つの測定像のそれぞれの信号強度に関するデータを以下の(式1)に当てはめ、測定対象物2の面形状に応じた各画素における縞の位相値φを求める。 The imaging unit 50 captures four measurement images by imaging the measurement object 2 in a state where the first stripe pattern P1 to the fourth stripe pattern P4 are projected onto the measurement object 2, respectively. Then, the arithmetic processing unit 60 applies the data on the signal strengths of the four measurement images captured by the imaging unit 50 to the following (Equation 1), and the fringes in each pixel according to the surface shape of the measurement object 2 are calculated. A phase value φ is obtained.
 φ(u,v)=tan-1{(I4(u,v)-I2(u,v))/(I1(u,v)-I3(u,v))}・・・(式1)
 ただし、(u、v)は画素の位置座標を示している。また、I1は第1縞パターンP1が投影されたときに撮像された測定像の信号強度である。同様に、I2は第2縞パターンP2、I3は第3縞パターンP3、I4は第4縞パターンP4がそれぞれ投影されたときの測定像の信号強度である。
φ (u, v) = tan −1 {(I4 (u, v) −I2 (u, v)) / (I1 (u, v) −I3 (u, v))} (Expression 1)
However, (u, v) indicates the position coordinates of the pixel. I1 is the signal intensity of the measurement image captured when the first fringe pattern P1 is projected. Similarly, I2 is the second stripe pattern P2, I3 is the third stripe pattern P3, and I4 is the signal intensity of the measurement image when the fourth stripe pattern P4 is projected.
 このように、画像の画素毎に正弦波状に変化する信号強度の位相を求めることができる。位相φ(u,v)が等しい点を連結して得られる線(等位相線)が、光切断法における切断線と同じく物体をある平面で切断した断面の形状を表す。従って、この位相φ(u,v)に基づいて三角測量の原理により三次元形状(画像の各点での高さ情報)が求められる。 Thus, the phase of the signal intensity that changes sinusoidally for each pixel of the image can be obtained. A line (equal phase line) obtained by connecting points having the same phase φ (u, v) represents the shape of a cross section obtained by cutting an object along a certain plane in the same manner as the cutting line in the optical cutting method. Therefore, a three-dimensional shape (height information at each point of the image) is obtained by the principle of triangulation based on this phase φ (u, v).
 なお、図5(a)~(d)に示すように、縞パターンPの位相が0、π/2、π、3π/2とシフトする毎に、撮像領域210上で縞の位置(縞の明るい部分と縞の暗い部分の位置)が位相差分だけずれているのが確認される。このように、縞パターンPの位相をシフトさせることにより、撮像領域210上においては、位相に対応して、縞の明るい部分及び暗い部分の位置が第2の方向D2にずれた状態で投影される。 As shown in FIGS. 5A to 5D, every time the phase of the stripe pattern P shifts to 0, π / 2, π, 3π / 2, the position of the stripe (the stripe It is confirmed that the bright portions and the dark portions of the stripes are shifted by the phase difference. As described above, by shifting the phase of the fringe pattern P, the bright and dark portions of the fringe are projected in the second direction D2 on the imaging region 210 in accordance with the phase. The
 図5(a)~(d)に示すように、例えば、第2縞パターンP2では、第1縞パターンP1に対して、縞の位置が位相π/2に対応する距離だけ第2の方向D2にずれている。また、第3縞パターンP3では、第1縞パターンP1に対して、縞の位置が位相πに対応する距離だけ第2の方向D2にずれている。同様に、第4縞パターンP4では、第1縞パターンP1に対して、縞の位置が位相3π/2に対応する距離だけ第2の方向D2にずれている。このため、撮像領域210上においては、第1縞パターンP1から第4縞パターンP4にかけて、縞の位置が等間隔ずつ第2の方向D2にずれた状態で投影される。 As shown in FIGS. 5A to 5D, for example, in the second stripe pattern P2, the second direction D2 is equal to the first stripe pattern P1 by the distance corresponding to the position of the stripe corresponding to the phase π / 2. It is shifted to. Further, in the third stripe pattern P3, the position of the stripe is shifted in the second direction D2 by a distance corresponding to the phase π with respect to the first stripe pattern P1. Similarly, in the fourth stripe pattern P4, the position of the stripe is shifted in the second direction D2 by a distance corresponding to the phase 3π / 2 with respect to the first stripe pattern P1. For this reason, on the imaging region 210, the position of the stripe is projected from the first stripe pattern P1 to the fourth stripe pattern P4 in the second direction D2 at equal intervals.
 なお、図5(a)~(d)では、平面上に投影された縞パターンPの像を示しているので、縞パターンPの像の形状に変化はない。測定対象物2がある場合は、測定対象物2の表面に縞パターンPが投影されるので測定対象物2の形状(高さ)に応じて縞パターンPの像が第2の方向D2(図3のY1軸方向)に沿って変形する。 5A to 5D show the image of the stripe pattern P projected on the plane, the shape of the image of the stripe pattern P does not change. When the measuring object 2 is present, the fringe pattern P is projected on the surface of the measuring object 2, so that the image of the fringe pattern P in the second direction D2 (see FIG. 3 in the Y1-axis direction).
 ここで、形状測定装置1は、例えば使用者が筐体80を持った状態で取り扱われる。使用者が筐体80を持った状態で静止した測定対象物2を撮像する場合、形状測定装置1から測定対象物2に対して第1縞パターンP1~第4縞パターンP4が投影され、各縞パターンのそれぞれについて測定対象物2の測定像が撮像される。この一連の動作の間に、例えば使用者の手ブレなどによって、測定対象物2と形状測定装置1との間に相対的な位置関係の変化(ぶれ)が発生する可能性がある。 Here, the shape measuring apparatus 1 is handled with the user holding the casing 80, for example. When the user picks up an image of the measurement object 2 that is stationary while holding the housing 80, the shape measuring device 1 projects the first stripe pattern P1 to the fourth stripe pattern P4 onto the measurement object 2, and A measurement image of the measurement object 2 is captured for each of the stripe patterns. During this series of operations, there is a possibility that a relative positional change (blurring) may occur between the measuring object 2 and the shape measuring apparatus 1 due to, for example, camera shake of the user.
 図6は、測定対象物2と形状測定装置1との相対的なぶれを説明するための図である。図6では、説明の便宜上、測定対象物2の表面2fを曲面として表示しているが、平面である場合にも同様の説明が可能である。 
 測定対象物2と形状測定装置1との相対的なぶれが生じる場合において、測定対象物2が変位せず形状測定装置1のみが変位する場合と、測定対象物2のみが変位し形状測定装置1が変位しない場合とは、同値として扱うことができる。したがって、以下、測定対象物2と形状測定装置1との相対的なぶれを説明するに当たり、便宜的に測定対象物2のみが変位し形状測定装置1が変位しないものとする。
FIG. 6 is a diagram for explaining the relative shake between the measurement object 2 and the shape measuring apparatus 1. In FIG. 6, for convenience of explanation, the surface 2 f of the measurement object 2 is displayed as a curved surface, but the same explanation can be made when it is a flat surface.
In the case where relative shake occurs between the measuring object 2 and the shape measuring apparatus 1, the measuring object 2 is not displaced and only the shape measuring apparatus 1 is displaced, and only the measuring object 2 is displaced and the shape measuring apparatus is displaced. The case where 1 is not displaced can be treated as the same value. Therefore, hereinafter, in explaining the relative shake between the measuring object 2 and the shape measuring apparatus 1, it is assumed that only the measuring object 2 is displaced and the shape measuring apparatus 1 is not displaced for convenience.
 例えば、第1縞パターンP1を投影した状態で測定像を撮像する時(以下、「第1タイミング」と表記する)と、例えば第2縞パターンP2を投影した状態で測定像を撮像する時(以下、「第2タイミング」と表記する)との間で測定対象物2と形状測定装置1との相対的なぶれが発生すると、図6に示すように、形状測定装置1(投影部10及び撮像部50)に対して測定対象物2の表面2fが変位する。 For example, when a measurement image is captured with the first stripe pattern P1 projected (hereinafter referred to as “first timing”), and when a measurement image is captured with the second stripe pattern P2 projected, for example ( Hereinafter, when a relative shake between the measurement object 2 and the shape measuring device 1 occurs between the “second timing” and the shape measuring device 1 (projection unit 10 and The surface 2f of the measuring object 2 is displaced with respect to the imaging unit 50).
 この場合、撮像部50(CCDカメラ52a)の任意の一画素(u1,v1)においては、表面2fの変位によって、第1タイミングで撮像される表面2f上の位置(以下、第1位置L1)と、第2タイミングで撮像される表面2f上の位置(以下、第2位置L2)とが異なる。言い換えると、第1タイミングで第1縞パターンP1が投影される位置と第2タイミングで第2縞パターンP2が投影される位置とは表面2f上で異なる。したがって、第1タイミングに対し、第2タイミングにおいて、第2縞パターンP2は、形状測定装置1と測定対象物2とが相対的に変位しない場合に投影されるべき表面2f上の位置には投影されずに投影位置が変位する。つまり、表面2f上における第1位置L1と第2位置L2との変位に伴い、投影される縞パターンPの位相は、第1タイミングで投影される第1縞パターンP1と第2タイミングで投影される第2縞パターンP2との位相差(π/2)とは別に、位相が変化(以下、この位相の変化量をδ1とする)してしまう。 In this case, in an arbitrary pixel (u1, v1) of the imaging unit 50 (CCD camera 52a), a position on the surface 2f that is imaged at the first timing (hereinafter, a first position L1) due to the displacement of the surface 2f. And a position on the surface 2f (hereinafter, the second position L2) imaged at the second timing is different. In other words, the position where the first stripe pattern P1 is projected at the first timing and the position where the second stripe pattern P2 is projected at the second timing are different on the surface 2f. Accordingly, at the second timing with respect to the first timing, the second fringe pattern P2 is projected at a position on the surface 2f to be projected when the shape measuring apparatus 1 and the measurement object 2 are not relatively displaced. Instead, the projection position is displaced. That is, with the displacement between the first position L1 and the second position L2 on the surface 2f, the phase of the projected fringe pattern P is projected at the first timing with the first fringe pattern P1 and the second timing. In addition to the phase difference (π / 2) with the second stripe pattern P2, the phase changes (hereinafter, the amount of phase change is referred to as δ1).
 この位相の変化により、第2タイミングで撮像される測定像のうち上記画素(u1、v1)の信号強度(I2(u1,v1))は、形状測定装置1と測定対象物2とが相対的に変位しない場合に得られるべき値からずれた値となってしまう。このため、各画素における縞の位相値(φ(u1,v1))は、形状測定装置1と測定対象物2とが相対的に変位しない場合に得られるべき値からずれた値となってしまう。よって、該位相値に基づいて求められる測定対象物2の三次元形状の測定精度が低下してしまう。 Due to this phase change, the signal intensity (I2 (u1, v1)) of the pixel (u1, v1) in the measurement image captured at the second timing is relative to the shape measuring apparatus 1 and the measurement object 2. If it is not displaced, the value deviates from the value that should be obtained. For this reason, the fringe phase value (φ (u1, v1)) in each pixel is a value deviated from the value to be obtained when the shape measuring apparatus 1 and the measurement object 2 are not relatively displaced. . Therefore, the measurement accuracy of the three-dimensional shape of the measurement object 2 obtained based on the phase value is lowered.
 これに対して、第1実施形態に係る形状測定装置1は、測定対象物2と形状測定装置1との相対的なぶれを精度よく検出し、このぶれを補正することにより、側知恵対象物2の三次元形状を精度よく測定するものである。以下、形状測定装置1によるぶれの検出方法の一例と、この検出方法を用いた形状測定方法の一例とについて説明する。 On the other hand, the shape measuring apparatus 1 according to the first embodiment accurately detects a relative shake between the measurement object 2 and the shape measuring apparatus 1 and corrects the shake, thereby correcting the side wisdom object. 2 three-dimensional shape is accurately measured. Hereinafter, an example of a shake detection method by the shape measuring apparatus 1 and an example of a shape measurement method using this detection method will be described.
 本実施形態に係る検出方法は、形状測定用の縞パターンPを投影部10から測定対象物2に投影した測定対象物2の測定像を撮像部50で撮像することと、測定対象物2と形状測定装置1との間の相対的なぶれを検出することとを含む。また、本実施形態に係る形状測定方法は、測定対象物2の測定像と上記ぶれとに基づいて、測定対象物2の形状を算出することとを含む。なお、以下の説明において、形状測定装置1の撮像部50は、予めキャリブレーションが行われ、内部パラメータが既知となっている。このキャリブレーションは、所謂、カメラキャリブレーションであって、撮像部50の焦点距離に関する情報、撮像部50の光軸が交差する撮像装置52の撮像面上の位置に関する情報や撮像部50に含まれるレンズ等のディストーションに関する情報等を内部パラメータとする。 In the detection method according to the present embodiment, the imaging unit 50 captures a measurement image of the measurement target 2 obtained by projecting the stripe pattern P for shape measurement from the projection unit 10 onto the measurement target 2, and the measurement target 2 Detecting relative shake with the shape measuring apparatus 1. Moreover, the shape measuring method according to the present embodiment includes calculating the shape of the measuring object 2 based on the measurement image of the measuring object 2 and the above-described blur. In the following description, the imaging unit 50 of the shape measuring apparatus 1 is calibrated in advance and the internal parameters are known. This calibration is so-called camera calibration, and is included in information related to the focal length of the imaging unit 50, information related to the position on the imaging surface of the imaging device 52 where the optical axes of the imaging unit 50 intersect, and the imaging unit 50. Information related to distortion of lenses and the like is used as an internal parameter.
 図7は、第1実施形態に係る検出方法及び形状測定方法の一例について説明するフローチャートである。また、図8は、検出方法において撮像される画像を処理順に沿って模式的に示す図である。 FIG. 7 is a flowchart illustrating an example of the detection method and the shape measurement method according to the first embodiment. Further, FIG. 8 is a diagram schematically showing images captured in the detection method in the order of processing.
 形状測定装置1の電源がオンとなった状態で、使用者によりシャッター操作が行われると、制御部62は、操作部61からシャッター操作が行われたことを表す信号が入力される。なお、使用者によるシャッター操作が行われない場合は、待機状態となっている。また、シャッター操作が行われた場合、測定対象物2との距離を測定して、投影光学系30や撮像レンズ51のフォーカス合わせが行われてもよい。 When the shutter operation is performed by the user while the power of the shape measuring apparatus 1 is turned on, the control unit 62 receives a signal indicating that the shutter operation has been performed from the operation unit 61. When the shutter operation by the user is not performed, the standby state is set. Further, when the shutter operation is performed, the distance from the measurement object 2 may be measured and the projection optical system 30 and the imaging lens 51 may be focused.
 シャッター操作が行われると、制御部62は、光生成部20及び走査部40に対して指令信号を出力し、位相0の第1縞パターンP1を測定対象物2に投影させる。また、制御部62は、撮像部50に対して指令信号を出力し、第1縞パターンP1が投影された測定対象物2の測定像を撮像させる(ステップS01)。 When the shutter operation is performed, the control unit 62 outputs a command signal to the light generation unit 20 and the scanning unit 40 to project the first fringe pattern P1 having the phase 0 onto the measurement object 2. In addition, the control unit 62 outputs a command signal to the imaging unit 50 to capture a measurement image of the measurement object 2 on which the first fringe pattern P1 is projected (step S01).
 制御部62は、光生成部20(レーザコントローラ21)による投影光100の光生成と、走査部40による走査とを同期させるように光生成部20及び走査部40の動作を制御する。光生成部20による光生成の制御について、制御部62は、所定波長の投影光100の光強度を正弦波状で周期的に変化するように調整する。走査部40による走査の制御について、制御部62は、投影光100を第2の方向D2に所定速度で走査させる。これにより、光強度(又は、明暗、濃淡)が第2の方向D2に正弦波状で周期的に変化する第1縞パターンP1が投影領域200に投影される。したがって、投影領域200に配置された測定対象物2に第1縞パターンP1が投影される。なお、投影光100の走査回数は任意に設定される。 The control unit 62 controls the operations of the light generation unit 20 and the scanning unit 40 so that the light generation of the projection light 100 by the light generation unit 20 (laser controller 21) and the scanning by the scanning unit 40 are synchronized. Regarding the light generation control by the light generation unit 20, the control unit 62 adjusts the light intensity of the projection light 100 having a predetermined wavelength so as to periodically change in a sinusoidal shape. Regarding control of scanning by the scanning unit 40, the control unit 62 causes the projection light 100 to scan in the second direction D2 at a predetermined speed. As a result, the first stripe pattern P1 in which the light intensity (or light and dark, light and shade) periodically changes in a sine wave shape in the second direction D2 is projected onto the projection region 200. Accordingly, the first fringe pattern P1 is projected onto the measurement object 2 arranged in the projection region 200. Note that the number of scans of the projection light 100 is arbitrarily set.
 また、制御部62からの指令信号に基づいて、CCDカメラ52aは、第1縞パターンP1が投影された測定対象物2の表面を撮像する。図8に示すように、この撮像によって、第1縞パターンP1が投影された測定対象物2の第1測定像M1が取得される。そして、CCDカメラ52aは、この第1測定像M1の画像データを生成する。第1測定像M1の画像データは、一旦、画像メモリ52bに格納された後、それぞれ、取込メモリ64に設けられた記憶領域に記憶される。 Further, based on the command signal from the control unit 62, the CCD camera 52a images the surface of the measurement object 2 on which the first stripe pattern P1 is projected. As shown in FIG. 8, the first measurement image M1 of the measurement object 2 on which the first stripe pattern P1 is projected is acquired by this imaging. Then, the CCD camera 52a generates image data of the first measurement image M1. The image data of the first measurement image M1 is temporarily stored in the image memory 52b and then stored in a storage area provided in the capture memory 64.
 なお、図8の第1測定像M1として、説明の便宜上、平面上に投影された第1縞パターンP1がそのまま撮像された状態の画像を示している。実際には、測定対象物2の表面に第1縞パターンP1が投影されるため、測定対象物2の形状に応じて第1縞パターンP1が変形した状態の画像が得られる。後述する第2測定像M2~第4測定像M4についても同様である。 Note that, as the first measurement image M1 in FIG. 8, for convenience of explanation, an image in a state where the first stripe pattern P1 projected on the plane is captured as it is is shown. Actually, since the first fringe pattern P1 is projected on the surface of the measurement object 2, an image in which the first fringe pattern P1 is deformed according to the shape of the measurement object 2 is obtained. The same applies to a second measurement image M2 to a fourth measurement image M4 described later.
 次に、制御部62は、光生成部20及び走査部40に対して指令信号を出力し、一様パターンQ(図3(b)参照)を測定対象物2に投影させる。また、制御部62は、撮像部50に対して指令信号を出力し、一様パターンQが投影された測定対象物2の参照像を撮像させる(ステップS02)。 Next, the control unit 62 outputs a command signal to the light generation unit 20 and the scanning unit 40 to project the uniform pattern Q (see FIG. 3B) on the measurement object 2. In addition, the control unit 62 outputs a command signal to the imaging unit 50 to capture a reference image of the measurement object 2 on which the uniform pattern Q is projected (step S02).
 制御部62は、光生成部20(レーザコントローラ21)による投影光100の光生成と、走査部40による走査とを同期させるように光生成部20及び走査部40の動作を制御する。光生成部20による光生成の制御について、制御部62は、投影光100を白色光とする。また、制御部62は、投影光100の光強度が一定となるように調整する。走査部40による走査の制御について、制御部62は、投影光100を第2の方向D2に所定速度で走査させる。これにより、光強度(又は、明暗、濃淡)が第2の方向D2に一様となるように調整された一様パターンQが投影領域200に投影される。したがって、投影領域200に配置された測定対象物2に一様パターンQが投影される。なお、投影光100の走査回数は任意に設定される。 The control unit 62 controls the operations of the light generation unit 20 and the scanning unit 40 so that the light generation of the projection light 100 by the light generation unit 20 (laser controller 21) and the scanning by the scanning unit 40 are synchronized. Regarding the light generation control by the light generation unit 20, the control unit 62 sets the projection light 100 to white light. In addition, the control unit 62 adjusts so that the light intensity of the projection light 100 is constant. Regarding control of scanning by the scanning unit 40, the control unit 62 causes the projection light 100 to scan in the second direction D2 at a predetermined speed. Thereby, the uniform pattern Q adjusted so that the light intensity (or light and dark, light and shade) is uniform in the second direction D2 is projected onto the projection region 200. Therefore, the uniform pattern Q is projected onto the measurement object 2 arranged in the projection area 200. Note that the number of scans of the projection light 100 is arbitrarily set.
 また、制御部62からの指令信号に基づいて、CCDカメラ52aは、一様パターンQが投影された測定対象物2の表面を撮像する。この撮像により、一様パターンQが投影された測定対象物2の第1参照像(第1の参照像)R1が取得される。CCDカメラ52aは、この第1参照像R01の画像データを生成する。第1参照像R01の画像データは、一旦、画像メモリ52bに格納された後、それぞれ、取込メモリ64に設けられた記憶領域に記憶される。 Further, based on a command signal from the control unit 62, the CCD camera 52a images the surface of the measurement object 2 on which the uniform pattern Q is projected. By this imaging, a first reference image (first reference image) R1 of the measurement object 2 onto which the uniform pattern Q is projected is acquired. The CCD camera 52a generates image data of the first reference image R01. The image data of the first reference image R01 is once stored in the image memory 52b and then stored in a storage area provided in the capture memory 64.
 次に、制御部62は、光生成部20及び走査部40に対して指令信号を出力し、位相π/2の第2縞パターンP2を測定対象物2に投影させる。また、制御部62は、撮像部50に対して指令信号を出力し、第2縞パターンP2が投影された測定対象物2の測定像を撮像させる(ステップS03)。これにより、図8に示すように、第2縞パターンP2が投影された測定対象物2の第2測定像M2が取得される。その後、CCDカメラ52aによって第2測定像M2の画像データが生成され、この画像データが取込メモリ64に記憶される。 Next, the control unit 62 outputs a command signal to the light generation unit 20 and the scanning unit 40 to project the second fringe pattern P2 having a phase of π / 2 onto the measurement object 2. In addition, the control unit 62 outputs a command signal to the imaging unit 50 to capture a measurement image of the measurement object 2 on which the second stripe pattern P2 is projected (step S03). Thereby, as shown in FIG. 8, the 2nd measurement image M2 of the measuring object 2 in which the 2nd striped pattern P2 was projected is acquired. Thereafter, image data of the second measurement image M <b> 2 is generated by the CCD camera 52 a, and this image data is stored in the capture memory 64.
 次に、制御部62は、光生成部20及び走査部40に対して指令信号を出力し、一様パターンQを測定対象物2に投影させる。また、制御部62は、撮像部50に対して指令信号を出力し、一様パターンQが投影された測定対象物2の参照像を撮像させる(ステップS04)。これにより、図8に示すように、一様パターンQが投影された測定対象物2の第2参照像(第2の参照像)R2が取得される。その後、CCDカメラ52aによって第2参照像R02の画像データが生成され、この画像データが取込メモリ64に記憶される。 Next, the control unit 62 outputs a command signal to the light generation unit 20 and the scanning unit 40 to project the uniform pattern Q onto the measurement object 2. In addition, the control unit 62 outputs a command signal to the imaging unit 50 to capture a reference image of the measurement object 2 on which the uniform pattern Q is projected (step S04). Thereby, as shown in FIG. 8, the 2nd reference image (2nd reference image) R2 of the measuring object 2 in which the uniform pattern Q was projected is acquired. Thereafter, image data of the second reference image R02 is generated by the CCD camera 52a, and this image data is stored in the capture memory 64.
 以下、制御部62は、第3縞パターンP3、一様パターンQ、第4縞パターンP4、一様パターンQの順に縞パターンPと一様パターンQとを交互に測定対象物2に投影させ、縞パターンP又は一様パターンQが投影された測定対象物2を撮像させる(ステップS05~ステップS08)。これにより、図8に示すように、第3縞パターンP3が投影された測定対象物2の第3測定像M3、一様パターンQが投影された測定対象物2の第3参照像R03、第4縞パターンP4が投影された測定対象物2の第4測定像M4、一様パターンQが投影された測定対象物2の第4参照像R04がそれぞれ取得される。その後、CCDカメラ52aによって各画像の画像データが生成され、この画像データが取込メモリ64に記憶される。 Hereinafter, the control unit 62 alternately projects the stripe pattern P and the uniform pattern Q on the measurement object 2 in the order of the third stripe pattern P3, the uniform pattern Q, the fourth stripe pattern P4, and the uniform pattern Q, The measurement object 2 on which the stripe pattern P or the uniform pattern Q is projected is imaged (steps S05 to S08). Accordingly, as shown in FIG. 8, the third measurement image M3 of the measurement object 2 on which the third stripe pattern P3 is projected, the third reference image R03 of the measurement object 2 on which the uniform pattern Q is projected, A fourth measurement image M4 of the measurement object 2 onto which the four-stripe pattern P4 is projected and a fourth reference image R04 of the measurement object 2 onto which the uniform pattern Q is projected are obtained. Thereafter, image data of each image is generated by the CCD camera 52 a, and this image data is stored in the capture memory 64.
 次に、演算部65は、第1参照像R01~第4参照像R04における特徴領域を検出する(ステップS09)。本実施形態では、演算部65は、第1参照像R01~第4参照像R04のうち測定対象物2の角部2a及びその周辺に対応する特徴領域Cを検出する。なお、特徴領域Cには、角部2aに集まる3本の直線のそれぞれに対応した3つの単位領域Ca~Ccが設けられている。なお、図8の第1参照像R01~第4参照像R04として、説明の便宜上、特徴領域Cのみが表示された状態で示されている。実際には、測定対象物2の表面全体が表示された画像が得られる。 Next, the calculation unit 65 detects a feature region in the first reference image R01 to the fourth reference image R04 (step S09). In the present embodiment, the computing unit 65 detects the feature region C corresponding to the corner 2a of the measurement object 2 and its periphery from the first reference image R01 to the fourth reference image R04. In the feature region C, three unit regions Ca to Cc corresponding to each of the three straight lines gathering at the corner 2a are provided. For convenience of explanation, only the feature region C is displayed as the first reference image R01 to the fourth reference image R04 in FIG. Actually, an image displaying the entire surface of the measuring object 2 is obtained.
 次に、演算部65は、測定対象物2と形状測定装置1との相対的なぶれを検出する(ステップS10)。ステップS10において、演算部65は、まず第1測定像M1~第4測定像M4を用いて、測定対象物2の三次元形状を算出する。この場合、演算部65は、例えば手ブレ等によるぶれが無い(位相の変化量δ1が0である)ものとして、各画素の初期位相分布φ(u,v)を求める。そして、演算部65は、求めた初期位相分布φ(u,v)に対して位相接続処理を行う。これにより、連続した位相分布φ’(u,v)が求められる。そして、演算部65は、三角測量の原理を用いて、求めた位相分布φ’(u,v)から、測定対象物2の三次元形状の座標データ(x′,y′,z′)を算出する。このように算出された三次元形状の座標データは、ステップ10では測定対象物2と形状測定装置1との相対的なぶれが無い(位相の変化量δ1が0である)ものとして算出されるが、実際には上記ぶれが生じている(位相の変化量δ1が0ではない)可能性がある。したがって、算出される三次元形状の座標データは、実際の三次元形状の値とは異なっている可能性のあるラフな値となる。 Next, the calculation unit 65 detects a relative shake between the measurement object 2 and the shape measuring apparatus 1 (step S10). In step S10, the calculation unit 65 first calculates the three-dimensional shape of the measurement object 2 using the first measurement image M1 to the fourth measurement image M4. In this case, the calculation unit 65 obtains the initial phase distribution φ (u, v) of each pixel on the assumption that there is no shake due to camera shake or the like (the phase change amount δ1 is 0). Then, the calculation unit 65 performs a phase connection process on the obtained initial phase distribution φ (u, v). Thus, a continuous phase distribution φ ′ (u, v) is obtained. Then, the calculation unit 65 obtains the coordinate data (x ′, y ′, z ′) of the three-dimensional shape of the measuring object 2 from the obtained phase distribution φ ′ (u, v) using the principle of triangulation. calculate. The coordinate data of the three-dimensional shape calculated in this way is calculated in step 10 as having no relative shake between the measuring object 2 and the shape measuring device 1 (the phase change amount δ1 is 0). However, there is a possibility that the above-mentioned blur actually occurs (the amount of phase change δ1 is not 0). Therefore, the calculated coordinate data of the three-dimensional shape is a rough value that may be different from the value of the actual three-dimensional shape.
 次に、演算部65は、算出した測定対象物2のラフな三次元形状の座標データと、第1参照像R01~第4参照像R04の各特徴領域Cの二次元座標との対応により、形状測定装置1から表面2fへの回転及び並進を算出する。この場合の回転及び並進の算出方法としては、学術論文(例、V. Lepetit et al. “EPnP: An Accurate O(n) Solution to the PnP Problem”,International Journal Of Computer Vision, vol. 81, p. 155-166, 2009.)や公開公報などに記載の公知の手法を用いることができる。 Next, the calculation unit 65 associates the calculated coordinate data of the rough three-dimensional shape of the measurement object 2 with the two-dimensional coordinates of the feature regions C of the first reference image R01 to the fourth reference image R04. The rotation and translation from the shape measuring device 1 to the surface 2f are calculated. The calculation method of rotation and translation in this case includes academic papers (eg, V. Lepetit et al. “EPnP: An Accurate O (n) Solution to the PnP Problem”, International Journal Of Computer Vision, vol. 81, p 155-166, 2009.) and publicly known publications can be used.
 演算部65は、まずラフな三次元形状の座標データと第1参照像R01の特徴領域Cの二次元座標との対応により、形状測定装置1から第1タイミングにおける表面2fへの回転R1及び並進t1を算出する。次に、演算部65は、ラフな三次元形状の座標データと例えば第2参照像R02の特徴領域Cの二次元座標との対応により、形状測定装置1から第2タイミングにおける表面2fへの回転R2及び並進t2を算出する。 First, the calculation unit 65 performs rotation R1 and translation from the shape measuring device 1 to the surface 2f at the first timing based on the correspondence between the coordinate data of the rough three-dimensional shape and the two-dimensional coordinates of the feature region C of the first reference image R01. t1 is calculated. Next, the calculation unit 65 rotates from the shape measuring apparatus 1 to the surface 2f at the second timing based on the correspondence between the coordinate data of the rough three-dimensional shape and the two-dimensional coordinates of the feature region C of the second reference image R02, for example. R2 and translation t2 are calculated.
 次に、演算部65は、求めたR1、t1、R2、t2を用いて、以下の[数1]により、特徴領域Cの回転Ra及び並進taを求める。なお、回転R1、R2、Raは行列式で表され、並進t1、t2、taはベクトルで表される。この回転Ra及び並進taは、第1測定像M1と第2測定像M2との間で生じたぶれについての第1変化情報である。 Next, the arithmetic unit 65 obtains the rotation Ra and the translation ta of the feature region C by the following [Equation 1] using the obtained R1, t1, R2, and t2. The rotations R1, R2, and Ra are represented by determinants, and the translations t1, t2, and ta are represented by vectors. The rotation Ra and the translation ta are first change information about the shake that occurs between the first measurement image M1 and the second measurement image M2.
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 また、演算部65は、ラフな三次元形状の座標データと第3参照像R03の特徴領域Cの二次元座標との対応により、形状測定装置1から第3測定像M3の撮像時(第3タイミング)における表面2fへの回転R3及び並進t3を算出する。次に、演算部65は、求めたR3、t3と、上記のR2、t2とを用いて、以下の[数2]により、特徴領域Cの回転Rb及び並進tbを求める。この回転Rb及び並進tbは、第2測定像M2と第3測定像M3との間で生じたぶれについての第2変化情報である。 In addition, the calculation unit 65 captures the third measurement image M3 from the shape measurement apparatus 1 (third) based on the correspondence between the rough three-dimensional shape coordinate data and the two-dimensional coordinates of the feature region C of the third reference image R03. The rotation R3 and translation t3 to the surface 2f at (timing) are calculated. Next, the computing unit 65 obtains the rotation Rb and translation tb of the feature region C by the following [Equation 2] using the obtained R3, t3 and the above R2, t2. The rotation Rb and the translation tb are second change information about the shake that occurs between the second measurement image M2 and the third measurement image M3.
Figure JPOXMLDOC01-appb-M000002
Figure JPOXMLDOC01-appb-M000002
 また、演算部65は、ラフな三次元形状の座標データと第4参照像R04の特徴領域Cの二次元座標との対応により、形状測定装置1から第4測定像M4の撮像時(第4タイミング)における表面2fへの回転R4及び並進t4を算出する。次に、演算部65は、求めたR4、t4と、上記のR3、t3とを用いて、以下の[数3]により、特徴領域Cの回転Rc及び並進tcを求める。この回転Rc及び並進tcは、第3測定像M3と第4測定像M4との間で生じたぶれについての第3変化情報である。 Further, the calculation unit 65 captures the fourth measurement image M4 from the shape measurement device 1 (fourth) based on the correspondence between the rough three-dimensional shape coordinate data and the two-dimensional coordinates of the feature region C of the fourth reference image R04. Rotation R4 and translation t4 to the surface 2f at (timing) are calculated. Next, the arithmetic unit 65 obtains the rotation Rc and the translation tc of the feature region C by the following [Equation 3] using the obtained R4 and t4 and the above R3 and t3. The rotation Rc and the translation tc are the third change information about the shake that occurs between the third measurement image M3 and the fourth measurement image M4.
Figure JPOXMLDOC01-appb-M000003
Figure JPOXMLDOC01-appb-M000003
 このように、本実施形態に係る検出方法により、第1測定像M1と第2測定像M2との間のぶれ(第1変化情報)、第2測定像M2と第3測定像M3との間のぶれ(第2変化情報)、及び第3測定像M3と第4測定像M4との間のぶれ(第3変化情報)が検出される。 Thus, by the detection method according to the present embodiment, blur (first change information) between the first measurement image M1 and the second measurement image M2, and between the second measurement image M2 and the third measurement image M3. A shake (second change information) and a shake (third change information) between the third measurement image M3 and the fourth measurement image M4 are detected.
 上記検出方法によってぶれを検出した後、演算部65は、ぶれの検出結果と、第1測定像M1~第4測定像M4とに基づいて、測定対象物2の形状を算出する(ステップS11)。このステップS11では、演算部65は、第1測定像M1と第2測定像M2との間のぶれ、第2測定像M2と第3測定像M3との間のぶれ、及び第3測定像M3と第4測定像M4との間のぶれを補正する。 After detecting the shake by the detection method, the calculation unit 65 calculates the shape of the measurement object 2 based on the shake detection result and the first measurement image M1 to the fourth measurement image M4 (step S11). . In step S11, the arithmetic unit 65 causes the blur between the first measurement image M1 and the second measurement image M2, the blur between the second measurement image M2 and the third measurement image M3, and the third measurement image M3. And the fourth measurement image M4 are corrected.
 ここで、使用者の手ブレ等によるぶれの場合には、図6に示す第1位置L1と第2位置L2との距離は微小であるとすることができる。このため、第1位置L1及び第2位置L2においては、表面2fにおける光反射率の変化を無視することができる。また、図6に示すように表面2fが移動すると、撮像位置から撮像部50までの距離が変化すると共に、この撮像位置における縞パターンPの入射角及び反射角が変化する。この影響により、撮像位置で反射されて撮像部50に到達する縞パターンPの反射光の強度が変化する。しかしながら、使用者の手ブレ等によるぶれの場合には、第1位置L1と第2位置L2との距離が微小であるため、撮像位置から撮像部50までの距離の変化や撮像位置における縞パターンPの入射角及び反射角の変化の影響は無視することができる。したがって、第1位置L1及び第2位置L2に投影される縞パターンPの光強度の変化は、投影部10から投影される縞パターンPの位相の変化のみによるものとみなすことができる。そこで、本実施形態では、各画素について、縞パターンPの位相の変化量(例、変化量δ1)を求め、この変化量を考慮した初期位相分布φ(u,v)を求めることで、ぶれを補正することができる。 Here, in the case of a shake due to a hand shake or the like of the user, it can be assumed that the distance between the first position L1 and the second position L2 shown in FIG. 6 is very small. For this reason, in the 1st position L1 and the 2nd position L2, the change of the light reflectivity in the surface 2f can be disregarded. As shown in FIG. 6, when the surface 2f moves, the distance from the imaging position to the imaging unit 50 changes, and the incident angle and reflection angle of the fringe pattern P at this imaging position change. Due to this influence, the intensity of the reflected light of the stripe pattern P reflected at the imaging position and reaching the imaging unit 50 changes. However, in the case of camera shake or the like, the distance between the first position L1 and the second position L2 is very small, so the change in the distance from the imaging position to the imaging unit 50 and the fringe pattern at the imaging position. The influence of changes in the incident angle and reflection angle of P can be ignored. Therefore, the change in the light intensity of the fringe pattern P projected to the first position L1 and the second position L2 can be regarded as being only due to the change in the phase of the fringe pattern P projected from the projection unit 10. Therefore, in this embodiment, for each pixel, the amount of change in the phase of the fringe pattern P (for example, the amount of change δ1) is obtained, and the initial phase distribution φ (u, v) taking this amount of change into consideration is obtained. Can be corrected.
 以下、撮像部50(CCDカメラ52a)の任意の一画素(u1、v1)について位相の変化量δ1を求める場合を例に挙げて説明する。具体的には、まず演算部65は、上記のように求めた第1変化情報(回転R、並進t)によって第1タイミングにおける表面2fの位置を変換することにより、第2タイミングにおける表面2fの位置を算出する。なお、第1タイミングにおける表面2fの位置は、ラフな三次元形状の座標データに基づいて算出する。 Hereinafter, a case where the phase change amount δ1 is obtained for an arbitrary pixel (u1, v1) of the imaging unit 50 (CCD camera 52a) will be described as an example. Specifically, the calculating unit 65 first converts the position of the surface 2f at the first timing by converting the position of the surface 2f at the first timing based on the first change information (rotation R, translation t) obtained as described above. Calculate the position. The position of the surface 2f at the first timing is calculated based on rough three-dimensional coordinate data.
 次に、演算部65は、第1位置L1及び第2位置L2の位置座標を求める。この場合、演算部65は、例えば撮像部50の任意の一画素(u1、v1)から逆投影した光線と第1タイミングにおける表面2fとの交点の座標を第1位置L1の位置座標として求める。また、演算部65は、上記画素(u1、v1)から逆投影した光線と第2タイミングにおける表面2fとの交点の座標を第2位置L2の位置座標として求める。 Next, the calculation unit 65 obtains the position coordinates of the first position L1 and the second position L2. In this case, the calculation unit 65 obtains, for example, the coordinates of the intersection of the light beam back-projected from any one pixel (u1, v1) of the imaging unit 50 and the surface 2f at the first timing as the position coordinate of the first position L1. Further, the calculation unit 65 obtains the coordinates of the intersection of the light ray back projected from the pixel (u1, v1) and the surface 2f at the second timing as the position coordinate of the second position L2.
 次に、演算部65は、投影部10から第1位置L1及び第2位置L2に構造光101を投影した時の位相の変化量δ1を求める。図9(a)及び(b)は、位相の変化量δ1を求める原理を説明するための図である。図9(a)は、第1位置L1に投影される縞パターンPの強度分布を示す図である。図9(b)は、第1位置及び第2位置L2に投影される縞パターンPの強度分布を示す図である。 Next, the calculation unit 65 obtains a phase change amount δ1 when the structured light 101 is projected from the projection unit 10 to the first position L1 and the second position L2. FIGS. 9A and 9B are diagrams for explaining the principle of obtaining the phase change amount δ1. FIG. 9A is a diagram showing the intensity distribution of the fringe pattern P projected on the first position L1. FIG. 9B is a diagram illustrating the intensity distribution of the fringe pattern P projected on the first position and the second position L2.
 まず、図9(a)に示すように、第1タイミングでは第1縞パターンP1が測定対象物2に投影される。このとき、第1位置L1には、縞パターンPの位相φL1(一の暗部の位相を0とする)に対応する部分が投影される。また、CCDカメラ52aのうち第1位置L1を撮像する一画素では、縞パターンPの位相φL1での光強度に対応した輝度の像が撮像される。 First, as shown in FIG. 9A, the first fringe pattern P1 is projected onto the measurement object 2 at the first timing. At this time, a portion corresponding to the phase φ L1 of the fringe pattern P (the phase of one dark portion is set to 0) is projected onto the first position L1. Further, in one pixel for imaging the first position L1 of CCD cameras 52a, luminance image of which corresponds to the light intensity of the phase phi L1 stripe pattern P is captured.
 また、図9(b)に示すように、第2タイミングでは、第1縞パターンP1よりもπ大きい位相の第2縞パターンP2が測定対象物2に投影される。このため、手ブレ等のぶれが無い場合、第1位置L1では、縞パターンPのうち上記の位相φL1にπ/2を加えた位相(φL1+π/2)に対応する部分が投影される。この場合、CCDカメラ52aの上記一画素では、縞パターンPのうち位相(φL1+π/2)での光強度に対応した輝度の像が撮像される。 Further, as shown in FIG. 9B, at the second timing, the second fringe pattern P2 having a phase larger by π than the first fringe pattern P1 is projected onto the measurement object 2. For this reason, when there is no camera shake or the like, a portion corresponding to a phase (φ L1 + π / 2) obtained by adding π / 2 to the phase φ L1 in the stripe pattern P is projected at the first position L1. The In this case, an image having a luminance corresponding to the light intensity at the phase (φ L1 + π / 2) in the stripe pattern P is captured by the one pixel of the CCD camera 52a.
 一方、手ブレ等によるぶれが生じた場合、撮像部50の上記画素では、撮像位置が第2位置L2となる。ここで、第2位置L2のY1方向の位置が第1位置L1に対して変化している場合、図9(b)に示すように、第2位置L2の位置は、縞パターンP(第2縞パターンP2)において、第1位置L1に対してY1方向(例、+Y1方向)にずれた位置となる。このため、第2位置L2の位相φL2は、第1位置L1の位相(φL1+π/2)とは異なる値となる。この場合、撮像部50の上記画素では、縞パターンPのうち位相φL2での光強度に対応した輝度の像が撮像される。 On the other hand, when camera shake or the like occurs, the imaging position of the pixel of the imaging unit 50 is the second position L2. Here, when the position in the Y1 direction of the second position L2 is changed with respect to the first position L1, as shown in FIG. 9B, the position of the second position L2 is a fringe pattern P (second pattern). In the fringe pattern P2), the first position L1 is shifted in the Y1 direction (eg, + Y1 direction). For this reason, the phase φ L2 of the second position L2 is a value different from the phase (φ L1 + π / 2) of the first position L1. In this case, in the pixels of the imaging unit 50, the luminance image of which corresponds to the light intensity of the phase phi L2 of the fringe pattern P is captured.
 縞パターンPはY1方向について正弦波状に周期的な光強度の分布を有しているため、第1位置L1の位相(φL1+π/2)に対応する部分と、第2位置L2の位相φL2に対応する部分とでは、光強度が異なる。このため、第2タイミングにおいてCCDカメラ52aの上記一画素で撮像される第1位置L1の像と、第2位置L2の像とでは、互いに輝度が異なる。 Since the fringe pattern P has a periodic light intensity distribution in a sine wave shape in the Y1 direction, the portion corresponding to the phase (φ L1 + π / 2) of the first position L1 and the phase φ of the second position L2 The light intensity is different from the portion corresponding to L2 . For this reason, the brightness at the first position L1 and the image at the second position L2 captured by the one pixel of the CCD camera 52a at the second timing are different from each other.
 本実施形態では、第1位置L1及び第2位置L2に投影される縞パターンPの光強度の変化は、投影部10から投影される縞パターンPの位相の変化のみによるものとみなすことができる。このため、第2タイミングにおいて上記一画素で撮像される第1位置L1の像と第2位置L2の像との間の輝度の差は、縞パターンPの位相の変化のみによって生じるものとみなすことができる。 In the present embodiment, the change in the light intensity of the fringe pattern P projected to the first position L1 and the second position L2 can be regarded as being only due to the change in the phase of the fringe pattern P projected from the projection unit 10. . For this reason, it is assumed that the difference in luminance between the image at the first position L1 and the image at the second position L2 captured by the one pixel at the second timing is caused only by the change in the phase of the fringe pattern P. Can do.
 そこで、演算部65は、上記画素で撮像された像の輝度に基づいて、第2位置L2に投影される縞パターンPの位相φL2を求める。そして、演算部65は、求めた位相φL2が、第1位置L1に投影される縞パターンPの位相(φL1+π/2)に対してどれだけずれているかを、位相の変化量δ1として算出する。 Therefore, the calculation unit 65 obtains the phase φ L2 of the fringe pattern P projected on the second position L2 based on the luminance of the image captured by the pixel. Then, the calculation unit 65 determines how much the obtained phase φ L2 is deviated from the phase (φ L1 + π / 2) of the fringe pattern P projected on the first position L1 as a phase change amount δ1. calculate.
 また、演算部65は、上記同様の方法により、第2タイミングで投影される第2縞パターンP2と第3タイミングで投影される第3縞パターンP3との間でぶれによって生じる位相の変化量δ2(位相差(π/2)とは別の位相の変化)と、第3タイミングで投影される第3縞パターンP3と第4タイミングで投影される第4縞パターンP4との間でぶれによって生じる位相の変化量δ3(位相差(π/2)とは別の位相の変化)とをそれぞれ算出する。 In addition, the calculating unit 65 uses the same method as described above, and the phase change amount δ2 caused by the shake between the second stripe pattern P2 projected at the second timing and the third stripe pattern P3 projected at the third timing. (Change in phase different from phase difference (π / 2)) and blurring between the third stripe pattern P3 projected at the third timing and the fourth stripe pattern P4 projected at the fourth timing A phase change amount δ3 (a phase change different from the phase difference (π / 2)) is calculated.
 次に、演算部65は、第2タイミングに対応する縞パターンPの初期位相を(π/2+δ1)、第3タイミングに対応する縞パターンPの初期位相を(π+δ1+δ2)、第4タイミングに対応する縞パターンPの初期位相を(3π/2+δ1+δ2+δ3)として、初期位相分布φ(u1,v1)を求め、位相接続処理を行う。そして、演算部65は、求めた位相分布φ’(u1,v1)から、三角測量の原理を用いて三次元形状の座標データ(x1,y1,z1)を算出する。演算部65は、以上の演算を、各画素についてそれぞれ行い、測定対象物2の三次元形状の座標データ(x,y,z)を算出する。¥。なお、演算部65は、上記のラフな三次元形状の座標データとして、このように算出した三次元形状の座標データ(x,y,z)を用いて、上記の補正を再度又は繰り返し行ってもよい。これにより、第1タイミングにおける表面2fの位置の精度が高められるため、より精度のよい検出が可能となる。 Next, the calculation unit 65 corresponds to the fourth timing with the initial phase of the fringe pattern P corresponding to the second timing being (π / 2 + δ1), the initial phase of the fringe pattern P corresponding to the third timing being (π + δ1 + δ2). Assuming that the initial phase of the fringe pattern P is (3π / 2 + δ1 + δ2 + δ3), an initial phase distribution φ (u1, v1) is obtained, and phase connection processing is performed. Then, the calculation unit 65 calculates three-dimensional coordinate data (x1, y1, z1) from the obtained phase distribution φ ′ (u1, v1) using the principle of triangulation. The calculation unit 65 performs the above calculation for each pixel to calculate the coordinate data (x, y, z) of the three-dimensional shape of the measurement object 2. ¥. The calculation unit 65 performs the above correction again or repeatedly using the coordinate data (x, y, z) calculated in this way as the rough coordinate data of the three-dimensional shape. Also good. As a result, the accuracy of the position of the surface 2f at the first timing is increased, and therefore, more accurate detection is possible.
 演算部65は、算出した測定対象物2の三次元形状の座標データを画像記憶部66に記憶する。表示制御部67は、使用者による操作部61の操作に応じて、または自動的に、画像記憶部66に記憶された三次元形状の座標データを読み出す。表示制御部67は、読み出した三次元形状の座標データに基づいて表示装置70の表示画面に測定対象物2の三次元形状を表示させる。三次元形状は、三次元空間内の点の集合である点群で表示される。この点群のデータは、形状測定装置1から出力可能である。 The calculation unit 65 stores the calculated coordinate data of the three-dimensional shape of the measurement object 2 in the image storage unit 66. The display control unit 67 reads the coordinate data of the three-dimensional shape stored in the image storage unit 66 according to the operation of the operation unit 61 by the user or automatically. The display control unit 67 displays the three-dimensional shape of the measurement object 2 on the display screen of the display device 70 based on the read coordinate data of the three-dimensional shape. The three-dimensional shape is displayed as a point group that is a set of points in the three-dimensional space. This point cloud data can be output from the shape measuring apparatus 1.
 表示装置70は、測定対象物2の三次元形状を表示するだけでなく、撮像部50により撮像された縞画像を表示させてもよい。すなわち、表示制御部67は、取込メモリ64に記憶された画像データに基づいて、撮像部50が撮像した縞画像を表示装置70に表示させてもよい。このような構成によれば、使用者が撮像部50により撮像された縞画像に基づいて、撮像現場で測定対象物2が正確に撮像されたか否かを確認することができる。 The display device 70 may display not only the three-dimensional shape of the measurement object 2 but also a fringe image captured by the imaging unit 50. That is, the display control unit 67 may cause the display device 70 to display the fringe image captured by the imaging unit 50 based on the image data stored in the capture memory 64. According to such a configuration, the user can confirm whether or not the measurement object 2 has been accurately imaged at the imaging site based on the fringe image captured by the imaging unit 50.
 また、表示装置70は、撮像部50により撮像された画像、及び演算部65により算出された三次元形状、のうち少なくとも一方を表示する構成であってもよい。この場合、撮像部50により撮像された画像、及び演算部65により算出された三次元形状、のうち少なくとも一方は、形状測定装置1と無線または有線で接続された外部の表示装置に表示させるものでもよい。 Further, the display device 70 may be configured to display at least one of the image captured by the imaging unit 50 and the three-dimensional shape calculated by the calculation unit 65. In this case, at least one of the image picked up by the image pickup unit 50 and the three-dimensional shape calculated by the calculation unit 65 is displayed on an external display device connected to the shape measuring device 1 wirelessly or by wire. But you can.
 以上のように、第1実施形態によれば、測定対象物2の形状を測定する形状測定装置1と該測定対象物2との相対的なぶれを検出する検出方法において、測定対象物2の第1参照像R01及び第2参照像を撮像し、第1参照像R01の撮像後で、かつ第2参照像R02の撮像よりも前に、測定対象物の第1測定像M1を撮像し、第1参照像R01から特徴領域Cを検出し、第2参照像R02から特徴領域Cを検出し、これらの特徴領域Cに基づいて、測定対象物2と形状測定装置1との相対的なぶれを検出するため、当該測定対象物2と形状測定装置1との相対的なぶれを精度よく検出できる。 As described above, according to the first embodiment, in the detection method for detecting the relative shake between the shape measuring apparatus 1 that measures the shape of the measuring object 2 and the measuring object 2, The first reference image R01 and the second reference image are captured, the first measurement image M1 of the measurement object is captured after the first reference image R01 and before the second reference image R02, The feature region C is detected from the first reference image R01, the feature region C is detected from the second reference image R02, and the relative blur between the measurement object 2 and the shape measuring device 1 is based on these feature regions C. Therefore, the relative shake between the measurement object 2 and the shape measuring device 1 can be detected with high accuracy.
 <第2実施形態> 
 次に、第2実施形態を説明する。本実施形態では、第1参照像R01~第4参照像R04を撮像する際に、参照光として、空間コードとしての矩形波状の強度分布を有する光(以下、「空間コードパターン」と表記する。)を投影する場合を例に挙げて説明する。図10(a)~(d)は、例えば測定対象物2に投影される空間コードパターンの強度分布を示す図である。
Second Embodiment
Next, a second embodiment will be described. In the present embodiment, when the first reference image R01 to the fourth reference image R04 are imaged, light having a rectangular wave intensity distribution as a spatial code (hereinafter referred to as “spatial code pattern”) is used as the reference light. ) Will be described as an example. FIGS. 10A to 10D are diagrams showing the intensity distribution of the spatial code pattern projected onto the measurement object 2, for example.
 図10(a)~(d)に示すように、空間コードパターンQAは、第2の方向D2に矩形状の強度分布を有する縞状のパターン光である。空間コードパターンQAは、明るい部分(図10の白い部分)と暗い部分(図10の黒い部分)が交互に現れる。本実施形態では、第2の方向D2についての周波数の異なる4種類の矩形状の光強度分布を有する空間コードパターンQA1~QA4を用いて説明する。なお、言い換えると、空間コードパターンは、例えば、白色と黒色とを組み合わせた格子状のパターンである。 As shown in FIGS. 10A to 10D, the spatial code pattern QA is a striped pattern light having a rectangular intensity distribution in the second direction D2. In the spatial code pattern QA, bright portions (white portions in FIG. 10) and dark portions (black portions in FIG. 10) appear alternately. In the present embodiment, description will be made using spatial code patterns QA1 to QA4 having four types of rectangular light intensity distributions with different frequencies in the second direction D2. In other words, the spatial code pattern is, for example, a lattice pattern in which white and black are combined.
 第1実施形態では、縞パターンPの位相をシフトした像を取得し、位相接続を行うことで測定対象物2の三次元形状を求める。この手法では、測定対象物2の面形状が滑らかに変化する場合、正確に測定対象物2の三次元形状を求めることができる。ここで、以下の説明のため、縞パターンPの縞を、縞パターンPの正弦波状の強度分布における正弦波の1周期を単位とするパターンとして示す。つまり、縞パターンPには、投影領域200の一方の端に投影された、位相の基準を0とした0~2πの位相の縞、2π~4πの位相の縞を含む、(m-1)π~2mπ(但し、mは整数)の位相範囲の縞が含まれる。 In the first embodiment, an image obtained by shifting the phase of the fringe pattern P is acquired, and the three-dimensional shape of the measurement object 2 is obtained by performing phase connection. In this method, when the surface shape of the measuring object 2 changes smoothly, the three-dimensional shape of the measuring object 2 can be obtained accurately. Here, for the following description, the fringes of the fringe pattern P are shown as a pattern with one period of the sine wave in the sinusoidal intensity distribution of the fringe pattern P as a unit. That is, the fringe pattern P includes a fringe having a phase of 0 to 2π and a fringe having a phase of 2π to 4π with a phase reference of 0, which is projected on one end of the projection region 200, (m−1) It includes fringes with a phase range of π to 2mπ (where m is an integer).
 第1実施形態のような三角測量の原理に基づくと、例えば、測定対象物2の表面に凸状の段差が存在し、その段差の下面から上面までの標高(図1におけるY1座標)の変化が、撮像装置52の任意の隣り合う画素の領域に対応する測定対象物2上の領域において、縞パターンPの縞の周期(つまり、2π)に対応する標高差よりも大きい場合(つまり、縞パターンPの任意の縞が1周期以上、D1方向に変位する)場合、周期的な正弦波である縞パターンPの縞の内、どの位相範囲の縞が段差の上面に投影されているのかを確認できない(例えば、0~2πの位相の縞が投影されているのか、2π~4πの位相の縞が投影されているのか分からない)。つまり、位相シフト法のみでは、その段差に投影されている縞に対応する絶対位相値を求めることができなくなる問題が生じる可能性がある。 Based on the triangulation principle as in the first embodiment, for example, there is a convex step on the surface of the measurement object 2, and the elevation (Y1 coordinate in FIG. 1) changes from the lower surface to the upper surface of the step. Is larger than the altitude difference corresponding to the period (that is, 2π) of the fringe pattern P in the area on the measurement object 2 corresponding to the area of any adjacent pixel of the imaging device 52 (that is, fringe). If any stripe of the pattern P is displaced in the D1 direction for one period or more), which phase range of the stripe of the stripe pattern P, which is a periodic sine wave, is projected on the upper surface of the step. It cannot be confirmed (for example, it is not known whether a fringe having a phase of 0 to 2π is projected or a fringe having a phase of 2π to 4π is projected). That is, there is a possibility that the absolute phase value corresponding to the stripe projected on the step cannot be obtained only by the phase shift method.
 この問題を回避するため、位相シフト法に空間コード法を組み合わせて用いる手法がある。このような位相シフト法に空間コード法を組み合わせて用いる手法に関しては、例えば米国特許第6,075,605号明細書などに記載されている。この手法の場合、位相シフト法で正弦波状の強度分布を有する縞パターンPを測定対象物に投影した領域とほぼ同一の領域に、正弦波状の縞パターンPとは別に、複数種類の矩形状の強度分布を有する空間コードパターンを個別に投影しそれぞれ撮像することで、測定対象物2に投影された縞パターンPにおける(m-1)π~2mπ(但し、mは整数)の位相範囲のそれぞれの縞を識別できるようになる。 In order to avoid this problem, there is a method using a combination of the spatial code method and the phase shift method. For example, US Pat. No. 6,075,605 describes a method of using such a phase shift method in combination with the spatial code method. In the case of this method, in addition to the sinusoidal stripe pattern P, a plurality of types of rectangular shapes are formed in the same area as the area where the fringe pattern P having a sinusoidal intensity distribution is projected onto the measurement object by the phase shift method. Each of the phase ranges of (m−1) π to 2mπ (where m is an integer) in the fringe pattern P projected onto the measurement object 2 by individually projecting and imaging each spatial code pattern having an intensity distribution The stripes can be identified.
 図10に示す例において、(a)の空間コードパターンQA1は、8本の白色のラインと8本の黒色のラインが交互に配置されている。(b)の空間コードパターンQA2は、4本の白色のラインと4本の黒色のラインが交互に配置されている。(c)の空間コードパターンQA3は、2本の白色のラインと2本の黒色のラインが交互に配置されている。(d)の空間コードパターンQA4は、左半分が白色で右半分が黒色となっている。 In the example shown in FIG. 10, in the spatial code pattern QA1 in (a), eight white lines and eight black lines are alternately arranged. In the spatial code pattern QA2 in (b), four white lines and four black lines are alternately arranged. In the spatial code pattern QA3 in (c), two white lines and two black lines are alternately arranged. In the spatial code pattern QA4 in (d), the left half is white and the right half is black.
 図11は、第2実施形態に係る検出方法において撮像される画像を処理順に沿って模式的に示す図である。 
 図11に示すように、この場合、第1測定像M1を撮像した後に、制御部62は、投影部10に指示信号を出力し、空間コードパターンQA1を投影させる。また、制御部62は、撮像部50に指示信号を出力し、空間コードパターンQA1が投影された測定対象物2を撮像させる。これにより、第1参照像RA1が取得される。同様に、制御部62は、空間コードパターンQA2~QA4を投影させ、この空間コードパターンQA2~QA4が投影された測定対象物2を撮像させる。これにより、第2参照像RA2~第4参照像RA4が取得される。
FIG. 11 is a diagram schematically illustrating images captured in the detection method according to the second embodiment in the order of processing.
As shown in FIG. 11, in this case, after capturing the first measurement image M1, the control unit 62 outputs an instruction signal to the projection unit 10 to project the spatial code pattern QA1. In addition, the control unit 62 outputs an instruction signal to the imaging unit 50, and images the measurement object 2 on which the spatial code pattern QA1 is projected. Thereby, the first reference image RA1 is acquired. Similarly, the control unit 62 projects the spatial code patterns QA2 to QA4 and images the measurement object 2 on which the spatial code patterns QA2 to QA4 are projected. Thereby, the second reference image RA2 to the fourth reference image RA4 are acquired.
 次に、演算部65は、第1参照像RA1~第4参照像RA4における特徴領域を検出する。本実施形態では、演算部65は、空間コードパターンQA1~QA4のうち明るい部分において撮像された画像の一部を特徴領域として設定する。本実施形態では、空間コードパターンQA1及びQA2の明るい部分に共通して表示される第1特徴領域CA1と、空間コードパターンQA2~QA4の明るい部分に共通して表示される第2特徴領域CA2とを特徴領域とする。なお、図11の第1参照像RA1~第4参照像RA4として、説明の便宜上、特徴領域CA1~CA2のみが表示された状態で示されている。実際には、空間コードパターンQA1~QA4の明るい部分に対応する測定対象物2の一部が表示される。 Next, the calculation unit 65 detects a feature region in the first reference image RA1 to the fourth reference image RA4. In the present embodiment, the calculation unit 65 sets a part of an image captured in a bright part of the spatial code patterns QA1 to QA4 as a feature region. In the present embodiment, the first feature area CA1 displayed in common in the bright portions of the spatial code patterns QA1 and QA2, and the second feature region CA2 displayed in common in the bright portions of the spatial code patterns QA2 to QA4 Is a feature region. For convenience of explanation, only the feature areas CA1 to CA2 are displayed as the first reference image RA1 to the fourth reference image RA4 in FIG. Actually, a part of the measuring object 2 corresponding to the bright part of the spatial code patterns QA1 to QA4 is displayed.
 次に、演算部65は、第1測定像M1と第2測定像M2との間の相対的なぶれを検出する。例えば、演算部65は、第1実施形態と同様にラフな三次元形状の座標データを求め、該三次元形状と第1参照像RA1の特徴領域CA1及び第2参照像RA2の特徴領域CA1の二次元座標との対応によって、形状測定装置1から測定対象物2の表面2fへの回転及び並進をそれぞれ求める。そして、演算部65は、求めた回転及び並進から、第1参照像RA1と第1参照像RA2との間の特徴領域CA1の回転及び並進を、第1測定像M1と第2測定像M2との間の相対的なぶれに関する情報として求める。なお、実際には、形状測定装置1の移動によって生じるぶれである。同様に、演算部65は、第2参照像RA2と第3参照像RA3との間の特徴領域CA2の回転及び並進を、第2測定像M2と第3測定像M3との間の相対的なぶれに関する情報として求める。また、演算部65は、第3参照像RA3と第4参照像RA4との間の特徴領域CA2の回転及び並進を、第3測定像M3と第4測定像M4との間の相対的なぶれに関する情報として求める。このように、特徴領域は、すべての参照像において共通する部分に限定するものではない。その後、演算部65は、ぶれの検出結果と、第1測定像M1~第4測定像M4とに基づいて、測定対象物2の形状を算出する。この場合、演算部65は、第1実施形態と同様に、各画素について、縞パターンPの位相の変化量δ、δ2、δ3を求め、この変化量δ、δ2、δ3を考慮した初期位相分布φ(u,v)を求めることで、ぶれを補正する。 Next, the calculation unit 65 detects a relative shake between the first measurement image M1 and the second measurement image M2. For example, the calculation unit 65 obtains rough three-dimensional coordinate data as in the first embodiment, and calculates the three-dimensional shape and the feature region CA1 of the first reference image RA1 and the feature region CA1 of the second reference image RA2. According to the correspondence with the two-dimensional coordinates, rotation and translation from the shape measuring device 1 to the surface 2f of the measurement object 2 are obtained. Then, the computing unit 65 calculates the rotation and translation of the feature area CA1 between the first reference image RA1 and the first reference image RA2 from the obtained rotation and translation, as the first measurement image M1 and the second measurement image M2. As information on relative shake between In practice, the shake is caused by the movement of the shape measuring apparatus 1. Similarly, the calculation unit 65 performs rotation and translation of the feature area CA2 between the second reference image RA2 and the third reference image RA3, relative to each other between the second measurement image M2 and the third measurement image M3. It asks for information about blurring. In addition, the arithmetic unit 65 rotates and translates the feature area CA2 between the third reference image RA3 and the fourth reference image RA4, and the relative blur between the third measurement image M3 and the fourth measurement image M4. As information about. Thus, the feature region is not limited to a common part in all reference images. Thereafter, the calculation unit 65 calculates the shape of the measurement object 2 based on the shake detection result and the first measurement image M1 to the fourth measurement image M4. In this case, the arithmetic unit 65 obtains the phase variations δ, δ2, and δ3 of the fringe pattern P for each pixel as in the first embodiment, and the initial phase distribution in consideration of the variations δ, δ2, and δ3. The blur is corrected by obtaining φ (u, v).
 なお、この場合において、演算部65は、空間コードパターンQA1~QA4と各縞パターンPとの対応付けを行ってもよい。この場合、空間コードパターンQA1~QA4は、構造光及び参照光を兼ねることになる。これにより、この空間コードが割り当てられて区分された投影領域内におけるそれぞれの領域で正弦波状の縞パターンPのそれぞれの縞(m-1)π~2mπ(但し、mは整数)が識別される。 In this case, the arithmetic unit 65 may associate the spatial code patterns QA1 to QA4 with each stripe pattern P. In this case, the spatial code patterns QA1 to QA4 serve as structured light and reference light. As a result, the respective fringes (m−1) π to 2mπ (where m is an integer) of the sinusoidal fringe pattern P are identified in the respective regions within the projection region assigned with the spatial code. .
 以上のように、第2実施形態によれば、第1参照像R01~第4参照像R04を撮像する際に、参照光として空間コードパターンQA1~QA4を投影することにより、測定対象物2と形状測定装置1との相対的なぶれを精度よく検出できる。また、第2実施形態によれば、検出したぶれに基づいて、測定像を補正して測定対象物2の形状を算出するため、測定対象物の三次元形状を精度よく測定することができる。また、位相シフト法と空間コード法とを組み合わせて測定対象物2の三次元形状を測定する場合において、参照光として空間コードパターンQA1~QA4を用いることができるので、全体的な撮像時間を短縮することができる。 As described above, according to the second embodiment, when the first reference image R01 to the fourth reference image R04 are imaged, the spatial code patterns QA1 to QA4 are projected as the reference light, so that the measurement object 2 and The relative shake with the shape measuring apparatus 1 can be detected with high accuracy. Moreover, according to 2nd Embodiment, since the shape of the measurement target object 2 is calculated by correct | amending a measurement image based on the detected blur, the three-dimensional shape of a measurement target object can be measured accurately. Further, when the three-dimensional shape of the measurement object 2 is measured by combining the phase shift method and the spatial code method, the spatial code patterns QA1 to QA4 can be used as the reference light, thereby reducing the overall imaging time. can do.
 また、第2実施形態では、空間コードパターンQA1~QA4のうちパターンの細かいコードよりもパターンの粗いコードを後に投影することにより、撮像時間の経過とともに特徴領域を検出しやすくすることができる。使用者による手ブレは、撮像時間の経過とともに発生しやすくなる場合があり、そのような場合に測定誤差を精度よく検出することができる。 Also, in the second embodiment, by projecting a code with a coarser pattern than a code with a finer pattern than the spatial code patterns QA1 to QA4 later, it is possible to easily detect the feature region as the imaging time elapses. In some cases, camera shake due to the user is likely to occur as the imaging time elapses. In such a case, measurement errors can be detected with high accuracy.
 <変形例> 
 次に、上記実施形態の変形例を説明する。 
 例えば、上記第1実施形態においては、第1測定像M1~第4測定像M4の撮像後に、それぞれ第1参照像R01~第4参照像R04を撮像する形態を例に挙げて説明したが、これに限定するものではない。例えば、図12に示すように、第1縞パターンP1を投影した測定対象物2の第1測定像M1を撮像する前に、一様パターンQを投影した測定対象物2の第1参照像RB1を撮像し、その後、第1測定像M1~第4測定像M4を撮像した後、一様パターンQを投影した測定対象物2の第2参照像RB2を撮像するようにしてもよい。この場合、演算部65は、第1参照像RB1及び第2参照像RBにおいて特徴領域CBを検出し、これらの特徴領域CBに基づいて、上記同様の手法により、測定対象物2と形状測定装置1との間の相対的なぶれを検出する。この場合、一様パターンQの投影及び参照像の撮像に要する時間が第1実施形態に比べて短縮されるため、全体的な撮像時間を短縮させることができる。
<Modification>
Next, a modification of the above embodiment will be described.
For example, in the first embodiment described above, an example in which the first reference image R01 to the fourth reference image R04 are captured after capturing the first measurement image M1 to the fourth measurement image M4 has been described as an example. However, the present invention is not limited to this. For example, as shown in FIG. 12, the first reference image RB1 of the measurement object 2 projected with the uniform pattern Q before imaging the first measurement image M1 of the measurement object 2 projected with the first stripe pattern P1. Then, after imaging the first measurement image M1 to the fourth measurement image M4, the second reference image RB2 of the measurement object 2 onto which the uniform pattern Q is projected may be imaged. In this case, the calculation unit 65 detects the feature region CB in the first reference image RB1 and the second reference image RB, and based on these feature regions CB, the measurement object 2 and the shape measuring device are obtained by the same method as described above. Detect relative shake between 1 and 1. In this case, since the time required for the projection of the uniform pattern Q and the imaging of the reference image is reduced as compared with the first embodiment, the overall imaging time can be reduced.
 また、上記第2実施形態においては、第1測定像M1~第4測定像M4と、第1参照像RA1~第4測定像RA4を交互に撮像する場合を例に挙げて説明したが、これに限定するものではない。例えば、図13に示すように、第1測定像M1~第4測定像M4の間に、参照像を連続して撮像するようにしてもよい。図13では、連続して撮像された参照像として、空間コードパターンQA1を投影した測定対象物2の参照像RC1、RC2が示されているが、これに限定するものではない。例えば、空間コードパターンを投影した測定対象物2の参照像を撮像した後に、一様パターンQを投影した測定対象物2の参照像を撮像してもよい。 In the second embodiment, the case where the first measurement image M1 to the fourth measurement image M4 and the first reference image RA1 to the fourth measurement image RA4 are alternately captured has been described as an example. It is not limited to. For example, as shown in FIG. 13, the reference image may be continuously captured between the first measurement image M1 to the fourth measurement image M4. In FIG. 13, reference images RC1 and RC2 of the measurement object 2 onto which the spatial code pattern QA1 is projected are shown as continuously captured reference images. However, the present invention is not limited to this. For example, after capturing a reference image of the measurement object 2 onto which the spatial code pattern is projected, a reference image of the measurement object 2 onto which the uniform pattern Q is projected may be captured.
 また、上記実施形態においては、構造光を投影した測定対象物2と、参照光を投影した測定対象物2とを同一の撮像部50で撮像する構成を例に挙げて説明したが、これに限定するものではない。例えば、図14に示すように、撮像部50に加えて第2撮像部150が設けられた形状測定装置201としてもよい。 Moreover, in the said embodiment, although the measurement object 2 which projected the structured light and the measurement object 2 which projected the reference light were mentioned as an example and demonstrated, the structure imaged by the same imaging part 50 was demonstrated as an example. It is not limited. For example, as illustrated in FIG. 14, the shape measuring device 201 may include a second imaging unit 150 in addition to the imaging unit 50.
 この場合、撮像部50及び第2撮像部150のうち一方を、構造光を投影した測定対象物2を撮像するために用い、他方を、参照光を投影した測定対象物2を撮像するために用いることができる。 In this case, one of the imaging unit 50 and the second imaging unit 150 is used for imaging the measurement object 2 on which structured light is projected, and the other is used for imaging the measurement object 2 on which reference light is projected. Can be used.
 形状測定装置201では、投影部10は、例えば投影光100Dの波長を所定波長(例、約680nm)として正弦波状の周期的に光強度を変化させつつ第2の方向D2に走査することで、構造光を測定対象物2に投影する。また、第2撮像部150において、受光光学系151と撮像装置152との間にこの所定波長に対応する波長の光を遮光し、他の波長の光を透過させるフィルタ153を配置させる。これにより、構造光を測定対象物2に投影する場合、撮像部50では構造光が投影された測定対象物2の測定像を撮像することができる。また、第2撮像部150では、フィルタ153で構造光が遮光される。また、第2撮像部150では、参照光を投影部10から投影せず自然光による測定対象物2の像を参照像として取得することができる。 In the shape measuring apparatus 201, the projection unit 10 scans in the second direction D2 while changing the light intensity periodically, for example, by setting the wavelength of the projection light 100D to a predetermined wavelength (eg, about 680 nm), The structured light is projected onto the measurement object 2. In the second imaging unit 150, a filter 153 that blocks light having a wavelength corresponding to the predetermined wavelength and transmits light having another wavelength is disposed between the light receiving optical system 151 and the imaging device 152. Thereby, when projecting structured light onto the measuring object 2, the imaging unit 50 can capture a measurement image of the measuring object 2 onto which the structured light is projected. In the second imaging unit 150, the structured light is blocked by the filter 153. Further, the second imaging unit 150 can acquire an image of the measurement object 2 by natural light as a reference image without projecting the reference light from the projection unit 10.
 このように、構造光を投影した測定対象物2と、参照光を投影した測定対象物2とを、別個の撮像部(撮像部50及び第2撮像部150)によって撮像することにより、測定対象物2の測定像及び参照像を同時に取得することが可能となる。これにより、撮像時間の短縮を図ることができる。 As described above, the measurement object 2 on which the structured light is projected and the measurement object 2 on which the reference light is projected are imaged by separate imaging units (the imaging unit 50 and the second imaging unit 150), thereby measuring the measurement object. It becomes possible to acquire the measurement image and the reference image of the object 2 at the same time. Thereby, the imaging time can be shortened.
 また、上記実施形態においては、ぶれを検出した後、そのぶれを補正する場合を例に挙げて説明したが、これに限定するものではない。例えば、演算部65は、ぶれを検出した後、ぶれの程度に応じて、その後の補正及び三次元形状の測定の可否を判断するようにしてもよい。この場合、ぶれの程度が予め設定された閾値を超える場合、その後の補正及び三次元形状の測定について否判定とすることができる。また、演算部65は、否判定とした場合、所定の信号を出力し、使用者に対して警告するようにしてもよい。そして、警告を受けた使用者は、再度、測定対象物の形状測定を行うようにしてもよい。 In the above embodiment, the case where the shake is corrected after the shake is detected has been described as an example. However, the present invention is not limited to this. For example, the calculation unit 65 may determine whether or not to perform subsequent correction and measurement of the three-dimensional shape according to the degree of shake after detecting the shake. In this case, when the degree of blur exceeds a preset threshold value, it is possible to make a negative determination for subsequent correction and measurement of the three-dimensional shape. Moreover, when it is determined as NO, the calculation unit 65 may output a predetermined signal and warn the user. Then, the user who has received the warning may measure the shape of the measurement object again.
 また、上記第1実施形態においては、第1参照像R01~第4参照像R04を用いて、それぞれ第1変化情報、第2変化情報、及び第3変化情報をそれぞれ検出する場合を例に挙げて説明したが、これに限定するものではない。例えば、第1変化情報、第2変化情報及び第3変化情報のうち少なくとも2つを検出し、これに基づいて、他の1つの変化情報を検出するようにしてもよい。これにより、処理時間の短縮を図ることができる。 In the first embodiment, the first change information, the second change information, and the third change information are detected by using the first reference image R01 to the fourth reference image R04, respectively. However, the present invention is not limited to this. For example, at least two of the first change information, the second change information, and the third change information may be detected, and one other change information may be detected based on this. Thereby, shortening of processing time can be aimed at.
 また、上記実施形態においては、特徴領域として、測定対象物2の角部2a及びその周辺に対応する領域が予め設定された場合を例に挙げて説明したが、これに限定するものではない。例えば、複数の参照像に共通する部位であれば、他の部位を特徴領域としてもよい。また、特徴領域として、3つの単位領域を含む構成を例に挙げて説明したが、これに限定するものではなく、例えば、2つ又は1つの単位領域を含む構成であってもよいし、4つ以上の単位領域を含む構成であってもよい。 In the above-described embodiment, the case where the region corresponding to the corner 2a of the measurement object 2 and the periphery thereof is set as the feature region has been described as an example. However, the present invention is not limited to this. For example, as long as it is a part common to a plurality of reference images, another part may be used as the feature region. In addition, the configuration including three unit regions has been described as an example of the feature region. However, the configuration is not limited to this, and for example, a configuration including two or one unit region may be used. It may be configured to include two or more unit regions.
 また、特徴領域が予め設定された場合に限られず、例えば演算部65が特徴領域を自動的に検出するようにしてもよい。一例として、参照像を複数撮像した後、演算部65が複数の参照像に共通して含まれるパターンを、パターンマッチング法などによって求めるようにすればよい。この場合、パターンとしては、L字型の部位、十字型の部位など、閉じた形状を選択することができる。また、後述のオーバーラッピング処理のために測定対象物2又はその周囲に配置されるマーク(例、円形上のシールやQRコードマークなど)を特徴領域として用いてもよい。このようなマークの位置や形状は、予め設定されたものが用いられる。したがって、マークを特徴領域として用いる場合、特徴領域が予め設定された構成となる。また、上記実施形態においては、特徴領域が例えば測定対象物2の画像上の領域である場合を例に挙げて説明したが、これに限定されるものではない。例えば、測定対象物2の一部を特徴領域とすることも可能である。 Further, the present invention is not limited to the case where the feature region is set in advance, and for example, the calculation unit 65 may automatically detect the feature region. As an example, after a plurality of reference images are captured, the calculation unit 65 may obtain a pattern that is commonly included in the plurality of reference images by a pattern matching method or the like. In this case, a closed shape such as an L-shaped part or a cross-shaped part can be selected as the pattern. In addition, a mark (for example, a circular seal or a QR code mark) arranged around the measurement object 2 or around it may be used as a feature region for the overlapping process described later. The positions and shapes of such marks are set in advance. Therefore, when the mark is used as the feature region, the feature region is set in advance. Moreover, in the said embodiment, although the case where the characteristic area | region was an area | region on the image of the measuring object 2 was mentioned as an example, it demonstrated as an example, but it is not limited to this. For example, a part of the measurement object 2 can be used as a feature region.
 <構造物製造システム及び構造物製造方法> 
 図15は、構造物製造システムの実施形態の一例を示すブロック図である。図15に示す構造物製造システムSYSは、上記した形状測定装置1(又は形状測定装置201)、設計装置710、成形装置720、制御装置(検査装置)730、及びリペア装置740を有している。
<Structure manufacturing system and structure manufacturing method>
FIG. 15 is a block diagram illustrating an example of an embodiment of a structure manufacturing system. The structure manufacturing system SYS illustrated in FIG. 15 includes the shape measuring device 1 (or the shape measuring device 201), the design device 710, the molding device 720, the control device (inspection device) 730, and the repair device 740. .
 設計装置710は、構造物の形状に関する設計情報を作製する。そして、設計装置710は、作製した設計情報を成形装置720及び制御装置730に送信する。ここで、設計情報とは、構造物の各位置の座標を示す情報である。また、測定対象物は、構造物である。 The design device 710 creates design information related to the shape of the structure. Then, the design device 710 transmits the produced design information to the molding device 720 and the control device 730. Here, the design information is information indicating the coordinates of each position of the structure. Further, the measurement object is a structure.
 成形装置720は、設計装置710から送信された設計情報に基づいて構造物を成形する。この成形装置720の成形工程は、鋳造、鍛造、または切削などが含まれる。形状測定装置1、201は、成形装置720により作製された構造物(測定対象物2)の三次元形状、すなわち構造物の座標を測定する。そして、形状測定装置1、201は、測定した座標を示す情報(以下、形状情報という。)を制御装置730に送信する。 The forming apparatus 720 forms a structure based on the design information transmitted from the design apparatus 710. The molding process of the molding apparatus 720 includes casting, forging, cutting, or the like. The shape measuring devices 1 and 201 measure the three-dimensional shape of the structure (measurement object 2) produced by the forming device 720, that is, the coordinates of the structure. Then, the shape measuring devices 1, 201 transmit information indicating the measured coordinates (hereinafter referred to as shape information) to the control device 730.
 制御装置730は、座標記憶部731及び検査部732を有している。座標記憶部731は、設計装置710から送信される設計情報を記憶する。検査部732は、座標記憶部731から設計情報を読み出す。また、検査部732は、座標記憶部731から読み出した設計情報と、形状測定装置1、201から送信される形状情報とを比較する。そして、検査部732は、比較結果に基づき、構造物が設計情報の通りに成形されたか否かを検査する。 The control device 730 includes a coordinate storage unit 731 and an inspection unit 732. The coordinate storage unit 731 stores design information transmitted from the design device 710. The inspection unit 732 reads design information from the coordinate storage unit 731. Further, the inspection unit 732 compares the design information read from the coordinate storage unit 731 with the shape information transmitted from the shape measuring devices 1 and 201. And the test | inspection part 732 test | inspects whether the structure was shape | molded according to design information based on the comparison result.
 また、検査部732は、成形装置720により成形された構造物が良品であるか否かを判定する。構造物が良品であるか否かは、例えば、設計情報と形状情報との誤差が所定の閾値の範囲内であるか否かにより判定する。そして、検査部732は、構造物が設計情報の通りに成形されていない場合は、その構造物を設計情報の通りに修復することができるか否かを判定する。修復することができると判定した場合は、検査部732は、比較結果に基づき、不良部位と修復量を算出する。そして、検査部732は、不良部位を示す情報(以下、不良部位情報という。)と、修復量を示す情報(以下、修復量情報という。)と、をリペア装置740に送信する。 Further, the inspection unit 732 determines whether or not the structure molded by the molding device 720 is a non-defective product. Whether or not the structure is a non-defective product is determined based on, for example, whether or not the error between the design information and the shape information is within a predetermined threshold range. If the structure is not molded according to the design information, the inspection unit 732 determines whether the structure can be repaired according to the design information. If it is determined that it can be repaired, the inspection unit 732 calculates a defective portion and a repair amount based on the comparison result. Then, the inspection unit 732 transmits information indicating a defective portion (hereinafter referred to as defective portion information) and information indicating a repair amount (hereinafter referred to as repair amount information) to the repair device 740.
 リペア装置740は、制御装置730から送信された不良部位情報と修復量情報とに基づいて、構造物の不良部位を加工する。 The repair device 740 processes the defective portion of the structure based on the defective portion information and the repair amount information transmitted from the control device 730.
 図16は、構造物製造システムSYSによる処理を示すフローチャートであり、構造物製造方法の実施形態の一例を示している。図16に示すように、設計装置710は、構造物の形状に関する設計情報を作製する(ステップS31)。設計装置710は、作製した設計情報を成形装置720及び制御装置730に送信する。制御装置730は、設計装置710から送信された設計情報を受信する。そして、制御装置730は、受信した設計情報を座標記憶部731に記憶する。 FIG. 16 is a flowchart showing processing by the structure manufacturing system SYS, and shows an example of an embodiment of a structure manufacturing method. As shown in FIG. 16, the design device 710 creates design information related to the shape of the structure (step S31). The design device 710 transmits the produced design information to the molding device 720 and the control device 730. The control device 730 receives the design information transmitted from the design device 710. Then, the control device 730 stores the received design information in the coordinate storage unit 731.
 次に、成形装置720は、設計装置710が作製した設計情報に基づいて構造物を成形する(ステップS32)。そして、形状測定装置1、201は、成形装置720が成形した構造物の三次元形状を測定する(ステップS33)。その後、形状測定装置1、201は、構造物の測定結果である形状情報を制御装置730に送信する。次に、検査部732は、形状測定装置1、201から送信された形状情報と、座標記憶部731に記憶されている設計情報とを比較して、構造物が設計情報の通りに成形されたか否か検査する(ステップS34)。 Next, the molding apparatus 720 molds the structure based on the design information created by the design apparatus 710 (step S32). Then, the shape measuring devices 1 and 201 measure the three-dimensional shape of the structure formed by the forming device 720 (step S33). Thereafter, the shape measuring devices 1 and 201 transmit shape information that is a measurement result of the structure to the control device 730. Next, the inspection unit 732 compares the shape information transmitted from the shape measuring apparatuses 1 and 201 with the design information stored in the coordinate storage unit 731, and whether the structure has been molded according to the design information. Whether or not is checked (step S34).
 次に、検査部732は、構造物が良品であるか否かを判定する(ステップS35)。構造物が良品であると判定した場合は(ステップS35:YES)、構造物製造システムSYSによる処理を終了する。一方、検査部732は、構造物が良品でないと判定した場合は(ステップS35:NO)、検査部732は、構造物を修復することができるか否かを判定する(ステップS36)。 Next, the inspection unit 732 determines whether or not the structure is a good product (step S35). If it is determined that the structure is a non-defective product (step S35: YES), the process by the structure manufacturing system SYS is terminated. On the other hand, when the inspection unit 732 determines that the structure is not a non-defective product (step S35: NO), the inspection unit 732 determines whether the structure can be repaired (step S36).
 検査部732が構造物を修復することができると判定した場合は(ステップS36:YES)、検査部732は、ステップS34の比較結果に基づいて、構造物の不良部位と修復量を算出する。そして、検査部732は、不良部位情報と修復量情報とをリペア装置740に送信する。リペア装置740は、不良部位情報と修復量情報とに基づいて構造物のリペア(再加工)を実行する(ステップS37)。そして、ステップS33の処理に移行する。すなわち、リペア装置740がリペアを実行した構造物に対してステップS33以降の処理が再度実行される。一方、検査部732が構造物を修復することができると判定した場合は(ステップS36:NO)、構造物製造システムSYSによる処理を終了する。 If the inspection unit 732 determines that the structure can be repaired (step S36: YES), the inspection unit 732 calculates the defective portion of the structure and the repair amount based on the comparison result of step S34. Then, the inspection unit 732 transmits the defective part information and the repair amount information to the repair device 740. The repair device 740 performs repair (rework) of the structure based on the defective part information and the repair amount information (step S37). Then, the process proceeds to step S33. That is, the process after step S33 is performed again with respect to the structure which the repair apparatus 740 performed repair. On the other hand, when the inspection unit 732 determines that the structure can be repaired (step S36: NO), the process by the structure manufacturing system SYS is terminated.
 このように、構造物製造システムSYS及び構造物製造方法では、形状測定装置1、201による構造物の測定結果に基づいて、検査部732が設計情報の通りに構造物が作製されたか否かを判定する。これにより、成形装置720により作製された構造物が良品であるか否か精度よく判定することができるとともに、その判定の時間を短縮することができる。また、上記した構造物製造システムSYSでは、検査部732により構造物が良品でないと判定された場合に、直ちに構造物のリペアを実行することができる。 As described above, in the structure manufacturing system SYS and the structure manufacturing method, based on the measurement result of the structure by the shape measuring apparatuses 1 and 201, the inspection unit 732 determines whether the structure is manufactured according to the design information. judge. Accordingly, it can be accurately determined whether or not the structure manufactured by the molding apparatus 720 is a non-defective product, and the determination time can be shortened. Further, in the structure manufacturing system SYS described above, when the inspection unit 732 determines that the structure is not a non-defective product, the structure can be repaired immediately.
 なお、上記した構造物製造システムSYS及び構造物製造方法において、リペア装置740が加工を実行することに代えて、成形装置720が再度加工を実行するように構成してもよい。 In the structure manufacturing system SYS and the structure manufacturing method described above, the molding device 720 may execute the processing again instead of the repair device 740 executing the processing.
 以上、本発明を実施の形態を用いて説明したが、本発明の技術的範囲は、上記実施の形態に記載の範囲には限定されない。本発明の趣旨を逸脱しない範囲で、上記実施の形態に、多様な変更または改良を加えることが可能である。また、上記の実施形態で説明した要件の1つ以上は、省略されることがある。そのような変更または改良、省略した形態も本発明の技術的範囲に含まれる。また、上記した実施形態や変形例の構成を適宜組み合わせて適用することも可能である。また、法令で許容される限りにおいて、上述の各実施形態及び変形例で引用したX線装置などに関する全ての公開公報及び米国特許の開示を援用して本文の記載の一部とする。 As mentioned above, although this invention was demonstrated using embodiment, the technical scope of this invention is not limited to the range as described in the said embodiment. Various modifications or improvements can be added to the above-described embodiment without departing from the spirit of the present invention. In addition, one or more of the requirements described in the above embodiments may be omitted. Such modifications, improvements, and omitted forms are also included in the technical scope of the present invention. In addition, the configurations of the above-described embodiments and modifications can be applied in appropriate combinations. In addition, as long as it is permitted by law, the disclosure of all published publications and US patents related to the X-ray apparatus and the like cited in the above embodiments and modifications are incorporated herein by reference.
 例えば、上記した各実施形態及び変形例において、第1の方向D1と第2の方向D2とが直交していたが、第1の方向D1と第2の方向D2とが異なる方向であれば直交していなくてもよい。例えば、第2の方向D2は、第1の方向D1に対して60度や80度の角度に設定されてもよい。 For example, in each of the above-described embodiments and modifications, the first direction D1 and the second direction D2 are orthogonal to each other, but are orthogonal if the first direction D1 and the second direction D2 are different directions. You don't have to. For example, the second direction D2 may be set to an angle of 60 degrees or 80 degrees with respect to the first direction D1.
 また、上記した各実施形態及び変形例において、各図面では光学素子を一つまたは複数で表しているが、特に使用する数を指定しない限り、同様の光学性能を発揮させるものであれば、使用する光学素子の数は任意である。 Further, in each of the above-described embodiments and modifications, each drawing shows one or more optical elements, but unless the number to be used is specified, it is used as long as the same optical performance is exhibited. The number of optical elements to be performed is arbitrary.
 また、上記した各実施形態及び変形例において、光生成部20等が構造光101及び参照光102を生成するための光は、可視光領域の波長の光、赤外線領域の波長の光、紫外線領域の波長の光、のいずれが用いられてもよい。可視光領域の波長の光が用いられることにより、使用者が投影領域200を認識可能となる。この可視光領域のうち、赤色の波長が用いられることにより、測定対象物2へのダメージを軽減させることができる。 In each of the above-described embodiments and modifications, the light for generating the structured light 101 and the reference light 102 by the light generation unit 20 and the like is light having a wavelength in the visible light region, light having a wavelength in the infrared region, and an ultraviolet region. Any of the light having a wavelength of may be used. By using light having a wavelength in the visible light region, the user can recognize the projection region 200. By using a red wavelength in the visible light region, damage to the measurement object 2 can be reduced.
 また、上記した各実施形態及び変形例において、走査部40は、構造光を反射する光学素子を用いているがこれに限定されない。例えば、回折光学素子や、屈折光学素子、平行平板ガラス等が用いられてもよい。レンズ等の屈折光学素子を光軸に対して振動させることで構造光を走査させてもよい。なお、この屈折光学素子としては、投影光学系30の一部の光学素子が用いられてもよい。 Further, in each of the above-described embodiments and modifications, the scanning unit 40 uses an optical element that reflects structured light, but is not limited thereto. For example, a diffractive optical element, a refractive optical element, parallel flat glass, or the like may be used. The structured light may be scanned by vibrating a refractive optical element such as a lens with respect to the optical axis. As this refractive optical element, a part of the optical elements of the projection optical system 30 may be used.
 また、上記した各実施形態及び変形例において、撮像部50としてCCDカメラ52aが用いられるがこれに限定されない。例えば、CCDカメラに代えて、CMOSイメージセンサ(CMOS:Complementary Metal Oxide Semiconductor:相補性金属酸化膜半導体)などのイメージセンサが用いられてもよい。 In each of the above-described embodiments and modifications, the CCD camera 52a is used as the imaging unit 50, but the present invention is not limited to this. For example, an image sensor such as a CMOS image sensor (CMOS: Complementary Metal Oxide Semiconductor) may be used instead of the CCD camera.
 また、上記した各実施形態及び変形例において、位相シフト法に用いる縞パターンPの位相を一周期の間に4回シフトさせる4バケット法が用いられるが、これに限定されない。例えば、縞パターンPの位相の一周期2πを5分割した5バケット法や、同じく6分割した6バケット法などが用いられてもよい。 Further, in each of the above-described embodiments and modifications, the 4-bucket method is used in which the phase of the fringe pattern P used in the phase shift method is shifted four times during one period, but is not limited thereto. For example, a 5-bucket method in which one period 2π of the phase of the fringe pattern P is divided into 5 or a 6-bucket method in which the period is also divided into 6 may be used.
 また、上記した各実施形態及び変形例において、いずれも位相シフト法が用いられているが、第2実施形態で説明した空間コード法のみを用いて測定対象物2の三次元形状を測定するものでもよい。 In each of the above-described embodiments and modifications, the phase shift method is used, but the three-dimensional shape of the measurement object 2 is measured using only the spatial code method described in the second embodiment. But you can.
 また、上記した第2実施形態において、空間コードパターンQAの撮像前に縞パターンPを撮像しているが、これとは逆に行ってもよい。 In the second embodiment described above, the fringe pattern P is imaged before the spatial code pattern QA is imaged, but this may be reversed.
 また、上記した各実施形態及び変形例において、縞パターンPや空間コードパターンQAを白色及び黒色で表していたが、これに限定されず、いずれか一方または双方が単色であってもよい。例えば、縞パターンPや空間コードパターンQAは、白色と赤色とで生成されるものでもよい。 In each of the above-described embodiments and modifications, the stripe pattern P and the spatial code pattern QA are expressed in white and black. However, the present invention is not limited to this, and either one or both of them may be monochromatic. For example, the stripe pattern P and the spatial code pattern QA may be generated in white and red.
 また、上記した各実施形態及び変形例において、第1測定像M1~第4測定像M4を投影した場合の、第1測定像M1と第2測定像M2との間のぶれ、第2測定像M2と第3測定像M3との間のぶれ、及び第3測定像M3と第4測定像M4との間のぶれを検出したが、これに限定するものではない。例えば、第1参照像R01~第4参照像R04を投影した場合の、第1参照像R01と第2参照像R02との間のぶれ、第2参照像R02と第3参照像R03との間のぶれ、第3参照像R03と第4参照像R04との間のぶれを検出するものであってもよい。 Further, in each of the above-described embodiments and modifications, the blur between the first measurement image M1 and the second measurement image M2 and the second measurement image when the first measurement image M1 to the fourth measurement image M4 are projected. Although the blur between M2 and the 3rd measurement image M3 and the blur between the 3rd measurement image M3 and the 4th measurement image M4 were detected, it is not limited to this. For example, when the first reference image R01 to the fourth reference image R04 are projected, the blur between the first reference image R01 and the second reference image R02, and between the second reference image R02 and the third reference image R03. It is also possible to detect blurring and blurring between the third reference image R03 and the fourth reference image R04.
 また、上述の実施形態及び変形例において、投影部10、撮像部50、演算処理部60、及び表示装置70が持ち運びが可能な筐体80に収容された構成を例に挙げて説明したが、これに限定されるものではない。例えば、演算装置60や表示装置70については、筐体80に配置されなくてもよく、筐体80の外部に設置されてもよい。この場合、演算装置60や表示装置70としては、例えばパーソナルコンピュータ(ノート型及びデスクトップ型を含む)などを用いることができる。 Further, in the above-described embodiment and modification, the projection unit 10, the imaging unit 50, the arithmetic processing unit 60, and the display device 70 have been described as an example of a configuration housed in a portable case 80. It is not limited to this. For example, the arithmetic device 60 and the display device 70 do not have to be arranged in the housing 80 and may be installed outside the housing 80. In this case, as the arithmetic device 60 and the display device 70, for example, a personal computer (including a notebook type and a desktop type) can be used.
 なお、演算処理部60の全ての機能を持ち運びが可能な筐体に収容しなくてもよく、演算処理部60の一部の機能(演算部、画像記憶部、表示制御部、及び設定情報記憶部の少なくとも一部)を外部のコンピュータに持たせてもよい。また、本発明は、持ち運び可能な形状測定装置1、201に限定されず、例えば多関節アームに三次元測定部が設けられた測定機や、測定対象物2を載置するステージ上を三次元測定部が移動可能に構成された測定機などの、据え置き型の形状測定装置に対しても適用できる。 Note that all functions of the arithmetic processing unit 60 may not be housed in a portable case, and some functions of the arithmetic processing unit 60 (the arithmetic unit, the image storage unit, the display control unit, and the setting information storage) May be provided to an external computer. Further, the present invention is not limited to the portable shape measuring devices 1 and 201. For example, a measuring machine in which a multi-joint arm is provided with a three-dimensional measuring unit or a stage on which a measuring object 2 is placed is three-dimensionally displayed. The present invention can also be applied to a stationary shape measuring apparatus such as a measuring machine in which the measuring unit is configured to be movable.
 この場合であっても、上述の実施形態と同様に、MEMSミラーの往復振動とレーザダイオードから射出される光強度とを同期させる必要がなく、複雑かつ高度な同期制御が不要となる。投影部10、撮像部50、演算処理部60、及び表示装置70を持ち運びが可能な筐体に収容した形状測定装置を持ち運ぶ場合、特に外部の測定環境(温度、湿度、気圧など)が変化しやすくなるが、外部環境が変化したとしても高精度な測定対象物2の形状測定を行うことができる。 Even in this case, it is not necessary to synchronize the reciprocating vibration of the MEMS mirror and the light intensity emitted from the laser diode as in the above-described embodiment, and complicated and sophisticated synchronization control is not necessary. When carrying the shape measuring device housed in a portable case that can carry the projection unit 10, the imaging unit 50, the arithmetic processing unit 60, and the display device 70, the external measurement environment (temperature, humidity, atmospheric pressure, etc.) changes in particular. Although it becomes easy, even if the external environment changes, the shape of the measurement object 2 can be measured with high accuracy.
 また、上記した各実施形態及び変形例において、測定対象物2が撮像部50の撮像視野に収まらない場合は、測定対象物2の異なる位置をそれぞれ測定して、各測定結果をつなぎ合わせることで測定対象物2全体の三次元形状を測定してもよい。測定結果をつなぎ合わせるさいには、測定結果の一部を重ねあわせる、オーバーラッピング処理が用いられてもよい。このオーバーラッピング処理は、演算部65が行う。 Moreover, in each above-mentioned embodiment and modification, when the measurement target object 2 does not fit in the imaging visual field of the imaging part 50, each different position of the measurement target object 2 is measured, and each measurement result is connected. The three-dimensional shape of the entire measurement object 2 may be measured. When connecting the measurement results, an overlapping process for overlapping a part of the measurement results may be used. This overlapping process is performed by the calculation unit 65.
 オーバーラッピング処理について説明する。先ず、撮像部50によって測定対象物2の第1部分を撮像する。次いで、撮像部50によって、測定対象物2の第1部分と一部重なる第2部分を撮像する。演算部65は、第1部分及び第2部分についてそれぞれ三次元形状を算出する。さらに、演算部65は、第1部分と第2部分とが重なる部分をサーチし、この部分を重ねることにより第1部分及び第2部分の三次元形状をつなぎ合わせる。なお、演算部65は、第1部分の三次元形状と第2部分の三次元形状とで共通の座標データとなる所定領域の画素をサーチして、第1部分と第2部分との重複部分を判断する。 The overlapping process will be described. First, the first portion of the measurement object 2 is imaged by the imaging unit 50. Next, the imaging unit 50 captures an image of the second part that partially overlaps the first part of the measurement object 2. The calculation unit 65 calculates a three-dimensional shape for each of the first part and the second part. Further, the calculation unit 65 searches for a portion where the first portion and the second portion overlap, and connects the three-dimensional shapes of the first portion and the second portion by overlapping the portions. Note that the calculation unit 65 searches for pixels in a predetermined area that is common coordinate data for the three-dimensional shape of the first portion and the three-dimensional shape of the second portion, and overlaps the first portion and the second portion. Judging.
 このような処理を測定対象物2の全体が撮像されるまで繰り返し実行することで、測定対象物2全体の三次元形状が測定される。これによれば、測定対象物2が大きな物体であった場合でも、測定対象物2全体の三次元形状を容易に測定することができる。 The three-dimensional shape of the entire measurement object 2 is measured by repeatedly executing such processing until the entire measurement object 2 is imaged. According to this, even when the measuring object 2 is a large object, the three-dimensional shape of the entire measuring object 2 can be easily measured.
 また、形状測定装置1の一部の構成をコンピュータにより実現してもよい。例えば、演算部処理部60をコンピュータにより実現してもよい。この場合、コンピュータは、記憶部に記憶された形状測定プログラムに従って、測定対象物2の第1の参照像を撮像する処理と、測定対象物2の第2の参照像を撮像する処理と、第1の参照像の撮像と同時又は撮像後で、かつ第2の参照像の撮像よりも前に、縞パターンPを測定対象物2に投影した測定対象物2の測定像を撮像する処理と、第1の参照像から特徴領域(後述)を検出する処理と、第2の参照像からその特徴領域(後述)を検出する処理と、第1の参照像における特徴領域と、第2の参照像における特徴領域とに基づいて、測定対象物2と形状測定装置1との相対的なぶれを検出する処理と、測定対象物2の測定像とぶれとに基づいて、測定対象物2の形状を算出する処理と、を実行する。 Further, a part of the configuration of the shape measuring apparatus 1 may be realized by a computer. For example, the calculation unit processing unit 60 may be realized by a computer. In this case, the computer captures the first reference image of the measurement object 2 according to the shape measurement program stored in the storage unit, the process of capturing the second reference image of the measurement object 2, and the first A process of imaging a measurement image of the measurement object 2 obtained by projecting the fringe pattern P onto the measurement object 2 simultaneously with or after the imaging of the first reference image and before the imaging of the second reference image; Processing for detecting a feature region (described later) from the first reference image, processing for detecting the feature region (described later) from the second reference image, feature region in the first reference image, and second reference image The shape of the measuring object 2 is determined based on the processing for detecting the relative shake between the measuring object 2 and the shape measuring device 1 based on the characteristic area in FIG. Processing to calculate.
 なお、上述の各実施形態の要件は、適宜組み合わせることができる。また、一部の構成要素を用いない場合もある。また、法令で許容される限りにおいて、上述の各実施形態及び変形例で引用した検出方法、形状測定方法や形状測定装置などに関する全ての公開公報及び米国特許の開示を援用して本文の記載の一部とする。 Note that the requirements of the above-described embodiments can be combined as appropriate. Some components may not be used. In addition, as long as it is permitted by law, the disclosure of the text is incorporated with the disclosure of all published publications and US patents related to the detection methods, shape measurement methods, shape measurement devices, etc. cited in the above embodiments and modifications. Part.
 C、CA…特徴領域、D1…第1の方向、D2…第2の方向、SYS…構造物製造システム、P…縞パターン、Q…一様パターン、QA…空間コードパターン、M1~M4…第1測定像~第4測定像、R01~R04、RA1~RA4…第1参照像~第4参照像、1…形状測定装置(形状測定装置、測定機)、2…測定対象物、10…投影部、20…光生成部、40…走査部、50…撮像部、60…演算処理部、62…制御部、62a…第1制御部、62b…第2制御部、65…演算部、70…表示装置、100…投影光、101…構造光、102…参照光 C, CA: Feature region, D1: First direction, D2: Second direction, SYS: Structure manufacturing system, P: Stripe pattern, Q: Uniform pattern, QA: Spatial code pattern, M1 to M4: First 1 measurement image to 4th measurement image, R01 to R04, RA1 to RA4 ... 1st reference image to 4th reference image, 1 ... shape measuring device (shape measuring device, measuring machine), 2 ... measurement object, 10 ... projection , 20 ... light generation unit, 40 ... scanning unit, 50 ... imaging unit, 60 ... calculation processing unit, 62 ... control unit, 62a ... first control unit, 62b ... second control unit, 65 ... calculation unit, 70 ... Display device, 100: projection light, 101: structured light, 102: reference light

Claims (21)

  1.  測定対象物の形状を測定する測定機と該測定対象物との相対的なぶれを検出する検出方法において、
     前記測定対象物の第1の参照像を前記測定機で撮像することと、
     前記測定対象物の第2の参照像を前記測定機で撮像することと、
     前記第1の参照像の撮像と同時又は撮像後で、かつ前記第2の参照像の撮像よりも前に、形状測定用の構造光を前記測定機から前記測定対象物に投影した前記測定対象物の像を前記測定機で撮像することと、
     前記第1の参照像から特徴領域を検出することと、
     前記第2の参照像から前記特徴領域を検出することと、
     前記第1の参照像における前記特徴領域と、前記第2の参照像における前記特徴領域とに基づいて、前記測定対象物と前記測定機との相対的なぶれを検出することと、
     を含む検出方法。
    In a detection method for detecting a relative shake between a measuring machine that measures the shape of a measurement object and the measurement object,
    Capturing a first reference image of the measurement object with the measuring instrument;
    Capturing a second reference image of the measurement object with the measuring instrument;
    The measurement object obtained by projecting the structural light for shape measurement from the measuring instrument onto the measurement object at the same time as or after the imaging of the first reference image and before the imaging of the second reference image. Taking an image of an object with the measuring machine;
    Detecting a feature region from the first reference image;
    Detecting the feature region from the second reference image;
    Detecting relative blur between the measurement object and the measuring device based on the feature region in the first reference image and the feature region in the second reference image;
    A detection method comprising:
  2.  前記特徴領域は、複数の領域を含む
     請求項1に記載の検出方法。
    The detection method according to claim 1, wherein the feature region includes a plurality of regions.
  3.  前記第1の参照像の撮像は、前記測定対象物に第1の参照光を投影して行い、
     前記第2の参照像の撮像は、前記測定対象物に第2の参照光を投影して行う
     請求項1又は請求項2に記載の検出方法。
    The imaging of the first reference image is performed by projecting first reference light onto the measurement object,
    The detection method according to claim 1, wherein the second reference image is captured by projecting second reference light onto the measurement object.
  4.  前記第1の参照光及び前記第2の参照光は、強度分布が一様に設定された光、又は空間コードとしての矩形波状の強度分布を有する光を含むように設定された光である
     請求項3に記載の検出方法。
    The first reference light and the second reference light are light set so as to include light having a uniform intensity distribution or light having a rectangular wave-like intensity distribution as a spatial code. Item 4. The detection method according to Item 3.
  5.  前記構造光は、正弦波状の強度分布を有する光、又は空間コードとしての矩形波状の強度分布を有する光を含むように設定された光である
     請求項1から請求項4のうちいずれか一項に記載の検出方法。
    The structure light is light set to include light having a sinusoidal intensity distribution or light having a rectangular wave intensity distribution as a spatial code. The detection method according to.
  6.  前記ぶれは、前記測定対象物と前記測定機との間の相対的な位置変化及び姿勢変化のうち少なくとも一方である
     請求項1から請求項5のうちいずれか一項に記載の検出方法。
    The detection method according to claim 1, wherein the shake is at least one of a relative position change and a posture change between the measurement object and the measuring device.
  7.  前記測定対象物の像の撮像は、
     前記構造光として所定の強度分布を有する第1構造光を投影して前記測定対象物の第1像を撮像することと、
     前記第1像の撮像後、前記構造光として前記第1構造光とは強度分布の異なる第2構造光を投影して前記測定対象物の第2像を撮像することと、を含む
     請求項1から請求項6のうちいずれか一項に記載の検出方法。
    Imaging of the image of the measurement object
    Projecting first structured light having a predetermined intensity distribution as the structured light to capture a first image of the measurement object;
    2. After capturing the first image, projecting second structured light having a different intensity distribution from the first structured light as the structured light to capture a second image of the measurement object. The detection method according to claim 6.
  8.  前記第1の参照像の撮像は、前記第1像の撮像と同時または撮像前に行い、
     前記第2の参照像の撮像は、前記第2像の撮像と同時または撮像前に行い、
     前記第2像の撮像後、前記形状測定装置で前記測定対象物の第3の参照像を撮像することを含み、
     前記ぶれの検出は、前記第1の参照像及び前記第2の参照像における前記特徴領域に基づいて前記測定対象物と前記測定機との間の相対的な位置変化及び姿勢変化のうち少なくとも一方に関する第1変化情報を検出することと、前記第2の参照像及び前記第3の参照像における前記特徴領域とに基づいて前記測定対象物と前記測定機との間の相対的な位置変化及び姿勢変化のうち少なくとも一方に関する第2変化情報を検出することと、前記第1変化情報及び前記第2変化情報に基づいて前記ぶれを検出することと、を含む
     請求項7に記載の検出方法。
    The imaging of the first reference image is performed simultaneously with or before the imaging of the first image,
    The imaging of the second reference image is performed simultaneously with or before the imaging of the second image,
    Capturing a third reference image of the measurement object with the shape measuring device after capturing the second image,
    The detection of the blur is at least one of a relative position change and a posture change between the measurement object and the measuring device based on the feature region in the first reference image and the second reference image. Detecting a first change information regarding the relative position change between the measurement object and the measuring device based on the second reference image and the feature region in the third reference image, and The detection method according to claim 7, comprising: detecting second change information related to at least one of posture changes; and detecting the shake based on the first change information and the second change information.
  9.  形状測定用の構造光を測定機から測定対象物に投影した前記測定対象物の像を前記測定機で撮像することと、
     請求項1から請求項8のうちいずれか一項に記載の検出方法によって、前記測定対象物と前記測定機との間の相対的なぶれを検出することと、
     前記測定対象物の像と前記ぶれとに基づいて、前記測定対象物の形状を算出することと、
     を含む形状測定方法。
    Capturing an image of the measurement object obtained by projecting the structured light for shape measurement from the measurement machine onto the measurement object; and
    Detecting a relative shake between the measurement object and the measuring device by the detection method according to any one of claims 1 to 8;
    Calculating the shape of the measurement object based on the image of the measurement object and the blur;
    A shape measuring method including:
  10.  前記測定対象物の像から位相情報を算出することと、
     算出した前記位相情報を前記ぶれに基づいて補正することと、
     を含み、
     前記測定対象物の形状の測定は、補正した前記位置情報に基づいて行う
     請求項9に記載の形状測定方法。
    Calculating phase information from an image of the measurement object;
    Correcting the calculated phase information based on the blur;
    Including
    The shape measurement method according to claim 9, wherein the measurement of the shape of the measurement object is performed based on the corrected position information.
  11.  測定対象物の形状を測定する形状測定装置であって、
     前記測定対象物に対して少なくとも形状測定用の構造光を投影する投影部と、
     前記測定対象物の像を撮像する撮像部と、
     前記測定対象物の第1の参照像を前記撮像部に撮像させ、前記測定対象物の第2の参照像を前記撮像部に撮像させ、前記第1の参照像の撮像と同時又は撮像後で、かつ前記第2の参照像の撮像よりも前に、形状測定用の構造光を前記投影部から前記測定対象物に投影させて前記構造光を投影した前記測定対象物の像を前記撮像部に撮像させる制御部と、
     前記第1の参照像から特徴領域を検出し、前記第2の参照像から前記特徴領域を検出し、前記第1の参照像における前記特徴領域と、前記第2の参照像における前記特徴領域とに基づいて、前記測定対象物と前記測定機との相対的なぶれを検出し、前記測定対象物の像と前記ぶれとに基づいて、前記測定対象物の形状を算出する演算部と
     を備える形状測定装置。
    A shape measuring device for measuring the shape of a measurement object,
    A projection unit that projects at least structured light for shape measurement onto the measurement object;
    An imaging unit that captures an image of the measurement object;
    The first reference image of the measurement object is imaged by the imaging unit, the second reference image of the measurement object is imaged by the imaging unit, and at the same time or after the imaging of the first reference image. Before the imaging of the second reference image, the imaging unit projects an image of the measurement object obtained by projecting the structured light for shape measurement from the projection unit onto the measurement object. A control unit for imaging
    A feature region is detected from the first reference image, the feature region is detected from the second reference image, the feature region in the first reference image, and the feature region in the second reference image; And a calculation unit that detects a relative shake between the measurement object and the measuring device and calculates a shape of the measurement object based on the image of the measurement object and the shake. Shape measuring device.
  12.  前記特徴領域は、複数の領域を含む
     請求項11に記載の形状測定装置。
    The shape measuring apparatus according to claim 11, wherein the feature region includes a plurality of regions.
  13.  前記投影部は、前記測定対象物に対して、第1の参照光及び第2の参照光を投影可能であり、
     前記制御部は、
     前記測定対象物に前記第1の参照光を投影して前記第1の参照像の撮像を行わせ、
     前記測定対象物に前記第2の参照光を投影して前記第2の参照像の撮像を行わせる
     請求項11又は請求項12に記載の形状測定装置。
    The projection unit can project the first reference light and the second reference light onto the measurement object,
    The controller is
    Projecting the first reference light onto the object to be measured to capture the first reference image;
    The shape measuring apparatus according to claim 11, wherein the second reference light is projected onto the measurement object to capture the second reference image.
  14.  前記第1の参照光及び前記第2の参照光は、強度分布が一様に設定された光、又は空間コードとしての矩形波状の強度分布を有する光を含むように設定された光である
     請求項13に記載の形状測定装置。
    The first reference light and the second reference light are light set so as to include light having a uniform intensity distribution or light having a rectangular wave-like intensity distribution as a spatial code. Item 14. The shape measuring apparatus according to Item 13.
  15.  前記構造光は、正弦波状の強度分布を有する光、又は空間コードとしての矩形波状の強度分布を有する光を含むように設定された光である
     請求項11から請求項14のうちいずれか一項に記載の形状測定装置。
    The structure light is light set to include light having a sinusoidal intensity distribution or light having a rectangular wave intensity distribution as a spatial code. The shape measuring device described in 1.
  16.  前記ぶれは、前記測定対象物と前記測定機との間の相対的な位置変化及び姿勢変化のうち少なくとも一方である
     請求項11から請求項15のうちいずれか一項に記載の検出方法。
    The detection method according to any one of claims 11 to 15, wherein the shake is at least one of a relative position change and a posture change between the measurement object and the measuring device.
  17.  前記制御部は、前記測定対象物の像の撮像として、
     前記構造光として所定の強度分布を有する第1構造光を投影して前記測定対象物の第1像を撮像させ、
     前記第1像の撮像後、前記構造光として前記第1構造光とは強度分布の異なる第2構造光を投影して前記測定対象物の第2像を撮像させる
     請求項11から請求項16のうちいずれか一項に記載の形状測定装置。
    As the imaging of the image of the measurement object, the control unit,
    Projecting a first structured light having a predetermined intensity distribution as the structured light to capture a first image of the measurement object;
    17. The second image of the measurement object is captured by projecting second structured light having a different intensity distribution from the first structured light as the structured light after capturing the first image. The shape measuring apparatus as described in any one of them.
  18.  前記制御部は、
     前記第1像の撮像と同時または撮像前に前記第1の参照像の撮像を行わせ、
     前記第2像の撮像と同時または撮像前に前記第2の参照像の撮像を行わせ、
     前記第2像の撮像後、前記形状測定装置で前記測定対象物の第3の参照像の撮像を行わせるものであり、
     前記演算部は、前記第1の参照像及び前記第2の参照像における前記特徴領域に基づいて前記測定対象物と前記測定機との間の相対的な位置変化及び姿勢変化のうち少なくとも一方に関する第1変化情報を検出し、前記第2の参照像及び前記第3の参照像における前記特徴領域とに基づいて前記測定対象物と前記測定機との間の相対的な位置変化及び姿勢変化のうち少なくとも一方に関する第2変化情報を検出し、前記第1変化情報及び前記第2変化情報に基づいて前記ぶれを検出する
     請求項17に記載の形状測定装置。
    The controller is
    Imaging the first reference image simultaneously with or before imaging the first image,
    Imaging the second reference image simultaneously with or before imaging the second image,
    After capturing the second image, the shape measuring device captures a third reference image of the measurement object,
    The calculation unit relates to at least one of a relative position change and a posture change between the measurement object and the measuring device based on the feature region in the first reference image and the second reference image. First change information is detected, and based on the feature region in the second reference image and the third reference image, a relative position change and posture change between the measurement object and the measuring machine are detected. The shape measuring apparatus according to claim 17, wherein second change information relating to at least one of them is detected, and the shake is detected based on the first change information and the second change information.
  19.  構造物の形状に関する設計情報を作製する設計装置と、
     前記設計情報に基づいて前記構造物を作製する成形装置と、
     作製された前記構造物の形状を測定する請求項11から請求項18のうちいずれか一項に記載の形状測定装置と、
     前記形状測定装置によって得られた前記構造物の形状に関する形状情報と前記設計情報とを比較する検査装置と、
     を含む構造物製造システム。
    A design device for creating design information on the shape of the structure;
    A molding apparatus for producing the structure based on the design information;
    The shape measuring device according to any one of claims 11 to 18, which measures the shape of the manufactured structure,
    An inspection device for comparing shape information on the shape of the structure obtained by the shape measuring device with the design information;
    Structure manufacturing system including.
  20.  構造物の形状に関する設計情報を作製することと、
     前記設計情報に基づいて前記構造物を作製することと、
     作製された前記構造物の形状を測定する請求項9又は請求項10に記載の形状測定方法と、
     前記形状測定方法によって得られた前記構造物の形状に関する形状情報と前記設計情報とを比較することと、
     を含む構造物製造方法。
    Creating design information on the shape of the structure;
    Producing the structure based on the design information;
    The shape measuring method according to claim 9 or 10, wherein the shape of the manufactured structure is measured,
    Comparing the shape information on the shape of the structure obtained by the shape measuring method with the design information;
    A structure manufacturing method comprising:
  21.  形状測定用の構造光を測定対象物に投影した前記測定対象物の像を撮像して前記測定対象物の形状を測定する形状測定装置に含まれるコンピュータに、
     前記測定対象物の第1の参照像を撮像する処理と、
     前記測定対象物の第2の参照像を撮像する処理と、
     前記第1の参照像の撮像と同時又は撮像後で、かつ前記第2の参照像の撮像よりも前に、形状測定用の構造光を前記測定対象物に投影した前記測定対象物の像を撮像する処理と、
     前記第1の参照像から特徴領域を検出する処理と、
     前記第2の参照像から前記特徴領域を検出する処理と、
     前記第1の参照像における前記特徴領域と、前記第2の参照像における前記特徴領域とに基づいて、前記測定対象物と前記形状測定装置との相対的なぶれを検出する処理と、
     前記測定対象物の像と前記ぶれとに基づいて、前記測定対象物の形状を算出する処理と、
     を実行させる形状測定プログラム。
    A computer included in a shape measuring apparatus that measures the shape of the measurement object by capturing an image of the measurement object obtained by projecting the structured light for shape measurement onto the measurement object,
    Processing to capture a first reference image of the measurement object;
    Processing to capture a second reference image of the measurement object;
    The image of the measurement object obtained by projecting the structural light for shape measurement onto the measurement object at the same time as or after the imaging of the first reference image and before the imaging of the second reference image. Processing to image,
    Processing for detecting a feature region from the first reference image;
    Processing for detecting the feature region from the second reference image;
    A process for detecting a relative shake between the measurement object and the shape measuring device based on the feature region in the first reference image and the feature region in the second reference image;
    Processing for calculating the shape of the measurement object based on the image of the measurement object and the blur;
    A shape measurement program that executes
PCT/JP2014/070460 2014-08-04 2014-08-04 Detection method, shape measurement method, shape measurement device, structure production method, structure production system, and shape measurement program WO2016020966A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/070460 WO2016020966A1 (en) 2014-08-04 2014-08-04 Detection method, shape measurement method, shape measurement device, structure production method, structure production system, and shape measurement program

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/JP2014/070460 WO2016020966A1 (en) 2014-08-04 2014-08-04 Detection method, shape measurement method, shape measurement device, structure production method, structure production system, and shape measurement program

Publications (1)

Publication Number Publication Date
WO2016020966A1 true WO2016020966A1 (en) 2016-02-11

Family

ID=55263274

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2014/070460 WO2016020966A1 (en) 2014-08-04 2014-08-04 Detection method, shape measurement method, shape measurement device, structure production method, structure production system, and shape measurement program

Country Status (1)

Country Link
WO (1) WO2016020966A1 (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019012534A1 (en) 2017-07-12 2019-01-17 Guardian Optical Technologies Ltd. Visual, depth and micro-vibration data extraction using a unified imaging device

Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012239829A (en) * 2011-05-24 2012-12-10 Olympus Corp Endoscope device and image acquisition method
JP2014035198A (en) * 2012-08-07 2014-02-24 Nikon Corp Shape measurement device, structure manufacturing system, shape measurement method, structure manufacturing method, and shape measurement program

Patent Citations (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2012239829A (en) * 2011-05-24 2012-12-10 Olympus Corp Endoscope device and image acquisition method
JP2014035198A (en) * 2012-08-07 2014-02-24 Nikon Corp Shape measurement device, structure manufacturing system, shape measurement method, structure manufacturing method, and shape measurement program

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2019012534A1 (en) 2017-07-12 2019-01-17 Guardian Optical Technologies Ltd. Visual, depth and micro-vibration data extraction using a unified imaging device
EP3652703A4 (en) * 2017-07-12 2021-04-28 Guardian Optical Technologies Ltd. Visual, depth and micro-vibration data extraction using a unified imaging device
US11182915B2 (en) 2017-07-12 2021-11-23 Gentex Corporation Visual, depth and micro-vibration data extraction using a unified imaging device
US11706377B2 (en) 2017-07-12 2023-07-18 Gentex Corporation Visual, depth and micro-vibration data extraction using a unified imaging device

Similar Documents

Publication Publication Date Title
JP6504274B2 (en) Three-dimensional shape data and texture information generation system, imaging control program, three-dimensional shape data and texture information generation method, and information recording medium
JP6532325B2 (en) Measuring device for measuring the shape of the object to be measured
US10803623B2 (en) Image processing apparatus
JP2007322162A (en) Three-dimensional shape measuring apparatus and three-dimensional shape measuring method
JP2015102485A (en) Shape measuring device, optical scanner, structure manufacturing system, shape measuring method, structure manufacturing method, and shape measuring program
US11451755B2 (en) Method for controlling electronic instrument and electronic instrument
US11209712B2 (en) Image processing apparatus
JP2012093235A (en) Three-dimensional shape measurement device, three-dimensional shape measurement method, structure manufacturing method, and structure manufacturing system
JP6701745B2 (en) Three-dimensional shape measuring method, displacement measuring method, three-dimensional shape measuring apparatus, displacement measuring apparatus, structure manufacturing method, structure manufacturing system, and three-dimensional shape measuring program
JP2014134611A (en) Geometric distortion correction device, projector, and geometric distortion correction method
JP2017198470A (en) Measurement device, measurement method, system, and goods manufacturing method
US10630910B1 (en) Image processing apparatus
WO2016020966A1 (en) Detection method, shape measurement method, shape measurement device, structure production method, structure production system, and shape measurement program
JP2016011930A (en) Connection method of three-dimensional data, measurement method, measurement device, structure manufacturing method, structure manufacturing system, and shape measurement program
JP2015206749A (en) Coupling method of three-dimensional data, shape measurement method, coupling device of three-dimensional data, shape measurement device, structure manufacturing method, structure manufacturing system and shape measurement program
JP2015203588A (en) Detection device, detection method, shape measurement device, shape measurement method, structure manufacturing system, structure manufacturing method and shape measurement program
WO2019230284A1 (en) Three-dimensional measuring device, method for displaying position of three-dimensional measuring device, and program
JP2011130290A (en) Method of correcting camera image, camera device, and coordinate transformation parameter determining device
JP2012013592A (en) Calibration method for three-dimensional shape measuring machine, and three-dimensional shape measuring machine
JP2016008838A (en) Shape measuring device, structure manufacturing system, shape measuring method, and structure manufacturing method
WO2016113861A1 (en) Three-dimensional shape measurement device, three-dimensional shape measurement method, structure manufacturing system, structure manufacturing method and measurement program
JP2015215301A (en) Connection method of three-dimensional data, three-dimensional connection device, shape measurement device, shape measurement method, structure manufacturing system, structure manufacturing method and shape measurement program
JP2015206753A (en) Mark projection device, mark projection method, shape measurement device, shape measurement system, shape measurement method, structure manufacturing system, structuring manufacturing method and shape measurement program
JP2018189459A (en) Measuring device, measurement method, system, and goods manufacturing method
JP2016008837A (en) Shape measuring method, shape measuring device, structure manufacturing system, structure manufacturing method, and shape measuring program

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 14899346

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

NENP Non-entry into the national phase

Ref country code: JP

122 Ep: pct application non-entry in european phase

Ref document number: 14899346

Country of ref document: EP

Kind code of ref document: A1