WO2013035847A1 - Shape measurement device, structure manufacturing system, shape measurement method, structure manufacturing method, shape measurement program, and computer-readable recording medium - Google Patents

Shape measurement device, structure manufacturing system, shape measurement method, structure manufacturing method, shape measurement program, and computer-readable recording medium Download PDF

Info

Publication number
WO2013035847A1
WO2013035847A1 PCT/JP2012/072913 JP2012072913W WO2013035847A1 WO 2013035847 A1 WO2013035847 A1 WO 2013035847A1 JP 2012072913 W JP2012072913 W JP 2012072913W WO 2013035847 A1 WO2013035847 A1 WO 2013035847A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
captured
feature amount
captured image
unit
Prior art date
Application number
PCT/JP2012/072913
Other languages
French (fr)
Japanese (ja)
Inventor
青木 洋
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Publication of WO2013035847A1 publication Critical patent/WO2013035847A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10152Varying illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Definitions

  • the present invention relates to a shape measuring device, a structure manufacturing system, a shape measuring method, a structure manufacturing method, a shape measuring program, and a computer-readable recording medium.
  • a pattern projection type shape measuring apparatus using a phase shift method is known (for example, see Patent Document 1).
  • a grating pattern having a sinusoidal intensity distribution is projected onto a measurement object, and the measurement object is repeatedly imaged while changing the phase of the grating pattern at a constant pitch.
  • phase distribution (phase image) of the lattice pattern deformed according to the surface shape of the measurement object is obtained, and the phase image Is unwrapped (phase connection) and then converted into a height distribution (height image) of the measurement object.
  • the present invention has been made in view of such a situation, and a shape measuring device, a structure manufacturing system, and a shape measuring device that determine whether or not there is a blur in a captured image for measuring the shape of a measurement target.
  • a method, a structure manufacturing method, and a shape measurement program are provided.
  • the present invention provides an imaging unit that generates a captured image obtained by imaging a measurement target, and an image captured by the imaging unit is captured as an image in which a lattice pattern is projected on the measurement target.
  • the irradiation unit that irradiates the measurement target with illumination light having a predetermined intensity distribution from a direction different from the direction in which the imaging unit is imaging, and the degree of blur that exists in the captured image from the captured image
  • a shape measuring apparatus comprising: a feature amount calculating unit that calculates a feature amount; and a determination unit that determines whether or not blur is present in a captured image based on the feature amount.
  • the present invention also measures a design apparatus for producing design information related to the shape of a structure, a molding apparatus for producing a structure based on the design information, and a shape of the produced structure based on a captured image. It is a structure manufacturing system including the above-described shape measuring device and an inspection device that compares shape information obtained by measurement and design information.
  • the present invention provides an imaging unit that generates a captured image obtained by imaging a measurement target, and the imaging unit so that the image captured by the imaging unit is captured as an image in which a lattice pattern is projected on the measurement target.
  • a shape measuring device comprising: an irradiating unit that irradiates illumination light having a predetermined intensity distribution on the measurement target from a direction different from the direction in which the image is captured.
  • a shape measuring method comprising: calculating a feature amount; and determining whether or not there is a blur in a captured image based on the feature amount.
  • the present invention creates design information related to the shape of a structure, creates a structure based on the design information, and generates the shape of the created structure using the shape measurement method described above. It is a structure manufacturing method including measuring based on the captured image and comparing shape information obtained by measurement with design information.
  • the present invention provides an imaging unit that generates a captured image obtained by imaging a measurement target, and the imaging unit so that the image captured by the imaging unit is captured as an image in which a lattice pattern is projected on the measurement target.
  • the degree of blur that exists in the captured image from the captured image to the computer of the shape measuring apparatus that includes an irradiation unit that irradiates the measurement target with illumination light having a predetermined intensity distribution from a direction different from the direction in which the image is captured And a step of determining whether or not there is a blur in a captured image based on the feature amount.
  • FIG. 1 is a block diagram showing a configuration of a shape measuring apparatus 10 according to the first embodiment of the present invention.
  • the shape measuring apparatus 10 includes an operation input unit 11, a display unit 12, an imaging unit 13, an irradiation unit 14, a storage unit 15, a feature amount calculation unit 16, a determination unit 17, and a point group calculation unit 18.
  • the computer terminal which measures the three-dimensional shape of the measuring object A by the phase shift method is included.
  • the shape measuring apparatus 10 detects whether or not there is a blur in the captured image when capturing a plurality of captured images whose phases are shifted by the phase shift method.
  • causes of blurring include, for example, blurring when the measurement object moves during imaging, or camera shake when the user captures the portable shape measuring device 10 without holding it on a tripod or the like. Can be considered.
  • blurring that affects the measurement results may include blurring in which the imaging range differs between multiple captured images, and the measurement target may move or camera shake during the exposure time for capturing a single captured image. It is thought that this may be a blur due to Alternatively, these blurs may occur simultaneously.
  • an image obtained by projecting a plurality of grating patterns having different initial phases based on the N bucket method onto a measurement object is captured, and the shape of the measurement object is measured based on the luminance value of the same pixel in each image.
  • the shape measurement is performed from the fringe images having different initial phases, but in this embodiment, the lattice pattern having the same initial phase as that of at least one of the initial phases is projected again, and the captured image is The images of lattice patterns having the same initial phase are compared. In this way, the similarity of two or more captured images with different imaging timings in which the same initial phase lattice pattern is captured is calculated.
  • the similarity is equal to or greater than the threshold, it can be determined that there is no blur between the photographing timings, and if the similarity is not equal to or greater than the threshold, it can be determined that there is a blur between the photographing timings.
  • the shape measuring apparatus 10 determines whether or not such a blur has occurred during imaging, and issues a warning if a blur has occurred. Thereby, for example, when blurring occurs, it is possible to perform imaging again.
  • the operation input unit 11 receives an operation input from a user.
  • the operation input unit 11 includes operation members such as a power button for switching on and off of the main power supply and a release button for receiving an instruction to start an imaging process.
  • the operation input part 11 can also receive the input by a touchscreen.
  • the display unit 12 is a display that displays various types of information. For example, when the determination unit 17 described later determines that there is a blur in the captured image, the display unit 12 displays a warning that there is a blur. Further, the display unit 12 displays, for example, point cloud data indicating the three-dimensional shape of the measurement target calculated by the point cloud calculation unit 18.
  • the imaging unit 13 generates a captured image obtained by imaging the measurement target, and performs an imaging process for storing the generated captured image in the storage unit 15.
  • the imaging unit 13 operates in conjunction with the irradiation unit 14 and performs an imaging process in accordance with the timing at which the illumination light is projected onto the measurement target by the irradiation unit 14.
  • the imaging unit 13 generates a plurality of captured images obtained by capturing, for each initial phase, an image in which a plurality of lattice patterns having different initial phases based on the N bucket method are projected onto the measurement target by the irradiation unit 14.
  • the determination unit 17 determines that there is a possibility of blurring in the captured image, the imaging unit 13 warns the user of the possibility of blurring.
  • the user can cause the shape measuring apparatus 10 to acquire an image obtained by projecting a plurality of lattice patterns having different initial phases onto the measurement object, based on the N bucket method again. In this way, if imaging is continued until no warning about the possibility of the presence of blur is generated, shape measurement with reduced blurring can be performed.
  • the shape measuring apparatus 10 may continue to automatically capture an image in which a plurality of lattice patterns having different initial phases based on the N bucket method are projected on the measurement target until blur determination is eliminated. In this way, the measurement object irradiated with the illumination light is imaged again to generate a captured image.
  • the irradiating unit 14 has a direction different from the direction in which the imaging unit 13 is imaging so that an image in which the lattice pattern is projected onto the measurement target is captured when the imaging unit 13 is imaging the measurement target.
  • Irradiation light having a predetermined intensity distribution is irradiated to the measurement object.
  • the irradiating unit 14 sequentially captures images obtained by projecting a plurality of lattice patterns having a spatial frequency of a certain period and different initial phases by 90 degrees on the measurement object based on the N bucket method. Irradiate with illumination light so that you can. For example, as shown in FIG.
  • the irradiation unit 14 has a light intensity such that the light source 1 and the light from the light source 1 have a linear intensity distribution having a longitudinal direction in a direction orthogonal to the light irradiation direction. It has a collimate lens 2 and a cylindrical lens 3 for converting the distribution. Further, the light beam having a linear light intensity distribution is scanned with a scanning mirror 4 (MEMS (MEMS)) that scans the light of the light source 1 with respect to the measurement object in the direction perpendicular to the longitudinal direction of the light beam. Micro Electro Mechanical Systems) mirror).
  • MEMS scanning mirror 4
  • the light source 1 is provided with a light source control unit 5 for controlling the light intensity emitted from the light source 1, and the light source control unit 5 sequentially scans the scanning mirror while modulating the intensity of the laser light.
  • the intensity distribution of the laser light emitted from the light source 1 is shaped so as to have a linear light intensity distribution in one direction perpendicular to the optical axis direction. A light beam having a linear intensity distribution is scanned in a direction perpendicular to both the longitudinal directions while changing the intensity.
  • pattern light of a striped pattern (sine lattice) having a sinusoidal luminance change in one direction is formed.
  • the light of the light source is shifted in the direction perpendicular to the optical axis direction while changing the intensity in a sinusoidal shape by a mirror using the MEMS technology.
  • the irradiation light of the lattice pattern is projected on the measurement target.
  • laser light is projected using the MEMS technology
  • illumination light can also be projected by applying a liquid crystal projector or the like.
  • FIG. 3 is a diagram illustrating an example of a measurement target in which the irradiation unit 14 projects the irradiation light by shifting the initial phase by 90 degrees.
  • A has an initial phase of 0 degrees
  • B has an initial phase shifted 90 degrees from A
  • C has an initial phase shifted 180 degrees from A
  • D has an initial phase shifted 270 degrees from A. Is shown.
  • the 5-bucket method five captured images from A to E with the initial phase shifted by a constant angle are generated
  • a to G with the initial phase shifted by a constant angle. Up to seven captured images are generated.
  • the order of imaging does not necessarily have to be in the order of A, B, C, D, and E.
  • images can be captured in the order of A, E, B, C, and D.
  • the imaging process is performed while sequentially shifting the initial phase.
  • the imaging unit 13 may perform other initial phase grating patterns (for example, a plurality of imaging timings for capturing an image in which a lattice pattern (for example, A, E) having the same initial phase is projected onto the measurement target).
  • B, C, D captures an image projected on the measurement object.
  • the storage unit 15 stores the captured image generated by the imaging unit 13, the point cloud data calculated by the point cloud calculation unit 18, and the like.
  • the feature amount calculation unit 16 calculates a feature amount indicating the degree of blur existing in the captured image from the captured image generated by the imaging unit 13.
  • the feature amount calculation unit 16 calculates, as a feature amount, the similarity between a plurality of captured images obtained by capturing an image in which a lattice pattern having the same initial phase is projected onto the measurement target.
  • the similarity can be calculated by, for example, the following formula (1).
  • the feature amount calculation unit 16 may calculate, as a feature amount, a similarity between a set of captured images obtained by capturing an image in which a lattice pattern having the same initial phase is projected onto a measurement target. Similarity between sets of captured images can also be calculated as a feature amount. For example, when nine captured images obtained by capturing a lattice pattern in which the initial phase is shifted by 90 degrees by the 9-bucket method are generated, the first captured image whose initial phase is 0 degrees and the initial phase are 360 Only the degree of similarity with the fifth captured image that is degree (that is, 0 degree) may be calculated, and as shown in FIG.
  • the similarity between a plurality of sets of captured images whose initial phases are the same angle may be calculated. In this way, by calculating the degree of similarity between a plurality of sets of captured images and determining blur, it is possible to detect blur more accurately.
  • the determination unit 17 determines whether there is a blur in the captured image based on the feature amount calculated by the feature amount calculation unit 16. For example, the determination unit 17 stores a predetermined similarity threshold in its own storage area, and compares the similarity calculated by the feature amount calculation unit 16 with the similarity threshold stored in its own storage area. To do. The determination unit 17 determines that there is no blur if the similarity calculated by the feature amount calculation unit 16 is greater than or equal to a predetermined similarity threshold, and the similarity calculated by the feature amount calculation unit 16 However, if it is determined that it is not greater than or equal to a predetermined similarity threshold, it is determined that there is a blur. The point group calculation unit 18 performs point group calculation processing such as phase calculation and phase connection based on a plurality of captured images generated by the imaging unit 13, calculates point group data, and stores the data in the storage unit 15.
  • FIG. 4 is a flowchart for explaining an operation example in which the shape measuring apparatus 10 performs the shape measuring process.
  • the imaging unit 13 starts imaging processing of the measurement target, and the irradiation unit 14 starts projection processing of irradiation light onto the measurement target.
  • the imaging unit 13 stores, for example, five captured images captured at the time of fringe projection with an initial phase of 0 degrees, 90 degrees, 180 degrees, 270 degrees, and 360 degrees in the storage unit 15 (step S3).
  • the feature amount calculation unit 16 reads out a plurality of captured images having the same initial phase (for example, 0 degrees and 360 degrees) from among the plurality of captured images stored in the storage unit 15, and similarity between the read out captured images The degree is calculated (step S4).
  • the determining unit 17 compares the similarity calculated by the feature amount calculating unit 16 with a predetermined threshold (step S5). When the determination unit 17 determines that the similarity calculated by the feature amount calculation unit 16 is not equal to or greater than a predetermined threshold (step S5: NO), the display unit 12 indicates that the captured image is blurred. Is displayed (step S6). Here, although there is a possibility that camera shake has occurred, it is possible to accept a selection input from the user as to whether or not to continue the point cloud data calculation process. If an instruction not to continue the process is input to the operation input unit 11 (step S7: NO), the process returns to step S1. If an instruction to continue the process is input to the operation input unit 11 (step S7: YES), the process proceeds to step S8.
  • step S5 when the determination unit 17 determines that the similarity calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold (step S5: YES), the point group calculation unit 18 stores the storage unit 15 Point cloud data is calculated based on the captured image stored in the storage unit 15 and stored in the storage unit 15 (step S8). Then, the display unit 12 displays the calculated point cloud data (Step S9).
  • the similarity between the captured images having the same initial phase is calculated in the plurality of captured images having different initial phases captured based on the N bucket method, and based on the similarity. It can be determined whether or not there is a blur.
  • the point cloud calculation process performed by the point cloud calculation unit 18 based on the captured image is a relatively heavy process, and it takes about several seconds (for example, 5 seconds) to complete the process. There is. Therefore, in this embodiment, after the imaging process by the imaging unit 13 is performed and before the point cloud calculation process by the point cloud calculation unit 18 is started, the similarity determination and the blur determination process are performed, and blurring exists.
  • the user can know that there is a blur in the captured image before the high-load point cloud calculation process is performed, and can input a re-imaging instruction. That is, the point cloud calculation process is more compared to the case where it is known that blurring has occurred for the first time when the point cloud data is displayed on the display unit 12 after the time of the point cloud calculation process has elapsed. It is possible to know the occurrence of blur at an early stage before it is performed. Thereby, it is possible to reduce an unnecessary waiting time for performing re-measurement of the shape of the measurement target, and it is possible to efficiently perform measurement processing without consuming unnecessary power.
  • the shape measuring apparatus 10 according to this embodiment has the same configuration as that of the first embodiment shown in FIG.
  • a blur is detected in which the imaging range is different between a plurality of captured images.
  • the measurement target moves or shakes during the exposure time for capturing one captured image. It detects the blur caused by the occurrence of.
  • the feature amount calculation unit 16 in the present embodiment calculates the sharpness (blurring degree) of the captured image generated by the imaging unit 13 as the feature amount.
  • the determination unit 17 stores in advance a sharpness threshold for determining whether blurring has occurred, and compares the sharpness calculated by the feature amount calculation unit 16 with the sharpness threshold. If the determination unit 17 determines that the sharpness calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold, the determination unit 17 determines that there is no blur in the captured image, and the sharpness calculated by the feature amount calculation unit 16 is If it is determined that it is not equal to or more than the predetermined threshold, it is determined that there is a blur in the captured image.
  • the sharpness of the image can be calculated by a predetermined calculation formula, but can be calculated based on, for example, a spatial frequency distribution.
  • a spatial frequency distribution For example, as shown in FIGS. 5 and 6, the frequency distribution curve A1 of the spatial frequency in the image data in which the same measurement object is imaged without blurring and the frequency of the spatial frequency in the image data in which blurring occurs.
  • the partial curve B1 When the frequency distribution curve A1 of the spatial frequency in the image data in which no blurring occurs and the frequency curve B1 of the spatial frequency in the image data in which the blurring occurs are compared, the frequency distribution curve A1 is a frequency distribution curve. The center of gravity of the enclosed figure exists on the high spatial frequency side.
  • FIGS. 5 and 6 the frequency distribution curve A1 of the spatial frequency in the image data in which the same measurement object is imaged without blurring and the frequency of the spatial frequency in the image data in which blurring occurs.
  • the frequency corresponding to the position of the dotted line is the spatial frequency of the stripes projected. Therefore, in order to determine whether or not blur has occurred according to the measurement target, the spatial frequency corresponding to the position of the center of gravity of the region surrounded by the frequency distribution curve of the spatial frequency is detected, and the detected space Based on the frequency, it is possible to determine whether or not a blur has occurred in the captured image by comparing with a preset spatial frequency threshold. It should be noted that the spatial frequency of the fringe pattern copied in the image data may be noticed, and if there is a fine pattern on the surface of the measurement object, the fine pattern can be noted. That is, by detecting whether the spatial frequency of the pattern is low, it is possible to determine whether or not the captured image is blurred.
  • the feature amount calculation unit 16 performs a Fourier transform of the captured image generated by the imaging unit 13 and stored in the storage unit 15, and sets the spatial frequency corresponding to the center of gravity obtained as described above as the sharpness (feature) (Quantity).
  • the determination unit 17 determines whether or not the feature amount calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold value compared with the reference sharpness, thereby causing blur in the captured image. It is determined whether or not to do.
  • both the blur determination for each captured image according to the present embodiment and the blur determination based on the similarity between the plurality of captured images shown in the first embodiment are performed, and more efficiently and accurately.
  • Blur detection can be performed. For example, every time the imaging unit 13 generates captured images with different initial phases, blur determination based on sharpness can be performed, and a warning can be given when it is determined that the sharpness is not greater than or equal to a threshold value. As a result, it is possible to prevent unnecessary image pickup processing and to prevent unnecessary point cloud data calculation processing.
  • FIG. 7 is a flowchart showing an operation example of such a shape measuring apparatus 10.
  • the imaging unit 13 starts imaging processing of the measurement target, and the irradiation unit 14 starts projection processing of irradiation light onto the measurement target.
  • the imaging unit 13 causes the storage unit 15 to store a captured image generated each time a lattice pattern having a different initial phase is projected onto the measurement target by the irradiation unit 14 (step S12).
  • the feature amount calculation unit 16 reads the captured image stored in the storage unit 15, obtains a spatial frequency distribution by Fourier transforming the read captured image, and acquires sharpness (step S13).
  • the determination unit 17 compares the sharpness calculated by the feature amount calculation unit 16 with a predetermined sharpness threshold (step S14). When the determination unit 17 determines that the sharpness calculated by the feature amount calculation unit 16 is not equal to or greater than a predetermined threshold (step S14: NO), the display unit 12 indicates that the captured image is blurred. Is displayed (step S15). If an instruction not to continue the process is input to the operation input unit 11 (step S16: NO), the process returns to step S10. If an instruction to continue the process is input to the operation input unit 11 (step S16: YES), the process proceeds to step S17.
  • step S14 when the determination unit 17 determines that the sharpness calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold compared to the reference sharpness (step S14: YES), the determination The unit 17 determines whether or not more than a specified number of captured images have been generated (step S17).
  • the prescribed number of images is, for example, 5 when imaging by the 5-bucket method and 7 when imaging by the 7-bucket method. If the determination unit 17 determines that no more than the specified number of captured images has been generated (step S17: NO), the process returns to step S11.
  • step S17 determines that a predetermined number or more of captured images have been generated (step S17: YES). If the determination unit 17 determines that a predetermined number or more of captured images have been generated (step S17: YES), the process proceeds to step S18, which is the same as steps S4 to S9 described in the first embodiment. Processing is performed (steps S18 to S23).
  • the feature amount calculation unit 16 calculates the sharpness for each captured image, the determination unit 17 performs the blur determination, and the threshold in which all the sharpnesses of the plurality of captured images are determined.
  • the feature amount calculation unit 16 calculates the similarity between a plurality of captured images.
  • the shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the irradiation unit 14 of the shape measuring apparatus 10 of the present embodiment performs an irradiation process similar to that of the first embodiment.
  • the imaging unit 13 is imaging, the illumination light is irradiated so that an image in which illumination light having a uniform intensity distribution is projected on the measurement target is captured.
  • the feature amount calculation unit 16 combines a composite image obtained by synthesizing a plurality of captured images with different initial phases captured in the same manner as in the first embodiment, and a captured image obtained by capturing a measurement target irradiated with uniform illumination light. The similarity is calculated as a feature amount. Thereby, it is determined whether or not there is a blur in the captured image.
  • an image X in FIG. 8 is an image obtained by combining four captured images.
  • an image Y in FIG. 8 is an image taken when the measurement target is illuminated with uniform illumination.
  • the presence / absence of blur may be detected by detecting the similarity between an image obtained by combining four captured images and an image obtained by illuminating the measurement target without performing intensity modulation.
  • the feature amount calculation unit 16 calculates a spatial frequency distribution of a composite image of a plurality of captured images with different initial phases and a spatial frequency distribution of an image captured with a uniform intensity distribution.
  • the degree of similarity can be calculated by comparison.
  • the determination unit 17 determines that there is no blur if the calculated similarity is equal to or greater than a certain level, and determines that there is blur if the calculated similarity is not equal to or greater than a certain level.
  • the shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the storage unit 15 of the shape measuring apparatus 10 of the present embodiment captures an image in which a lattice pattern is projected onto the measurement target.
  • a template image that is an image is stored in advance.
  • the storage unit 15 stores A ′ (initial phase 0 degree), B ′ (initial phase 90 degrees), and C ′ (initial phase 180 degrees) in which the measurement target is imaged in advance for each initial phase. ), D ′ (initial phase 270 degrees) template image is stored.
  • the feature amount calculation unit 16 reads the template image stored in the storage unit 15 and calculates the similarity between the read template image and the captured image generated by the imaging unit 13 for each corresponding initial phase.
  • the determination unit 17 determines that there is no blur if the similarity calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold, and determines that there is blur if the similarity is not greater than the threshold.
  • the shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the feature amount calculating unit 16 of the shape measuring apparatus 10 of the present embodiment is configured to each of a plurality of captured images captured by the imaging unit 13. Is subjected to Fourier transform, and the spatial frequency distribution is calculated as a feature amount.
  • the determination unit 17 determines that there is blur between the captured images, and when the spatial frequency distribution between the plurality of captured images is substantially the same. Then, it is determined that there is no blur between the captured images.
  • the spatial frequency distribution of only the captured image is the spatial frequency distribution of the other captured images. It is thought that the result will be different. Therefore, when such a spatial frequency distribution does not match between a plurality of captured images exceeding a predetermined condition, it is determined that blurring exists.
  • the shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the irradiation unit 14 of the shape measuring apparatus 10 of the present embodiment irradiates a measurement target with a lattice pattern based on the spatial code method, and features
  • the amount calculation unit 16 calculates a spatial code for each of the plurality of captured images as a feature amount.
  • the determination unit 17 determines that there is blur in the captured image if the coordinate position based on the spatial code calculated for each of the plurality of captured images is different, and if there is no blur in the captured image if the coordinate positions are approximately the same. judge.
  • At least two determination processes can be combined to perform a determination process with higher accuracy.
  • at least one determination process is selected in advance as to which of the determination processes according to the first to sixth embodiments as described above is performed by the user, and a determination process (or combination of determination processes) based on the selection is selected. ) May be executed.
  • the display unit 12 displays a warning when the determination unit 17 determines that the captured image is blurred.
  • the shape measurement device 10 may be a speaker that outputs sound. A warning sound may be output.
  • a message may be transmitted to a terminal connected to the shape measuring apparatus 10 via a wired line or a wireless line to notify the determination unit 17 that the captured image has been determined to be blurred.
  • FIG. 10 is a diagram showing a configuration of the structure manufacturing system 100 of the present embodiment.
  • the structure manufacturing system 100 of this embodiment includes the shape measuring device 10, the design device 20, the molding device 30, the control device (inspection device) 40, and the repair device 50 as described in the above embodiment.
  • the control device 40 includes a coordinate storage unit 41 and an inspection unit 42.
  • the design device 20 creates design information related to the shape of the structure, and transmits the created design information to the molding device 30.
  • the design apparatus 20 stores the produced design information in the coordinate storage unit 210 of the control apparatus 40.
  • the design information includes information indicating the coordinates of each position of the structure.
  • the molding apparatus 30 produces the above-described structure based on the design information input from the design apparatus 20.
  • the molding of the molding apparatus 30 includes, for example, casting, forging, cutting, and the like.
  • the shape measuring device 10 measures the coordinates of the manufactured structure (measurement object) and transmits information (shape information) indicating the measured coordinates to the control device 40.
  • the coordinate storage unit 41 of the control device 40 stores design information.
  • the inspection unit 42 of the control device 40 reads design information from the coordinate storage unit 41.
  • the inspection unit 42 compares the information (shape information) indicating the coordinates received from the shape measuring apparatus 10 with the design information read from the coordinate storage unit 41.
  • the inspection unit 42 determines whether or not the structure has been molded according to the design information based on the comparison result. In other words, the inspection unit 42 determines whether or not the manufactured structure is a non-defective product.
  • the inspection unit 42 determines whether the structure can be repaired when the structure is not molded according to the design information.
  • the inspection unit 42 calculates a defective portion and a repair amount based on the comparison result, and transmits information indicating the defective portion and information indicating the repair amount to the repair device 50.
  • the repair device 50 processes the defective portion of the structure based on the information indicating the defective portion received from the control device 40 and the information indicating the repair amount.
  • FIG. 11 is a flowchart showing the structure manufacturing method of the present embodiment.
  • each process of the structure manufacturing method illustrated in FIG. 11 is executed by each unit of the structure manufacturing system 100.
  • the design apparatus 20 creates design information related to the shape of the structure (step S31).
  • molding apparatus 30 produces the said structure based on design information (step S32).
  • the shape measuring apparatus 10 measures the shape of the manufactured structure (step S33).
  • the inspection unit 42 of the control device 40 compares the shape information obtained by the shape measuring device 10 with the above-described design information to inspect whether or not the structure is manufactured according to the integrity design information ( Step S34).
  • the inspection unit 42 of the control device 40 determines whether or not the manufactured structure is a good product (step S35).
  • step S35 determines that the manufactured structure is a non-defective product
  • the structure manufacturing system 100 ends the process.
  • the inspection unit 42 determines that the manufactured structure is not a good product (step S35: NO)
  • the inspection unit 42 determines whether the manufactured structure can be repaired (step S36).
  • step S36 when the inspection unit 42 determines that the manufactured structure can be repaired (step S36: YES), the repair device 50 performs the reworking of the structure (step S37). Return to processing.
  • step S36: NO the structure manufacturing system 100 ends the process.
  • the shape measuring apparatus 10 in the first to sixth embodiments can accurately measure the coordinates of the structure, is the manufactured structure good? It can be determined whether or not. Moreover, the structure manufacturing system 100 can perform reworking and repair of the structure when the structure is not a good product.
  • the repair process which the repair apparatus 50 in this embodiment performs may be replaced with the process in which the shaping
  • molding apparatus 30 re-executes a shaping
  • the molding apparatus 30 cuts a portion that is originally to be cut and is not cut in the structure. Thereby, the structure manufacturing system 100 can produce a structure correctly.
  • a program for realizing the function of the processing unit in the present invention is recorded on a computer-readable recording medium, and the program recorded on the recording medium is read into a computer system and executed, thereby executing blur determination processing. May be performed.
  • the “computer system” includes an OS and hardware such as peripheral devices.
  • the “computer system” includes a WWW system having a homepage providing environment (or display environment).
  • the “computer-readable recording medium” refers to a storage device such as a flexible medium, a magneto-optical disk, a portable medium such as a ROM or a CD-ROM, and a hard disk incorporated in a computer system.
  • the “computer-readable recording medium” refers to a volatile memory (RAM) in a computer system that becomes a server or a client when a program is transmitted via a network such as the Internet or a communication line such as a telephone line. In addition, those holding programs for a certain period of time are also included.
  • RAM volatile memory
  • the program may be transmitted from a computer system storing the program in a storage device or the like to another computer system via a transmission medium or by a transmission wave in the transmission medium.
  • the “transmission medium” for transmitting the program refers to a medium having a function of transmitting information, such as a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line.
  • the program may be for realizing a part of the functions described above. Furthermore, what can implement
  • the image is necessary for calculating the similarity by comparing the images and the spatial frequency distribution of the image. Any method may be used as long as it detects whether or not the measurement object or the shape measuring device is relatively moving during the acquisition.
  • the present invention can be applied to a structure manufacturing system that can determine whether or not a manufactured structure is a non-defective product. Thereby, the inspection accuracy of the manufactured structure can be improved, and the manufacturing efficiency of the structure can be improved.

Abstract

An objective of the present invention is to assess whether blurring is present in a captured image for measuring a subject shape to be measured. A shape measurement device comprises: an image capture unit which generates a captured image which captures the subject to be measured; an illumination unit which shines upon the subject to be measured an illumination light which has a prescribed intensity distribution from a direction different form the direction which the image capture unit captures the image such that the image which is captured by the image capture unit is captured as an image in which a lattice pattern is projected upon the subject to be measured; a feature value computation unit which computes a feature value from the captured image which denotes the degree of blurring which is present in the captured image; and an assessment unit which assesses whether blurring is present in the captured image on the basis of the feature value.

Description

形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラム、コンピュータ読み取り可能な記録媒体Shape measuring device, structure manufacturing system, shape measuring method, structure manufacturing method, shape measuring program, computer-readable recording medium
 本発明は、形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラム及びコンピュータ読み取り可能な記録媒体に関する。 The present invention relates to a shape measuring device, a structure manufacturing system, a shape measuring method, a structure manufacturing method, a shape measuring program, and a computer-readable recording medium.
 測定対象の面形状(三次元形状)を非接触で測定する手法として、例えば位相シフト法によるパターン投影型の形状測定装置が知られている(例えば、特許文献1参照)。この形状測定装置では、正弦波状の強度分布を持つ格子パターンを測定対象物上に投影し、その格子パターンの位相を一定ピッチで変化させながら測定対象物を繰り返し撮像する。これによって得られた複数の撮像画像(輝度変化データ)を所定の演算式に当てはめることで、測定対象物の面形状に応じて変形した格子パターンの位相分布(位相画像)を求め、その位相画像をアンラップ(位相接続)してから、測定対象物の高さ分布(高さ画像)に換算する。 As a technique for measuring the surface shape (three-dimensional shape) of a measurement object in a non-contact manner, for example, a pattern projection type shape measuring apparatus using a phase shift method is known (for example, see Patent Document 1). In this shape measuring apparatus, a grating pattern having a sinusoidal intensity distribution is projected onto a measurement object, and the measurement object is repeatedly imaged while changing the phase of the grating pattern at a constant pitch. By applying a plurality of captured images (luminance change data) obtained in this manner to a predetermined arithmetic expression, the phase distribution (phase image) of the lattice pattern deformed according to the surface shape of the measurement object is obtained, and the phase image Is unwrapped (phase connection) and then converted into a height distribution (height image) of the measurement object.
特開2009-180689号公報JP 2009-180689 A
 しかしながら、本発明者らの知見によれば、位相が異なる格子パターンが投影された複数の撮像画像を撮像する際にブレが発生すると、三次元形状の測定値に誤差が生じると考えられる。そこで、撮像画像にブレが発生しているか否かを判定することが望ましい。 However, according to the knowledge of the present inventors, it is considered that an error occurs in the measurement value of the three-dimensional shape when blurring occurs when capturing a plurality of captured images on which a grating pattern having different phases is projected. Therefore, it is desirable to determine whether or not blur has occurred in the captured image.
 本発明は、このような状況に鑑みてなされたもので、測定対象の形状を測定するための撮像画像に、ブレが存在するか否かを判定する形状測定装置、構造物製造システム、形状測定方法、構造物製造方法、形状測定プログラムを提供する。 The present invention has been made in view of such a situation, and a shape measuring device, a structure manufacturing system, and a shape measuring device that determine whether or not there is a blur in a captured image for measuring the shape of a measurement target. A method, a structure manufacturing method, and a shape measurement program are provided.
 上述した課題を解決するために、本発明は、測定対象を撮像した撮像画像を生成する撮像部と、撮像部によって撮像された画像が、前記測定対象に格子パターンが投影された画像として撮像されるように、撮像部が撮像している方向と異なる方向から、前記測定対象に所定の強度分布を有する照明光を照射する照射部と、撮像画像から、撮像画像に存在するブレの度合いを示す特徴量を算出する特徴量算出部と、特徴量に基づいて、撮像画像にブレが存在するか否かを判定する判定部と、を備えることを特徴とする形状測定装置である。 In order to solve the above-described problem, the present invention provides an imaging unit that generates a captured image obtained by imaging a measurement target, and an image captured by the imaging unit is captured as an image in which a lattice pattern is projected on the measurement target. As shown, the irradiation unit that irradiates the measurement target with illumination light having a predetermined intensity distribution from a direction different from the direction in which the imaging unit is imaging, and the degree of blur that exists in the captured image from the captured image A shape measuring apparatus comprising: a feature amount calculating unit that calculates a feature amount; and a determination unit that determines whether or not blur is present in a captured image based on the feature amount.
 また、本発明は、構造物の形状に関する設計情報を作製する設計装置と、設計情報に基づいて構造物を作製する成形装置と、作製された構造物の形状を、撮像画像に基づいて測定する上述の形状測定装置と、測定によって得られた形状情報と、設計情報とを比較する検査装置と、を含む構造物製造システムである。 The present invention also measures a design apparatus for producing design information related to the shape of a structure, a molding apparatus for producing a structure based on the design information, and a shape of the produced structure based on a captured image. It is a structure manufacturing system including the above-described shape measuring device and an inspection device that compares shape information obtained by measurement and design information.
 また、本発明は、測定対象を撮像した撮像画像を生成する撮像部と、撮像部によって撮像された画像が、前記測定対象に格子パターンが投影された画像として撮像されるように、撮像部が撮像している方向と異なる方向から、前記測定対象に所定の強度分布を有する照明光を照射する照射部と、を備える形状測定装置が、撮像画像から、撮像画像に存在するブレの度合いを示す特徴量を算出するステップと、特徴量に基づいて、撮像画像にブレが存在するか否かを判定するステップと、を備えることを特徴とする形状測定方法である。 Further, the present invention provides an imaging unit that generates a captured image obtained by imaging a measurement target, and the imaging unit so that the image captured by the imaging unit is captured as an image in which a lattice pattern is projected on the measurement target. A shape measuring device comprising: an irradiating unit that irradiates illumination light having a predetermined intensity distribution on the measurement target from a direction different from the direction in which the image is captured. A shape measuring method comprising: calculating a feature amount; and determining whether or not there is a blur in a captured image based on the feature amount.
 また、本発明は、構造物の形状に関する設計情報を作製することと、設計情報に基づいて構造物を作製することと、作製された構造物の形状を、上述の形状測定方法を用いて生成した撮像画像に基づいて測定することと、測定によって得られた形状情報と、設計情報とを比較することと、を含む構造物製造方法である。 In addition, the present invention creates design information related to the shape of a structure, creates a structure based on the design information, and generates the shape of the created structure using the shape measurement method described above. It is a structure manufacturing method including measuring based on the captured image and comparing shape information obtained by measurement with design information.
 また、本発明は、測定対象を撮像した撮像画像を生成する撮像部と、撮像部によって撮像された画像が、前記測定対象に格子パターンが投影された画像として撮像されるように、撮像部が撮像している方向と異なる方向から、前記測定対象に所定の強度分布を有する照明光を照射する照射部と、を備える形状測定装置のコンピュータに、撮像画像から、撮像画像に存在するブレの度合いを示す特徴量を算出するステップと、特徴量に基づいて、撮像画像にブレが存在するか否かを判定するステップと、を実行させる形状測定プログラムである。 Further, the present invention provides an imaging unit that generates a captured image obtained by imaging a measurement target, and the imaging unit so that the image captured by the imaging unit is captured as an image in which a lattice pattern is projected on the measurement target. The degree of blur that exists in the captured image from the captured image to the computer of the shape measuring apparatus that includes an irradiation unit that irradiates the measurement target with illumination light having a predetermined intensity distribution from a direction different from the direction in which the image is captured And a step of determining whether or not there is a blur in a captured image based on the feature amount.
 以上説明したように、本発明によれば、測定対象の形状を測定するための撮像画像に、ブレが存在するか否かを判定することができる。 As described above, according to the present invention, it is possible to determine whether or not there is a blur in the captured image for measuring the shape of the measurement target.
本発明の第1の実施形態による形状測定装置の構成例を示すブロック図である。It is a block diagram which shows the structural example of the shape measuring apparatus by the 1st Embodiment of this invention. 本発明の第1の実施形態による照射部の概略構成図である。It is a schematic block diagram of the irradiation part by the 1st Embodiment of this invention. 本発明の第1の実施形態による撮像画像の例を示す図である。It is a figure which shows the example of the captured image by the 1st Embodiment of this invention. 本発明の第1の実施形態による形状測定装置の動作例を示すフローチャートである。It is a flowchart which shows the operation example of the shape measuring apparatus by the 1st Embodiment of this invention. ブレの発生していない画像データにおける空間周波数の度数分布曲線を示す図である。It is a figure which shows the frequency distribution curve of the spatial frequency in the image data which has not generate | occur | produced the blur. ブレの発生している画像データにおける空間周波数の度数分布曲線を示す図である。It is a figure which shows the frequency distribution curve of the spatial frequency in the image data which has generate | occur | produced the blurring. 本発明の第2の実施形態による形状測定装置の動作例を示すフローチャートである。It is a flowchart which shows the operation example of the shape measuring apparatus by the 2nd Embodiment of this invention. 本発明の第3の実施形態において比較する画像の例を示す図である。It is a figure which shows the example of the image compared in the 3rd Embodiment of this invention. 本発明の第4の実施形態において比較する画像の例を示す図である。It is a figure which shows the example of the image compared in the 4th Embodiment of this invention. 本発明の第7の実施形態による形状測定装置の構成例を示すブロック図である。It is a block diagram which shows the structural example of the shape measuring apparatus by the 7th Embodiment of this invention. 本発明の第7の実施形態による形状測定装置の動作例を示すフローチャートである。It is a flowchart which shows the operation example of the shape measuring apparatus by the 7th Embodiment of this invention.
 以下、本発明の一実施形態について、図面を参照して説明する。
<第1の実施形態:同一の初期位相である複数画像間の類似度を比較>
 図1は、本発明の第1の実施形態による形状測定装置10の構成を示すブロック図である。形状測定装置10は、操作入力部11と、表示部12と、撮像部13と、照射部14と、記憶部15と、特徴量算出部16と、判定部17と、点群算出部18とを備え、位相シフト法により測定対象Aの三次元形状を測定するコンピュータ端末を含んでいる。形状測定装置10は、位相シフト法により位相をシフトさせた複数の撮像画像を撮像する際、撮像画像にブレが存在するか否かを検出する。ブレの発生原因としては、例えば撮像中に測定対象が移動した場合におけるブレや、可搬タイプの形状測定装置10を三脚等に固定せずにユーザが手に持って撮像した場合における手ブレなどが考えられる。また、測定結果に影響を与えるブレには、複数の撮像画像間における撮像範囲が異なるようなブレと、1枚の撮像画像を撮像する露光時間中に測定対象が移動したり手ブレが発生したりすることによるブレとが考えられる。あるいは、これらのブレが同時に起こることも考えられる。
Hereinafter, an embodiment of the present invention will be described with reference to the drawings.
<First embodiment: Comparison of similarities between a plurality of images having the same initial phase>
FIG. 1 is a block diagram showing a configuration of a shape measuring apparatus 10 according to the first embodiment of the present invention. The shape measuring apparatus 10 includes an operation input unit 11, a display unit 12, an imaging unit 13, an irradiation unit 14, a storage unit 15, a feature amount calculation unit 16, a determination unit 17, and a point group calculation unit 18. The computer terminal which measures the three-dimensional shape of the measuring object A by the phase shift method is included. The shape measuring apparatus 10 detects whether or not there is a blur in the captured image when capturing a plurality of captured images whose phases are shifted by the phase shift method. Causes of blurring include, for example, blurring when the measurement object moves during imaging, or camera shake when the user captures the portable shape measuring device 10 without holding it on a tripod or the like. Can be considered. In addition, blurring that affects the measurement results may include blurring in which the imaging range differs between multiple captured images, and the measurement target may move or camera shake during the exposure time for capturing a single captured image. It is thought that this may be a blur due to Alternatively, these blurs may occur simultaneously.
 本実施形態では、Nバケット法に基づく初期位相の異なる複数の格子パターンが測定対象に投影された像を撮像し、それぞれの像における同一画素の輝度値に基づいて、測定対象の形状測定を行う。通常は、それぞれ初期位相の異なる縞画像から形状測定を行うが、本実施形態では、更に少なくともいずれかの初期位相の格子パターンと同じ初期位相の格子パターンを再度投影し、そのとき撮像した画像と初期位相が同じ格子パターンの画像を比較する。このようにして、同一の初期位相の格子パターンが撮像された、撮像タイミングの異なる2以上の撮像画像の類似度を算出する。ここで、類似度が閾値以上であれば撮影タイミング間にブレがなかったと判定でき、類似度が閾値以上でなければ撮影タイミング間にブレがあったと判定できる。形状測定装置10は、撮像中にこのようなブレが発生したか否かを判定し、ブレが発生した場合には警告を行う。これにより、例えばブレが発生した場合に再度の撮像を行うことが可能となる。 In the present embodiment, an image obtained by projecting a plurality of grating patterns having different initial phases based on the N bucket method onto a measurement object is captured, and the shape of the measurement object is measured based on the luminance value of the same pixel in each image. . Usually, the shape measurement is performed from the fringe images having different initial phases, but in this embodiment, the lattice pattern having the same initial phase as that of at least one of the initial phases is projected again, and the captured image is The images of lattice patterns having the same initial phase are compared. In this way, the similarity of two or more captured images with different imaging timings in which the same initial phase lattice pattern is captured is calculated. Here, if the similarity is equal to or greater than the threshold, it can be determined that there is no blur between the photographing timings, and if the similarity is not equal to or greater than the threshold, it can be determined that there is a blur between the photographing timings. The shape measuring apparatus 10 determines whether or not such a blur has occurred during imaging, and issues a warning if a blur has occurred. Thereby, for example, when blurring occurs, it is possible to perform imaging again.
 操作入力部11は、ユーザからの操作入力を受け付ける。例えば、操作入力部11は、主電源のオンとオフとを切替えるための電源釦、撮像処理開始の指示を受け付けるレリーズ釦等の操作部材を備えている。または、操作入力部11はタッチパネルによる入力を受け付けることもできる。
 表示部12は、各種情報を表示するディスプレイである。例えば、表示部12は、後述する判定部17によって撮像画像にブレが存在すると判定されると、ブレが存在する旨の警告を表示する。また、表示部12は、例えば点群算出部18により算出された測定対象の三次元形状を示す点群データ等を表示する。
The operation input unit 11 receives an operation input from a user. For example, the operation input unit 11 includes operation members such as a power button for switching on and off of the main power supply and a release button for receiving an instruction to start an imaging process. Or the operation input part 11 can also receive the input by a touchscreen.
The display unit 12 is a display that displays various types of information. For example, when the determination unit 17 described later determines that there is a blur in the captured image, the display unit 12 displays a warning that there is a blur. Further, the display unit 12 displays, for example, point cloud data indicating the three-dimensional shape of the measurement target calculated by the point cloud calculation unit 18.
 撮像部13は、測定対象を撮像した撮像画像を生成し、生成した撮像画像を記憶部15に記憶させる撮像処理を行う。撮像部13は、照射部14と連動して動作し、照射部14によって測定対象に照明光が投影されるタイミングに合わせて撮像処理を行う。本実施形態では、撮像部13は、照射部14によってNバケット法に基づく初期位相の異なる複数の格子パターンが測定対象に投影された像を、初期位相毎に撮像した複数の撮像画像を生成する。また、撮像部13は、判定部17が撮像画像にブレが存在する可能性があると判定すると、ブレの存在の可能性をユーザに警告する。ユーザは、その警告に応じて、再度Nバケット法に基づく、初期位相の異なる複数の格子パターンが測定対象に投影された像を、形状測定装置10に取得させることができる。このようにして、ブレの存在の可能性に関する警告が発しなくなるまで、撮像し続けると、ブレの影響が低減された形状測定を実施できる。
 なお、形状測定装置10が、ブレの判定が無くなるまで、Nバケット法に基づく初期位相の異なる複数の格子パターンが測定対象に投影された像を自動的に撮像し続けるようにしても良い。このようにして、照明光が照射された測定対象を再度撮像し、撮像画像を生成する。
The imaging unit 13 generates a captured image obtained by imaging the measurement target, and performs an imaging process for storing the generated captured image in the storage unit 15. The imaging unit 13 operates in conjunction with the irradiation unit 14 and performs an imaging process in accordance with the timing at which the illumination light is projected onto the measurement target by the irradiation unit 14. In the present embodiment, the imaging unit 13 generates a plurality of captured images obtained by capturing, for each initial phase, an image in which a plurality of lattice patterns having different initial phases based on the N bucket method are projected onto the measurement target by the irradiation unit 14. . In addition, when the determination unit 17 determines that there is a possibility of blurring in the captured image, the imaging unit 13 warns the user of the possibility of blurring. In response to the warning, the user can cause the shape measuring apparatus 10 to acquire an image obtained by projecting a plurality of lattice patterns having different initial phases onto the measurement object, based on the N bucket method again. In this way, if imaging is continued until no warning about the possibility of the presence of blur is generated, shape measurement with reduced blurring can be performed.
Note that the shape measuring apparatus 10 may continue to automatically capture an image in which a plurality of lattice patterns having different initial phases based on the N bucket method are projected on the measurement target until blur determination is eliminated. In this way, the measurement object irradiated with the illumination light is imaged again to generate a captured image.
 照射部14は、撮像部13が測定対象を撮像している際に、格子パターンが測定対象に投影された像が撮像されるように、撮像部13が撮像している方向と異なる方向から、測定対象に所定の強度分布を有する照明光を照射する。ここで、照射部14は、一定の周期の空間周波数を持ち、Nバケット法に基づいて初期位相が90度ずつ異なる複数の格子パターンが測定対象に投影された像を、撮像部13によって順次撮像できるように、照明光を照射する。例えば、照射部14は、図2に示すように、光源1と、光源1からの光を光の照射方向に対して直交する方向に長手方向を有するライン状の強度分布となるように光強度分布を変換するコリーメートレンズ2及びシリンドリカルレンズ3を有する。また、ライン状の光強度分布となった光束を、ライン状の光強度分布を光束の長手方向に対して、垂直方向に光源1の光を測定対象に対して走査する走査ミラー4(MEMS(Micro Electro Mechanical Systems)ミラー)を備えている。 The irradiating unit 14 has a direction different from the direction in which the imaging unit 13 is imaging so that an image in which the lattice pattern is projected onto the measurement target is captured when the imaging unit 13 is imaging the measurement target. Irradiation light having a predetermined intensity distribution is irradiated to the measurement object. Here, the irradiating unit 14 sequentially captures images obtained by projecting a plurality of lattice patterns having a spatial frequency of a certain period and different initial phases by 90 degrees on the measurement object based on the N bucket method. Irradiate with illumination light so that you can. For example, as shown in FIG. 2, the irradiation unit 14 has a light intensity such that the light source 1 and the light from the light source 1 have a linear intensity distribution having a longitudinal direction in a direction orthogonal to the light irradiation direction. It has a collimate lens 2 and a cylindrical lens 3 for converting the distribution. Further, the light beam having a linear light intensity distribution is scanned with a scanning mirror 4 (MEMS (MEMS)) that scans the light of the light source 1 with respect to the measurement object in the direction perpendicular to the longitudinal direction of the light beam. Micro Electro Mechanical Systems) mirror).
 また、光源1には、光源1から発する光強度を制御するための、光源制御部5が設けられており、光源制御部5によってレーザ光の強度を変調しながら、走査ミラーを走査させて逐次レーザ光の偏向方向を変えることで、撮像部13で取得される画像は、測定対象に縞パターンが投影されたときと同じ像が得られる。
 言い換えると、光源1から照射されるレーザ光を光軸方向と垂直方向のある一方向にライン状の光強度分布を有するように強度分布を整形させ、光軸方向及びライン状の光強度分布の長手方向の両方に垂直な方向に、ライン状の強度分布を有する光線を、強度変化をさせながら走査する。ここでは、一方向に正弦波状の輝度変化をもつ縞模様(正弦格子)のパターン光を形成する。そして、MEMS技術を用いたミラーにより正弦波状に強度変化しながら、光源の光を光軸方向と垂直方向にシフトさせる。このようにして、測定対象に格子パターンの照射光を投影させる。ここではMEMS技術を用いてレーザ光を投影する例を示したが、液晶プロジェクタ等を適用して照明光を投影することもできる。
Further, the light source 1 is provided with a light source control unit 5 for controlling the light intensity emitted from the light source 1, and the light source control unit 5 sequentially scans the scanning mirror while modulating the intensity of the laser light. By changing the deflection direction of the laser light, the same image as that obtained when the fringe pattern is projected onto the measurement target is obtained as the image acquired by the imaging unit 13.
In other words, the intensity distribution of the laser light emitted from the light source 1 is shaped so as to have a linear light intensity distribution in one direction perpendicular to the optical axis direction. A light beam having a linear intensity distribution is scanned in a direction perpendicular to both the longitudinal directions while changing the intensity. Here, pattern light of a striped pattern (sine lattice) having a sinusoidal luminance change in one direction is formed. Then, the light of the light source is shifted in the direction perpendicular to the optical axis direction while changing the intensity in a sinusoidal shape by a mirror using the MEMS technology. In this way, the irradiation light of the lattice pattern is projected on the measurement target. Here, an example in which laser light is projected using the MEMS technology is shown, but illumination light can also be projected by applying a liquid crystal projector or the like.
 図3は、照射部14が初期位相を90度ずつシフトさせて照射光を投影した測定対象の例を示す図である。ここでは、初期位相が0度であるAと、Aから初期位相を90度シフトさせたBと、Aから初期位相を180度シフトさせたCと、Aから初期位相を270度シフトさせたDとが示されている。例えば5バケット法の場合には初期位相を一定角度ずつシフトさせたAからEまでの5枚の撮像画像が生成され、7バケット法の場合には初期位相を一定角度ずつシフトさせたAからGまでの7枚の撮像画像が生成される。ここで、撮像順は必ずしもA、B、C、D、Eのような順でなくともよく、例えばA、E、B、C、Dの順に撮像することもできるが、本実施形態では、A、B、C、D、Eのように初期位相を順にシフトさせながら撮像処理を行うとする。すなわち、撮像部13は、初期位相が同一である格子パターン(例えば、A、E)が測定対象に投影された像を撮像する複数の撮像タイミング間に、他の初期位相の格子パターン(例えば、B、C、D)が測定対象に投影された像を撮像することになる。このように、同一の初期位相の格子パターンが撮像される撮像タイミングを時間的に離すことで、手ブレが発生している可能性が高い撮像タイミング間における撮像画像を比較することができ、ブレ検出の精度を高めることができる。 FIG. 3 is a diagram illustrating an example of a measurement target in which the irradiation unit 14 projects the irradiation light by shifting the initial phase by 90 degrees. Here, A has an initial phase of 0 degrees, B has an initial phase shifted 90 degrees from A, C has an initial phase shifted 180 degrees from A, and D has an initial phase shifted 270 degrees from A. Is shown. For example, in the case of the 5-bucket method, five captured images from A to E with the initial phase shifted by a constant angle are generated, and in the case of the 7-bucket method, A to G with the initial phase shifted by a constant angle. Up to seven captured images are generated. Here, the order of imaging does not necessarily have to be in the order of A, B, C, D, and E. For example, images can be captured in the order of A, E, B, C, and D. , B, C, D, and E, the imaging process is performed while sequentially shifting the initial phase. In other words, the imaging unit 13 may perform other initial phase grating patterns (for example, a plurality of imaging timings for capturing an image in which a lattice pattern (for example, A, E) having the same initial phase is projected onto the measurement target). B, C, D) captures an image projected on the measurement object. In this way, by separating the imaging timing at which the same initial phase lattice pattern is imaged in time, it is possible to compare captured images between imaging timings where there is a high possibility that camera shake has occurred. The accuracy of detection can be increased.
 記憶部15には、撮像部13により生成された撮像画像や、点群算出部18により算出された点群データ等が記憶される。
 特徴量算出部16は、撮像部13が生成した撮像画像から、その撮像画像に存在するブレの度合いを示す特徴量を算出する。本実施形態では、特徴量算出部16は、初期位相が同一である格子パターンが前記測定対象に投影された像を撮像した複数の撮像画像間の類似度を、特徴量として算出する。類似度は、例えば以下式(1)により算出することができる。
The storage unit 15 stores the captured image generated by the imaging unit 13, the point cloud data calculated by the point cloud calculation unit 18, and the like.
The feature amount calculation unit 16 calculates a feature amount indicating the degree of blur existing in the captured image from the captured image generated by the imaging unit 13. In the present embodiment, the feature amount calculation unit 16 calculates, as a feature amount, the similarity between a plurality of captured images obtained by capturing an image in which a lattice pattern having the same initial phase is projected onto the measurement target. The similarity can be calculated by, for example, the following formula (1).
Figure JPOXMLDOC01-appb-M000001
Figure JPOXMLDOC01-appb-M000001
 ここで、特徴量算出部16は、初期位相が同一である格子パターンが測定対象に投影された像を撮像した一組の撮像画像間の類似度を特徴量として算出してもよいし、複数組の撮像画像間の類似度を特徴量として算出することもできる。例えば、9バケット法により初期位相を90度ずつシフトさせた格子パターンが撮像された9枚の撮像画像が生成された場合、初期位相が0度である1枚目の撮像画像と初期位相が360度(つまり0度)である5枚目の撮像画像との類似度のみを算出してもよいし、図3に示したように、AとE、BとF、CとG、DとHのように、初期位相が同じ角度である複数組の撮像画像間の類似度を算出するようにしてもよい。このように、複数組の撮像画像間の類似度を算出してブレの判定を行うことで、より精度良くブレを検出することができる。 Here, the feature amount calculation unit 16 may calculate, as a feature amount, a similarity between a set of captured images obtained by capturing an image in which a lattice pattern having the same initial phase is projected onto a measurement target. Similarity between sets of captured images can also be calculated as a feature amount. For example, when nine captured images obtained by capturing a lattice pattern in which the initial phase is shifted by 90 degrees by the 9-bucket method are generated, the first captured image whose initial phase is 0 degrees and the initial phase are 360 Only the degree of similarity with the fifth captured image that is degree (that is, 0 degree) may be calculated, and as shown in FIG. 3, A and E, B and F, C and G, D and H As described above, the similarity between a plurality of sets of captured images whose initial phases are the same angle may be calculated. In this way, by calculating the degree of similarity between a plurality of sets of captured images and determining blur, it is possible to detect blur more accurately.
 判定部17は、特徴量算出部16が算出した特徴量に基づいて、撮像画像にブレが存在するか否かを判定する。例えば、判定部17は、予め定められた類似度の閾値を自身の記憶領域に記憶し、特徴量算出部16が算出した類似度と、自身の記憶領域に記憶した類似度の閾値とを比較する。判定部17は、特徴量算出部16が算出した類似度が、予め定められた類似度の閾値以上であると判定すればブレが存在しないと判定し、特徴量算出部16が算出した類似度が、予め定められた類似度の閾値以上でないと判定すればブレが存在すると判定する。
 点群算出部18は、撮像部13が生成した複数の撮像画像に基づいて、位相算出、位相接続等の点群算出処理を行い、点群データを算出して、記憶部15に記憶させる。
The determination unit 17 determines whether there is a blur in the captured image based on the feature amount calculated by the feature amount calculation unit 16. For example, the determination unit 17 stores a predetermined similarity threshold in its own storage area, and compares the similarity calculated by the feature amount calculation unit 16 with the similarity threshold stored in its own storage area. To do. The determination unit 17 determines that there is no blur if the similarity calculated by the feature amount calculation unit 16 is greater than or equal to a predetermined similarity threshold, and the similarity calculated by the feature amount calculation unit 16 However, if it is determined that it is not greater than or equal to a predetermined similarity threshold, it is determined that there is a blur.
The point group calculation unit 18 performs point group calculation processing such as phase calculation and phase connection based on a plurality of captured images generated by the imaging unit 13, calculates point group data, and stores the data in the storage unit 15.
 次に、図面を参照して、本実施形態による形状測定装置10の動作例を説明する。図4は、形状測定装置10が形状測定処理を行う動作例を説明するフローチャートである。
 操作入力部11にユーザから撮像指示が入力されると(ステップS1)、撮像部13が測定対象の撮像処理を開始し、合わせて照射部14が測定対象への照射光の投影処理を開始する(ステップS2)。撮像部13は、例えば初期位相が0度、90度、180度、270度、360度の縞投影時に撮像した5枚の撮像画像を、記憶部15に記憶させる(ステップS3)。特徴量算出部16は、記憶部15に記憶された複数の撮像画像のうち、同一の初期位相(例えば、0度と360度)である複数の撮像画像を読み出し、読み出した撮像画像間の類似度を算出する(ステップS4)。
Next, an operation example of the shape measuring apparatus 10 according to the present embodiment will be described with reference to the drawings. FIG. 4 is a flowchart for explaining an operation example in which the shape measuring apparatus 10 performs the shape measuring process.
When an imaging instruction is input from the user to the operation input unit 11 (step S1), the imaging unit 13 starts imaging processing of the measurement target, and the irradiation unit 14 starts projection processing of irradiation light onto the measurement target. (Step S2). The imaging unit 13 stores, for example, five captured images captured at the time of fringe projection with an initial phase of 0 degrees, 90 degrees, 180 degrees, 270 degrees, and 360 degrees in the storage unit 15 (step S3). The feature amount calculation unit 16 reads out a plurality of captured images having the same initial phase (for example, 0 degrees and 360 degrees) from among the plurality of captured images stored in the storage unit 15, and similarity between the read out captured images The degree is calculated (step S4).
 判定部17は、特徴量算出部16によって算出された類似度と、予め定められた閾値とを比較する(ステップS5)。判定部17が、特徴量算出部16によって算出された類似度が、予め定められた閾値以上でないと判定すると(ステップS5:NO)、表示部12は、撮像画像にブレが発生していることを示す警告を表示する(ステップS6)。ここでは、手ブレが発生している可能性があるが、点群データの算出処理を続行するか否かについてユーザからの選択入力を受け付けるようにすることができる。操作入力部11に、処理を続行しない旨の指示が入力されれば(ステップS7:NO)、ステップS1に戻る。操作入力部11に、処理を続行する旨の指示が入力されれば(ステップS7:YES)、ステップS8に進む。 The determining unit 17 compares the similarity calculated by the feature amount calculating unit 16 with a predetermined threshold (step S5). When the determination unit 17 determines that the similarity calculated by the feature amount calculation unit 16 is not equal to or greater than a predetermined threshold (step S5: NO), the display unit 12 indicates that the captured image is blurred. Is displayed (step S6). Here, although there is a possibility that camera shake has occurred, it is possible to accept a selection input from the user as to whether or not to continue the point cloud data calculation process. If an instruction not to continue the process is input to the operation input unit 11 (step S7: NO), the process returns to step S1. If an instruction to continue the process is input to the operation input unit 11 (step S7: YES), the process proceeds to step S8.
 ステップS5において、判定部17が、特徴量算出部16によって算出された類似度が、予め定められた閾値以上であると判定すると(ステップS5:YES)、点群算出部18は、記憶部15に記憶されている撮像画像に基づいて点群データを算出し、記憶部15に記憶させる(ステップS8)。そして、表示部12は、算出された点群データを表示する(ステップS9)。 In step S5, when the determination unit 17 determines that the similarity calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold (step S5: YES), the point group calculation unit 18 stores the storage unit 15 Point cloud data is calculated based on the captured image stored in the storage unit 15 and stored in the storage unit 15 (step S8). Then, the display unit 12 displays the calculated point cloud data (Step S9).
 このように、本実施形態によれば、Nバケット法に基づいて撮像された初期位相の異なる複数の撮像画像において、同一の初期位相である撮像画像間の類似度を算出し、類似度に基づいてブレが存在するか否かを判定することができる。これにより、ブレが発生した場合に再撮像、再測定の処理を行うことが可能である。ここで、撮像画像に基づいて行われる点群算出部18による点群算出処理は、比較的負荷の高い処理であり、処理が完了するまでに数秒(例えば、5秒)程度の時間がかかる場合がある。そこで、本実施形態では、撮像部13による撮像処理が行われた後、点群算出部18による点群算出処理が開始される前に、類似度の判定とブレ判定処理を行い、ブレが存在する場合には警告を表示している。これにより、ユーザは、撮像画像にブレが存在することを、負荷の高い点群算出処理が行われる前に知ることができ、再撮像の指示を入力することができる。すなわち、点群算出処理が行われる数秒程度の時間が経過した後、点群データが表示部12に表示された時点で初めてブレが発生していたことを知る場合に比べ、点群算出処理が行われる前に早期にブレの発生を知ることができる。これにより、測定対象の形状の再測定を行うために不要な待ち時間を低減させることができ、さらに無駄な電力を消費せず、効率良く測定処理を行うことができる。 As described above, according to the present embodiment, the similarity between the captured images having the same initial phase is calculated in the plurality of captured images having different initial phases captured based on the N bucket method, and based on the similarity. It can be determined whether or not there is a blur. Thereby, it is possible to perform re-imaging and re-measurement processing when blurring occurs. Here, the point cloud calculation process performed by the point cloud calculation unit 18 based on the captured image is a relatively heavy process, and it takes about several seconds (for example, 5 seconds) to complete the process. There is. Therefore, in this embodiment, after the imaging process by the imaging unit 13 is performed and before the point cloud calculation process by the point cloud calculation unit 18 is started, the similarity determination and the blur determination process are performed, and blurring exists. If you do, a warning is displayed. Thus, the user can know that there is a blur in the captured image before the high-load point cloud calculation process is performed, and can input a re-imaging instruction. That is, the point cloud calculation process is more compared to the case where it is known that blurring has occurred for the first time when the point cloud data is displayed on the display unit 12 after the time of the point cloud calculation process has elapsed. It is possible to know the occurrence of blur at an early stage before it is performed. Thereby, it is possible to reduce an unnecessary waiting time for performing re-measurement of the shape of the measurement target, and it is possible to efficiently perform measurement processing without consuming unnecessary power.
<第2の実施形態:撮像画像の鮮鋭度に基づいてブレ判定>
 第1の実施形態では、Nバケット法により生成した複数の撮像画像の類似度に基づいて撮像画像にブレが存在するか否かを判定する例を示したが、他の方法によりブレが存在するか否かを判定することもできる。以下に、本発明の第2の実施形態を説明する。本実施形態による形状測定装置10は、図1に示した第1の実施形態と同様の構成である。第1の実施形態では、複数の撮像画像間における撮像範囲が異なるようなブレについて検出したが、本実施形態では、1枚の撮像画像を撮像する露光時間中に測定対象が移動したり手ブレが発生したりすることによるブレを検出する。
<Second Embodiment: Blur Determination Based on Sharpness of Captured Image>
In the first embodiment, the example in which it is determined whether or not there is a blur in the captured image based on the similarity of the plurality of captured images generated by the N bucket method has been described. However, there is a blur by another method. It can also be determined whether or not. The second embodiment of the present invention will be described below. The shape measuring apparatus 10 according to this embodiment has the same configuration as that of the first embodiment shown in FIG. In the first embodiment, a blur is detected in which the imaging range is different between a plurality of captured images. However, in this embodiment, the measurement target moves or shakes during the exposure time for capturing one captured image. It detects the blur caused by the occurrence of.
 本実施形態における特徴量算出部16は、撮像部13によって生成された撮像画像の鮮鋭度(ボケ度)を特徴量として算出する。判定部17は、ブレが発生しているか否かを判定するための鮮鋭度の閾値を予め記憶し、特徴量算出部16が算出した鮮鋭度と鮮鋭度の閾値とを比較する。判定部17は、特徴量算出部16が算出した鮮鋭度が、定められた閾値以上であると判定すれば撮像画像にブレが存在しないと判定し、特徴量算出部16が算出した鮮鋭度が、定められた閾値以上でないと判定すれば撮像画像にブレが存在すると判定する。 The feature amount calculation unit 16 in the present embodiment calculates the sharpness (blurring degree) of the captured image generated by the imaging unit 13 as the feature amount. The determination unit 17 stores in advance a sharpness threshold for determining whether blurring has occurred, and compares the sharpness calculated by the feature amount calculation unit 16 with the sharpness threshold. If the determination unit 17 determines that the sharpness calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold, the determination unit 17 determines that there is no blur in the captured image, and the sharpness calculated by the feature amount calculation unit 16 is If it is determined that it is not equal to or more than the predetermined threshold, it is determined that there is a blur in the captured image.
 画像の鮮鋭度は、予め定められた算出式により算出することができるが、例えば空間周波数の分布に基づいて算出することができる。例えば、図5、6に示すように、同一の測定対象を撮像した、ブレの発生していない画像データにおける空間周波数の度数分布曲線A1と、ブレの発生している画像データにおける空間周波数の度数分曲線B1とを考える。ブレの発生していない画像データにおける空間周波数の度数分布曲線A1とブレの発生している画像データにおける空間周波数の度数分曲線B1とを比較すると、度数分布曲線A1の方が、度数分布曲線で囲まれる図形の重心が、高い空間周波数側に存在している。なお、図5、6では、点線の位置に該当する周波数が投影された縞の空間周波数である。
 そこで、測定対象に応じてブレが発生しているか否かを判定するために、空間周波数の度数分布曲線で囲まれた領域の重心の位置に該当する空間周波数を検出し、その検出された空間周波数に基づいて、予め設定された空間周波数の閾値と比較することで、撮像画像にブレが発生しているか否かを判定することができる。
 なお、画像データに写されている縞パターンの空間周波数に着目しても良いし、測定対象の表面に細かいパターンがある場合には、その細かいパターンに着目することもできる。つまり、そのパターンの空間周波数が低くなっているかどうかを検出することによって、撮像画像にブレが発生しているか否かを判定することができる。
The sharpness of the image can be calculated by a predetermined calculation formula, but can be calculated based on, for example, a spatial frequency distribution. For example, as shown in FIGS. 5 and 6, the frequency distribution curve A1 of the spatial frequency in the image data in which the same measurement object is imaged without blurring and the frequency of the spatial frequency in the image data in which blurring occurs. Consider the partial curve B1. When the frequency distribution curve A1 of the spatial frequency in the image data in which no blurring occurs and the frequency curve B1 of the spatial frequency in the image data in which the blurring occurs are compared, the frequency distribution curve A1 is a frequency distribution curve. The center of gravity of the enclosed figure exists on the high spatial frequency side. In FIGS. 5 and 6, the frequency corresponding to the position of the dotted line is the spatial frequency of the stripes projected.
Therefore, in order to determine whether or not blur has occurred according to the measurement target, the spatial frequency corresponding to the position of the center of gravity of the region surrounded by the frequency distribution curve of the spatial frequency is detected, and the detected space Based on the frequency, it is possible to determine whether or not a blur has occurred in the captured image by comparing with a preset spatial frequency threshold.
It should be noted that the spatial frequency of the fringe pattern copied in the image data may be noticed, and if there is a fine pattern on the surface of the measurement object, the fine pattern can be noted. That is, by detecting whether the spatial frequency of the pattern is low, it is possible to determine whether or not the captured image is blurred.
 この場合、特徴量算出部16は、撮像部13によって生成され記憶部15に記憶された撮像画像のフーリエ変換を行い、上述のように求められた重心位置に該当する空間周波数を鮮鋭度(特徴量)として算出する。判定部17は、特徴量算出部16が算出した特徴量が、基準となる鮮鋭度と比較して、予め定められた閾値以上であるか否かを判定することにより、撮像画像にブレが存在するか否かを判定する。 In this case, the feature amount calculation unit 16 performs a Fourier transform of the captured image generated by the imaging unit 13 and stored in the storage unit 15, and sets the spatial frequency corresponding to the center of gravity obtained as described above as the sharpness (feature) (Quantity). The determination unit 17 determines whether or not the feature amount calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold value compared with the reference sharpness, thereby causing blur in the captured image. It is determined whether or not to do.
 ここで、本実施形態による1枚の撮像画像毎のブレ判定と、第1の実施形態において示した複数の撮像画像間の類似度に基づくブレ判定との双方を行い、より効率良く、精度良くブレ検出を行うことができる。例えば、撮像部13が初期位相の異なる撮像画像を生成する毎に鮮鋭度に基づくブレ判定を行い、鮮鋭度が閾値以上でないと判定した時点で警告を行うことができる。これにより、不要な撮像処理を行うことを防ぐことができるとともに、不要な点群データの算出処理を行うことを防ぐことができる。 Here, both the blur determination for each captured image according to the present embodiment and the blur determination based on the similarity between the plurality of captured images shown in the first embodiment are performed, and more efficiently and accurately. Blur detection can be performed. For example, every time the imaging unit 13 generates captured images with different initial phases, blur determination based on sharpness can be performed, and a warning can be given when it is determined that the sharpness is not greater than or equal to a threshold value. As a result, it is possible to prevent unnecessary image pickup processing and to prevent unnecessary point cloud data calculation processing.
 図7は、このような形状測定装置10の動作例を示すフローチャートである。操作入力部11にユーザから撮像指示が入力されると(ステップS10)、撮像部13が測定対象の撮像処理を開始し、合わせて照射部14が測定対象への照射光の投影処理を開始する(ステップS11)。撮像部13が、照射部14によって初期位相が異なる格子パターンが測定対象に投影される毎に生成した撮像画像を、記憶部15に記憶させる(ステップS12)。特徴量算出部16は、記憶部15に記憶されている撮像画像を読み出し、読み出した撮像画像をフーリエ変換して空間周波数分布を取得して、鮮鋭度を取得する(ステップS13)。 FIG. 7 is a flowchart showing an operation example of such a shape measuring apparatus 10. When an imaging instruction is input from the user to the operation input unit 11 (Step S10), the imaging unit 13 starts imaging processing of the measurement target, and the irradiation unit 14 starts projection processing of irradiation light onto the measurement target. (Step S11). The imaging unit 13 causes the storage unit 15 to store a captured image generated each time a lattice pattern having a different initial phase is projected onto the measurement target by the irradiation unit 14 (step S12). The feature amount calculation unit 16 reads the captured image stored in the storage unit 15, obtains a spatial frequency distribution by Fourier transforming the read captured image, and acquires sharpness (step S13).
 判定部17は、特徴量算出部16によって算出された鮮鋭度と、予め定められた鮮鋭度の閾値とを比較する(ステップS14)。判定部17が、特徴量算出部16によって算出された鮮鋭度が、予め定められた閾値以上でないと判定すると(ステップS14:NO)、表示部12は、撮像画像にブレが発生していることを示す警告を表示する(ステップS15)。操作入力部11に、処理を続行しない旨の指示が入力されれば(ステップS16:NO)、ステップS10に戻る。操作入力部11に、処理を続行する旨の指示が入力されれば(ステップS16:YES)、ステップS17に進む。 The determination unit 17 compares the sharpness calculated by the feature amount calculation unit 16 with a predetermined sharpness threshold (step S14). When the determination unit 17 determines that the sharpness calculated by the feature amount calculation unit 16 is not equal to or greater than a predetermined threshold (step S14: NO), the display unit 12 indicates that the captured image is blurred. Is displayed (step S15). If an instruction not to continue the process is input to the operation input unit 11 (step S16: NO), the process returns to step S10. If an instruction to continue the process is input to the operation input unit 11 (step S16: YES), the process proceeds to step S17.
 ステップS14において、判定部17が、特徴量算出部16によって算出された鮮鋭度が、基準の鮮鋭度に比較して、予め定められた閾値以上であると判定すると(ステップS14:YES)、判定部17は、規定の枚数以上の撮像画像が生成されたか否かを判定する(ステップS17)。規定の枚数とは、例えば5バケット法による撮像を行う場合には5枚であり、7バケット法による撮像を行う場合には7枚である。判定部17が、規定の枚数以上の撮像画像が生成されていないと判定すれば(ステップS17:NO)、ステップS11に戻る。判定部17が、規定の枚数以上の撮像画像が生成されていると判定すれば(ステップS17:YES)、ステップS18に進み、第1の実施形態において示したステップS4からステップS9までと同様の処理を行う(ステップS18~ステップS23)。 In step S14, when the determination unit 17 determines that the sharpness calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold compared to the reference sharpness (step S14: YES), the determination The unit 17 determines whether or not more than a specified number of captured images have been generated (step S17). The prescribed number of images is, for example, 5 when imaging by the 5-bucket method and 7 when imaging by the 7-bucket method. If the determination unit 17 determines that no more than the specified number of captured images has been generated (step S17: NO), the process returns to step S11. If the determination unit 17 determines that a predetermined number or more of captured images have been generated (step S17: YES), the process proceeds to step S18, which is the same as steps S4 to S9 described in the first embodiment. Processing is performed (steps S18 to S23).
 このように、本実施形態によれば、特徴量算出部16が撮像画像毎に鮮鋭度を算出して判定部17がブレ判定を行い、複数の撮像画像の全ての鮮鋭度が定められた閾値以上でありブレがないと判定した場合に、特徴量算出部16が複数の撮像画像間の類似度を算出するようにした。これにより、複数の初期位相に基づく全てのパターンの撮像処理を行う前に、初期位相に応じた撮像画像毎にブレ判定を行い、ブレが検出された時点で警告を出力することができる。このため、不要な撮像処理や点群データの算出処理を防ぎ、効率良く測定対象の形状測定を行うことができる。 As described above, according to the present embodiment, the feature amount calculation unit 16 calculates the sharpness for each captured image, the determination unit 17 performs the blur determination, and the threshold in which all the sharpnesses of the plurality of captured images are determined. When it is determined that there is no blur as described above, the feature amount calculation unit 16 calculates the similarity between a plurality of captured images. Thereby, before performing the imaging process of all patterns based on a plurality of initial phases, it is possible to perform blur determination for each captured image corresponding to the initial phase, and to output a warning when a blur is detected. For this reason, unnecessary imaging processing and point cloud data calculation processing can be prevented, and the shape of the measurement target can be measured efficiently.
<第3の実施形態:初期位相の異なる複数の撮像画像の合成画像と、強度分布を一様にして撮像した画像とを比較>
 本実施形態の形状測定装置10は、第1の実施形態と同様であるが、本実施形態の形状測定装置10の照射部14は、第1の実施形態と同様の照射処理を行うことに加え、撮像部13が撮像している際に、測定対象に一様の強度分布を有する照明光が投影された像が撮像されるように照明光を照射する。特徴量算出部16は、第1の実施形態と同様に撮像した初期位相の異なる複数の撮像画像を合成した合成画像と、一様の照明光が照射された測定対象を撮像した撮像画像との類似度を特徴量として算出する。これにより、撮像画像にブレが存在しているか否かを判定する。
<Third Embodiment: Comparison of Composite Image of Multiple Captured Images with Different Initial Phases and Image Captured with Uniform Intensity Distribution>
The shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the irradiation unit 14 of the shape measuring apparatus 10 of the present embodiment performs an irradiation process similar to that of the first embodiment. When the imaging unit 13 is imaging, the illumination light is irradiated so that an image in which illumination light having a uniform intensity distribution is projected on the measurement target is captured. The feature amount calculation unit 16 combines a composite image obtained by synthesizing a plurality of captured images with different initial phases captured in the same manner as in the first embodiment, and a captured image obtained by capturing a measurement target irradiated with uniform illumination light. The similarity is calculated as a feature amount. Thereby, it is determined whether or not there is a blur in the captured image.
 すなわち、図8に示すように、初期位相をシフトさせて撮像した異なる4つの撮像画像(A、B、C、D)を合成し、例えば空間周波数の分布の平均を算出する。なお、図8の画像Xは4つの撮像画像を合成した画像である。また、図8の画像Yは均一照明で測定対象を照明したときに、撮影した画像である。4つの撮像画像を合成した画像と、レーザ光の強度変調をせずに、ライン状のパターンの光を走査したときに得られた画像とは、複数の撮像画像間にブレが発生していなければ、同じ画像となる。したがって、特徴量として、4つの撮像画像を合成した画像と強度変調を行わずに測定対象を照明した画像との類似度を検出して、ブレの有無の検出を行っても良い。
 また、発明者らの知見によれば、上述のような初期位相をシフトさせて撮像した異なる撮影画像の合成画像と、測定対象に対して一様の強度分布を有する照明光が投影された像を撮像した撮像画像との空間周波数の分布が略一致する。そこで、ほかの方法として、特徴量算出部16が、このような初期位相の異なる複数の撮像画像の合成画像の空間周波数分布と、強度分布を一様にして撮像した画像の空間周波数分布とを比較して、類似度を算出することができる。判定部17は、算出された類似度が一定以上であればブレが存在しないと判定し、一定以上でなければブレが存在すると判定する。
That is, as shown in FIG. 8, four different captured images (A, B, C, D) captured by shifting the initial phase are synthesized, and for example, the average of the spatial frequency distribution is calculated. Note that an image X in FIG. 8 is an image obtained by combining four captured images. Further, an image Y in FIG. 8 is an image taken when the measurement target is illuminated with uniform illumination. There should be no blurring between the multiple captured images between the image obtained by combining the four captured images and the image obtained by scanning the light with the line pattern without modulating the intensity of the laser beam. The same image. Therefore, the presence / absence of blur may be detected by detecting the similarity between an image obtained by combining four captured images and an image obtained by illuminating the measurement target without performing intensity modulation.
Further, according to the knowledge of the inventors, a composite image of different photographed images picked up by shifting the initial phase as described above, and an image in which illumination light having a uniform intensity distribution is projected on the measurement target The spatial frequency distribution is substantially the same as the captured image obtained by capturing. Therefore, as another method, the feature amount calculation unit 16 calculates a spatial frequency distribution of a composite image of a plurality of captured images with different initial phases and a spatial frequency distribution of an image captured with a uniform intensity distribution. The degree of similarity can be calculated by comparison. The determination unit 17 determines that there is no blur if the calculated similarity is equal to or greater than a certain level, and determines that there is blur if the calculated similarity is not equal to or greater than a certain level.
<第4の実施形態:テンプレート画像との比較>
 本実施形態の形状測定装置10は、第1の実施形態と同様であるが、本実施形態の形状測定装置10の記憶部15には、格子パターンが測定対象に投影された像を撮像した撮像画像であるテンプレート画像が予め記憶されている。例えば、図9に示すように、記憶部15には、測定対象を初期位相毎に予め撮像したA´(初期位相0度)、B´(初期位相90度)、C´(初期位相180度)、D´(初期位相270度)のテンプレート画像を記憶しておく。特徴量算出部16は、記憶部15に記憶されているテンプレート画像を読み出し、読み出したテンプレート画像と、対応する初期位相毎に撮像部13によって生成された撮像画像との類似度を算出する。判定部17は、特徴量算出部16が算出した類似度が、予め定められた閾値以上であればブレが存在しないと判定し、閾値以上でなければブレが存在すると判定する。
<Fourth Embodiment: Comparison with Template Image>
The shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the storage unit 15 of the shape measuring apparatus 10 of the present embodiment captures an image in which a lattice pattern is projected onto the measurement target. A template image that is an image is stored in advance. For example, as illustrated in FIG. 9, the storage unit 15 stores A ′ (initial phase 0 degree), B ′ (initial phase 90 degrees), and C ′ (initial phase 180 degrees) in which the measurement target is imaged in advance for each initial phase. ), D ′ (initial phase 270 degrees) template image is stored. The feature amount calculation unit 16 reads the template image stored in the storage unit 15 and calculates the similarity between the read template image and the captured image generated by the imaging unit 13 for each corresponding initial phase. The determination unit 17 determines that there is no blur if the similarity calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold, and determines that there is blur if the similarity is not greater than the threshold.
<第5の実施形態:複数画像の空間周波数の分布に基づいてブレ判定>
 本実施形態の形状測定装置10は、第1の実施形態と同様であるが、本実施形態の形状測定装置10の特徴量算出部16は、撮像部13によって撮像された複数の撮像画像のそれぞれをフーリエ変換し、空間周波数の分布を特徴量として算出する。判定部17は、複数の撮像画像間の空間周波数の分布が異なる場合には、撮像画像間にブレが存在すると判定し、複数の撮像画像間の空間周波数の分布が略同一である場合には、撮像画像間にブレが存在しないと判定する。例えば、初期位相の異なる複数の撮像画像のうち、いずれかの撮像画像のみにブレが発生している場合には、その撮像画像のみの空間周波数の分布が、他の撮像画像の空間周波数の分布と異なる結果になると考えられる。そこで、複数の撮像画像間でこのような空間周波数の分布が定められた条件を超えて一致しない場合には、ブレが存在すると判定する。
<Fifth Embodiment: Blur Determination Based on Spatial Frequency Distribution of Multiple Images>
The shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the feature amount calculating unit 16 of the shape measuring apparatus 10 of the present embodiment is configured to each of a plurality of captured images captured by the imaging unit 13. Is subjected to Fourier transform, and the spatial frequency distribution is calculated as a feature amount. When the spatial frequency distribution between the plurality of captured images is different, the determination unit 17 determines that there is blur between the captured images, and when the spatial frequency distribution between the plurality of captured images is substantially the same. Then, it is determined that there is no blur between the captured images. For example, when blurring occurs only in one of the plurality of captured images having different initial phases, the spatial frequency distribution of only the captured image is the spatial frequency distribution of the other captured images. It is thought that the result will be different. Therefore, when such a spatial frequency distribution does not match between a plurality of captured images exceeding a predetermined condition, it is determined that blurring exists.
<第6の実施形態:空間コード法に基づいてブレ判定>
 本実施形態の形状測定装置10は、第1の実施形態と同様であるが、本実施形態の形状測定装置10の照射部14は、空間コード法に基づく格子パターンを測定対象に照射し、特徴量算出部16は、複数の撮像画像毎の空間コードを特徴量として算出する。判定部17は、複数の撮像画像毎に算出された空間コードに基づく座標位置が異なれば、撮像画像にブレが存在すると判定し、座標位置が略一致すれば、撮像画像にブレが存在しないと判定する。
<Sixth Embodiment: Blur Determination Based on Spatial Code Method>
The shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the irradiation unit 14 of the shape measuring apparatus 10 of the present embodiment irradiates a measurement target with a lattice pattern based on the spatial code method, and features The amount calculation unit 16 calculates a spatial code for each of the plurality of captured images as a feature amount. The determination unit 17 determines that there is blur in the captured image if the coordinate position based on the spatial code calculated for each of the plurality of captured images is different, and if there is no blur in the captured image if the coordinate positions are approximately the same. judge.
 ここで、上述したような第1から第6の実施形態による判定処理のうち、少なくとも2つの判定処理を組み合わせて行い、より精度の高い判定処理を行うこともできる。例えば、ユーザが上述したような第1から第6実施形態による判定処理のいずれの判定処理を行うかについて予め少なくとも1つの判定処理を選択し、その選択に基づいた判定処理(または判定処理の組み合わせ)を実行するようにしてもよい。
 また、上述の実施形態では、判定部17によって撮像画像にブレが存在すると判定された場合、表示部12が警告を表示する例を示したが、例えば形状測定装置10が音を出力するスピーカを備え、警告音を出力するようにしてもよい。あるいは、有線回線又は無線回線により形状測定装置10に接続された端末に、メッセージを送信するなどして、判定部17によって撮像画像にブレが存在すると判定されたことを報知してもよい。
Here, of the determination processes according to the first to sixth embodiments as described above, at least two determination processes can be combined to perform a determination process with higher accuracy. For example, at least one determination process is selected in advance as to which of the determination processes according to the first to sixth embodiments as described above is performed by the user, and a determination process (or combination of determination processes) based on the selection is selected. ) May be executed.
Further, in the above-described embodiment, the example in which the display unit 12 displays a warning when the determination unit 17 determines that the captured image is blurred has been described. For example, the shape measurement device 10 may be a speaker that outputs sound. A warning sound may be output. Alternatively, a message may be transmitted to a terminal connected to the shape measuring apparatus 10 via a wired line or a wireless line to notify the determination unit 17 that the captured image has been determined to be blurred.
<第7の実施形態:構造物製造システム>
 次に、本実施形態の形状測定装置10を用いた構造物製造システム、及び構造物製造方法について説明する。
 図10は、本実施形態の構造物製造システム100の構成を示す図である。本実施形態の構造物製造システム100は、上記の実施形態において説明したような形状測定装置10と、設計装置20と、成形装置30と、制御装置(検査装置)40と、リペア装置50とを備える。制御装置40は、座標記憶部41及び検査部42を備える。
<Seventh Embodiment: Structure Manufacturing System>
Next, a structure manufacturing system and a structure manufacturing method using the shape measuring apparatus 10 of this embodiment will be described.
FIG. 10 is a diagram showing a configuration of the structure manufacturing system 100 of the present embodiment. The structure manufacturing system 100 of this embodiment includes the shape measuring device 10, the design device 20, the molding device 30, the control device (inspection device) 40, and the repair device 50 as described in the above embodiment. Prepare. The control device 40 includes a coordinate storage unit 41 and an inspection unit 42.
 設計装置20は、構造物の形状に関する設計情報を作製し、作製した設計情報を成形装置30に送信する。また、設計装置20は、作製した設計情報を制御装置40の座標記憶部210に記憶させる。設計情報は、構造物の各位置の座標を示す情報を含む。
 成形装置30は、設計装置20から入力された設計情報に基づいて、上記の構造物を作製する。成形装置30の成形は、例えば鋳造、鍛造、切削等が含まれる。形状測定装置10は、作製された構造物(測定対象物)の座標を測定し、測定した座標を示す情報(形状情報)を制御装置40へ送信する。
The design device 20 creates design information related to the shape of the structure, and transmits the created design information to the molding device 30. In addition, the design apparatus 20 stores the produced design information in the coordinate storage unit 210 of the control apparatus 40. The design information includes information indicating the coordinates of each position of the structure.
The molding apparatus 30 produces the above-described structure based on the design information input from the design apparatus 20. The molding of the molding apparatus 30 includes, for example, casting, forging, cutting, and the like. The shape measuring device 10 measures the coordinates of the manufactured structure (measurement object) and transmits information (shape information) indicating the measured coordinates to the control device 40.
 制御装置40の座標記憶部41は、設計情報を記憶する。制御装置40の検査部42は、座標記憶部41から設計情報を読み出す。検査部42は、形状測定装置10から受信した座標を示す情報(形状情報)と、座標記憶部41から読み出した設計情報とを比較する。検査部42は、比較結果に基づき、構造物が設計情報通りに成形されたか否かを判定する。換言すれば、検査部42は、作製された構造物が良品であるか否かを判定する。検査部42は、構造物が設計情報通りに成形されていない場合に、構造物が修復可能であるか否か判定する。検査部42は、構造物が修復できる場合、比較結果に基づいて不良部位と修復量を算出し、リペア装置50に不良部位を示す情報と修復量を示す情報とを送信する。
 リペア装置50は、制御装置40から受信した不良部位を示す情報と修復量を示す情報とに基づき、構造物の不良部位を加工する。
The coordinate storage unit 41 of the control device 40 stores design information. The inspection unit 42 of the control device 40 reads design information from the coordinate storage unit 41. The inspection unit 42 compares the information (shape information) indicating the coordinates received from the shape measuring apparatus 10 with the design information read from the coordinate storage unit 41. The inspection unit 42 determines whether or not the structure has been molded according to the design information based on the comparison result. In other words, the inspection unit 42 determines whether or not the manufactured structure is a non-defective product. The inspection unit 42 determines whether the structure can be repaired when the structure is not molded according to the design information. When the structure can be repaired, the inspection unit 42 calculates a defective portion and a repair amount based on the comparison result, and transmits information indicating the defective portion and information indicating the repair amount to the repair device 50.
The repair device 50 processes the defective portion of the structure based on the information indicating the defective portion received from the control device 40 and the information indicating the repair amount.
 図11は、本実施形態の構造物製造方法を示すフローチャートである。本実施形態において、図11に示す構造物製造方法の各処理は、構造物製造システム100の各部によって実行される。
 構造物製造システム100は、まず、設計装置20が構造物の形状に関する設計情報を作製する(ステップS31)。次に、成形装置30は、設計情報に基づいて上記構造物を作製する(ステップS32)。次に、形状測定装置10は、作製された上記構造物の形状を測定する(ステップS33)。次に、制御装置40の検査部42は、形状測定装置10で得られた形状情報と上記の設計情報とを比較することにより、構造物が誠設計情報通りに作製されたか否か検査する(ステップS34)。
FIG. 11 is a flowchart showing the structure manufacturing method of the present embodiment. In the present embodiment, each process of the structure manufacturing method illustrated in FIG. 11 is executed by each unit of the structure manufacturing system 100.
In the structure manufacturing system 100, first, the design apparatus 20 creates design information related to the shape of the structure (step S31). Next, the shaping | molding apparatus 30 produces the said structure based on design information (step S32). Next, the shape measuring apparatus 10 measures the shape of the manufactured structure (step S33). Next, the inspection unit 42 of the control device 40 compares the shape information obtained by the shape measuring device 10 with the above-described design information to inspect whether or not the structure is manufactured according to the integrity design information ( Step S34).
 次に、制御装置40の検査部42は、作製された構造物が良品であるか否かを判定する(ステップS35)。構造物製造システム100は、作製された構造物が良品であると検査部42が判定した場合(ステップS35:YES)、その処理を終了する。また、検査部42は、作製された構造物が良品でないと判定した場合(ステップS35:NO)、作製された構造物が修復できるか否か判定する(ステップS36)。 Next, the inspection unit 42 of the control device 40 determines whether or not the manufactured structure is a good product (step S35). When the inspection unit 42 determines that the manufactured structure is a non-defective product (step S35: YES), the structure manufacturing system 100 ends the process. In addition, when the inspection unit 42 determines that the manufactured structure is not a good product (step S35: NO), the inspection unit 42 determines whether the manufactured structure can be repaired (step S36).
 構造物製造システム100は、作製された構造物が修復できると検査部42が判定した場合(ステップS36:YES)、リペア装置50が構造物の再加工を実施し(ステップS37)、ステップS33の処理に戻る。構造物製造システム100は、作製された構造物が修復できないと検査部42が判定した場合(ステップS36:NO)、その処理を終了する。 In the structure manufacturing system 100, when the inspection unit 42 determines that the manufactured structure can be repaired (step S36: YES), the repair device 50 performs the reworking of the structure (step S37). Return to processing. When the inspection unit 42 determines that the manufactured structure cannot be repaired (step S36: NO), the structure manufacturing system 100 ends the process.
 本実施形態の構造物製造システム100は、上記の第1から第6実施形態における形状測定装置10が構造物の座標を正確に測定することができるので、作製された構造物が良品であるか否か判定することができる。また、構造物製造システム100は、構造物が良品でない場合、構造物の再加工を実施し、修復することができる。 In the structure manufacturing system 100 of the present embodiment, since the shape measuring apparatus 10 in the first to sixth embodiments can accurately measure the coordinates of the structure, is the manufactured structure good? It can be determined whether or not. Moreover, the structure manufacturing system 100 can perform reworking and repair of the structure when the structure is not a good product.
 なお、本実施形態におけるリペア装置50が実行するリペア工程は、成形装置30が成形工程を再実行する工程に置き換えられてもよい。その際には、制御装置40の検査部42が修復できると判定した場合、成形装置30は、成形工程(鍛造、切削等)を再実行する。具体的には、例えば、成形装置30は、構造物において本来切削されるべき箇所であって切削されていない箇所を切削する。これにより、構造物製造システム100は、構造物を正確に作製することができる。 In addition, the repair process which the repair apparatus 50 in this embodiment performs may be replaced with the process in which the shaping | molding apparatus 30 re-executes a shaping | molding process. In that case, when it determines with the test | inspection part 42 of the control apparatus 40 being able to be restored, the shaping | molding apparatus 30 re-executes a shaping | molding process (forging, cutting, etc.). Specifically, for example, the molding apparatus 30 cuts a portion that is originally to be cut and is not cut in the structure. Thereby, the structure manufacturing system 100 can produce a structure correctly.
 なお、本発明における処理部の機能を実現するためのプログラムをコンピュータ読み取り可能な記録媒体に記録して、この記録媒体に記録されたプログラムをコンピュータシステムに読み込ませ、実行することによりブレ判定の処理を行ってもよい。なお、ここでいう「コンピュータシステム」とは、OSや周辺機器等のハードウェアを含むものとする。また、「コンピュータシステム」は、ホームページ提供環境(あるいは表示環境)を備えたWWWシステムも含むものとする。また、「コンピュータ読み取り可能な記録媒体」とは、フレキシブルディスク、光磁気ディスク、ROM、CD-ROM等の可搬媒体、コンピュータシステムに内蔵されるハードディスク等の記憶装置のことをいう。さらに「コンピュータ読み取り可能な記録媒体」とは、インターネット等のネットワークや電話回線等の通信回線を介してプログラムが送信された場合のサーバやクライアントとなるコンピュータシステム内部の揮発性メモリ(RAM)のように、一定時間プログラムを保持しているものも含むものとする。 Note that a program for realizing the function of the processing unit in the present invention is recorded on a computer-readable recording medium, and the program recorded on the recording medium is read into a computer system and executed, thereby executing blur determination processing. May be performed. Here, the “computer system” includes an OS and hardware such as peripheral devices. The “computer system” includes a WWW system having a homepage providing environment (or display environment). The “computer-readable recording medium” refers to a storage device such as a flexible medium, a magneto-optical disk, a portable medium such as a ROM or a CD-ROM, and a hard disk incorporated in a computer system. Further, the “computer-readable recording medium” refers to a volatile memory (RAM) in a computer system that becomes a server or a client when a program is transmitted via a network such as the Internet or a communication line such as a telephone line. In addition, those holding programs for a certain period of time are also included.
 また、上記プログラムは、このプログラムを記憶装置等に格納したコンピュータシステムから、伝送媒体を介して、あるいは、伝送媒体中の伝送波により他のコンピュータシステムに伝送されてもよい。ここで、プログラムを伝送する「伝送媒体」は、インターネット等のネットワーク(通信網)や電話回線等の通信回線(通信線)のように情報を伝送する機能を有する媒体のことをいう。また、上記プログラムは、前述した機能の一部を実現するためのものであってもよい。さらに、前述した機能をコンピュータシステムにすでに記録されているプログラムとの組み合わせで実現できるもの、いわゆる差分ファイル(差分プログラム)であってもよい。
 また、本願発明の撮像画像に存在するブレの度合いを示す特徴量として、画像の比較による類似度算出や、画像の空間周波数分布により求めていたが、これだけに限られず、測定に必要な画像を取得している間に、測定対象又は形状測定装置が相対的に動いていたか否かを検出する方法であれば、いずれでもかまわない。
The program may be transmitted from a computer system storing the program in a storage device or the like to another computer system via a transmission medium or by a transmission wave in the transmission medium. Here, the “transmission medium” for transmitting the program refers to a medium having a function of transmitting information, such as a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line. The program may be for realizing a part of the functions described above. Furthermore, what can implement | achieve the function mentioned above in combination with the program already recorded on the computer system, what is called a difference file (difference program) may be sufficient.
In addition, as a feature amount indicating the degree of blurring present in the captured image of the present invention, the image is necessary for calculating the similarity by comparing the images and the spatial frequency distribution of the image. Any method may be used as long as it detects whether or not the measurement object or the shape measuring device is relatively moving during the acquisition.
 本発明は、本発明は、製造された構造物が良品であるか否かを判定できる構造物製造システムに適用することができる。これにより、製造された構造物の検査精度を向上させることができ、構造物の製造効率を向上させることができる。 The present invention can be applied to a structure manufacturing system that can determine whether or not a manufactured structure is a non-defective product. Thereby, the inspection accuracy of the manufactured structure can be improved, and the manufacturing efficiency of the structure can be improved.
 1…光源、2、3…レンズ、4…走査ミラー、10…形状測定装置、12…表示部、13…撮像部、14…照射部、15…記憶部、16…特徴量算出部、17…判定部、20…設計装置、30…成形装置、40…制御装置、42…検査部 DESCRIPTION OF SYMBOLS 1 ... Light source 2, 3 ... Lens, 4 ... Scanning mirror, 10 ... Shape measuring apparatus, 12 ... Display part, 13 ... Imaging part, 14 ... Irradiation part, 15 ... Memory | storage part, 16 ... Feature-value calculation part, 17 ... Determining unit, 20 ... design device, 30 ... molding device, 40 ... control device, 42 ... inspection unit

Claims (23)

  1.  測定対象を撮像した撮像画像を生成する撮像部と、
     前記撮像部によって撮像された画像が、前記測定対象に格子パターンが投影された画像として撮像されるように、前記撮像部が撮像している方向と異なる方向から、前記測定対象に所定の強度分布を有する照明光を照射する照射部と、
     前記撮像画像から、当該撮像画像に存在するブレの度合いを示す特徴量を算出する特徴量算出部と、
     前記特徴量に基づいて、前記撮像画像にブレが存在するか否かを判定する判定部とを備えることを特徴とする形状測定装置。
    An imaging unit that generates a captured image of the measurement object;
    A predetermined intensity distribution is applied to the measurement target from a direction different from the direction in which the imaging unit is capturing so that the image captured by the imaging unit is captured as an image in which a lattice pattern is projected onto the measurement target. An irradiation unit for irradiating illumination light having
    A feature amount calculation unit that calculates a feature amount indicating a degree of blur existing in the captured image from the captured image;
    A shape measuring apparatus comprising: a determination unit that determines whether or not blur exists in the captured image based on the feature amount.
  2.  前記照射部は、一定の周期の空間周波数を持ち、初期位相の異なる複数の格子パターンが前記測定対象に投影された像を、前記撮像部によって順次撮像できるように、前記照明光を照射し、
     前記撮像部は、前記初期位相の異なる複数の格子パターンが前記測定対象に投影された像を撮像した複数の撮像画像を生成することを特徴とする請求項1に記載の形状測定装置。
    The irradiation unit irradiates the illumination light so that images having a spatial frequency of a constant period and a plurality of lattice patterns having different initial phases projected onto the measurement object can be sequentially captured by the imaging unit,
    The shape measuring apparatus according to claim 1, wherein the imaging unit generates a plurality of captured images obtained by capturing an image in which a plurality of lattice patterns having different initial phases are projected on the measurement target.
  3.  前記特徴量算出部は、複数の前記撮像画像間の類似度を前記特徴量として算出することを特徴とする請求項1または請求項2に記載の形状測定装置。 3. The shape measuring apparatus according to claim 1, wherein the feature amount calculation unit calculates a similarity between a plurality of the captured images as the feature amount.
  4.  前記照射部は、Nバケット法に基づく格子パターンを前記測定対象に照射し、
     前記特徴量算出部は、初期位相が同一である格子パターンが前記測定対象に投影された像を撮像した複数の前記撮像画像間の類似度を前記特徴量として算出することを特徴とする請求項1から請求項3のいずれか1項に記載の形状測定装置。
    The irradiation unit irradiates the measurement target with a lattice pattern based on the N bucket method,
    The feature quantity calculation unit calculates, as the feature quantity, a similarity between a plurality of the captured images obtained by capturing an image in which a lattice pattern having the same initial phase is projected on the measurement target. The shape measuring apparatus according to any one of claims 1 to 3.
  5.  前記撮像部は、初期位相が同一である格子パターンが前記測定対象に投影された像を撮像する複数の撮像タイミング間に、他のパターンが前記測定対象に投影された像を撮像することを特徴とする請求項4に記載の形状測定装置。 The imaging unit captures an image in which another pattern is projected on the measurement target during a plurality of imaging timings when an image in which a lattice pattern having the same initial phase is projected on the measurement target is captured. The shape measuring apparatus according to claim 4.
  6.  前記特徴量算出部は、初期位相が同一である格子パターンが前記測定対象に投影された像を撮像した複数組の前記撮像画像間の類似度を前記特徴量として算出することを特徴とする請求項4または請求項5のいずれか1項に記載の形状測定装置。 The feature amount calculation unit calculates, as the feature amount, a similarity between a plurality of sets of the captured images obtained by capturing images in which lattice patterns having the same initial phase are projected onto the measurement target. The shape measuring apparatus according to claim 4 or 5.
  7.  前記格子パターンが前記測定対象に投影された像を撮像した撮像画像であるテンプレート画像が予め記憶されている記憶部を備え、
     前記特徴量算出部は、前記撮像部によって生成された撮像画像と、前記テンプレート画像との類似度を算出することを特徴とする請求項1から請求項6までのいずれか1項に記載の形状測定装置。
    A storage unit in which a template image that is a captured image obtained by capturing an image of the lattice pattern projected onto the measurement target is stored in advance;
    The shape according to any one of claims 1 to 6, wherein the feature amount calculation unit calculates the similarity between the captured image generated by the imaging unit and the template image. measuring device.
  8.  前記特徴量算出部は、前記撮像画像の鮮鋭度を前記特徴量として算出することを特徴とする請求項1から請求項7までのいずれか1項に記載の形状測定装置。 The shape measuring device according to any one of claims 1 to 7, wherein the feature amount calculation unit calculates the sharpness of the captured image as the feature amount.
  9.  前記特徴算出部は、前記撮像画像の空間周波数分布を算出し、前記空間周波数分布の重心位置に基づいて、前記先鋭度を算出する請求項8に記載の形状測定装置。 The shape measurement device according to claim 8, wherein the feature calculation unit calculates a spatial frequency distribution of the captured image, and calculates the sharpness based on a centroid position of the spatial frequency distribution.
  10.  前記判定部は、前記特徴量算出部が算出した複数の前記撮像画像間の類似度と、前記撮像画像の鮮鋭度との双方に基づいて、前記撮像画像にブレが存在するか否かを判定することを特徴とする請求項8に記載の形状測定装置。 The determination unit determines whether or not there is a blur in the captured image based on both the similarity between the captured images calculated by the feature amount calculation unit and the sharpness of the captured image. The shape measuring device according to claim 8.
  11.  前記特徴量算出部は、前記判定部が、複数の前記撮像画像の全ての鮮鋭度が定められた閾値以上であると判定した場合に、複数の前記撮像画像間の類似度を算出することを特徴とする請求項10に記載の形状測定装置。 The feature amount calculation unit calculates a similarity between the plurality of captured images when the determination unit determines that all the sharpnesses of the plurality of captured images are equal to or greater than a predetermined threshold. The shape measuring apparatus according to claim 10, wherein
  12.  前記照射部は、前記撮像部が撮像している際に、前記測定対象に一様の強度分布を有する照明光が投影された像が撮像されるように当該照明光を照射し、
     前記特徴量算出部は、初期位相の異なる複数の撮像画像を合成した合成画像と、前記一様の照明光が照射された前記測定対象を撮像した撮像画像との類似度を前記特徴量として算出することを特徴とする請求項1から請求項11までのいずれか1項に記載の形状測定装置。
    The irradiation unit irradiates the illumination light so that an image in which illumination light having a uniform intensity distribution is projected on the measurement target is captured when the imaging unit is imaging,
    The feature amount calculation unit calculates, as the feature amount, a similarity between a composite image obtained by combining a plurality of captured images having different initial phases and a captured image obtained by capturing the measurement target irradiated with the uniform illumination light. The shape measuring apparatus according to any one of claims 1 to 11, wherein
  13.  前記特徴量算出部は、複数の前記撮像画像の空間周波数の分布を、前記特徴量として算出することを特徴とする請求項1から請求項12までのいずれか1項に記載の形状測定装置。 The shape measuring device according to any one of claims 1 to 12, wherein the feature amount calculating unit calculates a spatial frequency distribution of the plurality of captured images as the feature amount.
  14.  前記特徴量算出部は、前記撮像画像に含まれる空間周波数成分のうち、最も高い空間周波数成分を前記特徴量として算出することを特徴とする請求項1から請求項13までのいずれか1項に記載の形状測定装置。 The feature amount calculation unit calculates the highest spatial frequency component among the spatial frequency components included in the captured image as the feature amount, according to any one of claims 1 to 13. The shape measuring apparatus described.
  15.  前記照射部は、空間コード法に基づく格子パターンを前記測定対象に照射し、
     前記特徴量算出部は、複数の前記撮像画像毎の空間コードを前記特徴量として算出することを特徴とする請求項1から請求項14までのいずれか1項に記載の形状測定装置。
    The irradiation unit irradiates the measurement target with a lattice pattern based on a spatial code method,
    The shape measuring device according to claim 1, wherein the feature amount calculation unit calculates a spatial code for each of the plurality of captured images as the feature amount.
  16.  前記判定部が、前記撮像画像にブレが存在すると判定すると、ブレが存在する旨の警告を出力する警告部を備えることを特徴とする請求項1から請求項15までのいずれか1項に記載の形状測定装置。 The said determination part is provided with the warning part which outputs the warning to the effect of a blurring, when it determines with the blurring existing in the said captured image, The any one of Claim 1-15 characterized by the above-mentioned. Shape measuring device.
  17.  前記撮像部は、前記判定部が、前記撮像画像にブレが存在すると判定すると、前記照明光が照射された前記測定対象を再度撮像し、撮像画像を生成することを特徴とする請求項1から請求項16までのいずれか1項に記載の形状測定装置。 The imaging unit, when the determination unit determines that blur is present in the captured image, captures the measurement object irradiated with the illumination light again to generate a captured image. The shape measuring apparatus according to claim 16.
  18.  前記照射部は、光源と、前記光源からの光の強度を変調するための光強度変調部と、前記光源からの光の強度分布をライン状の強度分布に変換する光強度分布変換部と、前記ライン状の長手方向及び前記光源からの光の投影方向に対してそれぞれ垂直な方向に、前記光源からの光を前記測定対象に対して走査するために配置された走査ミラーとを有することを特徴とする請求項1から請求項17までのいずれか1項に記載の形状測定装置。 The irradiation unit includes a light source, a light intensity modulation unit for modulating the intensity of light from the light source, a light intensity distribution conversion unit that converts the intensity distribution of light from the light source into a linear intensity distribution, A scanning mirror arranged for scanning the light from the light source with respect to the measurement object in a direction perpendicular to the longitudinal direction of the line and the projection direction of the light from the light source. The shape measuring device according to any one of claims 1 to 17, wherein the shape measuring device is characterized in that:
  19.  構造物の形状に関する設計情報を作製する設計装置と、
     前記設計情報に基づいて前記構造物を作製する成形装置と、
     作製された前記構造物の形状を、撮像画像に基づいて測定する請求項1から18のいずれか1項に記載の形状測定装置と、
     前記測定によって得られた形状情報と、前記設計情報とを比較する検査装置とを含む構造物製造システム。
    A design device for creating design information on the shape of the structure;
    A molding apparatus for producing the structure based on the design information;
    The shape measuring apparatus according to any one of claims 1 to 18, which measures the shape of the manufactured structure based on a captured image;
    A structure manufacturing system including shape information obtained by the measurement and an inspection device that compares the design information.
  20.  測定対象を撮像した撮像画像を生成する撮像部と、前記撮像部によって撮像された画像が、前記測定対象に格子パターンが投影された画像として撮像されるように、前記撮像部が撮像している方向と異なる方向から、前記測定対象に所定の強度分布を有する照明光を照射する照射部と、を備える形状測定装置が、
     前記撮像画像から、当該撮像画像に存在するブレの度合いを示す特徴量を算出するステップと、
     前記特徴量に基づいて、前記撮像画像にブレが存在するか否かを判定するステップとを備えることを特徴とする形状測定方法。
    An imaging unit that generates a captured image obtained by imaging the measurement target, and the imaging unit captures the image captured by the imaging unit so as to be captured as an image in which a lattice pattern is projected on the measurement target. A shape measuring device comprising: an irradiation unit that irradiates illumination light having a predetermined intensity distribution to the measurement object from a direction different from the direction,
    Calculating a feature amount indicating a degree of blur existing in the captured image from the captured image;
    And a step of determining whether or not blur is present in the captured image based on the feature amount.
  21.  構造物の形状に関する設計情報を作製することと、
     前記設計情報に基づいて前記構造物を作製することと、
     作製された前記構造物の形状を、請求項20に記載の形状測定方法を用いて生成した撮像画像に基づいて測定することと、
     前記測定によって得られた形状情報と、前記設計情報とを比較することとを含む構造物製造方法。
    Creating design information on the shape of the structure;
    Producing the structure based on the design information;
    Measuring the shape of the fabricated structure based on a captured image generated using the shape measuring method according to claim 20;
    A structure manufacturing method including comparing shape information obtained by the measurement with the design information.
  22.  測定対象を撮像した撮像画像を生成する撮像部と、前記撮像部によって撮像された画像が、前記測定対象に格子パターンが投影された画像として撮像されるように、前記撮像部が撮像している方向と異なる方向から、前記測定対象に所定の強度分布を有する照明光を照射する照射部と、を備える形状測定装置のコンピュータに、
     前記撮像画像から、当該撮像画像に存在するブレの度合いを示す特徴量を算出するステップと、
     前記特徴量に基づいて、前記撮像画像にブレが存在するか否かを判定するステップとを実行させる形状測定プログラム。
    An imaging unit that generates a captured image obtained by imaging the measurement target, and the imaging unit captures the image captured by the imaging unit so as to be captured as an image in which a lattice pattern is projected on the measurement target. An irradiation unit that irradiates illumination light having a predetermined intensity distribution to the measurement object from a direction different from the direction, to a computer of a shape measuring apparatus,
    Calculating a feature amount indicating a degree of blur existing in the captured image from the captured image;
    And a step of determining whether or not blur exists in the captured image based on the feature amount.
  23.  請求項23に記載の形状測定プログラムが記録された、コンピュータ読み取り可能な記録媒体。 A computer-readable recording medium on which the shape measurement program according to claim 23 is recorded.
PCT/JP2012/072913 2011-09-09 2012-09-07 Shape measurement device, structure manufacturing system, shape measurement method, structure manufacturing method, shape measurement program, and computer-readable recording medium WO2013035847A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
JP2011196865 2011-09-09
JP2011-196865 2011-09-09

Publications (1)

Publication Number Publication Date
WO2013035847A1 true WO2013035847A1 (en) 2013-03-14

Family

ID=47832286

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/072913 WO2013035847A1 (en) 2011-09-09 2012-09-07 Shape measurement device, structure manufacturing system, shape measurement method, structure manufacturing method, shape measurement program, and computer-readable recording medium

Country Status (3)

Country Link
JP (1) JPWO2013035847A1 (en)
TW (1) TW201329419A (en)
WO (1) WO2013035847A1 (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105027159A (en) * 2013-03-26 2015-11-04 凸版印刷株式会社 Image processing device, image processing system, image processing method, and image processing program
WO2016075978A1 (en) * 2014-11-12 2016-05-19 ソニー株式会社 Information processing device, information processing method, and program
US11313676B2 (en) * 2018-02-07 2022-04-26 Omron Corporation Three-dimensional measurement apparatus, three-dimensional measurement method, and three-dimensional measurement non-transitory computer readable medium

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004070421A (en) * 2002-08-01 2004-03-04 Fuji Photo Film Co Ltd System and method for image processing, imaging device, and image processor
US20060058972A1 (en) * 2004-09-15 2006-03-16 Asml Netherlands B.V. Method and apparatus for vibration detection, method and apparatus for vibration analysis, lithographic apparatus, device manufacturing method, and computer program
US20060120618A1 (en) * 2004-12-06 2006-06-08 Canon Kabushiki Kaisha Image processing apparatus, method of controlling thereof, and program
JP2007158488A (en) * 2005-12-01 2007-06-21 Sharp Corp Camera shake detecting apparatus
JP2007278951A (en) * 2006-04-10 2007-10-25 Alpine Electronics Inc Car body behavior measuring device
JP2008190962A (en) * 2007-02-02 2008-08-21 Aidin System Kk Three-dimensional measurement apparatus
US20100158490A1 (en) * 2008-12-19 2010-06-24 Sirona Dental Systems Gmbh Method and device for optical scanning of three-dimensional objects by means of a dental 3d camera using a triangulation method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004070421A (en) * 2002-08-01 2004-03-04 Fuji Photo Film Co Ltd System and method for image processing, imaging device, and image processor
US20060058972A1 (en) * 2004-09-15 2006-03-16 Asml Netherlands B.V. Method and apparatus for vibration detection, method and apparatus for vibration analysis, lithographic apparatus, device manufacturing method, and computer program
US20060120618A1 (en) * 2004-12-06 2006-06-08 Canon Kabushiki Kaisha Image processing apparatus, method of controlling thereof, and program
JP2007158488A (en) * 2005-12-01 2007-06-21 Sharp Corp Camera shake detecting apparatus
JP2007278951A (en) * 2006-04-10 2007-10-25 Alpine Electronics Inc Car body behavior measuring device
JP2008190962A (en) * 2007-02-02 2008-08-21 Aidin System Kk Three-dimensional measurement apparatus
US20100158490A1 (en) * 2008-12-19 2010-06-24 Sirona Dental Systems Gmbh Method and device for optical scanning of three-dimensional objects by means of a dental 3d camera using a triangulation method

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105027159A (en) * 2013-03-26 2015-11-04 凸版印刷株式会社 Image processing device, image processing system, image processing method, and image processing program
EP2980752A4 (en) * 2013-03-26 2016-11-09 Toppan Printing Co Ltd Image processing device, image processing system, image processing method, and image processing program
JPWO2014157348A1 (en) * 2013-03-26 2017-02-16 凸版印刷株式会社 Image processing apparatus, image processing system, image processing method, and image processing program
US10068339B2 (en) 2013-03-26 2018-09-04 Toppan Printing Co., Ltd. Image processing device, image processing system, image processing method and image processing program
WO2016075978A1 (en) * 2014-11-12 2016-05-19 ソニー株式会社 Information processing device, information processing method, and program
KR20170085494A (en) * 2014-11-12 2017-07-24 소니 주식회사 Information processing device, information processing method, and program
US11189024B2 (en) 2014-11-12 2021-11-30 Sony Corporation Information processing apparatus and information processing method
KR102503872B1 (en) * 2014-11-12 2023-02-27 소니그룹주식회사 Information processing device, information processing method, and program
US11313676B2 (en) * 2018-02-07 2022-04-26 Omron Corporation Three-dimensional measurement apparatus, three-dimensional measurement method, and three-dimensional measurement non-transitory computer readable medium

Also Published As

Publication number Publication date
JPWO2013035847A1 (en) 2015-03-23
TW201329419A (en) 2013-07-16

Similar Documents

Publication Publication Date Title
JP5395507B2 (en) Three-dimensional shape measuring apparatus, three-dimensional shape measuring method, and computer program
KR101639227B1 (en) Three dimensional shape measurment apparatus
DK2198780T3 (en) Method and Device for Optical Scanning of Three-Dimensional Objects Using a 3D Dental Camera Using Triangulation
JP5994787B2 (en) Shape measuring device, structure manufacturing system, shape measuring method, structure manufacturing method, shape measuring program
EP1946376B1 (en) Apparatus for and method of measuring image
WO2012057284A1 (en) Three-dimensional shape measurement device, three-dimensional shape measurement method, manufacturing method of structure, and structure manufacturing system
JP5385703B2 (en) Inspection device, inspection method, and inspection program
KR20190104367A (en) 3D shape measuring device, 3D shape measuring method and program
JP5595211B2 (en) Three-dimensional shape measuring apparatus, three-dimensional shape measuring method, and computer program
US8970674B2 (en) Three-dimensional measurement apparatus, three-dimensional measurement method and storage medium
JP2009036589A (en) Target for calibration and device, method and program for supporting calibration
WO2013035847A1 (en) Shape measurement device, structure manufacturing system, shape measurement method, structure manufacturing method, shape measurement program, and computer-readable recording medium
JP2010181247A (en) Shape measurement apparatus and shape measurement method
KR101465996B1 (en) Method for measurement of high speed 3d shape using selective long period
JP2012093234A (en) Three-dimensional shape measurement device, three-dimensional shape measurement method, structure manufacturing method, and structure manufacturing system
JP2006084286A (en) Three-dimensional measuring method and its measuring device
JP2014035198A (en) Shape measurement device, structure manufacturing system, shape measurement method, structure manufacturing method, and shape measurement program
JP2017125707A (en) Measurement method and measurement device
EP2198780B1 (en) Method and device for optical scanning of three-dimensional objects by means of a dental 3D camera using a triangulation method
JP2008170282A (en) Shape measuring device
JP2010210410A (en) Three-dimensional shape measuring device
JP2016008837A (en) Shape measuring method, shape measuring device, structure manufacturing system, structure manufacturing method, and shape measuring program
JP4930834B2 (en) Shape measurement method
JP6460944B2 (en) Method for measuring position and orientation of measured object
JP2008216076A (en) Shape measuring method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12829855

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase

Ref document number: 2013532674

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12829855

Country of ref document: EP

Kind code of ref document: A1