WO2013035847A1 - Shape measurement device, structure manufacturing system, shape measurement method, structure manufacturing method, shape measurement program, and computer-readable recording medium - Google Patents

Shape measurement device, structure manufacturing system, shape measurement method, structure manufacturing method, shape measurement program, and computer-readable recording medium Download PDF

Info

Publication number
WO2013035847A1
WO2013035847A1 PCT/JP2012/072913 JP2012072913W WO2013035847A1 WO 2013035847 A1 WO2013035847 A1 WO 2013035847A1 JP 2012072913 W JP2012072913 W JP 2012072913W WO 2013035847 A1 WO2013035847 A1 WO 2013035847A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
captured
feature amount
captured image
unit
Prior art date
Application number
PCT/JP2012/072913
Other languages
French (fr)
Japanese (ja)
Inventor
青木 洋
Original Assignee
株式会社ニコン
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority to JP2011196865 priority Critical
Priority to JP2011-196865 priority
Application filed by 株式会社ニコン filed Critical 株式会社ニコン
Publication of WO2013035847A1 publication Critical patent/WO2013035847A1/en

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical means
    • G01B11/24Measuring arrangements characterised by the use of optical means for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical means for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical means for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/50Depth or shape recovery
    • G06T7/521Depth or shape recovery from laser ranging, e.g. using interferometry; from the projection of structured light
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10016Video; Image sequence
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10141Special mode during image acquisition
    • G06T2207/10152Varying illumination
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30168Image quality inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING; COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30244Camera pose

Abstract

An objective of the present invention is to assess whether blurring is present in a captured image for measuring a subject shape to be measured. A shape measurement device comprises: an image capture unit which generates a captured image which captures the subject to be measured; an illumination unit which shines upon the subject to be measured an illumination light which has a prescribed intensity distribution from a direction different form the direction which the image capture unit captures the image such that the image which is captured by the image capture unit is captured as an image in which a lattice pattern is projected upon the subject to be measured; a feature value computation unit which computes a feature value from the captured image which denotes the degree of blurring which is present in the captured image; and an assessment unit which assesses whether blurring is present in the captured image on the basis of the feature value.

Description

Shape measuring device, structure manufacturing system, shape measuring method, structure manufacturing method, shape measuring program, computer-readable recording medium

The present invention relates to a shape measuring device, a structure manufacturing system, a shape measuring method, a structure manufacturing method, a shape measuring program, and a computer-readable recording medium.

As a technique for measuring the surface shape (three-dimensional shape) of a measurement object in a non-contact manner, for example, a pattern projection type shape measuring apparatus using a phase shift method is known (for example, see Patent Document 1). In this shape measuring apparatus, a grating pattern having a sinusoidal intensity distribution is projected onto a measurement object, and the measurement object is repeatedly imaged while changing the phase of the grating pattern at a constant pitch. By applying a plurality of captured images (luminance change data) obtained in this manner to a predetermined arithmetic expression, the phase distribution (phase image) of the lattice pattern deformed according to the surface shape of the measurement object is obtained, and the phase image Is unwrapped (phase connection) and then converted into a height distribution (height image) of the measurement object.

JP 2009-180689 A

However, according to the knowledge of the present inventors, it is considered that an error occurs in the measurement value of the three-dimensional shape when blurring occurs when capturing a plurality of captured images on which a grating pattern having different phases is projected. Therefore, it is desirable to determine whether or not blur has occurred in the captured image.

The present invention has been made in view of such a situation, and a shape measuring device, a structure manufacturing system, and a shape measuring device that determine whether or not there is a blur in a captured image for measuring the shape of a measurement target. A method, a structure manufacturing method, and a shape measurement program are provided.

In order to solve the above-described problem, the present invention provides an imaging unit that generates a captured image obtained by imaging a measurement target, and an image captured by the imaging unit is captured as an image in which a lattice pattern is projected on the measurement target. As shown, the irradiation unit that irradiates the measurement target with illumination light having a predetermined intensity distribution from a direction different from the direction in which the imaging unit is imaging, and the degree of blur that exists in the captured image from the captured image A shape measuring apparatus comprising: a feature amount calculating unit that calculates a feature amount; and a determination unit that determines whether or not blur is present in a captured image based on the feature amount.

The present invention also measures a design apparatus for producing design information related to the shape of a structure, a molding apparatus for producing a structure based on the design information, and a shape of the produced structure based on a captured image. It is a structure manufacturing system including the above-described shape measuring device and an inspection device that compares shape information obtained by measurement and design information.

Further, the present invention provides an imaging unit that generates a captured image obtained by imaging a measurement target, and the imaging unit so that the image captured by the imaging unit is captured as an image in which a lattice pattern is projected on the measurement target. A shape measuring device comprising: an irradiating unit that irradiates illumination light having a predetermined intensity distribution on the measurement target from a direction different from the direction in which the image is captured. A shape measuring method comprising: calculating a feature amount; and determining whether or not there is a blur in a captured image based on the feature amount.

In addition, the present invention creates design information related to the shape of a structure, creates a structure based on the design information, and generates the shape of the created structure using the shape measurement method described above. It is a structure manufacturing method including measuring based on the captured image and comparing shape information obtained by measurement with design information.

Further, the present invention provides an imaging unit that generates a captured image obtained by imaging a measurement target, and the imaging unit so that the image captured by the imaging unit is captured as an image in which a lattice pattern is projected on the measurement target. The degree of blur that exists in the captured image from the captured image to the computer of the shape measuring apparatus that includes an irradiation unit that irradiates the measurement target with illumination light having a predetermined intensity distribution from a direction different from the direction in which the image is captured And a step of determining whether or not there is a blur in a captured image based on the feature amount.

As described above, according to the present invention, it is possible to determine whether or not there is a blur in the captured image for measuring the shape of the measurement target.

It is a block diagram which shows the structural example of the shape measuring apparatus by the 1st Embodiment of this invention. It is a schematic block diagram of the irradiation part by the 1st Embodiment of this invention. It is a figure which shows the example of the captured image by the 1st Embodiment of this invention. It is a flowchart which shows the operation example of the shape measuring apparatus by the 1st Embodiment of this invention. It is a figure which shows the frequency distribution curve of the spatial frequency in the image data which has not generate | occur | produced the blur. It is a figure which shows the frequency distribution curve of the spatial frequency in the image data which has generate | occur | produced the blurring. It is a flowchart which shows the operation example of the shape measuring apparatus by the 2nd Embodiment of this invention. It is a figure which shows the example of the image compared in the 3rd Embodiment of this invention. It is a figure which shows the example of the image compared in the 4th Embodiment of this invention. It is a block diagram which shows the structural example of the shape measuring apparatus by the 7th Embodiment of this invention. It is a flowchart which shows the operation example of the shape measuring apparatus by the 7th Embodiment of this invention.

Hereinafter, an embodiment of the present invention will be described with reference to the drawings.
<First embodiment: Comparison of similarities between a plurality of images having the same initial phase>
FIG. 1 is a block diagram showing a configuration of a shape measuring apparatus 10 according to the first embodiment of the present invention. The shape measuring apparatus 10 includes an operation input unit 11, a display unit 12, an imaging unit 13, an irradiation unit 14, a storage unit 15, a feature amount calculation unit 16, a determination unit 17, and a point group calculation unit 18. The computer terminal which measures the three-dimensional shape of the measuring object A by the phase shift method is included. The shape measuring apparatus 10 detects whether or not there is a blur in the captured image when capturing a plurality of captured images whose phases are shifted by the phase shift method. Causes of blurring include, for example, blurring when the measurement object moves during imaging, or camera shake when the user captures the portable shape measuring device 10 without holding it on a tripod or the like. Can be considered. In addition, blurring that affects the measurement results may include blurring in which the imaging range differs between multiple captured images, and the measurement target may move or camera shake during the exposure time for capturing a single captured image. It is thought that this may be a blur due to Alternatively, these blurs may occur simultaneously.

In the present embodiment, an image obtained by projecting a plurality of grating patterns having different initial phases based on the N bucket method onto a measurement object is captured, and the shape of the measurement object is measured based on the luminance value of the same pixel in each image. . Usually, the shape measurement is performed from the fringe images having different initial phases, but in this embodiment, the lattice pattern having the same initial phase as that of at least one of the initial phases is projected again, and the captured image is The images of lattice patterns having the same initial phase are compared. In this way, the similarity of two or more captured images with different imaging timings in which the same initial phase lattice pattern is captured is calculated. Here, if the similarity is equal to or greater than the threshold, it can be determined that there is no blur between the photographing timings, and if the similarity is not equal to or greater than the threshold, it can be determined that there is a blur between the photographing timings. The shape measuring apparatus 10 determines whether or not such a blur has occurred during imaging, and issues a warning if a blur has occurred. Thereby, for example, when blurring occurs, it is possible to perform imaging again.

The operation input unit 11 receives an operation input from a user. For example, the operation input unit 11 includes operation members such as a power button for switching on and off of the main power supply and a release button for receiving an instruction to start an imaging process. Or the operation input part 11 can also receive the input by a touchscreen.
The display unit 12 is a display that displays various types of information. For example, when the determination unit 17 described later determines that there is a blur in the captured image, the display unit 12 displays a warning that there is a blur. Further, the display unit 12 displays, for example, point cloud data indicating the three-dimensional shape of the measurement target calculated by the point cloud calculation unit 18.

The imaging unit 13 generates a captured image obtained by imaging the measurement target, and performs an imaging process for storing the generated captured image in the storage unit 15. The imaging unit 13 operates in conjunction with the irradiation unit 14 and performs an imaging process in accordance with the timing at which the illumination light is projected onto the measurement target by the irradiation unit 14. In the present embodiment, the imaging unit 13 generates a plurality of captured images obtained by capturing, for each initial phase, an image in which a plurality of lattice patterns having different initial phases based on the N bucket method are projected onto the measurement target by the irradiation unit 14. . In addition, when the determination unit 17 determines that there is a possibility of blurring in the captured image, the imaging unit 13 warns the user of the possibility of blurring. In response to the warning, the user can cause the shape measuring apparatus 10 to acquire an image obtained by projecting a plurality of lattice patterns having different initial phases onto the measurement object, based on the N bucket method again. In this way, if imaging is continued until no warning about the possibility of the presence of blur is generated, shape measurement with reduced blurring can be performed.
Note that the shape measuring apparatus 10 may continue to automatically capture an image in which a plurality of lattice patterns having different initial phases based on the N bucket method are projected on the measurement target until blur determination is eliminated. In this way, the measurement object irradiated with the illumination light is imaged again to generate a captured image.

The irradiating unit 14 has a direction different from the direction in which the imaging unit 13 is imaging so that an image in which the lattice pattern is projected onto the measurement target is captured when the imaging unit 13 is imaging the measurement target. Irradiation light having a predetermined intensity distribution is irradiated to the measurement object. Here, the irradiating unit 14 sequentially captures images obtained by projecting a plurality of lattice patterns having a spatial frequency of a certain period and different initial phases by 90 degrees on the measurement object based on the N bucket method. Irradiate with illumination light so that you can. For example, as shown in FIG. 2, the irradiation unit 14 has a light intensity such that the light source 1 and the light from the light source 1 have a linear intensity distribution having a longitudinal direction in a direction orthogonal to the light irradiation direction. It has a collimate lens 2 and a cylindrical lens 3 for converting the distribution. Further, the light beam having a linear light intensity distribution is scanned with a scanning mirror 4 (MEMS (MEMS)) that scans the light of the light source 1 with respect to the measurement object in the direction perpendicular to the longitudinal direction of the light beam. Micro Electro Mechanical Systems) mirror).

Further, the light source 1 is provided with a light source control unit 5 for controlling the light intensity emitted from the light source 1, and the light source control unit 5 sequentially scans the scanning mirror while modulating the intensity of the laser light. By changing the deflection direction of the laser light, the same image as that obtained when the fringe pattern is projected onto the measurement target is obtained as the image acquired by the imaging unit 13.
In other words, the intensity distribution of the laser light emitted from the light source 1 is shaped so as to have a linear light intensity distribution in one direction perpendicular to the optical axis direction. A light beam having a linear intensity distribution is scanned in a direction perpendicular to both the longitudinal directions while changing the intensity. Here, pattern light of a striped pattern (sine lattice) having a sinusoidal luminance change in one direction is formed. Then, the light of the light source is shifted in the direction perpendicular to the optical axis direction while changing the intensity in a sinusoidal shape by a mirror using the MEMS technology. In this way, the irradiation light of the lattice pattern is projected on the measurement target. Here, an example in which laser light is projected using the MEMS technology is shown, but illumination light can also be projected by applying a liquid crystal projector or the like.

FIG. 3 is a diagram illustrating an example of a measurement target in which the irradiation unit 14 projects the irradiation light by shifting the initial phase by 90 degrees. Here, A has an initial phase of 0 degrees, B has an initial phase shifted 90 degrees from A, C has an initial phase shifted 180 degrees from A, and D has an initial phase shifted 270 degrees from A. Is shown. For example, in the case of the 5-bucket method, five captured images from A to E with the initial phase shifted by a constant angle are generated, and in the case of the 7-bucket method, A to G with the initial phase shifted by a constant angle. Up to seven captured images are generated. Here, the order of imaging does not necessarily have to be in the order of A, B, C, D, and E. For example, images can be captured in the order of A, E, B, C, and D. , B, C, D, and E, the imaging process is performed while sequentially shifting the initial phase. In other words, the imaging unit 13 may perform other initial phase grating patterns (for example, a plurality of imaging timings for capturing an image in which a lattice pattern (for example, A, E) having the same initial phase is projected onto the measurement target). B, C, D) captures an image projected on the measurement object. In this way, by separating the imaging timing at which the same initial phase lattice pattern is imaged in time, it is possible to compare captured images between imaging timings where there is a high possibility that camera shake has occurred. The accuracy of detection can be increased.

The storage unit 15 stores the captured image generated by the imaging unit 13, the point cloud data calculated by the point cloud calculation unit 18, and the like.
The feature amount calculation unit 16 calculates a feature amount indicating the degree of blur existing in the captured image from the captured image generated by the imaging unit 13. In the present embodiment, the feature amount calculation unit 16 calculates, as a feature amount, the similarity between a plurality of captured images obtained by capturing an image in which a lattice pattern having the same initial phase is projected onto the measurement target. The similarity can be calculated by, for example, the following formula (1).

Figure JPOXMLDOC01-appb-M000001

Here, the feature amount calculation unit 16 may calculate, as a feature amount, a similarity between a set of captured images obtained by capturing an image in which a lattice pattern having the same initial phase is projected onto a measurement target. Similarity between sets of captured images can also be calculated as a feature amount. For example, when nine captured images obtained by capturing a lattice pattern in which the initial phase is shifted by 90 degrees by the 9-bucket method are generated, the first captured image whose initial phase is 0 degrees and the initial phase are 360 Only the degree of similarity with the fifth captured image that is degree (that is, 0 degree) may be calculated, and as shown in FIG. 3, A and E, B and F, C and G, D and H As described above, the similarity between a plurality of sets of captured images whose initial phases are the same angle may be calculated. In this way, by calculating the degree of similarity between a plurality of sets of captured images and determining blur, it is possible to detect blur more accurately.

The determination unit 17 determines whether there is a blur in the captured image based on the feature amount calculated by the feature amount calculation unit 16. For example, the determination unit 17 stores a predetermined similarity threshold in its own storage area, and compares the similarity calculated by the feature amount calculation unit 16 with the similarity threshold stored in its own storage area. To do. The determination unit 17 determines that there is no blur if the similarity calculated by the feature amount calculation unit 16 is greater than or equal to a predetermined similarity threshold, and the similarity calculated by the feature amount calculation unit 16 However, if it is determined that it is not greater than or equal to a predetermined similarity threshold, it is determined that there is a blur.
The point group calculation unit 18 performs point group calculation processing such as phase calculation and phase connection based on a plurality of captured images generated by the imaging unit 13, calculates point group data, and stores the data in the storage unit 15.

Next, an operation example of the shape measuring apparatus 10 according to the present embodiment will be described with reference to the drawings. FIG. 4 is a flowchart for explaining an operation example in which the shape measuring apparatus 10 performs the shape measuring process.
When an imaging instruction is input from the user to the operation input unit 11 (step S1), the imaging unit 13 starts imaging processing of the measurement target, and the irradiation unit 14 starts projection processing of irradiation light onto the measurement target. (Step S2). The imaging unit 13 stores, for example, five captured images captured at the time of fringe projection with an initial phase of 0 degrees, 90 degrees, 180 degrees, 270 degrees, and 360 degrees in the storage unit 15 (step S3). The feature amount calculation unit 16 reads out a plurality of captured images having the same initial phase (for example, 0 degrees and 360 degrees) from among the plurality of captured images stored in the storage unit 15, and similarity between the read out captured images The degree is calculated (step S4).

The determining unit 17 compares the similarity calculated by the feature amount calculating unit 16 with a predetermined threshold (step S5). When the determination unit 17 determines that the similarity calculated by the feature amount calculation unit 16 is not equal to or greater than a predetermined threshold (step S5: NO), the display unit 12 indicates that the captured image is blurred. Is displayed (step S6). Here, although there is a possibility that camera shake has occurred, it is possible to accept a selection input from the user as to whether or not to continue the point cloud data calculation process. If an instruction not to continue the process is input to the operation input unit 11 (step S7: NO), the process returns to step S1. If an instruction to continue the process is input to the operation input unit 11 (step S7: YES), the process proceeds to step S8.

In step S5, when the determination unit 17 determines that the similarity calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold (step S5: YES), the point group calculation unit 18 stores the storage unit 15 Point cloud data is calculated based on the captured image stored in the storage unit 15 and stored in the storage unit 15 (step S8). Then, the display unit 12 displays the calculated point cloud data (Step S9).

As described above, according to the present embodiment, the similarity between the captured images having the same initial phase is calculated in the plurality of captured images having different initial phases captured based on the N bucket method, and based on the similarity. It can be determined whether or not there is a blur. Thereby, it is possible to perform re-imaging and re-measurement processing when blurring occurs. Here, the point cloud calculation process performed by the point cloud calculation unit 18 based on the captured image is a relatively heavy process, and it takes about several seconds (for example, 5 seconds) to complete the process. There is. Therefore, in this embodiment, after the imaging process by the imaging unit 13 is performed and before the point cloud calculation process by the point cloud calculation unit 18 is started, the similarity determination and the blur determination process are performed, and blurring exists. If you do, a warning is displayed. Thus, the user can know that there is a blur in the captured image before the high-load point cloud calculation process is performed, and can input a re-imaging instruction. That is, the point cloud calculation process is more compared to the case where it is known that blurring has occurred for the first time when the point cloud data is displayed on the display unit 12 after the time of the point cloud calculation process has elapsed. It is possible to know the occurrence of blur at an early stage before it is performed. Thereby, it is possible to reduce an unnecessary waiting time for performing re-measurement of the shape of the measurement target, and it is possible to efficiently perform measurement processing without consuming unnecessary power.

<Second Embodiment: Blur Determination Based on Sharpness of Captured Image>
In the first embodiment, the example in which it is determined whether or not there is a blur in the captured image based on the similarity of the plurality of captured images generated by the N bucket method has been described. However, there is a blur by another method. It can also be determined whether or not. The second embodiment of the present invention will be described below. The shape measuring apparatus 10 according to this embodiment has the same configuration as that of the first embodiment shown in FIG. In the first embodiment, a blur is detected in which the imaging range is different between a plurality of captured images. However, in this embodiment, the measurement target moves or shakes during the exposure time for capturing one captured image. It detects the blur caused by the occurrence of.

The feature amount calculation unit 16 in the present embodiment calculates the sharpness (blurring degree) of the captured image generated by the imaging unit 13 as the feature amount. The determination unit 17 stores in advance a sharpness threshold for determining whether blurring has occurred, and compares the sharpness calculated by the feature amount calculation unit 16 with the sharpness threshold. If the determination unit 17 determines that the sharpness calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold, the determination unit 17 determines that there is no blur in the captured image, and the sharpness calculated by the feature amount calculation unit 16 is If it is determined that it is not equal to or more than the predetermined threshold, it is determined that there is a blur in the captured image.

The sharpness of the image can be calculated by a predetermined calculation formula, but can be calculated based on, for example, a spatial frequency distribution. For example, as shown in FIGS. 5 and 6, the frequency distribution curve A1 of the spatial frequency in the image data in which the same measurement object is imaged without blurring and the frequency of the spatial frequency in the image data in which blurring occurs. Consider the partial curve B1. When the frequency distribution curve A1 of the spatial frequency in the image data in which no blurring occurs and the frequency curve B1 of the spatial frequency in the image data in which the blurring occurs are compared, the frequency distribution curve A1 is a frequency distribution curve. The center of gravity of the enclosed figure exists on the high spatial frequency side. In FIGS. 5 and 6, the frequency corresponding to the position of the dotted line is the spatial frequency of the stripes projected.
Therefore, in order to determine whether or not blur has occurred according to the measurement target, the spatial frequency corresponding to the position of the center of gravity of the region surrounded by the frequency distribution curve of the spatial frequency is detected, and the detected space Based on the frequency, it is possible to determine whether or not a blur has occurred in the captured image by comparing with a preset spatial frequency threshold.
It should be noted that the spatial frequency of the fringe pattern copied in the image data may be noticed, and if there is a fine pattern on the surface of the measurement object, the fine pattern can be noted. That is, by detecting whether the spatial frequency of the pattern is low, it is possible to determine whether or not the captured image is blurred.

In this case, the feature amount calculation unit 16 performs a Fourier transform of the captured image generated by the imaging unit 13 and stored in the storage unit 15, and sets the spatial frequency corresponding to the center of gravity obtained as described above as the sharpness (feature) (Quantity). The determination unit 17 determines whether or not the feature amount calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold value compared with the reference sharpness, thereby causing blur in the captured image. It is determined whether or not to do.

Here, both the blur determination for each captured image according to the present embodiment and the blur determination based on the similarity between the plurality of captured images shown in the first embodiment are performed, and more efficiently and accurately. Blur detection can be performed. For example, every time the imaging unit 13 generates captured images with different initial phases, blur determination based on sharpness can be performed, and a warning can be given when it is determined that the sharpness is not greater than or equal to a threshold value. As a result, it is possible to prevent unnecessary image pickup processing and to prevent unnecessary point cloud data calculation processing.

FIG. 7 is a flowchart showing an operation example of such a shape measuring apparatus 10. When an imaging instruction is input from the user to the operation input unit 11 (Step S10), the imaging unit 13 starts imaging processing of the measurement target, and the irradiation unit 14 starts projection processing of irradiation light onto the measurement target. (Step S11). The imaging unit 13 causes the storage unit 15 to store a captured image generated each time a lattice pattern having a different initial phase is projected onto the measurement target by the irradiation unit 14 (step S12). The feature amount calculation unit 16 reads the captured image stored in the storage unit 15, obtains a spatial frequency distribution by Fourier transforming the read captured image, and acquires sharpness (step S13).

The determination unit 17 compares the sharpness calculated by the feature amount calculation unit 16 with a predetermined sharpness threshold (step S14). When the determination unit 17 determines that the sharpness calculated by the feature amount calculation unit 16 is not equal to or greater than a predetermined threshold (step S14: NO), the display unit 12 indicates that the captured image is blurred. Is displayed (step S15). If an instruction not to continue the process is input to the operation input unit 11 (step S16: NO), the process returns to step S10. If an instruction to continue the process is input to the operation input unit 11 (step S16: YES), the process proceeds to step S17.

In step S14, when the determination unit 17 determines that the sharpness calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold compared to the reference sharpness (step S14: YES), the determination The unit 17 determines whether or not more than a specified number of captured images have been generated (step S17). The prescribed number of images is, for example, 5 when imaging by the 5-bucket method and 7 when imaging by the 7-bucket method. If the determination unit 17 determines that no more than the specified number of captured images has been generated (step S17: NO), the process returns to step S11. If the determination unit 17 determines that a predetermined number or more of captured images have been generated (step S17: YES), the process proceeds to step S18, which is the same as steps S4 to S9 described in the first embodiment. Processing is performed (steps S18 to S23).

As described above, according to the present embodiment, the feature amount calculation unit 16 calculates the sharpness for each captured image, the determination unit 17 performs the blur determination, and the threshold in which all the sharpnesses of the plurality of captured images are determined. When it is determined that there is no blur as described above, the feature amount calculation unit 16 calculates the similarity between a plurality of captured images. Thereby, before performing the imaging process of all patterns based on a plurality of initial phases, it is possible to perform blur determination for each captured image corresponding to the initial phase, and to output a warning when a blur is detected. For this reason, unnecessary imaging processing and point cloud data calculation processing can be prevented, and the shape of the measurement target can be measured efficiently.

<Third Embodiment: Comparison of Composite Image of Multiple Captured Images with Different Initial Phases and Image Captured with Uniform Intensity Distribution>
The shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the irradiation unit 14 of the shape measuring apparatus 10 of the present embodiment performs an irradiation process similar to that of the first embodiment. When the imaging unit 13 is imaging, the illumination light is irradiated so that an image in which illumination light having a uniform intensity distribution is projected on the measurement target is captured. The feature amount calculation unit 16 combines a composite image obtained by synthesizing a plurality of captured images with different initial phases captured in the same manner as in the first embodiment, and a captured image obtained by capturing a measurement target irradiated with uniform illumination light. The similarity is calculated as a feature amount. Thereby, it is determined whether or not there is a blur in the captured image.

That is, as shown in FIG. 8, four different captured images (A, B, C, D) captured by shifting the initial phase are synthesized, and for example, the average of the spatial frequency distribution is calculated. Note that an image X in FIG. 8 is an image obtained by combining four captured images. Further, an image Y in FIG. 8 is an image taken when the measurement target is illuminated with uniform illumination. There should be no blurring between the multiple captured images between the image obtained by combining the four captured images and the image obtained by scanning the light with the line pattern without modulating the intensity of the laser beam. The same image. Therefore, the presence / absence of blur may be detected by detecting the similarity between an image obtained by combining four captured images and an image obtained by illuminating the measurement target without performing intensity modulation.
Further, according to the knowledge of the inventors, a composite image of different photographed images picked up by shifting the initial phase as described above, and an image in which illumination light having a uniform intensity distribution is projected on the measurement target The spatial frequency distribution is substantially the same as the captured image obtained by capturing. Therefore, as another method, the feature amount calculation unit 16 calculates a spatial frequency distribution of a composite image of a plurality of captured images with different initial phases and a spatial frequency distribution of an image captured with a uniform intensity distribution. The degree of similarity can be calculated by comparison. The determination unit 17 determines that there is no blur if the calculated similarity is equal to or greater than a certain level, and determines that there is blur if the calculated similarity is not equal to or greater than a certain level.

<Fourth Embodiment: Comparison with Template Image>
The shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the storage unit 15 of the shape measuring apparatus 10 of the present embodiment captures an image in which a lattice pattern is projected onto the measurement target. A template image that is an image is stored in advance. For example, as illustrated in FIG. 9, the storage unit 15 stores A ′ (initial phase 0 degree), B ′ (initial phase 90 degrees), and C ′ (initial phase 180 degrees) in which the measurement target is imaged in advance for each initial phase. ), D ′ (initial phase 270 degrees) template image is stored. The feature amount calculation unit 16 reads the template image stored in the storage unit 15 and calculates the similarity between the read template image and the captured image generated by the imaging unit 13 for each corresponding initial phase. The determination unit 17 determines that there is no blur if the similarity calculated by the feature amount calculation unit 16 is equal to or greater than a predetermined threshold, and determines that there is blur if the similarity is not greater than the threshold.

<Fifth Embodiment: Blur Determination Based on Spatial Frequency Distribution of Multiple Images>
The shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the feature amount calculating unit 16 of the shape measuring apparatus 10 of the present embodiment is configured to each of a plurality of captured images captured by the imaging unit 13. Is subjected to Fourier transform, and the spatial frequency distribution is calculated as a feature amount. When the spatial frequency distribution between the plurality of captured images is different, the determination unit 17 determines that there is blur between the captured images, and when the spatial frequency distribution between the plurality of captured images is substantially the same. Then, it is determined that there is no blur between the captured images. For example, when blurring occurs only in one of the plurality of captured images having different initial phases, the spatial frequency distribution of only the captured image is the spatial frequency distribution of the other captured images. It is thought that the result will be different. Therefore, when such a spatial frequency distribution does not match between a plurality of captured images exceeding a predetermined condition, it is determined that blurring exists.

<Sixth Embodiment: Blur Determination Based on Spatial Code Method>
The shape measuring apparatus 10 of the present embodiment is the same as that of the first embodiment, but the irradiation unit 14 of the shape measuring apparatus 10 of the present embodiment irradiates a measurement target with a lattice pattern based on the spatial code method, and features The amount calculation unit 16 calculates a spatial code for each of the plurality of captured images as a feature amount. The determination unit 17 determines that there is blur in the captured image if the coordinate position based on the spatial code calculated for each of the plurality of captured images is different, and if there is no blur in the captured image if the coordinate positions are approximately the same. judge.

Here, of the determination processes according to the first to sixth embodiments as described above, at least two determination processes can be combined to perform a determination process with higher accuracy. For example, at least one determination process is selected in advance as to which of the determination processes according to the first to sixth embodiments as described above is performed by the user, and a determination process (or combination of determination processes) based on the selection is selected. ) May be executed.
Further, in the above-described embodiment, the example in which the display unit 12 displays a warning when the determination unit 17 determines that the captured image is blurred has been described. For example, the shape measurement device 10 may be a speaker that outputs sound. A warning sound may be output. Alternatively, a message may be transmitted to a terminal connected to the shape measuring apparatus 10 via a wired line or a wireless line to notify the determination unit 17 that the captured image has been determined to be blurred.

<Seventh Embodiment: Structure Manufacturing System>
Next, a structure manufacturing system and a structure manufacturing method using the shape measuring apparatus 10 of this embodiment will be described.
FIG. 10 is a diagram showing a configuration of the structure manufacturing system 100 of the present embodiment. The structure manufacturing system 100 of this embodiment includes the shape measuring device 10, the design device 20, the molding device 30, the control device (inspection device) 40, and the repair device 50 as described in the above embodiment. Prepare. The control device 40 includes a coordinate storage unit 41 and an inspection unit 42.

The design device 20 creates design information related to the shape of the structure, and transmits the created design information to the molding device 30. In addition, the design apparatus 20 stores the produced design information in the coordinate storage unit 210 of the control apparatus 40. The design information includes information indicating the coordinates of each position of the structure.
The molding apparatus 30 produces the above-described structure based on the design information input from the design apparatus 20. The molding of the molding apparatus 30 includes, for example, casting, forging, cutting, and the like. The shape measuring device 10 measures the coordinates of the manufactured structure (measurement object) and transmits information (shape information) indicating the measured coordinates to the control device 40.

The coordinate storage unit 41 of the control device 40 stores design information. The inspection unit 42 of the control device 40 reads design information from the coordinate storage unit 41. The inspection unit 42 compares the information (shape information) indicating the coordinates received from the shape measuring apparatus 10 with the design information read from the coordinate storage unit 41. The inspection unit 42 determines whether or not the structure has been molded according to the design information based on the comparison result. In other words, the inspection unit 42 determines whether or not the manufactured structure is a non-defective product. The inspection unit 42 determines whether the structure can be repaired when the structure is not molded according to the design information. When the structure can be repaired, the inspection unit 42 calculates a defective portion and a repair amount based on the comparison result, and transmits information indicating the defective portion and information indicating the repair amount to the repair device 50.
The repair device 50 processes the defective portion of the structure based on the information indicating the defective portion received from the control device 40 and the information indicating the repair amount.

FIG. 11 is a flowchart showing the structure manufacturing method of the present embodiment. In the present embodiment, each process of the structure manufacturing method illustrated in FIG. 11 is executed by each unit of the structure manufacturing system 100.
In the structure manufacturing system 100, first, the design apparatus 20 creates design information related to the shape of the structure (step S31). Next, the shaping | molding apparatus 30 produces the said structure based on design information (step S32). Next, the shape measuring apparatus 10 measures the shape of the manufactured structure (step S33). Next, the inspection unit 42 of the control device 40 compares the shape information obtained by the shape measuring device 10 with the above-described design information to inspect whether or not the structure is manufactured according to the integrity design information ( Step S34).

Next, the inspection unit 42 of the control device 40 determines whether or not the manufactured structure is a good product (step S35). When the inspection unit 42 determines that the manufactured structure is a non-defective product (step S35: YES), the structure manufacturing system 100 ends the process. In addition, when the inspection unit 42 determines that the manufactured structure is not a good product (step S35: NO), the inspection unit 42 determines whether the manufactured structure can be repaired (step S36).

In the structure manufacturing system 100, when the inspection unit 42 determines that the manufactured structure can be repaired (step S36: YES), the repair device 50 performs the reworking of the structure (step S37). Return to processing. When the inspection unit 42 determines that the manufactured structure cannot be repaired (step S36: NO), the structure manufacturing system 100 ends the process.

In the structure manufacturing system 100 of the present embodiment, since the shape measuring apparatus 10 in the first to sixth embodiments can accurately measure the coordinates of the structure, is the manufactured structure good? It can be determined whether or not. Moreover, the structure manufacturing system 100 can perform reworking and repair of the structure when the structure is not a good product.

In addition, the repair process which the repair apparatus 50 in this embodiment performs may be replaced with the process in which the shaping | molding apparatus 30 re-executes a shaping | molding process. In that case, when it determines with the test | inspection part 42 of the control apparatus 40 being able to be restored, the shaping | molding apparatus 30 re-executes a shaping | molding process (forging, cutting, etc.). Specifically, for example, the molding apparatus 30 cuts a portion that is originally to be cut and is not cut in the structure. Thereby, the structure manufacturing system 100 can produce a structure correctly.

Note that a program for realizing the function of the processing unit in the present invention is recorded on a computer-readable recording medium, and the program recorded on the recording medium is read into a computer system and executed, thereby executing blur determination processing. May be performed. Here, the “computer system” includes an OS and hardware such as peripheral devices. The “computer system” includes a WWW system having a homepage providing environment (or display environment). The “computer-readable recording medium” refers to a storage device such as a flexible medium, a magneto-optical disk, a portable medium such as a ROM or a CD-ROM, and a hard disk incorporated in a computer system. Further, the “computer-readable recording medium” refers to a volatile memory (RAM) in a computer system that becomes a server or a client when a program is transmitted via a network such as the Internet or a communication line such as a telephone line. In addition, those holding programs for a certain period of time are also included.

The program may be transmitted from a computer system storing the program in a storage device or the like to another computer system via a transmission medium or by a transmission wave in the transmission medium. Here, the “transmission medium” for transmitting the program refers to a medium having a function of transmitting information, such as a network (communication network) such as the Internet or a communication line (communication line) such as a telephone line. The program may be for realizing a part of the functions described above. Furthermore, what can implement | achieve the function mentioned above in combination with the program already recorded on the computer system, what is called a difference file (difference program) may be sufficient.
In addition, as a feature amount indicating the degree of blurring present in the captured image of the present invention, the image is necessary for calculating the similarity by comparing the images and the spatial frequency distribution of the image. Any method may be used as long as it detects whether or not the measurement object or the shape measuring device is relatively moving during the acquisition.

The present invention can be applied to a structure manufacturing system that can determine whether or not a manufactured structure is a non-defective product. Thereby, the inspection accuracy of the manufactured structure can be improved, and the manufacturing efficiency of the structure can be improved.

DESCRIPTION OF SYMBOLS 1 ... Light source 2, 3 ... Lens, 4 ... Scanning mirror, 10 ... Shape measuring apparatus, 12 ... Display part, 13 ... Imaging part, 14 ... Irradiation part, 15 ... Memory | storage part, 16 ... Feature-value calculation part, 17 ... Determining unit, 20 ... design device, 30 ... molding device, 40 ... control device, 42 ... inspection unit

Claims (23)

  1. An imaging unit that generates a captured image of the measurement object;
    A predetermined intensity distribution is applied to the measurement target from a direction different from the direction in which the imaging unit is capturing so that the image captured by the imaging unit is captured as an image in which a lattice pattern is projected onto the measurement target. An irradiation unit for irradiating illumination light having
    A feature amount calculation unit that calculates a feature amount indicating a degree of blur existing in the captured image from the captured image;
    A shape measuring apparatus comprising: a determination unit that determines whether or not blur exists in the captured image based on the feature amount.
  2. The irradiation unit irradiates the illumination light so that images having a spatial frequency of a constant period and a plurality of lattice patterns having different initial phases projected onto the measurement object can be sequentially captured by the imaging unit,
    The shape measuring apparatus according to claim 1, wherein the imaging unit generates a plurality of captured images obtained by capturing an image in which a plurality of lattice patterns having different initial phases are projected on the measurement target.
  3. 3. The shape measuring apparatus according to claim 1, wherein the feature amount calculation unit calculates a similarity between a plurality of the captured images as the feature amount.
  4. The irradiation unit irradiates the measurement target with a lattice pattern based on the N bucket method,
    The feature quantity calculation unit calculates, as the feature quantity, a similarity between a plurality of the captured images obtained by capturing an image in which a lattice pattern having the same initial phase is projected on the measurement target. The shape measuring apparatus according to any one of claims 1 to 3.
  5. The imaging unit captures an image in which another pattern is projected on the measurement target during a plurality of imaging timings when an image in which a lattice pattern having the same initial phase is projected on the measurement target is captured. The shape measuring apparatus according to claim 4.
  6. The feature amount calculation unit calculates, as the feature amount, a similarity between a plurality of sets of the captured images obtained by capturing images in which lattice patterns having the same initial phase are projected onto the measurement target. The shape measuring apparatus according to claim 4 or 5.
  7. A storage unit in which a template image that is a captured image obtained by capturing an image of the lattice pattern projected onto the measurement target is stored in advance;
    The shape according to any one of claims 1 to 6, wherein the feature amount calculation unit calculates the similarity between the captured image generated by the imaging unit and the template image. measuring device.
  8. The shape measuring device according to any one of claims 1 to 7, wherein the feature amount calculation unit calculates the sharpness of the captured image as the feature amount.
  9. The shape measurement device according to claim 8, wherein the feature calculation unit calculates a spatial frequency distribution of the captured image, and calculates the sharpness based on a centroid position of the spatial frequency distribution.
  10. The determination unit determines whether or not there is a blur in the captured image based on both the similarity between the captured images calculated by the feature amount calculation unit and the sharpness of the captured image. The shape measuring device according to claim 8.
  11. The feature amount calculation unit calculates a similarity between the plurality of captured images when the determination unit determines that all the sharpnesses of the plurality of captured images are equal to or greater than a predetermined threshold. The shape measuring apparatus according to claim 10, wherein
  12. The irradiation unit irradiates the illumination light so that an image in which illumination light having a uniform intensity distribution is projected on the measurement target is captured when the imaging unit is imaging,
    The feature amount calculation unit calculates, as the feature amount, a similarity between a composite image obtained by combining a plurality of captured images having different initial phases and a captured image obtained by capturing the measurement target irradiated with the uniform illumination light. The shape measuring apparatus according to any one of claims 1 to 11, wherein
  13. The shape measuring device according to any one of claims 1 to 12, wherein the feature amount calculating unit calculates a spatial frequency distribution of the plurality of captured images as the feature amount.
  14. The feature amount calculation unit calculates the highest spatial frequency component among the spatial frequency components included in the captured image as the feature amount, according to any one of claims 1 to 13. The shape measuring apparatus described.
  15. The irradiation unit irradiates the measurement target with a lattice pattern based on a spatial code method,
    The shape measuring device according to claim 1, wherein the feature amount calculation unit calculates a spatial code for each of the plurality of captured images as the feature amount.
  16. The said determination part is provided with the warning part which outputs the warning to the effect of a blurring, when it determines with the blurring existing in the said captured image, The any one of Claim 1-15 characterized by the above-mentioned. Shape measuring device.
  17. The imaging unit, when the determination unit determines that blur is present in the captured image, captures the measurement object irradiated with the illumination light again to generate a captured image. The shape measuring apparatus according to claim 16.
  18. The irradiation unit includes a light source, a light intensity modulation unit for modulating the intensity of light from the light source, a light intensity distribution conversion unit that converts the intensity distribution of light from the light source into a linear intensity distribution, A scanning mirror arranged for scanning the light from the light source with respect to the measurement object in a direction perpendicular to the longitudinal direction of the line and the projection direction of the light from the light source. The shape measuring device according to any one of claims 1 to 17, wherein the shape measuring device is characterized in that:
  19. A design device for creating design information on the shape of the structure;
    A molding apparatus for producing the structure based on the design information;
    The shape measuring apparatus according to any one of claims 1 to 18, which measures the shape of the manufactured structure based on a captured image;
    A structure manufacturing system including shape information obtained by the measurement and an inspection device that compares the design information.
  20. An imaging unit that generates a captured image obtained by imaging the measurement target, and the imaging unit captures the image captured by the imaging unit so as to be captured as an image in which a lattice pattern is projected on the measurement target. A shape measuring device comprising: an irradiation unit that irradiates illumination light having a predetermined intensity distribution to the measurement object from a direction different from the direction,
    Calculating a feature amount indicating a degree of blur existing in the captured image from the captured image;
    And a step of determining whether or not blur is present in the captured image based on the feature amount.
  21. Creating design information on the shape of the structure;
    Producing the structure based on the design information;
    Measuring the shape of the fabricated structure based on a captured image generated using the shape measuring method according to claim 20;
    A structure manufacturing method including comparing shape information obtained by the measurement with the design information.
  22. An imaging unit that generates a captured image obtained by imaging the measurement target, and the imaging unit captures the image captured by the imaging unit so as to be captured as an image in which a lattice pattern is projected on the measurement target. An irradiation unit that irradiates illumination light having a predetermined intensity distribution to the measurement object from a direction different from the direction, to a computer of a shape measuring apparatus,
    Calculating a feature amount indicating a degree of blur existing in the captured image from the captured image;
    And a step of determining whether or not blur exists in the captured image based on the feature amount.
  23. A computer-readable recording medium on which the shape measurement program according to claim 23 is recorded.
PCT/JP2012/072913 2011-09-09 2012-09-07 Shape measurement device, structure manufacturing system, shape measurement method, structure manufacturing method, shape measurement program, and computer-readable recording medium WO2013035847A1 (en)

Priority Applications (2)

Application Number Priority Date Filing Date Title
JP2011196865 2011-09-09
JP2011-196865 2011-09-09

Publications (1)

Publication Number Publication Date
WO2013035847A1 true WO2013035847A1 (en) 2013-03-14

Family

ID=47832286

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/072913 WO2013035847A1 (en) 2011-09-09 2012-09-07 Shape measurement device, structure manufacturing system, shape measurement method, structure manufacturing method, shape measurement program, and computer-readable recording medium

Country Status (3)

Country Link
JP (1) JPWO2013035847A1 (en)
TW (1) TW201329419A (en)
WO (1) WO2013035847A1 (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105027159A (en) * 2013-03-26 2015-11-04 凸版印刷株式会社 Image processing device, image processing system, image processing method, and image processing program
WO2016075978A1 (en) * 2014-11-12 2016-05-19 ソニー株式会社 Information processing device, information processing method, and program

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004070421A (en) * 2002-08-01 2004-03-04 Fuji Photo Film Co Ltd System and method for image processing, imaging device, and image processor
US20060058972A1 (en) * 2004-09-15 2006-03-16 Asml Netherlands B.V. Method and apparatus for vibration detection, method and apparatus for vibration analysis, lithographic apparatus, device manufacturing method, and computer program
US20060120618A1 (en) * 2004-12-06 2006-06-08 Canon Kabushiki Kaisha Image processing apparatus, method of controlling thereof, and program
JP2007158488A (en) * 2005-12-01 2007-06-21 Sharp Corp Camera shake detecting apparatus
JP2007278951A (en) * 2006-04-10 2007-10-25 Alpine Electronics Inc Car body behavior measuring device
JP2008190962A (en) * 2007-02-02 2008-08-21 Aidin System Kk Three-dimensional measurement apparatus
US20100158490A1 (en) * 2008-12-19 2010-06-24 Sirona Dental Systems Gmbh Method and device for optical scanning of three-dimensional objects by means of a dental 3d camera using a triangulation method

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2004070421A (en) * 2002-08-01 2004-03-04 Fuji Photo Film Co Ltd System and method for image processing, imaging device, and image processor
US20060058972A1 (en) * 2004-09-15 2006-03-16 Asml Netherlands B.V. Method and apparatus for vibration detection, method and apparatus for vibration analysis, lithographic apparatus, device manufacturing method, and computer program
US20060120618A1 (en) * 2004-12-06 2006-06-08 Canon Kabushiki Kaisha Image processing apparatus, method of controlling thereof, and program
JP2007158488A (en) * 2005-12-01 2007-06-21 Sharp Corp Camera shake detecting apparatus
JP2007278951A (en) * 2006-04-10 2007-10-25 Alpine Electronics Inc Car body behavior measuring device
JP2008190962A (en) * 2007-02-02 2008-08-21 Aidin System Kk Three-dimensional measurement apparatus
US20100158490A1 (en) * 2008-12-19 2010-06-24 Sirona Dental Systems Gmbh Method and device for optical scanning of three-dimensional objects by means of a dental 3d camera using a triangulation method

Cited By (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN105027159A (en) * 2013-03-26 2015-11-04 凸版印刷株式会社 Image processing device, image processing system, image processing method, and image processing program
EP2980752A4 (en) * 2013-03-26 2016-11-09 Toppan Printing Co Ltd Image processing device, image processing system, image processing method, and image processing program
JPWO2014157348A1 (en) * 2013-03-26 2017-02-16 凸版印刷株式会社 Image processing apparatus, image processing system, image processing method, and image processing program
US10068339B2 (en) 2013-03-26 2018-09-04 Toppan Printing Co., Ltd. Image processing device, image processing system, image processing method and image processing program
WO2016075978A1 (en) * 2014-11-12 2016-05-19 ソニー株式会社 Information processing device, information processing method, and program

Also Published As

Publication number Publication date
TW201329419A (en) 2013-07-16
JPWO2013035847A1 (en) 2015-03-23

Similar Documents

Publication Publication Date Title
US8199335B2 (en) Three-dimensional shape measuring apparatus, three-dimensional shape measuring method, three-dimensional shape measuring program, and recording medium
JP4480488B2 (en) Measuring device, computer numerical control device, and program
JP5001286B2 (en) Object reconstruction method and system
US20150204662A1 (en) Three-dimensional-shape measurement apparatus, three-dimensional-shape measurement method, and non-transitory computer-readable storage medium
EP1777487B1 (en) Three-dimensional shape measuring apparatus, program and three-dimensional shape measuring method
CN100507441C (en) Non-contact 3-D shape testing method and its device
DE60125025T2 (en) System for simultaneous projections of multiple phase-shifted patterns for the three-dimensional inspection of an object
JP2011185872A (en) Information processor, and processing method and program of the same
JP4913597B2 (en) High-speed multiple line 3D digitization method
KR101259835B1 (en) Apparatus and method for generating depth information
JP5690774B2 (en) Inspection method
US20100299103A1 (en) Three dimensional shape measurement apparatus, three dimensional shape measurement method, and computer program
TWI414750B (en) Dimensional shape measuring device, three-dimensional shape measurement method and three-dimensional shape measurement program
KR20010040339A (en) Method and apparatus for three dimensional inspection of electronic components
US20060119848A1 (en) Methods and apparatus for making images including depth information
JP5496008B2 (en) Position / orientation measuring apparatus, position / orientation measuring method, and program
US9329030B2 (en) Non-contact object inspection
JP2001066223A (en) Method for deciding quality of reflected light
US9147247B2 (en) Three-dimensional measurement apparatus, three-dimensional measurement method, and storage medium
US20120281087A1 (en) Three-dimensional scanner for hand-held phones
JP2009236917A (en) Method and apparatus for determining 3d coordinates of object
JP4255865B2 (en) Non-contact three-dimensional shape measuring method and apparatus
JP2012013453A (en) Three-dimensional measurement device, three-dimensional measurement method and program
JP5029618B2 (en) Three-dimensional shape measuring apparatus, method and program by pattern projection method
JP5467321B2 (en) 3D shape measuring method and 3D shape measuring apparatus

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12829855

Country of ref document: EP

Kind code of ref document: A1

ENP Entry into the national phase in:

Ref document number: 2013532674

Country of ref document: JP

Kind code of ref document: A

NENP Non-entry into the national phase in:

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12829855

Country of ref document: EP

Kind code of ref document: A1