WO2013061976A1 - Procédé et dispositif d'inspection de forme - Google Patents

Procédé et dispositif d'inspection de forme Download PDF

Info

Publication number
WO2013061976A1
WO2013061976A1 PCT/JP2012/077386 JP2012077386W WO2013061976A1 WO 2013061976 A1 WO2013061976 A1 WO 2013061976A1 JP 2012077386 W JP2012077386 W JP 2012077386W WO 2013061976 A1 WO2013061976 A1 WO 2013061976A1
Authority
WO
WIPO (PCT)
Prior art keywords
shape
data
dimensional
dimensional shape
shape data
Prior art date
Application number
PCT/JP2012/077386
Other languages
English (en)
Japanese (ja)
Inventor
敦史 谷口
薫 酒井
丸山 重信
前田 俊二
Original Assignee
株式会社日立製作所
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Priority claimed from JP2011232468A external-priority patent/JP5913903B2/ja
Priority claimed from JP2012053956A external-priority patent/JP2013186100A/ja
Application filed by 株式会社日立製作所 filed Critical 株式会社日立製作所
Priority to CN201280052260.2A priority Critical patent/CN104024793B/zh
Publication of WO2013061976A1 publication Critical patent/WO2013061976A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2513Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • G01B11/2545Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with one projection direction and several detection directions, e.g. stereo
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10028Range image; Depth image; 3D point clouds
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30164Workpiece; Machine component

Definitions

  • the present invention relates to, for example, a shape inspection method and an inspection apparatus for a processed product and a processing tool.
  • Patent Document 1 in a three-dimensional shape measurement capable of performing a wide range of three-dimensional shape measurement by laser light scanning, the laser light amount is set so that the reflected light amount is constant even when a color or a shadow is added to the three-dimensional shape.
  • the measurement point data representing the shape of the measurement object measured by the optical cutting method and the reference point data are aligned based on sequential convergence processing, and the measured measurement point data and the reference are aligned.
  • An object shape evaluation apparatus that evaluates the shape of a measurement object based on point data is described.
  • the distance weight between adjacent points is determined based on the distance between adjacent points between adjacent measurement points or the distance between adjacent points between adjacent reference points. This is used when obtaining the sequential convergence evaluation value in the sequential convergence processing.
  • the present invention provides a three-dimensional shape inspection method that ensures high measurement accuracy regardless of the shape of the measurement target by combining a plurality of three-dimensional shape measurement methods and surface measurement methods in a complementary manner.
  • An object is to provide such a device.
  • the present invention provides a first three-dimensional shape sensor that acquires first shape data to be inspected, and second shape data that is different from the first shape data to be inspected.
  • a three-dimensional shape inspection apparatus comprising: a second three-dimensional shape sensor to be acquired; and a complementary integration unit that corrects and integrates the first shape data and the second shape data.
  • first shape data is acquired, second shape data different from the first shape data to be inspected is acquired, and the first shape data and the first shape data are acquired.
  • a three-dimensional shape inspection method characterized by integrating two shape data.
  • the present invention it is possible to provide a three-dimensional shape inspection method and apparatus that ensure high measurement accuracy regardless of the shape of a measurement object by complementarily combining a plurality of three-dimensional shape measurement methods and surface measurement methods. Can do.
  • 3D shape inspection requires measuring the 3D shape, comparing it with a reference model, quantifying the shape defect, and estimating its influence from the defect quantitative value.
  • the influence degree is obtained by quantifying the influence given to the index representing the performance of the product in the case of a product, or to the index representing the machining performance in the case of a machining tool.
  • Patent Documents 1 and 2 have problems with respect to the above technologies. There is a tendency for the accuracy of edges and sharp corners to be insufficient at the time of measurement, CAD data is often required when comparing measured data, inspection is not possible when CAD data is not at hand, and shape defect values Therefore, it may not have the function of estimating the degree of influence.
  • Fig. 1 shows the configuration of the 3D measurement device.
  • Sample 1 is held by holding mechanisms 101 and 102.
  • the entire sample 1 and the holding mechanisms 101 and 102 are connected to the servo motor 103 and have a rotation mechanism centered on the y-axis on the xz plane.
  • the holding mechanisms 101 and 102 have an appropriate holding force that does not cause a deviation between the rotation amount of the servo motor 103 and the rotation amount of the sample 1.
  • the relative position between the sample 1 and the image capturing unit 120 and the point group measuring unit 130 is set by the rotation of the servo motor 103.
  • Sample 1 is a processed product that requires quality assurance by three-dimensional shape measurement or a processing tool that requires shape measurement for processing accuracy management.
  • the sample 1, the holding mechanisms 101 and 102, and the servo motor 103 are all held by a base 105, and the base 105 is mounted on an x stage 106, a y stage 107, and a ⁇ stage 108.
  • the rotation direction of the ⁇ stage 108 is in the xy plane, and the ⁇ axis is orthogonal to the xy plane.
  • the x stage 106, the y stage 107, the ⁇ stage 108, and the base 105 are mounted on an anti-vibration surface plate 110.
  • the servo motor 103 is controlled by a control PC 140 through a motor controller 104, and the x stage 106, y stage 107, and ⁇ stage 108 through a three-axis stage controller 109.
  • the surface state and shape of the sample 1 are measured by the image capturing unit 120 and the point group measuring unit 130.
  • the illumination unit 121 illuminates the sample 1 from an arbitrary direction, and the reflected light, scattered light, diffracted light, and diffused light are captured by the two-dimensional camera 123 using the lens 122, and the three-dimensional shape is captured.
  • the illumination unit 121 a lamp, LED (Light Emitting Diode), or the like can be used, and FIG. 1 shows illumination from a single direction, but the illumination direction may be a plurality of directions, or ring illumination may be used. .
  • the two-dimensional camera 123 a CCD (Charge Coupled Device) image sensor, a CMOS (Complementary Metal Oxide Semiconductor) image sensor, or the like can be used.
  • the pixel pitch of the two-dimensional camera 123 is set to be finer than the resolution determined from the magnification and aperture ratio of the lens 122.
  • the two-dimensional camera 123 is controlled by the control PC 140 via the camera controller 124, and the measurement result is output to the monitor 141.
  • the stereo method based on triangulation, the lens focus method that measures the distance by moving the focus of the lens, and the method of obtaining the shape from the image captured by the two-dimensional camera 123, and the projection of the lattice pattern on the object There is a moire method for measuring the shape from a pattern pattern deformed according to the shape of the object surface. Further, as a method for detecting surface unevenness, there is an illuminance difference stereo or the like that uses a difference in shading depending on an illumination direction to estimate a normal vector direction of a surface of a target object.
  • the point cloud measurement unit 130 includes a point cloud measurement sensor 131 and a sensor controller 132, is controlled by the control PC 140, and the measurement result is output to the monitor 141.
  • the point group measurement sensor 130 includes a non-contact optical type, a contact type probe type, and the like, and is a device that measures the shape of the object surface and outputs three-dimensional coordinates of a large number of points as a point group. Many methods have been proposed for the optical point group measurement sensor, and any method can be applied to the present invention.
  • a light cutting method based on triangulation As a method that can be used in the present invention, a light cutting method based on triangulation, a TOF (Time Of Flight) method in which light is applied to an object, and a distance is measured according to the time when the light returns, the focus of the lens is moved.
  • a lens focus method for measuring a distance by focusing There are a moire method for projecting a lattice pattern onto an object and measuring the shape from a pattern pattern deformed according to the shape of the object surface, and an interference method using white interference.
  • a point cloud measurement method using an optical frequency comb having a large number of optical frequency modes arranged at equal intervals in a frequency space and a point cloud measurement method using a frequency feedback laser have been proposed.
  • the apparatus of FIG. 1 performs complementary measurement of the surface state and shape of the sample 1, and the complementary integration unit 1401 that integrates the image of the two-dimensional camera 123 and the data of the point cloud measurement sensor 130.
  • a storage unit storing CAD (Computer Aided Design) data 142 representing a 3D shape connected to the control PC 140, a region specifying unit for specifying a region for acquiring shape data from the CAD data 142 by the point cloud measurement sensor 131, A defect quantification unit 1402 and a defect quantification unit 1402 for quantifying the shape defect value by comparing the CAD data 142 or the self-reference shape data derived from the similarity of the sample 1 with the data integrated by the complementary integration unit 1401 Based on the degree of failure quantified by the A determination unit that refers to the database (DB) 143 and performs OK, NG determination, or determination of the degree of the product.
  • DB database
  • an area specifying unit that determines the weakness of each method based on the CAD data 142 and specifies the area from which the shape data is acquired by the distance measurement sensor 131 and the two-dimensional camera 123 is provided. You may have it.
  • the point cloud measurement method is suitable for grasping the global shape, but the measurement accuracy is reduced for local changes and minute irregularities.
  • the shape measurement accuracy tends to be lowered at an edge or an acute angle portion where it is difficult to scan the probe.
  • a shape having an edge, an acute angle portion, and a steep inclination is greatly different from a shape having a flat light reflection angle, and the measurement accuracy tends to be lowered.
  • the point cloud measurement method is not good at measuring minute irregularities such as the surface roughness of the sample.
  • the brightness difference from the surroundings in the acquired image increases or the brightness change increases. have. Therefore, it is possible to restore from the image information on the edge portion, the acute angle portion, and the shape having a steep inclination that the point cloud measurement method is not good at. Further, the shape of the surface state and minute unevenness can be derived from the shadow in the image.
  • the global shape that the point cloud measurement method is good at is an image with a flat intensity distribution, and there are few characteristic parts, so that it may be difficult to restore the shape.
  • Figure 2 shows the complementary integration procedure in shape measurement.
  • the CAD data representing the three-dimensional shape of the sample includes information representing the direction of the surface constituting the shape.
  • the point cloud measurement method has a direction and an inclination of a surface whose measurement accuracy is guaranteed by the method.
  • An area in which the measurement accuracy is guaranteed is an area that can be measured by the point cloud measurement sensor.
  • the CAD data is read (S100), the measurement area of the sample 1 is set (S101), the arrangement when holding the sample 1 shown in FIG. 1 and the relative position of the rotation axis and the point cloud measurement sensor are calculated (S102).
  • a threshold value determined by the characteristics of the point group measurement sensor (S103), and the region of the surface where the point group measurement is possible is specified. Specify by part. For the area below the threshold value, a point cloud representing the shape is measured by the point cloud measurement sensor (S104). Since the accuracy of the point cloud measurement sensor is not guaranteed for the area above the threshold value, it is imaged with a two-dimensional camera (S105), and shape data is estimated from the captured image (S106). The point cloud of the point cloud measurement sensor and the position information of the shape data estimated from the image are collated (S107), and desired shape data is acquired (S108).
  • the sample 1 can be imaged by the two-dimensional camera 123 from many viewpoints by controlling the rotation amount of the servo motor 103. Using these images from a plurality of viewpoints, shape data such as an edge and an acute angle portion that the point cloud measurement sensor is not good at is estimated.
  • the stereo method which is a depth measurement method based on triangulation, the inverse rendering method that estimates the tilt of the measurement surface from the shadow of the image, and the view volume intersection that estimates the contour of the sample from the silhouette of multiple images Use the law.
  • the stereo method uses two images obtained by capturing the same visual field from different directions.
  • Images from different directions are acquired by rotating the servo motor 103 of FIG. 1 by a certain amount and controlling the relative position with the two-dimensional camera 123.
  • the sample 1 is rotated 5 degrees by the servo motor 103 to acquire the second image.
  • the depth is calculated from these two images using the principle of triangulation.
  • the shape data of the surface parallel to the rotation axis of the sample 1 can be acquired.
  • the illumination state and the object reflectance are known, and a geometric shape is estimated from an image obtained by a two-dimensional camera using a rendering equation. It is possible to acquire object shape data from a single image.
  • the view volume intersection method is a technique for deriving the shape of the target object by deriving a view volume that is a cone having a viewpoint as a vertex and a silhouette as a cross section, and obtaining a common part of the view volumes at all viewpoints.
  • the servo motor 103 is rotated to acquire an image of a plane parallel to the rotation axis of the sample 1 by 360 degrees with the two-dimensional camera 123, and the viewing volume is derived from the silhouette of each image. Get shape data from the part.
  • FIG. 4 shows an example of position verification (S107).
  • the acquired point cloud data (S200) and image data (S202) have different spatial resolutions determined at the time of designing the point cloud measurement unit and the image capturing unit. Therefore, as shown in FIG. 5, the spatial resolution of the point cloud data is adjusted to be the same scale as the image data acquired by the image capturing unit (S201).
  • the adjustment of the scale may be performed on the image processing unit as long as the point cloud data and the image data have the same spatial resolution, or may be applied to both the point cloud data and the image data. If the point cloud data and the image data are designed to have the same spatial resolution, this scale adjustment (S201) is not necessary.
  • the point cloud data and the image data are roughly adjusted based on the difference in the geometric arrangement between the point cloud measurement unit and the image capturing unit (S203).
  • the coarse adjustment (S203) detailed position matching is performed at the common feature points in both the data 161 of the point 161 where the displacement changes sharply in the point cloud data of FIG. 5 and the location 162 where the intensity change is large in the image. (S204).
  • position matching is performed so that the sum of the distances between the feature points is minimized.
  • the position matching may be performed so that the distance between the feature points is statistically an intermediate value.
  • FIG. 2 shows a complementary integration flow when acquiring shape data.
  • the resolution determined by the pixel size of the lens 122 and the two-dimensional camera 123 is set to a height that can sufficiently capture the minute unevenness to which attention is focused, and an image is acquired.
  • FIG. 2 shows a method for calculating a region for point cloud measurement using CAD data, but the region for using a two-dimensional image is determined using the comparison result between the acquired point cloud data and CAD data shown in FIG. In this way, complementary integration with a two-dimensional image is possible.
  • An area for measuring the shape of the sample is set (S300), and point cloud measurement is performed (S301).
  • the CAD data (S302) of the sample is input, and the distance between corresponding points is calculated by aligning with the point group measured in (S301) (S303).
  • an ICP (Iterative Closest Point) method or the like widely used for collation of point clouds is used.
  • the distance between the measured point group and each corresponding point in the CAD data is averaged, and the average value is compared with a certain threshold value (S304). Since the accuracy of the point cloud measurement sensor is not guaranteed for the area above the threshold value, it is imaged with a two-dimensional camera (S305), and shape data is estimated from the captured image (S306).
  • the point cloud of the point cloud measurement sensor and the position information of the shape data estimated from the image are collated (S307), and the desired shape data is acquired (S308). Compare actual measurement results with CAD to determine whether or not to use point cloud measurement results, so outliers in point cloud measurement that are not possible with the flow shown in FIG. 2 can be removed, improving measurement accuracy To do.
  • the threshold value to be compared with the average of the distances between corresponding points is set to a value larger than the shape defect or surface unevenness to be inspected.
  • FIG. 7 shows a measurement flow in which both point cloud data and image data are acquired before CAD data comparison.
  • An area for measuring the shape of the sample is set (S400), point cloud measurement is performed on the set area (S401), and an image is also taken (S402).
  • the CAD data is read (S403), and it is determined whether the inclination ⁇ between the sample surface and the point group measurement sensor is equal to or less than the threshold value determined by the characteristics of the point group measurement sensor (S404). Clarify the surface area.
  • the shape data measured by the point cloud measurement sensor is adopted for the area below the threshold (S406), and the shape data is calculated from the shadow of the image in the area above the threshold (S405).
  • the point cloud measurement data and the shape data calculated from the image are aligned (S407), and the shape data is output (S408).
  • an overlapping portion is required when aligning the point cloud data and image data, and both data are acquired over the entire area. Will increase, but there will be no shortage of overlap. Therefore, by acquiring both the point cloud data and the image data before the CAD data comparison, it is possible to use highly accurate data out of the point cloud data and the image data, and further improve the measurement accuracy. it can.
  • the measured shape is quantitatively compared with the reference shape.
  • CAD data that is an ideal shape
  • average self-reference shape data calculated using the similarity of the sample itself, average shape data of a plurality of samples of the same shape, or a plurality of types Is used.
  • the method for deriving each shape data, the method for quantifying the defect, and the combination are described below.
  • the difference between the measured shape data and CAD data is taken and the difference value is output as a shape defect value.
  • FIG. 8 shows a six-blade impeller 170 as an example of a sample to be measured.
  • the six wings 170a to 170f have the same shape. 1 is mounted so that the impeller central axis 1701 and the y axis coincide with each other, and the point cloud measuring sensor 131 and the two-dimensional camera 123 are rotated by the servo motor 103 about the y axis by the method described above.
  • FIG. 9 shows a flow for deriving self-reference shape data.
  • the shape / surface data is acquired by the inspection apparatus of FIG.
  • FIG. 10A A waveform representing the shape of the impeller 170 is shown in FIG. In FIG. 10A, the waveform profiles 171a to 171f of the wings 170a to 170f are identical in design. Recognition of the similarity between the profiles 171a to 171f may be performed manually by the operator of the apparatus, or may be automatically recognized based on a pattern recognition technique. If there are three or more parts having the same shape, the self-reference shape data 171 is calculated from their statistical average value (S503).
  • the statistical average value is calculated by the median or the average value is calculated by omitting the maximum value and the minimum value in order to remove the influence of the shape defect and the surface irregularity defect from the self-reference shape data.
  • the self-reference shape data 171 and the profiles 171a to 171f are aligned (S504), the difference between them is calculated as shown in FIG. 10B (S505), and the difference value is output as a shape defect value. (S506). Since the similarity of the sample itself is used, shape defects can be quantified without CAD data.
  • the derivation flow of average shape data is shown in FIG.
  • the shape / surface data is acquired by the inspection apparatus of FIG. 1 (S601).
  • a waveform representing the shape of the impeller 170 is shown in FIG. FIG. 12A shows the entire profiles 172a to 170n when the impeller is measured n times.
  • the average shape data 172 is calculated from the statistical average value of these profiles (S602).
  • the statistical average value is calculated by the median or the average value is calculated by omitting the maximum value and the minimum value in order to remove the influence of the shape defect and the surface irregularity defect from the average shape data.
  • the average shape data 172 and the profiles 172a to 171n are aligned (S603), a difference between them is calculated as shown in FIG. 12B (S604), and the difference value is output as a defective shape value (S604). S605). Self-referenced shape data cannot be used unless the sample itself is similar. On the other hand, since the average shape data is derived from a plurality of samples, comparative inspection is possible even for samples having no CAD data and no similarity.
  • shape defect quantification combining the three standard shape data of CAD data, self-reference shape data, and average shape data will be described.
  • the shape defect is quantified using self-reference shape data and CAD data, as shown in FIG.
  • the shape / surface unevenness data is measured by the apparatus of FIG. 1 (S701), and the presence or absence of similarity is determined (S702).
  • similarity recognition may be performed manually by the apparatus operator, or may be automatically recognized based on a pattern recognition technique. If there is no similarity, the difference between the measured shape data and CAD data is taken and the difference value is output as a shape defect value (S703).
  • the similar part calculates a difference value from the self-reference shape data according to the flow shown in FIG. 9 (S704), and the dissimilar part calculates a difference from the CAD data (S705). A defective value is output (S706).
  • the shape defect is quantified using the self-reference shape data and the average shape data.
  • the shape / surface unevenness data is measured by the apparatus of FIG. 1 (S801), and the presence or absence of similarity is determined (S802).
  • similarity recognition may be performed manually by the apparatus operator, or may be automatically recognized based on a pattern recognition technique. If there is no similarity, the difference between the measured shape data and the average shape data is taken according to the flow shown in FIG. 11, and the difference value is output as a shape defect value (S803). If there is similarity, the similar part calculates a difference value from the self-reference shape data according to the flow shown in FIG. 9 (S804), and the dissimilar part calculates a difference from the average shape data (S805) A defective value is output (S806).
  • the defect When detecting defects such as irregularities on the sample surface, the defect may be detected from the difference in the feature quantity between the normal part and the defective part of the sample surface imaged by the two-dimensional camera image.
  • the feature amount represents the feature of the image, and includes a high-order local autocorrelation (HLAC) feature amount and the like.
  • HLAC local autocorrelation
  • the position of the two-dimensional camera, the position of illumination, and the tilt of the sample are also used as the feature amount.
  • the position of the two-dimensional camera 123 shown in FIG. 15 is the unit vector I301
  • the position of the illumination 121 is the unit vector S302
  • the tilt of the sample in the Cartesian coordinate system is measured using a point cloud measurement sensor.
  • the normal vector 304 of the triangular patch 303 formed by three adjacent points is used as a feature amount representing the inclination of the sample.
  • CAD data may be used to derive the normal vector 304.
  • the image obtained by the two-dimensional camera 123 depends on the position of the illumination 121, the position of the two-dimensional camera 123, and the tilt of the sample. The part can be determined more accurately. Further, the shielding portion 306 caused by the shape of the sample can be calculated, and the defect portion 305 and the normal portion can be more accurately discriminated in consideration of a change in image characteristics due to a decrease in luminance due to a shadow.
  • Fig. 16 shows the flowchart.
  • An image of the sample is measured with a two-dimensional camera (S1001).
  • 25-dimensional HLAC features, illumination positions, 2-dimensional camera positions, and sample normal vector values for each pixel (or a value obtained by performing smoothing processing or the like from a plurality of peripheral pixels) acquired with a 2-dimensional camera Is calculated as a feature amount (S1002).
  • S1003 the boundary for discriminating between the normal part and the defective part in the feature amount space is determined
  • SVM Support Vector Machine
  • a defective part is extracted using the boundary calculated in (S1003) (S1004). Note that the boundary determined in (S1002) may be determined using a sample before inspection.
  • a parameter representing the shape of the tool is designated (S901).
  • the wear area 180 at the tip of the drilling tool in FIG. 18 and the angle 181 of the tip of the tip are used as evaluation parameters.
  • the shape of the drilling tool as a sample is measured by the above-described method, and compared with any of CAD data, self-reference shape data, or average shape data, and the shape defect value is quantified (S902).
  • the wear area and the angle of the blade edge are calculated from the shape defect value indicating the difference in shape by distance (S903).
  • the influence of the wear area and the blade edge angle on the machining result is estimated from past experimental results and machining simulation results.
  • FIG. 19 shows a GUI for inputting inspection conditions and outputting results.
  • the GUI is displayed on a monitor 141 mounted on the inspection apparatus.
  • the allowable values of the depth and roundness are input as the inspection area and the degree of influence on the machining, and the estimated values of the cutting edge inclination, the wear area, and the degree of influence on the machining derived from the measurement results are output. Further, the final determination value is output as the inspection result 191.
  • the point cloud measurement data and the two-dimensional camera image are used in a complementary manner, thereby realizing three-dimensional shape measurement with higher accuracy than when only the point cloud measurement data and the two-dimensional camera image are used. can do.
  • product quality / yield prediction is performed by determining whether the product has a shape defect based on the performance when the processed product or tool is used, and determining the degree of quality. It is possible to manage the life prediction and the like considering the deterioration / wearing state of the machining tool and the reduction of machining accuracy.
  • FIG. 1 A second embodiment of the present invention will be described with reference to FIG.
  • the apparatus configuration is shown in FIG.
  • the basic configuration is the same as that of the first embodiment.
  • the polarization camera 200 that can analyze the polarization state of the reflected light is used instead of the two-dimensional camera 123 of FIG. 1, and the ring illumination system 201 is used instead of the illumination unit 121. ing.
  • a minute polarizer is attached in front of each pixel, and four pixels form one group, and the four pixels in the group have different polarizer orientations. It is known that the polarization state of the light reflected from the sample 1 varies depending on the direction of the surface, and the direction of the surface can be detected by integrating different polarization information obtained by the four pixels.
  • the direction of the surface can be determined with high accuracy even if there is a color change such as a spot that is difficult to identify in a normal image.
  • the polarization information is highly responsive to minute irregularities such as scratches, and surface irregularities can be effectively revealed.
  • a third embodiment of the present invention will be described with reference to FIGS.
  • the combination of the distance measurement method and the advantage of the image is used to more accurately measure the three-dimensional shape and surface unevenness, and the shape defect of the three-dimensional information of the target point of the sample restored by the complementary integration is obtained.
  • the defect quantification method for quantification will be described in detail below.
  • measurement data obtained by a plurality of measurement methods are complementarily integrated to improve the stability and accuracy of the measurement method.
  • the distance measurement method is suitable for grasping the global shape, but the measurement accuracy is often insufficient for local changes and minute irregularities.
  • a non-contact type that applies optics such as laser
  • the shape having an edge, an acute angle portion, and a steep inclination is greatly different from a shape having a flat light reflection angle, and the measurement accuracy tends to be lowered.
  • the distance measurement method using these lasers irradiates a target with a beam shaped into a point or a line, and measures the distance from the position of the reflected light. Therefore, to measure the 3D shape, it is necessary to scan the sample or laser.
  • the scanning interval becomes the measurement spatial resolution as it is.
  • the spatial resolution of the shape measurement method using an image depends on the pixel size of the two-dimensional camera 123 and the magnification of the lens 122, and is generally finer than the scanning interval, but changes gradually and is difficult to show characteristics in the image. We are not good at measuring the shape part. Therefore, the distance measurement method is good at grasping the global shape compared to the method using the image, but is not suitable for the measurement of local minute surface irregularities.
  • Fig. 21 shows the flow of shape inspection using the distance measurement method.
  • a measurement region is determined according to the performance of the distance measurement unit 130 to be used (S1100), and a point group representing coordinates in 3D space by the distance measurement unit 130 while performing stage control of the sample 1 with respect to the measurement region determined in S1100.
  • S1101 the exceptional value caused by the measurement error of the distance measuring unit 130 included in the measured point cloud is removed (S1102), and a mesh is stretched over the point cloud to obtain measured shape data (S1103).
  • Measured shape data (S1103) is compared with CAD data or non-defective product shape data measured in the same process as S1100 to S1103, and the shape defect of the measured shape data is quantified (S1104), and a threshold is set. OK / NG determination is performed (S1105).
  • the distance measurement unit 130 is limited in measurement accuracy by the inclination of the surface of the sample 1.
  • the measurement area in which the measurement accuracy is guaranteed can be determined from the positional relationship between the distance measurement unit 130 and the measurement target.
  • the accuracy of the distance information measured by the distance measuring unit 130 depends on the distance between the distance measuring unit 130 and the sample 1, the inclination of the sample 1 with respect to the distance measuring unit, and the material of the sample 1. An area where necessary accuracy is ensured is set as an inspection area.
  • the shape of the sample 1 may be visually observed to determine a measurement region where measurement accuracy is expected to be guaranteed.
  • the relative positions of the distance measuring unit 130 and the sample 1 are controlled by the x stage 106, the y stage 107, and the ⁇ stage 108. Each stage is controlled so as to cover the measurement region of the sample 1, and a point group representing coordinates in the 3D space is acquired. Since the distance measurement unit 130 measures the distance from the surface of the sample 1, it is converted into 3D space coordinates using the position information of each stage. (S1102) An exceptional value is generated in the point group measured by the distance measuring unit 130 due to a measurement error of the distance measuring unit 130. This exceptional value is generally removed from the statistical properties of the measurement point cloud.
  • a process of expressing the difference in the position of point groups that are dense within a certain range of interest in terms of standard deviation, and making points that are N distances away from the standard deviation as exceptional values can be considered.
  • the point cloud is converted into a mesh format suitable for CAD comparison. As a conversion method, methods such as Ball-Pivoting and Power Crush have been proposed.
  • S1104 The measurement data meshed in S1103 is compared with CAD data to quantify the shape defect.
  • a non-defective product can be compared with non-defective product shape data digitized by the procedures of S1100 to S1103 to quantify the shape defect.
  • a threshold value is set in advance for the shape defect value quantified in S1104, and OK / NG determination is automatically performed.
  • FIG. 22 shows a shape inspection flow using the stereo method.
  • a measurement region for measuring the shape by stereo measurement is determined (S1200), and images of different viewpoints are acquired by the two-dimensional camera 123 while rotating the sample 1 on the ⁇ stage 108 with respect to the measurement region determined in S1200. (S1201), an edge is extracted from the acquired image (S1202).
  • Edges are extracted from images at different viewpoints from S1202, corresponding points indicating the same location between the images are searched (S1203), depth information is calculated from the viewpoint deviation and the positional deviation amount of the corresponding points, and the 3D space.
  • a point group indicating the coordinates inside is derived (S1204).
  • a mesh is stretched over the calculated point group to obtain measurement shape data (S1205).
  • Measured shape data and CAD data or non-defective product shape data measured in the same process as S1200 to S1205 are compared, the shape defect of the measured shape data is quantified (S1206), a threshold value is provided, and OK / NG A determination is made (S1207).
  • (S1200) A region for shape measurement is determined by stereo measurement, and a location for acquiring an image is determined.
  • S1201 A rotation angle of the ⁇ stage 108 is set for the measurement region determined in S200, and a plurality of images are acquired from different viewpoints while rotating. Although the rotation angle depends on the size of the object, the rotation angle is set sufficiently finely so that the correspondence between the images can be the same location.
  • S1202 Edges are extracted from the image acquired in step S1201. For the edge extraction, a Canny edge extraction method, a method using a Sobel filter, or the like can be used.
  • S1203 In step S1202, the edge part correspondence between a plurality of images with different viewpoints extracted in edge is calculated.
  • a normalized correlation method or the like can be used.
  • S1204 Depth information is calculated based on the principle of triangulation from the viewpoint shift that can be calculated from the stage movement amount and the position shift amount of the corresponding point calculated in S1203, and a point group indicating coordinates in the 3D space is derived.
  • S1205 The point cloud calculated in S1204 is converted into a mesh format suitable for CAD comparison. As a conversion method, methods such as Ball-Pivoting and Power Crush have been proposed.
  • S1206 The measurement data meshed in S1205 is compared with CAD data to quantify the shape defect.
  • a non-defective product can be compared with non-defective product shape data digitized by the procedure of S1200 to S1205, and the shape defect can be quantified.
  • S1207 A threshold value is set in advance for the shape defect value quantified in S1206, and OK / NG determination is automatically performed.
  • FIG. 23 shows a surface irregularity inspection flow using the illuminance difference stereo method.
  • a measurement region for measuring the shape by illuminance difference stereo measurement is determined (S1300), and the sample 1 is irradiated with illumination from at least three different directions to the measurement region determined in S1300, and an image is obtained under each illumination. get.
  • the ⁇ stage 108 is rotated, and an image is acquired under each illumination for the entire measurement region (S1301).
  • the reflectance of the sample 1 is assumed to be a Lambertian surface, a normal vector of the surface of the sample 1 is derived (S1302), and the reflectance and the illumination direction are calibrated using a reference sample or the like. (S1303).
  • the normal vector that has been calibrated is subjected to integration processing to calculate shape data (S1304).
  • a mesh is stretched over the calculated point group to obtain measured shape data (S1305).
  • Measured shape data and CAD data, or non-defective product shape data measured in the same process as S1200 to S1205 are compared, the shape defect of the measured shape data is quantified (S1306), a threshold value is provided, and OK / NG A determination is made (S1307).
  • S1300 A region for shape measurement is determined by illuminance difference stereo measurement, and a location for acquiring an image is determined.
  • S1311 The measurement area determined in S1300 is irradiated with illumination from at least three different directions, and at least three images are acquired.
  • the rotation angle of the ⁇ stage 108 is set, and images are acquired from different viewpoints while rotating. Although the rotation angle depends on the size of the object, the rotation angle is set sufficiently finely so that the correspondence between the images can be the same location.
  • S1302 A matrix calculation is performed on the image acquired in S1301 to calculate a normal vector.
  • the surface method can be obtained from an intensity vector composed of the intensities of images with different illumination directions acquired by the two-dimensional camera 123 and an illumination direction matrix composed of unit vectors representing the illumination directions.
  • a line vector is calculated.
  • the calculated normal vector includes a systematic error.
  • the systematic error included in the normal vector derived in S1302 is made of the same material as that of the sample 1, and can be calibrated by using a known shape sample including three or more different surface inclinations.
  • the illuminance difference stereo method is rarely used for absolute shape measurement.
  • S1304 By integrating the normal vectors derived in S1302 and 1303, a point group representing shape information can be calculated. However, if the normal vector error is not partitioned in S303, the point cloud also includes an error.
  • S1305) The point cloud calculated in S1304 is converted into a mesh format suitable for CAD comparison. As a conversion method, a method such as Ball-Pivoting or Power Crush is used.
  • the measurement data meshed in S1305 is compared with CAD data to quantify the shape defect.
  • a shape inspection method that complementarily integrates a plurality of shape measurement methods, which is a feature of this embodiment, will be described.
  • the two-dimensional camera 123 acquires a shape having an edge portion, an acute angle portion, and a steep inclination under a certain illumination, a difference in brightness from the surroundings or a change in brightness increases in the acquired image. Have. Therefore, it is possible to restore from the image information on the edge portion and the acute angle portion that are not good at the distance measurement method and the shape having a steep inclination (stereo method). Also, the shape of the surface state and minute irregularities can be derived from the shadow in the image (illuminance difference stereo method).
  • the global shape that the distance measuring unit 130 is good at is an image with a flat intensity distribution, and there are few characteristic parts, so that it may be difficult to restore the shape.
  • a method for measuring the entire complex shape with high accuracy by exchanging information obtained by each of these methods in a complementary manner, and calibrating and integrating the information will be described.
  • measurement may be possible with only one method. In that case, since complementary integration cannot be performed, the measurement results are directly adopted and integrated.
  • Fig. 24 shows the flow. The flow in FIG. 21-23 is integrated, and only the integrated part of the method will be described.
  • a method of removing these by comparison with each other will be described. Details are shown in FIG.
  • the result measured with respect to the surface 200 of the sample 1 by the distance measurement unit 130 and the illuminance difference stereo method is shown.
  • Surface 2000 includes a shape defect 2001.
  • the resolution of the measurement point 2002 of the distance measurement unit depends on the scanning resolution of the stage, and the resolution of the measurement point 2007 obtained by integrating the normal vector 2006 derived by the illuminance difference stereo method is determined by the lens magnification and the camera pixel size. . In this embodiment, it is assumed that the resolution of stage scanning is about 100 ⁇ m and the pixel size / lens magnification is about 10 ⁇ m.
  • the experimental results include random exception values 2003 and 2008 depending on the stability of the distance measurement sensor and the camera.
  • Exceptional value determination is generally performed by calculating an outlier value by comparison with the surrounding area. For example, a surface is stretched from adjacent measurement points, the inclination 2004 is derived, and an exceptional value is calculated based on the angle change amount of the inclination. An example is shown in FIG. A dotted line indicates the result of the distance measuring unit 130, and a solid line indicates the result of the illuminance difference stereo method. In the result of the illuminance difference stereo, the exceptional value 2011 can be determined as an exceptional value because only one point has a large angle change amount locally. On the other hand, according to the result of the distance measurement unit, the defect 2009 and the exceptional value 2010 cannot be distinguished only by the angle change amount because the resolution is low.
  • the difference between the angle change amounts of the distance measurement unit and the illuminance difference stereo is used as an index, and a point having a difference equal to or greater than the threshold value ⁇ A (2014) is determined as an exceptional value.
  • a point having a difference equal to or greater than the threshold value ⁇ A (2014) is determined as an exceptional value.
  • an exceptional value is randomly generated at the time of measurement, and the exceptional value can be easily removed by comparing between a plurality of methods.
  • a sampling interval having a larger resolution is complemented.
  • the threshold value ⁇ A (2014) that determines the degree of exception value removal is a parameter that determines the accuracy of the point cloud data, and the user can arbitrarily set a value that exceeds the measurement accuracy of each measurement method. Set.
  • edge extraction in an image calculates an intensity change between pixels, and recognizes a portion having a large intensity change as an edge.
  • these methods determine the edge only from the intensity information in one image, when the edge portion does not appear as an intensity change depending on the material and the viewpoint at the time of image acquisition as shown in FIG. Unable to extract edges well. Therefore, the normal vector of the surface of the sample 1 obtained by the illuminance difference stereo method is used.
  • the direction of the normal vector of the surface derived from a plurality of images is calculated by calculating the difference between adjacent pixels, the angle change of the normal vector is calculated, the threshold value B2015 is set, and the angle change exceeds a certain level.
  • the case is determined to be an edge.
  • the threshold value B2015 is determined by the user in consideration of the edge sharpness included in the shape.
  • the illuminance difference stereo method when the normal vector is derived from the intensity of the image, the set light source direction and the reflectance of the sample 1 are known. However, an error is included in the set position in the light source direction and the actual position. Also, the known reflectance is not an accurate value. Accordingly, the normal vector derived in S1402 includes an error. Further, the shape derived in S1403 includes a systematic error. In order to correct this error, the distance measurement unit and the result of the stereo method are used.
  • the coordinate transformation in the space coordinates can be expressed as in Equation (1).
  • the original coordinate (x y z 1) is multiplied from the left by a 3x4 transformation matrix consisting of 12 coefficients to obtain the transformed coordinate (x 'y' z '1).
  • the twelve variables of the transformation matrix can be obtained by deriving transformation equations of the original coordinates and the transformed coordinates for three different planes. Details will be described with reference to FIG.
  • a range 2050 is set in the measurement result of the sample 1 by the distance measurement unit 130, and an equation representing the plane S2051 is derived in the xyz space.
  • the setting of the range S250 is obtained by providing a threshold C for the change in the normal vector direction on each measurement plane and setting a range smaller than a certain threshold C as a plane.
  • the threshold value C for the direction change of the normal vector is designated by the user.
  • an equation is derived from the measurement result obtained by the illuminance difference stereo method for the plane 2052 at the same location as the plane 2051, and the coefficient of the first row of the transformation matrix is derived from both equations. Similarly, the coefficients of the second and third rows of the transformation matrix are derived for the other two planes.
  • the measurement result of the illuminance difference stereo is set to (x y z 1), and is converted into the (x ′ y ′ z ′ 1) space by a conversion matrix, thereby obtaining shape information 2053 in which the systematic error is corrected.
  • a method of weighting and adding the point groups acquired by the heterogeneous sensors with the corresponding point distances in the connection between the point groups in S1412 and S1413.
  • a triangular normal vector formed by a point of interest and at least two adjacent points is used as a weighting function.
  • a normal vector at the point of interest is derived from the point group of the point of interest and the adjacent points acquired by the distance measurement method and the stereo method, respectively.
  • the weight of the point group obtained by the distance measurement method and the stereo method is increased, and as the inner product is closer to 1, the weight of the point group obtained by the illuminance difference stereo method is increased.
  • the combination of the normal vector directly calculated by the illuminance difference stereo method and the point group calculated in S1412 is calculated using the normal vector as a weight in the same manner as S1412. Note that, when the measurement density differs greatly between the point groups between the different sensors, the data is interpolated so that the normal vectors are smoothly connected. In this way, by using the normal vector for the point cloud connection, a surface shape that has not been taken into account in the conventional method of corresponding point distance is taken into consideration, so that it is possible to combine the point clouds with higher accuracy. .
  • FIG. 29 shows a GUI (Graphic User Interface) of the shape measuring apparatus shown in FIG. A GUI is displayed on the PC display 400.
  • the threshold value A is a parameter that determines the degree of exception value removal
  • the threshold value B is an edge extraction parameter
  • the threshold value C is a parameter that determines an area that can be regarded as a plane. Yes.
  • the measurement start button 404 is pressed to perform measurement.
  • the measurement result is displayed in a measurement result display window 405.
  • the CAD comparison button 406 compares the measurement result with the CAD data, and the difference from the CAD is displayed in the error display box 407.
  • the magnitude of the error is represented by a statistical maximum value, average value, standard deviation, or the like. Further, NG is displayed in the OK / NG display box 408 when it is large and OK when it is small, depending on the size of the allowable error value set in advance.
  • the distance measurement method, the stereo method, and the illuminance difference stereo method are used to correct the data and complementarily integrate, thereby taking advantage of the advantages of the respective three-dimensional shape measurement methods.
  • a three-dimensional shape inspection can be performed with high measurement accuracy regardless of the shape of the measurement target.
  • a fourth embodiment of the present invention will be described with reference to FIGS.
  • the difference from the third embodiment is that only two types of measurement methods are used: a distance measurement method and an illuminance difference stereo method.
  • the inspection flow is shown in FIG.
  • FIG. 31 shows the GUI.
  • the difference from the third embodiment is that there is no input box for threshold B, which is a parameter related to the stereo method.
  • only two types of distance measurement method and illuminance difference stereo method are used.
  • the present invention is not limited to this.
  • a combination of two types of distance measurement method and stereo method may be used.

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Quality & Reliability (AREA)
  • Theoretical Computer Science (AREA)
  • Length Measuring Devices By Optical Means (AREA)

Abstract

La présente invention concerne un procédé et un dispositif d'inspection de forme tridimensionnelle dans lesquels on maintient un niveau élevé de précision de mesure quelle que soit la forme de l'objet mesuré, par combinaison complémentaire de plusieurs procédés de mesure de forme tridimensionnelle et des procédés de mesure de surface. La présente invention concerne un dispositif d'inspection de forme tridimensionnelle caractérisé en ce qu'il comprend un premier capteur de forme tridimensionnelle destiné à acquérir de premières données de forme d'un sujet d'inspection, un second capteur de forme tridimensionnelle destiné à acquérir de secondes données de forme différentes des premières données de forme du sujet d'inspection, et une unité d'intégration complémentaire destinée à corriger et à intégrer les premières données de forme et les secondes données de forme.
PCT/JP2012/077386 2011-10-24 2012-10-24 Procédé et dispositif d'inspection de forme WO2013061976A1 (fr)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201280052260.2A CN104024793B (zh) 2011-10-24 2012-10-24 形状检查方法及其装置

Applications Claiming Priority (4)

Application Number Priority Date Filing Date Title
JP2011232468A JP5913903B2 (ja) 2011-10-24 2011-10-24 形状検査方法およびその装置
JP2011-232468 2011-10-24
JP2012053956A JP2013186100A (ja) 2012-03-12 2012-03-12 形状検査方法およびその装置
JP2012-053956 2012-03-12

Publications (1)

Publication Number Publication Date
WO2013061976A1 true WO2013061976A1 (fr) 2013-05-02

Family

ID=48167806

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/JP2012/077386 WO2013061976A1 (fr) 2011-10-24 2012-10-24 Procédé et dispositif d'inspection de forme

Country Status (2)

Country Link
CN (1) CN104024793B (fr)
WO (1) WO2013061976A1 (fr)

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2017025503A (ja) * 2015-07-17 2017-02-02 清水建設株式会社 建設機器操作アシスト表示システム
CN106489061A (zh) * 2014-08-29 2017-03-08 日立汽车系统株式会社 部件的制造方法和使用它的制造装置、体积测定方法
JP2019045510A (ja) * 2018-12-07 2019-03-22 株式会社キーエンス 検査装置
JP2019158767A (ja) * 2018-03-15 2019-09-19 東芝テック株式会社 測定装置
WO2021080498A1 (fr) * 2019-10-23 2021-04-29 Winteria Ab Procédé et dispositif d'inspection de géométrie comportant des moyens de capture d'image et de balayage de forme
WO2023053238A1 (fr) * 2021-09-29 2023-04-06 日本電気株式会社 Dispositif de détection de forme, système de détection de forme et procédé de détection de forme

Families Citing this family (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP6704813B2 (ja) * 2016-08-05 2020-06-03 キヤノン株式会社 計測装置、露光装置、および物品の製造方法
CN108088407B (zh) * 2017-12-15 2020-11-10 成都光明光电股份有限公司 光学玻璃制品形貌偏差校正方法及系统
CN109579733B (zh) * 2018-11-30 2020-12-18 广东省新材料研究所 一种激光3d打印成型尺寸精度快速测算方法
JP7257942B2 (ja) * 2019-11-29 2023-04-14 日立Astemo株式会社 表面検査装置および形状矯正装置、並びに表面検査方法および形状矯正方法
CN110986858A (zh) * 2019-11-30 2020-04-10 深圳市裕展精密科技有限公司 测量装置及测量方法
TWI817487B (zh) * 2021-05-13 2023-10-01 日商芝浦機械股份有限公司 檢測工具的形狀的裝置及檢測工具的形狀的方法
CN113334978B (zh) * 2021-07-07 2021-12-14 东莞市昂图智能科技有限公司 应用于cnc雕刻机的图像采集系统

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005122706A (ja) * 2003-09-25 2005-05-12 Fuji Photo Film Co Ltd 形状診断装置、形状診断方法、及びプログラム
JP2005215917A (ja) * 2004-01-29 2005-08-11 Hitachi Plant Eng & Constr Co Ltd 施工図作成支援方法およびリプレースモデル作成方法
JP2007333462A (ja) * 2006-06-13 2007-12-27 Yokohama Rubber Co Ltd:The タイヤ型部材検査方法、タイヤ型部材検査装置、および型部材作製工程精度検査方法
JP2009058459A (ja) * 2007-09-03 2009-03-19 Nikon Corp 形状測定装置
JP2010160135A (ja) * 2008-12-09 2010-07-22 Toshiba Corp タービン発電機におけるステータコイルの接続組立の3次元形状測定方法及び3次元形状測定装置用冶具
JP2011175600A (ja) * 2010-02-25 2011-09-08 Canon Inc 認識装置及びその制御方法、コンピュータプログラム

Family Cites Families (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7286246B2 (en) * 2003-03-31 2007-10-23 Mitutoyo Corporation Method and apparatus for non-contact three-dimensional surface measurement
US7787692B2 (en) * 2003-09-25 2010-08-31 Fujifilm Corporation Image processing apparatus, image processing method, shape diagnostic apparatus, shape diagnostic method and program

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2005122706A (ja) * 2003-09-25 2005-05-12 Fuji Photo Film Co Ltd 形状診断装置、形状診断方法、及びプログラム
JP2005215917A (ja) * 2004-01-29 2005-08-11 Hitachi Plant Eng & Constr Co Ltd 施工図作成支援方法およびリプレースモデル作成方法
JP2007333462A (ja) * 2006-06-13 2007-12-27 Yokohama Rubber Co Ltd:The タイヤ型部材検査方法、タイヤ型部材検査装置、および型部材作製工程精度検査方法
JP2009058459A (ja) * 2007-09-03 2009-03-19 Nikon Corp 形状測定装置
JP2010160135A (ja) * 2008-12-09 2010-07-22 Toshiba Corp タービン発電機におけるステータコイルの接続組立の3次元形状測定方法及び3次元形状測定装置用冶具
JP2011175600A (ja) * 2010-02-25 2011-09-08 Canon Inc 認識装置及びその制御方法、コンピュータプログラム

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
SZYMON RUSINKIEWICZ ET AL.: "Efficient Variants of the ICP Algorithm, 3-D Digital Imaging and Modeling 2001. proceedings.", THIRD INTERNATIONAL CONFERENCE, pages 145 - 152 *

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN106489061A (zh) * 2014-08-29 2017-03-08 日立汽车系统株式会社 部件的制造方法和使用它的制造装置、体积测定方法
JP2017025503A (ja) * 2015-07-17 2017-02-02 清水建設株式会社 建設機器操作アシスト表示システム
JP2019158767A (ja) * 2018-03-15 2019-09-19 東芝テック株式会社 測定装置
JP7073149B2 (ja) 2018-03-15 2022-05-23 東芝テック株式会社 測定装置
JP2019045510A (ja) * 2018-12-07 2019-03-22 株式会社キーエンス 検査装置
WO2021080498A1 (fr) * 2019-10-23 2021-04-29 Winteria Ab Procédé et dispositif d'inspection de géométrie comportant des moyens de capture d'image et de balayage de forme
WO2023053238A1 (fr) * 2021-09-29 2023-04-06 日本電気株式会社 Dispositif de détection de forme, système de détection de forme et procédé de détection de forme

Also Published As

Publication number Publication date
CN104024793B (zh) 2017-02-15
CN104024793A (zh) 2014-09-03

Similar Documents

Publication Publication Date Title
WO2013061976A1 (fr) Procédé et dispositif d'inspection de forme
JP2013186100A (ja) 形状検査方法およびその装置
JP5480914B2 (ja) 点群データ処理装置、点群データ処理方法、および点群データ処理プログラム
US9858659B2 (en) Pattern inspecting and measuring device and program
US8581162B2 (en) Weighting surface fit points based on focus peak uncertainty
JP5913903B2 (ja) 形状検査方法およびその装置
JP7037876B2 (ja) 自動工業検査における3dビジョンの使用
US9171364B2 (en) Wafer inspection using free-form care areas
WO2013015013A1 (fr) Procédé d'inspection d'aspect et appareil pour celui-ci
WO2014136490A1 (fr) Procédé d'examen de forme et dispositif pour celui-ci
US20150346115A1 (en) 3d optical metrology of internal surfaces
CN109716495B (zh) 用于晶片中开口尺寸的光学测量的方法和系统
US10579890B2 (en) Automatic alignment of a 3D model to a test object
JP5385703B2 (ja) 検査装置、検査方法および検査プログラム
EP3531688A2 (fr) Différenciateurs de portee pour la mise au point automatique dans des systèmes d'imagerie optique
Zhang et al. Accuracy improvement in laser stripe extraction for large-scale triangulation scanning measurement system
WO2019212042A1 (fr) Dispositif de mesure et procédé de mesure de forme de vis
JP7353757B2 (ja) アーチファクトを測定するための方法
Bergström et al. Virtual projective shape matching in targetless CAD-based close-range photogrammetry for efficient estimation of specific deviations
US11168976B2 (en) Measuring device for examining a specimen and method for determining a topographic map of a specimen
Lin et al. Real-time image-based defect inspection system of internal thread for nut
JP2020512536A (ja) モデルベースのピーク選択を使用した3dプロファイル決定のためのシステム及び方法
JP6040215B2 (ja) 検査方法
TW201809592A (zh) 自動化三維量測
JP5191265B2 (ja) 光学顕微鏡装置及び光学顕微鏡用データ処理装置

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12844031

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12844031

Country of ref document: EP

Kind code of ref document: A1