EP2619525B1 - Procédé de palpage optique d'un bord dans ou sur une zone superficielle - Google Patents
Procédé de palpage optique d'un bord dans ou sur une zone superficielle Download PDFInfo
- Publication number
- EP2619525B1 EP2619525B1 EP11760456.1A EP11760456A EP2619525B1 EP 2619525 B1 EP2619525 B1 EP 2619525B1 EP 11760456 A EP11760456 A EP 11760456A EP 2619525 B1 EP2619525 B1 EP 2619525B1
- Authority
- EP
- European Patent Office
- Prior art keywords
- object surface
- reflectance
- surface region
- intensity
- edge
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Active
Links
- 238000000034 method Methods 0.000 title claims description 44
- 238000005286 illumination Methods 0.000 claims description 56
- 230000001419 dependent effect Effects 0.000 claims description 32
- 230000003287 optical effect Effects 0.000 claims description 12
- 238000001514 detection method Methods 0.000 claims description 7
- 238000005259 measurement Methods 0.000 description 16
- 230000008859 change Effects 0.000 description 10
- 230000006870 function Effects 0.000 description 8
- 230000007935 neutral effect Effects 0.000 description 6
- 238000004364 calculation method Methods 0.000 description 3
- 230000007423 decrease Effects 0.000 description 3
- 230000000694 effects Effects 0.000 description 2
- 238000012360 testing method Methods 0.000 description 2
- 230000006978 adaptation Effects 0.000 description 1
- 230000008901 benefit Effects 0.000 description 1
- 230000003247 decreasing effect Effects 0.000 description 1
- 230000002950 deficient Effects 0.000 description 1
- 230000001066 destructive effect Effects 0.000 description 1
- 238000010586 diagram Methods 0.000 description 1
- 238000003708 edge detection Methods 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 239000000463 material Substances 0.000 description 1
- 239000011159 matrix material Substances 0.000 description 1
- 238000005457 optimization Methods 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 238000001454 recorded image Methods 0.000 description 1
- 230000000630 rising effect Effects 0.000 description 1
- 230000007480 spreading Effects 0.000 description 1
- 239000013589 supplement Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/02—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness
- G01B11/028—Measuring arrangements characterised by the use of optical techniques for measuring length, width or thickness by measuring lateral position of a boundary of the object
Definitions
- the present invention relates to a method of optically measuring or sensing an edge which is within or confines a surface area.
- the surface area of interest is recorded by a camera which contains at least two partial surfaces adjoining the edge.
- a determination of the edge location can be made on the basis of at least one edge location criterion in order to determine the position of the edge.
- a method for edge measurement is for example from DE 102 15 135 A1 known.
- control information for influencing variables on the camera image are to be determined automatically in order to be able to achieve an optimum measurement result.
- auxiliary parameters are obtained and from these auxiliary parameters, in turn, the control information is derived.
- gradients of at least two auxiliary parameters are determined as a function of one or more influencing variables.
- an auxiliary parameter for example, the contrast, the homogeneity, the brightness and / or the increase can be used.
- the gradients are determined in such a way that either all gradients have a maximum or a minimum in the optimum.
- a weighted summation of the individual profiles of the auxiliary parameters to an overall course.
- the weighting factors can be determined experimentally, for example.
- the value of the influencing variable is determined as control information for the influencing variable. In this way, the fair setting for the Recording of the camera image and thus optimized for the determination of the edge.
- edge determination several different edge location criteria, for example a threshold criterion and a differential criterion, can be added in a weighted manner in order to improve the orientation of the edge.
- a transmitted light method for measuring tools is known.
- the detection characteristic of the camera is controlled as a function of a control criterion with the aim of optimizing the detection characteristic in order to be able to determine the contour of the tool accurately in transmitted light.
- the exposure time, the signal gain of the image processing sensor and / or the sensor offset are changed.
- the control criteria are derived from a comparison with nominal gray value profiles for the tool contour.
- a method and apparatus for analyzing at least partially reflective surfaces is known in DE 10 2004 033 526 A1 described.
- the essence of this method is to perform surface and shape inspection of the device under test utilizing the movement of components of the device.
- the effect is utilized that a pattern reflected by the object surface changes when the relative position or the orientation of the camera relative to the object surface is changed. This pattern change can be evaluated in a control unit.
- US 49 12 336 also describes a device which serves to detect the surface shape.
- punctiform light sources are activated sequentially and in each case an image of the light reflected by the object surface is recorded.
- the device should serve to be suitable for different reflection properties of the surface.
- the method should be suitable both for surfaces requiring a mirror-like directional reflection and for surfaces with diffuse reflection properties, as well as for hybrid surfaces.
- An extraction algorithm separates the components of the directed from the components of the diffuse reflection for each image intensity measured in order to be able to calculate the surface orientation.
- a device for testing surface structures is made WO 2010/003163 A2 known. There, a surface area is illuminated by means of light segments of a lighting device from different directions. Every picture from a different direction is taken. These images are provided for further processing.
- the core of the method is to determine the reflectance of the object surface area of interest pixel-precise and to optically touch the edge with at least one known edge location criterion.
- the surface area is illuminated successively in different illumination states, for example with different groups of light sources or from different illumination directions, and a sequence of several camera images is recorded in each illumination state.
- This can be done for example by a lighting device with a plurality of light sources, wherein only one or a selected group of light sources is used for the intensity change in each case.
- a light source can be assigned to several groups. But there are no two identical groups. Alternatively, at least one positionable, movable light source could be provided, the position of which is changed again and again.
- a sequence of multiple images is captured. For each illumination of the object surface area from a selected illumination direction, a sequence or a stack with a plurality of camera images is created in this way. Each camera image of a sequence is recorded at a different illumination intensity. For example, the illumination intensity is gradually increased from camera image to camera image. In particular, the illumination intensity difference between the camera images of a sequence is identical. Preferably, the relative position between the camera and the object surface remains unchanged during the recording of the sequences.
- a spatially resolved maximum pixel-accurate reflectance image is determined. This happens in particular in that, based on the camera images of the sequence, a course of a quantity of light quantity is determined for each pixel as a function of the illumination intensity.
- the amount of light quantity describes the amount of light received by the camera for the respectively assigned pixel.
- the value of the quantity of light quantity is, for example, the gray value of a pixel.
- the at least one selection criterion makes it possible to disregard invalid values for the amount of light quantity. For example, overshoot of a pixel may result in incorrect values for the amount of light amount for that and adjacent pixels. Such invalid values of the amount of light quantity are omitted for the determination of the variation of the amount of light quantity for the respective pixels. This can lead to different pixels Values of the amount of light quantity from different images of the same sequence are disregarded.
- the intensity-dependent course of the amount of light quantity - for example, its slope in the linearly increasing range - describes the reflectance for the relevant pixel. The reflectance of all pixels results in the reflectance image.
- the result reflectance image may preferably be formed by a weighted addition of the reflectances of the locally matching pixels from the reflectance images.
- the weighting factors can also be equal to zero, so that individual reflectance images are not taken into account in the calculation of the result reflectance image.
- the weighting factor of greater than or equal to zero may preferably result from a comparison of local reflectance changes of the reflectance images. This is intended to ensure a constructive overlay for the two edge surfaces delimiting the object surface area and to prevent a destructive overlay. In this comparison, for example, the location-dependent change of the reflectance in a considered image segment can be compared.
- Reflectance images with reflectances that are similar to or similar within a given tolerance range depending on location are taken into account when determining the result reflectance image. For example, one of the refelctance images whose reflectance decreases or stays substantially the same in a pixel-to-pixel image segment, while the values of the reflectances in the locally corresponding image segment increase in the same direction from pixel to pixel in all other reflectance images, is ignored.
- the reflectance images, which are not to be taken into account can have the weighting factor for the weighted addition be assigned null.
- the determined weighting factors are assigned to the currently measured object or surface area or type of the object or of the object surface area and stored.
- the properties of the reflectance images are substantially the same with respect to approximately equal surface areas. This knowledge can later be reused in the edge measurement of objects or object surface areas of the same type, for example, if repeated edge measurements are made on identical or identical tools or workpieces. Sequences of camera images that have not yielded a usable reflectance image due to the lighting condition need not be resumed. For later edge measurements on objects or object surface areas of the same type, it is then sufficient to record the sequences whose reflectance images were used in the determination of the result reflectance image, to which a weighting factor greater than zero, for example, was assigned.
- a minimum weighting factor can also be specified, and for subsequent edge measurements with the same object types, only those sequences are recorded whose reflectance image previously received a weighting factor in other measurements that corresponds at least to the minimum weighting factor. Such measures can significantly reduce the necessary measurement time.
- the result reflectance image thus obtained serves to determine the edge in or on the object surface area with at least one edge location criteria known per se along measurement lines. Since much more data is available for the determination of the edge profile than would be necessary, it is the determination of the edge According to the inventive method, an overdetermined problem, so that the accuracy in determining the position of the edge is greater than in the determination of the camera image.
- the generation of a reflectance image in the determination of an edge offers the advantage that it is also possible to exactly determine edges that form only by two differently oriented surface sections whose surface properties are, however, indistinguishable. Local variations in the contour sharpness, for example due to macroscopic processing traces, do not lead to inaccurate edge determination.
- the process is robust and requires little computing power. It is particularly suitable for the optical probing of workpieces, such as tools in incident light.
- a device 10 for optically scanning an edge 11 of an object surface area 12 of an object 9.
- an object surface region 12 for example, two partial surfaces 18 adjoin the edge 11 to be determined.
- the device 10 has a camera 13, preferably a matrix camera.
- the device 10 includes a lighting device 14, with a plurality of light sources 15.
- the light sources 15 distributed around the lens 16 of the camera 13 or its optical axis 17 are arranged. The distance of the light sources 15 to the optical axis 17 may be equal.
- a control device 20 controls both the camera 13 and the illumination device 14.
- the controller 20 may adjust the camera settings such as the focus position, the exposure time, the aperture.
- the control device 20 controls the illumination device 14. It selects the light source 15 or group of light sources 15 to be used for the current illumination state for changing the illumination intensity I and sets the illumination intensity I of the light source 15 or of the light sources 15 for the camera image B to be recorded by the camera 13.
- the illumination direction L1, L2 can be changed, from which the object surface area 12 is illuminated, wherein the object surface area 12 can also be illuminated simultaneously from a plurality of different illumination directions L1, L2.
- the illumination direction L1, L2 is in the embodiment by the position of the light source 15 or the group of light sources 15 in the circumferential direction around the As seen optical axis 17. In the illumination direction L1, L2, the light is incident on the object surface area 12 of interest.
- a light source 15 could be arranged to be movable about the optical axis 17.
- a first method step 30 at least one sequence, each with a plurality of camera images B, is recorded by the control device 20.
- a first light source 15 or a first group of light sources 15 of the illumination device 14 is used for intensity-variable illumination.
- the object surface area 12 is illuminated from one or more illumination directions L1, L2, whereby a first illumination state is predetermined.
- the intensity I of the light emitted onto the object surface area of interest 12 is changed.
- the purpose intended first light source 15 or first group of light sources 15 is driven by the control device 20 accordingly.
- the intensity I is, for example, increased or decreased stepwise with constant step sizes, and a camera image B is recorded for each intensity value.
- sequences S2 to Sn are optionally recorded.
- the lighting condition is changed.
- a different light source 15 or another group of light sources 15 of the illumination device 14 is used for illuminating the object surface area 12 for each sequence S1 to Sn.
- a movable light source 15 it is also possible to use a movable light source 15 and to change the position of the light source 15 in the different illumination states.
- the object surface area 12 can be illuminated with colored light.
- At least one of the light sources 15 of the illumination device 14 can emit colored light. It is also possible to generate a basic brightness for all sequences over a portion of the light sources 15 in order to adapt the brightness change by the intensity-variable light source 15 or intensity-variable group of light sources 15 in the different illumination states from the basic brightness to the operating point of the camera 13.
- a profile V1 to Vk is reflected locally for each pixel P1 to Pk using the camera images B of a sequence S1 to Sn for the object surface area 12 and detected by the camera 13 received light amount. According to the resolution of the camera 13 thus arise depending on the number k of available pixels P a corresponding number of intensity-dependent curves V1 (I) to Vk (I) per sequence S1 to Sn.
- gray levels GW are detected as a function of intensity by the camera 13, so that an intensity-dependent gray level profile V1 (I) to Vk (I) is determined for each pixel P1 to Pk of a sequence S1 to Sn.
- Exemplary intensity-dependent gray scale gradients V1 (I), V2 (I) are in FIG. 8 shown.
- the intensity-dependent gray value gradients V1 (I) to Vk (I) are determined as a function of the illumination intensity I.
- the intensity-dependent gray level profile V1 (I) to Vk (I) and in particular its rising range can be adapted for each pixel P1 to Pk to the operating point of the camera 13.
- the basic brightness can also be changed, for example by a part of the light sources 15 of the illumination device 14 providing the desired basic brightness.
- the gray value minimum GW min and / or the gray value maximum GW max can be shifted over the basic brightness.
- the local or local change of gray values GW within a contiguous image segment 21 of a camera image B can serve as a selection criterion.
- the local gray value change in the image segment 21 can be compared with predetermined limit values.
- the local gray scale gradients of the locally matching image segments 21 in the different camera images B can be used to compare. In doing so, a majority decision can be made.
- Such deviations in the local gray value course can occur, for example, through the effect of the overshoot of a pixel. This leads to erroneous gray levels at this pixel. Because of the crosstalk and the point spreading function of the camera, all immediately adjacent pixels can be affected.
- the overshoot of a pixel PU can be caused for example by processing traces in the object surface area 12.
- the recognition of the overshoot of a pixel PU can be detected.
- the absolute value for the gray value minimum GW min and / or the gray value maximum GW max is appreciably greater than for the directly adjacent pixels PN (FIG. FIG. 3 ). If this situation is detected in an image segment 21, the gray values detected in this camera image B remain in the determination for the over-beam pixel PU and for the neighboring pixels PN of this image segment 21 of the intensity-dependent gray value curve V (I) is disregarded.
- the reflectance for the respectively corresponding pixel P1 to Pk is determined from the intensity-dependent gray-scale gradients V1 (I) to Vk (I) for each pixel P1 to Pk and from the reflectances of the k pixels P1 to Pk Reflectance image R1 to Rn determined for each sequence S1 to Sn, as shown schematically in FIG FIG. 4 is illustrated.
- the reflectance can be, for example, the gradient of the intensity-dependent gray-level curve V1 (I) to Vk (I) of the respective pixel P1 to Pk or another parameter which characterizes the intensity-dependent gray value curve in the region of its linear rise.
- a fourth method step 33 the individual reflectance images R1 to Rn are combined to form a common result reflectance image E.
- This is preferably done by a weighted addition of the reflectance images R1 to Rn.
- An example is this weighted addition in FIG. 5 shown.
- the weighting factors W1 to Wn of the individual reflectance images R1 to Rn may be equal to zero or greater than zero. If one of the weighting factors W1 to Wn equals zero, then the relevant reflectance image is not taken into account in the determination of the result reflectance image E, as illustrated by the example of the third reflectance image R3. That's why be necessary because, for example, depending on the material properties of the object surface area 12 opposing reflectances can occur.
- location-dependent reflectance courses in the reflectance images R1 to Rn are compared, and only such reflectance images R1 to Rn are taken into account in the calculation of the result reflectance image E, whose location-dependent reflectance characteristics are superimposed so to speak constructively during the addition.
- the current illumination state which is characterized here by the light source 15 or group of light sources 15 currently used for the intensity change-as neutral for the current object type or the current object surface area type and to store this identification in a memory of the control device 20. For later edge measurements on the same object types or object surface area types, this information can be used.
- the gradient of the reflectance curve can be determined as a function of location and compared with the other location-dependent gradients of the reflectance curve of the other reflectance images. If the difference between location-dependent gradients of different reflectance images R1 to Rn lies within a permissible tolerance range, then these can be taken into account in the determination of the result reflectance image E. If the location-dependent reflectance curve in one or more parameters of a reflectance image R1 to Rn is outside the tolerance range, then a weighting factor W1 to Wn of zero is assigned to this reflectance image R1 to Rn, so that it does not enter into the calculation of the result reflectance image E.
- the weighting factors W1 to Wn are preferably assigned to the type of the object 9 or the type of the object surface area 12 and stored. If objects 9 or object surface areas 12 of the same type are optically probed in future edge measurements, then the stored data can be used.
- Based on the stored weighting factors W1 to Wn can thus be decided in the first method step 30, which lighting conditions can be selected and which sequences of camera images B to be recorded.
- the procedure is quasi-learning. The more different object types or object surface area types have already been edge-measured, the more a priori knowledge is available for future edge measurements.
- the weighting factors W1 to Wn it is also possible to store other data characterizing the usability of specific illumination states.
- a fifth method step 34 the result reflectance image E is evaluated.
- an edge location criterion K by means of which the position or the course of the edge 11 in the result reflectance image E can be determined, is used.
- an edge location criterion K an integral edge location criterion is preferably used.
- the integral of the location-dependent reflectance curve in the result reflectance image E is determined and evaluated along predeterminable measurement lines. It is understood that alternatively or additionally, other edge location criteria, such as a differential edge location criterion, could be used.
- the invention relates to a method for determining the position of an edge 11 in or on an object surface area 12 of interest by optical probing. For this purpose, the reflectance of the object surface area 12 is evaluated. Under different lighting conditions, in particular different light incidence directions Light is irradiated on the object surface area 12, and in each illumination state, a sequence S1 to Sn of camera images B is taken. Each camera image B of a sequence S1 to Sn is taken at a different illumination intensity I. From several or all camera images B of a sequence S1 to Sn, a reflectance image R1 to Rn is subsequently generated in each case.
- a result reflectance image E is generated by weighted addition, in which, by applying at least one edge location criterion, the position of an edge 11 in or on the object surface area 12 of interest is determined to be subpixel accurate.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Image Analysis (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Investigating Or Analysing Materials By Optical Means (AREA)
- Image Processing (AREA)
Claims (12)
- Procédé de détermination optique d'un bord (11) dans ou sur une zone de surface d'objet (12) d'un objet (9), comprenant les étapes suivantes :irradiation de la zone de surface d'objet (12) avec un premier groupe de sources lumineuses (15) ou depuis une première direction d'illumination (L1) et enregistrement d'une première séquence (S1) avec plusieurs images de caméra (B) de la zone de surface d'objet (12) avec des intensités d'illumination (I) respectives différentes,enregistrement d'au moins une séquence supplémentaire (S2 à Sn) avec plusieurs images de caméra (B) de la zone de surface d'objet (12) avec des intensités d'illumination (I) respectives différentes, la zone de surface d'objet (12) étant irradiée pour toutes les séquences (S1 à Sn) respectivement avec un autre groupe de sources lumineuses (15) ou respectivement depuis une autre direction d'illumination (L2),formation respectivement d'une image de réflectance (R1 à Rn) à résolution locale pour chacune des séquences (S1 à Sn) à partir d'au moins quelques-unes des images de caméra (B) de la séquence respective (S1 à Sn), sachant que l'on détermine pour chaque pixel une évolution (V1 à Vk) d'une grandeur de quantité de lumière (GW) en fonction de l'intensité d'illumination (I), la grandeur de quantité de lumière (GW) décrivant la quantité de lumière reçue par une caméra (13) pour le pixel (P1 à Pk) respectif associé, etutilisation d'au moins une des images de réflectance (R1 à Rn) pour déterminer au moins un bord (11) dans ou sur la zone de surface d'objet (12).
- Procédé selon la revendication 1, caractérisé en ce que seules sont utilisées pour la formation de l'évolution (V1 à Vk) dépendant de l'intensité, les valeurs de la grandeur de quantité de lumière (GW) qui répondent à au moins un critère de sélection prédéterminé.
- Procédé selon la revendication 1, caractérisé en ce que l'on utilise comme critère de sélection un ou plusieurs des critères suivants :- les valeurs de la grandeur de quantité de lumière (GW) inférieures à une valeur minimale d'intensité (Imin) pour l'intensité d'illumination (I) correspondent au maximum à une valeur minimale de quantité de lumière (Gmin),- les valeurs de la grandeur de quantité de lumière (GW) supérieures à une valeur maximale d'intensité (Imax) pour l'intensité d'illumination (I) correspondent au moins à une valeur maximale de quantité de lumière (Gmax),- à partir d'une valeur minimale d'intensité (Imin) jusqu'à une valeur maximale d'intensité (Imax) pour l'intensité d'illumination (I), les valeurs de la grandeur de quantité de lumière (GW) augmentent.
- Procédé selon la revendication 1, caractérisé en ce que sur la base d'au moins une des images de réflectance (R1 à Rn), on forme une image de réflectance résultante (E).
- Procédé selon la revendication 4, caractérisé en ce que l'image de réflectance résultante (E) est formée par addition pondérée des réflectances des pixels (P) à concordance locale provenant des images de réflectance (R1 à Rn).
- Procédé selon la revendication 5, caractérisé en ce que l'on détermine pour chaque image de réflectance (R1 à Rn) un coefficient de pondération (W1 à Wn) supérieur ou égal à zéro, qui est le résultat d'une comparaison d'évolutions de réflectance locales des images de réflectance (R1 à Rn).
- Procédé selon la revendication 5, caractérisé en ce qu'au moins une partie des coefficients de pondération (W1 à Wn) déterminés est associée au type de l'objet (9) actuellement mesuré ou de la zone de surface d'objet (12) actuellement mesurée, et est sauvegardée.
- Procédé selon la revendication 7, caractérisé en ce que les coefficients de pondération (W1 à Wn) sauvegardés sont pris en compte lors de déterminations ultérieures de la position d'un bord (11) pour un objet (9) ou une zone de surface d'objet (12) du même type.
- Procédé selon la revendication 7, caractérisé en ce que lors de la détermination du bord (11), on vérifie tout d'abord s'il existe, pour le type d'objet (9) ou de zone de surface d'objet (12) actuellement à mesurer, une sauvegarde de coefficients de pondération (W1 à Wn) dans la mémoire, et si cela est le cas, on définit la sélection des séquences (S1 à Sn) d'images de caméra (B) à enregistrer, en fonction des coefficients de pondération (W1 à Wn) sauvegardés.
- Procédé selon la revendication 4, caractérisé en ce que la position du bord (11), au nombre d'au moins un, est déterminée à l'aide de l'image de réflectance résultante (E).
- Procédé selon la revendication 1, caractérisé en ce que lors de l'enregistrement d'une séquence (S1 à Sn), la position relative entre une caméra (13) et la zone de surface d'objet (12) reste inchangée.
- Procédé de détermination optique d'un bord (11) dans ou sur une zone de surface d'objet (12) d'un objet (9), comprenant les étapes suivantes :(a) vérification pour déterminer s'il existe, pour le type d'objet (9) ou de zone de surface d'objet (12) actuellement à mesurer, une sauvegarde de coefficients de pondération (W1 à Wn), ou d'autres données caractérisant la possibilité d'utiliser des états d'illumination précis, dans une mémoire,(b) mise en oeuvre du procédé selon l'une des revendications précédentes, si aucune sauvegarde n'a été trouvée dans la mémoire au cours de l'étape (a),(c) exécution des étapes de procédé suivantes, si une inscription a été trouvée dans la mémoire :(c1) sélection d'au moins une séquence (Si) d'images de caméra (B) à enregistrer, en fonction des coefficients de pondération (W1 à Wn) sauvegardés ou en fonction des autres données caractérisant la possibilité d'utiliser des états d'illumination précis,(c2) irradiation de la zone de surface d'objet (12) dans un état d'illumination associé à la séquence (Si) respective à enregistrer, et enregistrement de la séquence (Si) sélectionnée, au nombre d'au moins une, avec respectivement plusieurs images de caméra (B) de la zone de surface d'objet (12) avec chaque fois des intensités d'illumination (I) différentes,(c3) formation respectivement d'une image de réflectance (Ri) à résolution locale pour chacune des séquences (Si) sélectionnées et enregistrées à partir d'au moins quelques-unes des images de caméra (B) de la séquence (Si) respective, sachant que pour chaque pixel on détermine une évolution (V1 à Vk) d'une grandeur de quantité de lumière (GW) en fonction de l'intensité d'illumination (I), la grandeur de quantité de lumière (GW) décrivant la quantité de lumière reçue par une caméra (13) pour le pixel (P1 à Pk) respectivement associé,(c4) utilisation d'au moins une des images de réflectance (Ri) pour déterminer au moins un bord (11) dans ou sur une zone de surface d'objet (12).
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102010037746A DE102010037746B4 (de) | 2010-09-23 | 2010-09-23 | Verfahren zum optischen Antasten einer Kante in oder an einem Oberflächenbereich |
PCT/EP2011/066135 WO2012038349A1 (fr) | 2010-09-23 | 2011-09-16 | Procédé de palpage optique d'un bord dans ou sur une zone superficielle |
Publications (2)
Publication Number | Publication Date |
---|---|
EP2619525A1 EP2619525A1 (fr) | 2013-07-31 |
EP2619525B1 true EP2619525B1 (fr) | 2015-07-22 |
Family
ID=44674785
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP11760456.1A Active EP2619525B1 (fr) | 2010-09-23 | 2011-09-16 | Procédé de palpage optique d'un bord dans ou sur une zone superficielle |
Country Status (5)
Country | Link |
---|---|
US (1) | US9280721B2 (fr) |
EP (1) | EP2619525B1 (fr) |
CN (1) | CN103370598B (fr) |
DE (1) | DE102010037746B4 (fr) |
WO (1) | WO2012038349A1 (fr) |
Families Citing this family (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN111787246B (zh) * | 2013-11-26 | 2023-07-14 | 株式会社尼康 | 摄像元件及摄像装置 |
JP6290651B2 (ja) * | 2014-02-27 | 2018-03-07 | 株式会社キーエンス | 画像測定器 |
JP6278741B2 (ja) * | 2014-02-27 | 2018-02-14 | 株式会社キーエンス | 画像測定器 |
JP2015169624A (ja) * | 2014-03-10 | 2015-09-28 | キヤノン株式会社 | 計測装置、計測方法及び物品の製造方法 |
JP6355487B2 (ja) * | 2014-08-29 | 2018-07-11 | 株式会社Screenホールディングス | エッジ位置検出装置およびエッジ位置検出方法 |
KR101726061B1 (ko) * | 2015-07-28 | 2017-04-12 | 주식회사 포스코 | 판 위치 측정 장치, 판 쏠림 제어 장치 및 판 쏠림 계산 방법 |
DE102016100437B4 (de) * | 2016-01-12 | 2018-08-02 | Stephan Krebs | Vorrichtung zur Druckbildkontrolle |
JP6522181B2 (ja) * | 2018-02-09 | 2019-05-29 | 株式会社キーエンス | 画像測定器 |
JP2020021105A (ja) * | 2018-07-30 | 2020-02-06 | キヤノン株式会社 | 画像処理装置、画像処理方法及びプログラム |
CN110991485B (zh) * | 2019-11-07 | 2023-04-14 | 成都傅立叶电子科技有限公司 | 一种目标检测算法的性能评估方法及系统 |
Family Cites Families (18)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US4912336A (en) | 1989-02-21 | 1990-03-27 | Westinghouse Electric Corp. | Surface shape and reflectance extraction system |
DE4123916C2 (de) * | 1990-07-19 | 1998-04-09 | Reinhard Malz | Verfahren und Vorrichtung zum beleuchtungsdynamischen Erkennen und Klassifizieren von Oberflächenmerkmalen und -defekten eines Objektes |
US6025905A (en) * | 1996-12-31 | 2000-02-15 | Cognex Corporation | System for obtaining a uniform illumination reflectance image during periodic structured illumination |
WO1999022224A1 (fr) * | 1997-10-29 | 1999-05-06 | Vista Computer Vision Ltd. | Systeme d'eclairage pour examen d'objets |
DE10215135A1 (de) | 2001-04-18 | 2002-10-24 | Zeiss Carl | Verfahren zur automatischen Regelung von Fokus und Beleuchtung, sowie zur objektivierten Antastung des Kantenortes in der optischen Präzisionsmesstechnik |
US6990255B2 (en) * | 2001-09-19 | 2006-01-24 | Romanik Philip B | Image defect display system |
US7003161B2 (en) * | 2001-11-16 | 2006-02-21 | Mitutoyo Corporation | Systems and methods for boundary detection in images |
DE10237426B4 (de) | 2002-08-12 | 2010-06-02 | Joachim Egelhof | Verfahren und Vorrichtung zum Vermessen von Werkzeugen |
US7738725B2 (en) * | 2003-03-19 | 2010-06-15 | Mitsubishi Electric Research Laboratories, Inc. | Stylized rendering using a multi-flash camera |
DE102004033526A1 (de) | 2004-07-08 | 2006-02-02 | Universität Karlsruhe (TH) Institut für Mess- und Regelungstechnik | Verfahren und Vorrichtung zur Analyse zumindest partiell reflektierender Oberflächen |
JP2006038639A (ja) * | 2004-07-27 | 2006-02-09 | Brother Ind Ltd | 端部位置検出装置及び方法、並びにプログラム |
DE102007003060A1 (de) | 2007-01-15 | 2008-07-17 | Technische Universität Ilmenau | Verfahren zur Bestimmung der Güte eines Messpunktes bei der Kantendetektion in der optischen Längenmesstechnik |
CN101324422B (zh) * | 2007-06-12 | 2011-01-19 | 北京普瑞微纳科技有限公司 | 白光干涉测量样品表面形状精细分布的方法及其装置 |
JP5268088B2 (ja) * | 2008-02-28 | 2013-08-21 | レーザーテック株式会社 | 表面形状測定装置及び表面形状測定方法 |
AT507018B1 (de) * | 2008-07-09 | 2012-11-15 | Profactor Res And Solutions Gmbh | Vorrichtung zur prüfung von gegenständen |
JP5243980B2 (ja) * | 2009-01-28 | 2013-07-24 | 新東エスプレシジョン株式会社 | 二次元測定装置 |
JP2010185324A (ja) | 2009-02-11 | 2010-08-26 | Denso Corp | 燃料噴射弁 |
US8280172B1 (en) * | 2011-03-22 | 2012-10-02 | Mitutoyo Corporation | Edge location measurement correction for coaxial light images |
-
2010
- 2010-09-23 DE DE102010037746A patent/DE102010037746B4/de not_active Expired - Fee Related
-
2011
- 2011-09-16 EP EP11760456.1A patent/EP2619525B1/fr active Active
- 2011-09-16 WO PCT/EP2011/066135 patent/WO2012038349A1/fr active Application Filing
- 2011-09-16 CN CN201180046030.0A patent/CN103370598B/zh active Active
-
2013
- 2013-03-15 US US13/836,336 patent/US9280721B2/en not_active Expired - Fee Related
Also Published As
Publication number | Publication date |
---|---|
CN103370598A (zh) | 2013-10-23 |
DE102010037746B4 (de) | 2013-01-24 |
DE102010037746A1 (de) | 2012-03-29 |
EP2619525A1 (fr) | 2013-07-31 |
US20130208987A1 (en) | 2013-08-15 |
WO2012038349A1 (fr) | 2012-03-29 |
CN103370598B (zh) | 2016-07-13 |
US9280721B2 (en) | 2016-03-08 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
EP2619525B1 (fr) | Procédé de palpage optique d'un bord dans ou sur une zone superficielle | |
DE102010043052B4 (de) | Verfahren zur präzisen Dimensionsprüfung und Fokussierung | |
DE10081029B4 (de) | Bildbearbeitung zur Vorbereitung einer Texturnalyse | |
DE3937950C2 (fr) | ||
DE102011081668B4 (de) | Vorrichtung und Verfahren zur Bestimmung der Form des Endes einer Schweißnaht | |
EP1610270A2 (fr) | Procédé d'évaluation d'un materiel avec au moins une caractéristique d'identification | |
DE102017215334A1 (de) | Verfahren, Computerprogrammprodukt und Messsystem zum Betrieb mindestens eines Triangulations-Laserscanners zur Identifizierung von Oberflächeneigenschaften eines zu vermessenden Werkstücks | |
EP1379835B9 (fr) | Procede de reglage automatique de la distance focale et de l'eclairage, ainsi que de demarrage objective du palpage de la position d'arete dans la technique de mesure de precision optique | |
DE102014107143B4 (de) | System und Verfahren zur Messung der Verschiebung einer Objektoberfläche | |
DE102013209770B4 (de) | Verfahren zur Bestimmung von einstellbaren Parametern mehrerer Koordinatenmessgeräte sowie Verfahren und Vorrichtung zur Erzeugung mindestens eines virtuellen Abbilds eines Messobjekts | |
DE102016202928B4 (de) | Verbessertes Autofokusverfahren für ein Koordinatenmessgerät | |
DE4309802A1 (de) | Produktionsnahe Farbkontrolle mit bildgebenden Sensoren | |
EP3291172A1 (fr) | Procédé de détermination de données d'image de résultat | |
DE102010017604B4 (de) | Verfahren zum optischen Messen von Strukturen eines Objekts | |
DE102020120887B4 (de) | Verfahren zum erfassen einer einhängeposition eines auflagestegs und flachbettwerkzeugmaschine | |
DE102019208114B4 (de) | Vorrichtung und Verfahren zur 3D Vermessung von Objektkoordinaten | |
DE102011104435B3 (de) | Zerstörungsfreie Bestimmung von Werkstoffeigenschaften | |
DE102011000088A1 (de) | Verfahren zur Ermittlung eines Verfahrweges bei der Messung von Strukturen eines Objekts | |
DE102005045748A1 (de) | Messvorrichtung zum Vermessen eines Werkstücks | |
EP3798570B1 (fr) | Procédé d'étalonnage d'un système de mesure optique, système de mesure optique et objet d'étalonnage pour un système de mesure optique | |
EP1136787B1 (fr) | Reproduction d'un objet avec compensation de la distance ( taille, couleur ) | |
DE102008049859A1 (de) | Verfahren und Prüfsystem zur optischen Prüfung einer Kontur eines Prüfobjekts | |
EP3889895A1 (fr) | Procédé de recherche des bords d'un objet de mesure | |
DE102005036421B4 (de) | Verfahren zum automatischen Bestimmen der exakten Position von Passermarken | |
DE102018101995B3 (de) | 6Vorrichtung zur Messung nach dem Lichtschnitt-Triangulationsverfahren |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20130305 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
DAX | Request for extension of the european patent (deleted) | ||
GRAP | Despatch of communication of intention to grant a patent |
Free format text: ORIGINAL CODE: EPIDOSNIGR1 |
|
INTG | Intention to grant announced |
Effective date: 20150313 |
|
GRAS | Grant fee paid |
Free format text: ORIGINAL CODE: EPIDOSNIGR3 |
|
GRAA | (expected) grant |
Free format text: ORIGINAL CODE: 0009210 |
|
AK | Designated contracting states |
Kind code of ref document: B1 Designated state(s): AL AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HR HU IE IS IT LI LT LU LV MC MK MT NL NO PL PT RO RS SE SI SK SM TR |
|
REG | Reference to a national code |
Ref country code: GB Ref legal event code: FG4D Free format text: NOT ENGLISH |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: EP |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: FG4D Free format text: LANGUAGE OF EP DOCUMENT: GERMAN |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: REF Ref document number: 738163 Country of ref document: AT Kind code of ref document: T Effective date: 20150815 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R096 Ref document number: 502011007394 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: LT Ref legal event code: MG4D |
|
REG | Reference to a national code |
Ref country code: NL Ref legal event code: MP Effective date: 20150722 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: FI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 Ref country code: LV Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 Ref country code: GR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151023 Ref country code: LT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 Ref country code: NO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151022 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: HR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 Ref country code: PT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151123 Ref country code: ES Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 Ref country code: IS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20151122 Ref country code: RS Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 Ref country code: SE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 Ref country code: PL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R097 Ref document number: 502011007394 Country of ref document: DE |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: DK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 Ref country code: EE Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 Ref country code: MC Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 Ref country code: CZ Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 Ref country code: SK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 Ref country code: LU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150916 |
|
PLBE | No opposition filed within time limit |
Free format text: ORIGINAL CODE: 0009261 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: NO OPPOSITION FILED WITHIN TIME LIMIT |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: RO Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 |
|
26N | No opposition filed |
Effective date: 20160425 |
|
REG | Reference to a national code |
Ref country code: IE Ref legal event code: MM4A |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: IE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20150916 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SI Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 6 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: MT Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: SM Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 Ref country code: HU Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT; INVALID AB INITIO Effective date: 20110916 Ref country code: BG Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: NL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 Ref country code: CY Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: BE Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20150930 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 7 |
|
REG | Reference to a national code |
Ref country code: AT Ref legal event code: MM01 Ref document number: 738163 Country of ref document: AT Kind code of ref document: T Effective date: 20160916 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AT Free format text: LAPSE BECAUSE OF NON-PAYMENT OF DUE FEES Effective date: 20160916 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: TR Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 Ref country code: MK Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 |
|
REG | Reference to a national code |
Ref country code: FR Ref legal event code: PLFP Year of fee payment: 8 |
|
PG25 | Lapsed in a contracting state [announced via postgrant information from national office to epo] |
Ref country code: AL Free format text: LAPSE BECAUSE OF FAILURE TO SUBMIT A TRANSLATION OF THE DESCRIPTION OR TO PAY THE FEE WITHIN THE PRESCRIBED TIME-LIMIT Effective date: 20150722 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: GB Payment date: 20220929 Year of fee payment: 12 Ref country code: DE Payment date: 20220615 Year of fee payment: 12 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: FR Payment date: 20220930 Year of fee payment: 12 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: IT Payment date: 20220929 Year of fee payment: 12 |
|
PGFP | Annual fee paid to national office [announced via postgrant information from national office to epo] |
Ref country code: CH Payment date: 20221024 Year of fee payment: 12 |
|
REG | Reference to a national code |
Ref country code: DE Ref legal event code: R119 Ref document number: 502011007394 Country of ref document: DE |
|
REG | Reference to a national code |
Ref country code: CH Ref legal event code: PL |
|
GBPC | Gb: european patent ceased through non-payment of renewal fee |
Effective date: 20230916 |