EP2062221A1 - Procédé et dispositif pour l'intégration de systèmes optiques de mesure en 3d et de vérification dans des systèmes de traitement d'images - Google Patents
Procédé et dispositif pour l'intégration de systèmes optiques de mesure en 3d et de vérification dans des systèmes de traitement d'imagesInfo
- Publication number
- EP2062221A1 EP2062221A1 EP07802032A EP07802032A EP2062221A1 EP 2062221 A1 EP2062221 A1 EP 2062221A1 EP 07802032 A EP07802032 A EP 07802032A EP 07802032 A EP07802032 A EP 07802032A EP 2062221 A1 EP2062221 A1 EP 2062221A1
- Authority
- EP
- European Patent Office
- Prior art keywords
- sensor
- data
- interface
- image processing
- processing system
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T1/00—General purpose image data processing
- G06T1/0007—Image acquisition
Definitions
- the invention relates to a method according to the preamble of claim 1 and a device according to the preamble of claim 48 for uniform integration of optical 3D measuring and test systems in systems of image processing and for the evaluation and display of 3D data on these systems.
- Figure 1 is an overview of known 3D systems
- Figure 2 is a schematic diagram of 2D cameras in conjunction with an image processing system
- FIG. 3 shows a schematic diagram of a 3D sensor in conjunction with an image processing system
- Figure 4a-f is a schematic diagram of a 3D sensor integrated into an image processing system
- FIG. 5 is a schematic diagram of the timing of an SD
- FIG. 6 is a schematic diagram of the conversion of 2D
- FIG. 7 is a schematic diagram of an intelligent 3D sensor.
- Optical measurement and test methods are increasingly finding their way onto topographic, three-dimensional (3D) data of a component to be tested. These 3D methods are usually operated on the basis of specialized optical, electronic and mechanical hardware in close connection with a specially adapted control and evaluation software (FIG. 1). This specialization takes into account the peculiarities of the 3D measurement and test procedure as well as the special characteristics of the 3D data. Therefore, the majority of 3D systems are isolated solutions in terms of hardware and software, which are far from standardization. An automatic evaluation of the result data is only partially available, often it has to be created specifically for the respective application. The strength of these systems lies in the area of the 3D measuring method as such (3D sensor) and less of the evaluation software.
- the 3D sensors A, B,... are connected to the image processing system X and its software by means of one or more standardized interfaces, for example a known camera interface.
- the specific evaluation of the image data takes place with the aid of the software modules for the 3D sensors A, B, which include the sensor-specific features
- the 3D data supplied by the sensors A and B are transferred and displayed according to the invention as follows in 2D data.
- first some 3D methods are considered in more detail.
- interferometric methods methods of white-light interferometry and the method of confocal microscopy are used, which, however, require high measurement times.
- inclination measuring methods Methods of this type are hereinafter referred to as inclination measuring methods.
- CT computed tomography
- a component is irradiated by X-ray radiation and images are taken of it. From the pictures, the three-dimensional structure of the component can be calculated.
- topographic data can be displayed and evaluated.
- the further processing of this data is done as follows.
- these data can be displayed and visualized in gray-tone-coded form.
- a minimum gray level is assigned to a certain minimum height, eg. B. zero, a maximum height a maximum gray value, z. B. 255.
- a linear assignment For all height values in between there is preferably a linear assignment, other types of assignment are possible.
- height data in particular, it makes sense to use more than 256 levels in accordance with a quantization of 8 bits, since height data are usually much finer in their resolution.
- another advantageous method is to produce slope data by differentiating from height data. Under certain circumstances, it may be necessary to smooth the data, since the differentiation of the measurement uncertainty in the form of noise is more prominent.
- the inclination-measuring methods already have the inclination data and usually do not need to be smoothed.
- the inclination data can be displayed and visualized in gray tone. For example, a minimum gray level is associated with a particular minimum slope, e.g. B. zero, a maximum slope of the maximum gray value, z. B. 255. For all inclination values in between there is preferably a linear assignment, other types of assignment are possible.
- quantization with more than 8 bits or as a floating-point number makes sense.
- the inclination in the x direction dz / cfcc or in the y direction dzldy is preferably displayed in two different images.
- tilt data can be displayed in a cylindrical coordinate system.
- the inclination in the radial direction dzldr and in the tangential direction dz I d ⁇ is shown separately. This is particularly advantageous for components with rotational symmetry, in particular turned parts.
- radial structures eg turning depths
- structures in the tangential direction eg impact points
- the derivative of the arc length r - ⁇ is possible.
- Other coordinate systems eg. B. Coordinate system with diagonal orientation relative to the x and y axis or other angles are possible borrowed, depending on which preferred directions the component has.
- inclination images can now be evaluated automatically with existing image processing tools, for example based on threshold values. If a certain slope value is exceeded or undershot locally, this location of the component is marked as faulty. Alternatively, it can be regarded as an error if such a value is exceeded or fallen short of over an entire region of the component.
- the tilt images can also be inspected for defects using edge filters or other filters. Many other known methods of image processing are also applicable thereto.
- the following procedure can also be selected.
- the local height z (x, y) two times differentiated, for example only in the x-direction d 2 z / d ⁇ 2 or only in the y-direction d 2 z / dy 2 .
- An offset of the inclination, for example, at a tilt of the component is thereby eliminated.
- the Laplace operator was used in image processing, so a vivid interpretation of the result was only possible to a very limited extent.
- edges in the image are amplified, but it was uncertain whether geometrical object edges of interest were represented by the image or perhaps inconsequential brightness differences on the component without a relevant topographical feature.
- the Laplace operator gains a very concrete meaning. It can be interpreted as the local curvature of the surface. If, for example, a small local spherical depression with radius R is detected, the Laplace operator yields a value that is approximately proportional to the inverse spherical radius Il R, ie a measure of the curvature of the surface. A local increase with radius R returns the value -l / i? that is a negative value.
- local elevations and depressions are distinguishable from each other by means of the sign.
- the value zero of the Laplace operator corresponds to a flat surface, which is also the desired shape for many technical products.
- only one further differential step, instead of two, is required to obtain curvature data.
- the inclination in x-direction dz / dx is differentiated one more time into x, the inclination in y-direction dz / dy is differentiated to /.
- Other variants are possible, for example a different choice of the coordinate system.
- the curvature images can also be evaluated automatically with existing image processing tools, for example based on threshold values. If a certain curvature value is locally exceeded or undershot, this location of the component is marked as faulty. In contrast to the inclination images, all directions are evaluated equally in the curvature image, a horizontal feature is therefore shown as clearly as a vertical one. All directions are treated the same; It is therefore a rotation invariant test. This is desirable in many testing tasks. If certain directions are to be emphasized, this can also be achieved in the case of curvature images by means of suitable directional filters.
- optical 3D sensors In addition to the topographical 3D data, optical 3D sensors usually provide further information about the test object, namely how much light returns from each location of the surface to the sensor. This information corresponds to a common 2-D camera image representing brightness. In connection with 3D optical sensors this image should be referred to as a texture image to distinguish it from topographical 3D images. Under certain conditions of illumination of the 3D sensor, which are mostly given, the texture image corresponds to the local reflection coefficient of the test object p ⁇ , y). This designation is used in the figures as an alternative to the term "texture.”
- the texture image is suitable for automatic evaluation in the image processing system, in particular for position tracking of test operations.The contrast between the test object and the background enables a precise and reliable finding of the test object in the image.
- topographical data and / or texture data of a test part with predetermined design data, in particular CAD data.
- the data of the test part can be compared in the form of point clouds, voxel data, triangular meshes or as a CAD file.
- a height image can be generated from the CAD data, which corresponds to the respective test view of the sensor and describes the desired state (reference image).
- the height image of the respective test part is compared with the height image of the CAD data and derived therefrom a rating (good part, bad part, etc.).
- a rating good part, bad part, etc.
- the above-mentioned methods of comparing height images of test parts with CAD data and / or reference images are by no means limited to height images. According to the invention, it is also possible to compare the inclination images and / or curvature images and / or texture images with reference images.
- the reference images of the slope and / or the curvature and / or texture may also be generated from CAD data and / or from images of good parts and / or smoothed images of the specimen itself.
- FIG. 4a shows the integration of a 3D sensor into an image processing system according to the invention.
- Practically all 3D sensors have in common that they have a number of n camera images E1 (x, y), E2 (x, y) En (x, y) and several illuminations, usually also n in number, B1, B2 , ... Bn use.
- n 4.
- the image processing system takes over the temporal control of the 3D sensor and its illumination as well as the image acquisition. The lighting is controlled by time control and digital inputs / outputs (digital I / Os) of the image processing system, image acquisition by means of a trigger signal (hardware or software trigger).
- a trigger signal hardware or software trigger
- FIG. 4b shows the time at which a first illumination B1 is activated by the image processing system and an image E1 is triggered and recorded. This is transmitted to the image processing system using a standard interface. This is followed by the illuminations B2 with the image E2 (FIG. 4c), B3 and E3 (FIG. 4d) up to the illumination B4 and E4 (FIG. 4e).
- the transmitted images are processed in the software module of the image processing system according to the invention to 3D data, which in turn are passed as height data, inclination data, curvature data and / or texture data to the automatic evaluation (Fig. 4f).
- the time control can also be taken over by the 3D sensor (FIG. 5).
- the timing by the image processing system has the advantage that in the image processing system is known which lighting and which recording is currently active or are recorded. If the 3D sensor takes over the temporal control, then the correct assignment of the images to the illuminations can be transmitted in a different manner, for example a further signal to the image processing system (dashed line). Alternatively, the assignment can also be transmitted using the standard interface.
- the 3D data are already generated in the SD sensor, and with the help of the standardized interface as Elevation data, inclination data, curvature data and / or texture data are passed on to the automatic evaluation (FIG. 6).
- This calculation can be performed on a sensor-own microprocessor, an FPGA or other computing units.
- 3D data can also be passed on in other forms, such as point clouds, triangular meshes, CAD data sets or voxel data. This is particularly useful at the time when sophisticated and standardized tools for the automatic assessment of these data are available.
- both the 3D sensors and the image processing systems can be upgraded and extended to this new standard.
- a device is also explained here.
- Such a device is suitable for carrying out the method according to the invention.
- these are optical 3D sensors that can be integrated via a standardized interface into a computer unit and / or software for image processing, in particular 2D image processing.
- This 3D sensor can transmit 2D camera images to the image processing system.
- the 3D sensor can already be designed to generate internally finished 3D data and pass it on in one of the formats mentioned (height image, slope image, curvature image, texture image, point cloud, triangular mesh, CAD data sets, voxel data, etc.).
- the 3D sensor can be designed for it be to make even the automatic evaluation of the 3D data itself (Fig. 7).
- the method for uniform integration of optical 3D sensors in an image processing system is characterized in that different 3D sensors can be integrated into the same image processing system.
- a software module is preferably integrated into the image processing system, which is adapted to the particular features of the respective 3D sensor.
- the image processing system preferably performs an automatic inspection of components, wherein the automated test is carried out in particular on the basis of height images and / or inclination images and / or curvature images and / or texture images and / or point clouds and / or voxel data and / or triangular meshes and / or CAD data becomes.
- the automatic testing of components preferably takes place in such a way that a comparison of height images and / or inclination images and / or curvature images and / or texture images with one or more reference images is carried out.
- the reference image of the component to be tested is preferably generated from CAD data and / or a good part and / or the component to be tested itself.
- the 3D sensor transmits camera images to the image processing system and that for the integration of the 3D Sensors in the image processing system, a standardized interface, in particular a standardized camera interface, is used.
- the interface can be a FireWire interface, a CameraLink interface, an Ethemet interface, a USB interface, an LVDS interface, or an analog interface.
- the SD sensor is controlled by means of the interface.
- the 3D sensor can have a lighting device with the aid of which an interface can be controlled.
- adjustment parameters of the 3D sensor can be set with the aid of the interface.
- the image processing system controls the 3D sensor.
- the image processing system can also control the lighting device of the 3D sensor.
- the 3D data are preferably generated in a software module of the image processing system, alternatively it is provided that the 3D data are generated in the 3D sensor. However, it can also be provided that the 3D data is generated partly in the 3D sensor and partly in the image processing system.
- the automatic evaluation is preferably carried out in the image processing system, alternatively in the 3D sensor.
- the 3D sensor is preferably a tilt-measuring 3D sensor. However, it can also be a height-measuring or a curvature-measuring 3D sensor.
- the 3D data can be further processed and / or displayed as height data, as inclination data or as curvature data.
- inclination data or curvature data these can be further processed and / or displayed with respect to a Cartesian coordinate system. But it can also be provided inclination data or to process and / or display curvature data relating to a polar coordinate system.
- the method is designed to test parts having different geometries, in particular to test essentially rotationally symmetric parts, and preferably processes and / or displays curvature data.
- 3D data in the method proposed here are displayed in gray scale.
- a laser light section method for example, a laser light section method, a strip projection method, a method of interferometry, a method of white light interferometry, a photometric stereo method, a method of photometric deflectometry or a method of confocal microscopy can be used.
- the present invention also relates to a 3D optical sensor suitable for carrying out a method of the type discussed here.
- the optical 3D sensor preferably has a standardized interface for integration into a computing unit and / or the software of image processing systems. Furthermore, a standardized camera interface can be provided for integration into a computing unit and / or the software of image processing systems.
- the interface of the optical 3D sensor may include a FireWire interface, a CameraLink interface, an E-themet interface, a USB interface, an LVDS interface or an analog interface.
- the 3D sensor preferably has an interface with the aid of which it can be externally connected. can be controlled.
- the 3D sensor preferably has an interface with the aid of which its lighting unit can be controlled.
- the optical 3D sensor preferably has an interface with the aid of which setting parameters can be controlled.
- the optical 3D sensor is a sensor according to the method of photometric deflectometry. But it can also be provided that it is a sensor according to the photometric stereo method, a laser light section sensor, a fringe projection sensor, an interferometric sensor, a white light interferometric sensor or a confocal microscope. In particular, it is provided that the 3D sensor itself performs an automatic evaluation of 3D data.
Landscapes
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- Theoretical Computer Science (AREA)
- Length Measuring Devices By Optical Means (AREA)
- Dental Tools And Instruments Or Auxiliary Dental Instruments (AREA)
Abstract
La présente invention concerne un procédé pour l'intégration d'un détecteur 3D dans un système de traitement d'images.
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE200610041200 DE102006041200A1 (de) | 2006-09-02 | 2006-09-02 | Verfahren und Vorrichtung zur Einbindung von optischen 3D-Mess- und Prüfsystemen in Systeme der Bildverarbeitung |
PCT/EP2007/007614 WO2008028595A1 (fr) | 2006-09-02 | 2007-08-31 | Procédé et dispositif pour l'intégration de systèmes optiques de mesure en 3d et de vérification dans des systèmes de traitement d'images |
Publications (1)
Publication Number | Publication Date |
---|---|
EP2062221A1 true EP2062221A1 (fr) | 2009-05-27 |
Family
ID=38787711
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
EP07802032A Withdrawn EP2062221A1 (fr) | 2006-09-02 | 2007-08-31 | Procédé et dispositif pour l'intégration de systèmes optiques de mesure en 3d et de vérification dans des systèmes de traitement d'images |
Country Status (3)
Country | Link |
---|---|
EP (1) | EP2062221A1 (fr) |
DE (1) | DE102006041200A1 (fr) |
WO (1) | WO2008028595A1 (fr) |
Families Citing this family (1)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN113538680A (zh) * | 2021-06-10 | 2021-10-22 | 无锡中车时代智能装备有限公司 | 基于双目光度立体视觉的三维测量方法及设备 |
-
2006
- 2006-09-02 DE DE200610041200 patent/DE102006041200A1/de not_active Ceased
-
2007
- 2007-08-31 EP EP07802032A patent/EP2062221A1/fr not_active Withdrawn
- 2007-08-31 WO PCT/EP2007/007614 patent/WO2008028595A1/fr active Application Filing
Non-Patent Citations (1)
Title |
---|
See references of WO2008028595A1 * |
Also Published As
Publication number | Publication date |
---|---|
WO2008028595A1 (fr) | 2008-03-13 |
DE102006041200A1 (de) | 2008-03-13 |
WO2008028595A8 (fr) | 2008-05-29 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
DE10081029B4 (de) | Bildbearbeitung zur Vorbereitung einer Texturnalyse | |
DE102006059172B4 (de) | Bildprozessor | |
DE60307967T2 (de) | Bildverarbeitungsverfahren für die untersuchung des erscheinungsbildes | |
DE102009023896B4 (de) | Vorrichtung und Verfahren zum Erfassen einer Pflanze | |
EP2329222B1 (fr) | Procédé et dispositif de mesure pour déterminer la géométrie de roues ou d'essieux d'un véhicule | |
DE3505331C2 (de) | Verfahren und Gerät zur Vermessung des bei der Eindringhärteprüfung in einer Probe hinterlassenen Eindrucks | |
DE112010004767T5 (de) | Punktwolkedaten-Verarbeitungsvorrichtung, Punktwolkedaten-Verarbeitungsverfahren und Punktwolkedaten-Verarbeitungsprogramm | |
DE102017215334A1 (de) | Verfahren, Computerprogrammprodukt und Messsystem zum Betrieb mindestens eines Triangulations-Laserscanners zur Identifizierung von Oberflächeneigenschaften eines zu vermessenden Werkstücks | |
DE102017121591A1 (de) | System zum Extrahieren von Positionsinformationen von Objekten aus Punktwolkendaten unter Verwendung von Komponenten | |
WO2014032661A1 (fr) | Procédé et dispositif de détection d'écarts d'une surface d'un objet | |
EP1882232B1 (fr) | Procede et dispositif pour determiner des delimitations de matiere d'un objet a tester | |
DE102020126554A1 (de) | Mikroskopiesystem und verfahren zum überprüfen von eingabedaten | |
EP3649614B1 (fr) | Procédé de détermination d'incertitudes dans des données de mesure à partir d'une mesure d'un objet | |
EP2035810B1 (fr) | Procédé d'analyse des propriétés de réflexion | |
EP3663881B1 (fr) | Procédé de commande d'un véhicule autonome en fonction des vecteurs de mouvement estimés | |
DE10063756A1 (de) | Verfahren und Vorrichtung zum Kompensieren einer Dejustage einer Bilderzeugungsvorrichtung | |
EP2997543B1 (fr) | Dispositif et procédé de paramétrage d'une plante | |
EP2062221A1 (fr) | Procédé et dispositif pour l'intégration de systèmes optiques de mesure en 3d et de vérification dans des systèmes de traitement d'images | |
EP3663800B1 (fr) | Procédé de détection d'objet à l'aide d'une camera tridimensionnelle | |
DE102007014475A1 (de) | Bestimmung von Oberflächeneigenschaften | |
EP1435065A1 (fr) | Determination automatique de modeles geometriques pour des reconnaissances optiques partielles | |
WO2018153575A1 (fr) | Procédé et dispositif de reconnaissance d'arêtes multiples | |
WO2020160861A1 (fr) | Étalonnage d'un capteur pour un véhicule sur la base d'indices d'identification côté objet et côté image d'un objet de référence | |
WO2013017618A1 (fr) | Balayage optique de surface à éclairage structuré | |
EP3875892B1 (fr) | Appareil de mesure optique, méthode pour générer un programme de mesure optique d'un objet et méthode de mesure optique d'un objet |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PUAI | Public reference made under article 153(3) epc to a published international application that has entered the european phase |
Free format text: ORIGINAL CODE: 0009012 |
|
17P | Request for examination filed |
Effective date: 20090402 |
|
AK | Designated contracting states |
Kind code of ref document: A1 Designated state(s): AT BE BG CH CY CZ DE DK EE ES FI FR GB GR HU IE IS IT LI LT LU LV MC MT NL PL PT RO SE SI SK TR |
|
AX | Request for extension of the european patent |
Extension state: AL BA HR MK RS |
|
DAX | Request for extension of the european patent (deleted) | ||
17Q | First examination report despatched |
Effective date: 20130117 |
|
STAA | Information on the status of an ep patent application or granted ep patent |
Free format text: STATUS: THE APPLICATION IS DEEMED TO BE WITHDRAWN |
|
18D | Application deemed to be withdrawn |
Effective date: 20130730 |