US20210325313A1 - Method and device for recognising and analysing surface defects in three-dimensional objects having a reflective surface, in particular motor vehicle bodies - Google Patents
Method and device for recognising and analysing surface defects in three-dimensional objects having a reflective surface, in particular motor vehicle bodies Download PDFInfo
- Publication number
- US20210325313A1 US20210325313A1 US17/260,677 US201917260677A US2021325313A1 US 20210325313 A1 US20210325313 A1 US 20210325313A1 US 201917260677 A US201917260677 A US 201917260677A US 2021325313 A1 US2021325313 A1 US 2021325313A1
- Authority
- US
- United States
- Prior art keywords
- illumination
- identified
- surface defects
- reflective surface
- pixels
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Granted
Links
Images
Classifications
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/30—Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
- G01B11/306—Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces for measuring evenness
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/2513—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object with several lines being projected in more than one direction, e.g. grids, patterns
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01B—MEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
- G01B11/00—Measuring arrangements characterised by the use of optical techniques
- G01B11/24—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
- G01B11/25—Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
- G01B11/254—Projection of a pattern, viewing through a pattern, e.g. moiré
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
- G06T7/001—Industrial image inspection using an image reference approach
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
- G01N2021/8809—Adjustment for highlighting flaws
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
- G01N2021/8822—Dark field detection
- G01N2021/8825—Separate detection of dark field and bright field
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8806—Specially adapted optical and illumination features
- G01N2021/8829—Shadow projection or structured background, e.g. for deflectometry
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/84—Systems specially adapted for particular applications
- G01N21/88—Investigating the presence of flaws or contamination
- G01N21/8851—Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
- G01N2021/8854—Grading and classifying of flaws
- G01N2021/888—Marking defects
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20084—Artificial neural networks [ANN]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30204—Marker
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30248—Vehicle exterior or interior
- G06T2207/30252—Vehicle exterior; Vicinity of vehicle
Definitions
- the present invention relates to a method for recognizing and analyzing surface defects in three-dimensional objects having a reflective surface, in particular motor vehicle bodies, and a device for recognizing and analyzing surface defects in three-dimensional objects having a reflective surface, in particular motor vehicle bodies, in particular according to such a method, at least comprising: a first illumination device, in particular an illumination arch or a handheld module, for projecting an illumination pattern on a surface of the object; a camera, activatable by a light barrier in particular, for recording images in the form of a grid graphic made up of pixels; and a control and data analysis unit.
- a method and a device for recognizing surface defects in an object preferably for recognizing paint defects on the surface of a motor vehicle body, is known from DE 37 12 513 A1, in which a light stripe is generated on the surface by means of an illumination system and guided over the object, wherein its relative movement with respect to the surface is recorded step-by-step and used to determine surface defects.
- DE 10 2010 015 566 A1 moreover discloses a method and system for measuring reflective surfaces, in which multiple patterns are generated on a luminescent surface, which are reflected on the reflective surface and are captured using a sensor surface of a camera, wherein the coordinates of surface points of the inspected surface are determined from the captured measurement data, from the known coordinates of the luminescent surface and sensor surface, and from the distance between sensor surface and/or luminescent surface and at least one support point arranged on the surface.
- Both methods are based on the “scanning” of the surface of the object to be inspected and subsequent calculation of a point cloud, which is dependent on a relative velocity between object and respective device and/or a distance between device and object and which can then be used for a further surface analysis. These methods are complex in computation and design.
- the present invention is based on the object of providing a method improved in comparison to the prior art and an improved device for recognizing and analyzing surface defects in three-dimensional objects having a reflective surface, which are usable quickly, cost-effectively, and in a mobile manner and advantageously enable objective automated damage classification of surface defects, in particular of hail damage on motor vehicle or aircraft bodies.
- This object is first achieved by a method for recognizing and analyzing surface defects in three-dimensional objects having a reflective surface, in particular motor vehicle bodies, having the features of independent claim 1 , and by a device for carrying out such a method having the features of independent claim 14 .
- Further advantageous embodiments and refinements, which are usable individually or in combination with one another, are the subject matter of the dependent claims.
- the method according to the invention is distinguished in that surface defects are identified on the basis of the evaluation of at least one image recorded by at least one camera in the form of a grid graphic made up of pixels of an illumination pattern projected by at least one first illumination unit on at least a part of the surface on the basis of a two-dimensional grid coordinate system; accordingly, the device according to the invention is distinguished in relation to devices forming the generic type in that the control and data analysis unit is designed to identify surface defects on the surface of the object on the basis of one or more images recorded by means of the camera in the form of grid graphics made up of pixels on the basis of a two-dimensional grid coordinate system.
- the invention advantageously enables surface defects on the surface of three-dimensional objects having a reflective surface to be identified exclusively on the basis of items of two-dimensional image information with the aid of image processing algorithms.
- complex geometrical calculations, in particular triangulation calculations, as are regularly applied in the prior art, can advantageously be omitted.
- the solution according to the invention is thus fast, robust, and can be carried out in cooperation with a whole array of differently designed first illumination units, which predestines it in particular for mobile applications, for example as a handheld module.
- Advantageous embodiments increase the robustness of the method.
- FIG. 1 shows an embodiment of a device according to the invention for recognizing and analyzing surface defects in three-dimensional objects having a reflective surface using a first illumination unit designed as an illumination arch in a perspective view;
- FIG. 2 shows a further embodiment of the device according to the invention having a first illumination unit designed as an illumination arch, and a three-dimensional object to be inspected in a frontal view;
- FIG. 3 shows a part of an embodiment of a first illumination unit having three different illumination sections
- FIG. 4 shows a part of an embodiment of a first illumination unit, and the illumination pattern projected by it on a three-dimensional object, a motor vehicle body here;
- FIG. 5 shows a grid graphic of an identified region of a part of a surface of a three-dimensional object, an engine hood of a motor vehicle here;
- FIG. 6 shows an enlargement of the section of the grid graphic framed by dashed lines in FIG. 5 , a region having surface defects (bottom left) and a region without surface defects, but change of the structure of the surface (top right) framed therein;
- FIG. 7 shows an enlargement of the region shown in FIG. 6 with surface defects (framed bottom left) having an identified and classified surface defect.
- FIG. 1 shows an embodiment of a device 1 according to the invention for recognizing and analyzing surface defects OF in three-dimensional objects 5 having a reflective surface 51 using a first illumination unit 2 designed as an illumination arch in a perspective view.
- the device 1 for recognizing and analyzing surface defects OF in three-dimensional objects 5 having a reflective surface 51 , in particular motor vehicle bodies, comprises at least one first illumination unit 2 for projecting an illumination pattern 21 onto a surface 51 of the object 5 ; a camera 3 , activatable in particular by a light barrier 11 , for recording images in the form of a grid graphic made up of pixels; and a control and data analysis unit 4 . It is distinguished in that the control and data analysis unit 4 is designed to identify surface defects OF on the surface 51 of the object 5 on the basis of one or more images recorded by the camera 3 in the form of grid graphics made up of pixels on the basis of a two-dimensional grid coordinate system.
- the first illumination unit 2 can be formed by an illumination arch having a frame 14 , preferably an aluminum frame.
- the first illumination unit 2 can also be designed as a handheld module (not shown), which can preferably be formed in the form of a portable lamp, (organic) light-emitting diode (LED/OLED) display, liquid crystal (LCD) display, or plasma display.
- the illumination pattern 21 generated by the first illumination unit 2 can be generated by a regular shading of a continuous luminescent surface, for example by a correspondingly designed film arranged on the luminescent surface.
- the first illumination unit 2 can also be formed by at least one liquid crystal (LCD) display, (organic) light-emitting diode (LED/OLED) display, or plasma display, which is either also shaded in regular sections and thus a desired illumination pattern 21 is generated, or an illumination pattern 21 is projected directly in the form of an image projection onto the surface 51 of a three-dimensional object 5 .
- LCD liquid crystal
- LED/OLED light-emitting diode
- plasma display which is either also shaded in regular sections and thus a desired illumination pattern 21 is generated, or an illumination pattern 21 is projected directly in the form of an image projection onto the surface 51 of a three-dimensional object 5 .
- the embodiment of the device 1 shown in FIG. 1 comprises, for example, nine cameras 3 , which are arranged in particular on the first illumination unit 2 , preferably, as shown here, on a frame 14 of a first illumination unit 2 designed as an illumination arch.
- a calibration means 31 is shown by way of example, with the aid of which, preferably during startup of the device 1 , a calibration of the one or more cameras 3 can advantageously be performed.
- a calibration plate having points spaced apart uniformly from one another at known intervals is shown for this purpose by way of example as the calibration means 31 in FIG. 1 .
- FIG. 2 shows a further embodiment of the device 1 according to the invention having a first illumination unit 2 designed as an illumination arch, and a three-dimensional object 5 to be inspected in a frontal view.
- the first illumination unit 2 can comprise at least two, preferably four rollers 13 for moving the first illumination unit 2 .
- the provision of rollers 13 on a first illumination unit 2 designed in particular as an illumination arch can advantageously enhance the mobility of the device 1 .
- the device 1 can thus, for example, with the aid of the rollers 13 , easily be rolled onto a transport trailer or rolled down from it and set up for use at any location.
- At least one light barrier 11 can advantageously also be provided, which is shown arranged by way of example in FIG. 2 on the frame 14 of the first illumination unit 2 above the rollers 13 .
- the at least one or more cameras 3 can advantageously be activated, in particular when a three-dimensional object 5 , preferably a motor vehicle, is moved through the first illumination unit 2 , in particular the electric arch, and passes the light barrier 11 at the same time.
- the light barrier 11 can accordingly also activate the at least one or more cameras 3 , of course, if the three-dimensional object 5 is stationary and the device 1 , in particular the first illumination unit 21 , moves over the object 5 .
- the movement of the device 1 in relation to the object 5 can advantageously be implemented with respect to the device 1 by the rollers 13 arranged on the first illumination unit 2 and with respect to the three-dimensional object 5 , in particular by a motor vehicle, by its own drive including wheels. Alternatively or additionally, however, rails can also be provided, on which the device 1 and/or the three-dimensional object 5 moves.
- the one or more cameras 3 , the first 2 and second 201 illumination unit, and the light barrier 11 can moreover preferably communicate via wires or also wirelessly, for example via a WLAN or Bluetooth connection, with the control and data analysis unit 4 .
- the first illumination unit 2 in particular the illumination arch, comprises at least one, preferably—as shown in FIG. 2 —two mirrors 12 , which are preferably arranged along a longitudinal axis LA adjacent to the floor on the illumination unit 2 , wherein the mirror or mirrors 12 are preferably pivotably arranged on the illumination unit 2 .
- Such mirrors 12 advantageously enable the illumination pattern 21 projected by the first illumination unit 2 on the surface 51 of the three-dimensional object 5 advantageously to be projected by reflection also on regions of the surface 51 which are close to the floor and/or curved, and on which an illumination pattern 21 cannot be depicted in the case of direct projection.
- the first illumination unit 2 in particular the illumination arch, can also comprise at least one, preferably two second illumination units 201 , which are preferably arranged along a longitudinal axis LA adjacent to the floor on the first illumination unit 2 , wherein the at least one second illumination unit 201 is preferably arranged pivotably on the first illumination unit 2 and is designed to project an illumination pattern 21 , in particular a stripe pattern, on a surface 51 of the object 5 , in particular on the surface 51 of the object 5 close to the floor.
- the at least one second illumination unit 201 is preferably arranged pivotably on the first illumination unit 2 and is designed to project an illumination pattern 21 , in particular a stripe pattern, on a surface 51 of the object 5 , in particular on the surface 51 of the object 5 close to the floor.
- the one or more second illumination units 201 can be designed here like the one or more first illumination units 2 , thus in the form of a lamp, (organic) light-emitting diodes (LED/OLED), liquid crystal (LCD) displays, or plasma displays and can generate the illumination pattern or patterns 21 as described above.
- One or more such second illumination units 201 can, like the mirrors 12 , advantageously enable an illumination pattern 21 to also be projected on regions of the surface 51 which are close to the floor and/or curved and on which an illumination pattern 21 cannot be imaged in the case of direct projection.
- various wavelengths i.e., various colors up to UV or infrared rays
- the type of the one or more cameras 3 used is preferably selected in accordance with the wavelength of the light used, i.e., for example cameras 3 having a CCD detector for visible light, UV light, and/or infrared light.
- FIG. 3 shows a part of an embodiment of a first illumination unit 2 having three different illumination sections A, B, C.
- the illumination pattern 21 thus generated on the surface 51 of the three-dimensional object 5 accordingly also advantageously has three illumination sections A, B, C, which advantageously differ from one another in the respective stripe width.
- Illumination patterns 21 in particular stripe patterns having smaller stripe widths, as are generated here, for example, by the illumination sections B and C, are advantageously suitable for recognizing and analyzing surface defects OF having smaller diameter; while in contrast illumination patterns 21 , in particular stripe patterns having comparatively greater stripe widths, as are generated in particular by illumination section A, are advantageously suitable for recognizing and analyzing surface defects OF having larger diameter.
- FIG. 4 shows a part of an embodiment of a first illumination unit 2 , and the illumination pattern 21 projected by it on a three-dimensional object 5 , a motor vehicle body here.
- Surface defects OF are now identified by the method according to the invention for recognizing and analyzing surface defects OF in three-dimensional objects 5 having a reflective surface 51 , in particular of motor vehicle bodies, on the basis of the evaluation of at least one image recorded by at least one camera 3 in the form of a grid graphic made up of pixels of an illumination pattern 21 projected by at least one first illumination unit 2 on at least a part of the surface 51 on the basis of a two-dimensional grid coordinate system.
- the illumination pattern 21 can preferably be a stripe pattern, in particular a stripe pattern having a sinusoidal intensity curve, wherein the stripe widths can differ from one another in sections (see illumination sections A, B, C), as already shown in FIG. 3 .
- one or more images in the form of grid graphics made up of pixels of at least a part of the surface 51 of the object 5 can advantageously be recorded by means of at least one camera 3 .
- the (optimum) exposure time of the one camera 3 or the multiple cameras 3 can be determined and configured, in particular by means of a control and data analysis unit 4 , on the basis of the color of the surface 51 of the object 5 , in particular the color of the motor vehicle body.
- the one or more cameras 3 are calibrated with the aid of a calibration means 31 , as shown in FIG. 1 .
- a region 511 of the recorded surface 51 of the object 5 , on which the illumination pattern 21 is projected can now advantageously be identified within the recorded image or images, wherein preferably a two-dimensional grid coordinate system having x and y coordinates for each pixel can preferably be associated with the identified region 511 .
- FIG. 5 shows by way of example a grid graphic of such an identified region 511 of a part of a surface 51 of a three-dimensional object 5 , on the basis of the example of an engine hood of a motor vehicle here.
- intensity values are assigned to at least a part of the pixels.
- edge detection starting from previously defined starting values, at least sections of the illumination pattern 21 projected on the surface 51 , in particular the stripes of a stripe pattern, can then be identified in the image or the images, wherein pixels of intensity values which are equal or similar up to a predetermined deviation value, in particular light-dark values, can be identified as associated.
- a functional (mathematical) relationship in particular a function graph, can be established for the respective group of pixels identified as associated.
- the light stripes 211 and dark stripes 212 shown, inter alia, in FIG. 5 can thus advantageously be approximated in the form of mathematical curves, for example as a polynomial or spline.
- the first derivative of a found functional (mathematical) relationship in particular the slope of a found function graph
- at least sections of the illumination pattern 21 projected on the surface 51 in particular the stripes of a stripe pattern can advantageously be classified in the image or the images on the basis of the first derivative of a found functional (mathematical) relationship, in particular on the basis of a slope of the function graph.
- Three light stripes 211 horizontal shaded, diagonally shaded, and checked shaded) classified in this way are shown by way of example in FIG. 7 , wherein each shading corresponds, for example, to a special slope, preferably within a predetermined region in terms of a “neighborhood analysis”.
- surface defects OF are identified within the identified region 511 on the basis of characteristic changes and/or a disappearance of the first derivative of a found functional (mathematical) relationship, in particular the slope of a function graph, in the classified sections, in particular in the stripes of a stripe pattern.
- FIG. 6 shows for this purpose an enlargement of the section of the grid graphic framed by dashed lines in FIG. 5 , and in turn framed therein a region 512 with surface defects OF (bottom left) and a region 513 without surface defects OF, but change of the structure of the surface 51 (top right), wherein FIG. 7 finally shows an enlargement of the region 512 shown in FIG. 6 with surface defects OF (framed bottom left) with an identified and classified surface defect OF.
- the first derivatives, in particular the slopes, of the horizontally shaded and the checked stripe change in a characteristic way in a section of the identified region 511 adjacent to one another, wherein the curves of the horizontally shaded and the checked stripe are opposite to one another.
- the diagonally shaded light stripe 211 even breaks off completely in an also adjacent section of the identified region 511 located between the two other shaded light stripes 211 , the first derivative of a functional (mathematical) relationship found for this purpose accordingly disappears.
- the disappearance of individual stripes can accordingly advantageously be used as a criterion for identifying surface defects OF.
- structure changes in the surface 51 of a three-dimensional object 5 do not result in such a pattern, in particular do not result in a disappearance of the first derivative of a found functional (mathematical) relationship, in particular the slope of a function graph, and therefore a complete interruption of a light stripe 211 , but only, as shown in FIG. 6 in the framed region 513 , in similar changes of the first derivative of a found functional (mathematical) relationship, preferably the slope of a function graph, in particular having a similar curve (in the same direction).
- the size of individual identified surface defects OF can moreover advantageously preferably be ascertained by determining a center point of the respective surface defect OF and at least one extremum E spaced apart from the center point MP.
- a model function MF for example a circle or ellipse function, can be calculated starting from the center point MP and the distance between center point MP and an extremum E as the radius and preferably projected in the respective grid graphic, in order to represent the size of the respective surface defect OF in the image or the images.
- the maximum or minimum of a found function graph can be used as the extremum E.
- the at least one illumination unit 2 and/or the at least one camera 3 can be moved in relation to the three-dimensional object ( 5 ).
- the three-dimensional object 5 can also be moved in relation to the at least one illumination unit 2 and/or the at least one camera 3 .
- a motor vehicle can drive through a first illumination unit designed as an illumination arch. Multiple images can then preferably be recorded, in particular multiple images by multiple cameras 3 , for recognizing and analyzing surface defects OF in this case.
- At least one marker M in particular four markers M, are arranged on the surface 51 of the object 5 and the position of the marker or markers M is determined within the recorded image or images.
- the determination of the position of one or more markers M within various recorded images can advantageously be used, preferably before beginning a further surface analysis, for combining multiple images into a two-dimensional grid coordinate system, in particular if the object 5 to be inspected, or its surface 51 , respectively, is larger than the illumination pattern 21 projected on the surface 51 of the object 5 .
- multiple images are advantageously recorded, in particular also by multiple cameras 3 , which can then advantageously be combined with one another in a grid coordinate system on the basis of an item of position information of at least one marker M in the various images.
- identified surface defects OF are classified by a comparison to at least one model function MF, in particular to a circle function and/or ellipse function, and/or a pattern of already classified surface defects OF, characteristic changes, and/or a disappearance of the first derivative of an already found functional (mathematic) relationship, and the measurement and/or evaluation data of surface defects OF classified in this way are stored in a database, in particular in a cloud database.
- identified surface defects OF can also be classified by a comparison to database entries, in particular to database entries of a database having measurement and/or evaluation data of already classified surface defects OF, and measurement and/or evaluation data of classified surface defects OF are stored in a database, in particular in a cloud database.
- the present invention relates to a method for recognizing and analyzing surface defects OF in three-dimensional objects 5 having a reflective surface 51 , in particular motor vehicle bodies, in which the surface defects OF are identified on the basis of the evaluation of at least one image recorded by at least one camera 3 in the form of a grid graphic made up of pixels of an illumination pattern 21 projected by at least one illumination unit 2 on at least a part of the surface 51 on the basis of a two-dimensional grid coordinate system, and also a device 1 for this purpose. It advantageously enables surface defects OF to be identified exclusively on the basis of items of two-dimensional image information with the aid of image processing algorithms, wherein no “environmental parameters” are required and complex geometrical calculations can advantageously be omitted.
- the solution according to the invention is thus fast, robust, and can be carried out in cooperation with differently designed first illumination units 2 , which predestines it in particular for mobile applications, for example as a handheld module.
- optimizing the method according to the invention by way of a “deep-learning” strategy is enabled.
Landscapes
- Engineering & Computer Science (AREA)
- General Physics & Mathematics (AREA)
- Physics & Mathematics (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Chemical & Material Sciences (AREA)
- Theoretical Computer Science (AREA)
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Health & Medical Sciences (AREA)
- Immunology (AREA)
- Pathology (AREA)
- Quality & Reliability (AREA)
- Signal Processing (AREA)
- Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)
- Length Measuring Devices By Optical Means (AREA)
Abstract
Description
- The present invention relates to a method for recognizing and analyzing surface defects in three-dimensional objects having a reflective surface, in particular motor vehicle bodies, and a device for recognizing and analyzing surface defects in three-dimensional objects having a reflective surface, in particular motor vehicle bodies, in particular according to such a method, at least comprising: a first illumination device, in particular an illumination arch or a handheld module, for projecting an illumination pattern on a surface of the object; a camera, activatable by a light barrier in particular, for recording images in the form of a grid graphic made up of pixels; and a control and data analysis unit.
- The increase of extreme weather phenomena such as intensive rainfall, severe storms, and strong thunderstorms with massive hailstorms often has the result that the surface of three-dimensional objects, in particular motor vehicles, but also aircraft, which are subjected to these weather phenomena are frequently damaged. If such damage is claimed by the owner with an insurance company, estimators are always employed, who estimate the respective damage and usually classify it manually. This method is time-consuming, costly, and sometimes not very transparent, since the damage classification can in certain circumstances be greatly dependent on the subjective assessments of individual estimators.
- Objective, automated methods for damage classification would therefore be desirable, but their development has an array of difficulties to be overcome in automated surface inspection in particular due to the usually reflective (reflecting) surfaces of the objects to be inspected.
- In this context, a method and a device for recognizing surface defects in an object, preferably for recognizing paint defects on the surface of a motor vehicle body, is known from DE 37 12 513 A1, in which a light stripe is generated on the surface by means of an illumination system and guided over the object, wherein its relative movement with respect to the surface is recorded step-by-step and used to determine surface defects. DE 10 2010 015 566 A1 moreover discloses a method and system for measuring reflective surfaces, in which multiple patterns are generated on a luminescent surface, which are reflected on the reflective surface and are captured using a sensor surface of a camera, wherein the coordinates of surface points of the inspected surface are determined from the captured measurement data, from the known coordinates of the luminescent surface and sensor surface, and from the distance between sensor surface and/or luminescent surface and at least one support point arranged on the surface. Both methods are based on the “scanning” of the surface of the object to be inspected and subsequent calculation of a point cloud, which is dependent on a relative velocity between object and respective device and/or a distance between device and object and which can then be used for a further surface analysis. These methods are complex in computation and design.
- In addition to the above-mentioned prior art, reference is also made to documents DE 10 2016 006 780 A1, DE 10 2015 119 240 B3, U.S. Pat. No 5,367,378 A, and EP 0 997 201 B1.
- Proceeding therefrom, the present invention is based on the object of providing a method improved in comparison to the prior art and an improved device for recognizing and analyzing surface defects in three-dimensional objects having a reflective surface, which are usable quickly, cost-effectively, and in a mobile manner and advantageously enable objective automated damage classification of surface defects, in particular of hail damage on motor vehicle or aircraft bodies.
- This object is first achieved by a method for recognizing and analyzing surface defects in three-dimensional objects having a reflective surface, in particular motor vehicle bodies, having the features of independent claim 1, and by a device for carrying out such a method having the features of
independent claim 14. Further advantageous embodiments and refinements, which are usable individually or in combination with one another, are the subject matter of the dependent claims. - The method according to the invention is distinguished in that surface defects are identified on the basis of the evaluation of at least one image recorded by at least one camera in the form of a grid graphic made up of pixels of an illumination pattern projected by at least one first illumination unit on at least a part of the surface on the basis of a two-dimensional grid coordinate system; accordingly, the device according to the invention is distinguished in relation to devices forming the generic type in that the control and data analysis unit is designed to identify surface defects on the surface of the object on the basis of one or more images recorded by means of the camera in the form of grid graphics made up of pixels on the basis of a two-dimensional grid coordinate system.
- The invention advantageously enables surface defects on the surface of three-dimensional objects having a reflective surface to be identified exclusively on the basis of items of two-dimensional image information with the aid of image processing algorithms. “Environmental parameters,” for example, distances between the camera(s) among one another, but also between camera(s) and object and/or its surface, are not required. Moreover, complex geometrical calculations, in particular triangulation calculations, as are regularly applied in the prior art, can advantageously be omitted. The solution according to the invention is thus fast, robust, and can be carried out in cooperation with a whole array of differently designed first illumination units, which predestines it in particular for mobile applications, for example as a handheld module. Advantageous embodiments increase the robustness of the method.
- Additional details and further advantages of the invention will be described hereinafter on the basis of preferred exemplary embodiments, to which the present invention is not restricted, however, and in conjunction with the appended drawings.
- In the schematic figures:
-
FIG. 1 shows an embodiment of a device according to the invention for recognizing and analyzing surface defects in three-dimensional objects having a reflective surface using a first illumination unit designed as an illumination arch in a perspective view; -
FIG. 2 shows a further embodiment of the device according to the invention having a first illumination unit designed as an illumination arch, and a three-dimensional object to be inspected in a frontal view; -
FIG. 3 shows a part of an embodiment of a first illumination unit having three different illumination sections; -
FIG. 4 shows a part of an embodiment of a first illumination unit, and the illumination pattern projected by it on a three-dimensional object, a motor vehicle body here; -
FIG. 5 shows a grid graphic of an identified region of a part of a surface of a three-dimensional object, an engine hood of a motor vehicle here; -
FIG. 6 shows an enlargement of the section of the grid graphic framed by dashed lines inFIG. 5 , a region having surface defects (bottom left) and a region without surface defects, but change of the structure of the surface (top right) framed therein; and -
FIG. 7 shows an enlargement of the region shown inFIG. 6 with surface defects (framed bottom left) having an identified and classified surface defect. - In the following description of preferred embodiments of the present invention, identical reference signs refer to identical or comparable components.
-
FIG. 1 shows an embodiment of a device 1 according to the invention for recognizing and analyzing surface defects OF in three-dimensional objects 5 having a reflective surface 51 using afirst illumination unit 2 designed as an illumination arch in a perspective view. - The device 1 according to the invention shown by way of example in
FIG. 1 for recognizing and analyzing surface defects OF in three-dimensional objects 5 having a reflective surface 51, in particular motor vehicle bodies, comprises at least onefirst illumination unit 2 for projecting anillumination pattern 21 onto a surface 51 of the object 5; a camera 3, activatable in particular by a light barrier 11, for recording images in the form of a grid graphic made up of pixels; and a control anddata analysis unit 4. It is distinguished in that the control anddata analysis unit 4 is designed to identify surface defects OF on the surface 51 of the object 5 on the basis of one or more images recorded by the camera 3 in the form of grid graphics made up of pixels on the basis of a two-dimensional grid coordinate system. - In this case, the
first illumination unit 2, as shown here, can be formed by an illumination arch having aframe 14, preferably an aluminum frame. Alternatively thereto, thefirst illumination unit 2 can also be designed as a handheld module (not shown), which can preferably be formed in the form of a portable lamp, (organic) light-emitting diode (LED/OLED) display, liquid crystal (LCD) display, or plasma display. Theillumination pattern 21 generated by thefirst illumination unit 2 can be generated by a regular shading of a continuous luminescent surface, for example by a correspondingly designed film arranged on the luminescent surface. Alternatively or additionally thereto, thefirst illumination unit 2 can also be formed by at least one liquid crystal (LCD) display, (organic) light-emitting diode (LED/OLED) display, or plasma display, which is either also shaded in regular sections and thus a desiredillumination pattern 21 is generated, or anillumination pattern 21 is projected directly in the form of an image projection onto the surface 51 of a three-dimensional object 5. - The embodiment of the device 1 shown in
FIG. 1 comprises, for example, nine cameras 3, which are arranged in particular on thefirst illumination unit 2, preferably, as shown here, on aframe 14 of afirst illumination unit 2 designed as an illumination arch. - Below the
first illumination unit 2, in particular the illumination arch, a calibration means 31 is shown by way of example, with the aid of which, preferably during startup of the device 1, a calibration of the one or more cameras 3 can advantageously be performed. A calibration plate having points spaced apart uniformly from one another at known intervals is shown for this purpose by way of example as the calibration means 31 inFIG. 1 . -
FIG. 2 shows a further embodiment of the device 1 according to the invention having afirst illumination unit 2 designed as an illumination arch, and a three-dimensional object 5 to be inspected in a frontal view. - In this embodiment of the device 1, seven cameras 3 are arranged by way of example on the
frame 14 of afirst illumination unit 2 designed as an illumination arch. As already shown inFIG. 1 , thefirst illumination unit 2 can comprise at least two, preferably fourrollers 13 for moving thefirst illumination unit 2. The provision ofrollers 13 on afirst illumination unit 2 designed in particular as an illumination arch can advantageously enhance the mobility of the device 1. The device 1 can thus, for example, with the aid of therollers 13, easily be rolled onto a transport trailer or rolled down from it and set up for use at any location. - In addition, at least one light barrier 11 can advantageously also be provided, which is shown arranged by way of example in
FIG. 2 on theframe 14 of thefirst illumination unit 2 above therollers 13. With the aid of the light barrier 11, the at least one or more cameras 3 can advantageously be activated, in particular when a three-dimensional object 5, preferably a motor vehicle, is moved through thefirst illumination unit 2, in particular the electric arch, and passes the light barrier 11 at the same time. The light barrier 11 can accordingly also activate the at least one or more cameras 3, of course, if the three-dimensional object 5 is stationary and the device 1, in particular thefirst illumination unit 21, moves over the object 5. The movement of the device 1 in relation to the object 5 can advantageously be implemented with respect to the device 1 by therollers 13 arranged on thefirst illumination unit 2 and with respect to the three-dimensional object 5, in particular by a motor vehicle, by its own drive including wheels. Alternatively or additionally, however, rails can also be provided, on which the device 1 and/or the three-dimensional object 5 moves. The one or more cameras 3, the first 2 and second 201 illumination unit, and the light barrier 11 can moreover preferably communicate via wires or also wirelessly, for example via a WLAN or Bluetooth connection, with the control anddata analysis unit 4. - In addition, it has proven to be effective if the
first illumination unit 2, in particular the illumination arch, comprises at least one, preferably—as shown inFIG. 2 —twomirrors 12, which are preferably arranged along a longitudinal axis LA adjacent to the floor on theillumination unit 2, wherein the mirror ormirrors 12 are preferably pivotably arranged on theillumination unit 2.Such mirrors 12 advantageously enable theillumination pattern 21 projected by thefirst illumination unit 2 on the surface 51 of the three-dimensional object 5 advantageously to be projected by reflection also on regions of the surface 51 which are close to the floor and/or curved, and on which anillumination pattern 21 cannot be depicted in the case of direct projection. - Alternatively or additionally, the
first illumination unit 2, in particular the illumination arch, can also comprise at least one, preferably two second illumination units 201, which are preferably arranged along a longitudinal axis LA adjacent to the floor on thefirst illumination unit 2, wherein the at least one second illumination unit 201 is preferably arranged pivotably on thefirst illumination unit 2 and is designed to project anillumination pattern 21, in particular a stripe pattern, on a surface 51 of the object 5, in particular on the surface 51 of the object 5 close to the floor. The one or more second illumination units 201 can be designed here like the one or morefirst illumination units 2, thus in the form of a lamp, (organic) light-emitting diodes (LED/OLED), liquid crystal (LCD) displays, or plasma displays and can generate the illumination pattern orpatterns 21 as described above. One or more such second illumination units 201 can, like themirrors 12, advantageously enable anillumination pattern 21 to also be projected on regions of the surface 51 which are close to the floor and/or curved and on which anillumination pattern 21 cannot be imaged in the case of direct projection. - In general, various wavelengths, i.e., various colors up to UV or infrared rays, can be used for the illumination. The type of the one or more cameras 3 used is preferably selected in accordance with the wavelength of the light used, i.e., for example cameras 3 having a CCD detector for visible light, UV light, and/or infrared light.
-
FIG. 3 shows a part of an embodiment of afirst illumination unit 2 having three different illumination sections A, B, C. Theillumination pattern 21 thus generated on the surface 51 of the three-dimensional object 5 accordingly also advantageously has three illumination sections A, B, C, which advantageously differ from one another in the respective stripe width.Illumination patterns 21, in particular stripe patterns having smaller stripe widths, as are generated here, for example, by the illumination sections B and C, are advantageously suitable for recognizing and analyzing surface defects OF having smaller diameter; while incontrast illumination patterns 21, in particular stripe patterns having comparatively greater stripe widths, as are generated in particular by illumination section A, are advantageously suitable for recognizing and analyzing surface defects OF having larger diameter. -
FIG. 4 shows a part of an embodiment of afirst illumination unit 2, and theillumination pattern 21 projected by it on a three-dimensional object 5, a motor vehicle body here. - Surface defects OF, as shown by way of example in
FIG. 4 , are now identified by the method according to the invention for recognizing and analyzing surface defects OF in three-dimensional objects 5 having a reflective surface 51, in particular of motor vehicle bodies, on the basis of the evaluation of at least one image recorded by at least one camera 3 in the form of a grid graphic made up of pixels of anillumination pattern 21 projected by at least onefirst illumination unit 2 on at least a part of the surface 51 on the basis of a two-dimensional grid coordinate system. Theillumination pattern 21 can preferably be a stripe pattern, in particular a stripe pattern having a sinusoidal intensity curve, wherein the stripe widths can differ from one another in sections (see illumination sections A, B, C), as already shown inFIG. 3 . - In one preferred embodiment of the invention, in a preferred method step, one or more images in the form of grid graphics made up of pixels of at least a part of the surface 51 of the object 5 can advantageously be recorded by means of at least one camera 3. For this purpose, in a further method step, preferably, in particular after a manual or automated configuration of the focus of the one camera 3 or the multiple cameras 3, the (optimum) exposure time of the one camera 3 or the multiple cameras 3 can be determined and configured, in particular by means of a control and
data analysis unit 4, on the basis of the color of the surface 51 of the object 5, in particular the color of the motor vehicle body. Moreover, it is advantageous if in one method step the one or more cameras 3 are calibrated with the aid of a calibration means 31, as shown inFIG. 1 . - In a further preferred method step, a region 511 of the recorded surface 51 of the object 5, on which the
illumination pattern 21 is projected, can now advantageously be identified within the recorded image or images, wherein preferably a two-dimensional grid coordinate system having x and y coordinates for each pixel can preferably be associated with the identified region 511. -
FIG. 5 shows by way of example a grid graphic of such an identified region 511 of a part of a surface 51 of a three-dimensional object 5, on the basis of the example of an engine hood of a motor vehicle here. - It can be seen in both figures (
FIG. 4 andFIG. 5 ) how the light stripes 211 anddark stripes 212 of anexemplary illumination pattern 21 stand out on the surface 51 of a three-dimensional object 5, a motor vehicle body here. - Furthermore, it is preferred if in one method step, within an identified region 511, in particular starting from a zero point of the two-dimensional grid coordinate system, intensity values, in particular light-dark values, are assigned to at least a part of the pixels. By means of edge detection, starting from previously defined starting values, at least sections of the
illumination pattern 21 projected on the surface 51, in particular the stripes of a stripe pattern, can then be identified in the image or the images, wherein pixels of intensity values which are equal or similar up to a predetermined deviation value, in particular light-dark values, can be identified as associated. - In a further preferred embodiment of the invention, on the basis of a group of pixels identified as associated, a functional (mathematical) relationship, in particular a function graph, can be established for the respective group of pixels identified as associated. The light stripes 211 and
dark stripes 212 shown, inter alia, inFIG. 5 can thus advantageously be approximated in the form of mathematical curves, for example as a polynomial or spline. In a further preferred method step of the present invention, for each pixel of a group of pixels identified as associated, the first derivative of a found functional (mathematical) relationship, in particular the slope of a found function graph, can then be determined and in addition in a further preferred method step, at least sections of theillumination pattern 21 projected on the surface 51, in particular the stripes of a stripe pattern can advantageously be classified in the image or the images on the basis of the first derivative of a found functional (mathematical) relationship, in particular on the basis of a slope of the function graph. Three light stripes 211 (horizontally shaded, diagonally shaded, and checked shaded) classified in this way are shown by way of example inFIG. 7 , wherein each shading corresponds, for example, to a special slope, preferably within a predetermined region in terms of a “neighborhood analysis”. - In addition, it is preferred if in one method step, surface defects OF are identified within the identified region 511 on the basis of characteristic changes and/or a disappearance of the first derivative of a found functional (mathematical) relationship, in particular the slope of a function graph, in the classified sections, in particular in the stripes of a stripe pattern.
-
FIG. 6 shows for this purpose an enlargement of the section of the grid graphic framed by dashed lines inFIG. 5 , and in turn framed therein a region 512 with surface defects OF (bottom left) and aregion 513 without surface defects OF, but change of the structure of the surface 51 (top right), whereinFIG. 7 finally shows an enlargement of the region 512 shown inFIG. 6 with surface defects OF (framed bottom left) with an identified and classified surface defect OF. - In
FIG. 7 , such characteristic changes and/or a disappearance of the first derivative of a found functional (mathematical) relationship can be seen well on the basis of the shaded light stripes 211. In the example illustrated here, the first derivatives, in particular the slopes, of the horizontally shaded and the checked stripe change in a characteristic way in a section of the identified region 511 adjacent to one another, wherein the curves of the horizontally shaded and the checked stripe are opposite to one another. The diagonally shaded light stripe 211 even breaks off completely in an also adjacent section of the identified region 511 located between the two other shaded light stripes 211, the first derivative of a functional (mathematical) relationship found for this purpose accordingly disappears. The disappearance of individual stripes can accordingly advantageously be used as a criterion for identifying surface defects OF. - In contrast thereto, structure changes in the surface 51 of a three-dimensional object 5, for example folds or bends in the engine hood of a motor vehicle body, do not result in such a pattern, in particular do not result in a disappearance of the first derivative of a found functional (mathematical) relationship, in particular the slope of a function graph, and therefore a complete interruption of a light stripe 211, but only, as shown in
FIG. 6 in the framedregion 513, in similar changes of the first derivative of a found functional (mathematical) relationship, preferably the slope of a function graph, in particular having a similar curve (in the same direction). - In a further preferred method step, the size of individual identified surface defects OF can moreover advantageously preferably be ascertained by determining a center point of the respective surface defect OF and at least one extremum E spaced apart from the center point MP. For this purpose, in particular a model function MF, for example a circle or ellipse function, can be calculated starting from the center point MP and the distance between center point MP and an extremum E as the radius and preferably projected in the respective grid graphic, in order to represent the size of the respective surface defect OF in the image or the images. For this purpose, for example, the maximum or minimum of a found function graph can be used as the extremum E.
- In a further preferred embodiment of the invention, in one method step, the at least one
illumination unit 2 and/or the at least one camera 3 can be moved in relation to the three-dimensional object (5). Alternatively thereto, the three-dimensional object 5 can also be moved in relation to the at least oneillumination unit 2 and/or the at least one camera 3. For example, a motor vehicle can drive through a first illumination unit designed as an illumination arch. Multiple images can then preferably be recorded, in particular multiple images by multiple cameras 3, for recognizing and analyzing surface defects OF in this case. In a further embodiment of the invention, it has proven to be effective if in one method step, at least one marker M, in particular four markers M, are arranged on the surface 51 of the object 5 and the position of the marker or markers M is determined within the recorded image or images. The determination of the position of one or more markers M within various recorded images can advantageously be used, preferably before beginning a further surface analysis, for combining multiple images into a two-dimensional grid coordinate system, in particular if the object 5 to be inspected, or its surface 51, respectively, is larger than theillumination pattern 21 projected on the surface 51 of the object 5. In this case and/or in the case of the relative movement of the object 5 with respect to the device 1 or vice versa, multiple images are advantageously recorded, in particular also by multiple cameras 3, which can then advantageously be combined with one another in a grid coordinate system on the basis of an item of position information of at least one marker M in the various images. - Finally, it is preferred if, in one method step, identified surface defects OF are classified by a comparison to at least one model function MF, in particular to a circle function and/or ellipse function, and/or a pattern of already classified surface defects OF, characteristic changes, and/or a disappearance of the first derivative of an already found functional (mathematic) relationship, and the measurement and/or evaluation data of surface defects OF classified in this way are stored in a database, in particular in a cloud database. Alternatively or additionally thereto, identified surface defects OF can also be classified by a comparison to database entries, in particular to database entries of a database having measurement and/or evaluation data of already classified surface defects OF, and measurement and/or evaluation data of classified surface defects OF are stored in a database, in particular in a cloud database. This advantageously enables the recognition and analysis of surface defects OF in three-dimensional objects 5 having reflective surface 51 to be improved after each performed analysis and the performance of the method according to the invention or the device 1 according to the invention to be optimized automatically in terms of a “deep learning” or “machine learning” strategy.
- The present invention relates to a method for recognizing and analyzing surface defects OF in three-dimensional objects 5 having a reflective surface 51, in particular motor vehicle bodies, in which the surface defects OF are identified on the basis of the evaluation of at least one image recorded by at least one camera 3 in the form of a grid graphic made up of pixels of an
illumination pattern 21 projected by at least oneillumination unit 2 on at least a part of the surface 51 on the basis of a two-dimensional grid coordinate system, and also a device 1 for this purpose. It advantageously enables surface defects OF to be identified exclusively on the basis of items of two-dimensional image information with the aid of image processing algorithms, wherein no “environmental parameters” are required and complex geometrical calculations can advantageously be omitted. The solution according to the invention is thus fast, robust, and can be carried out in cooperation with differently designedfirst illumination units 2, which predestines it in particular for mobile applications, for example as a handheld module. In addition, optimizing the method according to the invention by way of a “deep-learning” strategy is enabled. -
- 1 device
- 11 light barrier
- 12 mirror
- 13 rollers
- 14 frame
- 2 first illumination unit
- 21 illumination pattern
- 211 light stripe
- 212 dark stripe
- 21 illumination pattern
- 201 second illumination unit
- 3 camera
- 31 calibration means
- 4 control and data analysis unit
- 5 object, in particular motor vehicle body
- 51 surface
- 511 region
- 512 region with surface defects (OF)
- 513 region without surface defects (OF), but change of the structure of the surface (51)
- 51 surface
- A first illumination section
- B second illumination section
- C third illumination section
- LA longitudinal axis
- M marker
- OF surface defects
- MF model function
- MP center point
- E extremum
Claims (20)
Applications Claiming Priority (3)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
DE102018118602.9 | 2018-07-31 | ||
DE102018118602.9A DE102018118602B3 (en) | 2018-07-31 | 2018-07-31 | Method and device for detecting and analyzing surface defects of three-dimensional objects with a reflective surface, in particular motor vehicle bodies |
PCT/DE2019/100655 WO2020025086A1 (en) | 2018-07-31 | 2019-07-15 | Method and device for recognising and analysing surface defects in three-dimensional objects having a reflective surface, in particular motor vehicle bodies |
Publications (2)
Publication Number | Publication Date |
---|---|
US20210325313A1 true US20210325313A1 (en) | 2021-10-21 |
US11674907B2 US11674907B2 (en) | 2023-06-13 |
Family
ID=67514270
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US17/260,677 Active 2039-12-04 US11674907B2 (en) | 2018-07-31 | 2019-07-15 | Method and device for recognising and analysing surface defects in three-dimensional objects having a reflective surface, in particular motor vehicle bodies |
Country Status (6)
Country | Link |
---|---|
US (1) | US11674907B2 (en) |
EP (1) | EP3830517A1 (en) |
AU (1) | AU2019312721A1 (en) |
CA (1) | CA3106398A1 (en) |
DE (2) | DE102018118602B3 (en) |
WO (1) | WO2020025086A1 (en) |
Cited By (2)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210375078A1 (en) * | 2020-05-28 | 2021-12-02 | Gm Cruise Holdings Llc | Automated vehicle body damage detection |
US20210396684A1 (en) * | 2019-09-11 | 2021-12-23 | Proov Station | Assembly for detecting defects on a motor vehicle bodywork |
Families Citing this family (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2021151412A1 (en) * | 2020-01-27 | 2021-08-05 | Jan Nabatian | Apparatus and method for automatically detecting damage to vehicles |
DE102020120559A1 (en) | 2020-08-04 | 2022-02-10 | Bayerische Motoren Werke Aktiengesellschaft | Method for processing at least one defect of a surface of a component, in particular in an area of at least one surface feature |
US11828711B2 (en) * | 2020-10-09 | 2023-11-28 | Virtek Vision International Inc | Method and system for inspecting repair or assembly operations |
DE102020131086A1 (en) * | 2020-11-24 | 2022-05-25 | Broetje-Automation Gmbh | Process for machining an aircraft structural component |
DE102021123880A1 (en) * | 2021-09-15 | 2023-03-16 | Isra Vision Gmbh | Method and device for detecting local defects on a reflective surface |
Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1997039339A1 (en) * | 1996-04-12 | 1997-10-23 | Autospect, Inc. | Surface defect inspection system and method |
JPH109839A (en) * | 1996-06-24 | 1998-01-16 | Nissan Motor Co Ltd | Surface flaw inspection apparatus |
CN102169095A (en) * | 2009-09-24 | 2011-08-31 | 凯德易株式会社 | Inspecting system and inspecting method |
WO2011144964A1 (en) * | 2010-05-17 | 2011-11-24 | Ford Espana S.L. | Inspection system and method of defect detection on specular surfaces |
FR3039276A1 (en) * | 2015-07-23 | 2017-01-27 | Snecma | METHOD FOR TREATING AND MONITORING A PRECONTROLLING SURFACE |
EP3279864A1 (en) * | 2016-08-05 | 2018-02-07 | Thomson Licensing | A method for obtaining parameters defining a pixel beam associated with a pixel of an image sensor comprised in an optical device |
WO2018130421A1 (en) * | 2017-01-11 | 2018-07-19 | Autoscan Gmbh | Mobile and automated apparatus for the detection and classification of damages on the body of a vehicle |
US20190302030A1 (en) * | 2018-03-30 | 2019-10-03 | Dent Impressions, Inc. | Vehicle lighting |
WO2021151412A1 (en) * | 2020-01-27 | 2021-08-05 | Jan Nabatian | Apparatus and method for automatically detecting damage to vehicles |
US20220178838A1 (en) * | 2019-03-29 | 2022-06-09 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Method and apparatus for determining deformations on an object |
Family Cites Families (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
DE3021448A1 (en) * | 1980-06-06 | 1981-12-24 | Siemens AG, 1000 Berlin und 8000 München | Spatial deviation detection of surfaces from smooth planes - using optical grid and video image signal analysis |
US4863268A (en) | 1984-02-14 | 1989-09-05 | Diffracto Ltd. | Diffractosight improvements |
DE3712513A1 (en) | 1987-04-13 | 1988-11-03 | Roth Electric Gmbh | METHOD AND DEVICE FOR DETECTING SURFACE DEFECTS |
US5367378A (en) * | 1993-06-01 | 1994-11-22 | Industrial Technology Institute | Highlighted panel inspection |
DE19849802A1 (en) * | 1998-10-29 | 2000-05-04 | Volkswagen Ag | Paint defect detection and elimination |
JP2007183225A (en) | 2006-01-10 | 2007-07-19 | Toyota Motor Corp | Light radiation device, surface shape inspection system, and surface shape inspection method |
DE102010015566B4 (en) * | 2010-04-19 | 2013-10-02 | adomea GmbH | Method and system for measuring reflective surfaces |
US20140201022A1 (en) | 2013-01-16 | 2014-07-17 | Andre Balzer | Vehicle damage processing and information system |
DE102015119240B3 (en) | 2015-11-09 | 2017-03-30 | ATENSOR Engineering and Technology Systems GmbH | AUTOMATIC DETECTING AND ROBOT-BASED MACHINING OF SURFACE DEFECTS |
DE102016006780A1 (en) * | 2016-06-02 | 2017-12-07 | Eisenmann Se | Installation for optical inspection of surface areas of objects |
-
2018
- 2018-07-31 DE DE102018118602.9A patent/DE102018118602B3/en active Active
-
2019
- 2019-07-15 WO PCT/DE2019/100655 patent/WO2020025086A1/en unknown
- 2019-07-15 CA CA3106398A patent/CA3106398A1/en active Pending
- 2019-07-15 US US17/260,677 patent/US11674907B2/en active Active
- 2019-07-15 EP EP19748656.6A patent/EP3830517A1/en active Pending
- 2019-07-15 AU AU2019312721A patent/AU2019312721A1/en active Pending
- 2019-07-15 DE DE112019003826.5T patent/DE112019003826A5/en not_active Withdrawn
Patent Citations (10)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO1997039339A1 (en) * | 1996-04-12 | 1997-10-23 | Autospect, Inc. | Surface defect inspection system and method |
JPH109839A (en) * | 1996-06-24 | 1998-01-16 | Nissan Motor Co Ltd | Surface flaw inspection apparatus |
CN102169095A (en) * | 2009-09-24 | 2011-08-31 | 凯德易株式会社 | Inspecting system and inspecting method |
WO2011144964A1 (en) * | 2010-05-17 | 2011-11-24 | Ford Espana S.L. | Inspection system and method of defect detection on specular surfaces |
FR3039276A1 (en) * | 2015-07-23 | 2017-01-27 | Snecma | METHOD FOR TREATING AND MONITORING A PRECONTROLLING SURFACE |
EP3279864A1 (en) * | 2016-08-05 | 2018-02-07 | Thomson Licensing | A method for obtaining parameters defining a pixel beam associated with a pixel of an image sensor comprised in an optical device |
WO2018130421A1 (en) * | 2017-01-11 | 2018-07-19 | Autoscan Gmbh | Mobile and automated apparatus for the detection and classification of damages on the body of a vehicle |
US20190302030A1 (en) * | 2018-03-30 | 2019-10-03 | Dent Impressions, Inc. | Vehicle lighting |
US20220178838A1 (en) * | 2019-03-29 | 2022-06-09 | Fraunhofer-Gesellschaft zur Förderung der angewandten Forschung e.V. | Method and apparatus for determining deformations on an object |
WO2021151412A1 (en) * | 2020-01-27 | 2021-08-05 | Jan Nabatian | Apparatus and method for automatically detecting damage to vehicles |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US20210396684A1 (en) * | 2019-09-11 | 2021-12-23 | Proov Station | Assembly for detecting defects on a motor vehicle bodywork |
US11965836B2 (en) * | 2019-09-11 | 2024-04-23 | Proov Station | Assembly for detecting defects on a motor vehicle bodywork |
US20210375078A1 (en) * | 2020-05-28 | 2021-12-02 | Gm Cruise Holdings Llc | Automated vehicle body damage detection |
Also Published As
Publication number | Publication date |
---|---|
CA3106398A1 (en) | 2020-02-06 |
US11674907B2 (en) | 2023-06-13 |
WO2020025086A1 (en) | 2020-02-06 |
DE102018118602B3 (en) | 2019-11-28 |
DE112019003826A5 (en) | 2021-04-15 |
AU2019312721A1 (en) | 2021-02-04 |
EP3830517A1 (en) | 2021-06-09 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
US11674907B2 (en) | Method and device for recognising and analysing surface defects in three-dimensional objects having a reflective surface, in particular motor vehicle bodies | |
CN111989544B (en) | System and method for indoor vehicle navigation based on optical target | |
CN110057601B (en) | Method and apparatus for tire condition analysis | |
ES2910068T3 (en) | Mobile and automated device for the detection and classification of damage to the bodywork of a vehicle | |
JP6667065B2 (en) | Position estimation device and position estimation method | |
US20130057678A1 (en) | Inspection system and method of defect detection on specular surfaces | |
US20210396684A1 (en) | Assembly for detecting defects on a motor vehicle bodywork | |
US10313574B2 (en) | Device and method for recognizing inscriptions on vehicle tires | |
JP5418176B2 (en) | Pantograph height measuring device and calibration method thereof | |
US10482347B2 (en) | Inspection of the contoured surface of the undercarriage of a motor vehicle | |
EP3743713B1 (en) | Vehicle surface scanning system | |
EP3580073B1 (en) | Tread line scanner | |
JP2014066657A (en) | Vehicle body surface inspection device and surface inspection method | |
US20160259034A1 (en) | Position estimation device and position estimation method | |
JP4857909B2 (en) | Object detection method and object detection apparatus | |
JP4318579B2 (en) | Surface defect inspection equipment | |
EP3625517B1 (en) | Mobile platform with an arrangement for contactless distance determination according to the light intersection method | |
JP7043787B2 (en) | Object detection system | |
Seulin et al. | Dynamic lighting system for specular surface inspection | |
CN117091571A (en) | Image-based method of defining a scan region | |
CN113671944A (en) | Control method, control device, intelligent robot and readable storage medium | |
BRPI0901912A2 (en) | system and method for surface analysis of objects |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO UNDISCOUNTED (ORIGINAL EVENT CODE: BIG.); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
FEPP | Fee payment procedure |
Free format text: ENTITY STATUS SET TO SMALL (ORIGINAL EVENT CODE: SMAL); ENTITY STATUS OF PATENT OWNER: SMALL ENTITY |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: DOCKETED NEW CASE - READY FOR EXAMINATION |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: NON FINAL ACTION MAILED |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: RESPONSE TO NON-FINAL OFFICE ACTION ENTERED AND FORWARDED TO EXAMINER |
|
STPP | Information on status: patent application and granting procedure in general |
Free format text: FINAL REJECTION MAILED |
|
STCF | Information on status: patent grant |
Free format text: PATENTED CASE |