WO2022115928A1 - Procédés et systèmes pour l'inspection de la qualité de matériaux et de surfaces tridimensionnelles dans un environnement virtuel - Google Patents

Procédés et systèmes pour l'inspection de la qualité de matériaux et de surfaces tridimensionnelles dans un environnement virtuel Download PDF

Info

Publication number
WO2022115928A1
WO2022115928A1 PCT/BR2021/050533 BR2021050533W WO2022115928A1 WO 2022115928 A1 WO2022115928 A1 WO 2022115928A1 BR 2021050533 W BR2021050533 W BR 2021050533W WO 2022115928 A1 WO2022115928 A1 WO 2022115928A1
Authority
WO
WIPO (PCT)
Prior art keywords
virtual
light
defects
virtual surface
image
Prior art date
Application number
PCT/BR2021/050533
Other languages
English (en)
Portuguese (pt)
Inventor
Jorge AUGUSTO DE BONFIM GRIPP
Enivaldo AMARAL DE SOUZA
Renan PADOVANI
Original Assignee
Autaza Tecnologia Ltda - Epp
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Autaza Tecnologia Ltda - Epp filed Critical Autaza Tecnologia Ltda - Epp
Publication of WO2022115928A1 publication Critical patent/WO2022115928A1/fr

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0006Industrial image inspection using a design-rule based approach
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/30Measuring arrangements characterised by the use of optical techniques for measuring roughness or irregularity of surfaces
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/17Systems in which incident light is modified in accordance with the properties of the material investigated
    • G01N21/55Specular reflectivity
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • GPHYSICS
    • G05CONTROLLING; REGULATING
    • G05BCONTROL OR REGULATING SYSTEMS IN GENERAL; FUNCTIONAL ELEMENTS OF SUCH SYSTEMS; MONITORING OR TESTING ARRANGEMENTS FOR SUCH SYSTEMS OR ELEMENTS
    • G05B19/00Programme-control systems
    • G05B19/02Programme-control systems electric
    • G05B19/18Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form
    • G05B19/4097Numerical control [NC], i.e. automatically operating machines, in particular machine tools, e.g. in a manufacturing environment, so as to execute positioning, movement or co-ordinated operations by means of programme data in numerical form characterised by using design data to control NC machines, e.g. CAD/CAM
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis

Definitions

  • the present application relates to methods and systems for the automatic quality inspection of materials and virtual surfaces of materials.
  • the methods and systems described here can be used in the quality inspection of a wide variety of products, for example, metallic, plastic, resin, composite, glass, crystal parts, or mixtures thereof, molds and various tooling, including tooling stamping, injection, application of fibers, resins, manufacture of composites, packaging, glass or crystals.
  • the methods and systems described here can be used in the quality inspection of vehicles, such as bicycles, motor vehicles (motorcycles, cars, vans, trucks, buses), rail vehicles (trains, trams), vessels (boats, speedboats, ships) ), amphibious vehicles (screw-propelled vehicle, hydrofoils, hovercraft), aircraft (airplanes, helicopters, non-tribbed air vehicles) and spacecraft.
  • vehicles such as bicycles, motor vehicles (motorcycles, cars, vans, trucks, buses), rail vehicles (trains, trams), vessels (boats, speedboats, ships) ), amphibious vehicles (screw-propelled vehicle, hydrofoils, hovercraft), aircraft (airplanes, helicopters, non-tribbed air vehicles) and spacecraft.
  • the car body manufacturing process goes through three main stages: stamping, for the fabrication of the individual components and sub-assemblies of the body, the welding to join the components and sub-assemblies and, finally, the procedures of painting the body, to assign color and protection against corrosion .
  • the tooling used in the stamping presses is the starting point of the automobile manufacturing process. Before series production, however, the development of stamping tooling for automotive parts comprises 5 phases (BRAZIL. Inovar-Auto. Interministerial Ordinance MDIC/MCTI No.
  • I Planning with the specification of the raw material , equipment and means of production, including tooling processes or method plans, virtual simulations of production parts, processes and equipment;
  • II - Project involving drawings, calculations and simulations, modeling and technical details, according to the specifications of the planning area;
  • III Construction of the tooling, based on project information, list of materials, components and production process;
  • IV - Tests with the manufacture of parts samples for tooling validation;
  • V - Finishing which involves the execution of finishing processes to meet product and process specifications.
  • the tooling development process also has control steps Of Quality.
  • the quality control aims to correct the virtual project in the Design stage or the tooling in the Test stage, in order to meet a good surface finish and the geometry specified in the Planning stage.
  • the design of tooling and stamped parts is based on the ostensible use of computational techniques, both in the scope of design through computer-aided design software (CAD) , and in the scope of analysis through computer-aided engineering software ( CAE) for simulating manufacturing processes.
  • CAD computer-aided design software
  • CAE computer-aided engineering software
  • the focus is not on the dimensional inspection of the part, but on the quality of its surface, which should not have undulations or unwanted marks.
  • the surface can be designed by a designer (CAD), generated by simulation software in a three-dimensional computational environment (SIM) or as a result of three-dimensional scanning of the surface of a real material (SCAN).
  • CAD designer
  • SIM three-dimensional computational environment
  • SCAN real material
  • the invention exposed here uses calculations and three-dimensional analysis in the geometry of the three-dimensional surface itself, which can be composed , for example, by a three-dimensional point cloud or by a polygonal mesh.
  • Another possible technical field of the present invention is the glass manufacturing industry, whose objective is the production of glasses with homogeneity and guaranteed specification for application in windows of residences and commercial works, interior architecture, shelves, mirrors, appliance doors, windshield glass, vehicle windows, among others. They are not pleasant, for example, the distorted view of an object when looking through a glass window, nor is it pleasant to look in a mirror that has a distorted reflection. In the first case, there is a distortion in the refraction of the image through the glass; in the second, distortion of the glass reflection.
  • float glass begins by melting the raw material in a furnace: silica (sand), soda, limestone, magnesium, alumina and potassium.
  • silica soda, limestone, magnesium, alumina and potassium.
  • the glass melted in the oven floats on a layer of melted metal (tin), which allows it to lie flat and have a uniform surface.
  • tin melted metal
  • the glass can be of mirror quality (used for silver mirrors and reflective glass), process quality (used for processing into tempered or laminated glass) and architectural quality (for general architectural and decorative applications).
  • the quality inspection of the refraction of a sample of the glass produced is done in a laboratory by an inspector: the refraction test.
  • the glass (specimen) is placed in a vertical position on a support capable of rotating on a vertical axis.
  • the inspector observing the glass stands still at a distance of 4.5 meters in front of the glass; while a screen with a zebra pattern (black stripes amidst white lighting, each stripe 25 millimeters thick and tilted at 45 degrees) is placed 4.5 meters behind the glass.
  • the critical angle at which this circumstance occurred is noted.
  • the inspected specimen has a height between 300 and 500 millimeters, a width between 800 and 900 millimeters, and typically a thickness of 2, 3, 3.15, 4, 5, 6, 8, 10 or 12 millimeters.
  • the reflection test For glasses whose purpose is the production of mirrors, an inspection is also carried out to evaluate the reflection of the glass: the reflection test.
  • the glass (test specimen) is placed in a horizontal position on a table next to the same screen with a diagonal zebra pattern (the same one used for the refraction test) and an observer evaluates the straightness of the lines reflected on the glass.
  • a reflected image showing curved lines would show glass not approved for use as a mirror.
  • One of the claimed materials is a method for inspecting virtual surfaces of materials comprising the steps of: i. - loading a virtual surface on a computer; ii. - extract features from geometry three-dimensional virtual surface; iii. - processing the characteristics through a calculation, selection or classification algorithm; and iv. - graphically visualize the processed characteristics; or V. - identify, locate and/or classify defects based on the processed characteristics of the material's virtual surface.
  • Another claimed subject relates to a method for automatically inspecting the quality of optical distortion in partially or fully transparent materials, which distort light passing through it.
  • Said method comprises the steps of: i. - projecting light through an inspected material with a light source; ii. - capture the refracted light by an image capture device; iii. - transmit and process the captured image; and
  • Another claimed subject relates to a method for automatically inspecting the quality of optical distortion in fully or partially reflective materials, which distort light incident upon it.
  • Said method comprises the steps of: i. - project light through an inspected material with a light source; ii. - capture the reflected light by an image capture device; iii. - transmit and process the captured image; and iv. - identify and classify defects.
  • Another claimed subject is a method for comparing characteristics of two or more virtual surfaces in order to visualize, identify, locate and/or classify discrepancies between the two or more virtual surfaces and/or surface defects .
  • Said method comprises the steps of: i. - loading at least a first virtual surface and a second virtual surface into a computer; ii. - extract features from the three-dimensional geometry of the virtual surface; iii. - virtually aligning at least a first virtual surface and a second virtual surface using an alignment algorithm and generating at least a third virtual surface; iv. - matching points on at least a first surface with points on at least a third surface, extracting features from each of the virtual surfaces; v.
  • one of the claimed materials is a system for inspecting virtual surfaces of materials.
  • Such a system comprises the following characteristics: i. - means for loading a virtual surface, such as a computer;
  • - means to extract features from the three-dimensional geometry of the virtual surface, such as software; ill. - means for processing the characteristics through a calculation, selection or classification algorithm; and iv. - means to identify, locate and/or classify defects based on the extracted features of the material's virtual surface.
  • Another claimed subject refers to a system for the automatic quality inspection of optical distortion in partially or fully transparent materials, which distort the light that passes through it.
  • Such a system comprises the following characteristics:
  • Another claimed subject is a system for the automatic quality inspection of optical distortion in fully or partially reflective materials, which distort the light that falls on it.
  • Such a system comprises the following characteristics:
  • One of the claimed materials also relates to a system for comparing the characteristics of two or more virtual surfaces, in order to visualize, identify, locate and/or classify discrepancies between the two or more virtual surfaces and/or or surface defects.
  • a system for comparing the characteristics of two or more virtual surfaces, in order to visualize, identify, locate and/or classify discrepancies between the two or more virtual surfaces and/or or surface defects.
  • - means for loading at least a first virtual surface and a second virtual surface, such as a computer; i. - means for virtually aligning at least a first virtual surface and a second virtual surface, as an alignment algorithm and generating at least a third virtual surface; ii. - means for matching points on at least a first surface with points on at least a third surface, extracting features from each of the virtual surfaces, like an algorithm; iii. - means for comparing the characteristics of the virtual surfaces based on the correspondence data between points of the virtual surfaces, generating comparison values; iv. - means for adding the comparison values to at least a third surface, together with the respective extracted features; and v. - means to calculate a value or class, as a calculation algorithm, from the set of comparisons and selection or classification characteristics.
  • the present patent application relates to a method and system for the automatic inspection of the quality of materials, more specifically through the steps of: (i) capturing three-dimensional data from a surface or capturing images a pattern of light reflected by or distorted through the inspected material; (ii) processing the captured images or the three-dimensional point cloud; and (iii) identification of material defects.
  • the application refers to a method for automatic inspection of virtual surfaces of materials.
  • Automatic inspection of the material's virtual surface can also be useful in many industries, such as the automotive industry. Basically, any type of material can be submitted to the automatic inspection described here, opaque materials, translucent or transparent materials; matte, glossy or reflective materials.
  • the actual surfaces of the material can be digitized with a digitizing apparatus to generate a surface material, which can be inspected according to the method described here.
  • the inspection of virtual material surfaces can also be applied to the virtual surface of materials created, designed or simulated in software.
  • the quality of three-dimensional models generated in a virtual environment i.e. fully created, designed or simulated in a design or modeling software, can be inspected so that unwanted ripples or any other imperfection in the virtual surface of the material can be detected. , identified, located and/or classified.
  • the method of automatic inspection of virtual surfaces of materials provides an opportunity to correct any problem even during the design phase of the automobile or the tooling created for the production of automobile parts, improving material quality and saving costs during manufacturing.
  • the virtual surface can be composed of three-dimensional points in a coordinate system.
  • the coordinate system can be Cartesian, spherical, cylindrical, curvilinear, among others. Each point on the surface is called a vertex.
  • the surface can be composed of a set of vertices (zero-dimensional, 0D), a set of lines (one-dimensional, ID), a set of polygons (two-dimensional, 2D), a range of triangles (two-dimensional), or a combination thereof.
  • Different formats can be used for the file or set of computer files that contain the surface data, and there can be conversion between formats, for example, the conversion of a polygonal mesh formed by a list of polygons (2D faces) to a point cloud formed by a list of points (0D) .
  • the calculated features can be associated with points (vertices), lines (edges) or polygons (faces), without loss of generality.
  • the surface can be altered beforehand, for example, through smoothing, subsampling, supersampling, cropping, partial deletion, joining between surfaces, translation, rotation, warping, scaling or any editing process.
  • the method for automatic inspection of virtual surfaces comprises the steps of loading a virtual surface; extract features from the three-dimensional geometry of the virtual surface; processing the characteristics through a calculation, selection or classification algorithm; and identifying, locating and/or classifying defects based on characteristics extracted from the material's virtual surface.
  • the steps of detecting, identifying, locating or classifying defects can be performed independently, so it should be understood that the methods described in this document can detect and/or identify and/or locate and/or classify defects.
  • the method described here can show the values of the calculated features or values calculated from these features, numerically, or through a color scale that demonstrates these values on the virtual surface.
  • the features extracted from the virtual surface can be the curvatures extracted at each point of the surface.
  • the main characteristics are the maximum curvature K, the minimum curvature J, the average curvature H (average between the maximum and minimum curvature), the Gaussian curvature G (multiplication of maximum curvature and minimum curvature), plus any combination of them.
  • the calculation of curvatures is done through discrete differential geometry techniques, considering the surface points, considering a surface area within a neighborhood around this point (search radius), proportional to the size of the expected defects.
  • the coefficients a 1 , a 2 , a 3 , a 4 , a 5 , a 6 are determined by solving a least squares problem (as can be arbitrarily chosen as zero).
  • a least squares problem as can be arbitrarily chosen as zero.
  • a spherical neighborhood of each point p on the surface S is composed by the point p and all other points belonging to the surface S that are inside a ball centered on p with radius r (search radius), that is, all points of S whose distance ap is less than r.
  • the region of space that delimits the neighborhood can have other shapes, such as a cube or a cylinder.
  • the search radius can be defined as a value in the coordinate system (eg 10 millimeters), or as a multiple of the average distance of points on the surface.
  • the minimum set corresponding to the neighborhood is composed of the surface points directly connected to the point p.
  • the derivative of each possible curvature can be used as a characteristic in the same way as curvature.
  • the three-dimensional curvatures (second order derivatives) and three-dimensional wrinkling (third order derivatives) of the points can be used considering their sign or in modulus to calculate the characteristics, as well as the combination of several of these characteristics.
  • the wrinkling is calculated as the standard deviation of the curvatures in a given neighborhood of the point p. Considering that the curvature Gi was calculated for each in the N points pi of the neighborhood, the wrinkling is where the mean of curvature in the neighborhood is Naturally, the The same reasoning can be used to calculate the wrinkles dK, dJ and dH from the values of curvatures K, J and H, respectively.
  • each type of curvature or wrinkle can have its own threshold , from which the region that exceeds the threshold is marked as defective . For example, if the absolute value of the Gaussian curvature G exceeds a threshold tl the point is considered to belong to a faulty region.
  • each type of curvature or wrinkle can also be calculated on more than one type of neighborhood search radius, for example, curvature G calculated using search radius rl and or search radius r2 , each with its thresholds.
  • the neighborhood of the calculation of wrinkles can be the same or different from the neighborhood of the calculation of curvatures.
  • Surroundings can differ in size and shape, including a spherical neighborhood, a spherical crown, a cube, or a cylinder.
  • the surfaces can come from different stages of the project, among which, for example, the projected drawing (CAD), a stamping simulation (SIM), a digitization (scanning) of a real part (SCAN), or even different versions. within the same type among these steps, for example, two different simulations considering variations in the simulation parameters. Comparison involves comparing the characteristics (curvatures and/or wrinkles) of each point on the surface SI with the characteristics of the corresponding point on the surface S2, in order to find discrepancies between the characteristics.
  • CAD projected drawing
  • SIM stamping simulation
  • SCAN digitization of a real part
  • Comparison involves comparing the characteristics (curvatures and/or wrinkles) of each point on the surface SI with the characteristics of the corresponding point on the surface S2, in order to find discrepancies between the characteristics.
  • Characteristic discrepancies can be calculated using an appropriate metric and incorporated as new features to the surface S1, to be later shown as indicative of a possible nonconformity between the surfaces. Additionally, the discrepancy can generate a classification, in a finite number of classes, in order to highlight the points with the greatest discrepancy.
  • An example of classification is the marking of the point with the class "outlier" if any of the discrepancies of the characteristics exceeds a value determined by a threshold (which can be different for each type of characteristic). Additionally, the identification of possible defects can be the union of these regions, considering different curvature and wrinkle calculations, multiple search radius scales for each of them and the comparison between different surfaces.
  • each point p1 of the surface S1 would have a correspondent in a similar position in S2.
  • the correspondence of the points between the surfaces S1 and S2 can be trivial, if both have the points indexed and the indexing of the points involves the same distribution in three-dimensional space. If this is not the case, for example, the number of points of the surface S1 and S2 is different, it is necessary, initially, an algorithm to align the two surfaces so that they are superimposed. This one alignment, which can usually be performed by applying a rotation matrix and a translation vector to each point on the surface, transforms the original surface S2 into a rotated and translated surface S3, so that the comparison becomes between SI and S3.
  • Each point p1 of S1 must search in its neighborhood for the closest point p3 that belongs to S3 to calculate the discrepancy between the characteristics of p1 and the characteristics of p3. If, for a predetermined neighborhood, more than one neighboring point belonging to S3 is found, it is possible to choose the one closest to point p1 in S1 or perform a statistic (such as average, maximum, minimum, mode) of the characteristics of the values of the points in S3 to consolidate them in the proper dimension for comparison with the characteristics of point p1 in S1.
  • a statistic such as average, maximum, minimum, mode
  • the patent application relates to a method and system for automatically inspecting the quality of optical distortion in partially or fully transparent materials, which distort light passing through it.
  • the system contains a light source, the inspected material, an image capture device, and a device capable of processing and analyzing data.
  • the light projected by the light source passes through the inspected material, is refracted due to the material's refractive index, and captured by the image capture device, which transmits the image to a device capable of processing the captured image, identifying and classifying defects.
  • the material inspected can be glass, crystal, polymer, acrylic or any other type of material that exhibits transparency. Variations in refractive index, thickness or shape along the material cause ripple in the light and dark stripe patterns captured by the image capture device and sent to the device which contains software for image processing as well as classification of images. optical distortion defects.
  • the application refers to a method and system for automatically inspecting the quality of optical distortion in fully or partially reflective materials, which distort light incident upon them.
  • the system contains a light source, the inspected material, an image capture device, and a device capable of processing and analyzing data.
  • the light projected by the light source is reflected specularly on the surface of the material, and captured by the image capture device, which transmits the image to a device capable of processing the captured image, identifying and classifying the defects.
  • the material inspected may be metal, ceramic, glass, composite, polymer, acrylic, crystal or any other type of reflective material.
  • the material can be flat, curved, regular, corrugated, corrugated, concave, convex or can understand a mixture of such formats.
  • Variations in the surface of the material such as variations in flatness, undulations, elevations, and dips, cause ripple in the light stripe and dark stripe patterns captured by the image capture device and are sent to the device that contains the image processing software as well. as the classification of reflex distortion defects.
  • the systems and methods for optical distortion in refraction and for distortion in reflection can be applied to the quality inspection of flat glasses.
  • the refraction test may find striations, grooves, or lines of marks, or deformations in the image passing through the glass.
  • the reflection test can find variations in the surface of the glass, such as variations in flatness, undulations, swells and dips.
  • FIGURE 1A and 1B show the development cycle of stamped parts tooling
  • FIGURE 2A shows the sequence to process a virtual surface, extract features, extract profiles, identify, locate and/or classify surface defects
  • FIGURE 2B illustrates an example of an osculating surface to a set of vertices.
  • the vertices are points on the surface S, which belong to the neighborhood V around a point p.
  • FIGURE 2C illustrates an example of the comparison of two virtual surfaces and the alignment process from S2 to S1.
  • FIGURE 3 presents the processing flow of a virtual surface to extract features and visualize, identify, locate and/or classify surface defects
  • FIGURE 4 presents the processing flow of a virtual surface to extract profiles and visualize, identify, locate and/or classify surface defects from the profiles;
  • FIGURE 5 presents the processing flow of two or more virtual surfaces for the comparison of their characteristics, in order to visualize, identify, locate and/or classify the discrepancies between them or the surface defects;
  • FIGURE 6A shows the refraction test, for the automatic inspection of optical distortion quality in partially or totally transparent materials, which distort the light that passes over it;
  • FIGURE 6B shows the reflection test, for the automatic quality inspection of optical distortion in fully or partially reflective materials, which distort light falling on them;
  • FIGURE 6C shows two possible examples of images captured by the image capture apparatus 610.
  • FIGURE 6D shows a possible example of an image captured by the image capture device 640 and part of the processing of this image.
  • FIGURE 7A presents the refraction test processing flow.
  • FIGURE 7B shows the reflection test processing flow.
  • FIGURE 1A and IB show the development cycle of stamped parts tooling.
  • the development of the tooling for stamping the parts comprises the phases of: Planning 110, with the specification of the raw material, equipment and means of production, including the tooling processes or plans of methods, virtual simulations of parts, processes and equipment of production; Project 120, involving drawings, calculations and simulations, modeling and technical details, according to the specifications of the planning area; Construction of tooling 130, based on project information, list of materials, components and production process; Tests 140, with the manufacture of parts samples for tooling validation; and Finishing 150, which involves the execution of finishing processes to meet product and process specifications.
  • Project 120 involves a learning cycle in a virtual environment, comprising the steps of: Adjustments and/or creation of virtual tooling 121, Stamping simulation 122, Virtual surface analysis 123, and Inspection 124.
  • step 121 computational techniques were used, both in the scope of design through computer-aided design (CAD) software, and in the scope of analysis through computer-aided engineering (CAE) software for simulating manufacturing processes.
  • CAD computer-aided design
  • CAE computer-aided engineering
  • CAE software has the ability to simulate the geometry, dynamic behavior and material of the tooling and stamped part, providing the surface shape of the stamped part.
  • This simulated stamped part is subjected to a set of analyzes in step 123.
  • step 124 from the results of the analysis, the part is considered approved (OK) and the tooling passes to manufacturing 130, or considered disapproved, returning to step 121 for tooling design adjustments.
  • step 123 Different types of analysis can be performed in step 123, such as dimensional analysis, comparing if the shape of the simulated part in 122 is in accordance with the design of step 110 and if the thickness is within the tolerance values; mechanical analysis if the stamping process broke the part, generated any cracks or created any region with stresses or deformations outside the specifications; and surface analysis, checking if the surface has ripples or unwanted marks.
  • dimensional analysis comparing if the shape of the simulated part in 122 is in accordance with the design of step 110 and if the thickness is within the tolerance values
  • mechanical analysis if the stamping process broke the part, generated any cracks or created any region with stresses or deformations outside the specifications
  • surface analysis checking if the surface has ripples or unwanted marks.
  • Tests 140 involve a correction cycle in a real environment, consisting of the steps of Real Tooling Adjustments 141, Stamping parts 142, Inspection of stamped parts 143, and Inspection 144.
  • the die and the punch have the its geometric shape adjusted through material thinning (generally using emery) or material addition (generally using welding), positioning adjustments, lubrication adjustments, among other stamping parameters.
  • Some stamped parts are produced using the tooling set in step 142.
  • the parts are inspected in step 143 using manual or automatic techniques, such as: visual inspection, reflection, touch, stone scratching, mechanical positioning assessment templates, measuring machines coordinates, optical scanning devices, processing and comparison software, among others.
  • step 142 From digitalization techniques using three-dimensional scanners, the parts produced in step 142 can be compared with Simulation 122 or with Project 110.
  • step 144 based on the results of dimensional, mechanical and surface analysis of the parts, the tooling can either pass (OK) and go to Finish 150, or fail, returning to step 141 for tooling adjustments.
  • FIGURE 2A shows the sequence to process a virtual surface, extract features, extract profiles, identify, locate and/or classify surface defects.
  • a virtual surface 201 is loaded into a three-dimensional virtual environment and can be processed, by calculating geometric features 0 in 202, such as three-dimensional curvature or wrinkling (derived from curvature).
  • geometric features 0 in 202 such as three-dimensional curvature or wrinkling (derived from curvature).
  • a value or class ⁇ (0) is calculated from these characteristics, through a calculation, selection or classification algorithm.
  • Visualization of ⁇ values for each point, edge or surface face can be visualized to identify, locate and/or classify defects on the material's virtual surface. the visualization
  • is a value calculated from 0 on a continuous scale
  • 205 shows identification of surface regions where ⁇ is a class obtained through a classification algorithm.
  • profiles can be extracted by cutting the surface 201 through orthogonal planes or through surfaces that cross it.
  • the definition of direction, surface type, separation between profiles and number of profiles can be determined by a user or by the system.
  • the user's input can be by selecting only two points of the surface pc1ie and p2, as shown in visualization 206, for the creation of the profile el.
  • the orthogonal slice plane can be formed as the plane that contains the two points p1 and p2 and the average of the vectors normal to the surface at these points.
  • Profile el can be extracted as the set of points formed by the intersection between the surface edges and the orthogonal plane.
  • the plans cutting edges of the other profiles e2, e3 are planes parallel to the cutting plane of el and a predetermined distance away from them.
  • the number of cutting planes and profiles can be determined by the user, as well as the distance between them.
  • Visualization 206 shows a set of profiles e1, e2 and e3 extracted from the three-dimensional surface and still visualized in the three-dimensional environment.
  • Visualization 207 shows a two-dimensional visualization (2D) of an el profile, through a transformation from 3D to 2D coordinates, for two coordinates contained in the cutting plane, with origin in pl, and with abscissa represented by the vector between the point ps1 and p2.
  • singular points can be extracted, such as points of ordinate maximum, ordinate minimum, inflection (zero second derivative), abscissa maximum, abscissa minimum.
  • features are calculated, such as the distances of height h, width w and height q shown in visualization 207.
  • FIGURE 2B illustrates an example of an osculating surface R to a set of vertices.
  • the vertices are points in the neighborhood V of a point p on the surface S.
  • the neighborhood V around the point p can be defined by several shapes, including a spherical neighborhood, a spherical crown, a cube, or a cylinder.
  • One The spherical neighborhood of each point p on the surface S is composed of the point p and all other points belonging to the surface S that are inside a ball centered on p with radius r, called the search radius, that is, all points of S whose distance ap is smaller than r.
  • the search radius can be defined as a value in the coordinate system (eg 10 millimeters), or as a multiple of the average distance of the surface points.
  • the minimum set corresponding to the neighborhood is composed of the surface points directly connected to the point p.
  • the osculating surface R is referenced by the orthogonal coordinate system (u,v,n), with coordinates u and v tangent to the surface S and the third coordinate n normal to the surface S at point p.
  • This coordinate system is positioned in such a way that the u coordinate is positioned in the direction of greatest curvature of the surface and v in the direction of least curvature.
  • FIGURE 2C illustrates an example of the comparison of two virtual surfaces and the alignment process from S2 to S1.
  • An alignment algorithm is used so that the two surfaces overlap.
  • This alignment which can generally be summarized in a rotation matrix and a translation vector, transforms the original surface S2 into a rotated and translated surface S3, so that the comparison becomes between S1 and S3.
  • Each point p1 of S1 must search in its neighborhood for the closest point p3 that belongs to S3 to calculate the discrepancy between the characteristics of p1 and the characteristics of p3.
  • FIGURE 3 presents the processing flow of a virtual surface to extract features and visualize, identify, locate and/or classify surface defects.
  • the flow starts by loading the virtual surface S 301.
  • the calculation of neighbors V 302 is performed.
  • the neighborhood V around point p can be defined by several formats, among which, a spherical neighborhood, a spherical crown, a cube , a cylinder, or the points attached to point p.
  • a spherical neighborhood of each point p on the surface S is composed of the point p and all other points belonging to the surface S that are inside a ball centered on p with radius r, called the search radius, that is, all points of S whose distance ap is less than r.
  • the ⁇ parameters of an osculating surface to the neighborhood V of each point are calculated, through the method of fitting a parameterized surface part detailed in FIGURE 2B.
  • the parameters p can be calculated using other methods of discrete differential geometry, such as the differentiation of normal vectors around the point p or the averaging methods of tensor.
  • step 304 the 0 characteristics corresponding to the different types of curls and wrinkles are calculated from the ⁇ parameters.
  • step 304 can contain one more step of calculation of neighborhood V, similar to step 302.
  • the calculation of this new neighborhood can be different in size and/or shape with the neighborhood of step 302.
  • the ⁇ parameters of the points neighboring point p are used for the wrinkle calculation in this step 304.
  • a value or class ⁇ ( ⁇ ) is calculated from these characteristics ⁇ , through a calculation, selection or classification algorithm.
  • is a continuous value.
  • can be a class obtained through a classification algorithm.
  • the dimension of c can vary.
  • c can have only two values in a binary map (white or black), or c can be a value between 0 (black) and 255 (white) in a grayscale color map.
  • c can be composed of 3 values, for example, RGB (red, green, blue) or HSV (hue, saturation and brightness) , or it can have associated transparency values .
  • the surface is visualized, with its points and/or edges and/or faces colored by the value of c.
  • the color scale can be displayed by indicating regions, as in 205, through a progressive color shift scale as in 204, or composed of a two-dimensional color map. Through visualization it is possible to identify, locate and/or classify defects in the virtual surface. It is also possible to observe the variation of some calculated parameter. Finally, generated visualizations and calculated data can be saved in step 316.
  • Decisions 309-315 show a set of changes 308 that the user can make to the processing flow. Each change makes the change in the corresponding step and in subsequent steps, in such a way that calculations prior to the changed step are reused. Reusing calculations is a way to speed up processing time, which is something highly valued in software when working with a large amount of data.
  • the user can change the way the surface is viewed.
  • how to change the view is whether or not to show points, edges or faces, as well as the positioning of the surface and the degree of magnification of the view.
  • the color scale for example, the user can change from a continuous scale to a discrete scale, or the color tones of the scale can be switched between a rainbow scale, grayscale, and scale. iron.
  • the scale can be inverted, its minimum and maximum value can be changed, the color of the saturated points (values above the maximum or below the minimum) can be the maximum value of the scale. or another color, or the scale progression can be switched between linear or logarithmic.
  • the ⁇ function is changed.
  • the parameters of the function can be changed or there may be a set of pre-set functions that the user can choose from, for example, to migrate from a calculation function as in 204 to a sort view as in 205.
  • the function of characteristics 0 is changed.
  • the features can be modified, for example, the inclusion or omission of some of the wrinkles, some curvatures, or other features.
  • the calculation method or the neighborhood parameters can be changed.
  • the method for obtaining the p parameters is changed, which can be a type of regression using an osculating surface, as well as another type of adjustment of a parameterized osculating surface part, normal vector differentiation, or tensor averaging methods.
  • a new surface can be loaded for evaluation, or editing of the same surface, for example, through smoothing, subsampling, supersampling, cropping, partial deletion, union between surfaces, translation, rotation, deformation, scaling or any editing process.
  • FIGURE 4 presents the processing flow of a virtual surface to extract profiles and visualize, identify, locate and/or classify surface defects from the profiles.
  • Profiles are cuts of the surface, in a curve constituted by a chained sequence of points.
  • the flow starts by loading the virtual surface S 401.
  • the user or the system determines the evaluation points 402.
  • the profile can be determined through two points ps1 and p2 chosen by the user as start and end points. of the profile el.
  • the 3D points are extracted from the profiles.
  • the profile el can be extracted through an orthogonal plane that contains the points p1 and p2 and is aligned with the average of the vectors normal to the surface at these points.
  • Profile el can be extracted as the set of points formed by the intersection between the surface edges and the orthogonal plane.
  • the system can already have a preferred direction for extracting profiles, requiring only one point p1 to define the orthogonal cutting plane.
  • the extraction can be from just one profile el or from a set of associated profiles, extracted through cutting planes parallel to the cutting plane of el and distant from them by a predetermined distance.
  • the number of cutting planes and profiles can be determined by the user or by the system, as well as the distance between these planes.
  • the profiles are displayed in a three-dimensional environment.
  • Visualization 206 shows a set of profiles e1, e2 and e3 extracted from the three-dimensional surface and still visualized in the three-dimensional environment.
  • step 405 the three-dimensional points of the profile are transformed into two-dimensional points.
  • This change of coordinates is carried out through a translation of the points and a rotation.
  • the new origin becomes the point po1 and in rotation, two coordinates are contained in the orthogonal cutting plane and one coordinate perpendicular to the cutting plane.
  • the first coordinate (abscissa) is in the direction of vector A between points p1 and p2 and the second coordinate (ordinate) is in the direction of vector D, vector perpendicular to the abscissa, but also contained in the cutting plane.
  • the value of the third coordinate in the direction of vector C, perpendicular to the cutting plane is zero at these points and, therefore, is disregarded.
  • the vector B is the average between the vectors n1 and n2, which are the vectors normal to the surface S at the point p1 and at the point p2.
  • the vector C is the cross product of A and B; vector D is the cross product between C and A.
  • the rotation matrix can be assembled through the coordinates in reference to the XYZ axis of the vectors An, Dn and Cn, ie, normalized A, D and C.
  • the translation vector is composed of the coordinates of p1, so that p1 becomes the origin of the system with vectors An, Dn and Cn.
  • step 406 the values of the extracted profile are seen, in two-dimensional coordinates (directions An and Dn).
  • features of points 407 are extracted, with ordinate, abscissa, slope (first derivative), curvature (second derivative).
  • singular points 408 can be extracted, such as points of ordinate maximum, ordinate minimum, inflection (zero second derivative), abscissa maximum, abscissa minimum.
  • features 409 are calculated, such as the distances of height h, width w and height q shown in visualization 207.
  • FIGURE 5 presents the processing flow of two or more virtual surfaces for the comparison of their characteristics, in order to visualize, identify, locate and/or classify the discrepancies between them or the surface defects.
  • the flow starts by loading 501 of the virtual surfaces S1 and S2.
  • the surfaces can come from different stages of the project, among which, for example, the projected drawing (CAD), a stamping simulation (SIM), a digitization (scanning) of a real part (SCAN), or even different versions.
  • step 502 surface S2 is aligned to S1, generating surface S3, as shown in FIGURE 2C.
  • An alignment algorithm is used so that the two surfaces overlap.
  • This alignment which can usually be summarized in a rotation matrix and a translation vector, transforms the original surface S2 into a rotated and translated surface S3, so that the comparison becomes between S1 and S3.
  • Each point p1 of S1 has ⁇ 1 characteristics and each point p3 of S3 has ⁇ 3 characteristics.
  • step 503 the points of S1 and S3 are matched.
  • Each point p1 of S1 must search in its neighborhood for the closest point p3 belonging to S3 to calculate the discrepancy between the characteristics of p1 and the characteristics of p3.
  • the comparison ⁇ is calculated from the characteristics 01 and ⁇ 3.
  • the ⁇ comparison can be a single value or a vector of values, for example, the difference between all values of ⁇ 3 minus the values of 01.
  • the comparison can only use the difference between a specific characteristic, for example, the difference between the G curvature or use some or all of the characteristic values. If, for a predetermined neighborhood, more than one neighboring point belonging to S3 is found, it is possible to choose the one closest to point p1 in S1 or perform a statistic (such as average, maximum, minimum, mode) of the characteristics of the values of the points in S3 to consolidate them in the adequate dimension for the comparison with the characteristics of the point p1 in S1.
  • a statistic such as average, maximum, minimum, mode
  • step 505 the values of the comparison ⁇ are added to the surface S3, together with the characteristics ⁇ 3.
  • step 506 from the set of comparisons ⁇ and characteristics ⁇ 3 a value or class ⁇ ( ⁇ 3, ⁇ ) is calculated, through a calculation, selection or classification algorithm.
  • is a continuous value.
  • can be a class obtained through a classification algorithm.
  • c can have only two values in a binary map (white or black), or c can be a value between 0 (black) and 255 (white) in a grayscale color map.
  • c can be composed of 3 values, for example, RGB (red, green, blue) or HSV (hue, saturation and brightness), or it can be associated transparency values.
  • the visualization of the surface S3 is made, with its points and/or edges and/or faces colored by the value of m.
  • the color scale can be displayed by indicating regions, as in 205, through a progressive color shift scale as in 204, or composed of a two-dimensional color map. Through visualization it is possible to identify, locate and/or classify defects on the virtual surface. It is also possible to observe the variation of some calculated parameter. Finally, generated visualizations and calculated data can be saved in step 518.
  • Decisions 510-517 show a set of 509 changes that the user can make to the processing flow. Each change makes the change in the corresponding step and in subsequent steps, in such a way that calculations prior to the changed step are reused.
  • the reuse of calculations is a way of accelerating the processing time, which is something that is highly valued in software when working with a large amount of data.
  • the user can change the way the S3 surface is displayed.
  • how to change the view is whether or not to show points, edges or faces, as well as the positioning of the surface and the degree of magnification of the view.
  • the color scale for example, the user can change from a continuous scale to a discrete scale, or the color tones of the scale can be switched between a rainbow scale, grayscale, and scale. iron.
  • the scale can be inverted, its minimum and maximum value can be changed, the color of the saturated points (values above the maximum or below the minimum) can be the maximum value of the scale or another color, or the progression of the scale. can be switched between linear or logarithmic.
  • the function q is changed.
  • the parameters of the function can be changed or there may be a set of pre-set functions that the user can choose from, for example, to migrate from a calculation function as in 204 to a sort view as in 205.
  • the function of comparison ⁇ is changed, for example, it can be a function with a single value or a vector of values, use the difference between a specific characteristic or some features.
  • the function of characteristics 0 is changed.
  • the features can be modified, for example, the inclusion or omission of some of the wrinkles, some curvatures, or other features.
  • the type of correspondence between surface points S1 and S3 is changed.
  • the match can use the closest point or a set of close points, as well as change the criterion that defines proximity. A user would likely use this functionality to change the point-to-point comparison to a region-average comparison.
  • the alignment algorithm from S3 to S1 is changed, including the possibility of not performing alignment if it is not necessary.
  • some or both surfaces S1 and S2 are replaced by a new surface for evaluation, or the same surface is edited, for example, through smoothing, subsampling, supersampling, cropping, partial deletion, joining between surfaces, translation , rotation, warping, scaling or any editing process.
  • FIGURE 6A shows the refraction test, for the automatic quality inspection of optical distortion in partially or fully transparent materials, which distort the light that passes over it.
  • the system contains a light source 601, inspected material 605, an image capture apparatus 610, and an apparatus capable of process and analyze data 620 .
  • Light projected by light source 601 passes through inspected material 605, is refracted due to the refractive index of material 605, and captured by imaging apparatus 610.
  • the image captured by the image capture apparatus 610 is transmitted to an apparatus 620 capable of processing the captured image, identifying and classifying the defects.
  • the inspected material 605 can be glass , crystal , polymer , acrylic or any other type of material that exhibits transparency . Variations in refractive index, thickness or shape along the material 605 cause ripple in the patterns of light stripes 621 and dark stripes 622 captured by image capture apparatus 610 and sent to apparatus 620.
  • Apparatus 620 contains software for image processing as well as optical distortion defect classification, as illustrated in FIGURE 6C and FIGURE 7A.
  • FIGURE 6B shows the reflection test , for the automatic inspection of optical distortion quality in fully or partially reflective materials , which distort the light that falls on it .
  • the system contains a light source 631, the inspected material 635, an image capture apparatus 640, and an apparatus capable of processing and analyzing data 650.
  • Light projected by light source 631 is reflected specularly on the surface of material 635, and captured by the image capturing apparatus 610.
  • the image captured by the image capturing apparatus 610 is transmitted to an apparatus 620 capable of processing the captured image, identifying and classifying the defects.
  • Light rays 634 from lights 632 are specularly reflected by inspected material 635, as indicated by reflected ray 636.
  • Inspected material 635 may be metal, ceramic, glass, composite, polymer, acrylic, crystal, or any other type of material. school ref.
  • the material 635 may be flat, curved, regular, corrugated, corrugated, concave, convex, or may comprise a mixture of such shapes. Variations in the surface of the material 635, such as variations in flatness, undulations, elevations, and dips, cause ripple in the patterns of light stripes 651 and dark stripes 652 captured by the imaging apparatus 640 and sent to the apparatus 650.
  • the apparatus 650 contains image processing software, as well as the classification of reflex distortion defects, as illustrated in FIGURE 6D and FIGURE 7B.
  • Light sources 601 and 631 that generate a pattern of light, eg creating parallel lines of light 602 or 632 on a black background 603 or 633.
  • Lighting can be created by: a set of fluorescent tubes; tubular LED lamps; or a screen illuminated by a projector or laser; or an LCD, plasma, OLED or LED monitor; or a set of lamps that have in front of them a sheet of material that alternates between translucent regions and black matte regions; or any device capable of creating a pattern of light and shadow.
  • the image capture devices 610 and 640 can be, for example, a video and/or photographic camera, infrared camera, ultraviolet camera or any set of electromagnetic sensors capable of capturing an image.
  • the frequency range of the electromagnetic waves emitted by the light apparatus and captured by the image capture apparatus is preferably in the visible spectrum, however, it can be comprised in the infrared or ultraviolet spectrum.
  • Devices 620 and 650 are, for example, a computer, mobile device, microprocessor or any other device capable of processing and analyzing data.
  • FIGURE 6A presents a form of automation of the measurement of optical distortion level due to the variation of the refractive index along the part, replacing the subjective verification currently carried out in the industry of glass making, which is visual inspection by a human inspector.
  • 605 being a flat glass
  • FIGURE 6A presents a form of automation of the measurement of optical distortion level due to the variation of the refractive index along the part, replacing the subjective verification currently carried out in the industry of glass making, which is visual inspection by a human inspector.
  • FIGURE 6B shows a way of automating the reflection distortion level measurement speculate on the surface along the part, replacing the subjective verification currently performed in the glass making industry for application to mirrors, which is visual inspection by a human inspector.
  • a possible implementation of the system in FIGURE 6A follows the dimensions of the Brazilian float glass inspection standard (ABNT NBR NM 294, 2004): the distance between the light source 601 and the inspected material 605 is 4.5 meters; the distance between the inspected material 605 and the image capture apparatus 610 is 4.5 meters; light source 601 has 25 millimeter wide light bands and 25 millimeter wide dark bands; and , for example , screen 601 consists of a translucent white background crossed by parallel black bands and backlit with fluorescent lights .
  • the Brazilian float glass inspection standard (ABNT NBR NM 294, 2004): the distance between the light source 601 and the inspected material 605 is 4.5 meters; the distance between the inspected material 605 and the image capture apparatus 610 is 4.5 meters; light source 601 has 25 millimeter wide light bands and 25 millimeter wide dark bands; and , for example , screen 601 consists of a translucent white background crossed by parallel black bands and backlit with fluorescent lights .
  • the inspected material 605 has a width (on the W axis) between 0.8 and 0.9 meters and comprises a quarter of a cut. of the glass produced at the factory, which has a total width of between 3.2 and 3.6 meters.
  • the main indication of defect in the sample is the distortion of the fringe position outside a trend line.
  • a possible implementation of the system of FIGURE 6B contains a light source 631 with light strips 25 millimeters wide and dark stripes 25 millimeters wide; and, for example, screen 631 consists of a translucent white background crossed by parallel black bands and backlit with fluorescent lamps.
  • the piece is placed horizontally on a table, in such a way that, if the cut of the glass is one of the two quarters of the edge of the glass produced, the edge of the glass is in the part closest to the pickup device 640 and, consequently, is in the part bottom of the image.
  • FIGURE 6C shows two possible examples of images captured by the image capture apparatus 610 .
  • a material is positioned between two stops 614 and 615, on a support 613, which rotates about an axis 616.
  • Material 605 is inspected at different positions of rotation from angle ⁇ in plane XZ.
  • material 605 is a 3 millimeter thick clear flat glass and angle ⁇ is 70 degrees in image 611 and 35 degrees in image 612 .
  • Material 605 is delimited horizontally between edges 624 and 625, where a vertical line from its beginning and end can be seen.
  • the illumination contains diagonal light 621 and dark 622 fringes that cross the material 605 , so it is not possible to see distortions of these fringes in the image 611 , but it is possible to see that they are slightly distorted in 612 , and this distortion is noticed in several fringes, in the same vertical line 623 along the material 605 .
  • the region of interest 712 cropped at 611 or 612 for processing is delimited horizontally between 624 and 625 and vertically between 626 and 627.
  • FIGURE 6D shows a possible example of an image captured by the image capture device 640 and part of the processing of this image.
  • Image 641 shows a 635 , where it is possible to see the reflection of the light 632 and dark 633 fringes from the light source 631 , forming the fringes in the image .
  • the image 641 is transformed through the process of binarization into a binary image 642, that is, the separation of the pixels belonging to the light fringes 651 from the rest of the image, with a darker tone 652 .
  • the extraction of the border line between the white fringes 651 and the dark background 652 is performed, resulting in image 643 with curves 653.
  • a trend line 654 is calculated and the distance between this point and the trend line is considered as distortion at each point of the curve 653.
  • the greatest distortion is indicated with the value D .
  • the first point on the curve 653 with distortion above a minimum z threshold is denoted as the start of distortion 655 and the last point with distortion above the minimum z threshold is denoted as the end of distortion 656 .
  • Length C is the distance between 655 and 656 and height A is measured between the bottom of curves 653 and point 655 .
  • FIGURE 7A shows the refraction test processing flow.
  • the flow starts by positioning the sample 605 701 on the support 613 .
  • the shaft 616 containing the sample 605 is rotated through an angle ⁇ .
  • the image is captured by the device 610 and the angle ⁇ is read.
  • the angle reading can be done by an encoder, goniometer, potentiometer or any device capable of measure an angle.
  • the decision is made whether the rotation has reached its end and, if not, it turns around and continues rotating the sample 703 and capturing the image and reading the angle 704.
  • the sample can vary the value of a from 75 to 15 degrees, with an interval of 1 degree between each captured image.
  • step 705 processing of the captured images is performed, resulting in the identification in the image and in the corresponding angle that has the least visible distortion in the sample.
  • the sample classification is performed, based on the obtained angle 721, the internal criteria (classification thresholds) of the industry 723 and the sample data 722, for example, the thickness of the sample 605.
  • the classification, as well as the image and the angle are displayed in an apparatus 620 and in step 708 the saving of images and inspection data of the sample 605 is performed.
  • Image processing 705 starts by binarizing the captured image 710, which is transformed into an image with two possible pixel values.
  • the edge edges of the sample are identified 711 for clipping the region of interest 712.
  • the lines between the light 621 and dark 622 fringes, called ya(x) curves are extracted.
  • ya(x) a polynomial trendline t(x) is calculated.
  • the yb(x) distortion of the image is consolidated into a distortion yc(x) , which represents the vertical distortion in the image for each column x.
  • each column x has approximately 16 curves, and, in the consolidation of distortions in yc(x) , it is possible to disregard from the calculation of the average of the distortions, for example, half of the columns with less distortion and the two columns with the most distortion.
  • the values of yc(x) can contain averages in the x direction, that is, consider the average of more than one column.
  • Images captured with different angles ⁇ have a different number of columns x, as can be seen in the difference in width L between images 611 and 612.
  • a distortion transformation is performed as a function of x to a function of w (on the W axis indicated in FIGURE 6A, which rotates in conjunction with sample 605).
  • This change of variable (coordinate transformation) also adjusts for the fact that the left side of the sample, which is further from the camera, appears smaller than the right side.
  • a resampling 717 of yc(x) generates a set of yd(w) values, where w varies from 0 to 1 (0 to 100% of the width) with a constant step, for example, 0.05.
  • all the values of the distortions yd(w) for each image can be stacked in a matrix Y containing yd of each image and the angle of each image. image, as proposed in 718.
  • these data can be smoothed, for example, averaging the values of each Y line with the values of 3 lines above and below it.
  • Each row of column Y is divided into center and edge regions and for each region a maximum yd value is found.
  • the maximum distortion threshold 720 is compared with the maximum yd of each Y line at 719, looking for the line and angle that corresponds to the least visible distortion in the sample 605.
  • FIGURE 7B shows the reflection test processing flow.
  • the flow starts by positioning sample 635 751 horizontally.
  • the image is captured by the device 640.
  • the captured image is processed, resulting in the extraction of features in the image.
  • the sample classification is made, based on the obtained characteristics 781, on the internal criteria (classification thresholds) of the industry 783 and on the sample data 782, for example, the thickness of the sample 635.
  • the classification, as well as the image and characteristics are displayed on an apparatus 650 and in step 757 the saving of images and inspection data of the sample 635 is performed.
  • Image processing 755 starts by binarizing the captured image 760, which is transformed into an image with two possible pixel values.
  • the extremes from the edge of the sample, 761 are identified for clipping the region of interest 762.
  • the lines between the light 651 and dark 652 fringes, called x(y) curves are extracted.
  • x(y) a polynomial trendline t(y) is calculated.
  • the distortion d(y) of each curve is calculated as the distance between each point of x(y) and t(y) .
  • the beginning and end of distortion are calculated on each curve, as the first and last points, respectively, with distortion d(y) above a threshold z.
  • the selection of the median values of the maximum and minimum distortion point values is made in 767, establishing the median curve for the calculation.
  • it is calculated from the median curve, its maximum distortion value D d(y) , the length C as the distance between the start and end point of the distortion and height A as the vertical measurement between the start of the distortion. distortion and the bottom of the curves.
  • step 769 the alignment, smoothing and normalization of the curves x(y) is performed for the extraction of their derivatives s(y) in step 770.
  • the derivative s(y) is related to the tangent to the curve as shown in 657 and must remain constant for a defect-free sample. However, s(y) can change in samples with ripples, as indicated in FIGURES 6D at the bottom of 641, 642, and 643.
  • the median of the greatest angle change from the derivative change s(y) is calculated and called the F-bend in step 771.
  • the characteristics are gathered at 772: D distortion, A-height, C-length, and F-bend.

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • Theoretical Computer Science (AREA)
  • Pathology (AREA)
  • Immunology (AREA)
  • General Health & Medical Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Chemical & Material Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Automation & Control Theory (AREA)
  • Manufacturing & Machinery (AREA)
  • Human Computer Interaction (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

La présente demande de brevet concerne des procédés et des systèmes pour l'inspection de la qualité de matériaux et de surfaces tridimensionnelles dans un environnement virtuel. Plus particulièrement, la présente demande concerne des procédés pour inspecter et comparer des surfaces virtuelles sur la base de caractéristiques extraites d'images et traitées de manière à permettre la visualisation, l'identification, la localisation et/ou la classification de défauts sur des surfaces virtuelles ou des irrégularités entre deux ou plusieurs surfaces virtuelles.
PCT/BR2021/050533 2020-12-04 2021-12-02 Procédés et systèmes pour l'inspection de la qualité de matériaux et de surfaces tridimensionnelles dans un environnement virtuel WO2022115928A1 (fr)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
BR102020024851-0A BR102020024851A2 (pt) 2020-12-04 2020-12-04 Métodos e sistemas para a inspeção de qualidade de materiais e de superfícies tridimensionais em ambiente virtual
BRBR1020200248510 2020-12-04

Publications (1)

Publication Number Publication Date
WO2022115928A1 true WO2022115928A1 (fr) 2022-06-09

Family

ID=81852668

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/BR2021/050533 WO2022115928A1 (fr) 2020-12-04 2021-12-02 Procédés et systèmes pour l'inspection de la qualité de matériaux et de surfaces tridimensionnelles dans un environnement virtuel

Country Status (2)

Country Link
BR (1) BR102020024851A2 (fr)
WO (1) WO2022115928A1 (fr)

Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5175601A (en) * 1991-10-15 1992-12-29 Electro-Optical Information Systems High-speed 3-D surface measurement surface inspection and reverse-CAD system
US5216481A (en) * 1990-12-19 1993-06-01 Toyo Glass Co., Ltd. Method of and apparatus for inspecting transparent object for defect
JPH05164696A (ja) * 1991-12-11 1993-06-29 Nissan Motor Co Ltd 塗装面評価装置
US5307152A (en) * 1992-09-29 1994-04-26 Industrial Technology Institute Moire inspection system
US6738507B2 (en) * 2001-01-09 2004-05-18 Ford Global Technologies, Llc Apparatus and method for correlating part design geometry, manufacturing tool geometry, and manufactured part geometry
US20070206182A1 (en) * 2003-10-21 2007-09-06 Daihatsu Motor Co., Ltd. Surface Defect Inspecting Method And Device
WO2018098551A1 (fr) * 2016-12-01 2018-06-07 Autaza Tecnologia Ltda - Epp Procédé et système d'inspection automatique de la qualité de matériaux

Patent Citations (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5216481A (en) * 1990-12-19 1993-06-01 Toyo Glass Co., Ltd. Method of and apparatus for inspecting transparent object for defect
US5175601A (en) * 1991-10-15 1992-12-29 Electro-Optical Information Systems High-speed 3-D surface measurement surface inspection and reverse-CAD system
JPH05164696A (ja) * 1991-12-11 1993-06-29 Nissan Motor Co Ltd 塗装面評価装置
US5307152A (en) * 1992-09-29 1994-04-26 Industrial Technology Institute Moire inspection system
US6738507B2 (en) * 2001-01-09 2004-05-18 Ford Global Technologies, Llc Apparatus and method for correlating part design geometry, manufacturing tool geometry, and manufactured part geometry
US20070206182A1 (en) * 2003-10-21 2007-09-06 Daihatsu Motor Co., Ltd. Surface Defect Inspecting Method And Device
WO2018098551A1 (fr) * 2016-12-01 2018-06-07 Autaza Tecnologia Ltda - Epp Procédé et système d'inspection automatique de la qualité de matériaux

Also Published As

Publication number Publication date
BR102020024851A2 (pt) 2022-06-21

Similar Documents

Publication Publication Date Title
US11024020B2 (en) Method and system for automatic quality inspection of materials and virtual material surfaces
Boschetto et al. Design for manufacturing of surfaces to improve accuracy in Fused Deposition Modeling
KR101242984B1 (ko) 형상 검사 방법 및 장치
US6937235B2 (en) Three-dimensional object surface shape modeling apparatus, method and program
US4792232A (en) Method and apparatus for detection of undesirable surface deformities
CN104024792A (zh) 轮胎形状检查方法以及轮胎形状检查装置
KR20080033472A (ko) 면왜곡의 측정장치 및 방법
CN101680752A (zh) 形状评价方法、形状评价装置及三维检查装置
US7649545B2 (en) Inspection system and method
CN115656182A (zh) 基于张量投票主成分分析的板材点云缺陷检测方法
Wang et al. Structured-light three-dimensional scanning for process monitoring and quality control in precast concrete production.
Petruccioli et al. Assessment of close-range photogrammetry for the low cost development of 3D models of car bodywork components
WO2022115928A1 (fr) Procédés et systèmes pour l'inspection de la qualité de matériaux et de surfaces tridimensionnelles dans un environnement virtuel
US20210407064A1 (en) Method and device for geometric analysis of a part surface
Dar et al. Field surface roughness levelling of the lapping metal surface using specular white light
CN114463317A (zh) 一种基于计算机视觉的结构原位修补3d打印方法
JPH0623992B2 (ja) 板ガラスの検査方法
US7162398B2 (en) Method for evaluating the dynamic perspective distortion of a transparent body and method for supporting the designing of a three-dimensionally curved shape of a transparent body
CN111179248A (zh) 一种透明平滑曲面缺陷识别方法及检测装置
CN116433658B (zh) 类镜面缺陷检测方法、装置、电子设备及存储介质
Kalajahi et al. On detailed deviation zone evaluation of scanned surfaces for automatic detection of defected regions
Yu et al. The defect detection method for automobile surfaces based on a lighting system with light fields
Świłło et al. Hemming Process Evaluation by Using Computer Aided Measurement System and Numerical Analysis
JPH0682090B2 (ja) 板ガラスの透視二重像のシミュレーション方法
Schimpf Objective surface inspection and semi-automated material removal for metal castings

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 21899360

Country of ref document: EP

Kind code of ref document: A1

REG Reference to national code

Ref country code: BR

Ref legal event code: B01A

Ref document number: 112023010913

Country of ref document: BR

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 21899360

Country of ref document: EP

Kind code of ref document: A1