CN116559179B - Reflective surface morphology and defect detection method and system thereof - Google Patents

Reflective surface morphology and defect detection method and system thereof Download PDF

Info

Publication number
CN116559179B
CN116559179B CN202310819746.4A CN202310819746A CN116559179B CN 116559179 B CN116559179 B CN 116559179B CN 202310819746 A CN202310819746 A CN 202310819746A CN 116559179 B CN116559179 B CN 116559179B
Authority
CN
China
Prior art keywords
phase
image
map
detected object
processed
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310819746.4A
Other languages
Chinese (zh)
Other versions
CN116559179A (en
Inventor
彭余
王国安
吴伟锋
孙久春
谢国栋
郑泽鹏
黄碧华
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Hypersen Technologies Co ltd
Original Assignee
Hypersen Technologies Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Hypersen Technologies Co ltd filed Critical Hypersen Technologies Co ltd
Priority to CN202310819746.4A priority Critical patent/CN116559179B/en
Publication of CN116559179A publication Critical patent/CN116559179A/en
Application granted granted Critical
Publication of CN116559179B publication Critical patent/CN116559179B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01BMEASURING LENGTH, THICKNESS OR SIMILAR LINEAR DIMENSIONS; MEASURING ANGLES; MEASURING AREAS; MEASURING IRREGULARITIES OF SURFACES OR CONTOURS
    • G01B11/00Measuring arrangements characterised by the use of optical techniques
    • G01B11/24Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures
    • G01B11/25Measuring arrangements characterised by the use of optical techniques for measuring contours or curvatures by projecting a pattern, e.g. one or more lines, moiré fringes on the object
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8854Grading and classifying of flaws
    • G01N2021/8874Taking dimensions of defect into account
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20048Transform domain processing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02EREDUCTION OF GREENHOUSE GAS [GHG] EMISSIONS, RELATED TO ENERGY GENERATION, TRANSMISSION OR DISTRIBUTION
    • Y02E10/00Energy generation through renewable energy sources
    • Y02E10/50Photovoltaic [PV] energy

Abstract

The invention provides a method and a system for detecting the appearance and defects of a reflective surface. The method for detecting the appearance and the defects of the reflective surface comprises the following steps: acquiring an image to be processed of a detected object; based on a to-be-processed image of the detected object, determining a parcel phase diagram, a background diagram and an original modulation degree diagram corresponding to the to-be-processed image, and obtaining a high-quality modulation degree diagram; generating a synthetic modulation degree map based on the high quality modulation degree map; calculating a parcel phase map corresponding to the image to be processed to obtain an absolute phase map of the detected object; obtaining a gloss ratio map and a diffuse reflection map based on the synthesized modulation degree map and the background map, and determining a real phase map of the detected object based on the absolute phase map; and detecting the surface morphology of the detected object based on one or more of the synthetic modulation degree diagram, the gloss ratio diagram, the diffuse reflection diagram, the true phase diagram and the background diagram so as to output a detection result of the detected object. The system can realize the method.

Description

Reflective surface morphology and defect detection method and system thereof
Technical Field
The invention relates to the technical field of defect detection, in particular to a method and a system for detecting the appearance and defects of a reflective surface.
Background
Because the shapes and materials of detected objects in different industrial fields are different, the existing surface morphology and defect detection device, method and system are difficult to be used universally, hardware such as a light source and a camera is required to be replaced, high-quality surface morphology information, defect distribution and defect depth information cannot be provided yet, and therefore the false detection rate of surface morphology and defect detection is increased, and the maintenance cost of the surface morphology and defect detection device is increased.
For example, chinese patent document No. CN200710006717.7 discloses a surface defect inspection apparatus and a surface defect inspection method. The surface defect inspection apparatus includes: a linear light source for irradiating a rotating inspection object with pattern light having stripes with different brightness in an oblique direction of a sub-scanning direction; a line sensor that performs one-dimensional imaging of an object to be inspected in a main scanning direction by light reflected by the object to be inspected irradiated with pattern light; a phase detection unit that detects a phase change in brightness of a captured line image; and an imaging position control unit for controlling the position of the line sensor so as to maintain a constant relative distance between the inspection object and the line sensor according to the phase change. However, as a result of the study by the inventors, it was found that: the surface defect inspection device and the method thereof need to adjust the position of the camera, so that the defect inspection process is complicated, the defect inspection efficiency is low, and the line scanning camera and the line light source are also used, so that the popularization of the surface defect inspection device in other fields is limited.
For example, chinese patent document No. CN201680008920.5 discloses a surface defect inspection apparatus and a surface defect inspection method for a melt-plated steel sheet. The surface defect inspection apparatus and the surface defect inspection method are characterized in that reflected light of an object to be inspected is photographed by two cameras respectively, an image processing is performed on a regular reflection image and a diffuse reflection image obtained by photographing respectively, and a defect part and a defect type are judged by comparing the regular reflection image and the diffuse reflection image with a preset threshold value. However, as a result of the study by the inventors, it was found that: the defect inspection process needs to combine images of two cameras, so that the cost of detection and maintenance is increased, and a threshold value needs to be preset, so that the accuracy and reliability of surface defect inspection are low.
For example, chinese patent document No. CN201680036829.4 discloses a surface defect detecting device, a surface defect detecting method, and a method for manufacturing steel. The surface defect detection device and method irradiates illumination light from different directions to the same inspection target part by using more than two distinguishable light sources, and performs differential processing on the reflected light images obtained by shooting to detect the surface defect of the inspection target part. The above surface defect detecting device and method are generally used for detecting oxide scale or harmless patterns on the surface of steel. However, as a result of the study by the inventors, it was found that: the device and the method need to correct the placing postures of the two light sources until a certain condition is met, and the result after the difference processing can only represent binary information of the concave-convex property of the detected surface defects, but cannot display the continuity information of the detected surface depth, and further cannot identify some less obvious defects (such as scratches, fingerprints and the like), so that the application range of the device and the method is limited, namely the device and the method have low universality.
Disclosure of Invention
The invention mainly solves the technical problem of providing a method and a system for detecting the appearance and the defects of the reflective surface, wherein the method and the system for detecting the appearance and the defects of the reflective surface can detect an object to be detected, obtain high-quality surface appearance and depth information of the surface defects of the object to be detected, and obviously reduce the false detection rate of the surface defects and improve the detection efficiency.
According to a first aspect, in one embodiment, a method for detecting a reflective surface topography and defects is provided, including: acquiring a to-be-processed image of a detected object, wherein the to-be-processed image is obtained by projecting structured light onto the surface of the detected object and shooting the surface of the detected object; based on an image to be processed of the detected object, determining a parcel phase diagram, a background diagram and an original modulation degree diagram corresponding to the image to be processed, and removing clutter signals in the original modulation degree diagram to obtain a high-quality modulation degree diagram; generating a synthetic modulation degree map based on the high quality modulation degree map; resolving the parcel phase map corresponding to the image to be processed to obtain an absolute phase map of the detected object; obtaining a gloss ratio map and a diffuse reflection map based on the synthetic modulation map and the background map, and determining a real phase map of the detected object based on the absolute phase map; and detecting the surface morphology of the detected object based on the synthetic modulation degree diagram and one or more of the gloss ratio diagram, the diffuse reflection diagram, the true phase diagram and the background diagram so as to output a detection result of the detected object.
In an embodiment, the removing the clutter signals in the original modulation degree map to obtain a high quality modulation degree map includes: performing Fourier transform on the light intensity distribution corresponding to the original modulation degree diagram to obtain Fourier transform light intensity distribution; obtaining the frequency band of the clutter signal based on the Fourier transform light intensity distribution; filtering the Fourier transform light intensity distribution based on the frequency band of the clutter signals to remove the clutter signals in the original modulation degree diagram; and carrying out inverse Fourier transform on the Fourier transform light intensity distribution subjected to the filtering treatment to obtain the high-quality modulation degree chart.
In an embodiment, the calculating the parcel phase map corresponding to the image to be processed to obtain the absolute phase map corresponding to the image to be processed includes: acquiring a phase numerical equation, wherein the phase numerical equation is used for converting a wrapping phase and an absolute phase corresponding to the image to be processed; and obtaining an absolute phase diagram corresponding to the image to be processed by solving the phase numerical equation.
In an embodiment, the obtaining the absolute phase map corresponding to the image to be processed by solving the phase numerical equation includes:
Calculating a wrapping phase phi (x, y) corresponding to the image to be processed through a phase value corresponding to a pixel point (x, y) on the surface of the detected object, performing second-order difference processing on the wrapping phase to obtain a difference rho (x, y) of adjacent wrapping phase differences, and performing two-dimensional discrete domain expansion on the difference rho (x, y) of the adjacent wrapping phase differences to obtain rho' (x, y); expanding the wrapping phase phi (x, y) through discrete transformation to obtain a mapping relation between the wrapping phase phi (x, y) and a mapping spectrum value of a discrete transformation domain, substituting the discrete transformed wrapping phase into the phase numerical equation, and further calculating to obtain the mapping spectrum value of the wrapping phase map in a two-dimensional discrete transformation domain; and performing inverse discrete cosine transform on the calculated mapping spectrum value to obtain an absolute phase diagram corresponding to the image to be processed.
In one embodiment, the determining the real phase map of the detected object based on the absolute phase map includes: fitting the space curved surface of the absolute phase diagram by using a first 36-order Zernike polynomial to obtain a fitted curved surface, wherein the fitted curved surface is a phase diagram of a carrier signal in the absolute phase diagram; and linearly subtracting the absolute phase map from the fitting curved surface to remove carrier signals in the absolute phase map so as to obtain a real phase map corresponding to the detected object.
In one embodiment, the obtaining the gloss ratio map and the diffuse reflection map based on the synthetic modulation map and the background map includes: dividing the synthesized modulation degree map by the background map to obtain the gloss ratio map; and subtracting the background image from the synthesized modulation degree to obtain the diffuse reflection image.
In one embodiment, the acquiring the image to be processed of the detected object includes: correcting a Gamma value of a structured light projection unit to prevent the structured light projected by the structured light projection unit from generating distortion, wherein the structured light projection unit is used for generating the structured light and projecting the structured light to the detected object; setting the frequency of the structured light according to the defect type of the surface of the detected object; and projecting the structured light to the surface of the detected object, and shooting the surface of the detected object to obtain an image to be processed of the detected object.
According to a second aspect, an embodiment provides a reflective surface topography and defect detection system, comprising: the image acquisition module to be processed is configured to acquire an image to be processed of the detected object; the image to be processed is obtained by projecting structured light onto the surface of the object to be detected and photographing the surface of the object to be detected; the image processing module is configured to determine a parcel phase diagram, a background diagram and an original modulation diagram corresponding to the image to be processed based on the image to be processed of the detected object, and remove clutter signals in the original modulation diagram to obtain a high-quality modulation diagram; generating a synthetic modulation degree map based on the high quality modulation degree map; resolving the parcel phase map corresponding to the image to be processed to obtain an absolute phase map of the detected object; obtaining a gloss ratio map and a diffuse reflection map based on the synthetic modulation map and the background map, and determining a real phase map of the detected object based on the absolute phase map; and the analysis detection module is configured to detect the surface morphology of the detected object based on the synthetic modulation degree diagram and one or more of the gloss ratio diagram, the diffuse reflection diagram, the true phase diagram and the background diagram so as to output a detection result of the detected object.
In one embodiment, the image acquisition module to be processed includes: an imaging unit and a structured light projection unit; wherein the structured light projection unit is configured to generate structured light and project the structured light to a surface of the object to be detected; the image capturing unit is configured to capture a surface of the detected object to acquire a to-be-processed image of the detected object.
According to a third aspect, a computer-readable storage medium is provided in one embodiment. The computer-readable storage medium includes a program. The program is capable of being executed by a processor to implement the retroreflective surface topography and defect detection method as described in any of the embodiments herein.
The beneficial effects of the application include:
acquiring an image to be processed of a detected object; based on an image to be processed of the detected object, determining a parcel phase diagram, a background diagram and an original modulation degree diagram corresponding to the image to be processed, and removing clutter signals in the original modulation degree diagram to obtain a high-quality modulation degree diagram; generating a synthetic modulation degree map based on the high quality modulation degree map; resolving the parcel phase map corresponding to the image to be processed to obtain an absolute phase map of the detected object; obtaining a gloss ratio map and a diffuse reflection map based on the synthetic modulation map and the background map, and determining a real phase map of the detected object based on the absolute phase map; detecting the surface morphology of the detected object based on the synthetic modulation degree diagram and one or more of the gloss ratio diagram, the diffuse reflection diagram, the true phase diagram and the background diagram so as to output a detection result of the detected object; that is, the information/morphology of different defects is characterized by the synthetic modulation degree map, the gloss ratio map, the diffuse reflection map, the synthetic modulation degree map, the background map, the contrast map, and the like, so that the defect region, the type of the defect, the two-dimensional/three-dimensional morphology, the corresponding depth information, and the like of the detected object can be comprehensively determined.
Drawings
FIG. 1 is a schematic overall flow chart of a reflective surface topography and defect detection method according to an embodiment;
FIG. 2 is a flowchart of an embodiment of acquiring a to-be-processed image of a detected object;
FIG. 3 is a schematic flow chart of an absolute phase diagram corresponding to an image to be processed obtained by solving a phase numerical equation according to an embodiment;
FIG. 4 is a schematic block diagram of a reflective surface topography and defect detection system according to an embodiment;
FIG. 5 is a schematic diagram of another embodiment of a reflective surface topography and defect detection system.
Detailed Description
The application will be described in further detail below with reference to the drawings by means of specific embodiments. Wherein like elements in different embodiments are numbered alike in association. In the following embodiments, numerous specific details are set forth in order to provide a better understanding of the present application. However, one skilled in the art will readily recognize that some of the features may be omitted, or replaced by other elements, materials, or methods in different situations. In some instances, related operations of the present application have not been shown or described in the specification in order to avoid obscuring the core portions of the present application, and may be unnecessary to persons skilled in the art from a detailed description of the related operations, which may be presented in the description and general knowledge of one skilled in the art.
Furthermore, the described features, operations, or characteristics of the description may be combined in any suitable manner in various embodiments. Also, various steps or acts in the method descriptions may be interchanged or modified in a manner apparent to those of ordinary skill in the art. Thus, the various orders in the description and drawings are for clarity of description of only certain embodiments, and are not meant to be required orders unless otherwise indicated.
The numbering of the components itself, e.g. "first", "second", etc., is used herein merely to distinguish between the described objects and does not have any sequential or technical meaning. The term "coupled" as used herein includes both direct and indirect coupling (coupling), unless otherwise indicated.
For a clear understanding of the technical solution of the present application, some terms will be described herein.
The structured light three-dimensional measurement technique is to use a projection device such as a projector to project a known optical pattern such as a light spot, a grating, or a grid onto the surface of a detected object, then modulate the known optical pattern, capture an image modulated on the surface of the detected object by a device such as a camera, and then decode the image, thereby obtaining depth information and the like of the detected object according to the principle of triangle. The three-dimensional structured light measurement technology based on the sinusoidal stripe grating image is characterized in that the relation between the depth information of each point on the surface of the detected object and the phase value corresponding to the point is obtained according to the triangle principle and the phase characteristics of the sinusoidal stripe grating. According to the measuring principle, the phase of the stripe grating image captured by the camera is obtained, so that the depth information of the detected object and the like can be obtained.
The technical scheme of the present application will be described in detail with reference to examples.
Referring to fig. 1, a method for detecting a reflective surface morphology and a defect includes:
step S100: acquiring an image to be processed of a detected object; the image to be processed is obtained by projecting structured light onto the surface of the object to be detected and photographing the surface of the object to be detected;
step S200: based on an image to be processed of the detected object, determining a parcel phase diagram, a background diagram and an original modulation degree diagram corresponding to the image to be processed, and removing clutter signals in the original modulation degree diagram to obtain a high-quality modulation degree diagram; generating a synthetic modulation degree map based on the high quality modulation degree map; resolving the parcel phase map corresponding to the image to be processed to obtain an absolute phase map of the detected object; obtaining a gloss ratio map and a diffuse reflection map based on the synthetic modulation map and the background map, and determining a real phase map of the detected object based on the absolute phase map;
step S300: and detecting the surface morphology of the detected object based on the synthetic modulation degree diagram and one or more of the gloss ratio diagram, the diffuse reflection diagram, the true phase diagram and the background diagram so as to output a detection result of the detected object.
When the three-dimensional measurement technology of structured light is used to measure the detected object, a phase shift method is generally adopted, that is, a plurality of structured lights (such as sinusoidal stripe grating images) are projected onto the surface of the detected object, then an imaging unit (such as a camera or a camera) is used to capture the structured lights modulated by the surface of the detected object (i.e. the image to be processed), and finally the image to be processed is processed and calculated to obtain information such as absolute phases corresponding to points on the surface of the detected object. The image to be processed is an image (such as a deformed stripe image) modulated by surface topography information (such as surface height information) of the detected object by using an image capturing unit after structured light (such as a sinusoidal stripe grating image) is projected to the detected object. It will be appreciated that the structured light may be a sinusoidal striped grating image, or the structured light may be a periodic striped grating image of rectangular, zigzagged or other form.
In some embodiments, the structured light projected onto the surface of the inspected object is a sinusoidal fringe grating image. It is understood that the structured light projected onto the surface of the object to be detected may be a laser stripe, a gray code, or the like.
In some embodiments, multiple images to be processed of a detected object may be obtained for the detected object. For example, if three-dimensional reconstruction is performed by using the four-step phase shift principle, four grating images (i.e., to-be-processed images) of the detected object can be obtained for one detected object, and subsequent computation of a parcel phase map, a background map, an original modulation degree map, and the like is performed based on the four to-be-processed images.
In some embodiments, the detection result of the detected object includes: one or more of defect area, defect type, two-dimensional/three-dimensional morphology and depth information of the detected object.
In some embodiments, please refer to fig. 2, in the step S100: acquiring an image to be processed of a detected object, comprising:
step S110: correcting a Gamma value of a structured light projection unit to prevent the structured light projected by the structured light projection unit from generating distortion, wherein the structured light projection unit is used for generating the structured light and projecting the structured light to the detected object;
step S120: setting the frequency of the structured light according to the defect type of the surface of the detected object;
step S130: and projecting the structured light to the surface of the detected object, and shooting the surface of the detected object to obtain an image to be processed of the detected object.
When the structured light projected onto the surface of the object to be detected is a sinusoidal fringe grating image, the input of the structured light projection unit is sinusoidal fringes having an ideal sinusoidal waveform. However, since the light source of the structural light projection unit itself does not reach the brightness value of the ideal sine waveform in the light emitting area thereof, and the light emitted by the light source is refracted by the diffusion plate in the structural light projection unit, there is a small unevenness, and thus the sine stripe grating image output by the structural light projection unit is distorted to generate distortion stripes, and further periodic high-frequency harmonics appear in a background image, a modulation degree image, and the like, which are obtained later, therefore, the structural light projection unit needs to be subjected to Gamma value correction in advance, so that the real waveform of the sine stripe grating image emitted by the structural light projection unit is more approximate to the ideal sine waveform, that is, the sine stripe grating image emitted by the structural light projection unit is prevented from being distorted. Wherein, the approximate expression of the light intensity distribution of the distortion stripes can be:
I' n (x,y)=[I n (x,y)] γ =A'(x,y)+
(1),
wherein in the above formula (1)NRepresenting the total number of the images to be processed of the detected object, n is the serial number of the images to be processed, A' (x, y) is the background light intensity of the images to be processed, and B k (x, y) is the amplitude of the kth harmonic, I n (x, y) is the light intensity of the ideal waveform, and gamma is the unknown coefficient to be solved.
The gamma value can be obtained by projecting a plurality of sinusoidal stripe grating images onto the surface of the detected object, and capturing the surface-modulated grating images (i.e., the images to be processed) of the detected object by using an image capturing unit, i.e., a plurality of images to be processed can be obtained, and then fitting according to the formula (1) in the form of an exponential function. And then, carrying out inverse transformation to obtain sine stripe input after the correction of the structured light projection unit.
It should be noted that, since the Gamma value correction performed on the structured light projection unit that generates the structured light is common knowledge in the art, a detailed description of the specific process of the Gamma value correction is omitted here.
In some embodiments, the sinusoidal fringe grating image projected onto the surface of the inspected object may be a transverse sinusoidal fringe grating image, which may also be a longitudinal sinusoidal fringe grating image, which may also be an oblique sinusoidal fringe grating image.
In some embodiments, the relative program may be used to control the degree of brightness of the structured light emitted by the structured light projection unit in a particular direction. The light source in the structured light projection unit has a plurality of bright and dark areas (such as vertical stripes), and the bright and dark areas (such as longitudinal sinusoidal stripe grating images) on the object placement surface (such as reference plane) are moved transversely and periodically N(NMore than or equal to 2) times, and shooting the surface of the detected object once by the camera unit every time the longitudinal sinusoidal fringe grating image is moved, thereby obtaining the vertical fringeNA step-phase diagram, whereinNFor the number of movements (i.e. can be obtainedNAn image to be processed). Similarly, the horizontal stripes can be obtained by the operation of the horizontal sinusoidal stripe grating imageNStep phase shift diagram.
In some embodiments, if the defect type of the surface of the detected object is the defect type of the key detection such as the salient point, the scratch, etc., a sine stripe grating image with a large period and a small frequency can be set; if the defect type of the surface of the detected object is unobvious flaws such as stains, a sine stripe grating image with large frequency can be set. It can be seen that the frequency of the structured light is set according to the defect type of the surface of the detected object, so that the fineness of the detail information of the surface of the detected object captured by the structured light can be improved.
It is well known that for common constant phase stepsNThe expression of the light intensity distribution of the nth image to be processed captured by the camera unit is as follows:
I n (x,y)= A(x,y)+ B(x,y) cos[φ(x,y)+2πn/N] (2),
in the above formula (2), (x, y) is the coordinates of the pixel point of the image to be processed, a (x, y) is the background light intensity in the measurement environment, B (x, y) is the amplitude (i.e., modulation degree) of the modulated sinusoidal fringe grating image, phi (x, y) is the phase value at the pixel point (x, y) of the detected object surface, n=1, 2,3 …,N
since the phase value phi (x, y) of each pixel point in the image to be processed captured by the image capturing unit contains the depth information of the detected object, three or more sine stripe grating images with different phase angles are projected to the surface of the detected object, then unknown variables (such as the background light intensity and modulation degree) can be obtained, then the distribution (such as the wrapping phase) of the phase value of each pixel point in the captured image to be processed is calculated, and then the detection results of the depth information, the three-dimensional contour information and the like of the detected object can be obtained. The specific steps are as follows:
first, a background map of an image to be processed is calculated. Wherein, the calculation expression of the background light intensity is as follows:
(3),
wherein in the above formula (3)NRepresenting the total number of the images to be processed corresponding to the detected object, n is the serial number of the images to be processed, I n (x, y) is the light intensity distribution of the image to be processed. From the above formula (3), it can be seen that the background light intensity of the image to be processed can be obtained by adding the light intensity distribution of all the images to be processed and then obtaining the average value, and then the background image of the image to be processed can be obtained according to the background light intensity.
And secondly, calculating the original modulation degree of the image to be processed, and further obtaining an original modulation degree diagram of the image to be processed based on the original modulation degree. Since the expression of the light intensity distribution of the nth image to be processed captured by the image capturing unit can be derived, the expression of the modulation degree corresponding to the pixel point (x, y) is:
(4),
therefore, the original modulation degree map of the image to be processed can be determined based on the calculated modulation degree.
Next, since the expression of the light intensity distribution of the nth image to be processed captured by the aforementioned image capturing unit, the phase value at the pixel point (x, y) of the detected object surface can be deduced as:
-π<φ(x,y)<π (5),
thus, the wrapping phase can be calculated by the above equation (5). And under the condition that the phase value is determined, further obtaining a parcel phase diagram corresponding to the image to be processed.
The clutter signal on the original modulation degree diagram is removed by combining a clutter signal extraction algorithm with a correction and compensation algorithm of a sinusoidal fringe grating image so as to obtain a high-quality modulation degree diagram. The definition of the high-quality modulation degree chart is obviously improved compared with the original modulation degree chart.
It is known that the height value H of a point on the surface of a detected object and the phase difference value ΔΦ of the point before and after the detected object is placed on the detected object satisfy the following mathematical relationship:
H=(Δφ×L)/(2πfd+Δφ) (6),
wherein L in the above formula (6) is the distance between the structured light projection unit and the object to be detected placement surface (i.e., reference plane), f is the frequency of the structured light projected by the structured light projection unit, and d is the horizontal distance between the imaging unit and the structured light projection unit. The values of L and d are fixed, and can be obtained through system calibration. After the phase value at the point (x, y) on the surface of the detected object is obtained by the calculation in the steps, the phase difference delta phi of the point before and after the detected object is placed on the surface of the detected object can be obtained, and further, the depth information of a certain point on the surface of the detected object can be indirectly obtained based on the phase difference, so that the surface appearance of the detected object can be recovered.
In some embodiments, in step S200: removing clutter signals in the original modulation degree diagram to obtain a high-quality modulation degree diagram, comprising: performing Fourier transform on the light intensity distribution corresponding to the original modulation degree diagram to obtain Fourier transform light intensity distribution; obtaining the frequency band of the clutter signal based on Fourier transform light intensity distribution; filtering the Fourier transform light intensity distribution based on the frequency band of the clutter signals to remove the clutter signals in the original modulation degree diagram; and carrying out inverse Fourier transform on the Fourier transform light intensity distribution subjected to the filtering treatment to obtain a high-quality modulation degree chart. The light intensity distribution corresponding to the original modulation degree map may be expressed as:
I n (x,ω)= A(x,ω)+ B(x,ω) cos(ω 0 x+δ n ) (7),
Omega in formula (7) above 0 Representing the spatial angular frequency of the structured light projected onto the object under test, ω being the spatial angular frequency of the interference fringes, x being the abscissa of the image coordinate system, and δ n =2πn/N,δ n Is the phase shift value. Performing Fourier transform on the light intensity distribution corresponding to the original modulation degree chart to obtain Fourier transform light intensity distribution, wherein the Fourier transform light intensity distribution is as follows:
I n (ω)= Aδ(ω)+ Bπexp(jδ n )[δ(ω+ω 0 )+δ(ω-ω 0 )] (8),
wherein j in the above formula (8) represents an imaginary axis of the complex plane after fourier transform. The fourier transform light intensity distribution is used to represent the frequency of the image signal, wherein bright spots with different brightness reflect the degree of difference between a certain point and a neighborhood, and the purpose of the fourier transform light intensity distribution is to obtain the feature and the position of the image change.
It can be seen that the higher order harmonic (i.e., the kth harmonic) in the original modulation degree chart can be known from the above fourier transform light intensity distribution, wherein the period of the higher order harmonic is n times the period of the original sinusoidal fringe grating, i.e., the frequency band of the clutter signal is obtained.
Because the clutter signals in the original modulation degree diagram are signals between high frequency and low frequency, the clutter signals between the fundamental frequency component and the high frequency component can be filtered by utilizing a band-pass filter, and then the inverse Fourier transform is carried out, so that the high-quality modulation degree diagram without the clutter signals can be obtained. In the practical application of the band-pass filter, the band-pass filter that allows the in-band signal of the circular ring centered on the frequency origin to pass through is radially symmetric. The transfer function of the band-pass filter may be:
(9),
Wherein D (u, v) = [ (u-P/2) 2 +(v-Q/2) 2 ] 1/2 D (D) 02 The method comprises the steps of carrying out a first treatment on the surface of the In the above formula (9), W represents the width of the annular band, D 0 Representing the center frequency of the annulus; d (u, v) represents the euclidean distance of (u, v) to the center of the spectrum; p and Q are the width and height of the image respectively, by taking different D 0 And W causes the frequency of the filtering to change.
It can be seen that, in some embodiments, on the basis of performing Gamma value correction on the structural light projection unit, fourier transformation may be further performed on the light intensity distribution corresponding to the original modulation degree by the above method to obtain a fourier transformed light intensity distribution; obtaining the frequency band of the clutter signal based on Fourier transform light intensity distribution; filtering the residual periodic high-frequency harmonic wave (namely, clutter signals) in the original modulation degree by utilizing a band-pass filter based on the frequency band of the clutter signals so as to remove the clutter signals in the original modulation degree diagram; and performing inverse Fourier transform on the Fourier transform light intensity distribution subjected to the filtering treatment, and finally obtaining a high-quality modulation degree chart for removing clutter signals.
The existing phase resolving method can generate phase truncation and jump on flaws (such as shadows, cracks, holes and the like) on the surface of the detected object so as to influence the accuracy of a phase result, so that the method can obtain a continuous phase result (namely an absolute phase diagram) by solving a phase numerical equation between a wrapping phase and an absolute phase (a phase after unwrapping) through discrete transformation, further avoid the problems of phase jump and truncation, and better recover the appearance of the surface of the detected object.
In some embodiments, in step S200: the package phase diagram corresponding to the image to be processed is solved to obtain an absolute phase diagram corresponding to the image to be processed, and the method comprises the following steps:
step S210: acquiring a phase numerical equation between the wrapping phase and the absolute phase;
step S220: and obtaining an absolute phase map (namely an unfolded phase image) corresponding to the image to be processed by solving the phase numerical equation.
The phase numerical equation between the wrapped phase and the absolute phase is pre-constructed. The phase numerical equation between the wrapped phase and the absolute phase can be expressed as:
2 /Δx 2 )φ(x,y)+(Δ 2 /Δy 2 )φ(x,y)=ρ(x,y) (10),
ρ (x, y) in the above equation (10) is the difference between adjacent wrapped phase differences.
In some embodiments, please refer to fig. 3, step S220: obtaining an absolute phase map corresponding to the image to be processed by solving the phase numerical equation, wherein the absolute phase map comprises the following steps:
step S221: calculating a wrapping phase phi (x, y) corresponding to the image to be processed through a phase value corresponding to a pixel point (x, y) on the surface of the detected object, performing second-order difference processing on the wrapping phase to obtain a difference rho (x, y) of adjacent wrapping phase differences, and performing two-dimensional discrete domain expansion on the difference rho (x, y) of the adjacent wrapping phase differences to obtain rho' (x, y);
Step S222: spreading the wrapping phase phi (x, y) through discrete transformation to obtain a mapping relation between the wrapping phase phi (x, y) and a mapping spectrum value phi '(m, n) of a discrete transformation domain, substituting the wrapping phase after discrete transformation into the phase numerical equation, and further calculating to obtain the mapping spectrum value phi' (m, n) of the wrapping phase map in a two-dimensional discrete transformation domain;
step S223: and performing inverse discrete cosine transform on the calculated mapping spectrum value phi' (m, n) to obtain an absolute phase diagram corresponding to the image to be processed.
According to the application, the discrete cosine transform is introduced to carry out unfolding solution on the phase numerical equation so as to obtain a mapping spectrum value phi' (x, y) of the wrapped phase diagram in a two-dimensional discrete transform domain. Thereafter, subsequent calculations are performed based on the mapped spectral values phi' (x, y). Wherein the discrete transformation is used for the conversion from the continuous domain to the discrete domain. The unwrapped wrapping phase phi (x, y) can be expressed as:
φ(x,y)=
(11),
W 1 (m)=1/2,m=0 (12),
W 1 (m)=1,1≤m≤M-1 (13),
W 2 (n)=1/2,n=0 (14),
W 2 (n)=1,1≤n≤N-1 (15),
wherein M in the formula (11) is the height of the parcel phase diagram, N is the width of the parcel phase diagram, phi' (M, N) is the mapping spectrum value of the parcel phase diagram at (M, N); in the formula (11), x is more than or equal to 0 and less than or equal to (M-1), and y is more than or equal to 0 and less than or equal to (N-1).
It can be seen that the wrapping phase phi (x, y) is unwrapped by discrete transformation to obtain the mapping relationship between the wrapping phase phi (x, y) and the discrete cosine transform mapping spectrum value phi' (x, y).
Step S221: calculating a wrapping phase phi (x, y) corresponding to the image to be processed through a phase value corresponding to a pixel point (x, y) on the surface of the detected object, performing second-order difference processing on the wrapping phase to obtain a difference rho (x, y) of adjacent wrapping phase differences, and performing two-dimensional discrete domain expansion on the difference rho (x, y) of the adjacent wrapping phase differences to obtain rho' (x, y).
Step S222: expanding the wrapping phase phi (x, y) through discrete transformation to obtain a mapping relation between the wrapping phase phi (x, y) and the mapping spectrum value phi ' (x, y) of a discrete transformation domain, substituting the discrete transformed wrapping phase and the rho ' (x, y) into the phase numerical equation, and further calculating to obtain the mapping spectrum value phi ' (m, n) of the wrapping phase in a two-dimensional discrete transformation domain:
φ'(m,n)=ρ'(x,y)/{2[cos(πm/M)+cos(πm/N)-2]} (16)。
step S223: and performing inverse discrete cosine transform on the calculated mapping spectrum value phi' (m, n), and further calculating to obtain a phase value after phase expansion. The phase value after phase unwrapping is used for an absolute phase diagram.
In some embodiments, in step S200: removing the carrier signal in the absolute phase diagram to obtain a real phase diagram, including:
fitting the space curved surface of the absolute phase diagram by using a first 36-order Zernike polynomial to obtain a fitted curved surface, wherein the fitted curved surface is the phase diagram of the carrier signal;
and linearly subtracting the absolute phase map from the fitting curved surface to remove carrier signals in the absolute phase map so as to obtain a real phase map corresponding to the detected object.
The real phase map can represent the height information of corresponding points in the image to be processed. And fitting the space curved surface of the absolute phase diagram by using the first 36 th-order Zernike polynomials to obtain a fitted curved surface, namely extracting the carrier signals in the absolute phase diagram by using a numerical fitting algorithm. The fitting curved surface is a phase diagram of the carrier signal.
After the phase solving and unwrapping of the interferogram group (such as the image to be processed) are performed, an inclined wave surface can be obtained, and related information of the wave surface cannot be visually seen by naked eyes, so that an ideal wave surface closest to the wave surface to be solved needs to be fitted through a Zernike polynomial, related serial aberration information and the like can be obtained according to the fitted wave surface, and a real phase diagram can be obtained.
In the actual imaging process, the optical imaging system usually causes the deviation of the route of the light rays in the transmission process of each surface of the system due to the non-ideal characteristics of the system, thereby forming various aberrations and causing the result to display image blurring, dimensional change,Morphological abnormality, and the like. Aberration is an important evaluation index for imaging quality of an optical system, and correction is a very critical link in the design of the optical system, generally, aberration can be divided into seven types of spherical aberration, coma aberration, distortion, field curvature, astigmatism, position chromatic aberration and magnification chromatic aberration, and each type of aberration is formed by different reasons and characteristics, so that a mathematical model can be established according to characteristic analysis of the aberration, and the aberration function can be further developed into a power series or a group of orthogonal polynomials by taking the mathematical model as an aberration function. Since the aberration function includes not only cos (mθ) but also sin (mθ) in an optical system without rotational symmetry, the aberration function can be expanded by the orthogonal Zernike unit circle polynomial z j Form of (r, θ):
W(r,θ)=∑ s a j Ζ j (r,θ) (17),
wherein W (r, θ) in the above formula (17) is the fitting curved surface, a j Each term coefficient of the polynomial is a unit circle of an orthogonal Zernike,rrepresents the length from the origin and θ represents the angle relative to the origin. Orthogonal Zernike unit circle polynomials Z in polar coordinate system j (r, θ) can be specifically divided into three parts:
(18),
wherein in the above formula (18), Z j (r, θ) is the jth order Zernike mode, 0.ltoreq.r≤1,0≤θM and n are respectively the angular series and the radial series of the polynomial and are less than or equal to 2 pi, and m is less than or equal to n; when n-m is even, and the radial polynomial R n m (r) is defined as:
(19)。
let the size of the fitting surface be row×col, where ROW is the number of ROWs and COL is the number of columns. In order to simulate the 3D pattern of the aberration function, the required mesh data needs to be generated first:
firstly, setting a ROW of X coordinates, wherein the value interval of the X coordinates is from-1 to 1, dividing the ROW of X coordinates into COL parts at equal intervals, copying the ROW of data into ROW ROWs, and establishing an X matrix;
then, a ROW of Y coordinates is set, the value interval of the Y coordinates is from 1 to-1, the ROW of Y coordinates is equally divided into ROW parts, the ROW of data is copied into COL columns, and a Y matrix is established;
then, combining the obtained X and Y matrixes, converting the Cartesian coordinate system corresponding to the X and Y matrixes into a polar coordinate system, and further obtaining two matrix data information R and Th (R and Th are respectivelyrAndθgrid data of (c), whose corresponding expressions are respectively:
(20),
Th(i,j)=arctan(Y(i,j)/X(i,j)) (21)。
wherein X (i, j) represents the data of the ith row and the jth column of the X matrix; y (i, j) represents the data of the ith row and the jth column of the Y matrix.
And for each aberration function, according to a polynomial expression corresponding to a Zernike polynomial table, combining R and theta data values corresponding to each pixel point in the R and Th matrixes to obtain a corresponding value of the aberration function under a corresponding point, further obtaining matrix information about the size of the aberration function as ROW_COL, deforming the matrix corresponding to the 36 aberrations, changing the size of the matrix from ROW_COL to a new matrix I of (ROW_COL) x 1, and sequentially placing the data of each ROW of the original matrix into the new matrix in a column data form according to the sequence of line sequence numbers. Then, the 36 deformed matrices are combined together according to a polynomial order to form a Zernike integrated matrix Z with a size of (row×col) ×36. At this time, 36 term coefficients of the zernike polynomials can be calculated by matrix solution, and the size of the matrix to be solved X is 36×1:
ZX=I (22)。
the meaning of the above formula (22) can be understood as: for any pixel point on the whole graph (namely the absolute phase diagram), the 36 polynomial values corresponding to the point are respectively multiplied by the corresponding coefficients and accumulated, and the result is approximately equal to the numerical value of the point in the unwrapped phase diagram to be fitted, so that the fitting curved surface is obtained.
In some embodiments, the absolute phase map and the fitting curved surface are linearly subtracted to remove the carrier signal in the absolute phase map, so as to obtain a real phase map corresponding to the detected object surface. The first term of the zernike polynomial is translation, the second term and the third term of the zernike polynomial are tilting along the x and y directions respectively, and the absolute phase diagram is subtracted by the product sum of the first three term matrixes of the zernike polynomial and the corresponding coefficients, so that a real topography diagram (namely a real phase diagram) without tilting amount and translation amount can be obtained.
In some embodiments, the step S200: obtaining a gloss ratio map and a diffuse reflection map based on the synthetic modulation map and the background map, comprising:
dividing the synthesized modulation degree map by the background map to obtain the gloss ratio map;
and subtracting the background image from the synthesized modulation degree to obtain the diffuse reflection image.
The structured light projected onto the surface of the object to be detected by the structured light projection unit may be a horizontal sinusoidal stripe grating image, a vertical sinusoidal stripe grating image, or an oblique sinusoidal stripe grating image, and therefore, the high-quality modulation degree map may be classified into a high-quality modulation degree map corresponding to the horizontal sinusoidal stripe grating image, a high-quality modulation degree map corresponding to the vertical sinusoidal stripe grating image, and a high-quality modulation degree map corresponding to the oblique sinusoidal stripe grating image.
In some embodiments, the high-quality modulation degree map corresponding to the horizontal sinusoidal stripe grating image, the high-quality modulation degree map corresponding to the vertical sinusoidal stripe grating image, and the high-quality modulation degree map corresponding to the inclined sinusoidal stripe grating image may be generated into a composite modulation degree map by a linear addition method. For example, the high-quality modulation degree map corresponding to the horizontal sinusoidal fringe grating image may be linearly added to the high-quality modulation degree map corresponding to the vertical sinusoidal fringe grating image to generate the first composite modulation degree map, or the like.
It can be seen that the above-described synthetic modulation scheme has a spatially more comprehensive defect recognition performance.
In some embodiments, all the images to be processed corresponding to the detected object are superimposed and averaged to obtain the background image. The obtained synthetic modulation degree map is divided by the background map to obtain a gloss ratio map. And subtracting the obtained background image from the synthesized modulation degree to obtain a diffuse reflection image.
In some embodiments, the contrast of the image to be processed may also be calculated, and a corresponding contrast map may be obtained.
In some embodiments, the carrier signal is removed from the absolute phase map obtained above, so as to obtain a true phase map. The synthetic modulation degree diagram and the gloss ratio diagram are respectively used for representing fine textures of the surface of the detected object and clear two-dimensional real appearance of defects of the surface of the detected object. The diffuse reflection map is used for representing the area position information of the defect of the surface of the detected object. The background map is used for representing brightness information of light intensity distribution of the image to be processed and approximate morphology of the surface of the image to be processed. The real phase diagram is used for representing depth information of defects on the surface of the detected object and approximate morphology of the defects.
And detecting the appearance of the detected object based on the synthesized modulation degree diagram and one or more of the gloss ratio diagram, the diffuse reflection diagram, the true phase diagram and the background diagram so as to output a detection result of the detected object.
In some embodiments, all images (i.e., a synthetic modulation map, a gloss ratio map, a diffuse reflection map, a synthetic modulation map, a background map, a contrast map, etc.) may be displayed on a visualization device and analyzed to obtain defect information of the surface of the inspected object. For example, the analysis may be to perform a corresponding logic operation and a series of other subsequent processes on one or more images of the synthetic modulation degree map, the gloss ratio map, the diffuse reflection map, the synthetic modulation degree map, the background map, the contrast map, and the like.
It can be seen that different defect information is characterized by a synthetic modulation degree diagram, a gloss ratio diagram, a diffuse reflection diagram, a synthetic modulation degree diagram, a background diagram, a contrast diagram and the like, so that the defect area, the defect type, the two-dimensional/three-dimensional morphology and the corresponding depth information of the detected object can be comprehensively judged; the 2.5D detection of the surface defects is achieved after the fusion of a synthetic modulation degree diagram, a gloss ratio diagram, a diffuse reflection diagram, a synthetic modulation degree diagram, a background diagram, a contrast diagram and the like.
In some embodiments, the brightness information of the surface of the detected object can be detected from the background map and the diffuse reflection map, and then approximately determined based on the brightness information: which areas of the inspected object are highly reflective, which areas of the inspected object are light absorbing, and the type of defect (such as black smudge, pits, and bumps with obvious reflection), etc.
In some embodiments, fine texture information of the surface of the detected object and a clear two-dimensional morphology of the surface of the detected object can be clearly detected through the high-quality modulation degree chart or the synthetic modulation degree chart and the gloss ratio chart, so that defects (such as fine cracks, slight fingerprints and the like) which are difficult to distinguish by naked eyes can be detected.
In some embodiments, depth value information of the surface of the detected object perpendicular to the direction (such as a horizontal plane) of the detected surface can be obtained through a real phase diagram. The real phase diagram has obvious positioning effect on the defect area with larger height value on the surface of the detected object, and the change of the height value at each point on the surface of the detected object can be clear at a glance through the real phase diagram, so that the real phase diagram is beneficial to judging the defects such as the bulges, the pits and the like on the surface of the detected object.
It can be seen that in some embodiments, the present reflective surface topography and defect detection method/system achieves an omnidirectional 2.5D detection of surface defects by detecting the two-dimensional topography of the surface of the object under inspection, the depth values perpendicular to the surface, and the like.
The foregoing is a description of the reflective surface topography and defect detection method. Some embodiments of the application also disclose a reflective surface topography and defect detection system, which is described in detail below.
The technical ideas of the reflective surface morphology and defect detection system are as follows: based on the existing reflection and demodulation principle, a frequency-adjustable structured light projection unit is added, a signal of the structured light projection unit triggers a shooting unit to repeatedly shoot reflected light of a detected object subjected to structured light modulation based on a change phase, then the phase demodulation is carried out through an algorithm to obtain the height information of the surface of the detected object, a synthetic modulation degree, a contrast diagram, a diffuse reflection diagram and a real phase diagram, and finally the high-quality surface morphology of the detected object and the depth information of surface defects are obtained by combining the results of the synthetic modulation degree diagram, the diffuse reflection diagram, the contrast diagram and the real phase diagram, so that the false detection rate of the defects is greatly reduced, and the detection efficiency is improved.
Referring to fig. 4, the reflective surface topography and defect detection system includes:
a to-be-processed image acquisition module 100 configured to acquire a to-be-processed image of a detected object; the image to be processed is obtained by projecting structured light onto the surface of the object to be detected, and photographing the surface of the object to be detected 10;
the image processing module 200 is configured to determine a parcel phase diagram, a background diagram and an original modulation diagram corresponding to the image to be processed based on the image to be processed of the detected object, and remove clutter signals in the original modulation diagram to obtain a high-quality modulation diagram; generating a synthetic modulation degree map based on the high quality modulation degree map; resolving the parcel phase map corresponding to the image to be processed to obtain an absolute phase map of the detected object 10; obtaining a gloss ratio map and a diffuse reflection map based on the synthetic modulation map and the background map, and determining a real phase map of the detected object based on the absolute phase map;
the analysis detection module 300 is configured to detect the surface topography of the detected object based on the synthetic modulation degree map and one or more of the gloss ratio map, the diffuse reflection map, the true phase map and the background map, so as to output a detection result of the detected object.
In some embodiments, referring to fig. 5, the image acquisition module to be processed 100 includes: an imaging unit 110 and a structured light projection unit 120; the structured light projection unit 120 is configured to generate structured light and project the structured light to the surface of the object to be inspected 10; the image capturing unit 110 is configured to capture a surface of the detected object 10 to obtain a to-be-processed image of the detected object 10.
In some embodiments, the image processing program associated with the image processing module 200 and the image to be processed acquired by the image capturing unit 110 are stored in a storage medium of the controller. The storage medium is typically a hard disk or a flash memory. The image processing module 200 operates to load image data into the high-speed storage area, and the image processing module 200 performs a series of processing on the image to be processed in combination with an algorithm program to obtain a background image, a synthetic modulation image, a real phase image, and the like.
In some embodiments, a structured light pattern of a certain frequency is projected to the surface of the inspected object 10 by the structured light projection unit 120. Due to the fact that the reflection characteristic of the surface of the object is detected, reflected light on the surface of the detected object 10 is repeatedly shot, and after a plurality of shot images are processed through an image algorithm and the like, clear morphology and depth information of defects on the surface of the detected object 10 are finally obtained, and further high-quality defect judging images can be provided for defect detection processes such as surface flaws, concave-convex defects and dirt.
Surface morphology and defect detection are important links in industrial production in many industries, and detection methods and equipment are numerous, and detection based on an optical method is finished under a traditional auxiliary instrument such as an interference microscope at present. Whereas the structured light projection unit 120 of the present application comprises a strip light source subunit, a stripe control subunit, and a light source control subunit. The strip light source subunit includes light emitting elements and display elements that are capable of generating structured light (such as the stripe pattern described above). The stripe control subunit makes the sine of the structured light approach to ideal by controlling the light emission of the light emitting element in a specific region. The light source control subunit includes a light emitting element control circuit and a fringe frequency control circuit, and communication components that interact with the image pickup unit 110. The light source control subunit controls the time sequence of the light emitting element in the strip-shaped light source, so that the light rays emitted by the strip-shaped light source subunit pass through the diffusion plate, then present structural light (such as sine stripe grating) with a certain frequency and are projected to the detected object 10, and the structural light (such as sine stripe grating) periodically moves for a certain number of times along a certain direction according to a certain time sequence interval. The image pickup unit 110 mainly includes an image pickup element and an image pickup element controller. The image pickup element includes an image sensor photosensitive element and a lens. The controller of the image pick-up element is mainly composed of an image pick-up process control subunit, an image pick-up element and light source synchronous control and communication subunit and a communication and data transmission subunit of an upper computer of the image pick-up device. The image pickup device controller can mechanically adjust the image pickup device to form a certain included angle between the central axis of the image pickup device and the central axis of the structured light projection unit 120, so as to ensure that the structured light emitted from the structured light projection unit 120 enters the photosensitive surface of the image sensor of the image pickup device after being reflected by the surface of the detected object 10. The light source control subunit drives the light source to project structural light with a certain period onto the detected object 10, and simultaneously sends a signal to the image pickup element controller, the image pickup element shoots reflected light of the detected object 10, the shot image data is sent to the image processing module 200, and the shot image data is stored on a storage medium of the image processing module 200.
The image processing module 200 includes a computer processing sub-module, an image processing algorithm sub-module, and an image visualization sub-module. The computer processing sub-module includes: a central processing unit, a memory and a communication element. The image processing algorithm sub-module reads the image shot by the camera unit 110, and transmits the image to the computer processing sub-module, the computer processing sub-module combines a specific algorithm program to process, and finally the computer processing sub-module outputs the detection result to the image visualization sub-module.
In some embodiments, the structured light projection unit 120 may employ a grating element. The structured light projection unit 120 may be other components capable of projecting a sinusoidal fringe grating. For example, a sinusoidal striped grating may be projected with a DLP projector.
In some embodiments, the image capturing unit 110 may employ a general line camera or an area camera.
In some embodiments, the analysis detection module 300 may implement the appearance and defect detection of the surface of the detected object 10 using a deep learning algorithm or the like.
In some embodiments, the reflective surface topography and the hardware portion of the defect detection system may be integrated into a stand-alone industrial smart camera product.
In some embodiments, the included angle between the central axis of the camera component in the camera unit 110 and the central axis of the structured light projection unit 120 is in the range of 0 ° to 90 °.
In some embodiments, the inspected object 10 is a cell phone screen. The image pickup unit 110 employs an area camera and a general industrial lens. The image resolution of the area array camera is 2048 pixels wide and 1024 pixels high, the luminous surface of the area array camera is a stripe reflection light source with the width of 100mm and the length of 300mm, and the reflection stripe period is 1024 pixels. And respectively projecting sinusoidal stripe gratings on a mobile phone screen along the transverse direction and the longitudinal direction by adopting a four-step phase shift method, moving stripes according to a stepping step distance of pi/2, triggering an area array camera to pick up images, and acquiring four images in each direction, thereby obtaining a four-step phase shift original image (namely an image to be processed).
It will be appreciated that the object 10 to be inspected may be another industrial product or object requiring surface topography inspection.
The image processing module 200 processes the image to be processed and outputs a resultant image sequence, i.e., one or more of a background map, a high-quality modulation degree map, a synthetic modulation degree map, a gloss ratio map, a diffuse reflection map, a true phase map, and a contrast map.
The analysis and detection module 300 performs analysis and detection on the result image sequence, so that surface defects such as scratches, bumps, dirt and the like which are difficult to detect in the past on the mobile phone screen can be detected, the detected surface defects are clearer, and meanwhile, the surface morphology (such as height information) of the detected object 10 (such as the mobile phone screen) is also clear at a glance, so that the false detection rate of the detected object 10 is reduced, and the detection efficiency is improved. The discrimination of various common defects of the analysis detection module 300 is based on the following:
(1) By combining the high-quality modulation degree map/the synthetic modulation degree map with the background map and the gloss ratio map, the defect of the surface dirt of the object to be detected 10 can be discriminated;
(2) By combining the high-quality modulation degree diagram/the synthetic modulation degree diagram with the gloss ratio diagram and the real phase diagram, the fine indentations, irregular scratches, pits and bumps on the surface of the detected object 10 can be distinguished;
(3) By combining the high-quality modulation degree map/the synthetic modulation degree map with the gloss ratio map, it is possible to discriminate cracks on the surface of the test object.
It can be seen that the analysis and detection module 300 may detect the surface morphology of the detected object 10 by combining one or more of the gloss ratio map, the diffuse reflection map, the true phase map and the background map on the basis of the high-quality modulation map/the synthetic modulation map, and may output the detection result of the detected object 10.
In some embodiments, the object 10 to be detected may be a product other than a cell phone screen.
In some embodiments, a cell phone screen is used as the detected object 10. The image pickup unit 110 employs a line camera and a general industrial lens. The image resolution of the line scanning camera is 2048 pixels wide and 1 pixel high, the luminous surface of the line scanning camera is a stripe reflection light source with the width of 100mm and the length of 300mm, and the reflection stripe period T is 1024 pixels. And respectively projecting sinusoidal stripe gratings on a mobile phone screen along the transverse direction and the longitudinal direction by adopting a four-step phase shift method, moving stripes according to the step pitch of pi/2, and controlling a line scanning camera to shoot, wherein four images are acquired in each direction. And repeating the same operation, and after the mobile phone screen and the line scanning camera are relatively moved in parallel for a certain distance, all the images at different moments are spliced together to form eight images with the resolution of 2048 pixels wide and 1024 pixels high, namely four-step phase-shift original images (namely images to be processed) of longitudinal and transverse stripes. The image processing module 200 processes the image to be processed and outputs a resultant image sequence, i.e., one or more of a background map, a high-quality modulation degree map, a synthetic modulation degree map, a gloss ratio map, a diffuse reflection map, and a contrast map.
It can be seen that, by the reflective surface morphology and defect detection system, a result image sequence with clear, comprehensive and distinct characteristics can be output, so that defects which cannot be detected at one time are possible, such as surface dirt, fine indentation, irregular scratches, pits, bumps, cracks and the like, and meanwhile, the defect false detection rate is reduced and the defect detection efficiency is improved; the multiple images in the result image sequence are fused, so that the finally obtained detection result is more robust; based on the principle of optical reflection, the surface of the detected object 10 is not damaged; not only the clear defect appearance and distribution of the detected object 10 but also the height information of the surface defects of the detected object 10 can be obtained from the detection result.
The foregoing is illustrative of a reflective surface topography and defect detection system. A computer-readable storage medium is also disclosed in some embodiments of the application.
The computer readable storage medium includes a program executable by a processor to implement the retroreflective surface topography and defect detection method of any of the embodiments of the present application.
The application also provides a surface defect detection production line based on the reflective surface morphology and the defect detection system. When the surface defect detection production line detects the surface of the detected object 10, the detected object 10 cannot deform or be damaged, and the detection process has simple operation steps.
In some embodiments, the present reflective surface topography and defect detection system may be applied in a surface defect detection line. The surface defect inspection line may be provided with a plurality of the above-described reflective surface topography and defect inspection systems for inspecting the surface of the inspected object 10. Other related equipment can be integrated in the surface defect detection production line so as to finish the procedures of loading and unloading, sorting, cleaning treatment of surface dirt and the like in the detection process.
Reference is made to various exemplary embodiments herein. However, those skilled in the art will recognize that changes and modifications may be made to the exemplary embodiments without departing from the scope herein. For example, the various operational steps and components used to perform the operational steps may be implemented in different ways (e.g., one or more steps may be deleted, modified, or combined into other steps) depending on the particular application or taking into account any number of cost functions associated with the operation of the system.
In the above embodiments, it may be implemented in whole or in part by software, hardware, firmware, or any combination thereof. Additionally, as will be appreciated by one of skill in the art, the principles herein may be reflected in a computer program product on a computer readable storage medium preloaded with computer readable program code. Any tangible, non-transitory computer readable storage medium may be used, including magnetic storage devices (hard disks, floppy disks, etc.), optical storage devices (CD-to-ROM, DVD, blu-Ray disks, etc.), flash memory, and/or the like. These computer program instructions may be loaded onto a general purpose computer, special purpose computer, or other programmable data processing apparatus to produce a machine, such that the instructions which execute on the computer or other programmable data processing apparatus create means for implementing the functions specified. These computer program instructions may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the instructions stored in the computer-readable memory produce an article of manufacture including means which implement the function specified. The computer program instructions may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the instructions which execute on the computer or other programmable apparatus provide steps for implementing the functions specified.
While the principles herein have been shown in various embodiments, many modifications of structure, arrangement, proportions, elements, materials, and components, which are particularly adapted to specific environments and operative requirements, may be used without departing from the principles and scope of the present disclosure. The above modifications and other changes or modifications are intended to be included within the scope of this document.
The foregoing detailed description has been described with reference to various embodiments. However, those skilled in the art will recognize that various modifications and changes may be made without departing from the scope of the present disclosure. Accordingly, the present disclosure is to be considered as illustrative and not restrictive in character, and all such modifications are intended to be included within the scope thereof. Also, advantages, other advantages, and solutions to problems have been described above with regard to various embodiments. The benefits, advantages, solutions to problems, and any element(s) that may cause any benefit, advantage, or solution to occur or become more pronounced are not to be construed as a critical, required, or essential feature. The terms "comprises," "comprising," or any other variation thereof, are intended to cover a non-exclusive inclusion, such that a process, method, article, or apparatus that comprises a list of elements does not include only those elements but may include other elements not expressly listed or inherent to such process, method, system, article, or apparatus. Furthermore, the term "couple" and any other variants thereof are used herein to refer to physical connections, electrical connections, magnetic connections, optical connections, communication connections, functional connections, and/or any other connection.
Those skilled in the art will recognize that many changes may be made to the details of the above-described embodiments without departing from the underlying principles of the invention. Accordingly, the scope of the invention should be determined only by the following claims.

Claims (8)

1. A method for detecting a reflective surface topography and defects, comprising:
acquiring an image to be processed of a detected object; the image to be processed is obtained by projecting structured light having an adjustable frequency onto the surface of the object to be detected and photographing the surface of the object to be detected;
determining a parcel phase diagram, a background diagram and an original modulation diagram corresponding to the image to be detected based on the image to be detected of the object to be detected, a calculation expression of background light intensity of the image to be detected, a calculation expression of modulation degree and a calculation expression of phase value, and removing clutter signals in the original modulation diagram to obtain a high-quality modulation diagram; generating a synthetic modulation degree map based on the high quality modulation degree map; based on a pre-constructed phase numerical equation, calculating a parcel phase map corresponding to the image to be processed by adopting discrete cosine transform to obtain an absolute phase map of the detected object; obtaining a gloss ratio graph and a diffuse reflection graph based on the synthetic modulation degree graph and the background graph, fitting a space curved surface of the absolute phase graph based on a first 36-order Zernike polynomial, and performing linear operation to determine a real phase graph of the detected object;
Detecting the surface morphology of the detected object based on the synthetic modulation degree diagram and one or more of the gloss ratio diagram, the diffuse reflection diagram, the true phase diagram and the background diagram so as to output a detection result of the detected object;
wherein, the calculation expression of the background light intensity is:
wherein the above formula isNRepresenting the total number of the images to be processed corresponding to the detected object, thenFor the serial number of the image to be processed, the I is as follows n (x, y) is the light intensity distribution of the image to be processed;
the calculation expression of the modulation degree corresponding to the pixel points (x, y) on the surface of the detected object is as follows:
the calculation expression of the phase value at the pixel point (x, y) of the detected object surface is:
wherein in the above formula(x, y) represents a wrapping phase at a pixel point (x, y) of the detected object surface;
the calculating the parcel phase map corresponding to the image to be processed to obtain an absolute phase map of the detected object includes:
acquiring a phase numerical equation, wherein the phase numerical equation is used for converting a wrapping phase and an absolute phase corresponding to the image to be processed;
Obtaining an absolute phase diagram corresponding to the image to be processed by solving the phase numerical equation;
the obtaining the absolute phase map corresponding to the image to be processed by solving the phase numerical equation comprises the following steps:
calculating the wrapping phase corresponding to the image to be processed through the phase value corresponding to the pixel point (x, y) on the surface of the detected object(x, y), performing second-order differential processing on the wrapped phases to obtain differential rho (x, y) of adjacent wrapped phase differences, and performing two-dimensional discrete domain expansion on the differential rho (x, y) of the adjacent wrapped phase differences to obtain rho' (x, y);
wrapping the phase by discrete transformation(x, y) unwrapping to obtain the wrapped phase(x, y) and the mapping spectrum value of the discrete transformation domain, substituting the unwrapped wrapping phase of the discrete transformation into the phase numerical equation, and further calculating to obtain the mapping spectrum value of the wrapping phase map in the two-dimensional discrete transformation domain;
performing inverse discrete cosine transform on the calculated mapping spectrum value to obtain an absolute phase diagram corresponding to the image to be processed;
wherein, the phase numerical equation is:
2 /Δx 2 )(x,y)+(Δ 2 /Δy 2 )(x, y) =ρ (x, y), where ρ (x, y) is the difference between adjacent wrapped phase differences, in the phase numerical equation (x, y) represents an absolute phase at a pixel point (x, y) of the detected object surface.
2. The method of claim 1, wherein said removing clutter signals from said original modulation map to obtain a high quality modulation map comprises:
performing Fourier transform on the light intensity distribution corresponding to the original modulation degree diagram to obtain Fourier transform light intensity distribution;
obtaining the frequency band of the clutter signal based on the Fourier transform light intensity distribution;
filtering the Fourier transform light intensity distribution based on the frequency band of the clutter signals to remove the clutter signals in the original modulation degree diagram;
and carrying out inverse Fourier transform on the Fourier transform light intensity distribution subjected to the filtering treatment to obtain the high-quality modulation degree chart.
3. The method of claim 1, wherein determining a true phase map of the inspected object based on the absolute phase map comprises:
fitting the space curved surface of the absolute phase diagram by using a first 36-order Zernike polynomial to obtain a fitted curved surface, wherein the fitted curved surface is a phase diagram of a carrier signal in the absolute phase diagram;
And linearly subtracting the absolute phase map from the fitting curved surface to remove carrier signals in the absolute phase map so as to obtain a real phase map corresponding to the detected object.
4. The method of claim 1, wherein the obtaining a gloss ratio map and a diffuse reflection map based on the synthetic modulation map and the background map comprises:
dividing the synthesized modulation degree map by the background map to obtain the gloss ratio map;
and subtracting the background image from the synthesized modulation degree to obtain the diffuse reflection image.
5. The method for detecting reflective surface topography and defects according to claim 1, wherein the acquiring a to-be-processed image of the detected object comprises:
correcting a Gamma value of a structured light projection unit to prevent the structured light projected by the structured light projection unit from generating distortion, wherein the structured light projection unit is used for generating the structured light and projecting the structured light to the detected object;
setting the frequency of the structured light according to the defect type of the surface of the detected object;
and projecting the structured light to the surface of the detected object, and shooting the surface of the detected object to obtain an image to be processed of the detected object.
6. A reflective surface topography and defect detection system, comprising:
the image acquisition module to be processed is configured to acquire an image to be processed of the detected object; the image to be processed is obtained by projecting structured light having an adjustable frequency onto the surface of the object to be detected and photographing the surface of the object to be detected;
the image processing module is configured to determine a parcel phase diagram, a background diagram and an original modulation diagram corresponding to the image to be processed based on the image to be processed of the detected object and a calculation expression of background light intensity, a calculation expression of a modulation degree and a calculation expression of a phase value of the image to be processed, and remove clutter signals in the original modulation diagram to obtain a high-quality modulation diagram; generating a synthetic modulation degree map based on the high quality modulation degree map; based on a pre-constructed phase numerical equation, calculating a parcel phase map corresponding to the image to be processed by adopting discrete cosine transform to obtain an absolute phase map of the detected object; obtaining a gloss ratio graph and a diffuse reflection graph based on the synthetic modulation degree graph and the background graph, fitting a space curved surface of the absolute phase graph based on a first 36-order Zernike polynomial, and performing linear operation to determine a real phase graph of the detected object;
An analysis detection module configured to detect a surface topography of the detected object based on the synthetic modulation degree map and one or more of the gloss ratio map, the diffuse reflection map, the true phase map, and the background map, to output a detection result of the detected object;
wherein, the calculation expression of the background light intensity is:
wherein the above formula isNRepresenting the total number of the images to be processed corresponding to the detected object, thenFor the serial number of the image to be processed, the I is as follows n (x, y) is the light intensity distribution of the image to be processed;
the calculation expression of the modulation degree corresponding to the pixel points (x, y) on the surface of the detected object is as follows:
the calculation expression of the phase value at the pixel point (x, y) of the detected object surface is:
wherein in the above formula(x, y) represents a wrapping phase at a pixel point (x, y) of the detected object surface;
the calculating the parcel phase map corresponding to the image to be processed to obtain an absolute phase map of the detected object includes:
acquiring a phase numerical equation, wherein the phase numerical equation is used for converting a wrapping phase and an absolute phase corresponding to the image to be processed;
Obtaining an absolute phase diagram corresponding to the image to be processed by solving the phase numerical equation;
the obtaining the absolute phase map corresponding to the image to be processed by solving the phase numerical equation comprises the following steps:
calculating the wrapping phase corresponding to the image to be processed through the phase value corresponding to the pixel point (x, y) on the surface of the detected object(x, y), performing second-order differential processing on the wrapped phases to obtain differential rho (x, y) of adjacent wrapped phase differences, and performing two-dimensional discrete domain expansion on the differential rho (x, y) of the adjacent wrapped phase differences to obtain rho' (x, y);
wrapping the phase by discrete transformation(x, y) unwrapping to obtain the wrapped phaseMapping relation between (x, y) and mapping spectrum value of discrete transformation domain, substituting the discrete transformation-unfolded parcel phase into the phase value equation, and then calculating to obtain the parcelMapping spectrum values of the phase map in a two-dimensional discrete transformation domain;
performing inverse discrete cosine transform on the calculated mapping spectrum value to obtain an absolute phase diagram corresponding to the image to be processed;
wherein, the phase numerical equation is:
2 /Δx 2 )(x,y)+(Δ 2 /Δy 2 )(x, y) =ρ (x, y), where ρ (x, y) is the difference between adjacent wrapped phase differences, in the phase numerical equation (x, y) represents an absolute phase at a pixel point (x, y) of the detected object surface.
7. The retroreflective surface topography and defect detection system of claim 6 wherein the image acquisition module to be processed comprises: an imaging unit and a structured light projection unit;
wherein the structured light projection unit is configured to generate structured light and project the structured light to a surface of the object to be detected; the image capturing unit is configured to capture a surface of the detected object to acquire a to-be-processed image of the detected object.
8. A computer readable storage medium comprising a program executable by a processor to implement the method of any one of claims 1 to 5.
CN202310819746.4A 2023-07-06 2023-07-06 Reflective surface morphology and defect detection method and system thereof Active CN116559179B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310819746.4A CN116559179B (en) 2023-07-06 2023-07-06 Reflective surface morphology and defect detection method and system thereof

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310819746.4A CN116559179B (en) 2023-07-06 2023-07-06 Reflective surface morphology and defect detection method and system thereof

Publications (2)

Publication Number Publication Date
CN116559179A CN116559179A (en) 2023-08-08
CN116559179B true CN116559179B (en) 2023-09-12

Family

ID=87503887

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310819746.4A Active CN116559179B (en) 2023-07-06 2023-07-06 Reflective surface morphology and defect detection method and system thereof

Country Status (1)

Country Link
CN (1) CN116559179B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117015121B (en) * 2023-10-07 2023-12-15 南通医疗器械有限公司 Fault alarm management method and system for shadowless lamp

Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109141224A (en) * 2018-10-08 2019-01-04 电子科技大学 A kind of interference reflective optic film microscopic measuring method based on structure light
CN111693549A (en) * 2020-05-14 2020-09-22 西安电子科技大学 Method for detecting and classifying defects of mobile phone cover plate glass
CN112798617A (en) * 2021-01-28 2021-05-14 江苏骠马智能工业设计研究有限公司 Defect detection device and method for mirror-like object
CN113192013A (en) * 2021-04-16 2021-07-30 华南师范大学 Method and system for detecting defects of light-reflecting surface and electronic equipment
CN113344865A (en) * 2021-05-21 2021-09-03 深圳中科精工科技有限公司 Method, device, equipment and medium for detecting surface defects of smooth object
CN114264664A (en) * 2021-12-22 2022-04-01 上海理工大学 Defect detection system based on bright and dark field and structured light detection
CN114280058A (en) * 2021-11-21 2022-04-05 郑州大学 Grating projection-based refractory brick surface defect detection method
CN115494080A (en) * 2022-09-22 2022-12-20 章鱼博士智能技术(上海)有限公司 Battery blue-coated film appearance defect detection method and system

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
KR20230038078A (en) * 2021-09-10 2023-03-17 라온피플 주식회사 Apparatus and method for detecting defects of glossy surfaces of objects

Patent Citations (8)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109141224A (en) * 2018-10-08 2019-01-04 电子科技大学 A kind of interference reflective optic film microscopic measuring method based on structure light
CN111693549A (en) * 2020-05-14 2020-09-22 西安电子科技大学 Method for detecting and classifying defects of mobile phone cover plate glass
CN112798617A (en) * 2021-01-28 2021-05-14 江苏骠马智能工业设计研究有限公司 Defect detection device and method for mirror-like object
CN113192013A (en) * 2021-04-16 2021-07-30 华南师范大学 Method and system for detecting defects of light-reflecting surface and electronic equipment
CN113344865A (en) * 2021-05-21 2021-09-03 深圳中科精工科技有限公司 Method, device, equipment and medium for detecting surface defects of smooth object
CN114280058A (en) * 2021-11-21 2022-04-05 郑州大学 Grating projection-based refractory brick surface defect detection method
CN114264664A (en) * 2021-12-22 2022-04-01 上海理工大学 Defect detection system based on bright and dark field and structured light detection
CN115494080A (en) * 2022-09-22 2022-12-20 章鱼博士智能技术(上海)有限公司 Battery blue-coated film appearance defect detection method and system

Also Published As

Publication number Publication date
CN116559179A (en) 2023-08-08

Similar Documents

Publication Publication Date Title
Su et al. Dynamic 3-D shape measurement method based on FTP
US9568304B2 (en) Image sequence and evaluation method and system for structured illumination microscopy
US20120007956A1 (en) Evaluation of the Relief of a Tire Surface by Active Stereovision
US20060072122A1 (en) Method and apparatus for measuring shape of an object
TW201710799A (en) New approaches in first order scatterometry overlay based on introduction of auxiliary electromagnetic fields
JP2004109106A (en) Method and apparatus for inspecting surface defect
CA2706042A1 (en) Fourier transform deflectometry system and method
CN116559179B (en) Reflective surface morphology and defect detection method and system thereof
US10078907B2 (en) Distance measurement apparatus, distance measurement method, and storage medium
CN111238403A (en) Three-dimensional reconstruction method and device based on light field sub-aperture stripe image
CN111561877B (en) Variable resolution phase unwrapping method based on point diffraction interferometer
Wu et al. Analysis and reduction of the phase error caused by the non-impulse system psf in fringe projection profilometry
CN115793412A (en) Non-imaging semiconductor overlay error measuring device and method
CN112505057A (en) Rolling surface defect detection system and method
Schoenleber et al. Fast and flexible shape control with adaptive LCD fringe masks
KR101555027B1 (en) Appapatus for three-dimensional shape measurment and method the same
JP2538435B2 (en) Fringe phase distribution analysis method and fringe phase distribution analyzer
WO2019238583A1 (en) Deflectometric techniques
CN109781153B (en) Physical parameter estimation method and device and electronic equipment
Wongjarern et al. Non-phase-shifting Fourier transform profilometry using single grating pattern and object image
JP4062581B2 (en) Region extraction method for fringe analysis
JP2008170282A (en) Shape measuring device
Che et al. 3D measurement of discontinuous objects with optimized dual-frequency grating profilometry
JP2717407B2 (en) Camera distortion measurement device
CN115457034B (en) Method and device for detecting surface defects of mirror-like workpiece

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant