CN116385445B - Visual technology-based electroplated hardware flaw detection method - Google Patents

Visual technology-based electroplated hardware flaw detection method Download PDF

Info

Publication number
CN116385445B
CN116385445B CN202310659443.0A CN202310659443A CN116385445B CN 116385445 B CN116385445 B CN 116385445B CN 202310659443 A CN202310659443 A CN 202310659443A CN 116385445 B CN116385445 B CN 116385445B
Authority
CN
China
Prior art keywords
image
area
difference
flaw
suspected
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202310659443.0A
Other languages
Chinese (zh)
Other versions
CN116385445A (en
Inventor
邓志坚
唐建明
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Dongguan C Ray Automatics Technology Co ltd
Original Assignee
Dongguan C Ray Automatics Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Dongguan C Ray Automatics Technology Co ltd filed Critical Dongguan C Ray Automatics Technology Co ltd
Priority to CN202310659443.0A priority Critical patent/CN116385445B/en
Publication of CN116385445A publication Critical patent/CN116385445A/en
Application granted granted Critical
Publication of CN116385445B publication Critical patent/CN116385445B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/44Local feature extraction by analysis of parts of the pattern, e.g. by detecting edges, contours, loops, corners, strokes or intersections; Connectivity analysis, e.g. of connected components
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/803Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of input or preprocessed data
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/77Processing image or video features in feature spaces; using data integration or data reduction, e.g. principal component analysis [PCA] or independent component analysis [ICA] or self-organising maps [SOM]; Blind source separation
    • G06V10/80Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level
    • G06V10/809Fusion, i.e. combining data from various sources at the sensor level, preprocessing level, feature extraction level or classification level of classification results, e.g. where the classifiers operate on the same input data
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The invention relates to the technical field of image data processing, in particular to a visual technology-based electroplated hardware flaw detection method, which comprises the following steps: collecting a plurality of surface images of the electroplated hardware; obtaining two difference images according to the surface image; obtaining a suspected flaw edge point set of each difference image according to edge pixel points, obtaining a suspected flaw area of each difference image according to the suspected flaw edge point set, obtaining a regular change coefficient according to the suspected flaw area, fusing two binary images of two difference images to obtain a fused binary image, obtaining a first target area and a second target area according to position distribution in the fused binary image, and obtaining a characteristic coefficient according to gray value change conditions and regular change coefficients of the two target areas; and determining the defect type of the electroplating bubble according to the characteristic coefficient. The invention extracts the flaw type of the electroplated hardware more accurately according to the characteristic coefficient.

Description

Visual technology-based electroplated hardware flaw detection method
Technical Field
The invention relates to the technical field of image data processing, in particular to a visual technology-based electroplated hardware flaw detection method.
Background
The hardware is treated by adopting an electroplating technology, so that the corrosion resistance of the surface can be improved, the hardness is increased, the surface abrasion is prevented, and the attractiveness is improved. However, when the electroplating process is performed, defects such as electroplating bubbles and hard material bags on the electroplated surface of the hardware can occur, so that the surface performance of the electroplated part is affected, and the use experience is greatly affected.
The conventional flaw detection algorithm generally uses steps of image segmentation, feature extraction and reclassification, but has the problems of long time consumption, redundant window and manual flaw feature extraction, and inaccurate flaw type extraction, such as difficult distinction of flaws of electroplating bubbles and material hard packets, easily occurs at the moment.
Therefore, the invention uses the industrial camera to collect a plurality of images on the surface of the hardware electroplating piece, and carries out preprocessing operations such as image enhancement and the like on the images to obtain corresponding gray level images; analyzing a plurality of gray images of the same hardware electroplating piece to be detected, and obtaining characteristic coefficients according to gray change analysis caused by forward and backward movement; and distinguishing the electroplating bubble flaws and the material hard packet flaws according to the characteristic coefficient.
Disclosure of Invention
The invention provides a visual technology-based electroplated hardware flaw detection method, which aims to solve the existing problems.
The invention discloses a visual technology-based electroplated hardware flaw detection method, which adopts the following technical scheme:
one embodiment of the invention provides a visual technology-based flaw detection method for electroplated hardware, which comprises the following steps:
collecting a plurality of surface images of the electroplated hardware; obtaining two difference images according to the surface image;
obtaining a suspected flaw edge point set of each difference image according to edge pixel points, obtaining a suspected flaw area of each difference image according to the suspected flaw edge point set, obtaining a regular change coefficient according to the suspected flaw area, fusing two binary images of two difference images to obtain a fused binary image, obtaining a first target area and a second target area according to position distribution in the fused binary image, and obtaining a characteristic coefficient according to gray value change conditions and regular change coefficients of the two target areas;
and determining the defect type of the electroplating bubble according to the characteristic coefficient.
Further, the obtaining two difference images according to the surface image comprises the following specific steps:
the acquired 3 surface images are respectively marked as a surface image A, a surface image B and a surface image C, the absolute value of the difference value of the gray values of two pixel points at the same position of the surface image A and the surface image B is marked as a difference value, the image formed by all the difference values of the surface image A and the surface image B is obtained and is marked as a difference image D1, and the image formed by all the difference values of the surface image C and the surface image B is obtained and is marked as a difference image D2.
Further, the obtaining the initial suspected flaw area of each difference image according to the edge pixel points comprises the following specific steps:
respectively carrying out edge detection on the two difference images through a Canny edge detection algorithm to respectively obtain a set Q1 formed by all edge pixel points in the difference image D1 and a set Q2 formed by all edge pixel points in the difference image D2;
obtaining the union of set Q1 and set Q2Set Q1 is combined with the union->Difference set of (2)A suspected flaw edge point set K1 marked as a difference image D1; set Q2 and union->Difference set of->A suspected flaw edge point set K2 marked as a difference image D2; wherein (1)>Representing the union.
Further, the step of obtaining the suspected flaw area of each difference image includes the following specific steps:
the connected domain area in the edge formed by the suspected flaw edge point set K1 of the difference image D1 is marked as an initial suspected flaw area M1 of the difference image D1;
the connected domain area in the edge formed by the suspected flaw edge point set K2 of the difference image D2 is marked as an initial suspected flaw area M2 of the difference image D2;
the method comprises the steps of obtaining a difference image D1 and a threshold value Y2 of the difference image D2 respectively through an Ojin threshold segmentation method, obtaining an area formed by pixels with gray values smaller than Y1 in an initial suspected flaw area M1 of the difference image D1, marking the area as a suspected flaw area N1 of the difference image D1, and obtaining an area formed by pixels with gray values smaller than Y2 in an initial suspected flaw area M2 of the difference image D2, marking the area as a suspected flaw area N2 of the difference image D2.
Further, the obtaining the regular change coefficient according to the suspected flaw area comprises the following specific steps:
the calculation formula of the rule change coefficient is as follows:
where H represents a regular change coefficient of the suspected defective region, N1 represents the suspected defective region of the difference image D1,the area of the suspected defective region N1 of the difference image D1, N2 of the difference image D2,area of suspected flaw area N2 representing difference image D2, +.>Representing absolute value>An exponential function based on a natural constant e is represented.
Further, the method for obtaining the fused binary image comprises the following specific steps:
the pixel points in the suspected flaw area in the difference image D1 are marked as 1, all other pixel points are marked as 0, so that a binary image of the difference image D1 is obtained, the pixel points in the suspected flaw area in the difference image D2 are marked as 1, all other pixel points are marked as 0, and a binary image of the difference image D2 is obtained; the binary image of the difference image D1 and the binary image of the difference image D2 are fused into a fused binary image, specifically: for the pixel points at any position in the fused binary image, only when the pixel point at the same position of the binary image of the difference image D1 and the binary image of the difference image D2 is 0, the pixel point at the same position in the fused binary image is 0, otherwise, the pixel point at the same position in the fused binary image is 1.
Further, the obtaining the first target area and the second target area includes the following specific steps:
obtaining the maximum value of the abscissa of all 1 pixel points in the fused binary imageAnd minimum value of abscissaWill cross the abscissa +>The straight line perpendicular to the x axis is marked as a central line, and the area formed by all 1 pixel points on the right side of the central line is marked as a target area;
the areas on the surface image a and the surface image C at the same positions as the target areas are obtained, respectively, and are denoted as a first target area T1 and a second target area T2, respectively.
Further, the obtaining the characteristic coefficient comprises the following specific steps:
the calculation formula of the characteristic coefficient S is specifically as follows:
wherein S represents a characteristic coefficient, T1 and T2 represent a first target region and a second target region, respectively,representing the gray value of the i-th pixel in the first target area,/th pixel>Represents the gray value of the i-th pixel in the second target area,the absolute value is represented, and H represents the regular change coefficient of the suspected flaw area.
Further, the determining the flaw type of the electroplating bubble according to the characteristic coefficient comprises the following specific steps:
presetting a first threshold Z1 and a second threshold Z2, if the characteristic coefficient isWhen the plating bubble flaws exist on the surface of the reactor; if the characteristic coefficient->When the surface of the reactor is provided with the material hard-package flaw, otherwise, the surface of the reactor is not provided with the flaw.
The technical scheme of the invention has the beneficial effects that: according to the invention, according to the fact that the electroplating bubble flaws and the material hard-packet flaws have different concave-convex characteristics, gray level changes caused by movement of the electroplating hardware are accurately and effectively obtained through arranging the positions of the camera, the light source and the conveyor belt, the suspected flaw areas are obtained through analyzing collected surface images irradiated by the light sources in different directions, the regular characteristics are obtained according to the fact that the electroplating bubble flaws and the material hard-packet flaws have different shape, the regular change coefficients are obtained according to the areas of the suspected flaw areas, the characteristic coefficients are obtained according to the gray level change conditions corresponding to the concave-convex characteristics and the gray level change conditions and the regular change coefficients of the two target areas, and then whether flaws exist on the surface of the reactor or not and the specific types of the flaws are determined, so that the flaw types of the electroplating hardware are extracted more accurately.
Drawings
In order to more clearly illustrate the embodiments of the invention or the technical solutions in the prior art, the drawings that are required in the embodiments or the description of the prior art will be briefly described, it being obvious that the drawings in the following description are only some embodiments of the invention, and that other drawings may be obtained according to these drawings without inventive effort for a person skilled in the art.
FIG. 1 is a flow chart of steps of a method for detecting flaws in electroplated hardware based on vision technology according to the present invention;
FIG. 2 is an illustration of an apparatus and specific arrangement for capturing surface images of plated hardware in accordance with the present invention;
FIG. 3 is a relief feature of an electroplated bubble flaw;
fig. 4 is a relief feature of a material hard pack flaw.
Detailed Description
In order to further describe the technical means and effects adopted by the invention to achieve the preset aim, the following detailed description is given below of a visual technology-based method for detecting defects of electroplated hardware, which is provided by the invention, with reference to the accompanying drawings and the preferred embodiment. In the following description, different "one embodiment" or "another embodiment" means that the embodiments are not necessarily the same. Furthermore, the particular features, structures, or characteristics of one or more embodiments may be combined in any suitable manner.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The following specifically describes a specific scheme of the method for detecting flaws of electroplated hardware based on visual technology.
Referring to fig. 1, a flowchart of a method for detecting defects of electroplated hardware based on vision technology according to an embodiment of the invention is shown, the method comprises the following steps:
s001, collecting a plurality of surface images of the electroplated hardware.
It should be noted that, the main purpose of the present invention is to detect the surface flaws of the electroplated hardware, so that it is necessary to collect the surface image of the electroplated hardware first.
Specifically, arranging a light source above the middle position of a conveyor belt, arranging 3 industrial cameras on the left side, the right side and the right side of the light source respectively, acquiring 3 surface images of electroplated hardware by using the industrial cameras, performing image enhancement and graying treatment on the acquired 3 surface images, and respectively marking the treated images as a surface image A, a surface image B and a surface image C; the image enhancement and graying processes are known in the art, and will not be described in detail herein.
Referring to fig. 2, the apparatus for capturing a surface image of a plated hardware and a specific arrangement manner of the apparatus are shown, wherein a position below an industrial camera on the left side of a light source in the drawing is denoted as a position a, and a surface image a of the plated hardware is captured at the position a; marking a position right below the industrial camera in the figure as a position B, and collecting a surface image B of the electroplated hardware at the position B; the position below the industrial camera on the right side of the light source in the figure is marked as a position C, and a surface image C of the electroplated hardware is acquired at the position C.
To this end, several surface images of the electroplated hardware are obtained.
S002, obtaining two difference images according to the surface images, obtaining initial suspected flaw areas of each difference image according to edge pixel points, obtaining two suspected flaw areas according to a threshold value, obtaining a regular change coefficient according to the areas of the suspected flaw areas, obtaining target areas according to the positions of the suspected flaw areas, further obtaining a first target area and a second target area, and calculating a characteristic coefficient according to gray value change conditions and the regular change coefficient of the two target areas.
It should be noted that, the primary difference between the plating bubble defect and the material hard pack defect is in morphology: the electroplating bubbles are convex protrusions, and are generally of a certain size and height and are not regular in shape; the material hard pack presents a concave surface, usually larger and deeper than the plating bubbles, and is regular in shape. The images of the electroplating hardware at the positions a, b and c on the two sides of the light source can be collected, and the gray level changes of the electroplating bubble flaws and the material hard packet flaws in the images are different when the electroplating bubble flaws and the material hard packet flaws move on the conveyor belt due to different irradiation directions of the light rays at the positions a and c.
It should be further noted that the specific manifestations of the two defects are: when the surface defect of the electroplated hardware is an electroplating bubble, the corresponding defect area has a convex feature: referring to FIG. 3, the behavior of the light facing and back surfaces at positions a and c under illumination from a light source is shown; when the surface defect of the electroplated hardware is a material hard pack, the corresponding defect area has concave characteristics: referring to fig. 4, the behavior of the light facing and back surfaces at positions a and c under illumination from a light source is shown.
1. Two difference images are obtained according to the surface image, and an initial suspected flaw area of each difference image is obtained according to the edge pixel points.
It should be noted that, since different flaws have different expressions under the irradiation of different light sources, the gray scale change area of the surface image caused by the change of the light source direction can be obtained by making a difference between the surface images collected at different positions, the flaw area can be obtained by analyzing the gray scale change area, and the flaw area in the surface image can be obtained by an image making method.
Specifically, the absolute value of the difference value of the gray values of two pixel points at the same position of the surface image A and the surface image B is recorded as a difference value, an image formed by all the difference values of the surface image A and the surface image B is obtained and recorded as a difference image D1, and an image formed by all the difference values of the surface image C and the surface image B is obtained and recorded as a difference image D2; and respectively carrying out edge detection on the two difference images through a Canny edge detection algorithm to respectively obtain a set Q1 formed by all edge pixel points in the difference image D1 and a set Q2 formed by all edge pixel points in the difference image D2.
Further, a suspected flaw edge point set K1 of the difference image D1 is obtained, specifically:
wherein K1 represents a set of suspected defective edge points, Q1 represents a set of all edge pixel points in the difference image D1, Q2 represents a set of all edge pixel points in the difference image D2,representing the union.
The connected domain region in the edge composed of the suspected flaw edge point set K1 of the difference image D1 is denoted as an initial suspected flaw region M1 of the difference image D1.
Further, a suspected flaw edge point set K2 of the difference image D2 is obtained, specifically:
where K2 represents a set of suspected defective edge points, Q1 represents a set of all edge pixel points in the difference image D1, Q2 represents a set of all edge pixel points in the difference image D2,representing the union.
The connected domain area in the edge composed of the suspected flaw edge point set K2 of the difference image D2 is denoted as an initial suspected flaw area M2 of the difference image D2.
2. And obtaining two suspected flaw areas according to the threshold value, and obtaining a regular change coefficient according to the area of the suspected flaw areas.
The shadow generated by the light irradiation is obtained according to the position of the defective area, and the image is represented by the number of pixels with gray values smaller than the threshold value at the position a and the position c, so that the size of the shadow area at this time can be represented.
Specifically, the threshold values Y1 and Y2 of the difference image D1 and the difference image D2 are obtained by the oxford threshold segmentation method, the region composed of the pixels having the gray value smaller than Y1 in the initial suspected defective region M1 of the difference image D1 is obtained, and is denoted as the suspected defective region N1 of the difference image D1, and the region composed of the pixels having the gray value smaller than Y2 in the initial suspected defective region M2 of the difference image D2 is obtained, and is denoted as the suspected defective region N2 of the difference image D2.
It should be further noted that, in addition to having different concave-convex features, the shape rule of the corresponding region is also different for the two defects, and specifically, the difference in the areas of the suspected defect regions of the two difference images can be measured.
Further, according to the areas of the suspected flaw areas of the two difference images, a regular change coefficient is obtained, and a specific calculation formula is as follows:
where H represents a regular change coefficient of the suspected defective region, N1 represents the suspected defective region of the difference image D1,the area of the suspected defective region N1 of the difference image D1, N2 of the difference image D2,area of suspected flaw area N2 representing difference image D2, +.>Representing absolute value>An exponential function based on a natural constant e is represented.
It should be noted that, H represents a regular change coefficient of a suspected flaw area, which is used to describe an area change of a shadow area that is presented in a surface image when a surface flaw of an electroplated hardware is irradiated by light due to a change of a shooting position, and the closer H is 1, the less the difference of the shadow areas of the suspected flaw area generated at a position a and a position c is, namely, the more regular the shape of the suspected flaw area is, the more likely the flaw is a material hard packet because the shape characteristic of the material hard packet is more regular under the appearance of a light shadow; conversely, the more likely the defect is a plating bubble.
3. And obtaining a target area according to the position of the suspected flaw area, and further obtaining a first target area and a second target area.
The shadow positions of the plating bubble flaws and the material hard packet flaws are different in the image, so that the same side gray scale region of the flaws can be obtained, and the flaws can be distinguished according to the gray scale change of the same side gray scale region.
Specifically, the pixel points in the suspected flaw area in the difference image D1 are marked as 1, all the other pixel points are marked as 0, so as to obtain a binary image of the difference image D1, the pixel points in the suspected flaw area in the difference image D2 are marked as 1, and all the other pixel points are marked as 0, so as to obtain a binary image of the difference image D2; the binary image of the difference image D1 and the binary image of the difference image D2 are fused into a fused binary image, specifically: for the pixel points at any position in the fused binary image, only when the pixel point at the same position of the binary image of the difference image D1 and the binary image of the difference image D2 is 0, the pixel point at the same position in the fused binary image is 0, otherwise, the pixel point at the same position in the fused binary image is 1.
Further, the maximum value of the abscissa of all 1 pixel points in the fused binary image is obtainedAnd minimum value of abscissa +.>Will cross the abscissa +>And a straight line perpendicular to the x-axis is denoted as a center line, and a region composed of all 1 pixels on the right side of the center line is denoted as a target region.
The areas on the surface image a and the surface image C at the same positions as the target areas are obtained, respectively, and are denoted as a first target area T1 and a second target area T2, respectively.
4. And calculating the characteristic coefficient according to the gray value change condition and the rule change coefficient of the two target areas.
It should be noted that, the most intuitive different characteristics of the plating bubble flaws and the material hard packet flaws in the surface of the plating hardware are: whether the concave-convex characteristics and the morphology are regular or not; therefore, the concave-convex characteristics of the flaws can be determined by acquiring the gray value change of the right-side gray change area of the same flaw in the image, and then the regular change coefficient is usedDetermining whether morphological characteristics of the Chinese medicinal materials are regular; thereby distinguishing between the two defects.
Specifically, according to the gray value change condition of the first target area T1 and the second target area T2, and the gray change coefficientThe characteristic coefficient S is obtained, and the calculation formula is specifically as follows:
wherein S represents a characteristic coefficient, T1 and T2 represent a first target region and a second target region, respectively,representing the gray value of the i-th pixel in the first target area,/th pixel>Represents the gray value of the i-th pixel in the second target area,the absolute value is represented, and H represents the regular change coefficient of the suspected flaw area;
、/>representing the sum of gray values of all pixels in the first target area and the second target area, respectively,/-, respectively>The difference representing the sum of the gray values of the two target areas, i.e. representing the overall change of the areas of the surface image a and the surface image C in the same position as the target areas, indicates that the areas of the surface image a are gray compared to the areas of the surface image C if the value is greater than 0A large value indicates that the defect is a convex feature, and when the defect is more likely to be a plating bubble and the value is negative, the region on the surface image a is smaller than the gray value of the region on the surface image C, and the defect is a concave feature, so that the defect is more likely to be a material hard packet; h represents a regular change coefficient of a suspected flaw area, which is used for describing the area change of a shadow area which is presented in a surface image when a surface flaw of an electroplated hardware is irradiated by light due to the change of a shooting position, and the closer H is 1, the less the difference of the shadow areas of the suspected flaw area generated at a position a and a position c is, namely the more regular the shape of the suspected flaw area is, the more likely the flaw is a material hard packet because the shape characteristic of the material hard packet is more regular under the appearance of a light shadow; conversely, the more likely the defect is an electroplating bubble; when S is positive, the defect is a convex feature, and the less the defect is close to 1, the more likely the defect is an electroplating bubble; when S is negative, the defect is a concave feature, and the closer S is to 1, the more likely the defect is a hard pack of material.
S003, determining the flaw type of the electroplating bubble according to the characteristic coefficient.
Specifically, a first threshold is preset, where the present embodiment is described by taking z1=5 as an example, and the present embodiment is not specifically limited, where Z1 may be determined according to specific implementation situations; a second threshold value Z2 is preset, where the present embodiment is described by taking z2= -1.5 as an example, and the present embodiment is not specifically limited, where Z2 may be determined according to the specific implementation situation.
Further, ifWhen the plating bubble flaws exist on the surface of the reactor; if->When the surface of the reactor is provided with the material hard-package flaw, otherwise, the surface of the reactor is not provided with the flaw.
According to the invention, according to the fact that the electroplating bubble flaws and the material hard-packet flaws have different concave-convex characteristics, gray level changes caused by movement of the electroplating hardware are accurately and effectively obtained through arranging the positions of the camera, the light source and the conveyor belt, the suspected flaw areas are obtained through analyzing collected surface images irradiated by the light sources in different directions, the regular characteristics are obtained according to the fact that the electroplating bubble flaws and the material hard-packet flaws have different shape, the regular change coefficients are obtained according to the areas of the suspected flaw areas, the characteristic coefficients are obtained according to the gray level change conditions corresponding to the concave-convex characteristics and the gray level change conditions and the regular change coefficients of the two target areas, and then whether flaws exist on the surface of the reactor or not and the specific types of the flaws are determined, so that the flaw types of the electroplating hardware are extracted more accurately.
It should be noted that: the sequence of the embodiments of the present invention is only for description, and does not represent the advantages and disadvantages of the embodiments. The processes depicted in the accompanying drawings do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing are also possible or may be advantageous.
In this specification, each embodiment is described in a progressive manner, and identical and similar parts of each embodiment are all referred to each other, and each embodiment mainly describes differences from other embodiments.

Claims (5)

1. The visual technology-based electroplated hardware flaw detection method is characterized by comprising the following steps of:
collecting a plurality of surface images of the electroplated hardware; obtaining two difference images according to the surface image;
obtaining a suspected flaw edge point set of each difference image according to edge pixel points, obtaining a suspected flaw area of each difference image according to the suspected flaw edge point set, obtaining a regular change coefficient according to the suspected flaw area, fusing two binary images of two difference images to obtain a fused binary image, obtaining a first target area and a second target area according to position distribution in the fused binary image, and obtaining a characteristic coefficient according to gray value change conditions and regular change coefficients of the two target areas;
determining the flaw type of the electroplating bubble according to the characteristic coefficient;
the method for obtaining the regular change coefficient according to the suspected flaw area comprises the following specific steps:
the calculation formula of the rule change coefficient is as follows:
where H represents a regular change coefficient of the suspected defective region, N1 represents the suspected defective region of the difference image D1,the area of the suspected defective region N1 of the difference image D1, N2 of the difference image D2,area of suspected flaw area N2 representing difference image D2, +.>Representing absolute value>An exponential function based on a natural constant e;
the method for obtaining the fused binary image comprises the following specific steps:
the pixel points in the suspected flaw area in the difference image D1 are marked as 1, all other pixel points are marked as 0, so that a binary image of the difference image D1 is obtained, the pixel points in the suspected flaw area in the difference image D2 are marked as 1, all other pixel points are marked as 0, and a binary image of the difference image D2 is obtained; the binary image of the difference image D1 and the binary image of the difference image D2 are fused into a fused binary image, specifically: for the pixel points at any position in the fused binary image, only when the pixel point at the same position of the binary image of the difference image D1 and the binary image of the difference image D2 is 0, the pixel point at the same position in the fused binary image is 0, otherwise, the pixel point at the same position in the fused binary image is 1;
the characteristic coefficient obtaining method comprises the following specific steps:
the calculation formula of the characteristic coefficient S is specifically as follows:
wherein S represents a characteristic coefficient, T1 and T2 represent a first target region and a second target region, respectively,representing the gray value of the i-th pixel in the first target area,/th pixel>Representing the gray value of the ith pixel point in the second target area,/and>the absolute value is represented, and H represents the regular change coefficient of the suspected flaw area;
the method for determining the flaw type of the electroplating bubble according to the characteristic coefficient comprises the following specific steps:
presetting a first threshold Z1 and a second threshold Z2, if the characteristic coefficient isWhen the plating bubble flaws exist on the surface of the reactor; if the characteristic coefficient->When the surface of the reactor is provided with the material hard-package flaw, otherwise, the surface of the reactor is not provided with the flaw.
2. The visual technology-based method for detecting defects of electroplated hardware according to claim 1, wherein the step of obtaining two difference images from the surface image comprises the following specific steps:
the acquired 3 surface images are respectively marked as a surface image A, a surface image B and a surface image C, the absolute value of the difference value of the gray values of two pixel points at the same position of the surface image A and the surface image B is marked as a difference value, the image formed by all the difference values of the surface image A and the surface image B is obtained and is marked as a difference image D1, and the image formed by all the difference values of the surface image C and the surface image B is obtained and is marked as a difference image D2.
3. The visual technology-based method for detecting defects of electroplated hardware according to claim 1, wherein the step of obtaining an initial suspected defective area of each difference image according to edge pixel points comprises the following specific steps:
respectively carrying out edge detection on the two difference images through a Canny edge detection algorithm to respectively obtain a set Q1 formed by all edge pixel points in the difference image D1 and a set Q2 formed by all edge pixel points in the difference image D2;
obtaining the union of set Q1 and set Q2Set Q1 is combined with the union->Difference set of (2)A suspected flaw edge point set K1 marked as a difference image D1; set Q2 and union->Difference set of->A suspected flaw edge point set K2 marked as a difference image D2; wherein (1)>Representing the union.
4. The visual technology-based method for detecting defects in electroplated hardware according to claim 1, wherein the step of obtaining suspected defective areas of each difference image comprises the following specific steps:
the connected domain area in the edge formed by the suspected flaw edge point set K1 of the difference image D1 is marked as an initial suspected flaw area M1 of the difference image D1;
the connected domain area in the edge formed by the suspected flaw edge point set K2 of the difference image D2 is marked as an initial suspected flaw area M2 of the difference image D2;
the method comprises the steps of obtaining a difference image D1 and a threshold value Y2 of the difference image D2 respectively through an Ojin threshold segmentation method, obtaining an area formed by pixels with gray values smaller than Y1 in an initial suspected flaw area M1 of the difference image D1, marking the area as a suspected flaw area N1 of the difference image D1, and obtaining an area formed by pixels with gray values smaller than Y2 in an initial suspected flaw area M2 of the difference image D2, marking the area as a suspected flaw area N2 of the difference image D2.
5. The visual technology-based method for detecting defects of electroplated hardware according to claim 1, wherein the steps of obtaining the first target area and the second target area comprise the following specific steps:
obtaining the maximum value of the abscissa of all 1 pixel points in the fused binary imageAnd minimum value of abscissa +.>Will cross the abscissa +>The straight line perpendicular to the x axis is marked as a central line, and the area formed by all 1 pixel points on the right side of the central line is marked as a target area;
the areas on the surface image a and the surface image C at the same positions as the target areas are obtained, respectively, and are denoted as a first target area T1 and a second target area T2, respectively.
CN202310659443.0A 2023-06-06 2023-06-06 Visual technology-based electroplated hardware flaw detection method Active CN116385445B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310659443.0A CN116385445B (en) 2023-06-06 2023-06-06 Visual technology-based electroplated hardware flaw detection method

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310659443.0A CN116385445B (en) 2023-06-06 2023-06-06 Visual technology-based electroplated hardware flaw detection method

Publications (2)

Publication Number Publication Date
CN116385445A CN116385445A (en) 2023-07-04
CN116385445B true CN116385445B (en) 2023-08-11

Family

ID=86969805

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310659443.0A Active CN116385445B (en) 2023-06-06 2023-06-06 Visual technology-based electroplated hardware flaw detection method

Country Status (1)

Country Link
CN (1) CN116385445B (en)

Families Citing this family (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116958144B (en) * 2023-09-20 2024-01-12 东莞市南谷第电子有限公司 Rapid positioning method and system for surface defect area of new energy connecting line
CN117238758A (en) * 2023-11-14 2023-12-15 深圳天狼芯半导体有限公司 Method for passivating SiC MOS interface defects by sacrificial oxidation NANO-P doping EPI
CN117237646B (en) * 2023-11-15 2024-01-30 深圳市润海电子有限公司 PET high-temperature flame-retardant adhesive tape flaw extraction method and system based on image segmentation
CN117274248B (en) * 2023-11-20 2024-02-02 滨州三元家纺有限公司 Visual detection method for fabric printing and dyeing flaws and defects
CN117314924B (en) * 2023-11-30 2024-02-09 湖南西欧新材料有限公司 Image feature-based electroplated product surface flaw detection method
CN117372432B (en) * 2023-12-08 2024-02-09 深圳市希格莱特科技有限公司 Electronic cigarette surface defect detection method and system based on image segmentation

Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002250700A (en) * 2001-02-23 2002-09-06 Matsushita Electric Works Ltd Method and device for inspecting pattern
JP2003149160A (en) * 2001-11-13 2003-05-21 Nec Corp Appearance inspection method and appearance inspection system
JP2006017668A (en) * 2004-07-05 2006-01-19 Toyota Motor Corp Surface flaw detecting method and surface flaw detector
CN102509300A (en) * 2011-11-18 2012-06-20 深圳市宝捷信科技有限公司 Defect detection method and system
WO2014134880A1 (en) * 2013-03-06 2014-09-12 京东方科技集团股份有限公司 Detection method and device for backlight module defects
CN109816652A (en) * 2019-01-25 2019-05-28 湖州云通科技有限公司 A kind of intricate casting defect identification method based on gray scale conspicuousness
CN110044910A (en) * 2019-05-09 2019-07-23 河南大学 A kind of automobile sets glass box components detection system and a detection method
CN112927189A (en) * 2021-01-28 2021-06-08 江苏大学 Method for eliminating edge reflection light spots in visual inspection of surface flaws of electroplated workpiece
CN114755236A (en) * 2022-04-25 2022-07-15 镇江亿诺伟视智能科技有限公司 System and method for detecting surface defects of electroplated part with revolution curved surface

Family Cites Families (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US7676110B2 (en) * 2003-09-30 2010-03-09 Fotonation Vision Limited Determination of need to service a camera based on detection of blemishes in digital images

Patent Citations (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2002250700A (en) * 2001-02-23 2002-09-06 Matsushita Electric Works Ltd Method and device for inspecting pattern
JP2003149160A (en) * 2001-11-13 2003-05-21 Nec Corp Appearance inspection method and appearance inspection system
JP2006017668A (en) * 2004-07-05 2006-01-19 Toyota Motor Corp Surface flaw detecting method and surface flaw detector
CN102509300A (en) * 2011-11-18 2012-06-20 深圳市宝捷信科技有限公司 Defect detection method and system
WO2014134880A1 (en) * 2013-03-06 2014-09-12 京东方科技集团股份有限公司 Detection method and device for backlight module defects
CN109816652A (en) * 2019-01-25 2019-05-28 湖州云通科技有限公司 A kind of intricate casting defect identification method based on gray scale conspicuousness
CN110044910A (en) * 2019-05-09 2019-07-23 河南大学 A kind of automobile sets glass box components detection system and a detection method
CN112927189A (en) * 2021-01-28 2021-06-08 江苏大学 Method for eliminating edge reflection light spots in visual inspection of surface flaws of electroplated workpiece
CN114755236A (en) * 2022-04-25 2022-07-15 镇江亿诺伟视智能科技有限公司 System and method for detecting surface defects of electroplated part with revolution curved surface

Also Published As

Publication number Publication date
CN116385445A (en) 2023-07-04

Similar Documents

Publication Publication Date Title
CN116385445B (en) Visual technology-based electroplated hardware flaw detection method
CN115170576B (en) Aluminum pipe surface defect detection method based on machine vision
Sun et al. An effective method of weld defect detection and classification based on machine vision
CN116109644B (en) Surface defect detection method for copper-aluminum transfer bar
CN105067638B (en) Tire fetal membrane face character defect inspection method based on machine vision
CN115082683A (en) Injection molding defect detection method based on image processing
CN116205919B (en) Hardware part production quality detection method and system based on artificial intelligence
CN116703907B (en) Machine vision-based method for detecting surface defects of automobile castings
CN115063404A (en) Weathering resistant steel weld joint quality detection method based on X-ray flaw detection
CN116091504B (en) Connecting pipe connector quality detection method based on image processing
CN116309600B (en) Environment-friendly textile quality detection method based on image processing
CN116385450B (en) PS sheet wear resistance detection method based on image processing
CN109239073A (en) A kind of detection method of surface flaw for body of a motor car
CN115719332A (en) Welding quality detection method
CN110443278A (en) A kind of detection method, device and the equipment of solar battery sheet grid line thickness exception
CN110889807A (en) Image processing method for channel type X-ray security inspection equipment
CN114155230A (en) Quality classification method and system for injection molding PC board with smooth surface
CN115131359A (en) Method for detecting pitting defects on surface of metal workpiece
CN116703251A (en) Rubber ring production quality detection method based on artificial intelligence
CN111402232B (en) Sperm aggregation detection method in semen
CN117058147B (en) Environment-friendly plastic product defect detection method based on computer vision
CN112561875A (en) Photovoltaic cell panel coarse grid detection method based on artificial intelligence
CN116503393A (en) Circuit board plasma nano coating quality detection method based on image processing
CN109934817A (en) The external contouring deformity detection method of one seed pod
Swarnalatha et al. A centroid model for the depth assessment of images using rough fuzzy set techniques

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant