CN116402781A - Defect detection method, device, computer equipment and medium - Google Patents

Defect detection method, device, computer equipment and medium Download PDF

Info

Publication number
CN116402781A
CN116402781A CN202310341573.XA CN202310341573A CN116402781A CN 116402781 A CN116402781 A CN 116402781A CN 202310341573 A CN202310341573 A CN 202310341573A CN 116402781 A CN116402781 A CN 116402781A
Authority
CN
China
Prior art keywords
image
target
feature
workpiece
images
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202310341573.XA
Other languages
Chinese (zh)
Inventor
请求不公布姓名
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Guangdong Lyric Robot Automation Co Ltd
Original Assignee
Guangdong Lyric Robot Intelligent Automation Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Guangdong Lyric Robot Intelligent Automation Co Ltd filed Critical Guangdong Lyric Robot Intelligent Automation Co Ltd
Priority to CN202310341573.XA priority Critical patent/CN116402781A/en
Publication of CN116402781A publication Critical patent/CN116402781A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/50Image enhancement or restoration using two or more images, e.g. averaging or subtraction
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/13Edge detection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20212Image combination
    • G06T2207/20221Image fusion; Image merging
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Quality & Reliability (AREA)
  • Image Analysis (AREA)

Abstract

The embodiment of the application provides a defect detection method, a defect detection device, computer equipment and a defect detection medium, and belongs to the technical field of image detection. The method comprises the following steps: illuminating at least a part of the area of the workpiece to be detected and carrying out image acquisition on the workpiece to be detected each time to obtain a plurality of workpiece images, wherein the workpiece images comprise integral images obtained by carrying out integral illumination on the workpiece to be detected and acquiring; performing feature division on the integral image to obtain a feature region; image synthesis is carried out on the integral image and the workpiece images to obtain a target synthetic image; and carrying out feature detection on the target synthetic image based on a preset edge detection algorithm and a global threshold segmentation algorithm, and determining target defect information corresponding to the feature region. According to the embodiment of the application, the defect edge can be accurately extracted, and the defect detection precision is improved.

Description

Defect detection method, device, computer equipment and medium
Technical Field
The present disclosure relates to the field of image detection technologies, and in particular, to a defect detection method, device, computer device, and medium.
Background
In the appearance detection process of the device, the product appearance quality is inspected manually and visually, the efficiency is low, the labor intensity is high, the detection accuracy is poor, and the product appearance quality is greatly influenced by objective factors such as personnel skill level, so that fluctuation of the carrier band product appearance quality is easily caused, and the detection accuracy is influenced. The related art often needs to collect a photo of the device, and a user analyzes the photo of the device to determine whether the device has defects, damages and the like. In the appearance detection process, most of common defect detection methods use an edge extraction algorithm to analyze the contour, the flatness of the edge and the concave-convex area, but along with the change of illumination conditions, the prior art cannot detect fine defects, and the edge extraction algorithm cannot accurately extract the edge. Thus causing difficulty in distinguishing defect features and degradation of defect detection accuracy.
Disclosure of Invention
The main purpose of the embodiments of the present application is to provide a defect detection method, device, computer equipment and medium, which can realize accurate extraction of defect edges and improve defect detection accuracy.
To achieve the above object, a first aspect of an embodiment of the present application proposes a defect detection method, including:
illuminating at least a part of the area of the workpiece to be detected each time and carrying out image acquisition on the workpiece to be detected to obtain a plurality of workpiece images, wherein the workpiece images comprise integral images obtained by carrying out integral illumination on the workpiece to be detected and acquiring;
performing feature division on the integral image to obtain a feature region;
image synthesis is carried out on the integral image and the workpiece images to obtain a target synthetic image;
and carrying out feature detection on the target synthetic image based on a preset edge detection algorithm and a global threshold segmentation algorithm, and determining target defect information corresponding to the feature region.
In some embodiments, the feature detection of the target synthetic image based on a preset edge detection algorithm and a global threshold segmentation algorithm, and determining target defect information corresponding to the feature region, includes:
extracting edge information of the target synthetic image according to the edge detection algorithm to obtain an edge image corresponding to the characteristic region;
performing threshold setting on the edge image based on the global threshold segmentation algorithm to obtain a target threshold;
extracting gray values of all elements in the edge image to obtain a plurality of gray value information;
and carrying out binarization processing on all the gray value information according to the target threshold value to determine the target defect information.
In some embodiments, the extracting edge information from the target synthetic image according to the edge detection algorithm to obtain an edge image corresponding to the feature region includes:
performing feature sampling on the target synthetic image based on a preset sampling direction to obtain a gray value curve;
conducting derivative operation on the gray value curve according to the edge detection algorithm to obtain a plurality of gray mutation positioning points;
generating a region of interest according to a plurality of gray scale mutation positioning points;
and carrying out target interception on the target synthetic image according to the region of interest to obtain the edge image corresponding to the characteristic region.
In some embodiments, the binarizing all the gray value information according to the target threshold value to determine the target defect information includes:
comparing the target threshold value with each piece of gray value information to obtain a first element set and a second element set, wherein the elements in the first element set are elements with the gray value information larger than or equal to the target threshold value, and the elements in the second element set are elements with the gray value information smaller than the target threshold value;
marking elements in the first element set according to a preset first state value, and determining a characteristic region;
marking elements in the second element set according to a preset second state value, and determining a normal area;
and carrying out feature expression on the feature region and the normal region to obtain the target defect information.
In some embodiments, the image synthesizing the whole image and the workpiece images to obtain a target synthetic image includes:
carrying out weighted bias treatment on a plurality of workpiece images to obtain a plurality of bias images;
determining a plurality of offset channels according to the offset images, and performing image synthesis on the offset images and the offset channels to obtain a synthetic image;
performing cross ratio calculation on the synthetic graph and the characteristic region based on a preset speckle tool to obtain a cross ratio;
and determining a target synthetic graph according to the intersection ratio, the synthetic graph and a preset intersection ratio condition.
In some embodiments, the calculating the cross-over ratio of the synthetic map and the feature region based on the preset speckle tool to obtain the cross-over ratio includes:
determining feature width, feature height and center point coordinates according to the feature region;
performing gray value binarization processing on the synthetic image to obtain a first characteristic region of the synthetic image;
threshold segmentation is carried out on the first characteristic region based on the speckle tool, and a second characteristic region is obtained;
determining a synthesized feature width, a synthesized feature height and synthesized center point coordinates according to the second feature region;
and calculating the intersection ratio of the feature width, the feature height, the center point coordinate, the synthesized feature width, the synthesized feature height and the synthesized center point coordinate to obtain the intersection ratio.
In some embodiments, the determining the target synthetic graph according to the intersection ratio value, the synthetic graph and the preset intersection ratio condition includes:
comparing the cross ratio with a preset cross ratio condition;
under the condition that the intersection ratio meets the intersection ratio condition, determining the target synthetic diagram according to the synthetic diagram;
or alternatively, the process may be performed,
and under the condition that the intersection ratio value does not meet the intersection ratio condition, continuing to carry out weighted bias processing on the plurality of workpiece images to obtain an iteration intersection ratio value until the iteration intersection ratio value meets the intersection ratio condition.
A second aspect of an embodiment of the present application proposes a defect detection apparatus, the apparatus comprising:
the image acquisition module is used for illuminating at least a part of the area of the workpiece to be detected each time and carrying out image acquisition on the workpiece to be detected to obtain a plurality of workpiece images, wherein the workpiece images comprise integral images obtained by carrying out integral illumination on the workpiece to be detected;
the feature division module is used for carrying out feature division on the integral image to obtain a feature region;
the image synthesis module is used for carrying out image synthesis on the integral image and the workpiece images to obtain a target synthesis image;
and the defect determining module is used for carrying out feature detection on the target synthetic image based on a preset edge detection algorithm and a global threshold segmentation algorithm, and determining target defect information corresponding to the feature region.
A third aspect of the embodiments of the present application proposes a computer device comprising a memory and a processor, wherein the memory stores a computer program, which when executed by the processor is configured to perform the defect detection method according to any of the embodiments of the first aspect of the present application.
A fourth aspect of the embodiments of the present application proposes a storage medium being a computer readable storage medium storing a computer program for performing the defect detection method according to any one of the embodiments of the first aspect of the present application when the computer program is executed by a computer.
The defect detection method, the device, the computer equipment and the medium provided by the embodiment of the application have the following beneficial effects: at first, at least one part of the area of the workpiece to be detected is illuminated each time and the image acquisition is carried out on the workpiece to be detected to obtain a plurality of workpiece images, the overall image in the workpiece images is subjected to feature division to obtain a specific feature area, the follow-up accurate identification of defects in the feature area is facilitated, then, the overall image and the plurality of workpiece images are subjected to image synthesis to obtain a target synthetic image, the uniform imaging of the synthetic image is realized, the accuracy of defect detection is improved, finally, the feature detection is carried out on the target synthetic image based on a preset edge detection algorithm and a global threshold segmentation algorithm, the edge profile of the feature area can be accurately determined, the target defect information corresponding to the feature area is determined, the accurate detection of defect information is realized, and the accuracy of defect positioning of the workpiece to be detected is improved.
Drawings
FIG. 1 is a flow chart of a defect detection method provided by one embodiment of the present application;
fig. 2 is a specific flowchart of step S104 in fig. 1;
fig. 3 is a specific flowchart of step S201 in fig. 2;
fig. 4 is a specific flowchart of step S204 in fig. 2;
fig. 5 is a specific flowchart of step S103 in fig. 1;
fig. 6 is a specific flowchart of step S503 in fig. 5;
fig. 7 is a specific flowchart of step S504 in fig. 5;
FIG. 8 is a schematic diagram of a defect detecting device according to an embodiment of the present disclosure;
fig. 9 is a schematic hardware structure of a computer device according to an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
It should be noted that although functional block division is performed in a device diagram and a logic sequence is shown in a flowchart, in some cases, the steps shown or described may be performed in a different order than the block division in the device, or in the flowchart. The terms first, second and the like in the description and in the claims and in the above-described figures, are used for distinguishing between similar elements and not necessarily for describing a particular sequential or chronological order.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs. The terminology used herein is for the purpose of describing embodiments of the present application only and is not intended to be limiting of the present application.
The defect detection method provided by the embodiment of the application can be applied to the terminal, the server side and software running in the terminal or the server side. In some embodiments, the terminal may be a smart phone, tablet, notebook, desktop, or smart watch, etc.; the server side can be configured as an independent physical server, a server cluster or a distributed system formed by a plurality of physical servers, and a cloud server for providing cloud services, cloud databases, cloud computing, cloud functions, cloud storage, network services, cloud communication, middleware services, domain name services, security services, content delivery networks (Content Delivery Network, CDN), basic cloud computing services such as big data and artificial intelligent platforms and the like; the software may be an application or the like that implements the above method, but is not limited to the above form.
Embodiments of the present application may be used in a variety of general-purpose or special-purpose computer system environments or configurations. For example: personal computers, server computers, hand-held or portable devices, tablet devices, multiprocessor systems, microprocessor-based systems, set top boxes, programmable consumer electronics, network PCs, minicomputers, mainframe computers, distributed computing environments that include any of the above systems or devices, and the like. The application may be described in the general context of computer-executable instructions, such as program modules, being executed by a computer. Generally, program modules include routines, programs, objects, components, data structures, etc. that perform particular tasks or implement particular abstract data types. The application may also be practiced in distributed computing environments where tasks are performed by remote processing devices that are linked through a communications network. In a distributed computing environment, program modules may be located in both local and remote computer storage media including memory storage devices.
Referring to fig. 1, fig. 1 is a flowchart of a specific method of a defect detection method according to an embodiment of the present application. In some embodiments, the defect detection method includes, but is not limited to, steps S101 to S104.
Step S101, illuminating at least a part of the area of the workpiece to be detected each time and collecting images of the workpiece to be detected to obtain a plurality of workpiece images;
the workpiece image includes an integral image obtained by integral illumination and collection of the workpiece to be measured.
In some embodiments, a light source emitter is arranged above the workpiece to be measured, the light source emitter is turned on or off by a light source controller, and in the process of illuminating the workpiece to be measured each time, a rear area, a left area, a front area, a right area and an integral lighting lamp of the light source emitter are sequentially controlled, so that at least a part of the area of the workpiece to be measured is illuminated, and image acquisition is carried out on the illuminated workpiece to be measured in different areas, so that a plurality of workpiece images are obtained, and subsequent image synthesis is facilitated.
It can be understood that in the process of image acquisition of the workpiece to be detected, the light source emitter of the rear quarter can be started first, the camera is triggered to shoot to obtain the first image, and then the light source emitter of the left quarter, the light source emitter of the front quarter, the light source emitter of the right quarter and the integral light source emitter are started in sequence, so that the second image, the third image, the fourth image and the integral image are obtained, and the first image, the second image, the third image, the fourth image and the integral image are integrated to obtain a plurality of workpiece images.
It should be noted that, in the present embodiment, the opening sequence and the irradiation area of the light source emitter can be adjusted according to the needs of the user, and the present embodiment is not limited specifically.
Step S102, carrying out feature division on the whole image to obtain a feature region;
in some embodiments, the integral image is subjected to feature division, and an obvious feature is defined in the integral image, so that a feature area is obtained, and the defect feature is conveniently calculated later.
Step S103, performing image synthesis on the whole image and the plurality of workpiece images to obtain a target synthetic image;
in some embodiments, image synthesis is performed on the whole image and the plurality of workpiece images, where the image synthesis in the embodiment is based on multi-channel image synthesis, so as to obtain a target synthesis image, so that subsequent positioning operation on target defect information is facilitated.
Step S104, feature detection is carried out on the target synthetic image based on a preset edge detection algorithm and a global threshold segmentation algorithm, and target defect information corresponding to the feature area is determined.
It should be noted that, the preset edge detection algorithm may be a sobel operator algorithm, a differential edge detection method, a laplace edge detection operator algorithm, or the like, which is a sobel operator algorithm in this embodiment.
In some embodiments, steps S101 to S104 are performed, firstly, at least a part of a region of a workpiece to be detected is illuminated and an image of the workpiece to be detected is acquired each time, a plurality of workpiece images are obtained, and feature division is performed on an overall image in the workpiece images to obtain a specific feature region, so that accurate recognition of defects in the feature region is facilitated, then, image synthesis is performed on the overall image and the plurality of workpiece images to obtain a target synthetic image, uniform imaging of the synthetic image is achieved, accuracy of defect detection is improved, finally, feature detection is performed on the target synthetic image based on a preset edge detection algorithm and a global threshold segmentation algorithm, edge contours of the feature region can be accurately determined, and accordingly, target defect information corresponding to the feature region can be determined, accurate detection of defect information is achieved, and accuracy of defect positioning of the workpiece to be detected is improved.
Referring to fig. 2, fig. 2 is a specific flowchart of step S104 provided in the embodiment of the present application. In some embodiments, step S104 specifically includes, but is not limited to, step S201 and step S204.
Step S201, extracting edge information of a target synthetic image according to an edge detection algorithm to obtain an edge image corresponding to a characteristic region;
step S202, threshold setting is carried out on the edge image based on a global threshold segmentation algorithm to obtain a target threshold;
step S203, extracting gray values of all elements in the edge image to obtain a plurality of gray value information;
step S204, binarizing all gray value information according to the target threshold value to determine target defect information.
In steps S201 to S204 of some embodiments, edge information extraction is performed on the target synthetic image according to an edge detection algorithm, a position of a feature region is located, an edge image corresponding to the feature region is obtained, then threshold setting is performed on the edge image based on a global threshold segmentation algorithm, a target threshold is obtained, the target threshold is a set designated gray value threshold, gray value extraction is performed on all elements in the edge image, gray value information corresponding to each element is obtained, finally binarization processing is performed on all gray value information according to the target threshold, images conforming to the set threshold are extracted, and target defect information is determined, so that accurate extraction of the target defect information is achieved, subsequent positioning of the target defect information is facilitated, and processing precision of a workpiece is improved.
Referring to fig. 3, fig. 3 is a specific flowchart of step S201 provided in the embodiment of the present application. In some embodiments, step S201 specifically includes, but is not limited to, step S301 and step S304.
Step S301, performing feature sampling on a target synthetic image based on a preset sampling direction to obtain a gray value curve;
step S302, deriving a gray value curve according to an edge detection algorithm to obtain a plurality of gray mutation positioning points;
step S303, generating a region of interest according to a plurality of gray scale mutation positioning points;
and step S304, carrying out target interception on the target synthetic image according to the region of interest to obtain an edge image corresponding to the region of interest.
In steps S301 to S304 of some embodiments, in the process of extracting edge information from a target synthetic image, feature sampling is performed on the target synthetic image based on a preset sampling direction to obtain a gray value curve in the sampling direction, then derivative operation is performed on the gray value curve according to an edge detection algorithm to obtain a plurality of gray abrupt change points to obtain gray abrupt change positioning points, edges are determined according to the plurality of gray abrupt change positioning points, an interested region (Region Of Interest, ROI) is generated, and finally the interested region is buckled on the target synthetic image to obtain an edge image corresponding to the interested region.
Referring to fig. 4, fig. 4 is a specific flowchart of step S204 provided in the embodiment of the present application. In some embodiments, step S204 specifically includes, but is not limited to, step S401 and step S404.
Step S401, comparing a target threshold value with each gray value information to obtain a first element set and a second element set;
the elements in the first element set are elements whose gray value information is greater than or equal to the target threshold value, and the elements in the second element set are elements whose gray value information is less than the target threshold value.
Step S402, marking elements in a first element set according to a preset first state value, and determining a characteristic region;
step S403, marking elements in the second element set according to a preset second state value, and determining a normal area;
and step S404, carrying out feature expression on the feature area and the normal area to obtain target defect information.
In steps S401 to S404 of some embodiments, comparing the gray value information of each element with the target threshold, dividing the element larger than or equal to the target threshold into a first element set, dividing the element smaller than the target threshold into a second element set, marking the element in the first element set according to a preset first state value, so that the element in the first element set can be displayed, determining a characteristic area of a display state, marking the element in the second element set according to a preset second state value, so that the element in the second element set is in a transparent state, namely a hidden state, determining a normal area of the transparent state, finally, performing feature expression on the characteristic area and the normal area, maximizing the salient defect characteristic area, and obtaining target defect information, thereby realizing positioning of the target defect information and improving the accuracy of defect detection.
It should be noted that, in this embodiment, the binarization processing of 0/255 is performed on the first element set and the second element set, the preset first state value is set to 255, and the second state value is set to 0, so that the elements in the first element set are set to 255, and the elements in the second element set are set to 0, thereby realizing the obvious mark of defect information, and facilitating the subsequent positioning and detection of the workpiece.
Referring to fig. 5, fig. 5 is a specific flowchart of step S103 provided in the embodiment of the present application. In some embodiments, step S103 specifically includes, but is not limited to, step S501 and step S504.
Step S501, carrying out weighted bias processing on a plurality of workpiece images to obtain a plurality of bias images;
in some embodiments, each workpiece image is subjected to weighted bias processing to obtain a plurality of bias images, so that the brightness of the bias images is uniform, and the situation that the brightness of the images is too bright or too dark is avoided.
In the process of performing the weighted bias processing on the workpiece images, the original image gray value of each workpiece image needs to be acquired first, and then the gray value of each workpiece image is weighted bias processed based on the preset offset value and the weight value to obtain a plurality of bias images, so that the gray value of the bright image of the workpiece image is reduced, the gray value of the dark place is increased, and the uniformity of the brightness of the image is improved as a whole.
It is noted that in the process of performing the weighted bias processing, firstly, determining an image weight value corresponding to the workpiece image in a randomly generated weight matrix of the workpiece image, determining an image offset value corresponding to the workpiece image in a preset offset value, then performing weighted bias on the image gray level value according to the image weight value and the image offset value to obtain a target gray level value, finally, performing brightness adjustment on the workpiece image according to the target gray level value, reducing the bright image gray level value of the workpiece image, increasing the dark gray level value, and performing weighted bias processing on each workpiece image to obtain a plurality of bias images with uniform brightness.
It should be noted that, the specific process of weighting bias calculation is shown in the following formula (1):
y=a1*x+b1 (1)
wherein a1 represents an image weight value, b1 represents an image offset value, x represents an image gray value, and y is a weighted offset target gray value.
Step S502, determining a plurality of offset channels according to the offset images, and performing image synthesis on the offset images and the offset channels to obtain a synthetic image;
in some embodiments, the offset channels corresponding to each offset image are determined according to the offset images, and the offset images and the offset channels are subjected to image synthesis, so that a brand new and complete synthesis image is obtained, and the image synthesis is realized.
In the process of image synthesis, firstly, each pixel point of the offset image needs to be acquired, then the offset images are ordered to obtain an offset sequence, the condition that the image sequence is disordered in the synthesis process to cause the confusion of a synthesized image area is avoided, and finally, the pixel points and offset channels corresponding to each offset image are added according to the offset sequence to obtain a synthesized image.
It should be noted that, the coefficient multiplied by each pixel point in the offset image does not exceed 1, and the whole pixel gray value is ensured not to exceed 255, so that the situation of data overflow is avoided.
Step S503, calculating the cross ratio of the synthetic image and the characteristic region based on a preset speckle tool to obtain the cross ratio;
in some embodiments, the blending ratio is calculated on the basis of a preset speckle tool for the synthetic image and defect characteristic information, so that the blending ratio is obtained, the image precision of the synthetic image is improved, and the problem of uneven defect imaging is solved.
Step S504, determining a target synthetic diagram according to the cross ratio value, the synthetic diagram and a preset cross ratio condition.
In some embodiments, the intersection ratio is compared with a preset intersection ratio condition, so that whether the current synthetic image meets the requirement of the intersection ratio condition can be judged, and a target synthetic image is determined according to a judging result, so that uniform imaging is realized.
Referring to fig. 6, fig. 6 is a specific flowchart of step S503 provided in the embodiment of the present application. In some embodiments, step S503 specifically includes, but is not limited to, step S601 and step S605.
Step S601, determining feature width, feature height and center point coordinates according to the feature region;
in some embodiments, feature widths, feature heights, and center point coordinates of features are first determined from feature regions of the overall image, denoted as w0, h0, and (x 0, y 0).
Step S602, gray value binarization processing is carried out on the synthetic image, and a first characteristic region of the synthetic image is obtained;
in some embodiments, gray value binarization processing is performed on the composite graph, so that defect features in the composite graph are highlighted, a first feature region is obtained, and efficiency of defect feature searching is improved.
Step S603, threshold segmentation is carried out on the first characteristic region based on a preset speckle tool, and a second characteristic region is obtained;
in some embodiments, the first feature region is subjected to threshold segmentation based on a preset speckle tool to obtain a second feature region, wherein the segmentation is performed by combining the areas of the synthetic graphs in the threshold segmentation process, and the segmented results are screened to obtain the second feature region.
Step S604, determining a synthesized feature width, a synthesized feature height and synthesized center point coordinates according to the second feature region;
in some embodiments, the composite feature width w1, the composite feature height h1, and the composite center point coordinates (x 1, y 1) are determined from the second feature region.
And step S605, calculating the cross-over ratio of the feature width, the feature height, the center point coordinates, the synthesized feature width, the synthesized feature height and the synthesized center point coordinates to obtain the cross-over ratio.
In some embodiments, the intersection ratio of the second feature area to the feature area is calculated according to the width, the height and the coordinate points, so as to judge whether the composite graph reaches the standard or not, and realize uniform defect feature imaging.
Referring to fig. 7, fig. 7 is a specific flowchart of step S504 provided in the embodiment of the present application. In some embodiments, step S504 includes, but is not limited to, step S701 and step S703 in particular.
Step S701, comparing the intersection ratio with a preset intersection ratio condition;
step S702, determining a target synthetic diagram according to the synthetic diagram when the cross ratio satisfies the cross ratio condition.
In steps S701 to S702 of some embodiments, the cross ratio is compared with a preset cross ratio condition, and when the cross ratio satisfies the cross ratio condition, it is indicated that the synthetic image has reached the specified specification, and the synthetic image may be directly used as the target synthetic image.
It should be noted that the cross ratio condition may be set according to the needs of the user, and the embodiment is not particularly limited.
And step 703, continuing to perform weighted bias processing on the plurality of workpiece images to obtain an iterative blending ratio under the condition that the blending ratio does not meet the blending ratio condition until the iterative blending ratio meets the blending ratio condition.
In some embodiments, in the process of comparing the cross-blending ratio with the preset cross-blending ratio condition, if the cross-blending ratio does not meet the cross-blending ratio condition, which indicates that the composite graph at this time does not meet the specified specification, step S501-step S503 are repeated, that is, the weighted bias processing is continuously performed on the multiple workpiece images, and the weights are optimized continuously in an iteration manner until the cross-blending ratio meets the cross-blending ratio condition, and the iterative composite graph corresponding to the iterative cross-blending ratio is taken as the target composite graph.
Referring to fig. 8, an embodiment of the present application further provides a defect detection apparatus, which may implement the defect detection method, where the apparatus includes:
the image acquisition module 801 is configured to illuminate at least a part of an area of a workpiece to be measured and acquire images of the workpiece to be measured each time, so as to obtain a plurality of workpiece images, where the workpiece images include an integral image obtained by integral illumination and acquisition of the workpiece to be measured;
the feature division module 802 is configured to perform feature division on the overall image to obtain a feature region;
an image synthesis module 803, configured to perform image synthesis on the whole image and the plurality of workpiece images, so as to obtain a target synthesis image;
the defect determining module 804 is configured to perform feature detection on the target synthetic image based on a preset edge detection algorithm and a global threshold segmentation algorithm, and determine target defect information corresponding to the feature region.
The specific processing procedure of the defect detection apparatus according to the embodiment of the present application is the same as that of the defect detection method according to the embodiment, and will not be described in detail here.
The embodiment of the application also provides a computer device, which comprises a memory and a processor, wherein the memory stores a computer program, and the processor is used for executing the defect detection method in the embodiment of the application when the computer program is executed by the processor.
Referring to fig. 9, fig. 9 is a schematic hardware structure of a computer device according to an embodiment of the present application.
The hardware structure of the computer device is described in detail below with reference to fig. 9. The computer device includes: a processor 910, a memory 920, an input/output interface 930, a communication interface 940, and a bus 950.
The processor 910 may be implemented by a general-purpose CPU (Central Processin Unit, central processing unit), a microprocessor, an application-specific integrated circuit (Application Specific Integrated Circuit, ASIC), or one or more integrated circuits, etc. for executing related programs to implement the technical solutions provided in the embodiments of the present application;
the Memory 920 may be implemented in the form of a Read Only Memory (ROM), a static storage device, a dynamic storage device, or a random access Memory (Random Access Memory, RAM). Memory 920 may store an operating system and other application programs, and when the technical solutions provided in the embodiments of the present application are implemented in software or firmware, relevant program codes are stored in memory 920, and the processor 910 invokes a defect detection method to perform the embodiments of the present application;
an input/output interface 930 for inputting and outputting information;
the communication interface 940 is configured to implement communication interaction between the device and other devices, and may implement communication in a wired manner (e.g., USB, network cable, etc.), or may implement communication in a wireless manner (e.g., mobile network, WIFI, bluetooth, etc.); and a bus 950 for transferring information between components of the device (e.g., processor 910, memory 920, input/output interface 930, and communication interface 940);
wherein processor 910, memory 920, input/output interface 930, and communication interface 940 implement communication connections among each other within the device via a bus 950.
The present application also provides a storage medium that is a computer-readable storage medium storing a computer program for executing the defect detection method as in the above embodiments of the present application when the computer program is executed by a computer.
The memory, as a non-transitory computer readable storage medium, may be used to store non-transitory software programs as well as non-transitory computer executable programs. In addition, the memory may include high-speed random access memory, and may also include non-transitory memory, such as at least one magnetic disk storage device, flash memory device, or other non-transitory solid state storage device. In some embodiments, the memory optionally includes memory remotely located relative to the processor, the remote memory being connectable to the processor through a network. Examples of such networks include, but are not limited to, the internet, intranets, local area networks, mobile communication networks, and combinations thereof.
The embodiments described in the embodiments of the present application are for more clearly describing the technical solutions of the embodiments of the present application, and do not constitute a limitation on the technical solutions provided by the embodiments of the present application, and as those skilled in the art can know that, with the evolution of technology and the appearance of new application scenarios, the technical solutions provided by the embodiments of the present application are equally applicable to similar technical problems.
It will be appreciated by those skilled in the art that the solutions shown in fig. 1-7 are not limiting to embodiments of the present application, and may include more or fewer steps than illustrated, or may combine certain steps, or different steps.
The above described apparatus embodiments are merely illustrative, wherein the units illustrated as separate components may or may not be physically separate, i.e. may be located in one place, or may be distributed over a plurality of network elements. Some or all of the modules may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
Those of ordinary skill in the art will appreciate that all or some of the steps of the methods, systems, functional modules/units in the devices disclosed above may be implemented as software, firmware, hardware, and suitable combinations thereof.
The terms "first," "second," "third," "fourth," and the like in the description of the present application and in the above-described figures, if any, are used for distinguishing between similar objects and not necessarily for describing a particular sequential or chronological order. It is to be understood that the data so used may be interchanged where appropriate such that embodiments of the present application described herein may be implemented in sequences other than those illustrated or otherwise described herein. Furthermore, the terms "comprises," "comprising," and "having," and any variations thereof, are intended to cover a non-exclusive inclusion, such that a process, method, system, article, or apparatus that comprises a list of steps or elements is not necessarily limited to those steps or elements expressly listed but may include other steps or elements not expressly listed or inherent to such process, method, article, or apparatus.
It should be understood that in this application, "at least one" means one or more, and "a plurality" means two or more. "and/or" for describing the association relationship of the association object, the representation may have three relationships, for example, "a and/or B" may represent: only a, only B and both a and B are present, wherein a, B may be singular or plural. The character "/" generally indicates that the context-dependent object is an "or" relationship. "at least one of" or the like means any combination of these items, including any combination of single item(s) or plural items(s). For example, at least one (one) of a, b or c may represent: a, b, c, "a and b", "a and c", "b and c", or "a and b and c", wherein a, b, c may be single or plural.
In the several embodiments provided in this application, it should be understood that the disclosed apparatus and method may be implemented in other ways. For example, the apparatus embodiments described above are merely illustrative, e.g., the division of elements is merely a logical functional division, and there may be additional divisions of actual implementation, e.g., multiple elements or components may be combined or integrated into another system, or some features may be omitted, or not performed. Alternatively, the coupling or direct coupling or communication connection shown or discussed with each other may be an indirect coupling or communication connection via some interfaces, devices or units, which may be in electrical, mechanical or other form.
The units described as separate units may or may not be physically separate, and units shown as units may or may not be physical units, may be located in one place, or may be distributed over a plurality of network units. Some or all of the units may be selected according to actual needs to achieve the purpose of the solution of this embodiment.
In addition, each functional unit in each embodiment of the present application may be integrated in one processing unit, or each unit may exist alone physically, or two or more units may be integrated in one unit. The integrated units may be implemented in hardware or in software functional units.
The integrated units, if implemented in the form of software functional units and sold or used as stand-alone products, may be stored in a computer readable storage medium. Based on such understanding, the technical solution of the present application may be embodied in essence or a part contributing to the prior art or all or part of the technical solution in the form of a software product stored in a storage medium, including multiple instructions to cause a computer device (which may be a personal computer, a server, or a network device, etc.) to perform all or part of the steps of the methods of the embodiments of the present application. And the aforementioned storage medium includes: a U-disk, a removable hard disk, a Read-Only Memory (ROM), a random access Memory (Random Access Memory, RAM), a magnetic disk, or an optical disk, or other various media capable of storing a program.
Preferred embodiments of the present application are described above with reference to the accompanying drawings, and thus do not limit the scope of the claims of the embodiments of the present application. Any modifications, equivalent substitutions and improvements made by those skilled in the art without departing from the scope and spirit of the embodiments of the present application shall fall within the scope of the claims of the embodiments of the present application.

Claims (10)

1. A method of defect detection, the method comprising:
illuminating at least a part of the area of the workpiece to be detected each time and carrying out image acquisition on the workpiece to be detected to obtain a plurality of workpiece images, wherein the workpiece images comprise integral images obtained by carrying out integral illumination on the workpiece to be detected and acquiring;
performing feature division on the integral image to obtain a feature region;
image synthesis is carried out on the integral image and the workpiece images to obtain a target synthetic image;
and carrying out feature detection on the target synthetic image based on a preset edge detection algorithm and a global threshold segmentation algorithm, and determining target defect information corresponding to the feature region.
2. The defect detection method according to claim 1, wherein the feature detection of the target composite graph based on a preset edge detection algorithm and a global threshold segmentation algorithm, and determining target defect information corresponding to the feature region, comprises:
extracting edge information of the target synthetic image according to the edge detection algorithm to obtain an edge image corresponding to the characteristic region;
performing threshold setting on the edge image based on the global threshold segmentation algorithm to obtain a target threshold;
extracting gray values of all elements in the edge image to obtain a plurality of gray value information;
and carrying out binarization processing on all the gray value information according to the target threshold value to determine the target defect information.
3. The defect detection method according to claim 2, wherein the extracting edge information from the target composite graph according to the edge detection algorithm, to obtain an edge image corresponding to the feature region, includes:
performing feature sampling on the target synthetic image based on a preset sampling direction to obtain a gray value curve;
conducting derivative operation on the gray value curve according to the edge detection algorithm to obtain a plurality of gray mutation positioning points;
generating a region of interest according to a plurality of gray scale mutation positioning points;
and carrying out target interception on the target synthetic image according to the region of interest to obtain the edge image corresponding to the region of interest.
4. The defect detection method of claim 2, wherein the binarizing all the gray value information according to the target threshold value to determine the target defect information comprises:
comparing the target threshold value with each piece of gray value information to obtain a first element set and a second element set, wherein the elements in the first element set are elements with the gray value information larger than or equal to the target threshold value, and the elements in the second element set are elements with the gray value information smaller than the target threshold value;
marking elements in the first element set according to a preset first state value, and determining a characteristic region;
marking elements in the second element set according to a preset second state value, and determining a normal area;
and carrying out feature expression on the feature region and the normal region to obtain the target defect information.
5. The method according to claim 1, wherein the image synthesizing the entire image and the plurality of workpiece images to obtain a target synthesized image comprises:
carrying out weighted bias treatment on a plurality of workpiece images to obtain a plurality of bias images;
determining a plurality of offset channels according to the offset images, and performing image synthesis on the offset images and the offset channels to obtain a synthetic image;
performing cross ratio calculation on the synthetic graph and the characteristic region based on a preset speckle tool to obtain a cross ratio;
and determining a target synthetic graph according to the intersection ratio, the synthetic graph and a preset intersection ratio condition.
6. The defect detection method according to claim 5, wherein the performing the cross-correlation calculation on the composite map and the feature region based on the preset speckle tool to obtain the cross-correlation value includes:
determining feature width, feature height and center point coordinates according to the feature region;
performing gray value binarization processing on the synthetic image to obtain a first characteristic region of the synthetic image;
threshold segmentation is carried out on the first characteristic region based on the speckle tool, and a second characteristic region is obtained;
determining a synthesized feature width, a synthesized feature height and synthesized center point coordinates according to the second feature region;
and calculating the intersection ratio of the feature width, the feature height, the center point coordinate, the synthesized feature width, the synthesized feature height and the synthesized center point coordinate to obtain the intersection ratio.
7. The defect detection method of claim 5, wherein the determining the target composite graph according to the intersection ratio value, the composite graph, and a preset intersection ratio condition comprises:
comparing the cross ratio with a preset cross ratio condition;
under the condition that the intersection ratio meets the intersection ratio condition, determining the target synthetic diagram according to the synthetic diagram;
or alternatively, the process may be performed,
and under the condition that the intersection ratio value does not meet the intersection ratio condition, continuing to carry out weighted bias processing on the plurality of workpiece images to obtain an iteration intersection ratio value until the iteration intersection ratio value meets the intersection ratio condition.
8. A defect detection apparatus, the apparatus comprising:
the image acquisition module is used for illuminating at least a part of the area of the workpiece to be detected each time and carrying out image acquisition on the workpiece to be detected to obtain a plurality of workpiece images, wherein the workpiece images comprise integral images obtained by carrying out integral illumination on the workpiece to be detected;
the feature division module is used for carrying out feature division on the integral image to obtain a feature region;
the image synthesis module is used for carrying out image synthesis on the integral image and the workpiece images to obtain a target synthesis image;
and the defect determining module is used for carrying out feature detection on the target synthetic image based on a preset edge detection algorithm and a global threshold segmentation algorithm, and determining target defect information corresponding to the feature region.
9. A computer device comprising a memory and a processor, wherein the memory has stored therein a computer program which, when executed by the processor, is adapted to carry out the defect detection method according to any one of claims 1 to 7.
10. A storage medium, characterized in that the storage medium is a computer-readable storage medium, wherein the computer-readable storage medium stores a computer program for performing the defect detection method according to any one of claims 1 to 7 when the computer program is executed by a computer.
CN202310341573.XA 2023-03-31 2023-03-31 Defect detection method, device, computer equipment and medium Pending CN116402781A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202310341573.XA CN116402781A (en) 2023-03-31 2023-03-31 Defect detection method, device, computer equipment and medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202310341573.XA CN116402781A (en) 2023-03-31 2023-03-31 Defect detection method, device, computer equipment and medium

Publications (1)

Publication Number Publication Date
CN116402781A true CN116402781A (en) 2023-07-07

Family

ID=87013738

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202310341573.XA Pending CN116402781A (en) 2023-03-31 2023-03-31 Defect detection method, device, computer equipment and medium

Country Status (1)

Country Link
CN (1) CN116402781A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117129527A (en) * 2023-08-30 2023-11-28 江苏瑞意隆建设工程有限公司 Urban road paving quality detection method and system

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN117129527A (en) * 2023-08-30 2023-11-28 江苏瑞意隆建设工程有限公司 Urban road paving quality detection method and system

Similar Documents

Publication Publication Date Title
CN110660066B (en) Training method of network, image processing method, network, terminal equipment and medium
US20200043157A1 (en) Inspection Apparatus, Inspection Method, And Program
Chen et al. Building detection in an urban area using lidar data and QuickBird imagery
CN111179230B (en) Remote sensing image contrast change detection method and device, storage medium and electronic equipment
CA2840436C (en) System for mapping and identification of plants using digital image processing and route generation
CN109886928B (en) Target cell marking method, device, storage medium and terminal equipment
EP2339292A1 (en) Three-dimensional measurement apparatus and method thereof
CN108898132B (en) Terahertz image dangerous article identification method based on shape context description
CN110706182B (en) Method and device for detecting flatness of shielding case, terminal equipment and storage medium
KR101272448B1 (en) Apparatus and method for detecting region of interest, and the recording media storing the program performing the said method
CN116503388B (en) Defect detection method, device and storage medium
US11640709B2 (en) Merchandise specification systems and programs
CN116402781A (en) Defect detection method, device, computer equipment and medium
CN117011250A (en) Defect detection method, device and storage medium
US8396297B2 (en) Supervised edge detection using fractal signatures
CN102360503A (en) SAR (Specific Absorption Rate) image change detection method based on space approach degree and pixel similarity
CN109272001B (en) Structure training method and device of urine test recognition classifier and computer equipment
CN108682021B (en) Rapid hand tracking method, device, terminal and storage medium
CN109241970A (en) Urine examination method, mobile terminal and computer readable storage medium
CN110532973B (en) Double-page text image identification and positioning segmentation method based on special anchor points
JP2012200156A (en) Method for tracking cell
Juvela Template matching method for the analysis of interstellar cloud structure
CN115294035B (en) Bright spot positioning method, bright spot positioning device, electronic equipment and storage medium
CN113614774A (en) Method and system for defect detection in image data of target coating
CN108564571B (en) Image area selection method and terminal equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination