CN116091419A - Defect identification method and device based on similarity, electronic equipment and storage medium - Google Patents

Defect identification method and device based on similarity, electronic equipment and storage medium Download PDF

Info

Publication number
CN116091419A
CN116091419A CN202211639365.XA CN202211639365A CN116091419A CN 116091419 A CN116091419 A CN 116091419A CN 202211639365 A CN202211639365 A CN 202211639365A CN 116091419 A CN116091419 A CN 116091419A
Authority
CN
China
Prior art keywords
defect
information
defects
image
combination
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211639365.XA
Other languages
Chinese (zh)
Inventor
易振彧
田倬韬
徐佳锋
蔡淳昊
张岳晨
刘枢
吕江波
沈小勇
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shenzhen Smartmore Technology Co Ltd
Original Assignee
Shenzhen Smartmore Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shenzhen Smartmore Technology Co Ltd filed Critical Shenzhen Smartmore Technology Co Ltd
Priority to CN202211639365.XA priority Critical patent/CN116091419A/en
Publication of CN116091419A publication Critical patent/CN116091419A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/0006Industrial image inspection using a design-rule based approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • G06T7/001Industrial image inspection using an image reference approach
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/74Image or video pattern matching; Proximity measures in feature spaces
    • G06V10/761Proximity, similarity or dissimilarity measures
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30108Industrial image inspection
    • G06T2207/30141Printed circuit board [PCB]
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Quality & Reliability (AREA)
  • Computing Systems (AREA)
  • Artificial Intelligence (AREA)
  • Health & Medical Sciences (AREA)
  • Databases & Information Systems (AREA)
  • Evolutionary Computation (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Software Systems (AREA)
  • Multimedia (AREA)
  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The application provides a defect identification method and device based on similarity, electronic equipment and a storage medium. The method comprises the following steps: acquiring an image to be detected; determining a first type of defect in an image to be detected and defect information of each defect in the first type of defect; the defect information includes at least one of position information, angle information, gradient information, connection point information, and texture information; in the first type of defects, determining a relation score of each defect combination according to defect information of each defect; the relation score is used for representing the similarity degree between two defect information which takes any two defects as a defect combination and corresponds to the two defects in the defect combination one by one; and identifying the defect combination with the relation score conforming to the preset score as a second type of defect in various defect combinations to obtain a defect identification result of the image to be detected. By the method, the accuracy of defect identification of the PCB can be improved.

Description

Defect identification method and device based on similarity, electronic equipment and storage medium
Technical Field
The present disclosure relates to the field of image processing technologies, and in particular, to a defect identification method and apparatus based on similarity, an electronic device, and a storage medium.
Background
With the wide application of products such as automobile electronics, communication equipment, transformers, inductance devices, power modules and the like in life and the rapid development of electronic information technology and communication technology, the market has put forward higher demands on high-transmission and high-voltage electronic products. The wireless charging coil as one of the basic bearing parts of the electronic product has the advantage that the performance of the wireless charging coil directly influences the performance of the corresponding electronic product.
Since the wireless charging coil is composed of a plurality of regions, each region part is very small and complex, a large number of crack defects are inevitably present in the production process, and the crack defects are present in various forms. Therefore, it is necessary to detect the defects of the wireless charging coil having the crack defects to perform the subsequent wireless charging coil repair work. At present, the crack defect detection of the wireless charging coil is mainly carried out by a manual sampling detection mode or a method of 'conveying belt+camera shooting', in the latter method, workers only need to check the quality of the wireless charging coil according to pictures shot by the camera, and the coil detection efficiency is improved.
However, because the existing manual sampling inspection mode or the method of adopting a 'conveyor belt and a camera shooting' often judges two different cracks as the same crack or judges two parts of the same crack as two different cracks, the accuracy of crack defect identification is not high, and the defect identification requirement of a wireless charging coil image in the current industrial scene cannot be met.
Disclosure of Invention
The application provides a defect identification method and device based on similarity, electronic equipment and a computer readable storage medium, which can improve the accuracy of defect identification on a PCB.
In a first aspect, the present application provides a defect identification method based on similarity, including:
acquiring an image to be detected;
determining a first type of defect in an image to be detected and defect information of each defect in the first type of defect; the defect information includes at least one of position information, angle information, gradient information, connection point information, and texture information;
in the first type of defects, determining a relation score of each defect combination according to defect information of each defect; the relation score is used for representing the similarity degree between two defect information which takes any two defects as a defect combination and corresponds to the two defects in the defect combination one by one;
And identifying the defect combination with the relation score conforming to the preset score as a second type of defect in various defect combinations to obtain a defect identification result of the image to be detected.
In a second aspect, the present application further provides a defect identifying device based on similarity, including:
the acquisition unit is used for acquiring the image to be detected;
the determining unit is used for determining the first type of defects in the image to be detected and defect information of each defect in the first type of defects; the defect information includes at least one of position information, angle information, gradient information, connection point information, and texture information;
a scoring unit, configured to determine, in the first type of defect, a relationship score of each defect combination according to defect information of each defect; the relation score is used for representing the similarity degree between two defect information which takes any two defects as a defect combination and corresponds to the two defects in the defect combination one by one;
and the identification unit is used for identifying the defect combination with the relation score conforming to the preset score as a second type of defect in various defect combinations so as to obtain a defect identification result of the image to be detected.
In a third aspect, the present application further provides an electronic device, where the electronic device includes a memory and a processor, where the memory stores a computer program, and where the processor implements a similarity-based defect identification method as described above when executing the computer program.
In a fourth aspect, the present application further provides a computer readable storage medium having stored thereon a computer program which, when executed by a processor, implements a similarity-based defect identification method as described above.
In a fifth aspect, the present application also provides a computer program product comprising a computer program which, when executed by a processor, implements a similarity-based defect identification method as described above.
According to the defect identification method, the defect identification device, the electronic equipment and the computer readable storage medium based on the similarity, on one hand, the relationship score of each defect combination in the first type of defects is determined by utilizing the position information, the angle information, the gradient information, the connection point information and the texture information of each defect in the first type of defects, so that the subsequent PCB defect identification process can be easily performed, the efficiency of PCB defect identification is improved, and the real-time performance of identification is ensured; on the other hand, partial defects in the first type defects are identified to the second type defects by utilizing the relation score of each defect combination in the first type defects, so that the accuracy of PCB defect identification can be improved, and the higher robustness of PCB defect identification can be ensured.
Drawings
In order to more clearly illustrate the embodiments of the present application or the technical solutions in the prior art, the drawings that are needed in the embodiments will be briefly described below, and it is obvious that the drawings in the following description are only some embodiments of the present application, and that other drawings can be obtained according to these drawings without inventive effort for a person skilled in the art.
Fig. 1 is an application environment diagram of a defect identification method based on similarity according to an embodiment of the present application;
FIG. 2 is a flowchart of a first similarity-based defect identification method according to an embodiment of the present disclosure;
FIG. 3 is a schematic flow chart of determining relationship scores between defects according to an embodiment of the present application;
FIG. 4 is a schematic flow chart of a first method for identifying defects of a second type based on relationship scores between defects according to an embodiment of the present application;
FIG. 5 is a flow chart for identifying a second type of defect according to a relationship score between defects according to an embodiment of the present application;
FIG. 6 is a schematic flow chart of determining the total defect number of an image to be detected according to an embodiment of the present application;
fig. 7 is a schematic flow chart of region segmentation of an image to be detected according to an embodiment of the present application;
FIG. 8 is a schematic diagram of an interface for identifying separation areas in an image to be detected according to an embodiment of the present disclosure;
FIG. 9 is a flowchart of a second similarity-based defect identification method according to an embodiment of the present disclosure;
FIG. 10 is a block diagram of a defect identification device based on similarity according to an embodiment of the present application;
FIG. 11 is a block diagram of an electronic device provided by an embodiment of the present application;
FIG. 12 is a block diagram of a computer-readable storage medium provided by an embodiment of the present application;
fig. 13 is a block diagram of a computer program product provided by an embodiment of the present application.
Detailed Description
In order to make the objects, technical solutions and advantages of the present application more apparent, the present application will be further described in detail with reference to the accompanying drawings and examples. It should be understood that the specific embodiments described herein are for purposes of illustration only and are not intended to limit the present application.
The term "and/or" in embodiments of the present application refers to any and all possible combinations including one or more of the associated listed items. Also described are: as used in this specification, the terms "comprises/comprising" and/or "includes" specify the presence of stated features, integers, steps, operations, elements, and/or components, but do not preclude the presence or addition of one or more other features, integers, steps, operations, elements, and/or components, and/or groups thereof.
The terms "first," "second," and the like in this application are used for distinguishing between different objects and not for describing a particular sequential order. Furthermore, the terms "comprise" and "have," as well as any variations thereof, are intended to cover a non-exclusive inclusion. For example, a process, method, system, article, or apparatus that comprises a list of steps or elements is not limited to only those listed steps or elements but may include other steps or elements not listed or inherent to such process, method, article, or apparatus.
In addition, although the terms "first," "second," etc. may be used several times in this application to describe various operations (or various elements or various applications or various instructions or various data) etc., these operations (or elements or applications or instructions or data) should not be limited by these terms. These terms are only used to distinguish one operation (or element or application or instruction or data) from another operation (or element or application or instruction or data). For example, a first number of pixel features may be referred to as a second number of pixel features, which may also be referred to as a first number of pixel features, and which are included only in a range that is different from the range of the present application, and which are a corresponding set of pixel features on various circuit boards, but which are not a corresponding set of pixel features on the same circuit board.
The defect identification method based on the similarity, provided by the embodiment of the application, can be applied to an application environment shown in fig. 1. Wherein the electronic device 102 communicates with the server 104 via a communication network. The data storage system may store data that the server 104 needs to process. The data storage system may be integrated on the server 104 or may be located on a cloud or other network server.
In some embodiments, referring to fig. 1, server 104 first obtains an image to be detected; then, determining the first type of defects in the image to be detected and defect information of each defect in the first type of defects; the defect information comprises at least one of position information, angle information, gradient information, connection point information and texture information; then, in the first type of defects, determining a relation score of each defect combination according to defect information of each defect; the relation score is used for representing the similarity degree between two defect information which takes any two defects as a defect combination and corresponds to the two defects in the defect combination one by one. And finally, identifying the defect combination with the relation score conforming to the preset score as a second type of defect in various defect combinations to obtain a defect identification result of the image to be detected.
In some embodiments, the electronic device 102 (e.g., mobile terminal, fixed terminal) may be implemented in a variety of forms. The electronic device 102 may be a mobile terminal including a relationship score of each defect combination, such as a mobile phone, a smart phone, a notebook computer, a portable handheld device, a personal digital assistant (PDA, personal Digital Assistant), a tablet computer (PAD), etc., which may determine the relationship score of each defect combination based on defect information of each defect, or the electronic device 102 may be an automated teller machine (Automated Teller Machine, ATM), an access control all-in-one machine, a digital TV, a desktop computer, a stationary computer, etc., which may determine the relationship score of each defect combination based on defect information of each defect.
In the following, it is assumed that the electronic device 102 is a fixed terminal. However, those skilled in the art will appreciate that configurations according to embodiments disclosed herein can also be applied to mobile-type electronic devices 102, if there are operations or elements specifically for mobile purposes.
In some embodiments, the image processing components and data processing components running by server 104 may load any of a variety of additional server applications and/or middle tier applications being executed, including, for example, HTTP (hypertext transfer protocol), FTP (file transfer protocol), CGI (common gateway interface), RDBMS (relational database management system), and the like.
In some embodiments, the server 104 may be implemented as a stand-alone server or as a cluster of servers. The server 104 may be adapted to run one or more application services or software components that provide the electronic device 102 described in the foregoing disclosure.
In some embodiments, the application services may include, for example, a service for extracting the first type of defect of the image to be detected and defect information of each defect in the first type of defect, a service for providing the user with a relationship score for determining each defect combination based on the defect information of the first type of defect after the first type of defect is extracted, and so on. The software components may include, for example, an APP or a client having a defect recognition function for the PCB image.
In some embodiments, the server 104 has an APP or client that performs defect detection functions on PCB images, and includes a portal port that provides one-to-one application services to users in the foreground and a plurality of business systems that perform data processing in the background to extend the application of defect identification functions to the APP or client so that users can perform use and access of PCB defect identification functions anywhere at any time.
In some embodiments, the PCB defect recognition function of the APP or client may be a computer program running in user mode to accomplish some specific job or jobs, which may interact with the user and have a visual user interface. Wherein, APP or client may include two parts: a Graphical User Interface (GUI) and an engine (engine) with which a user can be provided with a digitized client system of various application services in the form of a user interface.
In some embodiments, a user may input corresponding code data or control parameters to an APP or client via an input device to execute an application service of a computer program and display the application service in a user interface. For example, when the first type of defects in the image to be detected and the defect information of each defect in the first type of defects need to be extracted, a user operates through an input device and displays the first type of defects through a user interface. Alternatively, the input device may be a touch screen input, a key input, a voice input, or a pupil focus input, among others.
In some embodiments, the APP or client-running operating system may include various versions of Microsoft Windows
Figure BDA0004008083580000061
Apple/>
Figure BDA0004008083580000062
And/or Linux operating system, various commercial or quasi +.>
Figure BDA0004008083580000063
Operating systems (including but not limited to various GNU/Linux operating systems, google +.>
Figure BDA0004008083580000064
Etc.) and/or a mobile operating system, such as +.>
Figure BDA0004008083580000065
Figure BDA0004008083580000066
The operating system, as well as other online or offline operating systems, is not particularly limited herein.
In some embodiments, as shown in fig. 2, a defect identifying method based on similarity is provided, and the method is applied to the server 104 in fig. 1 for illustration, and the method includes the following steps:
step S11, obtaining an image to be detected.
In some embodiments, a user acquires a shot image of a circuit board to be tested in real time through an image pickup device in the electronic device, and then the electronic device sends the acquired shot image to a server for subsequent data processing to obtain an image to be tested of the circuit board to be tested.
In some embodiments, the captured image of the circuit board to be tested may be captured by a camera device or other device in the electronic device in advance and stored in a third party mechanism (such as an image database, a cloud storage platform, etc.), and when the server responds to receiving an instruction for starting to obtain the image to be tested of the circuit board to be tested selected by the user, the server directly obtains the captured image from the corresponding third party mechanism as the image to be tested.
In some embodiments, the camera in the electronic device is an automated optical inspection device (Automated Optical Inspection, AOI) or an automated visual inspection (automated vision inspection, AVI), where AOI or AVI is a device that integrates image sensing technology, data processing technology, motion control technology, which is based on optical principles to detect common defects encountered in PCB circuit board solder production. When automatic detection is carried out, the AOI machine or the AVI machine automatically scans the PCB through the camera, and the shooting image of the circuit board to be detected is acquired.
In some embodiments, the AOI machine or AVI machine may itself be mounted with an image capturing device of one of a depth camera, a 3D camera, a monocular camera, a binocular camera, or the like, and generate corresponding control information according to user input to capture a captured image of the circuit board under test.
In some embodiments, the captured image of the circuit board under test is a PCB (Printed circuit boards, printed circuit board) image. Wherein, the PCB face is divided into a circuit board area and a non-circuit board area.
In some embodiments, the circuit board area of the PCB is provided with a PCB circuit etched by the medicament, and the circuit elements and sharp corners thereof are tiny and dense on the PCB circuit (namely, the sharp corner area protrudes outwards from four right-angle parts of the square area, and two adjacent straight lines of the square area form mutually-intersected inclined lines at the mutually-approaching end parts, and the two adjacent inclined lines mutually intersect to form a sharp corner, and the inclined lines are called sharp corner line segments). Because of the tension of the medicament, a great number of defects (such as holes, rat erosion, open circuits, short circuits, burrs, copper residues and the like, for example) are inevitably generated in the production process of the PCB due to the manufacturing process, and sharp corners on the PCB circuit are often false point sharp corners (more false points can be generated on the circuit element due to the fact that the sharp corners are bright in the openings, the substrate reflects light, the local oxidation, the dirty points and the like). Thus, the image to be inspected acquired by the AOI machine or the AVI machine is a PCB image with various defects and sharp corners.
Step S12, determining the first type of defects in the image to be detected and defect information of each defect in the first type of defects.
In some embodiments, the server extracts feature information of an image from the acquired image to be detected through a preset neural network, and then performs preliminary defect identification based on the feature information of the image to be detected to determine a first type of defect in the image to be detected. The first type of defects are coarse-screen defects which are primarily identified, namely the number of the first type of defects is large, and the first type of defects possibly have a plurality of defects which are repeatedly identified, are identified as a plurality of defects by splitting one defect or are identified by missing, so that the identification accuracy of the first type of defects is not high.
In some embodiments, the server cuts the image to be detected with the first type of defects identified, then extracts all the corners of the first type of defects of the image to be detected by using a Harris corner detection method, and finally, performs feature analysis on the extracted corners by using a preset neural network (such as a convolutional neural network, a semantic segmentation network and the like) to identify defect information of each defect in the first type of defects.
In some embodiments, the server may also input the image to be detected with the first type of defect identified into a pre-trained UNet convolutional neural network model to directly identify defect information of each defect in the first type of defect from the convolutional neural network model.
In some embodiments, the defect information of each defect in the first type of defect includes at least one of position information, angle information, gradient information, connection point information, and texture information of each defect.
In some embodiments, gradient information refers to a change in direction of intensity or color of an image to be detected.
In some embodiments, the texture information is represented by the amount of gradient change between a pixel point in the image to be detected and each pixel point in the surrounding 4×4 range.
Step S13, determining the relation score of each defect combination according to the defect information of each defect in the first type of defects.
In some embodiments, the relationship score is used to characterize a degree of similarity between two defect information, in a first type of defect, that is, a defect combination of any two defects, in one-to-one correspondence with two defects in the defect combination.
In some embodiments, the server inputs the first type of defect and the defect information of the first type of defect identified in the image to be detected into a preset decision tree to calculate the similarity degree of the corresponding defect information between the two defects in each defect combination, so as to obtain the relation score of each defect combination.
In some embodiments, the relationship score for each defect combination includes respective degrees of similarity for location information, angle information, gradient information, connection point information, and texture information for each defect between each two first type defects in the image to be detected.
And S14, identifying the defect combination with the relation score conforming to the preset score as a second type of defect in various defect combinations to obtain a defect identification result of the image to be detected.
In some embodiments, the server obtains the defect recognition result of the image to be detected according to the relationship between the similarity degree of the position information, the angle information, the gradient information, the connection point information and the texture information of each defect between every two first types of defects in the image to be detected and the corresponding preset threshold value constant. If the calculated similarity degree of the various defect information is smaller than the corresponding preset threshold constant, the defect identification result of the PCB is that the two corresponding defects belong to two independent defects; if the similarity degree of the various defect information is calculated to be greater than or equal to the corresponding preset threshold value constant, the defect identification result of the PCB is that the two corresponding defects belong to an independent defect.
In some embodiments, the server fuses two first types of defects corresponding to the defects identified as belonging to one independent defect according to the defect identification result of the PCB, so as to obtain a new defect after fusion. And then, the server identifies the new defects obtained after fusion as the second type of defects, so that a final defect identification result of the image to be detected is obtained.
In the defect identification method based on the similarity, on one hand, the relationship score of each defect combination in the first type of defects is determined by utilizing the position information, the angle information, the gradient information, the connection point information and the texture information of each defect in the first type of defects, so that the subsequent PCB defect identification process can be easily performed, the efficiency of PCB defect identification is improved, and the real-time performance of identification is ensured; on the other hand, partial defects in the first type defects are identified to the second type defects by utilizing the relation score of each defect combination in the first type defects, so that the accuracy of PCB defect identification can be improved, and the higher robustness of PCB defect identification can be ensured.
It will be appreciated by those skilled in the art that in the above-described methods of the embodiments, the disclosed methods may be implemented in a more specific manner. For example, the above-described embodiment of a similarity-based defect recognition method is merely a schematic description.
In an exemplary embodiment, referring to fig. 3, fig. 3 is a flow chart illustrating an embodiment of determining a relationship score between defects in the present application. In step S13, the server determines, among the defects of the first type, a relationship score for each defect combination according to the defect information of each defect, specifically by:
step S131, for each defect information, determining a sub-relationship score corresponding to the defect information for each defect combination.
In some embodiments, the sub-relationship scores for each defect combination corresponding to the defect information are used to characterize a degree of similarity between two defects in each defect combination corresponding to one of the defect information of the location information, the angle information, the gradient information, the connection point information, and the texture information.
In some embodiments, the server determining sub-relationship scores for each defect combination corresponding to defect information may include, for example: for the location information, a distance size between two defects within each defect combination corresponding to the coordinate location is determined to determine a sub-relationship score for the location information corresponding to each defect combination. The larger the distance between two defects corresponding to the coordinate positions is, the smaller the sub-relation score of the corresponding position information is, and on the contrary, the smaller the distance between two defects corresponding to the coordinate positions is, the larger the sub-relation score of the corresponding position information is.
In some embodiments, the server determining sub-relationship scores for each defect combination corresponding to defect information may include, for example: for angle information, a degree of difference between two defects within each defect combination corresponding to an angle direction is determined to determine a sub-relationship score for each defect combination corresponding to the angle information. The greater the degree of difference between the two defects corresponding to the angular direction, the smaller the sub-relationship score corresponding to the angular direction, whereas the lesser the degree of difference between the two defects corresponding to the angular direction, the greater the sub-relationship score corresponding to the angular direction.
In some embodiments, the server determining sub-relationship scores for each defect combination corresponding to defect information may include, for example: for the gradient information, a degree of difference between the two defects within each defect combination corresponding to the color intensity variation is determined to determine a sub-relationship score for each defect combination corresponding to the gradient information. The greater the degree of difference between the two defects corresponding to the color intensity variation, the smaller the sub-relationship score of the corresponding gradient information, whereas the lesser the degree of difference between the two defects corresponding to the color intensity variation, the greater the sub-relationship score of the corresponding gradient information.
In some embodiments, the server determining sub-relationship scores for each defect combination corresponding to defect information may include, for example: for the connection point information, a distance size corresponding to the connection point between two defects within each defect combination is determined to determine a sub-relationship score for the connection point information corresponding to each defect combination. The larger the distance between the two defects corresponding to the connection point is, the smaller the sub-relation score of the information corresponding to the connection point is, otherwise, the smaller the distance between the two defects corresponding to the connection point is, the larger the sub-relation score of the information corresponding to the connection point is.
In some embodiments, the server determining sub-relationship scores for each defect combination corresponding to defect information may include, for example: for texture information, a degree of difference between two defects within each defect combination corresponding to a change in texture direction is determined to determine a sub-relationship score for the texture information corresponding to each defect combination. The greater the degree of difference between the two defects corresponding to the change of the texture direction, the smaller the sub-relationship score of the corresponding texture information, and on the contrary, the smaller the degree of difference between the two defects corresponding to the change of the texture direction, the greater the sub-relationship score of the corresponding texture information.
Step S132, determining the relation score of each defect combination according to the various sub-relation scores corresponding to each defect combination.
And in each defect combination, combining and splicing sub-relation scores corresponding to various defect information to obtain a combined and spliced relation score corresponding to each defect combination.
In some embodiments, corresponding to step S132, the server performs a combined splice on the sub-relationship score P1 of the corresponding texture information, the sub-relationship score P2 of the corresponding connection point information, the sub-relationship score P3 of the corresponding gradient information, the sub-relationship score P4 of the corresponding angle direction, and the sub-relationship score P5 of the corresponding position information between the two defects to obtain spliced relationship scores (P1, P2, P3, P4, P5).
In an exemplary embodiment, referring to fig. 4, fig. 4 is a flowchart illustrating an embodiment of identifying a second type of defect according to a relationship score between defects in the present application. In step S14, the server identifies, among the various defect combinations, the defects whose relationship scores meet the preset scores as defects of the second type, specifically by:
and A1, determining the size relation between each seed relation score and the corresponding preset score in the relation scores after combination and splicing in each defect combination.
In some embodiments, a preset relationship score R1 corresponding to texture information, a preset relationship score R2 corresponding to connection point information, a preset relationship score R3 corresponding to gradient information, a preset relationship score R4 corresponding to an angle direction, and a preset relationship score R5 corresponding to position information between two defects are preset in the server in step S14. Further, the server determines the magnitude relation between each sub-relation score Pi and each corresponding preset relation score Ri.
And A2, identifying two defects in the defect combination with at least one sub-relationship score smaller than the corresponding preset score as second type defects.
In some embodiments, corresponding to step S14, the server identifies two defects in the defect combination having at least one sub-relationship score Pi smaller than the corresponding category of preset score Ri as the second type of defect according to the magnitude relationship of each sub-relationship score Pi and each corresponding preset relationship score Ri. That is, in the various sub-relationship scores corresponding to the defect combinations calculated by the server, if at least one of the various sub-relationship scores corresponding to the distance between the positions of the two defects, the degree of difference in color intensity variation, the degree of difference in angle direction variation, the distance between the connection points, and the degree of difference in texture gradient variation is smaller than the corresponding preset threshold constant, the server identifies the two defects in the corresponding defect combination as the second type of defects. If two defects in the defect combination belong to the second type of defects, the two defects are two independent defects.
In an exemplary embodiment, referring to fig. 5, fig. 5 is a flowchart illustrating another embodiment of identifying defects of the second type according to the relationship scores between defects in the present application. In step S132 and step S14, the server determines a relationship score of each defect combination according to each sub-relationship score corresponding to each defect combination, and then, in each defect combination, the server identifies a defect combination whose relationship score meets a preset score as a second type defect, which is implemented by the following steps:
and B1, respectively fusing sub-relation scores corresponding to various defect information with corresponding weight coefficients in each defect combination.
In some embodiments, corresponding to step S132, the server fuses the sub-relationship score P1 of the corresponding texture information, the sub-relationship score P2 of the corresponding connection point information, the sub-relationship score P3 of the corresponding gradient information, the sub-relationship score P4 of the corresponding angle direction, and the sub-relationship score P5 of the corresponding position information between the two defects according to the corresponding preset weight coefficients to obtain the fused relationship scores (a×p1, b×p2, c×p3, d×p4, e×p5).
And B2, in each defect combination, taking the sum value of the sub-relationship scores after the corresponding weight coefficients are fused as the relationship score of the defect combination.
In some embodiments, corresponding to step S132, the server takes the sum of the sub-relationship scores after fusing the corresponding weight coefficients as the relationship score px=a×p1+b×p2+c×p3+d×p4+e×p5 between two defects in the corresponding defect combination. Where x represents a defect combination.
And B3, identifying two defects in the defect combination with the corresponding relation score smaller than the preset score as second type defects.
In some embodiments, corresponding to step S14, the server identifies two defects in the defect combination with the corresponding relationship score Px smaller than the preset score Rx as the second type of defects according to the magnitude relationship between the relationship score Px and the preset score Rx in the defect combination. If two defects in the defect combination belong to the second type of defects, the two defects are two independent defects.
In an exemplary embodiment, referring to fig. 6, fig. 6 is a flowchart illustrating an embodiment of determining the total defect number of the image to be detected in the present application. After step S14, the server may also perform the following implementation:
and C1, fusing two defects in the defect combination with the relation score not conforming to the preset score in various defect combinations to obtain a third type of fused defects.
In some embodiments, if the server calculates that, in each defect combination, there are defect combinations with the relation scores after the corresponding combination is spliced being greater than or equal to the corresponding preset scores, two defects in the corresponding defect combination are fused, and a third type of fused defect is obtained. Or if the server calculates that the defect combinations have the defect combinations with the relation scores which are greater than or equal to the corresponding preset scores after the fusion according to the weight coefficients, the server fuses the two defects in the corresponding defect combinations to obtain a fused third type defect.
And C2, taking the defect number corresponding to the second type of defects and the third type of defects as the total defect number of the image to be detected.
In some embodiments, the second type of defect is two defects in the defect combination corresponding to the combination and spliced relationship score and/or the relationship score fused according to the weight coefficient is smaller than the corresponding preset score, and the defects can be regarded as two independent defects in the image to be detected; the third type of defects are new defects after the fusion of two defects in the defect combination corresponding to the preset score, wherein the relation score after the combination and the splicing and/or the relation score after the fusion according to the weight coefficient are/is larger than or equal to the new defects after the fusion of the two defects in the defect combination corresponding to the preset score, and the new defects can be regarded as an independent defect in the image to be detected. Therefore, the server takes the sum of the defect numbers corresponding to the second type of defects and the third type of defects as the total defect number of the image to be detected.
In an exemplary embodiment, referring to fig. 7, fig. 7 is a flow chart illustrating an embodiment of performing region segmentation on an image to be detected in the present application. Prior to step S12, the server may also perform the following implementation:
and D1, carrying out image recognition on the image to be detected to obtain the characteristic information of the image to be detected.
In some embodiments, the feature information at least includes gray information, color information, texture information, and shape information of each pixel in the image to be detected.
And D2, carrying out region segmentation on the image to be detected based on the characteristic information to obtain a plurality of segmented regions.
In some embodiments, the average similarity of the feature information corresponding to each pixel point in each partition area is greater than a preset similarity, and the average similarity of the feature information corresponding to each pixel point in different partition areas is less than or equal to the preset similarity.
In some embodiments, the server divides the image to be detected into a plurality of mutually disjoint areas (segmented images) according to the feature information, so that the feature information shows consistency or similarity in the same area (i.e. the average similarity of the feature information is greater than the preset similarity), and shows obvious differences between different areas (i.e. the average similarity of the feature information is less than or equal to the preset similarity).
In some embodiments, the server may locate boundaries of each crack defect (e.g., edge lines, edge curves, etc. of the crack defect) in the image to be detected by segmenting the image into segmented images.
In some embodiments, before the image to be detected is segmented, the server may perform edge enhancement on the image to be detected, so that each crack defect contour in the image to be detected is more prominent, or the contrast of the image to be detected is increased, so that the image area can be segmented more accurately.
In some embodiments, the server performs edge enhancement on the image to be detected, which may include performing edge detection on the image to be detected first, then weakening or completely removing the edge of the salient image and the image area outside the edge according to the result of the edge detection, and finally performing binarization processing on the image to be detected by the server to obtain the boundary brightness and the crack defect contour of the image to be detected after the edge enhancement, wherein the boundary brightness and the crack defect contour are more prominent than the brightness and the crack defect contour around the edge in the original image, and the contrast is higher.
In some embodiments, before determining the first type of defect in the image to be detected and determining the defect information of each defect in the first type of defect, the server may further perform image recognition on the image to be detected to obtain at least one separation region in the image to be detected.
In some embodiments, the separation region is an image region in the image to be detected that has been previously masked. Referring to fig. 8, fig. 8 is an interface schematic diagram of an embodiment of identifying a separation region in an image to be detected in the present application. The region S1 is a part of the image region in the image to be detected, and the region S2 is an image region that has been previously masked in the image to be detected. In the region S1, there is one defect P0, and since the region S2 masks a part of the region S1 and also masks a part of the defect P0, the defect P0 may be identified as two defects in the subsequent defect identification process, which may result in inaccurate defect identification results.
In some embodiments, for the case of the pre-masked image region existing in the image to be detected, the server may first divide the image to be detected into a plurality of divided regions based on the feature information of the image to be detected, then the server identifies the pre-masked image region in each divided region and serves as a divided region, and finally the server eliminates the divided region in each divided region, respectively, to perform a similarity-based defect recognition method on the image to be detected in each divided region not masked by the divided region.
In some embodiments, for the case of pre-masked image areas existing in the image to be detected, the server may first identify all pre-masked image areas in the image to be detected and use them as separation areas, then reject the separation areas in the image to be detected, and finally, based on the feature information of the image to be detected, the server divides the image to be detected, which is not masked by the separation areas, into a plurality of division areas, so as to perform a method of identifying defects based on similarity.
In order to more clearly clarify the defect recognition method provided by the embodiments of the present disclosure, a specific embodiment is described below to specifically describe a defect recognition method based on similarity. In an exemplary embodiment, referring to fig. 9, fig. 9 is a flowchart illustrating a defect identifying method based on similarity, which is applied to the server 104, according to another exemplary embodiment, and specifically includes the following:
step S21: and acquiring an image to be detected.
In some embodiments, on the detection line of the charging coil, the detector includes a lighting lamp and a camera, and when the charging coil passes the detector, the detector captures an image of the charging coil to be detected, so as to obtain an image to be detected.
In some embodiments, there are multiple detectors on the detection line of the charging coil. The lighting lamps of different detectors may be identical or different in light source, angle, pattern and field of view (precision), and the different detectors of the lighting modes of the lighting lamps may be adopted according to the difference of the precision of the identification of the charging coil defects and the crack defect types.
Step S22: and carrying out image segmentation on the image to be detected to obtain a segmented image.
In some embodiments, the server first performs image recognition on the image to be detected to obtain feature information of the image to be detected, and then performs image segmentation on the image to be detected according to the feature information to obtain a segmented image.
In some embodiments, the characteristic information includes at least one of gray scale, color, spatial texture, geometry of the image to be detected.
In some embodiments, the server divides the image to be detected into a plurality of mutually disjoint areas (segmented images) according to the feature information, so that the feature information shows consistency or similarity in the same area and obviously different areas.
In some embodiments, the server may locate boundaries of each crack defect (e.g., edge lines, edge curves, etc. of the crack defect) in the image to be detected by segmenting the image into segmented images.
In some embodiments, the server may perform edge enhancement on the image to be detected before performing image segmentation on the image to be detected, so as to make the outlines of the crack defects in the image to be detected more prominent or increase the contrast of the image to be detected.
In some embodiments, the server performs edge enhancement on the image to be detected, which may include performing edge detection on the image to be detected first, then weakening or completely removing the edge of the salient image and the image area outside the edge according to the result of the edge detection, and finally performing binarization processing on the image to be detected by the server to obtain the boundary brightness and the crack defect contour of the image to be detected after the edge enhancement, wherein the boundary brightness and the crack defect contour are more prominent than the brightness and the crack defect contour around the edge in the original image, and the contrast is higher.
Step S23: and determining a separation region in the image to be detected, and obtaining the separation region.
In some embodiments, since a part of the region of the captured image to be detected may be masked in advance, it may be possible to cause subsequent crack recognition to separate the same crack defect into two crack defects.
In some embodiments, the server first identifies the separation region by performing image recognition on the image to be detected, and then locates the separation region to obtain relevant position parameter information of the separation region.
Step S24: and eliminating the separation areas in the images to be detected so as to extract parameter information of the coarse screen cracks and the preliminary screen cracks from each of the separation images.
In some embodiments, the server extracting parameter information for the coarse screen crack and the preliminary screen crack in each of the segmented images includes image scanning each of the segmented images based on AOI to obtain parameter information for the crack defect (i.e., the preliminary screen crack) and the preliminary screen crack in the image to be detected.
In some embodiments, the parameter information of the prescreening flaw includes location information, gradient information, angle information (directionality), connection point information, texture information, and the like of the flaw. Wherein, the various parameter information is vectorization parameter information.
In some embodiments, gradient information refers to directional changes in image intensity or color.
In some embodiments, texture information is represented by dividing the amount of gradient change between a pixel in the image and each pixel in the 4 x 4 range around.
Step S25: and carrying out connectivity judgment on every two primary screening cracks according to the parameter information of the primary screening cracks.
In some embodiments, the server inputs the parameter information of the primary screening crack in each of the divided images into a preset decision tree to make a connection judgment for every two primary screening cracks. The preset decision tree can be obtained by training based on an ID3 algorithm, a C4.5 algorithm or a CART algorithm.
In some embodiments, the server determines the distance between every two primary screen cracks according to the position information of each primary screen crack; the server respectively determines the degree of difference of the color intensity change between every two primary screen cracks according to the gradient information of each primary screen crack; the server respectively determines the degree of difference of the angle direction change between every two primary screen cracks according to the angle information of each primary screen crack; the server respectively determines the distance between the connection points of every two primary screen cracks according to the connection point information of every primary screen crack; and the server respectively determines the difference degree of the gradient change of the textures between every two primary screening cracks according to the texture information of each primary screening crack.
Further, the server performs connectivity judgment on every two primary screening cracks in each segmented image according to at least one of the calculated distance size of the position, the difference degree of the color intensity change, the difference degree of the angle direction change, the distance size of the connecting point and the difference degree of the texture gradient change, so as to obtain a connectivity judgment result.
In some embodiments, if at least one of the calculated distance magnitude of the position, the degree of difference of the color intensity variation, the degree of difference of the angle direction variation, the distance magnitude of the connection point, and the degree of difference of the texture gradient variation exceeds the respective preset threshold constants, the connectivity judgment result is that the two corresponding primary screening cracks belong to two independent crack defects; if the calculated difference degree of the distance and color intensity changes, the angle direction changes, the distance of the connecting points and the texture gradient changes do not exceed the respective preset threshold constants, the connectivity judgment result is that the two corresponding primary screening cracks belong to an independent crack defect.
In another embodiment, the server respectively fuses the calculated distance size of the position, the difference degree of the color intensity change, the difference degree of the angle direction change, the distance size of the connection point and the difference degree of the texture gradient change with the weight coefficients respectively preset corresponding to the calculated distance size, the difference degree of the color intensity change, the difference degree of the angle direction change, the distance size of the connection point and the difference degree of the texture gradient change, and then calculates the sum of the distance size of the position, the difference degree of the color intensity change, the difference degree of the angle direction change, the distance size of the connection point and the difference degree of the texture gradient change after each fusion weight coefficient is calculated. If the sum exceeds a corresponding preset threshold constant, judging the connectivity as that the corresponding two primary screening cracks belong to two independent crack defects; if the sum value does not exceed the corresponding preset threshold constant, the correlation judgment result is that the corresponding two primary screening cracks belong to an independent crack defect.
In some embodiments, the decision tree is a tree structure, wherein each internal node represents a determination of a middle attribute (i.e., a parameter information of the primary screening crack), each branch represents an output of a determination result (i.e., a connectivity determination score value for two crack defects corresponding to a parameter information, including a distance size of a position, a degree of difference in color intensity variation, a degree of difference in angle direction variation, a distance size of a connecting point, and a degree of difference in texture gradient variation), and each leaf node represents a classification result (i.e., a connectivity determination result of whether two corresponding primary screening cracks belong to an independent crack defect).
In some embodiments, training the decision tree includes first placing training data for each attribute (i.e., various parameter information for the crack defect being trained) at a corresponding root node, and then traversing to select an optimal feature for the various parameter information (i.e., determining an optimal preset threshold constant by traversing, including first starting training from a lower preset threshold constant to last traversing a preset threshold constant that identifies whether two cracks belong to an independent crack defect). And finally, respectively gathering each training data to two leaf nodes corresponding to each training data according to the optimal characteristics corresponding to various parameter information, so as to generate a decision tree.
The crack defects of the parameter information corresponding to the training data divided into one of the leaf nodes can be regarded as the same crack defect, namely, the crack connection. A crack defect that is separated into the respective parameter information corresponding to the training data on another of the leaf nodes may be regarded as two different crack defects, i.e., cracks are not connected.
When the attribute represented by a certain root node cannot be judged (i.e. the optimal preset threshold constant cannot be determined in a traversing manner), the corresponding root node is divided into 2 nodes to determine the optimal characteristic (i.e. another type of parameter information is introduced, the introduced parameter information and the parameter information are fused to obtain new parameter information, and then the optimal preset threshold constant of the new parameter information is determined in a traversing manner).
Step S26: and obtaining fine screen cracks of the image to be detected according to the connectivity judgment result.
In some embodiments, the server merges two primary screening cracks corresponding to the one independent crack defect as a result of the connectivity judgment to obtain a merged crack defect. And then, the server judges that the two corresponding primary screening cracks belong to two independent crack defects as fine screening cracks according to the fused crack defects and the judging result of the connectivity, so that the fine screening cracks of the image to be detected are determined.
According to the scheme, on one hand, the relationship scores among the defects in the first type of defects are determined by utilizing the position information, the angle information, the gradient information, the connection point information and the texture information of the defects in the first type of defects, so that the subsequent PCB defect identification flow can be easily carried out, the efficiency of PCB defect identification is improved, and the real-time performance of identification is ensured; on the other hand, partial defects in the first type defects are identified to the second type defects by utilizing the relation scores among the defects in the first type defects, so that the accuracy of PCB defect identification can be improved, and the higher robustness of PCB defect identification can be ensured.
It should be understood that, although the steps in the flowcharts of fig. 2-9 are shown in order as indicated by the arrows, these steps are not necessarily performed in order as indicated by the arrows. The steps are not strictly limited to the order of execution unless explicitly recited herein, and the steps may be executed in other orders. Moreover, at least a portion of the steps of fig. 2-9 may include multiple steps or stages that are not necessarily performed at the same time, but may be performed at different times, nor does the order in which the steps or stages are performed necessarily occur sequentially, but may be performed alternately or alternately with at least a portion of the steps or stages in other steps or other steps.
It should be understood that the same/similar parts of the embodiments of the method described above in this specification may be referred to each other, and each embodiment focuses on differences from other embodiments, and references to descriptions of other method embodiments are only needed.
Fig. 10 is a block diagram of a defect identifying device based on similarity according to an embodiment of the present application. Referring to fig. 10, the similarity-based defect recognition apparatus 10 includes:
an acquisition unit 11 for acquiring an image to be detected;
a determining unit 12, configured to determine a first type of defect in the image to be detected, and defect information of each defect in the first type of defect; the defect information includes at least one of position information, angle information, gradient information, connection point information, and texture information;
a scoring unit 13, configured to determine a relationship score of each defect combination according to defect information of each defect in the first type of defects; the relation score is used for representing the similarity degree between two defect information which takes any two defects as a defect combination and corresponds to the two defects in the defect combination one by one;
and the identifying unit 14 is configured to identify, among the various defect combinations, a defect combination whose relationship score matches the preset score as a second type of defect, so as to obtain a defect identification result of the image to be detected.
In some embodiments, in determining the relationship score of each defect combination according to the defect information of each defect in the first type of defect, the scoring unit 13 is specifically configured to:
determining, for each of the defect information, a sub-relationship score for each of the defect combinations corresponding to the defect information; the sub-relationship score is used for representing the similarity degree of one defect information in the position information, the angle information, the gradient information, the connection point information and the texture information between two defects in each defect combination;
and determining the relation score of each defect combination according to the various sub-relation scores corresponding to each defect combination.
In some embodiments, the scoring unit 13 is specifically configured to, for each defect information, determine a sub-relationship score for each defect combination corresponding to the defect information:
determining, for the location information, a distance between two defects within each defect combination corresponding to the coordinate location to determine a sub-relationship score for the location information corresponding to each defect combination;
determining, for the angle information, a degree of difference between two defects within each defect combination corresponding to the angle direction to determine a sub-relationship score for the angle information corresponding to each defect combination;
Determining, for the gradient information, a degree of difference between two defects within each defect combination corresponding to the change in color intensity to determine a sub-relationship score for the gradient information corresponding to each defect combination;
determining, for the connection point information, a distance between two defects within each defect combination corresponding to the connection point to determine a sub-relationship score for the connection point information corresponding to each defect combination;
for texture information, a degree of difference between two defects within each defect combination corresponding to a change in texture direction is determined to determine a sub-relationship score for the texture information corresponding to each defect combination.
In some embodiments, the scoring unit 13 is specifically configured to, in determining the relationship score for each defect combination according to the respective sub-relationship scores corresponding to each defect combination:
in each defect combination, sub-relation scores corresponding to various defect information are combined and spliced to obtain a combined and spliced relation score corresponding to each defect combination; or,
in each defect combination, fusing sub-relation scores corresponding to various defect information with corresponding weight coefficients respectively; and in each defect combination, taking the sum value of the sub-relationship scores after the corresponding weight coefficients are fused as the relationship score of the defect combination.
In some embodiments, among the various defect combinations, the defect combination whose relationship score corresponds to the preset score is identified as the second type of defect, and the identifying unit 14 is specifically configured to:
in each defect combination, determining the size relation between each seed relation score in the relation scores after combination and splicing and the corresponding preset score; identifying two defects in the defect combination with at least one sub-relationship score smaller than the corresponding preset score as second type defects; or,
and identifying two defects in the defect combination with the corresponding relation score smaller than the preset score as the second type of defects.
In some embodiments, the defect recognition device 10 based on similarity is specifically further configured to:
in various defect combinations, fusing two defects in the defect combination with the relation score not conforming to the preset score to obtain a fused third type of defect;
and taking the defect number corresponding to the second type of defects and the third type of defects as the total defect number of the image to be detected.
In some embodiments, the defect recognition device 10 based on similarity is specifically further configured to:
performing image recognition on the image to be detected to obtain characteristic information of the image to be detected; the characteristic information at least comprises gray information, color information, texture information and shape information of each pixel point in the image to be detected;
Based on the characteristic information, carrying out region segmentation on the image to be detected to obtain a plurality of segmented regions;
the average similarity of the feature information corresponding to each pixel point in each partition area is larger than the preset similarity, and the average similarity of the feature information corresponding to each pixel point in different partition areas is smaller than or equal to the preset similarity.
In some embodiments, the defect recognition device 10 based on similarity is specifically further configured to:
performing image recognition on the image to be detected to obtain a separation region in the image to be detected; the separation area is an image area which is pre-covered in the image to be detected; and
and eliminating the corresponding partition area in each partition area so as to execute a similarity-based defect identification method on the image to be detected in each partition area after eliminating the partition area.
The specific manner in which the various modules perform the operations in the apparatus of the above embodiments have been described in detail in connection with the embodiments of the method, and will not be described in detail herein.
Fig. 11 is a block diagram of an electronic device 20 provided in an embodiment of the present application. For example, the electronic device 20 may be a server. Referring to fig. 11, the electronic device 20 includes a processor 21, which further processor 21 may be a processor set, which may include one or more processors, and the electronic device 20 includes memory resources represented by a memory 22; wherein the memory 22 has stored thereon a computer program, such as an application program. The computer program stored in the memory 22 may include one or more modules each corresponding to a set of executable instructions. Furthermore, the processing component 21 is configured to implement the similarity-based defect recognition method as described above when executing the computer program.
In some embodiments, electronic device 20 is a server in which a computing system may run one or more operating systems, including any of the operating systems discussed above, as well as any commercially available server operating systems. The server may also run any of a variety of additional server applications and/or middle tier applications, including HTTP (hypertext transfer protocol) servers, FTP (file transfer protocol) servers, CGI (common gateway interface) servers, database servers, and the like. Exemplary database servers include, but are not limited to, those commercially available from (International Business machines) and the like.
In some embodiments, the processing component 21 generally controls overall operation of the electronic device 20, such as operations associated with display, data processing, data communication, and recording operations. The processing component 21 may include one or more processors to execute computer programs to perform all or part of the steps of the methods described above. Further, the processing component 21 may include one or more modules that facilitate interactions between the processing component 21 and other components. For example, the processing component 21 may include a multimedia module to facilitate controlling interactions between the consumer electronic device and the processing component 21 with the multimedia component.
In some embodiments, the processor in the processing component 21 may also be referred to as a CPU (Central Processing Unit ). The processor may be an electronic chip with signal processing capabilities. The processor may also be a general purpose processor, a digital signal processor (Digital Signal Processor, DSP), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), an application specific integrated circuit (Application Specific Integrated Circuit, ASIC), a Field programmable gate array (Field-Programmable Gate Array, FPGA) or other programmable logic device, discrete gate or transistor logic device, discrete hardware components. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like. In addition, the processor may be commonly implemented by an integrated circuit chip.
In some embodiments, memory 22 is configured to store various types of data to support operations at electronic device 20. Examples of such data include instructions, collected data, messages, pictures, videos, etc. for any application or method operating on electronic device 20. The memory 22 may be implemented by any type of volatile or non-volatile memory device or combination thereof, such as Static Random Access Memory (SRAM), electrically erasable programmable read-only memory (EEPROM), erasable programmable read-only memory (EPROM), programmable read-only memory (PROM), read-only memory (ROM), magnetic memory, flash memory, magnetic disk, optical disk, or graphene memory.
In some embodiments, the memory 22 may be a memory bank, a TF card, or the like, and may store all information in the electronic device 20, including input raw data, computer programs, intermediate operation results, and final operation results, all stored in some embodiments, the memory 22. It stores and retrieves information based on the location specified by the processor. With the memory 22, the electronic device 20 has a memory function to ensure proper operation in some embodiments. In some embodiments of the electronic device 20, the memory 22 may be divided into a main memory (memory) and a secondary memory (external memory) according to purposes, and there is a classification method that is divided into an external memory and an internal memory. The external memory is usually a magnetic medium, an optical disk, or the like, and can store information for a long period of time. The memory refers to a storage component on the motherboard for storing data and programs currently being executed, but is only used for temporarily storing programs and data, and the data is lost when the power supply is turned off or the power is turned off.
In some embodiments, the electronic device 20 may further include: the power supply assembly 23 is configured to perform power management of the electronic device 20, and the wired or wireless network interface 24 is configured to connect the electronic device 20 to a network, and an input output (I/O) interface 25. The electronic device 20 may operate based on an operating system stored in the memory 22, such as Windows Server, mac OS X, unix, linux, freeBSD, or the like.
In some embodiments, power supply assembly 23 provides power to the various components of electronic device 20. Power supply components 23 may include a power management system, one or more power supplies, and other components associated with generating, managing, and distributing power for electronic device 20.
In some embodiments, wired or wireless network interface 24 is configured to facilitate wired or wireless communication between electronic device 20 and other devices. The electronic device 20 may access a wireless network based on a communication standard, such as WiFi, an operator network (e.g., 2G, 3G, 4G, or 5G), or a combination thereof.
In some embodiments, the wired or wireless network interface 24 receives broadcast signals or broadcast-related information from an external broadcast management system via a broadcast channel. In one exemplary embodiment, the wired or wireless network interface 24 also includes a Near Field Communication (NFC) module to facilitate short range communications. For example, the NFC module may be implemented based on Radio Frequency Identification (RFID) technology, infrared data association (IrDA) technology, ultra Wideband (UWB) technology, bluetooth (BT) technology, and other technologies.
In some embodiments, input output (I/O) interface 25 provides an interface between processing component 21 and a peripheral interface module, which may be a keyboard, click wheel, button, or the like. These buttons may include, but are not limited to: homepage button, volume button, start button, and lock button.
Fig. 12 is a block diagram of a computer-readable storage medium 30 provided in an embodiment of the present application. The computer-readable storage medium 30 has stored thereon a computer program 31; wherein the computer program 31 when executed by the processor implements the similarity-based defect recognition method as described above.
The units integrated with the functional units in the various embodiments of the present application may be stored in the computer-readable storage medium 30 if implemented in the form of software functional units and sold or used as separate products. Based on such understanding, the technical solution of the present application may be embodied essentially or in a part contributing to the prior art or all or part of the technical solution, or in a software product, and the computer readable storage medium 30 includes several instructions in a computer program 31 to enable a computer device (may be a personal computer, a system server, or a network device, etc.), an electronic device (such as MP3, MP4, etc., also may be a smart terminal such as a mobile phone, a tablet computer, a wearable device, etc., also may be a desktop computer, etc.), or a processor (processor) to perform all or part of the steps of the methods of the embodiments of the present application.
Fig. 13 is a block diagram of a computer program product 40 provided by an embodiment of the present application. The computer program product 40 comprises a computer program comprising program instructions 41, the program instructions 41 being executable by a processor of the electronic device 20 for implementing the similarity-based defect identification method as described above.
It will be appreciated by those skilled in the art that embodiments of the present application may be provided with a similarity-based defect identification method, a similarity-based defect identification apparatus 10, an electronic device 20, a computer-readable storage medium 30, or a computer program product 40. Accordingly, the present application may take the form of an entirely hardware embodiment, an entirely software embodiment or an embodiment combining software and hardware aspects. Furthermore, the present application may take the form of a computer program product 40 embodied on one or more program instructions 41 (including but not limited to disk storage, CD-ROM, optical storage, etc.) of a computer program having computer usable program code embodied therein.
The present application is described with reference to flowchart illustrations and/or block diagrams of a similarity-based defect identification method, a similarity-based defect identification apparatus 10, an electronic device 20, a computer-readable storage medium 30, or a computer program product 40 according to embodiments of the application. It will be understood that each flowchart and/or block of the flowchart illustrations and/or block diagrams, and combinations of flowcharts and/or block diagrams, can be implemented by computer program product 40. These computer program products 40 may be provided to a processor of a general purpose computer, special purpose computer, embedded processor, or other programmable data processing apparatus to produce a machine, such that the program instructions 41, which execute via the processor of the computer or other programmable data processing apparatus, create means for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
These computer program products 40 may also be stored in a computer-readable memory that can direct a computer or other programmable data processing apparatus to function in a particular manner, such that the program instructions 41 stored in the computer program product 40 produce an article of manufacture including instruction means which implement the function specified in the flowchart flow or flows and/or block diagram block or blocks.
These program instructions 41 may also be loaded onto a computer or other programmable data processing apparatus to cause a series of operational steps to be performed on the computer or other programmable apparatus to produce a computer implemented process such that the program instructions 41 which execute on the computer or other programmable apparatus provide steps for implementing the functions specified in the flowchart flow or flows and/or block diagram block or blocks.
It should be noted that the descriptions of the above methods, apparatuses, electronic devices, computer-readable storage media, computer program products and the like according to the method embodiments may further include other implementations, and specific implementations may refer to descriptions of related method embodiments, which are not described herein in detail.
Other embodiments of the disclosure will be apparent to those skilled in the art from consideration of the specification and practice of the disclosure disclosed herein. This disclosure is intended to cover any adaptations, uses, or adaptations of the disclosure following the general principles of the disclosure and including such departures from the present disclosure as come within known or customary practice within the art to which the disclosure pertains. It is intended that the specification and examples be considered as exemplary only, with a true scope and spirit of the disclosure being indicated by the following claims.
It is to be understood that the present disclosure is not limited to the precise arrangements and instrumentalities shown in the drawings, and that various modifications and changes may be effected without departing from the scope thereof. The scope of the present disclosure is limited only by the appended claims.

Claims (11)

1. The defect identification method based on the similarity is characterized by comprising the following steps of:
acquiring an image to be detected;
determining a first type of defects in the image to be detected and defect information of each defect in the first type of defects; the defect information comprises at least one of position information, angle information, gradient information, connection point information and texture information;
determining a relationship score of each defect combination according to the defect information of each defect in the first type of defects; the relation score is used for representing the similarity degree between two defect information which takes any two defects as a defect combination and corresponds to the two defects in the defect combination one by one;
And in the various defect combinations, identifying the defect combination with the relation score conforming to a preset score as a second type of defect so as to obtain a defect identification result of the image to be detected.
2. The method of claim 1, wherein said determining a relationship score for each defect combination from the defect information for each defect in the first type of defect comprises:
determining, for each of the defect information, a sub-relationship score for each defect combination corresponding to the defect information; the sub-relationship score is used for representing the similarity degree of one defect information corresponding to the position information, the angle information, the gradient information, the connection point information and the texture information between two defects in each defect combination;
and determining the relation score of each defect combination according to various sub-relation scores corresponding to each defect combination.
3. The method of claim 2, wherein the step of determining the position of the substrate comprises,
the determining the relationship score of each defect combination according to the corresponding various sub-relationship scores of each defect combination comprises the following steps:
in each defect combination, combining and splicing sub-relation scores corresponding to various defect information to obtain a combined and spliced relation score corresponding to each defect combination;
The identifying, as a second type of defect, a defect combination whose relationship score meets a preset score among the various defect combinations includes:
in each defect combination, determining the size relation between each seed relation score and a corresponding preset score in the relation scores after combination and splicing;
and identifying two defects in the defect combination with at least one sub-relationship score smaller than the corresponding preset score as defects of the second type.
4. The method of claim 2, wherein said determining a relationship score for each of said defect combinations based on respective sub-relationship scores for each of said defect combinations comprises:
in each defect combination, fusing the sub-relation scores corresponding to the various defect information with the corresponding weight coefficients respectively;
in each defect combination, taking the sum value of sub-relationship scores after the corresponding weight coefficients are fused as the relationship score of the defect combination;
the identifying, as a second type of defect, a defect combination whose relationship score meets a preset score among the various defect combinations includes:
and identifying two defects in the defect combination corresponding to the relation score smaller than the preset score as second type defects.
5. The method of claim 2, wherein determining, for each of the defect information, a sub-relationship score for each defect combination corresponding to the defect information comprises at least one of:
determining, for the location information, a distance magnitude between two defects within each defect combination corresponding to a coordinate location to determine a sub-relationship score for each of the defect combinations corresponding to the location information;
determining, for the angle information, a degree of difference between two defects within each defect combination corresponding to an angle direction to determine a sub-relationship score for each of the defect combinations corresponding to the angle information;
determining, for the gradient information, a degree of difference between two defects within each defect combination corresponding to a color intensity variation to determine a sub-relationship score for each of the defect combinations corresponding to the gradient information;
determining, for the connection point information, a distance size between two defects within each defect combination corresponding to a connection point to determine a sub-relationship score for each of the defect combinations corresponding to the connection point information;
for the texture information, a degree of difference between two defects within each defect combination corresponding to a change in texture direction is determined to determine a sub-relationship score for each of the defect combinations corresponding to the texture information.
6. The method according to claim 1, wherein the method further comprises:
in various defect combinations, fusing two defects in the defect combination of which the relation scores do not accord with the preset scores to obtain a fused third type of defects;
and taking the defect number corresponding to the second type of defects and the third type of defects as the total defect number of the image to be detected.
7. The method of claim 1, wherein prior to said determining the first type of defect in the image to be detected and the defect information for each defect in the first type of defect, the method further comprises:
performing image recognition on the image to be detected to obtain characteristic information of the image to be detected; the characteristic information at least comprises gray information, color information, texture information and shape information of each pixel point in the image to be detected;
based on the characteristic information, carrying out region segmentation on the image to be detected to obtain a plurality of segmented regions;
the average similarity of the feature information corresponding to each pixel point in each partition area is larger than a preset similarity, and the average similarity of the feature information corresponding to each pixel point in different partition areas is smaller than or equal to the preset similarity.
8. The method of claim 7, wherein the step of determining the position of the probe is performed,
before determining the first type of defect in the image to be detected and the defect information of each defect in the first type of defect, the method further comprises:
performing image recognition on the image to be detected to obtain a separation region in the image to be detected; the separation area is an image area which is covered in advance in the image to be detected;
the method further comprises the steps of:
and eliminating the corresponding separation area in each separation area so as to execute a defect identification method based on similarity on the image to be detected in each separation area after eliminating the separation area.
9. A similarity-based defect recognition apparatus, comprising:
the acquisition unit is used for acquiring the image to be detected;
a determining unit, configured to determine a first type of defect in the image to be detected, and defect information of each defect in the first type of defect; the defect information comprises at least one of position information, angle information, gradient information, connection point information and texture information;
a scoring unit, configured to determine, in the first type of defect, a relationship score of each defect combination according to the defect information of each defect; the relation score is used for representing the similarity degree between two defect information which takes any two defects as a defect combination and corresponds to the two defects in the defect combination one by one;
And the identification unit is used for identifying the defect combination with the relation score conforming to the preset score as a second type of defect in the various defect combinations so as to obtain a defect identification result of the image to be detected.
10. An electronic device comprising a memory storing a computer program and a processor implementing the similarity-based defect identification method according to any one of claims 1 to 8 when the computer program is executed by the processor.
11. A computer readable storage medium having a computer program stored thereon, wherein the computer program, when executed by a processor, implements the similarity-based defect identification method of any one of claims 1 to 8.
CN202211639365.XA 2022-12-20 2022-12-20 Defect identification method and device based on similarity, electronic equipment and storage medium Pending CN116091419A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211639365.XA CN116091419A (en) 2022-12-20 2022-12-20 Defect identification method and device based on similarity, electronic equipment and storage medium

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211639365.XA CN116091419A (en) 2022-12-20 2022-12-20 Defect identification method and device based on similarity, electronic equipment and storage medium

Publications (1)

Publication Number Publication Date
CN116091419A true CN116091419A (en) 2023-05-09

Family

ID=86209437

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211639365.XA Pending CN116091419A (en) 2022-12-20 2022-12-20 Defect identification method and device based on similarity, electronic equipment and storage medium

Country Status (1)

Country Link
CN (1) CN116091419A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118657177A (en) * 2024-08-13 2024-09-17 昆明理工大学 Circuit board defect identification transducer network distributed reasoning method based on IEC61499 standard

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN118657177A (en) * 2024-08-13 2024-09-17 昆明理工大学 Circuit board defect identification transducer network distributed reasoning method based on IEC61499 standard

Similar Documents

Publication Publication Date Title
CN109284729B (en) Method, device and medium for acquiring face recognition model training data based on video
CN105894036B (en) A kind of characteristics of image template matching method applied to mobile phone screen defects detection
CN112435215A (en) Vehicle loss assessment method based on image, mobile terminal and server
CN111833303A (en) Product detection method and device, electronic equipment and storage medium
JP2011238228A (en) Screen area detection method and system
CN109344864B (en) Image processing method and device for dense object
CN107918767B (en) Object detection method, device, electronic equipment and computer-readable medium
Li et al. Automatic industry PCB board DIP process defect detection with deep ensemble method
CN117392042A (en) Defect detection method, defect detection apparatus, and storage medium
CN116152166A (en) Defect detection method and related device based on feature correlation
CN111242899A (en) Image-based flaw detection method and computer-readable storage medium
CN108460344A (en) Dynamic area intelligent identifying system in screen and intelligent identification Method
Mukhopadhyay et al. PCB inspection in the context of smart manufacturing
CN111783639A (en) Image detection method and device, electronic equipment and readable storage medium
CN112381765A (en) Equipment detection method, device, equipment and storage medium based on artificial intelligence
CN115239626A (en) Defect detection method, optical detection device, electronic device, and storage medium
CN115100104A (en) Defect detection method, device and equipment for glass ink area and readable storage medium
JP2020067308A (en) Image processing method and image processing device
CN116559177A (en) Defect detection method, device, equipment and storage medium
Li et al. A method of surface defect detection of irregular industrial products based on machine vision
CN112100430B (en) Article tracing method and device
CN112967224A (en) Electronic circuit board detection system, method and medium based on artificial intelligence
CN116109627B (en) Defect detection method, device and medium based on migration learning and small sample learning
CN115546219B (en) Detection plate type generation method, plate card defect detection method, device and product
JP2021174438A (en) Individual identification system, individual identification program, and recording medium

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination