CN111353974B - Method and device for detecting image boundary defects - Google Patents

Method and device for detecting image boundary defects Download PDF

Info

Publication number
CN111353974B
CN111353974B CN202010104450.0A CN202010104450A CN111353974B CN 111353974 B CN111353974 B CN 111353974B CN 202010104450 A CN202010104450 A CN 202010104450A CN 111353974 B CN111353974 B CN 111353974B
Authority
CN
China
Prior art keywords
image
boundary line
detected
boundary
template
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010104450.0A
Other languages
Chinese (zh)
Other versions
CN111353974A (en
Inventor
邢志伟
姚毅
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Suzhou Lingyunguang Industrial Intelligent Technology Co Ltd
Original Assignee
Suzhou Lingyunguang Industrial Intelligent Technology Co Ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Suzhou Lingyunguang Industrial Intelligent Technology Co Ltd filed Critical Suzhou Lingyunguang Industrial Intelligent Technology Co Ltd
Priority to CN202010104450.0A priority Critical patent/CN111353974B/en
Publication of CN111353974A publication Critical patent/CN111353974A/en
Application granted granted Critical
Publication of CN111353974B publication Critical patent/CN111353974B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N2021/9513Liquid crystal panels
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Theoretical Computer Science (AREA)
  • Chemical & Material Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Quality & Reliability (AREA)
  • Analytical Chemistry (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Immunology (AREA)
  • Pathology (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The application discloses a method and a device for detecting image edge defects, which are used for obtaining whether the boundary line of an image in an image to be detected accords with a preset form or not by extracting the boundary line of the image in the image to be detected and comparing the boundary line with the boundary line in a template image.

Description

Method and device for detecting image boundary defects
Technical Field
The application belongs to the field of image processing, and particularly relates to a method and a device for detecting image boundary defects.
Background
There is an increasing demand in the field of image processing for defects in image boundaries, for example, detection of defects in Liquid Crystal Displays (LCDs) includes detecting whether an edge display of the LCD screen conforms to a predetermined shape, such as whether a straight edge display is straight, whether an arc edge display is a smooth arc, whether an arc is a predetermined arc, and the like. The conventional detection method is to use an industrial camera system, shoot images of the LCD when the LCD is lighted on the premise of focusing clearly, and then use a vision processing system to process the images, for example, whether the LCD is good or not can be judged according to whether the boundary of the liquid crystal screen image in the images accords with the preset form or not.
The LCD product has a certain proportion of defects due to the influence of the process, technology and the like, for example, the defects of poor display of the LCD edge, incomplete display of the edge part and the like are caused by incomplete filling of liquid crystal, pollution of the liquid crystal, shielding of an LCD display area caused by assembly deviation and the like.
Fig. 1a shows a normal LCD display image, fig. 1b shows a defective LCD display image at an arc-shaped boundary portion, fig. 1c shows another defective LCD display image at an arc-shaped boundary portion, and fig. 1d shows an LCD display image having a large dark spot defect at an aperture boundary.
For defects on an image boundary, a neighborhood detection method or an integral background fitting method is commonly used for detection, but the method cannot judge the extending direction of the boundary in advance and cannot determine the domain relation of pixel points on the image boundary, so that a dead zone exists on the image boundary, and the defects cannot be detected well.
Disclosure of Invention
In order to solve the above problems, the present application provides a method for detecting image boundary defects, which comprises the steps of firstly obtaining a template boundary map, then obtaining a to-be-processed boundary map according to an to-be-processed image, and comparing the to-be-processed boundary map with the template boundary map to determine whether the image boundary line on the to-be-processed image has defects, wherein the method can rapidly and accurately determine the defects of the image boundary.
The application aims to provide the following aspects:
in a first aspect, the present application provides a method for detecting an image boundary defect, the method comprising:
acquiring an image to be detected and a template boundary line image;
acquiring a boundary line image to be detected, wherein the boundary line image to be detected is an image recorded with image boundary lines in the image to be detected;
obtaining a matching image, wherein the matching image is a fusion image with the highest contact ratio between the boundary line image to be detected and the template boundary line image;
and extracting boundary defects according to the matched images.
In one implementation manner, the acquiring the boundary line image to be detected includes:
acquiring boundary lines of images in the image to be detected;
and removing the images except the boundary line in the image to be detected.
In one implementation, the acquiring the matching image includes:
obtaining an optimal matching position, wherein the optimal matching position is determined by a boundary line image to be detected and a template boundary line image;
and obtaining a matching image, wherein the matching image is obtained by fusing the boundary line image to be detected and the template boundary line image at the optimal matching position.
Further, the obtaining the best matching location may include:
moving the boundary line image to be detected and/or the template boundary line image according to a preset step length;
acquiring an intersection area, wherein the intersection area is the intersection area of the boundary line image to be detected and the corresponding boundary line on the template boundary line image;
and acquiring the optimal matching position, wherein the optimal matching position is the position with the largest intersection area.
In one implementation, the extracting the boundary defect from the matching image includes:
the distance between each pixel point on the boundary line to be detected and the corresponding pixel point on the template boundary line is calculated;
and extracting boundary defects according to the distances, wherein the defects are a neighborhood pixel point set with the distances larger than a distance threshold.
In one implementation manner, the template boundary line image may be a preset image, or may be obtained according to a template image.
Further, acquiring the template boundary line image from the template image may include the steps of:
acquiring boundary lines of images in the template image;
and removing the images except the boundary line in the template image.
In a second aspect, the present application also provides an apparatus for detecting an edge defect of an image, the apparatus comprising:
the image acquisition unit is used for acquiring an image to be detected and a template boundary line image;
the image processing device comprises a boundary line image acquisition unit, a detection unit and a detection unit, wherein the boundary line image acquisition unit is used for acquiring a boundary line image to be detected, and the boundary line image to be detected is an image recorded with image boundary lines in the image to be detected;
the image matching unit is used for acquiring a matching image, wherein the matching image is a fusion image with the highest contact ratio between the boundary line image to be detected and the template boundary line image;
and the boundary defect extraction unit is used for extracting boundary defects according to the matched images.
In one implementation, the boundary line image acquisition unit includes:
a boundary line extraction subunit, configured to obtain a boundary line of an image in the image to be detected;
and the image processing subunit is used for removing images except the boundary line in the image to be detected.
In one implementation, the image matching unit includes:
the position matching subunit is used for acquiring an optimal matching position, and the optimal matching position is determined by the boundary line image to be detected and the template boundary line image;
and the matching image acquisition subunit is used for acquiring a matching image, and the matching image is obtained by fusing the boundary line image to be detected and the template boundary line image at the optimal matching position.
Further, the location matching subunit may include:
the image moving subunit is used for moving the boundary line image to be detected and/or the template boundary line image according to a preset step length;
the area calculating subunit is used for obtaining an intersection area, wherein the intersection area is the intersection area of the boundary line image to be detected and the corresponding boundary line on the template boundary line image;
and the position determining subunit is used for acquiring the optimal matching position, wherein the optimal matching position is the position with the largest intersection area.
In one implementation, the boundary defect extraction unit includes:
a distance calculating subunit, configured to calculate a distance between each pixel point on the boundary line to be detected and a corresponding pixel point on the boundary line of the template;
and the boundary defect extraction subunit is used for extracting boundary defects according to the distances, wherein the defects are a neighborhood pixel point set with the distances larger than a distance threshold value.
In a third aspect, the present application also provides a program for detecting an edge defect of an image, the program being configured to implement the steps of the method for detecting an edge defect of an image according to the first aspect.
In a fourth aspect, a computer readable storage medium has stored thereon computer instructions which, when executed by a processor, implement the steps of the method for detecting image edge defects according to the first aspect.
In a fifth aspect, a detection apparatus includes: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the one processor, the instructions being executable by the at least one processor to cause the at least one processor to perform the method of detecting image edge defects of the first aspect described above.
Compared with the prior art, the method provided by the application has the advantages that whether the boundary line of the image in the image to be detected accords with the preset form or not is obtained by extracting the boundary line of the image in the image to be detected and comparing the boundary line with the boundary line in the template image, and the method provided by the application has no blind area and can accurately detect the form and the relative position defect of the boundary line of the image.
Drawings
FIG. 1a shows an LCD display screen image showing normal;
FIG. 1b shows an LCD display screen image with a defect in the arcuate border portion;
FIG. 1c shows another LCD display screen image with a defect in the arcuate border portion;
FIG. 1d shows an LCD display image with large dark spot defects at the hole boundaries;
FIG. 2 is a flowchart illustrating a method for detecting image boundary defects according to the present application;
FIG. 3 illustrates a template boundary line image;
FIG. 4 shows a boundary line image to be detected generated from the image to be detected shown in FIG. 1 b;
fig. 5 shows a matching image of the template boundary line image shown in fig. 3 and the boundary line image to be detected shown in fig. 4.
Detailed Description
Reference will now be made in detail to exemplary embodiments, examples of which are illustrated in the accompanying drawings. When the following description refers to the accompanying drawings, the same numbers in different drawings refer to the same or similar elements, unless otherwise indicated. The implementations described in the following exemplary examples do not represent all implementations consistent with the application. Rather, they are merely examples of methods consistent with aspects of the application as detailed in the accompanying claims.
The following describes in detail a method and apparatus for detecting image boundary defects according to embodiments of the present application.
First, a brief description will be given of a usage scenario of the present solution.
The method and the device are particularly suitable for detecting whether the boundary shape of the product to be detected is in compliance or not through the image of the product to be detected. For example, the detection of the display screen of the mobile phone is generally to power on the display screen to be detected, collect an image in the state, and then analyze the collected image to determine whether the display screen is a qualified product. Specifically, the scheme provided by the application can be used for detecting whether the boundary shape of the display screen is a preset circular arc shape, whether the position of the hole formed in the display screen is at the preset position, even whether the hole formed in the display screen is in the preset shape or not according to the method. It will be appreciated that the defects at the boundary of the display screen to be detected are microscopic defects which are difficult to observe by the naked eye, rather than obvious defects which are visible to the naked eye. The display screen comprises a mobile phone display screen, a computer display screen and the like, and the display screen can be an LCD display screen or an OLED display screen and the like.
In the following examples, the scheme of the application is illustrated by detecting whether the boundary shape of the arc of the display screen of the LCD mobile phone, the boundary shape of the dug round hole and the distance between the round holes meet the preset conditions.
Fig. 2 is a flowchart of a method for detecting image boundary defects according to the present application, as shown in fig. 2, the method includes steps S101 to S104 as follows:
s101, acquiring an image to be detected and a template boundary line image.
As shown in fig. 1b, in this example, the image to be detected may be an image including an image of a product to be detected.
In this example, the image to be detected may be acquired by an industrial camera.
Fig. 3 shows a template boundary line image, which is an image obtained by extracting image boundary lines from a template image in this example, wherein the template image is similar to an image in an image to be detected, and the image in the template image is a standard image, that is, the boundary form of the image and the relative positional relationship of each boundary satisfy preset conditions, which can be used as a reference basis.
In this example, the template boundary line image may be a preset image, or may be obtained according to a template image.
In this instance, the method of template boundary line image may include the following steps S111 and S112:
step S111, a template image is acquired.
In this example, the template image is an image including a template image, and the form of the template image and the relative position of each image satisfy a preset condition, for example, may be a qualified product image.
In this example, the mode of collecting the template image is the same as the mode of collecting the image to be detected, so as to ensure that the template image has higher similarity with the image to be detected, and the results obtained by collecting the same processing method have comparability.
And step S112, obtaining a template boundary line image according to the template image.
The template boundary line image is an image generated after extracting a boundary line from the template image, and specifically may include the following steps S1121 and S1122:
step S1121, extracting a boundary line of the image in the template image.
The method for extracting the image boundary line in the template image is not particularly limited, and any method in the prior art for extracting the image boundary line in the image may be adopted, for example, canny operator edge detection, sobel operator edge detection and the like, so that the full detection of the boundary line of the image in the template image is preferable.
In step S1122, the images except the boundary line in the template image are removed.
In this example, the method for removing the image is not particularly limited, and a method for removing the image in the prior art may be used, for example, the gray values of all the images are adjusted to be uniform values, the gray values of the adjusted images are different from the gray values of the boundary lines in the image, and preferably, the gray values of the adjusted images have a larger contrast with the gray values of the boundary lines, so that the boundary lines are displayed in the image.
As shown in fig. 3, only the image border line in the template border line image is white, and the rest is filled with black background.
S102, acquiring a boundary line image to be detected, wherein the boundary line image to be detected is an image recorded with the image boundary line in the image to be detected.
In this example, a border line image to be detected is generated according to the image to be detected, and then the border line image to be detected is compared with a template border line image, so as to obtain whether the border line of the image in the image to be detected has a defect.
In this example, the acquiring the boundary line image to be detected may include the following steps S121 and S122:
step S121, extracting a boundary line of the image in the image to be detected.
The method for obtaining the boundary line of the image in the image to be detected is not particularly limited, any method in the prior art for extracting the boundary line of the image in the image may be used, for example, canny operator edge detection, sobel operator edge detection, and the like, so that the boundary line of the image in the image to be detected can be detected preferably, and preferably, the method for extracting the boundary line of the image in the image to be detected is the same as the method for extracting the boundary line in the template image.
Step S122, removing the images except the boundary line in the image to be detected.
In this example, the method used in this step is the same as that used in step S1122, and the specific method can be seen in step S1122.
Fig. 4 shows a boundary line image to be detected generated from the image to be detected shown in fig. 1b, wherein only the boundary line of the image of the boundary line image to be detected is white, and the rest is filled with black background as shown in fig. 4.
Further, the boundary line in the boundary line image to be detected is similar to the boundary line in the template boundary line image.
And S103, acquiring a matching image, wherein the matching image is a fusion image with the highest coincidence degree between the boundary line image to be detected and the template boundary line image.
In this example, a local image of the part to be detected is firstly cut out on the image to be detected, and then the local image is matched and fused with the template image, so as to improve the matching degree of the image to be detected and the template image at the part to be detected. The present inventors have found that if the entire image of the image to be detected is matched with the entire image of the template image, there is a possibility that the matching accuracy of the portion to be detected is lost based on the balance of the entire image, resulting in a decrease in the matching degree of the portion to be detected and thus in a decrease in the detection accuracy.
Further, to ensure that the size of the image to be detected is equal to that of the template image, a window of a fixed size may be used to intercept the image to be detected and the template image, for example, a window of 200 width and height may be used to intercept the image.
In this example, for convenience of description, the boundary line image to be detected and the template boundary line image refer to the partial image after being cut out unless otherwise specified.
In this example, the acquiring the matching image includes the following steps S131 and S132:
step S131, obtaining a best matching position, where the best matching position is determined by the boundary line image to be detected and the template boundary line image.
In this instance, any one of the image matching methods in the related art may be used to match the boundary line image to be detected with the template boundary line image, so as to determine the matching position of the boundary line image to be detected with the template boundary line image, for example, step S131 may include the following steps S1311 to S1313:
step S1311, moving the boundary line image to be detected and/or the template boundary line image according to a preset step.
In this example, the preset step size may be one pixel, or may be other step sizes.
In this example, one image may be fixed while the other image is moved, wherein the fixed image may be selected from one of the border line image to be detected and the template border line image, and the moved image is the other image. In this step, if not specifically described, the boundary line image to be detected and the template boundary line image are both partial images taken by the window.
Taking a preset step length as one pixel as an example, the matching position can be determined by horizontally or vertically moving the boundary line image to be detected or the template boundary line image by one pixel at a time.
Step S1312, obtaining an intersection area, where the intersection area is an area where the boundary line image to be detected intersects with the corresponding boundary line on the template boundary line image.
In this example, the area of intersection of the boundary line to be detected and the boundary line of the template is calculated once after each movement, and the area of intersection may be the number of pixels where the boundary lines overlap.
In step S1313, the best matching position is obtained, where the best matching position is the position with the largest intersection area.
In this example, after the best matching position is obtained, the boundary line image to be detected and the template boundary line image in the window are respectively intercepted.
Step S132, obtaining a matching image, wherein the matching image is obtained by fusing a boundary line image to be detected and a template boundary line image at the optimal matching position.
The fusion method in this example is not particularly limited, and any image fusion method in the prior art may be adopted.
Fig. 5 shows a matching image of the template boundary line image shown in fig. 3 and the boundary line image to be detected shown in fig. 4, and as shown in fig. 5, the fused image has a closed area which is not coincident at the defect.
S104, extracting boundary defects according to the matched images.
Based on the matching image, whether the boundary line to be detected meets a preset condition or not can be determined by utilizing the distance between each pixel point on the boundary line to be detected and the corresponding pixel point of the boundary line of the template. For example, a distance threshold may be set between the two, and if the actual distance between the two is greater than or equal to the preset distance, the product corresponding to the boundary line to be detected may be considered as a suspected defective product.
The distance threshold may be specifically set according to the detection accuracy, and may be set to 3 pixels, for example.
In this example, the present step may include the following steps S141 and S142:
step S141, calculating the distance between each pixel point on the boundary line to be detected and the corresponding pixel point on the boundary line of the template.
The method for calculating the distance between each pixel point on the boundary line to be detected and the corresponding pixel point on the boundary line of the template is not particularly limited, and any method in the prior art for calculating the distance between two lines in the image, such as a cosine method, can be adopted.
Step S142, extracting boundary defects according to the distances, wherein the defects are a neighborhood pixel point set with the distances larger than a distance threshold.
In this example, as shown in the white box in fig. 5, the defect is a set of neighboring pixels, where the distance between each pixel on the boundary line to be detected and the corresponding pixel on the boundary line of the template is greater than the distance threshold.
In this example, after step S142, step S105 and step S106 may be further included:
step S105, obtaining defect characteristics according to the boundary defects.
In this example, the defect features include a length, a width, a gray level difference of the defect region, and the like, wherein the gray level difference of the defect region refers to a difference between the gray level of the defect region and the gray level of the defect expansion region.
And the defect area is a closed area surrounded by a template boundary line and a boundary line to be detected.
In this example, the gray level difference of the defective region may be obtained by a method including the steps of:
acquiring a gray value of a defect image, wherein the gray value of the defect image is an average value of gray values of all pixel points in the defect image, and the defect image is an image block corresponding to a defect area on an image to be detected;
the method comprises the steps of obtaining gray values of a defect expansion image, wherein the gray values of the defect expansion image are average values of gray values of all pixel points in the defect expansion image, and the defect expansion image is an image obtained after the defect image is expanded according to preset conditions.
And S106, selecting defective products according to the defect characteristics.
In this example, the condition for selecting the defective product according to the defect feature may be specifically set according to different selection accuracy. For example, a defect width threshold, a defect length threshold, and a defect gray level difference threshold may be set, and if the width of the defect is greater than the defect width threshold, the length of the defect is greater than the defect length threshold, and the defect gray level difference is greater than the defect gray level difference threshold, the product corresponding to the defect is defective.
The application mainly utilizes the thought of template matching, takes the template boundary line as a reference, determines the boundary defect by judging the deviation between the boundary line to be detected and the template boundary line, and corresponds to the image to be detected, thus detecting the defect of the image edge line in the image to be detected, and further judging whether the product corresponding to the image to be detected is a defective product according to the severity of the defect.
Specifically, firstly, a good product image is extracted to serve as a template image, a boundary line of an image is extracted to serve as a template boundary line image, then an image to be detected is obtained, the boundary line of the image is extracted to serve as a boundary line image to be detected, the boundary line image to be detected is matched with the template boundary line image to obtain a matched image, a defect area is extracted from the matched image, and an image block corresponding to the defect area on the image to be detected is the defect of a product to be detected. Further, whether the product having the defect is a defective product may be specifically determined according to the defect characteristics of the defect region, for example, the length, width, gray level difference, and the like of the defect.
The detection method provided by the application can make up for the detection blind area of the boundary liquid crystal display abnormality of the conventional neighborhood detection and background fitting detection method, and can stably and effectively detect the defect that the boundary part of the sample to be detected is larger than 3 pixels in width.
Correspondingly, the application also provides a device for detecting the image edge defect, which comprises:
the image acquisition unit is used for acquiring an image to be detected and a template boundary line image;
the image processing device comprises a boundary line image acquisition unit, a detection unit and a detection unit, wherein the boundary line image acquisition unit is used for acquiring a boundary line image to be detected, and the boundary line image to be detected is an image recorded with image boundary lines in the image to be detected;
the image matching unit is used for acquiring a matching image, wherein the matching image is a fusion image with the highest contact ratio between the boundary line image to be detected and the template boundary line image;
and the boundary defect extraction unit is used for extracting boundary defects according to the matched images.
In one implementation, the boundary line image acquisition unit includes:
a boundary line extraction subunit, configured to obtain a boundary line of an image in the image to be detected;
and the image processing subunit is used for removing images except the boundary line in the image to be detected.
In one implementation, the image matching unit includes:
the position matching subunit is used for acquiring an optimal matching position, and the optimal matching position is determined by the boundary line image to be detected and the template boundary line image;
and the matching image acquisition subunit is used for acquiring a matching image, and the matching image is obtained by fusing the boundary line image to be detected and the template boundary line image at the optimal matching position.
Further, the location matching subunit may include:
the image moving subunit is used for moving the boundary line image to be detected and/or the template boundary line image according to a preset step length;
the area calculating subunit is used for obtaining an intersection area, wherein the intersection area is the intersection area of the boundary line image to be detected and the corresponding boundary line on the template boundary line image;
and the position determining subunit is used for acquiring the optimal matching position, wherein the optimal matching position is the position with the largest intersection area.
In one implementation, the boundary defect extraction unit includes:
a distance calculating subunit, configured to calculate a distance between each pixel point on the boundary line to be detected and a corresponding pixel point on the boundary line of the template;
and the boundary defect extraction subunit is used for extracting boundary defects according to the distances, wherein the defects are a neighborhood pixel point set with the distances larger than a distance threshold value.
In a third aspect, the present application also provides a program for detecting an edge defect of an image, the program being configured to implement the steps of the method for detecting an edge defect of an image according to the first aspect.
In a fourth aspect, a computer readable storage medium has stored thereon computer instructions which, when executed by a processor, implement the steps of the method for detecting image edge defects according to the first aspect.
In a fifth aspect, a detection apparatus includes: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the one processor, the instructions being executable by the at least one processor to cause the at least one processor to perform the method of detecting image edge defects of the first aspect described above.
The application has been described in detail in connection with the specific embodiments and exemplary examples thereof, but such description is not to be construed as limiting the application. It will be understood by those skilled in the art that various equivalent substitutions, modifications or improvements may be made to the technical solution of the present application and its embodiments without departing from the spirit and scope of the present application, and these fall within the scope of the present application. The scope of the application is defined by the appended claims.

Claims (5)

1. A method for detecting image boundary defects, the method comprising:
acquiring an image to be detected and a template boundary line image;
acquiring a boundary line image to be detected, wherein the boundary line image to be detected is an image recorded with image boundary lines in the image to be detected;
acquiring a matching image, the acquiring the matching image comprising: obtaining an optimal matching position, wherein the optimal matching position is determined by a boundary line image to be detected and a template boundary line image, and the obtaining the optimal matching position comprises the following steps: moving the boundary line image to be detected and the template boundary line image according to a preset step length; acquiring an intersection area, wherein the intersection area is the intersection area of the boundary line image to be detected and the corresponding boundary line on the template boundary line image; obtaining an optimal matching position, wherein the optimal matching position is the position with the largest intersection area; obtaining a matching image, wherein the matching image is obtained by fusing a boundary line image to be detected and a template boundary line image at the optimal matching position, and the matching image is a fused image with the highest contact ratio between the boundary line image to be detected and the template boundary line image;
extracting boundary defects according to the matching image, wherein the extracting boundary defects according to the matching image comprises the following steps: calculating the distance between each pixel point on the boundary line to be detected and the corresponding pixel point on the template boundary line; and extracting boundary defects according to the distances, wherein the defects are a neighborhood pixel point set with the distances larger than a distance threshold.
2. The method of claim 1, wherein the acquiring the boundary line image to be detected comprises:
acquiring boundary lines of images in the image to be detected;
and removing the images except the boundary line in the image to be detected.
3. An apparatus for detecting image boundary defects, the apparatus comprising:
the image acquisition unit is used for acquiring an image to be detected and a template boundary line image;
the image processing device comprises a boundary line image acquisition unit, a detection unit and a detection unit, wherein the boundary line image acquisition unit is used for acquiring a boundary line image to be detected, and the boundary line image to be detected is an image recorded with image boundary lines in the image to be detected;
an image matching unit, configured to acquire a matching image, where the acquiring the matching image includes: obtaining an optimal matching position, wherein the optimal matching position is determined by a boundary line image to be detected and a template boundary line image, and the obtaining the optimal matching position comprises the following steps: moving the boundary line image to be detected and the template boundary line image according to a preset step length; acquiring an intersection area, wherein the intersection area is the intersection area of the boundary line image to be detected and the corresponding boundary line on the template boundary line image; obtaining an optimal matching position, wherein the optimal matching position is the position with the largest intersection area; obtaining a matching image, wherein the matching image is obtained by fusing a boundary line image to be detected and a template boundary line image at the optimal matching position, and the matching image is a fused image with the highest contact ratio between the boundary line image to be detected and the template boundary line image;
a boundary defect extracting unit, configured to extract a boundary defect according to the matching image, where the extracting the boundary defect according to the matching image includes: calculating the distance between each pixel point on the boundary line to be detected and the corresponding pixel point on the template boundary line; and extracting boundary defects according to the distances, wherein the defects are a neighborhood pixel point set with the distances larger than a distance threshold.
4. A computer readable storage medium having stored thereon computer instructions which, when executed by a processor, implement the steps of the method of detecting image boundary defects of any of the preceding claims 1 to 2.
5. A detection apparatus, characterized in that the detection apparatus comprises: at least one processor; and a memory communicatively coupled to the at least one processor; wherein the memory stores instructions executable by the one processor to cause the at least one processor to perform the method of detecting image boundary defects of any one of claims 1 to 2.
CN202010104450.0A 2020-02-20 2020-02-20 Method and device for detecting image boundary defects Active CN111353974B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010104450.0A CN111353974B (en) 2020-02-20 2020-02-20 Method and device for detecting image boundary defects

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010104450.0A CN111353974B (en) 2020-02-20 2020-02-20 Method and device for detecting image boundary defects

Publications (2)

Publication Number Publication Date
CN111353974A CN111353974A (en) 2020-06-30
CN111353974B true CN111353974B (en) 2023-08-18

Family

ID=71197055

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010104450.0A Active CN111353974B (en) 2020-02-20 2020-02-20 Method and device for detecting image boundary defects

Country Status (1)

Country Link
CN (1) CN111353974B (en)

Families Citing this family (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113870264B (en) * 2021-12-02 2022-03-22 湖北全兴通管业有限公司 Tubular part port abnormity detection method and system based on image processing
CN117315303B (en) * 2023-11-28 2024-03-22 深圳市联嘉祥科技股份有限公司 Cable old damage degree analysis method, device and equipment based on image feature matching

Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006145377A (en) * 2004-11-19 2006-06-08 Toyota Motor Corp Method and apparatus for detecting defect on painted surface
JP2008185438A (en) * 2007-01-30 2008-08-14 Canon Chemicals Inc Method and device for detecting defect on cylindrical surface
CN102020036A (en) * 2010-11-02 2011-04-20 昆明理工大学 Visual detection method for transparent paper defect of outer package of strip cigarette
JP2011085820A (en) * 2009-10-16 2011-04-28 Sony Corp Device and method for defect correction
JP2011085821A (en) * 2009-10-16 2011-04-28 Sony Corp Device and method for defect correction
CN103134476A (en) * 2013-01-28 2013-06-05 中国科学院研究生院 Sea and land boundary detection method based on level set algorithm
CN105279492A (en) * 2015-10-22 2016-01-27 北京天诚盛业科技有限公司 Iris identification method and device
CN106855520A (en) * 2017-02-10 2017-06-16 南京航空航天大学 A kind of workpiece, defect detection method based on machine vision
CN108710883A (en) * 2018-06-04 2018-10-26 国网辽宁省电力有限公司信息通信分公司 A kind of complete conspicuousness object detecting method using contour detecting
CN109325930A (en) * 2018-09-12 2019-02-12 苏州优纳科技有限公司 Detection method, device and the detection device of boundary defect
CN109767445A (en) * 2019-02-01 2019-05-17 佛山市南海区广工大数控装备协同创新研究院 A kind of high-precision PCB defect intelligent detecting method

Patent Citations (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2006145377A (en) * 2004-11-19 2006-06-08 Toyota Motor Corp Method and apparatus for detecting defect on painted surface
JP2008185438A (en) * 2007-01-30 2008-08-14 Canon Chemicals Inc Method and device for detecting defect on cylindrical surface
JP2011085820A (en) * 2009-10-16 2011-04-28 Sony Corp Device and method for defect correction
JP2011085821A (en) * 2009-10-16 2011-04-28 Sony Corp Device and method for defect correction
CN102020036A (en) * 2010-11-02 2011-04-20 昆明理工大学 Visual detection method for transparent paper defect of outer package of strip cigarette
CN103134476A (en) * 2013-01-28 2013-06-05 中国科学院研究生院 Sea and land boundary detection method based on level set algorithm
CN105279492A (en) * 2015-10-22 2016-01-27 北京天诚盛业科技有限公司 Iris identification method and device
CN106855520A (en) * 2017-02-10 2017-06-16 南京航空航天大学 A kind of workpiece, defect detection method based on machine vision
CN108710883A (en) * 2018-06-04 2018-10-26 国网辽宁省电力有限公司信息通信分公司 A kind of complete conspicuousness object detecting method using contour detecting
CN109325930A (en) * 2018-09-12 2019-02-12 苏州优纳科技有限公司 Detection method, device and the detection device of boundary defect
CN109767445A (en) * 2019-02-01 2019-05-17 佛山市南海区广工大数控装备协同创新研究院 A kind of high-precision PCB defect intelligent detecting method

Non-Patent Citations (1)

* Cited by examiner, † Cited by third party
Title
刘建春.基于机器视觉的金属边缘细微缺陷检测方法的研究.《工艺与检测》.2018,(第undefined期),全文. *

Also Published As

Publication number Publication date
CN111353974A (en) 2020-06-30

Similar Documents

Publication Publication Date Title
CN109100363B (en) Method and system for distinguishing defects of attached foreign matters from dust
CN111353974B (en) Method and device for detecting image boundary defects
US7978904B2 (en) Pattern inspection apparatus and semiconductor inspection system
CN112577969B (en) Defect detection method and defect detection system based on machine vision
CN109752392B (en) PCB defect type detection system and method
CN110007493B (en) Method for detecting broken bright spots in liquid crystal display screen
WO2022110219A1 (en) Inspection method, device, and system for display panel
CN115272280A (en) Defect detection method, device, equipment and storage medium
CN108871185B (en) Method, device and equipment for detecting parts and computer readable storage medium
CN110567965A (en) Smartphone glass cover plate edge visual defect detection method
CN110376211B (en) Wet-process-gummed synthetic leather hemming on-line detection device and method
CN107515481A (en) The detection method and device of a kind of display panel
CN105303573A (en) Method and system of pin detection of gold needle elements
US11010884B2 (en) System and method for evaluating displays of electronic devices
CN115601359A (en) Welding seam detection method and device
KR102242996B1 (en) Method for atypical defects detect in automobile injection products
CN113538603A (en) Optical detection method and system based on array product and readable storage medium
WO2022222467A1 (en) Open circular ring workpiece appearance defect detection method and system, and computer storage medium
CN114677348A (en) IC chip defect detection method and system based on vision and storage medium
CN117058411B (en) Method, device, medium and equipment for identifying edge appearance flaws of battery
CN110602299B (en) Appearance defect detection method based on different inclination angles
CN115829981A (en) Visual detection method for end face damage of retired bearing
CN115147386A (en) Defect detection method and device for U-shaped pipe and electronic equipment
CN115546140A (en) Display panel detection method and system and electronic device
KR20140082335A (en) Method and apparatus of inspecting mura of flat display

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 100094 Beijing city Haidian District Cui Hunan loop 13 Hospital No. 7 Building 7 room 701

Applicant after: Lingyunguang Technology Co.,Ltd.

Address before: 100094 Beijing city Haidian District Cui Hunan loop 13 Hospital No. 7 Building 7 room 701

Applicant before: LUSTER LIGHTTECH GROUP Co.,Ltd.

TA01 Transfer of patent application right
TA01 Transfer of patent application right

Effective date of registration: 20230620

Address after: 215100 No.6, Shuangqi Road, Wuzhong Economic Development Zone, Suzhou City, Jiangsu Province

Applicant after: Suzhou lingyunguang Industrial Intelligent Technology Co.,Ltd.

Address before: 100094 701, 7 floor, 7 building, 13 Cui Hunan Ring Road, Haidian District, Beijing.

Applicant before: Lingyunguang Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant