CN115690011A - Synthetic adhesive non-woven fabric stain detection method based on optical information - Google Patents

Synthetic adhesive non-woven fabric stain detection method based on optical information Download PDF

Info

Publication number
CN115690011A
CN115690011A CN202211270650.9A CN202211270650A CN115690011A CN 115690011 A CN115690011 A CN 115690011A CN 202211270650 A CN202211270650 A CN 202211270650A CN 115690011 A CN115690011 A CN 115690011A
Authority
CN
China
Prior art keywords
image
evaluation index
difference
gray level
cluster
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202211270650.9A
Other languages
Chinese (zh)
Inventor
吉冠
吴华栋
于柠华
杜春
杨巧凤
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangsu Xinyuan Medical Technology Co ltd
Original Assignee
Jiangsu Xinyuan Medical Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangsu Xinyuan Medical Technology Co ltd filed Critical Jiangsu Xinyuan Medical Technology Co ltd
Priority to CN202211270650.9A priority Critical patent/CN115690011A/en
Publication of CN115690011A publication Critical patent/CN115690011A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Investigating Materials By The Use Of Optical Means Adapted For Particular Applications (AREA)

Abstract

The invention relates to the technical field of material defect detection and analysis, in particular to a synthetic adhesive non-woven fabric stain detection method based on optical information. The method comprises the steps of obtaining a first image of the non-woven fabric adhesive tape under a white light source and a second image of the non-woven fabric adhesive tape under an alternative light source, and calculating the difference degree between the first image and the second image, wherein when the difference degree is larger than a preset threshold value, the non-woven fabric adhesive tape is a suspected defect adhesive tape; adjusting the color of the alternative light source to shoot a plurality of second images of the suspected-defect adhesive tape, calculating the difference degree between each second image and the first image, taking the second image with the difference degree larger than a preset threshold value as a contrast image, and subtracting the first image from the contrast image to obtain a difference image; dividing the difference image into a plurality of sub-areas, clustering all the sub-areas to obtain a plurality of cluster clusters, and obtaining a stain area of the suspected defect adhesive tape according to the optimal clustering effect; the accuracy of the analysis of the taint area is improved.

Description

Synthetic adhesive non-woven fabric stain detection method based on optical information
Technical Field
The invention relates to the technical field of material defect detection and analysis, in particular to a synthetic adhesive non-woven fabric stain detection method based on optical information.
Background
The medical non-woven fabric adhesive tape is an important medical resource in the treatment and binding process. Due to the sticky nature of the nonwoven tape itself, there is a possibility that stains will contaminate the tape during its manufacture, and therefore surface stain detection has been an important part of quality detection during manufacture of nonwoven tapes.
The existing detection method for detecting whether stains exist on the surface of the non-woven fabric is usually manual detection, a large amount of human resources are consumed, the efficiency is low, and the stains are often detected inaccurately due to the textures on the surface of the non-woven fabric when the stains are detected visually.
Disclosure of Invention
In order to solve the above technical problems, the present invention provides a method for detecting stains on a synthetic adhesive nonwoven fabric based on optical information, the method comprising the steps of:
acquiring a first image of a non-woven fabric adhesive tape under a white light source and a second image of the non-woven fabric adhesive tape under an alternative light source, and calculating the difference degree between the first image and the second image, wherein when the difference degree is greater than a preset threshold value, the non-woven fabric adhesive tape is a suspected defect adhesive tape;
adjusting the color of an alternative light source to shoot a plurality of second images of the suspected defect adhesive tape, calculating the difference degree between each second image and the first image, taking the second image with the difference degree larger than a preset threshold value as a contrast image, and subtracting the first image and the contrast image to obtain a difference image;
dividing the difference image into a plurality of sub-areas, clustering all the sub-areas to obtain a plurality of cluster clusters, and obtaining a stain area of the suspected defect adhesive tape according to the optimal clustering effect;
the evaluation method of the clustering effect comprises the following steps:
acquiring a first evaluation index based on the intra-class minimum principle of the cluster, and acquiring a second evaluation index based on the inter-class maximum principle of all clusters;
acquiring a cluster with the most elements in all cluster clusters as a reference cluster, and obtaining a third evaluation index based on the difference between the gray gradient direction and the expected gradient direction of each pixel point in the reference cluster;
dividing all the clustering clusters into a normal group and an abnormal group, and obtaining a fourth evaluation index based on the difference between the corresponding image characteristics of the normal group and the abnormal group;
and obtaining an evaluation index of the clustering effect according to the first evaluation index, the second evaluation index, the third evaluation index and the fourth evaluation index, wherein the best clustering effect is obtained when the evaluation index is the maximum.
Preferably, the step of calculating the degree of difference between the first image and the second image includes:
respectively obtaining the gray level intermediate value of a first image and the gray level intermediate value of a second image, and obtaining the difference degree between the first image and the second image based on the gray level intermediate values.
Preferably, the step of obtaining the difference degree between the first image and the second image based on the intermediate gray value includes:
acquiring an overlapping gray level range of the first image and the second image based on a gray level intermediate value of the first image and a gray level intermediate value of the second image, updating the gray level range of the first image and the gray level range of the second image based on the overlapping gray level range, and acquiring the probability of each gray level in the updated gray level range of the first image and the probability of each gray level in the updated gray level range of the second image; and obtaining the difference degree between the first image and the second image based on the difference of the probabilities between the gray levels of the corresponding positions.
Preferably, the difference between the first image and the comparison image is the difference between the gray value of each pixel in the first image and the gray value of the corresponding pixel in the comparison image.
Preferably, the method for dividing the difference image into a plurality of sub-regions is a region growing algorithm.
Preferably, the method for acquiring the first evaluation index includes:
acquiring an average gray value and a center coordinate corresponding to each sub-region, and calculating the first evaluation index based on the average gray value and the center coordinate as follows:
Figure 100002_DEST_PATH_IMAGE001
wherein, the first and the second end of the pipe are connected with each other,
Figure 772417DEST_PATH_IMAGE002
represents a first evaluation index;
Figure 810781DEST_PATH_IMAGE003
is shown as
Figure 492429DEST_PATH_IMAGE004
Average gray values of all sub-areas in each cluster;
Figure 713063DEST_PATH_IMAGE005
is shown as
Figure 313809DEST_PATH_IMAGE004
The center coordinates of the individual cluster;
Figure 855780DEST_PATH_IMAGE006
represents the first in the cluster
Figure 262491DEST_PATH_IMAGE007
Average gray value of the sub-regions;
Figure 63263DEST_PATH_IMAGE008
represents the first in the cluster
Figure 710276DEST_PATH_IMAGE007
Center coordinates of the individual sub-regions;
Figure 988811DEST_PATH_IMAGE009
indicating the number of all sub-regions in the cluster.
Preferably, the calculation formula of the second evaluation index is:
Figure 182901DEST_PATH_IMAGE010
wherein, the first and the second end of the pipe are connected with each other,
Figure 100002_DEST_PATH_IMAGE011
represents a second evaluation index;
Figure 676330DEST_PATH_IMAGE003
representing the average gray value of all sub-areas in the nth clustering cluster;
Figure 681195DEST_PATH_IMAGE005
is shown as
Figure 168065DEST_PATH_IMAGE004
The center coordinates of the individual cluster;
Figure 650999DEST_PATH_IMAGE012
representing the average gray value of all the cluster clusters;
Figure 202197DEST_PATH_IMAGE013
representing the center coordinates of all the cluster clusters;
Figure 112384DEST_PATH_IMAGE014
indicating the number of all cluster clusters.
Preferably, the step of obtaining a third evaluation index based on a difference between the gradient direction of each pixel point in the reference cluster and an expected gradient direction includes:
and obtaining the average value of the difference between the gray gradient direction and the expected gradient direction of all the pixel points in the reference cluster as a third evaluation index.
Preferably, the step of obtaining a fourth evaluation index based on a difference between the corresponding image features of the normal group and the abnormal group includes:
acquiring a gray level co-occurrence matrix corresponding to the normal group and a gray level co-occurrence matrix corresponding to the abnormal group, and calculating a description operator corresponding to each gray level co-occurrence matrix, wherein the description operator is contrast and energy; and obtaining a fourth evaluation index based on the difference between the descriptor corresponding to the normal group and the descriptor corresponding to the abnormal group.
Preferably, the evaluation index is in a negative correlation with the first evaluation index, in a positive correlation with the second evaluation index, in a negative correlation with the third evaluation index, and in a positive correlation with the fourth evaluation index.
The invention has the following beneficial effects: whether a stain area exists or not is preliminarily judged based on a first image and a second image of the non-woven fabric double-faced adhesive tape under different point light sources, suspected defect adhesive tapes possibly having stains are further analyzed to obtain a difference image, the difference image is subjected to area division and clustering, a stain area and a normal area are obtained in all corresponding clustering clusters when the clustering effect is the best, the accuracy of clustering analysis is improved through evaluation of the clustering effect, and the stain area is detected based on a more accurate clustering result so that the detection result is more reliable and accurate.
Drawings
In order to more clearly illustrate the embodiments of the present invention or the technical solutions and advantages of the prior art, the drawings used in the description of the embodiments or the prior art will be briefly described below, it is obvious that the drawings in the following description are only some embodiments of the present invention, and other drawings can be obtained by those skilled in the art without creative efforts.
FIG. 1 is a flowchart of a method for detecting spots on a synthetic adhesive nonwoven fabric based on optical information according to an embodiment of the present invention.
Detailed Description
To further illustrate the technical means and effects of the present invention for achieving the predetermined objects, the following detailed description of the method for detecting stains on synthetic adhesive non-woven fabric based on optical information, the detailed implementation, structure, features and effects thereof are provided in the accompanying drawings and preferred embodiments. In the following description, the different references to "one embodiment" or "another embodiment" do not necessarily refer to the same embodiment. Furthermore, the particular features, structures, or characteristics may be combined in any suitable manner in one or more embodiments.
Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this invention belongs.
The method is suitable for performing stain detection on the non-woven fabric double faced adhesive tape in the glue non-woven fabric, analyzing through the difference between images under different light sources, obtaining the final stain area based on the optimal clustering effect, and improving the accuracy of analysis and identification.
The following describes a specific scheme of the method for detecting stains on synthetic adhesive non-woven fabric based on optical information in detail with reference to the accompanying drawings.
Referring to fig. 1, a flow chart of a method for detecting stains on a synthetic adhesive nonwoven fabric based on optical information according to an embodiment of the present invention is shown, the method including the following steps:
and S100, acquiring a first image of the non-woven fabric adhesive tape under a white light source and a second image of the non-woven fabric adhesive tape under an alternative light source, and calculating the difference degree between the first image and the second image, wherein when the difference degree is greater than a preset threshold value, the non-woven fabric adhesive tape is a suspected defect adhesive tape.
Specifically, because the non-woven fabric double-sided adhesive tape has the characteristic of light transmission, a point light source and a camera are arranged to acquire an image of the non-woven fabric double-sided adhesive tape, the point light source is arranged on one side of the non-woven fabric double-sided adhesive tape, the camera is arranged on the other side of the non-woven fabric double-sided adhesive tape, the point light source is a white light source, and the camera is used for acquiring the image of the non-woven fabric double-sided adhesive tape to obtain a first image; in order to increase the accuracy of analysis, the same point light source is arranged at the position of the camera, the color of the point light source is different from that of white, the two point light sources and the position of the camera are on the same connecting line, and at the moment, the camera is used for collecting a second image of the non-woven fabric double-sided adhesive tape under the point light source.
Furthermore, due to the fact that the non-woven double-sided adhesive tape has certain texture characteristics, stains on the surface of the non-woven fabric are difficult to detect; however, when the stain is stained on the double-sided adhesive of the non-woven fabric, permeation may occur, so that the light transmittance of the stain area on the double-sided adhesive of the non-woven fabric is changed, and therefore whether the stain area exists on the double-sided adhesive of the non-woven fabric can be preliminarily judged based on the color of the light source.
When no stain area exists on the double-sided adhesive of the non-woven fabric, the outlines of the gray level histograms corresponding to the first image and the second image which are acquired under different light sources are consistent; when a stain area exists, the surface of the non-woven double-sided adhesive tape changes, so that the sensitivity degrees of the non-woven double-sided adhesive tape to light are inconsistent, the shape profiles of the gray level histograms corresponding to the first image and the second image respectively under different light sources change, and therefore the difference degree between the first image and the second image is calculated. The specific method for acquiring the difference between the first image and the second image comprises the following steps:
firstly, the gray level intermediate value of the first image and the gray level intermediate value of the second image are respectively obtained, and the difference degree between the first image and the second image is obtained based on the gray level intermediate values.
Acquiring a gray level intermediate value corresponding to the first image; counting the occurrence frequency of each gray level in the first image to obtain the occurrence probability of each gray level, accumulating corresponding probabilities from the minimum value of the gray levels, and when the sum of the accumulated probabilities is equal to 0.5, recording the corresponding gray level as the gray level intermediate value of the first image, namely:
Figure 552462DEST_PATH_IMAGE015
wherein, the first and the second end of the pipe are connected with each other,
Figure 104666DEST_PATH_IMAGE016
representing a minimum value of gray levels in the first image;
Figure DEST_PATH_IMAGE017
representing the first in the first image
Figure 887201DEST_PATH_IMAGE007
Probability corresponding to each gray level;
Figure 499448DEST_PATH_IMAGE018
denotes the first
Figure 725025DEST_PATH_IMAGE018
Gray level, i.e. the gray level median.
That is, the probabilities corresponding to the minimum gray levels in the first image are sequentially accumulated, wherein the accumulation order is from small gray level to large gray level until the first image is accumulated
Figure 549761DEST_PATH_IMAGE018
At each gray level, the sum of the probabilities corresponding to all gray levels is 0.5, so the second one
Figure 42928DEST_PATH_IMAGE018
The gray levels are intermediate gray values.
Based on the method for obtaining the same gray level intermediate value of the first image, the gray level intermediate value corresponding to the second image is obtained, and the gray level intermediate values corresponding to the first image and the second image are recorded as the gray level intermediate values respectively
Figure 763759DEST_PATH_IMAGE019
Figure 742211DEST_PATH_IMAGE020
Then, acquiring an overlapping gray level range of the first image and the second image based on the gray level intermediate value of the first image and the gray level intermediate value of the second image, updating the gray level range of the first image and the gray level range of the second image based on the overlapping gray level range, and acquiring the probability of each gray level in the updated gray level range of the first image and the probability of each gray level in the updated gray level range of the second image; and obtaining the difference degree between the first image and the second image based on the difference of the probabilities between the gray levels of the corresponding positions.
Specifically, the overlapping gray level range between the first image and the second image is obtained based on the gray level intermediate value corresponding to each image, and the original gray level range of the first image is assumed to be
Figure 950732DEST_PATH_IMAGE021
The original gray scale range of the second image is
Figure 314717DEST_PATH_IMAGE022
(ii) a Wherein, the first and the second end of the pipe are connected with each other,
Figure 957182DEST_PATH_IMAGE023
representing a minimum gray level in the first image;
Figure 672197DEST_PATH_IMAGE024
representing a maximum gray level in the first image;
Figure 88004DEST_PATH_IMAGE025
representing a minimum gray level in the second image;
Figure 385124DEST_PATH_IMAGE026
representing the maximum gray level in the second image.
The overlapping gray scale range of the first image and the second image is
Figure 510075DEST_PATH_IMAGE027
(ii) a Wherein
Figure 460906DEST_PATH_IMAGE028
Updating a gray level range of the first image based on the overlapping gray level range to
Figure 962295DEST_PATH_IMAGE029
(ii) a Updating the gray scale range of the second image to
Figure 786025DEST_PATH_IMAGE030
As an example, assume that the first image has a gray scale range of
Figure 659041DEST_PATH_IMAGE031
The gray scale range of the second image is
Figure 817490DEST_PATH_IMAGE032
(ii) a If the median gray value of the first image is 40 and the median gray value of the second image is 50, the overlapping gray level range is
Figure 279826DEST_PATH_IMAGE033
(ii) a The overlapping gray scale ranges thus update the gray scale range of the first image to
Figure 472910DEST_PATH_IMAGE031
The overlapping gray level range updates the gray level range of the second image to
Figure 191861DEST_PATH_IMAGE034
And finally, acquiring the probability of each updated gray level in the first image and the probability of each updated gray level in the second image, and acquiring the difference degree between the first image and the second image based on the difference between the probabilities as follows:
Figure 837606DEST_PATH_IMAGE035
wherein, the first and the second end of the pipe are connected with each other,
Figure 228267DEST_PATH_IMAGE036
representing the degree of difference;
Figure 790704DEST_PATH_IMAGE017
indicating updated second image in first image
Figure 366042DEST_PATH_IMAGE007
Probability corresponding to each gray level;
Figure 249816DEST_PATH_IMAGE037
indicating updated second in second image
Figure 631119DEST_PATH_IMAGE009
Probability corresponding to each gray level;
Figure 113309DEST_PATH_IMAGE019
representing a gray level intermediate value corresponding to the first image;
Figure 407018DEST_PATH_IMAGE020
representing the gray level intermediate value corresponding to the second image;
Figure 27355DEST_PATH_IMAGE038
a minimum gray level representing a range of overlapping gray levels;
Figure 196037DEST_PATH_IMAGE039
representing the maximum gray level of the overlapping gray level range.
In the first image, the first image is
Figure 155903DEST_PATH_IMAGE007
A gray scale level associated with the second image
Figure 886093DEST_PATH_IMAGE009
Each gray level is a gray level of a corresponding position in the gray level range; when the difference degree is larger than the preset threshold value, it is indicated that a large difference exists between the first image and the second image, so that stains may exist in the currently shot non-woven fabric double-sided adhesive tape, the non-woven fabric double-sided adhesive tape is marked as a suspected defect adhesive tape, and the current suspected defect adhesive tape needs to be further analyzed.
Preferably, the threshold value of the difference degree is set to be 0.05 in the embodiment of the invention, namely when the difference degree between the first image and the second image is greater than 0.05, the non-woven double-sided adhesive tape needs to be further analyzed.
Step S200, adjusting the color of the alternative light source to shoot a plurality of second images of the suspected defect adhesive tape, calculating the difference degree between each second image and the first image, taking the second image with the difference degree larger than a preset threshold value as a contrast image, and subtracting the first image and the contrast image to obtain a difference image.
Specifically, the double-sided adhesive tape of the non-woven fabric obtained in step S100 is a suspected defective adhesive tape with a stain, and the point light source is changed based on the characteristic that the stain is different from the material of the non-woven fabric, so as to further determine the stain area. In the embodiment of the invention, a new second image of the suspected defect adhesive tape is obtained by adjusting the color of the point light source and is marked as a contrast image, and the confirmation method of the contrast image comprises the following steps: acquiring a new second image of the suspected-defect adhesive tape under the point light sources with different colors, and calculating the difference between the second image and the first image in the step S100, wherein the method for calculating the difference is the method in the step S100; and the color of the point light source corresponding to the second image when the difference degree is greater than the preset threshold value is obtained as the determined color, and the second image at the moment is a contrast image.
Preferably, in the embodiment of the present invention, the preset threshold is set to 0.1, that is, when the difference between the second image and the first image is greater than 0.1, the color of the corresponding point light source is the final color, and the second image is the contrast image at this time.
It should be noted that, when the difference degrees between all the second images and the first image are less than 0.1, the second image corresponding to the largest difference degree among all the difference degrees is selected as the final contrast image.
Further, the first image and the contrast image are subtracted to obtain a difference image, the difference is the difference between the gray value of each pixel point in the first image and the gray value of the corresponding pixel point in the contrast image, that is:
Figure 573819DEST_PATH_IMAGE040
wherein the content of the first and second substances,
Figure 296925DEST_PATH_IMAGE041
representing coordinates in the first image as
Figure 862029DEST_PATH_IMAGE042
The gray value of the pixel point;
Figure 277967DEST_PATH_IMAGE043
representing coordinates in the contrast image as
Figure 856585DEST_PATH_IMAGE042
The gray value of the pixel point;
Figure 852223DEST_PATH_IMAGE044
representing the difference value of the pixel point;
and by analogy, obtaining the difference value of each pixel point of the first image and the second image, and taking an image formed by the corresponding difference values of all the pixel points as a difference image.
And step S300, dividing the difference image into a plurality of sub-areas, clustering all the sub-areas to obtain a plurality of cluster clusters, and obtaining a stain area of the suspected defect adhesive tape according to the optimal clustering effect.
Obtaining a difference image corresponding to the suspected defect tape in the step S200, and performing preliminary division on the difference image to obtain a plurality of sub-regions, wherein the division method in the embodiment of the invention adopts a region growing algorithm, any point in the difference image is a seed point, region growing is performed by taking the seed point as a center, and the growing condition is a point which is consistent with the gray value of the pixel point of the seed point; and so on, dividing the difference image into a plurality of sub-regions.
Further, acquiring the average gray value and the center coordinate of all pixel points in each sub-region, wherein the center coordinate is the coordinate of the center point in the sub-region; and clustering all the sub-areas according to the average gray value and the central coordinate of each sub-area, so that the areas with the stains are divided into the same category, and the stain areas are obtained. In the embodiment of the invention, a k-means clustering algorithm of a self-adaptive k value is adopted, and the optimal clustering effect is obtained by continuously updating the k value, so that a stain area in the suspected defect adhesive tape is obtained. When all the subregions are clustered, the distance between any two subregions is as follows:
Figure 334151DEST_PATH_IMAGE045
wherein the content of the first and second substances,
Figure 655411DEST_PATH_IMAGE046
representing the distance between two sub-regions;
Figure 774853DEST_PATH_IMAGE047
denotes the first
Figure 856072DEST_PATH_IMAGE048
Average gray value of the sub-regions;
Figure 645037DEST_PATH_IMAGE049
denotes the first
Figure 120886DEST_PATH_IMAGE050
Average gray value of the sub-regions;
Figure 690408DEST_PATH_IMAGE051
denotes the first
Figure 372056DEST_PATH_IMAGE048
The center coordinates of each sub-area;
Figure 64462DEST_PATH_IMAGE052
is shown as
Figure 727524DEST_PATH_IMAGE050
The center coordinates of each sub-region.
Preferably, in the embodiment of the invention, the k value range in the k-means clustering algorithm is set as [1,20], and different clustering clusters are obtained by continuously updating the value of k; and selecting the one with the best clustering effect in multiple clustering as a final clustering result.
The evaluation method of the clustering effect comprises the following steps: acquiring a first evaluation index based on an intra-class minimum principle of a cluster, and acquiring a second evaluation index based on an inter-class maximum principle of the cluster; acquiring a cluster with the most elements in all the cluster clusters as a reference cluster, and obtaining a third evaluation index based on the difference between the gradient direction of each pixel point in the reference cluster and the expected gradient direction; dividing all clustering clusters into normal groups and abnormal groups, and obtaining a fourth evaluation index based on the difference between the corresponding image characteristics of the normal groups and the abnormal groups; and obtaining an evaluation index of the clustering effect according to the first evaluation index, the second evaluation index, the third evaluation index and the fourth evaluation index, wherein the best clustering effect is obtained when the evaluation index is the maximum.
Firstly, the clustering effect should satisfy the condition that the difference between all sub-regions in each clustering cluster is minimum, namely the first evaluation index for obtaining the clustering effect based on the principle of minimum intra-class difference is as follows:
Figure 269495DEST_PATH_IMAGE053
wherein, the first and the second end of the pipe are connected with each other,
Figure 879468DEST_PATH_IMAGE002
represents a first evaluation index;
Figure 954609DEST_PATH_IMAGE003
is shown as
Figure 522994DEST_PATH_IMAGE004
Average gray values of all sub-regions in each cluster;
Figure 880157DEST_PATH_IMAGE005
denotes the first
Figure 342756DEST_PATH_IMAGE004
The center coordinates of the individual cluster;
Figure 960819DEST_PATH_IMAGE006
represents the first in the cluster
Figure 450837DEST_PATH_IMAGE007
Average gray value of the sub-regions;
Figure 216668DEST_PATH_IMAGE008
represents the first in the cluster
Figure 214449DEST_PATH_IMAGE007
Center coordinates of the individual sub-regions;
Figure 483756DEST_PATH_IMAGE009
representing the number of all sub-regions in the cluster.
Then, the clustering effect should satisfy the condition that the difference between each clustering cluster is the largest, that is, the second evaluation index for obtaining the clustering effect based on the principle of the largest difference between the clusters is as follows:
Figure 410255DEST_PATH_IMAGE010
wherein, the first and the second end of the pipe are connected with each other,
Figure 977896DEST_PATH_IMAGE011
represents a second evaluation index;
Figure 530100DEST_PATH_IMAGE003
representing the average gray value of all sub-areas in the nth clustering cluster;
Figure 670225DEST_PATH_IMAGE005
is shown as
Figure 220155DEST_PATH_IMAGE004
The center coordinates of the individual cluster;
Figure 678687DEST_PATH_IMAGE012
representing the average gray value of all the cluster clusters;
Figure 34582DEST_PATH_IMAGE013
representing the center coordinates of all cluster clusters;
Figure 357110DEST_PATH_IMAGE014
indicating the number of all clusters.
Because the stain area on the double-sided adhesive tape of the non-woven fabric only occupies a small part, the cluster containing the most elements in the clusters is a normal part of the suspected defect adhesive tape, the expected gradient direction of each pixel point in the cluster is related to the position of the cluster, under the irradiation of a point light source, the expected gradient direction of each pixel point is a direction in which the pixel point at the position opposite to the point light source is taken as the center and has divergence, the actual gray gradient direction of each pixel point in the actual clustering result possibly has difference, and the average value of the difference between the gray gradient direction and the expected gradient direction of all the pixel points in the reference cluster is obtained as a third evaluation index.
Specifically, the cluster including the most pixel points in all the clusters is obtained as a reference cluster, and the gray gradient direction of each pixel point in the reference cluster is obtained, and the calculation method of the gradient direction is a known technology and is not described again. The expected gradient direction of the pixel point can be obtained according to the position of the point light source and the position of the pixel point as follows:
Figure 138595DEST_PATH_IMAGE054
wherein, the first and the second end of the pipe are connected with each other,
Figure 366314DEST_PATH_IMAGE055
representing the expected gradient direction of the pixel point;
Figure 948737DEST_PATH_IMAGE056
representing the coordinate position corresponding to the pixel point;
Figure DEST_PATH_IMAGE057
and the coordinate position of the pixel point at the position just opposite to the point light source is represented, namely the coordinate position of the central pixel point of the first image.
Obtaining the difference between the gray scale gradient direction and the expected gradient direction of the pixel point according to the obtained absolute value of the difference between the gray scale gradient direction and the expected gradient direction of the pixel point; by analogy, the differences corresponding to all the pixel points in the reference cluster are obtained, so that the third evaluation index for obtaining the clustering effect based on the differences corresponding to all the pixel points is as follows:
Figure 952202DEST_PATH_IMAGE058
wherein the content of the first and second substances,
Figure 656984DEST_PATH_IMAGE059
represents a third evaluation index;
Figure 840841DEST_PATH_IMAGE060
indicates the first in the reference cluster
Figure 993998DEST_PATH_IMAGE007
The difference between the gray scale gradient direction corresponding to each pixel point and the expected gradient direction;
Figure DEST_PATH_IMAGE061
and the number of all pixel points of the reference cluster is represented.
Further, considering that the difference of textures between a stain area and a normal area of the non-woven fabric synthetic adhesive is large, all clustered clusters obtained through clustering are preliminarily classified, one cluster with the most pixel points in the clustered clusters is recorded as a normal group, all other clustered clusters are recorded as abnormal groups, a gray level co-occurrence matrix corresponding to the normal group and a gray level co-occurrence matrix corresponding to the abnormal group are obtained, a description operator corresponding to each gray level co-occurrence matrix is calculated, and the description operator is contrast and energy; and obtaining a fourth evaluation index based on the difference between the descriptor corresponding to the normal group and the descriptor corresponding to the abnormal group.
Specifically, a gray level co-occurrence matrix of an image in a region corresponding to a normal group and a gray level co-occurrence matrix of an image in a region corresponding to an abnormal group are obtained, and a corresponding description operator is obtained based on the gray level co-occurrence matrix corresponding to each group, wherein the description operator in the embodiment of the invention comprises ASM (absolute second moment) energy and CON (contrast); the ASM energy is used for reflecting the image distribution uniformity and the texture thickness, and the CON contrast is used for reflecting the image definition and the texture depth; the gray level co-occurrence matrix and the calculation method of the description operator thereof are the prior known technologies and are not described in detail.
The fourth evaluation index for obtaining the clustering effect based on the description operators corresponding to the normal group and the abnormal group is as follows:
Figure 353435DEST_PATH_IMAGE062
wherein the content of the first and second substances,
Figure 727654DEST_PATH_IMAGE063
represents a fourth evaluation index;
Figure 602069DEST_PATH_IMAGE064
representing the contrast of the gray level co-occurrence matrix corresponding to the normal group;
Figure 588611DEST_PATH_IMAGE065
representing the contrast of the gray level co-occurrence matrix corresponding to the abnormal group;
Figure 507281DEST_PATH_IMAGE066
representing the energy of the gray level co-occurrence matrix corresponding to the normal group;
Figure 537554DEST_PATH_IMAGE067
representing the energy of the gray level co-occurrence matrix corresponding to the anomaly group.
And obtaining the evaluation index of the clustering effect according to the obtained first evaluation index, the second evaluation index, the third evaluation index and the fourth evaluation index of the clustering effect, wherein the evaluation index is in a negative correlation with the first evaluation index, in a positive correlation with the second evaluation index, in a negative correlation with the third evaluation index and in a positive correlation with the fourth evaluation index. The evaluation index is then:
Figure 915577DEST_PATH_IMAGE068
wherein, the first and the second end of the pipe are connected with each other,
Figure 266662DEST_PATH_IMAGE069
an evaluation index indicating a clustering effect;
Figure 725325DEST_PATH_IMAGE002
represents a first evaluation index;
Figure 880494DEST_PATH_IMAGE011
represents a second evaluation index;
Figure 260660DEST_PATH_IMAGE059
represents a third evaluation index;
Figure 356048DEST_PATH_IMAGE063
a fourth evaluation index;
Figure 482267DEST_PATH_IMAGE070
representing the calculation of an exponential function.
In the embodiment of the present invention, the first evaluation index, the second evaluation index, the third evaluation index, and the fourth evaluation index after normalization are all used in the calculation of the evaluation index. By analogy, the evaluation indexes of the clustering effects corresponding to different k values in the k-means clustering algorithm are obtained, when the evaluation index is the maximum, the corresponding clustering effect is the best clustering effect, all the clustering clusters obtained by clustering at the moment comprise the clustering cluster with the most pixel points, and are normal areas of the suspected defective adhesive tape, and the rest areas are stain areas of the suspected defective adhesive tape.
In summary, the embodiments of the present invention provide a method for detecting stains on a non-woven fabric based on optical information, where a difference image is obtained by analyzing and subtracting double-sided tapes of the non-woven fabric under different light sources, the difference image is clustered for multiple times, and a clustering effect at each time is evaluated, where a cluster including the most pixel points in all clusters corresponding to the best clustering effect is a normal area of the double-sided tape of the non-woven fabric, and areas corresponding to other clusters are stain areas of the double-sided tape of the non-woven fabric. And a taint area is obtained based on a more accurate clustering effect, so that the analysis accuracy is improved.
It should be noted that: the precedence order of the above embodiments of the present invention is only for description, and does not represent the merits of the embodiments. In addition, the processes depicted in the accompanying figures do not necessarily require the particular order shown, or sequential order, to achieve desirable results. In some embodiments, multitasking and parallel processing may also be possible or may be advantageous.
The embodiments in the present specification are described in a progressive manner, and the same and similar parts among the embodiments are referred to each other, and each embodiment focuses on the differences from the other embodiments.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and should not be taken as limiting the scope of the present invention, which is intended to cover any modifications, equivalents, improvements, etc. within the spirit of the present invention.

Claims (10)

1. A synthetic adhesive non-woven fabric stain detection method based on optical information is characterized by comprising the following steps:
acquiring a first image of a non-woven fabric adhesive tape under a white light source and a second image of the non-woven fabric adhesive tape under an alternative light source, and calculating the difference degree between the first image and the second image, wherein when the difference degree is greater than a preset threshold value, the non-woven fabric adhesive tape is a suspected defect adhesive tape;
adjusting the color of an alternative light source to shoot a plurality of second images of the suspected defect adhesive tape, calculating the difference degree between each second image and the first image, taking the second image with the difference degree larger than a preset threshold value as a contrast image, and subtracting the first image and the contrast image to obtain a difference image;
dividing the difference image into a plurality of sub-areas, clustering all the sub-areas to obtain a plurality of cluster clusters, and obtaining a stain area of the suspected defect adhesive tape according to the optimal clustering effect;
the evaluation method of the clustering effect comprises the following steps:
acquiring a first evaluation index based on an intra-class minimum principle of the cluster, and acquiring a second evaluation index based on an inter-class maximum principle of all clusters;
acquiring a cluster with the most elements in all cluster clusters as a reference cluster, and acquiring a third evaluation index based on the difference between the gray gradient direction and the expected gradient direction of each pixel point in the reference cluster;
dividing all the clustering clusters into a normal group and an abnormal group, and obtaining a fourth evaluation index based on the difference between the corresponding image characteristics of the normal group and the abnormal group;
and obtaining an evaluation index of the clustering effect according to the first evaluation index, the second evaluation index, the third evaluation index and the fourth evaluation index, wherein the best clustering effect is obtained when the evaluation index is the maximum.
2. The method of claim 1, wherein the step of calculating the difference between the first image and the second image comprises:
respectively obtaining the gray level intermediate value of a first image and the gray level intermediate value of a second image, and obtaining the difference degree between the first image and the second image based on the gray level intermediate values.
3. The method as claimed in claim 2, wherein the step of obtaining the difference between the first image and the second image based on the gray level intermediate value comprises:
acquiring an overlapping gray level range of the first image and the second image based on a gray level intermediate value of the first image and a gray level intermediate value of the second image, updating the gray level range of the first image and the gray level range of the second image based on the overlapping gray level range, and acquiring the probability of each gray level in the updated gray level range of the first image and the probability of each gray level in the updated gray level range of the second image; and obtaining the difference degree between the first image and the second image based on the difference of the probabilities between the gray levels of the corresponding positions.
4. The method as claimed in claim 1, wherein the difference between the first image and the contrast image is the difference between the gray level of each pixel in the first image and the gray level of the corresponding pixel in the contrast image.
5. The method as claimed in claim 1, wherein the method of dividing the difference image into a plurality of sub-regions is a region growing algorithm.
6. The method for detecting stains on synthetic adhesive non-woven fabric based on optical information as claimed in claim 1, wherein the method for obtaining the first evaluation index comprises:
acquiring an average gray value and a center coordinate corresponding to each sub-region, and calculating the first evaluation index based on the average gray value and the center coordinate as follows:
Figure DEST_PATH_IMAGE001
wherein, the first and the second end of the pipe are connected with each other,
Figure 427126DEST_PATH_IMAGE002
represents a first evaluation index;
Figure 63863DEST_PATH_IMAGE003
is shown as
Figure 765103DEST_PATH_IMAGE004
Average gray values of all sub-areas in each cluster;
Figure 864646DEST_PATH_IMAGE005
denotes the first
Figure 143180DEST_PATH_IMAGE004
Center coordinates of each cluster;
Figure 353582DEST_PATH_IMAGE006
represents the first in the cluster
Figure 909328DEST_PATH_IMAGE007
Average gray value of the sub-regions;
Figure 681237DEST_PATH_IMAGE008
represents the first in the cluster
Figure 181489DEST_PATH_IMAGE007
The center coordinates of each sub-area;
Figure 930002DEST_PATH_IMAGE009
indicating the number of all sub-regions in the cluster.
7. The method of claim 6, wherein the second evaluation index is calculated by the following formula:
Figure 730468DEST_PATH_IMAGE010
wherein, the first and the second end of the pipe are connected with each other,
Figure DEST_PATH_IMAGE011
represents a second evaluation index;
Figure 876541DEST_PATH_IMAGE003
representing the average gray value of all sub-areas in the nth clustering cluster;
Figure 129667DEST_PATH_IMAGE005
is shown as
Figure 822817DEST_PATH_IMAGE004
The center coordinates of the individual cluster;
Figure 477789DEST_PATH_IMAGE012
representing the average gray value of all the cluster clusters;
Figure 824457DEST_PATH_IMAGE013
representing the center coordinates of all cluster clusters;
Figure 66345DEST_PATH_IMAGE014
indicating the number of all cluster clusters.
8. The method as claimed in claim 1, wherein the step of obtaining a third evaluation index based on the difference between the gradient direction of each pixel point in the reference cluster and the expected gradient direction comprises:
and obtaining the average value of the difference between the gray gradient direction and the expected gradient direction of all the pixel points in the reference cluster as a third evaluation index.
9. The method as claimed in claim 1, wherein the step of obtaining a fourth evaluation index based on the difference between the corresponding image features of the normal group and the abnormal group comprises:
acquiring a gray level co-occurrence matrix corresponding to the normal group and a gray level co-occurrence matrix corresponding to the abnormal group, and calculating a description operator corresponding to each gray level co-occurrence matrix, wherein the description operator is contrast and energy; and obtaining a fourth evaluation index based on the difference between the descriptor corresponding to the normal group and the descriptor corresponding to the abnormal group.
10. The method as claimed in claim 1, wherein the evaluation index is negatively correlated to the first evaluation index, positively correlated to the second evaluation index, negatively correlated to the third evaluation index, and positively correlated to the fourth evaluation index.
CN202211270650.9A 2022-10-18 2022-10-18 Synthetic adhesive non-woven fabric stain detection method based on optical information Pending CN115690011A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202211270650.9A CN115690011A (en) 2022-10-18 2022-10-18 Synthetic adhesive non-woven fabric stain detection method based on optical information

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202211270650.9A CN115690011A (en) 2022-10-18 2022-10-18 Synthetic adhesive non-woven fabric stain detection method based on optical information

Publications (1)

Publication Number Publication Date
CN115690011A true CN115690011A (en) 2023-02-03

Family

ID=85066805

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202211270650.9A Pending CN115690011A (en) 2022-10-18 2022-10-18 Synthetic adhesive non-woven fabric stain detection method based on optical information

Country Status (1)

Country Link
CN (1) CN115690011A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116758071A (en) * 2023-08-17 2023-09-15 青岛冠宝林活性炭有限公司 Intelligent detection method for carbon electrode dirt under visual assistance

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN116758071A (en) * 2023-08-17 2023-09-15 青岛冠宝林活性炭有限公司 Intelligent detection method for carbon electrode dirt under visual assistance
CN116758071B (en) * 2023-08-17 2023-11-03 青岛冠宝林活性炭有限公司 Intelligent detection method for carbon electrode dirt under visual assistance

Similar Documents

Publication Publication Date Title
CN115311292B (en) Strip steel surface defect detection method and system based on image processing
US11341648B2 (en) Colony contrast gathering
CN114419025A (en) Fiberboard quality evaluation method based on image processing
CN114723704B (en) Textile quality evaluation method based on image processing
CN115330800A (en) Automatic segmentation method for radiotherapy target area based on image processing
CN116977358B (en) Visual auxiliary detection method for corrugated paper production quality
CN117392469B (en) Perovskite battery surface coating detection method and system based on machine vision
CN115249246A (en) Optical glass surface defect detection method
CN115187602A (en) Injection molding part defect detection method and system based on image processing
CN110910367A (en) Bioreactor cell culture quality evaluation method
CN115239736B (en) Method for monitoring quality of mixed material of abrasive layer for production of diamond-impregnated wheel
CN115690011A (en) Synthetic adhesive non-woven fabric stain detection method based on optical information
CN116152242B (en) Visual detection system of natural leather defect for basketball
CN117557820B (en) Quantum dot optical film damage detection method and system based on machine vision
CN115965607A (en) Intelligent traditional Chinese medicine tongue diagnosis auxiliary analysis system
CN117274293B (en) Accurate bacterial colony dividing method based on image features
CN115841491B (en) Quality detection method for porous metal material
CN110428437B (en) GGO segmentation method based on edge-sensitive SLIC and quadratic density clustering
CN113853607A (en) System and method for monitoring bacterial growth and predicting colony biomass of colonies
CN116468689A (en) Flaw identification method based on gray scale characteristics
CN114693646B (en) Corneal endothelial cell active factor analysis method based on deep learning
CN117611583B (en) Artificial intelligence-based aluminum composite panel defect detection method and system
CN117974633B (en) Intelligent tomato pest detection method based on image processing
CN117593303B (en) Defect detection method and system for quantum dot optical film
CN117314899B (en) Carbon fiber plate quality detection method based on image characteristics

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination