CN109961437B - Method for detecting significant fabric defects based on machine teaching mode - Google Patents

Method for detecting significant fabric defects based on machine teaching mode Download PDF

Info

Publication number
CN109961437B
CN109961437B CN201910270956.6A CN201910270956A CN109961437B CN 109961437 B CN109961437 B CN 109961437B CN 201910270956 A CN201910270956 A CN 201910270956A CN 109961437 B CN109961437 B CN 109961437B
Authority
CN
China
Prior art keywords
image
pixel
super
saliency
fuzzy
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN201910270956.6A
Other languages
Chinese (zh)
Other versions
CN109961437A (en
Inventor
李岳阳
杜帅
罗海驰
樊启高
朱一昕
佘雪
李美佳
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Jiangnan University
Original Assignee
Jiangnan University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Jiangnan University filed Critical Jiangnan University
Priority to CN201910270956.6A priority Critical patent/CN109961437B/en
Publication of CN109961437A publication Critical patent/CN109961437A/en
Application granted granted Critical
Publication of CN109961437B publication Critical patent/CN109961437B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/95Investigating the presence of flaws or contamination characterised by the material or shape of the object to be examined
    • G01N21/956Inspecting patterns on the surface of objects
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0004Industrial image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/11Region-based segmentation
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/10Segmentation; Edge detection
    • G06T7/136Segmentation; Edge detection involving thresholding
    • GPHYSICS
    • G01MEASURING; TESTING
    • G01NINVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
    • G01N21/00Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
    • G01N21/84Systems specially adapted for particular applications
    • G01N21/88Investigating the presence of flaws or contamination
    • G01N21/8851Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges
    • G01N2021/8887Scan or image signal processing specially adapted therefor, e.g. for scan signal adjustment, for detecting different kinds of defects, for compensating for structures, markings, edges based on image processing techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20081Training; Learning

Landscapes

  • Engineering & Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Immunology (AREA)
  • Biochemistry (AREA)
  • General Health & Medical Sciences (AREA)
  • Analytical Chemistry (AREA)
  • Chemical & Material Sciences (AREA)
  • Pathology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Health & Medical Sciences (AREA)
  • Theoretical Computer Science (AREA)
  • Quality & Reliability (AREA)
  • Signal Processing (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention discloses a method for detecting significant fabric defects based on a machine teaching mode, and belongs to the technical field of textiles. By introducing a teaching model and evaluating the propagation difficulty of the super-pixel set through fuzzy connectivity, the sequence of significance propagation is gradually controlled and optimized, and the defects and shortcomings of significance propagation in the defect detection method are overcome; the method for detecting the significant fabric defects is based on the premise that pixels have different propagation difficulties and is combined with practical results of cognitive science theories, and has reference value for improving the stability and accuracy of a significant defect detection algorithm.

Description

Method for detecting significant fabric defects based on machine teaching mode
Technical Field
The invention relates to a method for detecting significant fabric defects based on a machine teaching mode, and belongs to the technical field of textiles.
Background
During the production of the fabric, the generation of fabric defects is inevitable. According to the traditional method of manual detection, the detection result is greatly influenced by human subjectivity, the omission factor is high, and the labor cost is high; with the development of computer technology, the method for automatically detecting the defects of the fabric by means of machine vision saves labor cost, has high stability and high production efficiency, and is gradually developed and applied.
At present, a fabric defect detection method based on general visual saliency is effective in detecting defects of warp-knitted plain-weave type common fabrics, but is not stable enough in detecting jacquard type complex fabrics, poor in detection effect and difficult to be used in actual production.
Therefore, it is necessary to develop a method capable of effectively detecting defects on a complex fabric and effectively improving the accuracy of a detection algorithm.
Disclosure of Invention
In order to solve the problems of instability and low detection rate of general significance detection, the invention provides a significance fabric defect detection method based on a machine teaching mode, which comprises the following steps:
preprocessing the image by using a Harris corner detection method to obtain an image containing a convex hull;
clustering the related super pixel points by adopting a super pixel segmentation method of linear iterative clustering, and realizing region segmentation of the image through iterative updating;
constructing a rough saliency map by image fusion of an image background, an edge and a convex hull;
the idea of fuzzy connectivity is introduced into the teaching model to gradually refine the rough saliency image: evaluating the propagation difficulty of the superpixels by using the fuzzy connectivity in the teaching mode and sequencing; and allocating a certain amount of simple superpixels in the learning and teaching mode for learning, adjusting the task amount according to the learning feedback effect, and completing significance propagation through continuous iterative optimization until the unmarked superpixel set is a null set.
And carrying out pixel classification processing on the thinned saliency image by using a threshold segmentation method to obtain a final result.
In particular, a first object of the present application is to provide a method for detecting a significant fabric defect, said method comprising:
s1: preprocessing an image to be detected to construct a rough saliency image;
s2: gradually thinning the rough saliency image by using a teaching model to obtain a thinned saliency image;
s3: and segmenting the thinned saliency image by utilizing threshold segmentation to obtain a defect detection result.
Optionally, the step S1: preprocessing an image to be detected to construct a rough saliency image, wherein the rough saliency image comprises the following steps:
acquiring a defective fabric image by utilizing angular point detection, wherein the defective fabric image is an image containing convex hulls;
and (3) dividing the image to be detected into different super pixel sets by using a super pixel division method, wherein each super pixel set is a cluster of associated super pixel points.
Optionally, the step S2: gradually thinning the rough saliency image by using a teaching model to obtain a thinned saliency image, wherein the method comprises the following steps: evaluating the propagation difficulty of each super pixel through fuzzy connectivity, and sequencing each super pixel according to the propagation difficulty;
and (4) performing easy and difficult significance propagation on the sequenced super pixels by using a teaching model.
Optionally, the evaluating the propagation difficulty of each super-pixel through fuzzy connectivity includes:
constructing a fuzzy space, and describing the proximity relation among the super pixels in the constructed fuzzy space;
evaluating the uniformity component and the characteristic component of an object of interest in an image to be detected through a membership function; and defining the fuzzy relation among the super pixels in the fuzzy space by using the fuzzy connection degree, and sequencing the propagation difficulty among the super pixels.
Optionally, the uniform component
Figure BDA0002018359800000021
Comprises the following steps:
Figure BDA0002018359800000022
wherein
Figure BDA0002018359800000023
Comprises the following steps:
Figure BDA0002018359800000024
therein
Figure BDA0002018359800000025
Is the standard deviation;
said characteristic component muφ(c, d) are:
Figure BDA0002018359800000026
wherein
wo(c,d)=min[Wo(f(c)),Wo(f(d))]
wb(c,d)=max[Wb(f(c)),Wb(f(d))]
Wherein WoAnd WbRespectively a target membership function and a background membership function of the pixel intensity; wherein the target membership function is:
Figure BDA0002018359800000027
wherein m isoAnd k isoRespectively representing the mean value and the standard deviation of an interested object in the image to be detected;
a background membership function of
Figure BDA0002018359800000031
Wherein m isbAnd
Figure BDA0002018359800000033
respectively is the mean value and standard deviation of the background in the image to be detected.
Optionally, the performing easy and difficult significance propagation on the sorted super pixels by using the teaching model includes:
randomly distributing simple super pixels which are ranked at the back and at the front for learning, and adjusting the task amount according to the feedback effect of learning;
and (4) completing significance propagation by continuously iterating and optimizing until the unmarked super-pixel set is an empty set.
Optionally, the obtaining of the image of the defective textile by using the corner detection method is a obtaining of the image of the defective textile by using a Harris corner detection method, and includes:
establishing a mathematical model, moving on an image to be detected by using a local window, defining a corner response function R, and determining the position of a corner by judging the size of R;
and connecting the angular points to form a convex hull which surrounds the defect target, wherein the defect fabric image is the image containing the convex hull.
Optionally, the method for super-pixel segmentation is a linear iterative clustering algorithm, including:
initializing a seed point;
reselecting seed points within a Q '. Q' neighborhood of seed points: calculating gradient values of all pixel points in the Q '. multidot.Q' neighborhood, and moving the seed points to the place with the minimum gradient in the Q '. multidot.Q' neighborhood;
distributing a class label to each pixel point in the neighborhood around each seed point;
respectively calculating the distance between each pixel point and the seed point in the neighborhood around each seed point;
respectively calculating the color distance and the space distance between the kth clustering center and the ith pixel point by using a distance calculation formula among space pixels, and calculating the normalized measurement distance;
the optimization is iterated until the error converges to 0.
Optionally, the step S3: segmenting the thinned saliency image by utilizing threshold segmentation to obtain a defect detection result, wherein the defect detection result comprises the following steps:
carrying out pixel classification processing on the thinned saliency image;
setting the gray value of the thinned saliency image as F (x, y), and B (x, y) representing the image after threshold segmentation; t is a threshold value; then
Figure BDA0002018359800000032
And (x, y) is the position coordinates of the pixel points in the saliency image.
A first object of the present application is to provide a use of the above-mentioned method for detecting defects in a textile fabric in the field of textile technology.
The invention has the beneficial effects that:
according to the method for detecting the defects of the significant fabric, the teaching model is introduced, and the propagation difficulty of the super-pixel set is evaluated through fuzzy connectivity, so that the sequence of significant propagation is gradually controlled and optimized, and the defects and shortcomings of significant propagation in the defect detection method are overcome; the method for detecting the significant fabric defects is based on the premise that pixels have different propagation difficulties and is combined with the practical result of the cognitive scientific theory, and the stability and the accuracy of a significant defect detection algorithm can be improved.
Drawings
In order to more clearly illustrate the technical solutions in the embodiments of the present invention, the drawings needed to be used in the description of the embodiments will be briefly introduced below, and it is obvious that the drawings in the following description are only some embodiments of the present invention, and it is obvious for those skilled in the art to obtain other drawings based on these drawings without creative efforts.
Figure 1 is a schematic flow diagram of a significant fabric defect detection method provided by the present invention.
Fig. 2 is a schematic diagram of a learning and teaching mode.
FIG. 3 is a graph relating uniformity components
Figure BDA0002018359800000041
Is a functional form diagram of (a).
FIG. 4 is W associated with a feature componentoIs a functional form diagram of (a).
Fig. 5 is an image to be measured.
Fig. 6 is a refined saliency image.
Fig. 7 shows the detection result after the threshold value division.
Fig. 8 is a detection result based on the optimal Gabor filter algorithm.
Detailed Description
In order to make the objects, technical solutions and advantages of the present invention more apparent, embodiments of the present invention will be described in detail with reference to the accompanying drawings.
The first embodiment is as follows:
the present embodiment provides a method for detecting a significant fabric defect, referring to fig. 1, the method comprising:
(1) constructing a rough saliency image;
specifically, after an image to be detected is acquired, a rough saliency image of the image to be detected needs to be constructed first, and the method includes:
(1.1) acquiring images of the defective fabrics; in the step, a Harris corner detection method can be adopted to preprocess the image to obtain an image containing a convex hull;
(1.2) clustering the related super pixel points by adopting a super pixel segmentation method of linear iterative clustering, and realizing region segmentation of the image to be detected by iterative updating to obtain a plurality of super pixel sets;
(1.3) construction of a saliency image: constructing a rough saliency image by fusing the background, the edge and the convex hull of the image to be detected;
(2) refining a rough saliency image by using a teaching mode and performing saliency propagation;
specifically, after an image to be detected is constructed to obtain a rough saliency image, a teaching mode is adopted to refine and propagate the saliency image, and the method comprises the following steps:
(2.1) in a teaching and learning mode, ranking the superpixel sets by evaluating the difficulty of significance propagation of the superpixel sets;
(2.2) in a teaching and learning mode, namely, a teacher provides a learner to learn by distributing simple tasks, and the task amount of the next learning is determined according to the feedback effect of the learning; therefore, the rough saliency image can be effectively refined through the interaction of the teacher and the learner;
(2.3) continuously iterating and optimizing until the unmarked superpixel set is an empty set, and finishing significance propagation at the moment;
(3) carrying out binarization processing on the thinned significant image to obtain a detection result;
and processing the refined saliency images by utilizing threshold segmentation, so that the saliency defects in the fabric can be accurately positioned.
In the above technical solution, the preprocessing the defect image by using the Harris corner detection method in step (1.1) includes:
(1.1.1) establishing a mathematical model, moving on the image to be detected by using a local window, and judging whether the gray scale is changed greatly so as to determine an angular point; the mathematical model constructed therein is represented as:
Figure BDA0002018359800000051
wherein I (x, y) represents the gray value of the image to be detected, I (x + u, y + v) represents the gray value after the [ u, v ] pixel is translated, w (x, y) represents a window function, and E (u, v) represents the gray change generated after the [ u, v ] pixel is translated on the image window;
the above formula is simplified to be:
Figure BDA0002018359800000052
will [ I ]xu+Iyv]2Can be unfolded as follows:
Figure BDA0002018359800000053
if it is
Figure BDA0002018359800000054
Then
Figure BDA0002018359800000055
Defining a corner response function R, and judging whether each pixel is a corner or not by judging the size of R:
R=detA-k(traceA)2
detA=λ1λ2
traceA=λ12
wherein λ1λ2Is the eigenvalue of the A matrix, detA is the product of the eigenvalues of A, traceA is the sum of the eigenvalues. R of the corner points is related to the characteristic value of A, R | of the corner point area is larger than a certain threshold value, R | of the flat area is smaller than a certain threshold value, R of the edge is a negative value, the threshold value is a parameter, and the optimization can be continuously adjusted according to the detection result.
(1.1.2) connecting the corner points of the outer periphery by determining the corner points to form a convex hull enclosing the defect object.
In the above technical solution, the linear iterative clustering algorithm in step (1.2) includes:
(1.2.1) initializing seed points: and uniformly distributing seed points in the image to be detected according to the set number of the super pixels, wherein the seed points are initial pixel points in the region.
If an image has Q pixel points, the image is pre-divided into r super-pixel sets with the same size, the size of each super-pixel combination is Q/r, and the distance between seed points is approximately equal to P which is Q/r.
(1.2.2) reselecting the seed point in the neighborhood of Q '. times.Q' of the seed point, calculating the gradient value of all pixel points in the neighborhood, and moving the seed point to the place with the minimum gradient in the neighborhood.
(1.2.3) assigning a class label d to each pixel point in the neighborhood around each seed pointc
(1.2.4) for each searched pixel, calculating the distance D' between it and the seed point.
The distance D' is calculated as follows:
Figure BDA0002018359800000061
Figure BDA0002018359800000062
Figure BDA0002018359800000063
wherein L, a, b represent different color spaces, x, y represent the position of the pixel;
dcindicating the color distance, dsRepresenting spatial distance, NsIs the maximum spatial distance within a class;
is defined as NsMaximum color distance N ═ PcDifferent clustering and different pictures, so a fixed constant h, namely N, is takencThe final distance metric is then:
Figure BDA0002018359800000064
and (1.2.5) iterative optimization, repeating the steps (1.2.1) to (1.2.4), and continuously iterating until the error is converged to 0.
(1.2.6) enhancing connectivity, the above (1.2.5) iteration may make superpixel size too small, single pixel is cut into multiple discontinuous superpixels, and thus needs to be increased in connectivity to solve.
(1.2.7) establishing undirected graph G ═<v,ε>V corresponds to a superpixel rendezvous point, epsilon represents the boundary of node similarity, if siAnd siThe adjacent or same boundary superpixels are connected and then the similarity is defined by a Gaussian function, and the expression is as follows:
ωij=exp(-||si-sj||2/(2σ2))
where σ is the nuclear width, siIs the feature vector of the ith superpixel. And the adjacency matrix W associated with G defines: if i ≠ j, Wij=ωijOtherwise, W ij0, then the diagonal D of the diagonal matrix Dij=∑jWij
In the above technical solution, the step (1.3) of constructing the rough saliency image includes fusion of a saliency image of a super pixel set of the background outside the convex hull, a saliency image of a super pixel set of the edge, and a saliency image of a super pixel set inside the convex hull.
All the super-pixel points outside the convex hull are taken as the background, and the result of the significance propagation is used as an N-dimensional vector
Figure BDA0002018359800000071
Is shown in which fi *(i-1, …, N) corresponding to super-pixel siOf significance of f*Normalized, then the ith superpixel value in the saliency map is
Figure BDA0002018359800000072
Similarly, the saliency propagation is performed on the four boundary superpixels, and the generated saliency image is SBRepresents; for the ith super pixel point whether in the convex hull, if so, S M1, otherwise SMWhen the three parts are fused, a rough saliency image S is obtainedC
Figure BDA0002018359800000073
In the above technical solution, in the step (2), the teaching model is used to refine the rough saliency image in the step (1.3), the super-pixel set of the foreground region is used as a seed point, and only a small part of the foreground region is selected as a mark to avoid mistakenly using the pixels of the background region as the foreground region.
{si|SC(i)≥ηmax 1≤j≤N(SC(j))}
Wherein η is a constant, and the learning and teaching mode is started by setting a label, as shown in fig. 2.
If an image is centralized withl seed points s1,s2,…,slMarked, corresponding significance value f1=f2=…=f l1, there are z-N-l unmarked superpixel values.
In the above technical solution, in the teaching and learning mode of step (2.1), the core is that a teacher finds simple unmarked pixels for students to learn, that is, the teacher needs to evaluate the difficulty of unmarked seeds, that is: during the t-th propagation, a set of pixels C is established(t)Wherein the elements include the marked set L in the image(t)The teacher can select from C(t)The simplest super-pixel is selected to form the t course.
To measure si∈C(t)Ease of propagation of (2), introduction of fuzzy connectivity FCiThe concept of (c), which is distinguished as simple and difficult. Set the difficulty score for unmarked pixels with DSiThis means that there are:
DSi=FCi
FC based on fuzzy connectivityiThe most intuitive is to judge the propagation difficulty of the unmarked point set and the marked point set edge.
The fuzzy connectivity can realize the precise fusion of the super-pixel sets and the measurement of the relationship between the image elements, so that the difficulty of the propagation from unmarked pixels to marked pixel set points can be effectively evaluated.
In the above technical solution, the step (2.1) of evaluating the index of the degree of difficulty of the superpixel includes:
(2.1.1) fuzzy adjacency and fuzzy space: let the Euclidean space R of n dimensionsnIs divided into a plurality of hypercubes by n mutually orthogonal hyperplanes. There is an integer coordinate system at the center of each hypercube, which is called a hypercube as a pixel when n is 2, and the coordinates of these pixels can be represented by n-member integers, as ZnOne point of (1), ZnThen is RnThe set of all pixels in. At ZnA fuzzy relation a is defined which describes the positional proximity of two pixels. Generally requiring membership functions muα(c, d) is a non-increasing function defined in two dimensionsComprises the following steps:
Figure BDA0002018359800000081
wherein (Z)nAnd α) is called fuzzy number space.
(2.1.2) similarity of fuzzy space:
if X is (C, f) is (Z)nα), C is the image size, f is a function, which defines the field as C and the value field as the set of integers [ L, H ]]L and H are the minimum and maximum values of the pixel gray value, respectively.
If the value range of f is [0, 1 ]]Then, X is called (Z)nAnd α) membership field.
The fuzzy relation k in C is defined as the similarity of fuzzy space elements on X. Membership function mu of general kk(c, d) related to c, d itself and its functions f (c), f (d), can be defined as:
Figure BDA0002018359800000082
wherein
Figure BDA0002018359800000083
And muφRespectively a homogeneous component and a characteristic component of the object of interest in the image,
Figure BDA0002018359800000084
and φ is a fuzzy relation on C. In the following the discussion of g is discussed,
Figure BDA0002018359800000085
μφfunctional form of (c):
the value range of g is [0, 1 ]]While following simultaneously
Figure BDA00020183598000000813
Or muφWhen increasing, mukIs monotonically non-decreasing. Some examples of g satisfying this condition are:
Figure BDA0002018359800000086
Figure BDA0002018359800000087
it is assumed that the homogeneity between two pixels c, d over the field X is denoted by | f (c) -f (d) |. Uniform component of
Figure BDA0002018359800000088
Can be expressed as:
Figure BDA0002018359800000089
Figure BDA00020183598000000810
the function should satisfy the requirements: the value range is [0, 1 ]],
Figure BDA00020183598000000811
And is a monotonically non-increasing function. In fuzzy space, many membership functions can satisfy these requirements.
A more typical function can be selected here
Figure BDA00020183598000000812
See fig. 3.
The characteristic component describes that the intensity similarity of the two pixels c and d is equal to the local characteristic intensity moWhen the intensities of the two pixels are closer to each other and are equal to the local characteristic intensity moThe closer together, the larger the component value. When pixels c and d have higher feature membership (e.g., W)oValue of) and lower background membership (e.g., W)bC and d) have high local fuzzy connectivity values based on the target feature.
The following schemes may be selected to meet the above requirements:
Figure BDA0002018359800000091
wherein
wo(c,d)=min[Wo(f(c)),Wo(f(d))]
wb(c,d)=max[Wb(f(c)),Wb(f(d))]
Wherein WoAnd WbTarget and background membership functions, W, of pixel intensity, respectivelyoCan be represented by the function shown in FIG. 4, and W can be similarly definedbSubscripts o and b represent parameters of the target or background, respectively.
(2.1.3) fuzzy connectivity FCi: if k is the fuzzy similarity relation on field X, a path from c to d in field X is defined as a spatial element sequence < c(1),c(2),…,c(m)> (wherein c)i(i=1...m)∈X,c(1)=c,c(m)D. There are many paths from c to d in X, for each path ρcdAll have a weakest link, i.e. pcaThe minimum value of the similarity of two adjacent upper pixels, which determines ρcdThe connectivity of (d) is noted as:
μx(pcd)=min(μk(c(1),c(2)),μk(c(2),c(3)),…,μk(c(m-1),c(m)))
the fuzzy connectivity between c to d is defined by the maximum connectivity in all paths. If all paths from c to d are PcdThe fuzzy connectivity is defined as the fuzzy relation on x, with the membership function:
Figure BDA0002018359800000092
let μ ξ (ρ)cd) The path that reaches the maximum is called the fuzzy connectivity from c to d, and this value can be used as the fuzzy connectivity FCiThe value of (c).
Among the above-mentioned technical scheme, step (2.2) study and teaching mode, the most central is to realize teachers and students 'interaction, makes student's study condition feed back to mr, and mr adjusts the task or continues the teaching according to the condition. Ordering superpixels according to the evaluated difficulty scores in the teaching and learning mode to meet DS1<DS2<…<DSτIf the teacher gives q superpixels to learn during the tth learning, then q can be used(t)To represent q ≦ τ, the t-th learning set may be represented as U(t)={s1,s2,…,sq}。
In the learning and teaching mode, the teacher decides the next learning task according to the t-1 learning feedback condition of the student. In the process of t-1 times of confident learning, the significance value is
Figure BDA0002018359800000093
If the value is close to 0 or 1, the fact that the t-1 time of confident learning exists is shown, and the student has good judgment capability on the learning, the teacher distributes a heavier task during the t-th time of learning, and if the significant value of the t-1 time is close to 0.5, the fact that the learning is not ideal needs to be readjusted. It is in [0, 1 ]]The confidence on the interval is expressed as:
Figure BDA0002018359800000101
q is then(t)Can be calculated as:
q(t)=「|C(t)|ConScore]
in the above technical solution, the significance is propagated in step (2.3), and after the learning course is specified, the student should change the significance value from L(t)Is propagated to C(t)The specific expression is as follows:
f(t+1)=M(t)WD-1f(t)
wherein M is(t)Is a diagonal matrix, W is an adjacency matrix associated with G, D-1Is the inverse of the diagonal matrix, f(t)Is the significance value at time t.If si∈L(t)∪ C(t)Then the diagonal array is 1, otherwise 0, and after the t-th propagation, the marked and unmarked pixel sets are updated to L(t+1)=L(t)∪C(t)And S(t+1)=S(t)-C(t)And continuously iterating until the S is an empty set.
Fig. 5 shows an image of the fabric to be tested, and the refined saliency image is the result in fig. 6. Therefore, the method for detecting the fabric defects can accurately detect the positions and the sizes of the defects on the fabric.
In the binarization processing process in the above technical solution (3), the refined saliency image is subjected to pixel classification processing by threshold segmentation, and the binarized image is the result in fig. 7.
If the gray scale value of the thinned saliency image is F (x, y), the result can be expressed as:
Figure BDA0002018359800000102
where B (x, y) represents the image after threshold segmentation and T is the threshold.
Fig. 7 is an image after processing in the technical solution of the present invention, and fig. 8 is a detection result based on an optimal Gabor filter algorithm.
Comparing fig. 7 and 8, it can be known that the method for detecting the fabric defects in significance under the machine teaching model provided by the invention can accurately identify the positions and the precise shape profiles of the defects, and is superior to the detection result of fig. 8.
The invention provides a better method for detecting the defects of the significant fabric, which gradually controls and optimizes the sequence of significant propagation by introducing a teaching model and evaluating the propagation difficulty of a super-pixel set through fuzzy connectivity, thereby overcoming the defects and shortcomings of the significant propagation in the defect detection method; the method for detecting the significant fabric defects is based on the premise that pixels have different propagation difficulties and is combined with practical results of cognitive science theories, and has reference value for improving the stability and accuracy of a significant defect detection algorithm.
Some steps in the embodiments of the present invention may be implemented by software, and the corresponding software program may be stored in a readable storage medium, such as an optical disc or a hard disk.
The above description is only for the purpose of illustrating the preferred embodiments of the present invention and is not to be construed as limiting the invention, and any modifications, equivalents, improvements and the like that fall within the spirit and principle of the present invention are intended to be included therein.

Claims (7)

1. A method of detecting a significant fabric defect, the method comprising:
s1: preprocessing an image to be detected to construct a rough saliency image;
s2: gradually thinning the rough saliency image by using a teaching model to obtain a thinned saliency image;
s3: segmenting the thinned saliency image by utilizing threshold segmentation to obtain a defect detection result;
the S2: gradually thinning the rough saliency image by using a teaching model to obtain a thinned saliency image, wherein the method comprises the following steps: evaluating the propagation difficulty of each super pixel through fuzzy connectivity, and sequencing each super pixel according to the propagation difficulty;
performing easy and difficult significance propagation on each sequenced super pixel by using a teaching model;
the evaluation of the propagation difficulty of each super pixel through fuzzy connectivity comprises the following steps:
constructing a fuzzy space, and describing the proximity relation among the super pixels in the constructed fuzzy space;
evaluating the uniformity component of the interested object in the image to be detected through the membership function
Figure FDA0003002868170000011
And a characteristic component muφ(c, d); c and d represent two pixels;
and defining the fuzzy relation among the super pixels in the fuzzy space by using the fuzzy connection degree, and sequencing the propagation difficulty among the super pixels.
2. The method according to claim 1, wherein the step of S1: preprocessing an image to be detected to construct a rough saliency image, wherein the rough saliency image comprises the following steps:
acquiring a defective fabric image by utilizing angular point detection, wherein the defective fabric image is an image containing convex hulls;
and (3) dividing the image to be detected into different super pixel sets by using a super pixel division method, wherein each super pixel set is a cluster of associated super pixel points.
3. The method of claim 2, wherein the uniformity component
Figure FDA0003002868170000012
Comprises the following steps:
Figure FDA0003002868170000013
wherein the content of the first and second substances,
Figure FDA0003002868170000014
comprises the following steps:
Figure FDA0003002868170000015
therein
Figure FDA0003002868170000016
Is the standard deviation; f (c) and f (d) represent the pixel grey values of pixels c and d, respectively;
said characteristic component muφ(c, d) are:
Figure FDA0003002868170000017
wherein
wo(c,d)=min[Wo(f(c)),Wo(f(d))]
wb(c,d)=max[Wb(f(c)),Wb(f(d))]
Wherein WoAnd WbRespectively a target membership function and a background membership function of the pixel intensity; wherein the target membership function is:
Figure FDA0003002868170000021
wherein m isoAnd k isoRespectively representing the mean value and the standard deviation of an interested object in the image to be detected;
a background membership function of
Figure FDA0003002868170000022
Wherein m isbAnd k isbRespectively is the mean value and standard deviation of the background in the image to be detected.
4. The method of claim 3, wherein said propagating sorted superpixels from easy to difficult saliency using a teaching model comprises:
randomly distributing simple super pixels which are ranked at the back and at the front for learning, adjusting the task amount according to the feedback effect of learning, and judging the feedback result of learning according to the confidence coefficient Conscore;
and (4) completing significance propagation by continuously iterating and optimizing until the unmarked super-pixel set is an empty set.
5. The method according to claim 4, wherein said obtaining the image of the defective fabric by using the corner detection method is a Harris corner detection method, and comprises:
establishing a mathematical model, moving on an image to be detected by using a local window, defining a corner response function R, and determining the position of a corner by judging the size of R;
and connecting the angular points to form a convex hull which surrounds the defect target, wherein the defect fabric image is the image containing the convex hull.
6. The method of claim 5, wherein the method of superpixel segmentation is a linear iterative clustering algorithm comprising:
initializing a seed point;
reselecting seed points within a Q '. Q' neighborhood of seed points: calculating gradient values of all pixel points in the Q '. multidot.Q' neighborhood, and moving the seed points to the place with the minimum gradient in the Q '. multidot.Q' neighborhood;
distributing a class label to each pixel point in the neighborhood around each seed point;
respectively calculating the distance between each pixel point and the seed point in the neighborhood around each seed point;
respectively calculating the color distance and the space distance between the kth clustering center and the ith pixel point by using a distance calculation formula among space pixels, and calculating the normalized measurement distance;
the optimization is iterated until the error converges to 0.
7. The method according to claim 6, wherein the step of S3: segmenting the thinned saliency image by utilizing threshold segmentation to obtain a defect detection result, wherein the defect detection result comprises the following steps:
carrying out pixel classification processing on the thinned saliency image;
setting the gray value of the thinned saliency image as F (x, y), and B (x, y) representing the image after threshold segmentation; t is a threshold value; then
Figure FDA0003002868170000031
And (x, y) is the position coordinates of the pixel points in the saliency image.
CN201910270956.6A 2019-04-04 2019-04-04 Method for detecting significant fabric defects based on machine teaching mode Active CN109961437B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910270956.6A CN109961437B (en) 2019-04-04 2019-04-04 Method for detecting significant fabric defects based on machine teaching mode

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910270956.6A CN109961437B (en) 2019-04-04 2019-04-04 Method for detecting significant fabric defects based on machine teaching mode

Publications (2)

Publication Number Publication Date
CN109961437A CN109961437A (en) 2019-07-02
CN109961437B true CN109961437B (en) 2021-06-25

Family

ID=67025713

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910270956.6A Active CN109961437B (en) 2019-04-04 2019-04-04 Method for detecting significant fabric defects based on machine teaching mode

Country Status (1)

Country Link
CN (1) CN109961437B (en)

Families Citing this family (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN110473190B (en) * 2019-08-09 2022-03-04 江南大学 Adaptive fabric defect detection method based on scale
CN111861996B (en) * 2020-06-23 2023-11-03 西安工程大学 Printed fabric defect detection method
CN113870297B (en) * 2021-12-02 2022-02-22 暨南大学 Image edge detection method and device and storage medium
CN117237298B (en) * 2023-09-15 2024-05-14 广州乾丰印花有限公司 Printed fabric defect inspection method, device and computing equipment

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011149609A1 (en) * 2010-05-28 2011-12-01 Exxonmobil Upstream Research Company Method for seismic hydrocarbon system analysis
CN103871053A (en) * 2014-02-25 2014-06-18 苏州大学 Vision conspicuousness-based cloth flaw detection method
CN105678788A (en) * 2016-02-19 2016-06-15 中原工学院 Fabric defect detection method based on HOG and low-rank decomposition
CN107169956A (en) * 2017-04-28 2017-09-15 西安工程大学 Yarn dyed fabric defect detection method based on convolutional neural networks
CN107767400A (en) * 2017-06-23 2018-03-06 北京理工大学 Remote sensing images sequence moving target detection method based on stratification significance analysis
CN109191430A (en) * 2018-07-27 2019-01-11 江苏理工学院 A kind of plain color cloth defect inspection method based on Laws texture in conjunction with single classification SVM

Family Cites Families (11)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20090149565A1 (en) * 2007-12-11 2009-06-11 Chunqing Liu Method for Making High Performance Mixed Matrix Membranes
CN102968782B (en) * 2012-09-12 2015-08-19 苏州大学 In a kind of coloured image, remarkable object takes method automatically
US10025988B2 (en) * 2015-05-22 2018-07-17 Tektronix, Inc. Anomalous pixel detection
US9679219B2 (en) * 2015-08-12 2017-06-13 International Business Machines Corporation Image feature classification
CN106096615A (en) * 2015-11-25 2016-11-09 北京邮电大学 A kind of salient region of image extracting method based on random walk
CN109215031A (en) * 2017-07-03 2019-01-15 中国科学院文献情报中心 The weighting guiding filtering depth of field rendering method extracted based on saliency
CN107833220B (en) * 2017-11-28 2021-06-11 河海大学常州校区 Fabric defect detection method based on deep convolutional neural network and visual saliency
CN108133473B (en) * 2017-12-21 2021-10-01 江南大学 Warp-knitted jacquard fabric defect detection method based on Gabor filtering and deep neural network
CN108550132B (en) * 2018-03-16 2021-06-18 安徽大学 Image collaborative salient target detection method
CN109035195B (en) * 2018-05-08 2021-11-30 武汉纺织大学 Fabric defect detection method
CN109410334A (en) * 2018-09-21 2019-03-01 桂林电子科技大学 A kind of three-dimensional grid model defect hole restorative procedure based on characteristic curve

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011149609A1 (en) * 2010-05-28 2011-12-01 Exxonmobil Upstream Research Company Method for seismic hydrocarbon system analysis
CN103871053A (en) * 2014-02-25 2014-06-18 苏州大学 Vision conspicuousness-based cloth flaw detection method
CN105678788A (en) * 2016-02-19 2016-06-15 中原工学院 Fabric defect detection method based on HOG and low-rank decomposition
CN107169956A (en) * 2017-04-28 2017-09-15 西安工程大学 Yarn dyed fabric defect detection method based on convolutional neural networks
CN107767400A (en) * 2017-06-23 2018-03-06 北京理工大学 Remote sensing images sequence moving target detection method based on stratification significance analysis
CN109191430A (en) * 2018-07-27 2019-01-11 江苏理工学院 A kind of plain color cloth defect inspection method based on Laws texture in conjunction with single classification SVM

Also Published As

Publication number Publication date
CN109961437A (en) 2019-07-02

Similar Documents

Publication Publication Date Title
CN109961437B (en) Method for detecting significant fabric defects based on machine teaching mode
CN113160192B (en) Visual sense-based snow pressing vehicle appearance defect detection method and device under complex background
CN102308306B (en) A constraint generator for use in image segregation
Shen et al. Multiple instance subspace learning via partial random projection tree for local reflection symmetry in natural images
CN109448015B (en) Image collaborative segmentation method based on saliency map fusion
Ji et al. A robust modified Gaussian mixture model with rough set for image segmentation
Abdelsamea et al. A SOM-based Chan–Vese model for unsupervised image segmentation
Cerrone et al. End-to-end learned random walker for seeded image segmentation
CN113177592B (en) Image segmentation method and device, computer equipment and storage medium
CN112613410B (en) Parasite egg identification method based on transfer learning
CN109658378B (en) Pore identification method and system based on soil CT image
CN117152484B (en) Small target cloth flaw detection method based on improved YOLOv5s
CN113808123B (en) Dynamic detection method for liquid medicine bag based on machine vision
CN109993728B (en) Automatic detection method and system for deviation of thermal transfer glue
Liu et al. Multiobjective fuzzy clustering with multiple spatial information for Noisy color image segmentation
Wang et al. Local defect detection and print quality assessment
CN107423771B (en) Two-time-phase remote sensing image change detection method
CN112509017A (en) Remote sensing image change detection method based on learnable difference algorithm
CN116977271A (en) Defect detection method, model training method, device and electronic equipment
CN113469270B (en) Semi-supervised intuitive clustering method based on decomposition multi-target differential evolution superpixel
Singh et al. A hybrid approach using color spatial variance and novel object position prior for salient object detection
Sarmadian et al. Optimizing the snake model using honey-bee mating algorithm for road extraction from very high-resolution satellite images
Shi et al. A Novel Image Segmentation Algorithm based on Continuous-Time Quantum Walk using Superpixels
Zheng et al. A functional pipeline framework for landmark identification on 3D surface extracted from volumetric data
Wu et al. Retentive Compensation and Personality Filtering for Few-Shot Remote Sensing Object Detection

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant