CN101419706B - Jersey wear flokkit and balling up grading method based on image analysis - Google Patents

Jersey wear flokkit and balling up grading method based on image analysis Download PDF

Info

Publication number
CN101419706B
CN101419706B CN2008101539748A CN200810153974A CN101419706B CN 101419706 B CN101419706 B CN 101419706B CN 2008101539748 A CN2008101539748 A CN 2008101539748A CN 200810153974 A CN200810153974 A CN 200810153974A CN 101419706 B CN101419706 B CN 101419706B
Authority
CN
China
Prior art keywords
point
image
cut
pilling
sigma
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Expired - Fee Related
Application number
CN2008101539748A
Other languages
Chinese (zh)
Other versions
CN101419706A (en
Inventor
肖志涛
吴骏
耿磊
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Tianjin Polytechnic University
Original Assignee
Tianjin Polytechnic University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Tianjin Polytechnic University filed Critical Tianjin Polytechnic University
Priority to CN2008101539748A priority Critical patent/CN101419706B/en
Publication of CN101419706A publication Critical patent/CN101419706A/en
Application granted granted Critical
Publication of CN101419706B publication Critical patent/CN101419706B/en
Expired - Fee Related legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Abstract

The invention belongs to the technical fields of image processing and pattern recognition, and relates to an objective evaluation method of pilling property of knitted fabrics based on image analysis techniques. The method comprises the following steps: pilling particles are segmented from a sample image of the knitted fabrics, indexes characterizing pilling grains of the knitted fabrics are computed by pilling areas and quantities, and finally, the pilling grade of the knitted fabrics is evaluated by a genetic-BP neural network. Compared with the existing traditional visual inspection methods, the objective evaluation method can overcome subjective factors, more objectively and accurately evaluates the pilling grade, and is of great significance in promoting digital textile testing technologies.

Description

Knitted fabric pilling ranking method based on graphical analysis
Technical field
The invention belongs to Flame Image Process and mode identification technology, be specifically related to a kind of method that can carry out objective grading to the knitted fabric pilling.
Background technology
Fabric constantly stands friction in reality dress and washing process, its surface meeting pilling worsens the outward appearance of fabric, and influences the wearability of fabric.The pilling performance of how to evaluate fabric is an important content in textile inspection field.Now the test method of fabric pilling is a lot, the process of its design concept pilling when generally all to be that simulate fabric is actual wear.The method of fabric pilling evaluation mainly by the people subjectively calculate, ball top number and ball top on the comparative unit area be heavy, not only need rich experience, and subjectivity is big.
Scholars more both domestic and external have begun to attempt with image processing techniques assessment pilling performance, the U.S., Australia are abroad arranged, domestic Donghua University, Zhejiang Prov Engineering Academy have also carried out Primary Study in this respect, but can't reach requirement aspect precision, main cause is to describe the image of collection well.
Support fund of the present invention is a Tianjin applied basic research planning item (No:07JCYBJC13700).
Summary of the invention
The objective of the invention is at the deficiencies in the prior art, a kind of objective ranking method of knitted fabric pilling based on image analysis technology is provided.This method can overcome artificial subjective factor, and is more objective, carry out the pilling ranking accurately, efficiently.The present invention at first is partitioned into the ball top particle from the knitted fabric sample image, obtain characterizing the index of knitted fabric pilling degree then by ball top area and quantity Calculation, utilizes heredity-BP neural network that the grade of knitted fabric pilling is assessed at last.
The objective ranking method of knitted fabric pilling based on image analysis technology of the present invention comprises the following steps:
Step 1: obtain a width of cloth knitted fabric sample image, it is carried out following pre-service:
(1) method with histogram equalization is stretched to whole tonal range to the gray-scale value of sample image;
(2) image behind the histogram equalization is carried out the Top-Hat conversion, use the preceding figure image subtraction transformation results of conversion then, to remove the influence of knot.
Step 2: adopt minimum skewness method that image is carried out binary conversion treatment, calculate the area of each connected region in the bianry image, delete zone, thereby from sample image, extract the ball top zone less than threshold value;
Step 3: the ball top image is carried out range conversion, and the image of adjusting the distance carries out the pixel property analysis, determines the bottleneck position in adhesion zone, comprises the pixel connecting line in borderline pixel connecting line and the interior point;
Step 4: by chain code adhesion zone connecting line is followed the tracks of, examined the validity of connecting line, and then the connecting line and the cut point in definite adhesion zone;
Step 5:, adopt corresponding cutting method that the adhesion zone is cut apart at border cut point, adjacent side circle cut point and three kinds of dissimilar cut points of inner cut point;
Step 6:, calculate two or more pilling performance index of ball top according to ball top particle segmentation result.
Step 7: set up the BP neural network model, wherein input neuron is taken as above-mentioned index, and output neuron characterizes different pilling grades;
Step 8: the weights of BP neural network and threshold value as gene, are carried out real coding;
Step 9:, utilize genetic algorithm that the initial weight of BP neural network and threshold value and structure are optimized and train with the input of the index of standard pin fabric pilling sample image as the BP neural network.
Step 10:, obtain the objective rating result of pilling with the input of the index of test pattern as the BP neural network that trains.
As preferred implementation, the objective ranking method of knitted fabric pilling of the present invention when carrying out closed operation and Top-hat conversion in the step 1, is all chosen 3 * 3 decussate texture element;
In the step 2, establishing pretreated gray distribution of image scope is 0~G, and wherein the grey level distribution probability density of target and background is respectively p 1(t) and p 2(t), establish p 1(t) and p 2(t) equal Normal Distribution, its gray average is respectively μ 1(t) and μ 2(t), variance is respectively σ 1 2(t) and σ 2 2(t), the skewness index is respectively K 31(t) and K 32(t), the gray level of image is divided into target and background two classes by threshold value t, the normalization histogram of establishing pretreated image is h (i), (i=0~G), the object pixel accounts for the full images pixel count than being θ (t), then:
θ ( t ) = Σ i = 0 t h ( i )
μ 1 ( t ) = Σ i = 0 t h ( i ) i / θ ( t ) , μ 2 ( t ) = 1 - μ 1 ( t )
σ 1 2 ( t ) = Σ i = 0 t [ i - μ 1 ( t ) ] 2 h ( i ) / θ ( t )
σ 2 2 ( t ) = Σ i = t + 1 G [ i - μ 2 ( t ) ] 3 h ( i ) / ( 1 - θ ( t )
Figure G2008101539748D00025
Figure G2008101539748D00026
The skewness index of definition image: K 3(t)=| K 31(t) |+| K 32(t) |;
Ask for thre=ArgMin[K 3(t)], 0<t≤G;
Carry out binary conversion treatment through pretreated image: pixel value f ( i , j ) = 1 f ( i , j ) &GreaterEqual; thre 0 f ( i , j ) < thre ;
Selected area threshold carries out area filtering to image.
In the step 3, be located among the bianry image f, the target area is A, and the background area is A, and p and q are the pixel among the f, d (p q) is distance between pixel p and the q, to its carry out range conversion: DT (p)=min{d (p, q) | q ∈ A}.
In the step 5,
Cutting step for frontier point is as follows:
1. be in the background dot of distance value under saltus step place by counterclockwise sequential search at border cut point (being made as the A point) on for the Boundary Loop of 5 * 5 neighborhoods at center, be made as B and C point;
2. in 3 * 3 neighborhoods that B is ordered, seek and the nearest background dot of A point, be made as the D point;
3. in 3 * 3 neighborhoods that C is ordered, seek and the nearest background dot of A point, be made as the E point;
4. connect D (B), E 2 points with the background colour setting-out;
Cutting step for adjacent frontier point is as follows:
1. be to seek and the nearest frontier point of A point in 3 * 3 neighborhoods at center at adjacent side circle cut point (being made as the A point), be made as the B point, it is first end points of line of cut;
2. seeking with the A point is the symmetric points that the B at center is ordered, and is made as the C point;
3. seek and its nearest frontier point in 3 * 3 neighborhoods that C is ordered, be made as the D point, it is another end points of line of cut;
4. two terminal B and the D that connect line of cut with the background colour setting-out;
Cutting step for internal point is as follows:
1. establishing inner cut point is the A point, calculates average chain code by the chain code value that with the A point is the regional connecting line at center, with this direction as regional connecting line;
2. average chain code adds 2, makes straight line be rotated counterclockwise 90 ° of outsides of pointing to the zone, seeks line of cut along the outside pointwise of this direction and is positioned at point on the adjacent side boundary line, and promptly distance value is 2 point, is made as the B point;
3. average chain code subtracts 2 (or doing mould 8 computings after adding 6), makes the straight line 90 ° of outsides of pointing to border, regional opposite that turn clockwise.Seek point on the adjacent side boundary line that line of cut is positioned at opposite side along the outside pointwise of this direction, be made as the C point;
4. by 2. with 3. step in the adjacent frontier point B and the C point that find set out, the iterated conditional that can not surpass the distance value of cut point according to the search step number is sought respectively in their 3 * 3 neighborhoods and the nearest frontier point of A point, is made as D and E point.
5. connect D, E at 2 with the background colour setting-out and promptly get line of cut.
In the step 6, described performance index are that the total area, number, area standard differ from three performance index.
If WIH IjBe connected weights, WHO for j node of i node and hidden layer in the input layer JiBe the weights that are connected of i node of j node and output layer in the hidden layer, step 9 is pressed down step and is carried out:
1. initialization population P comprises intersection scale, crossover probability Pc, mutation probability Pm and to arbitrary WIH IjAnd WHO JiInitialization;
2. calculate each individual evaluation function, and, press the following formula probable value and select the network individuality its ordering:
P s = f i / &Sigma; i = 1 N f i
Wherein f is the adaptation value of individual i, f ( i ) = 1 / E ( i ) , E ( i ) = &Sigma; p &Sigma; k ( V k - T k ) 2
Wherein be the i chromosome number, k is an output layer node number, and p is the learning sample number, T kBe teacher signal, V kBe network output;
3. with probability P c to individual G iAnd G I+1Interlace operation produces new individual G i' and G ' I+1, the individuality that does not carry out interlace operation directly duplicates;
4. utilize probability P m sudden change to produce G jNew individual G j';
5. new individuality is inserted among the population P, and calculates new individual evaluation function;
If 6. new individual fitness value is then thought and has been found satisfied individuality greater than preset threshold, finishes, otherwise change 3..
Ranking method of the present invention, the value of each gene of initial population are between (1,1), and initial population quantity is taken as 30, and genetic algebra was taken as for 700 generations, and the genetic optimization standard is that fitness value is greater than 20; The learning rate of BP neural network is set at 0.01, and the maximum frequency of training of network is 3000.
The ball top dividing method that the present invention adopts is judged the type of connecting line by the boundary line of chain code following ball top particle, determines bottleneck position and cut point, can realize accurately cutting apart the adhesion position.Segmentation result and visual observation are close, and ball top is all well kept on quantity, shape and size.The present invention utilizes the BP neural network to estimate the pilling performance of knitted fabric, for the error function that overcomes the BP neural network is square type, has local minimum problem and the slow shortcoming of speed of convergence, be optimized with the parameter and the structure of genetic algorithm the BP neural network, utilize the BP neural network after optimizing that knitted fabric pilling index is carried out objective grading, have high recognition.
Description of drawings
Fig. 1: the general flow chart of the objective ranking method of knitted fabric pilling of the present invention;
Fig. 2: knitted fabric pilling one-level to the Pyatyi standard sample of national knit product quality supervision inspection center development shines;
Fig. 3: knitted fabric pilling primary standard sample is according to corresponding gray level image behind the histogram specification;
The Top-Hat transformation results of Fig. 4: Fig. 3;
Fig. 5: Fig. 4 reduces the image after the knot gray scale;
Fig. 6: knitted fabric pilling standard sample extracts the result according to one to the Pyatyi ball top;
Fig. 7: knitted fabric pilling standard sample arrives Pyatyi ball top segmentation result according to one;
Fig. 8: heredity-BP network weight and threshold value computation optimization process flow diagram.
Embodiment
The general flow chart of the objective ranking method of knitted fabric pilling of the present invention as shown in Figure 1, at first from the knitted fabric sample image, be partitioned into the ball top particle, obtain characterizing the index of knitted fabric pilling degree then by ball top area and quantity Calculation, utilize heredity-BP neural network that the grade of knitted fabric pilling is assessed at last.With regard to ball top dividing processing, index calculating and heredity-BP neural network model structure and embodiment the present invention is further described below.
1. image pre-service
As shown in Figure 2: the tonal range of knitted fabric sample image is less, is not easy image is handled; The gray-scale value of knitting wool knot is very big, almost can't distinguish with ball top.In order to reduce the influence that ball top is extracted, in this patent knitted fabric sample image successively carried out processing such as gray-level histogram equalizationization and Top-Hat conversion.Followingly make introduction respectively:
(1) grey level histogram stretches
Before ball top extracts, adopt the method for histogram equalization that the gray-scale value of sample image is stretched to whole tonal range, the result is as shown in Figure 3.
(2) Top-Hat conversion
With structural element b input picture f is carried out gray scale and expand, be designated as It is defined as
Figure G2008101539748D00052
With structural element b input picture f is carried out the gray scale corrosion, be designated as f Θ b, it is defined as
(f Θ b) (s, t)=min{f (s+x, t+y)-b (x, y) | (s+x), (t+y) ∈ D f(x, y) ∈ D b}
D wherein fAnd D bBe respectively the field of definition of f and b, (s t) is locations of pixels in the image, and (x y) is positions of elements among the b.
To expand and corrode in conjunction with obtaining closed operation fb and opening operation f о b, be defined as respectively
f &CenterDot; b = ( f &CirclePlus; b ) &Theta;b
Figure G2008101539748D00054
The Top-Hat transformation operator is defined as
HAT(f)=f-(fоb)
Wherein structural element b is taken as 3 * 3 decussate texture.
Obtain peak among Fig. 3 with the Top-Hat conversion, get threshold value again and obtain the peak mark.The peak point of obtaining as shown in Figure 4, as can be seen from the figure, the peak value in the image has been detected.
In order to reduce the gray-scale value of knot, deduct image after the Top-Hat conversion with the gray-scale value stretched image, as shown in Figure 5.As can be seen from the figure, the gray-scale value of bright knitting wool node has diminished, and has reduced the influence to subsequent treatment.
2. ball top extracts
With pretreated image segmentation is background (fabric) and target (ball top).Because the pixel value at ball top place is greater than the pixel value of surrounding pixel point, therefore can be by determining that an appropriate threshold comes ball top and background separation.The pixel that is higher than threshold value is defined as ball top, and the pixel that is lower than threshold value is defined as background.This patent adopts minimum skewness method that image is carried out binaryzation.
If pretreated gray distribution of image scope is 0~G, wherein the grey level distribution probability density of target and background is respectively p 1(t) and p 2(t).If p 1(t) and p 2(t) equal Normal Distribution, its gray average is respectively μ 1(t) and μ 2(t), variance is respectively σ 1 2(t) and σ 2 2(t), the skewness index is respectively K 31(t) and K 32(t).If image gray levels is divided into target and background two classes by threshold value t.If the image normalization histogram is h (i), (i=0~G), the object pixel accounts for the full images pixel count than being θ (t), then
&theta; ( t ) = &Sigma; i = 0 t h ( i )
&mu; 1 ( t ) = &Sigma; i = 0 t h ( i ) i / &theta; ( t ) , &mu; 2 ( t ) = 1 - &mu; 1 ( t )
&sigma; 1 2 ( t ) = &Sigma; i = 0 t [ i - &mu; 1 ( t ) ] 2 h ( i ) / &theta; ( t )
&sigma; 2 2 ( t ) = &Sigma; i = t + 1 G [ i - &mu; 2 ( t ) ] 3 h ( i ) / ( 1 - &theta; ( t )
Figure G2008101539748D00065
Figure G2008101539748D00066
The skewness index of definition image:
K 3(t)=|K 31(t)|+|K 32(t)|
When threshold value is in the optimum position, by threshold value divide to picture and the intensity profile of background near normal distribution, the degree that it departs from normal distribution is minimum just, at this moment the total skewness index of image should be minimum value.Therefore, the skewness index of image can be used as the discriminant function of image segmentation selection of threshold.Making the gray-scale value thre of discriminant function minimalization, will be the threshold value that least error is arranged, that is:
thre=ArgMin[K 3(t)] 0<t≤G
The pixel value of supposing gray level image be f (i, j), then:
f ( i , j ) = 1 f ( i , j ) &GreaterEqual; thre 0 f ( i , j ) < thre
1 represents the target subgraph, represents the background subgraph with 0.
The image that above-mentioned processing obtains can't be can be regarded as the image that only contains ball top, because in the original image of fabric, also there are many little fine hair and outstanding knot, theoretically, these compositions are not included in the row of ball top, and the maximum difference of they and ball top is that the area of ball top is bigger.Can set an area threshold (pixel count), calculate the area of each fritter then, every area all is removed less than the piece of threshold value.This process is referred to as area filtering.Wherein the selection of threshold value is according to being: the number of remaining piece (ball top) should be unanimous on the whole with the ball top number that the human eye actual observation is arrived.
Knitted fabric pilling standard sample extracts the result as shown in Figure 6 according to one to the Pyatyi ball top.
3. adhesion ball top particle separation
In the ball top image that extracts, the ball top zone situation of adhesion mutually may appear.In this patent ball top image is carried out range conversion, in its range image, utilize the attribute of adhesion particle pixel, determine the adhesion position and the cut point thereof of adhesion particle, realize cutting the adhesion zone.
Range conversion is a kind of operation to bianry image, and it is converted into a width of cloth gray level image with a width of cloth bianry image.In this width of cloth gray level image, the gray level of each pixel is this pixel and apart from the distance between its nearest background.Be located among the bianry image f, the target area is A, and the background area is A, and p and q are the pixel among the f, d (p q) is distance between pixel p and the q, and then the range conversion of pixel p is defined as:
DT(p)=min{d(p,q)|q∈A}
In range image, be the attribute of identification pixel, need to investigate the state of related pixel in this pixel place neighborhood, this dividing method uses three detection rings.They are to be the outer boundary of 3 * 3,5 * 5 and 9 * 9 neighborhoods at center with current pixel to be measured, have respectively 8,16 with 32 points.Be called interior ring, middle ring and outer shroud.The absolute value sum of ring up-and-down boundary and the average difference in border, the left and right sides during Grad is defined as.The positive transition number is defined as in the current some place neighborhood transition times that is made progress by centre of neighbourhood point value on the closed curve of center pixel, is used to represent the topological property of this point.
(1) bottleneck position, adhesion zone determines
A distinguishing feature of the split position in adhesion zone is the bottleneck position that they all are in connected region.The bottleneck position comprises on the border pixel connecting line in the pixel connecting line and interior point.
The recognition methods of pixel connecting line is as follows on the border:
1. ring positive transition number is necessary for 2 in, and this is the most important condition.
2. second condition---three kinds of situations of branch are counted in interior ring positive transition:
(a) the positive transition number is to be cut-point at 2 o'clock, and this is a Class1.This point and directly adjacency of connect two zones.
(b) the positive transition number is 1 o'clock, and counting in interior ring border should be greater than 4, and this is a type 2.Its objective is and unlikelyly in the ring in making sneak into too many background dot.
(c) the positive transition number is 0 o'clock, and middle ring background is counted should be less than 7, and this is a type 3.Its objective is in order to guarantee that current pixel remains on the bottleneck position in adhesion zone.
The recognition methods of pixel connecting line is as follows in the interior point:
1. the ring gradient must be less than defined threshold in, and this is the most important condition.Threshold value is divided into two kinds of situations:
(a) generally middle ring gradient should be less than 8.
(b) ring positive transition number is 2 o'clock in, and threshold value can relax 1, and promptly middle ring gradient should be less than 9.
2. second condition---interior ring positive transition number is in two kinds of situation:
(a) the positive transition number is to be cut-point at 2 o'clock, and this is a type 4.This point and directly adjacency of connect two zones.
(b) the positive transition number is non-2 o'clock, and counting on the interior ring should be less than 2 (promptly can only be 0 and 1).Can be divided into two kinds of situations this moment again:
In ring go up equivalence and count and non-0 be cut-point at non-2 o'clock, this is a type 5.
It is 2 that the last equivalence of interior ring is counted, and current point is not again an angle point, and this is a type 6.At this moment, add the intermediate pixel as current point, then they should constitute a short-term that single-point is wide in 3 * 3 neighborhoods of interior ring, and it is in the stage casing of connecting line, so can not be angle point.
3. for the pixel of type 5 with type 6, also must have an interior point on the outer shroud at least, point existed in promptly 5 of current points of distance were located to have, in order to cut off fine rule burr long in the image.
4. for fear of the zone is cut by the waist, the distance value of cut-point needs half less than maximum range value in the image.
(2) adhesion zone connecting line and cut point determines
The connecting line that adhesion is interregional is intactly followed the tracks of out, examines whether erroneous judgement has taken place, and determines actual cutting position point.
The tracking step of adhesion zone connecting line is as follows:
1. take out a list item from neck part bit line segment table, obtain its coordinate and distance value, the latter is exactly the distance value of connecting line to be tracked.
2. be 1 boundary line for distance value, just with the middle point coordinate of list item directly as cutting position.
3. for distance value 2 and 2 above isoline, clockwise with respectively followed the tracks of for 10 steps counterclockwise, termination condition has two, runs into internal point, being exactly distance value promptly finishes greater than the point of pursuit gain, or withdraws from after following the tracks of full 10 steps.Follow the tracks of and finish to obtain two chain code sequences.
4. second chain code oppositely after, they are merged into one section chain code, obtain needed connecting line thus.
After tracking obtains the connecting line in adhesion zone, also need determine the validity of this connecting line.Criterion has two, promptly
1. the urban district between connecting line end points distance should less than, equal counting of connecting line, its physical significance is that connecting line should be straight as far as possible, is not the oblique angle direction.
2. the distance value of connecting line is smaller or equal to max/3, and wherein max is the maximal value in each pixel distance value of zone.
For effective connecting line, just can select cut point thereon, cut point is selected in the mid point of two regional connecting lines usually.But two kinds of special circumstances are arranged, the one, the cut-point on the boundary line is directly made cut point and is used, and does not need to look for connecting line again.The 2nd, cut point should select two when connecting line was longer, was selected in the two ends of connecting line respectively, should equal the distance value of pixel connecting line from the distance of end points, so that cutting part is comparatively smooth, this mainly occurs in the connecting line distance value is 2 occasion.
(3) cutting in adhesion zone
Cut point is divided into 3 types of border cut point, adjacent side circle cut point and inner cut points, and dissimilar cut points adopts different cutting methods.
The cutting step of frontier point is as follows:
1. be in the background dot of distance value under saltus step place by counterclockwise sequential search at border cut point (being made as the A point) on for the Boundary Loop of 5 * 5 neighborhoods at center, be made as B and C point.
2. in 3 * 3 neighborhoods that B is ordered, seek and the nearest background dot of A point, be made as the D point.
3. in 3 * 3 neighborhoods that C is ordered, seek and the nearest background dot of A point, be made as the E point.
4. connect D (B), E 2 points with the background colour setting-out.
The cutting step of adjacent frontier point is as follows:
1. be to seek and the nearest frontier point of A point in 3 * 3 neighborhoods at center at adjacent side circle cut point (being made as the A point), be made as the B point, it is first end points of line of cut.
2. seeking with the A point is the symmetric points that the B at center is ordered, and is made as the C point.
3. seek and its nearest frontier point in 3 * 3 neighborhoods that C is ordered, be made as the D point, it is another end points of line of cut.
4. two terminal B and the D that connect line of cut with the background colour setting-out.
The cutting step of internal point is as follows:
(1) establishing inner cut point is the A point, calculates average chain code by the chain code value that with the A point is the regional connecting line at center, with this direction as regional connecting line.
(2) average chain code adds 2, makes straight line be rotated counterclockwise 90 ° of outsides of pointing to the zone.Seek line of cut along the outside pointwise of this direction and be positioned at point on the adjacent side boundary line, promptly distance value is 2 point, is made as the B point.
(3) average chain code subtracts 2 (or doing mould 8 computings after adding 6), makes the straight line 90 ° of outsides of pointing to border, regional opposite that turn clockwise.Seek point on the adjacent side boundary line that line of cut is positioned at opposite side along the outside pointwise of this direction, be made as the C point.
(4) set out by the adjacent frontier point B and the C point that find in (2) and (3) step, seek respectively in their 3 * 3 neighborhoods and the nearest frontier point of A point.This is an iterative process, and the point that finds at last is only real line of cut end points, is made as D and E point.Like this, although use average chain representation direction very inaccurate, the line of cut end points that searches out can be on the concave point of boundary line.
(5) connect D, E at 2 with the background colour setting-out and promptly get line of cut.It with two adhesion Region Segmentation come.
In addition, in (2) and (3) step, the search step number also has certain limitation, promptly searches for the distance value that step number can not surpass cut point.That is to say that if the distance value of inner cut point A is 4, then the step number to vertical direction search line of cut end points at most also respectively was 4 steps.
Knitted fabric pilling standard sample according to one to the ball top segmentation result of Pyatyi as shown in Figure 7.
4. knitted fabric pilling performance Index Calculation
Adopted the total area, number, the area standard of ball top to differ from the pilling degree that three indexs characterize fabric in this patent.
(1) the ball top total area and number
Calculation process is as follows:
1. carry out connected domain for the two-value ball top image that extracts (target gray value is 255, and background is 0) and calculate, obtain the connected domain total number, i.e. the total number N of ball top;
2. calculate the area of each ball top respectively, promptly each connected domain comprises the number s of pixel i
3. with each s iAddition obtains the total area S of ball top.
(2) the ball top area standard is poor
Computing formula is as follows:
Astd = &Sigma; i = 1 N ( s i - s ) 2 N
In the formula:
The Astd-area standard is poor,
s iThe area an of-Di i ball top,
The average area of s-ball top,
The number of N-ball top,
These parameters characterizes ball top quantity what, ball top area size and area standard difference size, these physical attributes of ball top are the concentrated expression in this physical process of pilling of fabric structure, yarn texture and type of fibers, make up and estimate the fabric pilling characteristic and suit so choose these indexs.
5. based on the objective grading of knitted fabric pilling grade of heredity-BP neural network
In this patent, utilize the parameter and the network structure of genetic algorithm optimization BP neural network, can overcome the problem of BP neural network preferably and effectively improve the extensive performance of neural network.When optimizing the BP network structure, mainly be the number of optimizing its latent node.The network desired output is set at 0000 by input during primes, is set at 1000 during the input seconds, is set at 0100 when import three grades of product, is set at 0010 when importing 4 grades of product, is set at 0001 when importing fifth class.
The method for building up of heredity-BP neural network model that is used for knitted fabric pilling performance grading is as follows:
(1) coding and initial population
This patent as gene, adopts the weights of BP neural network and threshold value floating-point encoding that initial population is carried out encoding operation then, is more conducive to the fusion of genetic algorithm (GA) and BP algorithm like this.The sign indicating number string is made up of four parts: hidden layer is connected weights, output layer and is connected weights, hidden layer threshold value, output layer threshold value with hidden layer with input layer, be joined together to form a long string like this, promptly constitutes a chromosome of genetic algorithm.A this chromosome individuality constitutes initial population.Genetic manipulation carries out in such chromosome complex.Because the good network weights are generally smaller, between (1,1).For this reason, the value for each gene of initial population is to be advisable between (1,1).The quantity of initial population also should be suitable, and this patent is taken as 30.
(2) selection of fitness function
This patent adopts inverse with the error sum of squares between real output value and the desired output as fitness function, can combine both evaluation criterias together fully like this, has improved the optimization performance of network, and its computing formula is as follows:
f(i)=1/E(i)
F in the formula (i) is the fitness value of individual i; E (i) represents the error sum of squares between real output value and the desired output.
(3) genetic manipulation
Utilize structure that genetic algorithm optimizes neural network simultaneously and be connected weights, as follows to set up based on the process of the knitted fabric pilling rating model of heredity-BP neural network, WIH wherein IjThe weights that are connected for j node of i node and hidden layer in the input layer; WHO JiThe weights that are connected for i node of j node and output layer in the hidden layer.
1. initialization population P comprises intersection scale, crossover probability Pc, mutation probability Pm and to arbitrary WIH IjAnd WHO JiInitialization; In coding, adopt real number to encode, initial population gets 30;
2. calculate each individual evaluation function, and with its ordering; Can select the network individuality by the following formula probable value:
P s = f i / &Sigma; i = 1 N f i
Wherein f is the adaptation value of individual i, and available error sum of squares E weighs, that is:
f ( i ) = 1 / E ( i ) , E ( i ) = &Sigma; p &Sigma; k ( V k - T k ) 2
I=1 wherein, 2,3 ... 30 is chromosome number; K=1 ... 4 is output layer node number; P=1,2,3 ... 15 is the learning sample number; T kBe teacher signal; V kBe network output.
3. with probability P c to individual G iAnd G I+1Interlace operation produces new individual G i' and G ' I+1, the individuality that does not carry out interlace operation directly duplicates;
4. utilize probability P m sudden change to produce G jNew individual G j';
5. new individuality is inserted among the population P, and calculates new individual evaluation function;
If 6. new individual fitness value is then thought and has been found satisfied individuality greater than preset threshold, finishes, otherwise change 3..
After reaching desired performance index, the network parameter after the optimum individual in final colony decoding can be optimized.The process flow diagram that utilizes the initial weight of genetic algorithm optimization BP neural network and threshold value as shown in Figure 8.
Below in conjunction with accompanying drawing technique effect of the present invention is described further.
Fig. 2 serves as reasons the knitted fabric pilling one-level of national knit product quality supervision inspection center development to Pyatyi standard sample photograph, and the image size is 256 * 256.The tonal range of its gray level image is less, is not easy image is handled.Therefore, before ball top extracted, the method for using histogram specification was stretched to whole tonal range to gray-scale value.
Fig. 3 is the knitted fabric pilling primary standard image of histogram after stretching.As can be seen from the figure the gray-scale value of knitting wool knot is very big, almost can't distinguish with ball top.So, in order to reduce knot, must reduce their gray scale and do not influence the gray-scale value of ball top the influence that ball top extracts, can regard them as in the image peak point.For the peak value in the detected image, the peak in the image is obtained in employing Top-Hat conversion, gets threshold value again and obtains the peak mark.
The peak point image that Fig. 4 obtains through the Top-Hat conversion for Fig. 3.
Fig. 5 deducts image behind Fig. 4 for Fig. 3, and as can be seen, the gray-scale value of bright knitting wool node has diminished, and has reduced the influence to subsequent treatment.
Fig. 6 is that knitted fabric pilling standard sample extracts the result according to one to the Pyatyi ball top, and ball top and the visual observation of extracting with this method is close as can be seen, and ball top is all well kept on quantity, shape and size, has realized the extraction to ball top preferably.
Fig. 7 is a knitted fabric pilling standard sample according to one to Pyatyi ball top segmentation result, through the contrast of images at different levels, and the ball top and the human eye subjective judgement basically identical that extract of this method as can be seen, and the particle that has adhesion cut apart.
Fig. 8 utilizes the initial weight of genetic algorithm optimization BP neural network and the process flow diagram of threshold value.

Claims (7)

1. based on the objective ranking method of knitted fabric pilling of graphical analysis, comprise the following steps:
Step 1: obtain a width of cloth knitted fabric sample image, it is carried out following pre-service:
(1) method with histogram equalization is stretched to whole intensity profile scope to the gray-scale value of sample image;
(2) image behind the histogram equalization is carried out the Top-Hat conversion, use the preceding figure image subtraction transformation results of conversion then, to remove the influence of knot;
Step 2: adopt minimum skewness method that the pretreated image through step 1 is carried out binary conversion treatment, calculate the area of each connected region in the bianry image, delete zone, thereby from sample image, extract the ball top area image less than threshold value;
Step 3: the ball top area image is carried out range conversion, the range image that carries out after the range conversion is carried out the pixel property analysis, determine the bottleneck position in adhesion zone, comprise the pixel connecting line in borderline pixel connecting line and the interior point;
Step 4: by chain code adhesion zone connecting line is followed the tracks of, examined the validity of connecting line, and then the connecting line and the cut point in definite adhesion zone;
Step 5:, adopt corresponding cutting method that the ball top particle is carried out in the adhesion zone and cut apart at border cut point, adjacent side circle cut point and three kinds of dissimilar cut points of inner cut point;
Step 6:, calculate two or more pilling performance index of ball top according to ball top particle segmentation result;
Step 7: set up the BP neural network model, wherein input neuron is taken as above-mentioned performance index, and output neuron characterizes different pilling grades;
Step 8: the weights of BP neural network and a certain fixing threshold value as gene, are carried out real coding;
Step 9:, utilize genetic algorithm that the initial weight of BP neural network and threshold value and structure are optimized and train with the input of the index of standard pin fabric pilling sample image as the BP neural network;
Step 10:, obtain the objective rating result of pilling with the input of the index of test pattern as the BP neural network that trains.
2. the objective ranking method of knitted fabric pilling based on graphical analysis according to claim 1 is characterized in that, during the Top-Hat conversion, chooses 3 * 3 decussate texture element in the step 1.
3. the objective ranking method of knitted fabric pilling based on graphical analysis according to claim 1 is characterized in that, in the step 2, establishing pretreated gray distribution of image scope is 0~G, and wherein the grey level distribution probability density of target and background is respectively p 1(t) and p 2(t), establish p 1(t) and p 2(t) equal Normal Distribution, the gray average of target and background is respectively μ 1(t) and μ 2(t), variance is respectively
Figure FSB00000280076200011
With
Figure FSB00000280076200012
, the skewness index is respectively K 31(t) and K 32(t), the gray level of image is divided into target and background two classes by threshold value t, the normalization histogram of establishing pretreated image is h (i), (i=0~G), the object pixel accounts for the full images pixel count than being θ (t), then:
&theta; ( t ) = &Sigma; i = 0 t h ( i )
&mu; 1 ( t ) = &Sigma; i = 0 t h ( i ) i / &theta; ( t ) , μ 2(t)=1-μ 1(t)
&sigma; 1 2 ( t ) = &Sigma; i = 0 t [ i - &mu; 1 ( t ) ] 2 h ( i ) / &theta; ( t )
&sigma; 2 2 ( t ) = &Sigma; i = t + 1 G [ i - &mu; 2 ( t ) ] 3 h ( i ) / ( 1 - &theta; ( t ) )
K 31 ( t ) = &Sigma; i = 0 t [ i - &mu; 1 ( t ) ] 2 h ( i ) [ &sigma; 1 ( t ) ] 3 &theta;
K 32 ( t ) = &Sigma; i = t + 1 G [ i - &mu; 2 ( t ) ] 3 h ( i ) [ &sigma; 2 ( t ) ] 3 ( 1 - &theta; )
The skewness index of definition image: K 3(t)=| K 31(t) |+| K 32(t) |;
Ask for thre=ArgMin[K 3(t)], 0<t≤G;
Carry out binary conversion treatment through pretreated image: pixel value f ( i , j ) = 1 f ( i , j ) &GreaterEqual; thre 0 f ( i , j ) < thre ;
Selected area threshold carries out area filtering to image.
4. the objective ranking method of knitted fabric pilling based on graphical analysis according to claim 1, it is characterized in that, in the step 5, at border cut point, adjacent side circle cut point and three kinds of dissimilar cut points of inner cut point, adopt corresponding cutting method that the ball top particle is carried out in the adhesion zone and cut apart
Cutting step for the border cut point is as follows:
1. establishing the border cut point is the A point, is the background dot that is in saltus step place under the distance value on the Boundary Loop of 5 * 5 neighborhoods at center by counterclockwise sequential search with the A point, is made as B and C point;
2. in 3 * 3 neighborhoods that B is ordered, seek and the nearest background dot of A point, be made as the D point;
3. in 3 * 3 neighborhoods that C is ordered, seek and the nearest background dot of A point, be made as the E point;
4. connect D, E 2 points with the background colour setting-out;
Cutting step for adjacent side circle cut point is as follows:
1. establishing adjacent side circle cut point is A, seeks and the nearest frontier point of this adjacent side circle cut point in 3 * 3 neighborhoods that with adjacent side circle cut point are the center, is made as the B point, and it is first end points of line of cut;
2. seeking with the A point is the symmetric points that the B at center is ordered, and is made as the C point;
3. seek and its nearest frontier point in 3 * 3 neighborhoods that C is ordered, be made as the D point, it is another end points of line of cut;
4. two terminal B and the D that connect line of cut with the background colour setting-out;
Cutting step for inner cut point is as follows:
1. establishing inner cut point is the A point, calculates average chain code by the chain code value that with the A point is the regional connecting line at center, with this direction as regional connecting line;
2. average chain code adds 2, makes straight line be rotated counterclockwise 90 ° of outsides of pointing to the zone, seeks line of cut along the outside pointwise of this direction and is positioned at point on the adjacent side boundary line, and promptly distance value is 2 point, is made as the B point;
3. average chain code subtract 2 or add 6 after do mould 8 computings, make the straight line 90 ° of outsides of pointing to border, regional opposite that turn clockwise, seek point on the adjacent side boundary line that line of cut is positioned at opposite side along the outside pointwise of this direction, be made as the C point;
4. by the cutting step of inner cut point 2. with 3. step in the B and the C point that find set out, the iterated conditional that can not surpass the distance value of cut point according to the search step number is sought respectively in 3 * 3 neighborhoods that B and C order and the nearest frontier point of A point, is made as D and E point;
5. connect D, E at 2 with the background colour setting-out and promptly get line of cut.
5. the objective ranking method of knitted fabric pilling based on graphical analysis according to claim 1 is characterized in that,
In the step 6, described performance index are that the total area, number, area standard differ from three performance index.
6. the objective ranking method of knitted fabric pilling based on graphical analysis according to claim 1 is characterized in that,
If WIH IjBe connected weights, WHO for j node of i node and hidden layer in the input layer JiBe the weights that are connected of i node of j node and output layer in the hidden layer, step 9 follows these steps to carry out:
1. initialization population P comprises intersection scale, crossover probability Pc, mutation probability Pm and to arbitrary WIH IjAnd WHO JiInitialization;
2. calculate each individual evaluation function, and, press the following formula probable value and select the network individuality its ordering:
P s = f i / &Sigma; i = 1 N f i
F wherein iBe the adaptation value of individual i, f i=1/E (i), E ( i ) = &Sigma; p &Sigma; k ( V k - T k ) 2
Wherein i is a chromosome number, and k is an output layer node number, and p is the learning sample number, T kBe teacher signal, V kBe network output;
3. with probability P c to individual G iAnd G I+1Interlace operation produces new individual G i' and G ' I+1, the individuality that does not carry out interlace operation directly duplicates;
4. utilize probability P m sudden change to produce G jNew individual G j';
5. new individuality is inserted among the population P, and calculates new individual evaluation function;
If 6. new individual fitness value is then thought and has been found satisfied individuality greater than preset threshold, finishes, otherwise change 3..
7. the objective ranking method of knitted fabric pilling based on graphical analysis according to claim 6, it is characterized in that, the value of each gene of initialization population is (1,1) between, the initialization population quantity is taken as 30, genetic algebra was taken as for 700 generations, and the genetic optimization standard is that fitness value is greater than 20; The learning rate of BP neural network is set at 0.01, and the maximum frequency of training of network is 3000.
CN2008101539748A 2008-12-11 2008-12-11 Jersey wear flokkit and balling up grading method based on image analysis Expired - Fee Related CN101419706B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN2008101539748A CN101419706B (en) 2008-12-11 2008-12-11 Jersey wear flokkit and balling up grading method based on image analysis

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN2008101539748A CN101419706B (en) 2008-12-11 2008-12-11 Jersey wear flokkit and balling up grading method based on image analysis

Publications (2)

Publication Number Publication Date
CN101419706A CN101419706A (en) 2009-04-29
CN101419706B true CN101419706B (en) 2011-01-12

Family

ID=40630484

Family Applications (1)

Application Number Title Priority Date Filing Date
CN2008101539748A Expired - Fee Related CN101419706B (en) 2008-12-11 2008-12-11 Jersey wear flokkit and balling up grading method based on image analysis

Country Status (1)

Country Link
CN (1) CN101419706B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109166118A (en) * 2018-09-05 2019-01-08 深圳灵图慧视科技有限公司 Fabric surface attribute detection method, device and computer equipment

Families Citing this family (13)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN102509008A (en) * 2011-11-01 2012-06-20 浙江理工大学 Method for evaluating scratchiness of ramie fabrics objectively
CN103292701B (en) * 2013-06-24 2015-09-16 哈尔滨工业大学 The online dimension measurement method of accurate device based on machine vision
CN103472209B (en) * 2013-08-28 2015-07-08 东华大学 Measuring device and method for apparent fluffing and balling as well as abrasion loss of yarn
CN104657983B (en) * 2015-01-20 2017-08-08 浙江理工大学 A kind of fabric ball top Density Detection method filtered based on Gabor
CN105277560B (en) * 2015-10-23 2018-09-11 温州大学 A kind of fabric pilling grade evaluation analysis method and device
CN107945165B (en) * 2017-11-24 2019-10-11 常州大学 Textile flaw detection method based on peak value coverage values and areal calculation
CN107977961B (en) * 2017-11-24 2019-10-11 常州大学 Textile flaw detection method based on peak value coverage values and composite character
CN109840928B (en) * 2019-01-31 2023-10-17 北京达佳互联信息技术有限公司 Knitting image generation method and device, electronic equipment and storage medium
CN110706274A (en) * 2019-10-12 2020-01-17 国家羊绒产品质量监督检验中心 Fuzzing and pilling grading tester, testing system and testing method
CN115237083B (en) * 2022-09-23 2024-01-12 南通沐沐兴晨纺织品有限公司 Textile singeing process control method and system based on computer vision
CN116342583B (en) * 2023-05-15 2023-08-04 山东超越纺织有限公司 Anti-pilling performance detection method for spinning production and processing
CN116934749B (en) * 2023-09-15 2023-12-19 山东虹纬纺织有限公司 Textile flaw rapid detection method based on image characteristics
CN117273554B (en) * 2023-11-23 2024-04-19 江苏洁瑞雅纺织品有限公司 Textile production quality prediction method based on data identification

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5815198A (en) * 1996-05-31 1998-09-29 Vachtsevanos; George J. Method and apparatus for analyzing an image to detect and identify defects
US5936665A (en) * 1996-05-22 1999-08-10 Georgia Tech Research Corporation Automated apparatus for counting pillings in textile fabrics
CN1359000A (en) * 2002-01-14 2002-07-17 东华大学 System for estimating fabric pilling grade
US20040008870A1 (en) * 2002-06-24 2004-01-15 Arkady Cherkassky Electro-optical method and apparatus for evaluating protrusions of fibers from a fabric surface
CN1523352A (en) * 2003-09-12 2004-08-25 东华大学 Fabric planeness gradation objective evaluation method
CN101063660A (en) * 2007-01-30 2007-10-31 蹇木伟 Method for detecting textile defect and device thereof

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5936665A (en) * 1996-05-22 1999-08-10 Georgia Tech Research Corporation Automated apparatus for counting pillings in textile fabrics
US5815198A (en) * 1996-05-31 1998-09-29 Vachtsevanos; George J. Method and apparatus for analyzing an image to detect and identify defects
CN1359000A (en) * 2002-01-14 2002-07-17 东华大学 System for estimating fabric pilling grade
US20040008870A1 (en) * 2002-06-24 2004-01-15 Arkady Cherkassky Electro-optical method and apparatus for evaluating protrusions of fibers from a fabric surface
CN1523352A (en) * 2003-09-12 2004-08-25 东华大学 Fabric planeness gradation objective evaluation method
CN101063660A (en) * 2007-01-30 2007-10-31 蹇木伟 Method for detecting textile defect and device thereof

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109166118A (en) * 2018-09-05 2019-01-08 深圳灵图慧视科技有限公司 Fabric surface attribute detection method, device and computer equipment

Also Published As

Publication number Publication date
CN101419706A (en) 2009-04-29

Similar Documents

Publication Publication Date Title
CN101419706B (en) Jersey wear flokkit and balling up grading method based on image analysis
CN104077613B (en) Crowd density estimation method based on cascaded multilevel convolution neural network
CN107481188A (en) A kind of image super-resolution reconstructing method
CN107977671A (en) A kind of tongue picture sorting technique based on multitask convolutional neural networks
CN109101938B (en) Multi-label age estimation method based on convolutional neural network
CN109613006A (en) A kind of fabric defect detection method based on end-to-end neural network
CN110084195A (en) Remote Sensing Target detection method based on convolutional neural networks
CN108960499A (en) A kind of Fashion trend predicting system merging vision and non-vision feature
CN109753946A (en) A kind of real scene pedestrian&#39;s small target deteection network and detection method based on the supervision of body key point
CN108764186A (en) Personage based on rotation deep learning blocks profile testing method
CN109801292A (en) A kind of bituminous highway crack image partition method based on generation confrontation network
CN110135502A (en) A kind of image fine granularity recognition methods based on intensified learning strategy
CN110111024A (en) Scientific and technological achievement market valuation method based on AHP model of fuzzy synthetic evaluation
CN104680178B (en) Image classification method based on transfer learning multi attractor cellular automaton
CN106056059B (en) The face identification method of multi-direction SLGS feature description and performance cloud Weighted Fusion
CN107808358A (en) Image watermark automatic testing method
CN109948522A (en) A kind of X-ray hand osseous maturation degree interpretation method based on deep neural network
CN101625755A (en) Image division method based on watershed-quantum evolution clustering algorithm
CN102662172A (en) Stormy cloud cluster extrapolation method based on Doppler radar reflectivity image
Matin et al. A comparative study on using meta-heuristic algorithms for road maintenance planning: insights from field study in a developing country
CN105303200B (en) Face identification method for handheld device
CN107330734A (en) Business address system of selection based on Co location patterns and body
CN107133690A (en) A kind of lake water systems connects engineering proposal preference ordering method
CN103366379A (en) Level set medical image segmentation method based on heredity kernel fuzzy clustering
CN110188816A (en) Based on the multiple dimensioned image fine granularity recognition methods for intersecting bilinearity feature of multithread

Legal Events

Date Code Title Description
C06 Publication
PB01 Publication
C10 Entry into substantive examination
SE01 Entry into force of request for substantive examination
C14 Grant of patent or utility model
GR01 Patent grant
CF01 Termination of patent right due to non-payment of annual fee

Granted publication date: 20110112

Termination date: 20181211

CF01 Termination of patent right due to non-payment of annual fee