CN111681181A - Method for evaluating fabric surface pilling degree - Google Patents
Method for evaluating fabric surface pilling degree Download PDFInfo
- Publication number
- CN111681181A CN111681181A CN202010462300.7A CN202010462300A CN111681181A CN 111681181 A CN111681181 A CN 111681181A CN 202010462300 A CN202010462300 A CN 202010462300A CN 111681181 A CN111681181 A CN 111681181A
- Authority
- CN
- China
- Prior art keywords
- fabric
- area
- image
- hair
- evaluating
- Prior art date
- Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
- Withdrawn
Links
- 239000004744 fabric Substances 0.000 title claims abstract description 73
- 238000000034 method Methods 0.000 title claims abstract description 40
- 238000001514 detection method Methods 0.000 claims abstract description 28
- 230000011218 segmentation Effects 0.000 claims abstract description 26
- 238000001228 spectrum Methods 0.000 claims abstract description 17
- 230000000877 morphologic effect Effects 0.000 claims abstract description 10
- 238000012545 processing Methods 0.000 claims abstract description 10
- 238000007781 pre-processing Methods 0.000 claims abstract description 8
- 238000001914 filtration Methods 0.000 claims abstract description 5
- 230000002708 enhancing effect Effects 0.000 claims abstract description 4
- 206010004542 Bezoar Diseases 0.000 claims description 20
- 238000012360 testing method Methods 0.000 claims description 8
- 230000008859 change Effects 0.000 claims description 5
- 238000011084 recovery Methods 0.000 claims description 3
- 238000012163 sequencing technique Methods 0.000 claims description 3
- 230000000694 effects Effects 0.000 abstract description 6
- 238000010586 diagram Methods 0.000 description 4
- 238000011156 evaluation Methods 0.000 description 4
- 230000009466 transformation Effects 0.000 description 4
- 238000005286 illumination Methods 0.000 description 3
- 238000012937 correction Methods 0.000 description 2
- 238000002474 experimental method Methods 0.000 description 2
- 241001388119 Anisotremus surinamensis Species 0.000 description 1
- 238000011161 development Methods 0.000 description 1
- 230000018109 developmental process Effects 0.000 description 1
- 238000005516 engineering process Methods 0.000 description 1
- 238000003709 image segmentation Methods 0.000 description 1
- 238000007689 inspection Methods 0.000 description 1
- 238000012986 modification Methods 0.000 description 1
- 230000004048 modification Effects 0.000 description 1
- 230000008569 process Effects 0.000 description 1
- 239000004753 textile Substances 0.000 description 1
Images
Classifications
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/11—Region-based segmentation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/40—Image enhancement or restoration using histogram techniques
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/70—Denoising; Smoothing
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T5/00—Image enhancement or restoration
- G06T5/90—Dynamic range modification of images or parts thereof
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/0002—Inspection of images, e.g. flaw detection
- G06T7/0004—Industrial image inspection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/13—Edge detection
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/10—Segmentation; Edge detection
- G06T7/168—Segmentation; Edge detection involving transform domain methods
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/60—Analysis of geometric attributes
- G06T7/62—Analysis of geometric attributes of area, perimeter, diameter or volume
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T7/00—Image analysis
- G06T7/90—Determination of colour characteristics
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20048—Transform domain processing
- G06T2207/20056—Discrete and fast Fourier transform, [DFT, FFT]
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/20—Special algorithmic details
- G06T2207/20172—Image enhancement details
- G06T2207/20192—Edge enhancement; Edge preservation
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30108—Industrial image inspection
- G06T2207/30124—Fabrics; Textile; Paper
-
- G—PHYSICS
- G06—COMPUTING; CALCULATING OR COUNTING
- G06T—IMAGE DATA PROCESSING OR GENERATION, IN GENERAL
- G06T2207/00—Indexing scheme for image analysis or image enhancement
- G06T2207/30—Subject of image; Context of image processing
- G06T2207/30168—Image quality inspection
Landscapes
- Engineering & Computer Science (AREA)
- Physics & Mathematics (AREA)
- General Physics & Mathematics (AREA)
- Theoretical Computer Science (AREA)
- Computer Vision & Pattern Recognition (AREA)
- Geometry (AREA)
- Quality & Reliability (AREA)
- Image Analysis (AREA)
Abstract
The invention provides a method for evaluating the pilling degree of the surface of a fabric, which comprises the following steps: s10, collecting an input fabric surface image, and preprocessing the fabric surface image; s20, roughly dividing each channel of the preprocessed image into hair bulbs by a frequency spectrum filtering method in a Fourier transform domain; s30, enhancing the result of the rough segmentation by a histogram equalization method; s40, extracting a minimum value region of the enhanced fabric sample image by using a fabric ball fine segmentation algorithm based on maximum stable extreme value region detection, and eliminating small regions which are not balls by using a morphological method to obtain a fine segmentation result of the balls; s50, processing the ball detection results of the RGB three channels of the fabric sample image to obtain a final ball area; and S60, evaluating the hair bulb grade according to the hair bulb area ratio and the standard deviation of the hair bulb area, wherein the detection is more accurate and reliable, the detection effect is better, and the hair bulb grade can be efficiently and automatically detected.
Description
Technical Field
The invention relates to the technical field of fabric detection, in particular to a method for evaluating the pilling degree of the fabric surface.
Background
How to evaluate the pilling performance of the fabric is an important content in the field of textile inspection. The method adopted by people is that standard samples are compared and rated, the method has great subjectivity, and the difference is caused by human factors, so that the accurate quantitative tracing speed of the pilling degree of the fabric can not be carried out. For the same fabric sample to be tested, different testing personnel can obtain different detection results, and with the development of a computer digital image processing technology, a computer vision evaluation system of fabric pilling grade based on a fabric gray level image has the advantages of low cost, simplicity and convenience in operation, convenience in carrying and the like, so that the computer vision evaluation system becomes the most common objective fabric pilling grading system at present, but because the fabric is various, the selection of the pilling fabric and the restriction conditions for collecting the pilling image are more, the adopted pilling image segmentation method is not perfect, and the fabric pilling information can not be accurately identified due to the influence of the environment.
In conclusion, the method for evaluating the pilling tendency of the surface of the fabric, which has the advantages of more accurate and reliable detection, better detection effect and capability of efficiently and automatically detecting the pilling tendency of the lint level, is a problem which needs to be solved by the technical personnel in the field.
Disclosure of Invention
In view of the above-mentioned problems and needs, the present invention provides a method and an apparatus for evaluating the pilling note on the surface of a fabric, which can solve the above-mentioned technical problems by adopting the following technical solutions.
In order to achieve the purpose, the invention provides the following technical scheme: a method for evaluating the pilling degree of the fabric surface comprises the following specific steps:
s10, collecting an input fabric surface image, and carrying out pretreatment operation on the fabric surface image to enhance the contrast of a hair bulb and a background to obtain a pretreated image;
s20, roughly dividing each channel of the preprocessed image into hair bulbs by a frequency spectrum filtering method in a Fourier transform domain;
s30, enhancing the result of the rough segmentation by a histogram equalization method;
s40, extracting a minimum value region of the enhanced fabric sample image by using a fabric ball fine segmentation algorithm based on maximum stable extreme value region detection, and eliminating small regions which are not balls by using a morphological method to obtain a fine segmentation result of the balls;
s50, processing the ball detection results of the RGB three channels of the fabric sample image to obtain a final ball area;
and S60, evaluating the hair bulb grade according to the hair bulb area ratio and the standard deviation of the area of the hair bulb.
Further, the preprocessing comprises denoising and edge enhancement processing of the fabric surface image, and the preprocessing image of the fabric surface image is obtained through preprocessing.
Further, the step S20 specifically includes:
s21, performing logarithmic stretching on the preprocessed image according to a formula U (x, y) ═ a log (1+ I (x, y)), wherein a is a fixed coefficient;
s22 according to the formula FU(u,v)=log[1+F(u,v)]And calculating a logarithmically stretched spectrum, wherein,
s23, according to the set threshold value T, using the relation between the magnitude of the frequency spectrum amplitude and the threshold value to filter the frequency signal representing the fabric texture,
s24, restoring the frequency signal representing the hair bulb in the region with the radius r around the center point of the frequency spectrum according to the spectrogram after thresholding;
s25, performing inverse Fourier transform on the spectrogram subjected to frequency spectrum recovery to obtain a division result I0(x, y) and performing contrast stretching on the segmentation resultWhere m is the image mean and e is a parameter controlling the slope of the curve.
Still further, the step of determining the radius r for recovering the hair ball frequency signal comprises: for FU(U, v) calculating F in a window of size N × N centered on each pixel U (i, j) of (U, v)U(u, v) if the window maximum equals the value of the pixel point, then the pixel point is FUA local maximum point of (d); taking 0.8 of the maximum pixel value as a threshold value to accept or reject all local maximum value points, and reserving the local maximum value points of which the pixel values of the corresponding pixel points are greater than the threshold value; calculating the division FU(u, v) all local maxima and F except the centerU(u, v) distance from the center point, and the minimum value is taken as the radius r.
Further, the step S40 specifically includes:
s41, sequencing all pixel points in the enhanced image, and calculating and updating a connected component structure and a connected component area by adopting a parallel-searching algorithm;
s42, according to the formula w (i) ═ Zi+△-Zi-△|/ZiFor all connected regions Z1,Z2… ZnCalculating the change rate of the area of the connected component to further obtain a maximum stable minimum value area, wherein the first area is a sub-area of the next area, namely Zi∈Zi+1。
Further, after small areas which are not the hair bulb are removed by a morphological method, the detection areas are translated for a certain distance along the x axis and the y axis respectively, and some false detection areas of the boundary are removed to correct the areas to obtain a fine segmentation result of the hair bulb.
Further, carrying out OR operation on the hair ball detection results of the RGB three channels of the fabric sample image to obtain an accurate hair ball area.
Further, the step S60 specifically includes:
s61: counting the area S of each hair ball in the image according to the number of the hair ball pixelsiThe number of hair balls is N, and the total area of the hair balls is SiN, the ratio Q of the hair balls is S/A, and A is the area of the sample fabric;
s62: the area standard deviation of the hair bulbs can represent the quantity, the area size, the variation degree and the spatial distribution condition of the hair bulbs, the standard deviation is adopted to judge the hair bulb grade of the test sample, and the formula is adoptedAnd calculating the area standard deviation of the hair bulb.
The invention has the advantages of more accurate and reliable detection of the hair bulb, better detection effect and high-efficiency automatic detection of the hair bulb grade.
The following description of the preferred embodiments for carrying out the present invention will be made in detail with reference to the accompanying drawings so that the features and advantages of the present invention can be easily understood.
Drawings
In order to more clearly illustrate the technical solutions of the embodiments of the present invention, the drawings of the embodiments of the present invention will be briefly described below. Wherein the drawings are only for purposes of illustrating some embodiments of the invention and are not to be construed as limiting the invention to all embodiments thereof.
FIG. 1 is a schematic diagram of the steps of the method for evaluating the pilling degree of the fabric surface according to the present invention.
FIG. 2 is a schematic diagram of the rough segmentation method according to the present invention.
Fig. 3 is a schematic diagram of a flow chart for determining the radius r of the recovered hair bulb frequency signal in this embodiment.
Fig. 4 is a schematic diagram of a test picture related to the fabric in this embodiment, where a is an original image of the test fabric, b is an original image of the standby test fabric, c is an original gray image of a, d is a gray image after logarithmic transformation, e is a roughly-divided image without contrast stretching processing, f is a roughly-divided image with contrast stretching processing, g is a finely-divided image without morphological operation and correction, and h is a finely-divided image with morphological operation and correction.
Detailed Description
In order to make the objects, technical solutions and advantages of the embodiments of the present invention clearer, the technical solutions of the embodiments of the present invention will be clearly and completely described below with reference to the accompanying drawings of specific embodiments of the present invention. Like reference symbols in the various drawings indicate like elements. It should be noted that the described embodiments are only some embodiments of the invention, and not all embodiments. All other embodiments, which can be derived by a person skilled in the art from the described embodiments of the invention without any inventive step, are within the scope of protection of the invention.
The invention provides the evaluation method for the pilling degree of the fabric surface, which is more accurate and reliable in pilling detection, has a better detection effect and can efficiently and automatically detect the pilling grade. As shown in the attached figures 1 to 4, the method for evaluating the pilling degree of the fabric surface comprises the following specific steps: s10, collecting an input fabric surface image, carrying out pretreatment operation on the fabric surface image to enhance the contrast of a hair bulb and a background to obtain a pretreated image, wherein the pretreatment comprises carrying out denoising and edge enhancement treatment on the fabric surface image, and obtaining the pretreated image of the fabric surface image through pretreatment;
s20, roughly dividing each channel of the preprocessed image into hair bulbs by a frequency spectrum filtering method in a Fourier transform domain;
the step S20 specifically includes:
s21, performing logarithmic stretching on the preprocessed image according to a formula U (x, y) ═ a log (1+ I (x, y)), wherein a is a fixed coefficient;
s22 according to the formula FU(u,v)=log[1+F(u,v)]And calculating a logarithmically stretched spectrum, wherein,
s23, according to the set threshold value T, using the relation between the magnitude of the frequency spectrum amplitude and the threshold value to filter the frequency signal representing the fabric texture,
s24, restoring the frequency signal representing the hair bulb in the region with the radius r around the center point of the frequency spectrum according to the spectrogram after thresholding;
s25, performing inverse Fourier transform on the spectrogram subjected to frequency spectrum recovery to obtain a division result I0(x, y) and performing contrast stretching on the segmentation resultWherein m is the image mean, e is a parameter controlling the slope of the curve;
s30, enhancing the result of the rough segmentation by a histogram equalization method;
s40, extracting a minimum value region of the enhanced fabric sample image by using a fabric ball fine segmentation algorithm based on maximum stable extreme value region detection, and eliminating small regions which are not balls by using a morphological method to obtain a fine segmentation result of the balls;
the step S40 specifically includes:
s41, sequencing all pixel points in the enhanced image, and calculating and updating a connected component structure and a connected component area by adopting a parallel-searching algorithm;
s42, according to the formula w (i) ═ Zi+△-Zi-△|/ZiFor all connected regions Z1,Z2… ZnCalculating the change rate of the area of the connected component to further obtain a maximum stable minimum value area, wherein the first area is a sub-area of the next area, namely Zi∈Zi+1;
S50, processing the ball detection results of the RGB three channels of the fabric sample image to obtain a final ball area;
s60, performing ball level evaluation according to the ball area and the standard deviation of the ball area, wherein the ball detection results of the RGB three channels of the fabric sample image are OR-operated to obtain an accurate ball area;
the step S60 specifically includes:
s61: counting the area S of each hair ball in the image according to the number of the hair ball pixelsiThe number of hair balls is N, and the total area of the hair balls is SiN, the ratio Q of the hair balls is S/A, and A is the area of the sample fabric;
s62: the area standard deviation of the hair bulbs can represent the quantity, the area size, the variation degree and the spatial distribution condition of the hair bulbs, the standard deviation is adopted to judge the hair bulb grade of the test sample, and the formula is adoptedAnd calculating the area standard deviation of the hair bulb.
In the method, as shown in the flow chart of determining the radius r of the recovered hair ball frequency signal in fig. 3, the step of determining the radius r of the recovered hair ball frequency signal includes: for FU(U, v) calculating F in a window of size N × N centered on each pixel U (i, j) of (U, v)U(u, v) if the window maximum equals the value of the pixel point, then the pixel point is FUA local maximum point of (d); taking 0.8 of the maximum pixel value as a threshold value to accept or reject all local maximum value points, and reserving the local maximum value points of which the pixel values of the corresponding pixel points are greater than the threshold value; calculating the division FU(u, v) all local maxima and F except the centerU(u, v) distance from the center point, and the minimum value is taken as the radius r.
After small areas which are not the hair bulbs are removed by a morphological method, the detection areas are respectively translated for a certain distance along the x axis and the y axis, and some false detection areas at the boundary are removed to correct the areas to obtain the fine segmentation result of the hair bulbs.
Because the overall change of the illumination component of the fabric pilling image is a relatively gentle image low-frequency part, and the image high-frequency part with relatively severe change is generally fabric pompon information and texture information, the low-frequency information in the image needs to be filtered to eliminate the uneven illumination phenomenon in the fabric image. And step 2, filtering out frequency signals representing the fabric texture by using the relation between the magnitude of the frequency spectrum amplitude and the threshold value according to the set threshold value T.
Experiments with an embodiment of the rating system: experiments were performed on a Pentium3.2GHz PC using MATLAB-R2014a software compiled into the environment. The test pictures are two pictures of the fabric surface balling, the picture size is 500 x 500, as shown in fig. 4, the picture is subjected to image preprocessing, the original gray image c of the picture a is subjected to logarithmic transformation to obtain a gray image d, the gray image d can be seen by comparing the pictures before and after the transformation, the texture of the picture after the logarithmic transformation is clearer, and the further processing in the following process is facilitated. The contrast of the background can be enhanced by performing contrast stretching on the segmentation result, for example, if the image e is a rough segmentation result which is not subjected to contrast stretching, the hair bulb area is more obvious compared with the image f of the rough segmentation result which is subjected to contrast stretching. After rough segmentation, the hair bulb needs to be finely segmented in one step, but because the hair bulb is too similar to the fabric, the hair bulb and the fabric surface have very close pixel values in a very large threshold range based on the pixel values, and therefore, the traditional single threshold segmentation method based on the pixel values cannot have a good segmentation effect. The method based on the maximum stable extremum region is based on the local characteristics of the hair bulb, a better segmentation effect can be obtained without depending on a single threshold, for example, if an image g can see that a plurality of fine interference regions which do not belong to the hair bulb region exist in an obtained accurately segmented image, the regions need to be removed by adopting morphological operation, and due to the influence of illumination, the detected region is actually a region of a ball shadow, so that the detected region needs to be respectively translated for a certain distance along an x axis and a y axis to be corrected to obtain a final clear and accurate image h.
It should be noted that the described embodiments of the invention are only preferred ways of implementing the invention, and that all obvious modifications, which are within the scope of the invention, are all included in the present general inventive concept.
Claims (8)
1. The method for evaluating the pilling degree of the fabric surface is characterized by comprising the following specific steps of:
s10, collecting an input fabric surface image, and carrying out pretreatment operation on the fabric surface image to enhance the contrast of a hair bulb and a background to obtain a pretreated image;
s20, roughly dividing each channel of the preprocessed image into hair bulbs by a frequency spectrum filtering method in a Fourier transform domain;
s30, enhancing the result of the rough segmentation by a histogram equalization method;
s40, extracting a minimum value region of the enhanced fabric sample image by using a fabric ball fine segmentation algorithm based on maximum stable extreme value region detection, and eliminating small regions which are not balls by using a morphological method to obtain a fine segmentation result of the balls;
s50, processing the ball detection results of the RGB three channels of the fabric sample image to obtain a final ball area;
and S60, evaluating the hair bulb grade according to the hair bulb area ratio and the standard deviation of the area of the hair bulb.
2. The method for evaluating the pilling note of the fabric surface as recited in claim 1, wherein the preprocessing comprises denoising and edge enhancement processing of the fabric surface image, and the preprocessing image of the fabric surface image is obtained through preprocessing.
3. The method for evaluating the pilling note of a fabric as claimed in claim 1, wherein said step S20 specifically comprises:
s21, performing logarithmic stretching on the preprocessed image according to a formula U (x, y) ═ a log (1+ I (x, y)), wherein a is a fixed coefficient;
s22 according to the formula FU(u,v)=log[1+F(u,v)]And calculating a logarithmically stretched spectrum, wherein,
s23, according to the set threshold value T, using the relation between the magnitude of the frequency spectrum amplitude and the threshold value to filter the frequency signal representing the fabric texture,
s24, restoring the frequency signal representing the hair bulb in the region with the radius r around the center point of the frequency spectrum according to the spectrogram after thresholding;
4. The method for evaluating the pilling note of a fabric as claimed in claim 3, wherein said step of determining the radius r for recovering the pilling note frequency signal comprises: for FU(U, v) calculating F in a window of size N × N centered on each pixel U (i, j) of (U, v)U(u, v) if the window maximum equals the value of the pixel point, then the pixel point is FUA local maximum point of (d); taking 0.8 of the maximum pixel value as a threshold value to accept or reject all local maximum value points, and reserving the local maximum value points of which the pixel values of the corresponding pixel points are greater than the threshold value; calculating the division FU(u, v) all local maxima and F except the centerU(u, v) distance from the center point, and the minimum value is taken as the radius r.
5. The method for evaluating the pilling note of a fabric as claimed in claim 1, wherein said step S40 specifically comprises:
s41, sequencing all pixel points in the enhanced image, and calculating and updating a connected component structure and a connected component area by adopting a parallel-searching algorithm;
s42, according to the formula w (i) ═ Zi+△-Zi-△|/ZiFor all connected regions Z1,Z2…ZnCalculating the change rate of the area of the connected component to further obtain a maximum stable minimum value area, wherein the first area is a sub-area of the next area, namely Zi∈Zi+1。
6. The method for evaluating the pilling note of the fabric surface according to claim 1, characterized in that after small areas which are not the pilling are removed by a morphological method, the detection areas are respectively translated for a certain distance along an x axis and a y axis, and some false detection areas of the boundary are removed to correct the areas to obtain the fine segmentation result of the pilling note.
7. The method for evaluating the pilling note of fabric according to claim 1, wherein the accurate pilling note area is obtained by performing or operation on the results of the pilling note detection of the three RGB channels of the fabric sample image.
8. The method for evaluating the pilling note of a fabric as claimed in claim 1, wherein said step S60 specifically comprises:
s61: counting the area S of each hair ball in the image according to the number of the hair ball pixelsiThe number N of the hair balls, the total area S of the hair balls is SiN, the ratio Q of the hair balls is S/A, and A is the area of the sample fabric;
s62: the area standard deviation of the hair bulbs can represent the quantity, the area size, the variation degree and the spatial distribution condition of the hair bulbs, the standard deviation is adopted to judge the hair bulb grade of the test sample, and the formula is adoptedAnd calculating the area standard deviation of the hair bulb.
Priority Applications (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010462300.7A CN111681181A (en) | 2020-05-27 | 2020-05-27 | Method for evaluating fabric surface pilling degree |
Applications Claiming Priority (1)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
CN202010462300.7A CN111681181A (en) | 2020-05-27 | 2020-05-27 | Method for evaluating fabric surface pilling degree |
Publications (1)
Publication Number | Publication Date |
---|---|
CN111681181A true CN111681181A (en) | 2020-09-18 |
Family
ID=72453579
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
CN202010462300.7A Withdrawn CN111681181A (en) | 2020-05-27 | 2020-05-27 | Method for evaluating fabric surface pilling degree |
Country Status (1)
Country | Link |
---|---|
CN (1) | CN111681181A (en) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112927180A (en) * | 2020-10-15 | 2021-06-08 | 内蒙古鄂尔多斯资源股份有限公司 | Cashmere and wool optical microscope image identification method based on generation countermeasure network |
CN114820459A (en) * | 2022-03-31 | 2022-07-29 | 江苏本峰新材料科技有限公司 | Aluminum veneer polishing quality evaluation method and system based on computer assistance |
CN117079144A (en) * | 2023-10-17 | 2023-11-17 | 深圳市城市交通规划设计研究中心股份有限公司 | Linear crack extraction method for asphalt pavement detection image under non-uniform illumination |
-
2020
- 2020-05-27 CN CN202010462300.7A patent/CN111681181A/en not_active Withdrawn
Cited By (5)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
CN112927180A (en) * | 2020-10-15 | 2021-06-08 | 内蒙古鄂尔多斯资源股份有限公司 | Cashmere and wool optical microscope image identification method based on generation countermeasure network |
CN112927180B (en) * | 2020-10-15 | 2022-11-15 | 内蒙古鄂尔多斯资源股份有限公司 | Cashmere and wool optical microscope image identification method based on generation of confrontation network |
CN114820459A (en) * | 2022-03-31 | 2022-07-29 | 江苏本峰新材料科技有限公司 | Aluminum veneer polishing quality evaluation method and system based on computer assistance |
CN117079144A (en) * | 2023-10-17 | 2023-11-17 | 深圳市城市交通规划设计研究中心股份有限公司 | Linear crack extraction method for asphalt pavement detection image under non-uniform illumination |
CN117079144B (en) * | 2023-10-17 | 2023-12-26 | 深圳市城市交通规划设计研究中心股份有限公司 | Linear crack extraction method for asphalt pavement detection image under non-uniform illumination |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
CN109934802B (en) | Cloth defect detection method based on Fourier transform and image morphology | |
CN109410230B (en) | Improved Canny image edge detection method capable of resisting noise | |
CN111681181A (en) | Method for evaluating fabric surface pilling degree | |
CN108765465B (en) | Unsupervised SAR image change detection method | |
CN111161222B (en) | Printing roller defect detection method based on visual saliency | |
CN109472788B (en) | Method for detecting flaw on surface of airplane rivet | |
CN114757900A (en) | Artificial intelligence-based textile defect type identification method | |
CN115205223B (en) | Visual inspection method and device for transparent object, computer equipment and medium | |
CN114118144A (en) | Anti-interference accurate aerial remote sensing image shadow detection method | |
CN114820631B (en) | Fabric defect detection method capable of resisting texture interference | |
CN110648330B (en) | Defect detection method for camera glass | |
CN116664565A (en) | Hidden crack detection method and system for photovoltaic solar cell | |
CN115330795A (en) | Cloth burr defect detection method | |
CN114821284A (en) | Intelligent adjusting method of plaiter for textile production | |
CN112435235A (en) | Seed cotton impurity content detection method based on image analysis | |
CN115100077A (en) | Novel image enhancement method and device | |
CN113673396B (en) | Spore germination rate calculation method, device and storage medium | |
CN107862689A (en) | Leather surface substantially damaged automatic identifying method and computer-readable recording medium | |
Ma et al. | An algorithm for fabric defect detection based on adaptive canny operator | |
CN110378271B (en) | Gait recognition equipment screening method based on quality dimension evaluation parameters | |
CN114693543B (en) | Image noise reduction method and device, image processing chip and image acquisition equipment | |
Prabha et al. | Defect detection of industrial products using image segmentation and saliency | |
CN115908399A (en) | Magnetic sheet flaw detection method based on improved visual attention mechanism | |
Zhang et al. | A LCD screen Mura defect detection method based on machine vision | |
CN110458042B (en) | Method for detecting number of probes in fluorescent CTC |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
PB01 | Publication | ||
PB01 | Publication | ||
SE01 | Entry into force of request for substantive examination | ||
SE01 | Entry into force of request for substantive examination | ||
WW01 | Invention patent application withdrawn after publication | ||
WW01 | Invention patent application withdrawn after publication |
Application publication date: 20200918 |