CN110728302A - Method for identifying color textile fabric tissue based on HSV (hue, saturation, value) and Lab (Lab) color spaces - Google Patents

Method for identifying color textile fabric tissue based on HSV (hue, saturation, value) and Lab (Lab) color spaces Download PDF

Info

Publication number
CN110728302A
CN110728302A CN201910853886.7A CN201910853886A CN110728302A CN 110728302 A CN110728302 A CN 110728302A CN 201910853886 A CN201910853886 A CN 201910853886A CN 110728302 A CN110728302 A CN 110728302A
Authority
CN
China
Prior art keywords
lab
hsv
color
moment
image
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN201910853886.7A
Other languages
Chinese (zh)
Inventor
袁理
龚雪
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Wuhan Textile University
Original Assignee
Wuhan Textile University
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Wuhan Textile University filed Critical Wuhan Textile University
Priority to CN201910853886.7A priority Critical patent/CN110728302A/en
Publication of CN110728302A publication Critical patent/CN110728302A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • G06F18/241Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches
    • G06F18/2411Classification techniques relating to the classification model, e.g. parametric or non-parametric approaches based on the proximity to a decision surface, e.g. support vector machines
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/25Fusion techniques
    • G06F18/253Fusion techniques of extracted features
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/20Image preprocessing
    • G06V10/30Noise filtering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/50Extraction of image or video features by performing operations within image blocks; by using histograms, e.g. histogram of oriented gradients [HoG]; by summing image-intensity values; Projection analysis
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Abstract

The invention belongs to the technical field of image processing, and relates to a method for identifying a color textile fabric tissue based on HSV and Lab color spaces, which is widely applied to the fields of fabric tissue identification, flaw detection, image retrieval and the like. The method comprises the steps of firstly carrying out filtering and denoising pretreatment on collected fabric images, then respectively segmenting the fabric images in HSV and Lab color spaces to obtain organization points, extracting channel components with the same properties, fusing the channel components to obtain local texture features and global color features of each organization point, and finally carrying out fusion identification on the texture features and the color features by adopting a learning method of a multi-core support vector machine. The method can effectively complete the tissue identification of the colored textile fabric and has wide applicability.

Description

Method for identifying color textile fabric tissue based on HSV (hue, saturation, value) and Lab (Lab) color spaces
Technical Field
The invention relates to a fabric tissue identification method, in particular to a method for identifying a color textile fabric based on HSV and Lab color spaces.
Background
The colored woven fabric is woven by yarns mixed by two or more fibers with different colors. The dyed fibers are complex in distribution, rich in color and have hierarchical changes, so that the color textile fabrics are rich in color information and complex in texture characteristics. And various color spaces can represent the color model and the texture information more comprehensively and accurately, and a single color space has certain limitation. The RGB color space is a space defined according to colors recognized by human eyes, can represent most colors, and is a color space related to a device. The Lab color space is a device-independent color system that describes human visual perception in a digital way and has a wider color gamut. The HSV color space is used to describe the more natural color than the RGB color space, which is proposed for better digitizing color information. The tissue identification of the colored textile fabric is mainly divided into the extraction of characteristic parameters, the fusion of texture characteristics and color information and the tissue identification.
The extraction of the characteristic parameters mainly comprises texture characteristics and color information. Extraction of textural features is mainly divided into four major categories: structural methods, model methods, statistical methods and signal processing methods. The texture method is a texture primitive-based analysis method, and has a good classification effect only on regular textures. The model method extracts texture features by calculating specific parameters of the model, but the solution of the parameters has certain difficulty, large computation amount and low efficiency. The statistical method mainly describes the gray attributes of pixels in textures and the field of the pixels, has strong stability, but has a large amount of statistical characteristic data and also has data redundancy. The signal processing method mainly performs multi-resolution representation on the texture, and the calculation amount is large. The method of color feature extraction mainly includes color histogram and color distance. The color histogram is used to reflect the color distribution of the image, and because it is a global color statistic, the local features between pixels are lost. The color distance is used for describing color distribution by using first moment, second moment and third moment, has less dimensionality and is often used together with other characteristics.
The fusion level is mainly divided into feature layer fusion, fractional layer fusion and decision layer fusion. The feature layer fusion is a process of extracting effective feature information from an image and then fusing the information. If the dimension of the feature vector needing to be fused is high and the types of the features are different, the result of the fusion is not definite. And the score layer fusion is to match the extracted feature vectors with a corresponding database to obtain matching scores and then fuse the matching scores. Different feature types can result in large differences between scoring regions. The decision layer fusion is a process for fusing the recognition result of each classifier, and has stability, but has a good recognition effect only on the characteristics of a single sample.
Pattern recognition is mainly divided into supervised learning and unsupervised learning. Supervised learning is to learn some known data to obtain a function model, and then to identify new data. This learning method requires a large amount of known data to train. While unsupervised learning does not require prior knowledge of the label of the sample.
Disclosure of Invention
1. A method for identification of a colored textile tissue based on HSV and Lab color spaces, comprising:
step 1: collecting an image of a colored textile fabric sample;
step 2: performing noise reduction treatment on the acquired sample image through median filtering based on MATLAB, and filtering out speckle noise or salt and pepper noise in the image;
and step 3: converting an RGB image of the fabric into a two-dimensional gray image, and then equalizing a histogram of the gray image to enhance yarn boundary information;
and 4, step 4: carrying out horizontal and vertical projection on the fabric image after the step 3 by adopting a gray projection method to obtain gray accumulation curves of the warp direction and the weft direction of the fabric, further carrying out smoothing treatment to obtain a smooth curve, taking valley points of the curve as yarn gaps, and finally extracting valley points of the two gray accumulation curves of the warp direction and the weft direction to complete the positioning of fabric tissue points;
and 5: respectively converting the color textile fabric image in the step 2 from an RGB color space to an HSV color space and a Lab color space, respectively dividing a fabric tissue point image from the whole fabric image in the HSV color space and the Lab color space by adopting the tissue point positioning method in the step 4, wherein the number of the tissue points at least comprises one tissue cycle, and numbering the tissue points in sequence;
step 6: extracting texture features under 2 color spaces for each tissue point respectively; the method comprises the following substeps:
step 6.1: firstly, extracting a pseudo gray image by using an LBP operator in a V channel of an HSV color space, then selecting a pixel pair with the direction theta equal to 0 degrees, 45 degrees, 90 degrees, 135 degrees and the distance d equal to 1 from the pseudo gray image by using a gray co-occurrence matrix, calculating the probability of the pixel pair to appear at the same time, and obtaining the characteristic parameters of the gray co-occurrence matrix in 4 directions: energy ASM, inverse difference IDM, contrast CON, correlation COR, defined as follows:
Figure BDA0002197729270000031
Figure BDA0002197729270000032
Figure BDA0002197729270000033
Figure BDA0002197729270000034
wherein the content of the first and second substances,
Figure BDA0002197729270000037
wherein, P (i, j) is the probability of the corresponding pixel at the position (i, j) and k is the gray level of the pixel;
therefore, a 16-dimensional feature vector, which is marked as W, can be extracted from the V channelHSV-V
Step 6.2: extracting a pseudo gray image by using an LBP operator in an L channel of a Lab color space, selecting a pixel pair with the direction theta of 0 degrees, 45 degrees, 90 degrees, 135 degrees and the distance d of 1 from the pseudo gray image by using a gray co-occurrence matrix, and calculating the probability of the pixel pair to simultaneously appear, so that the characteristic parameters of the gray co-occurrence matrix in 4 directions can be obtained: energy, inverse difference, contrast and correlation, thereby extracting a 16-dimensional feature vector on the L channel, which is denoted as WLab-LFinally WHSV-VAnd WLab-LThe fusion is performed according to the following formula:
Figure BDA0002197729270000041
and 7: extracting color features under HSV and Lab color spaces respectively for each tissue point; the method comprises the following substeps:
step 7.1: extracting color distance in HSV color space, wherein the color distance describes color distribution by using first moment, second moment and third moment, and the first moment uiSecond moment sigmaiAnd third moment siIs defined as follows:
Figure BDA0002197729270000043
wherein, P (i, j) is the color information of the pixel at the position (i, j), and N is the size of the color textile fabric tissue point image;
respectively recording the first moment, the second moment and the third moment of the H channel as UHSV-H、σHSV-H、SHSV-HThe first, second and third moments of the S channel are respectively recorded as UHSV-S,σHSV-S,SHSV-SRespectively recording the first order moment, the second order moment and the third order moment of the V channelIs UHSV-V,σHSV-V,SHSV-V
Step 7.2: extracting color distances in the Lab color space, and recording the first moment, the second moment and the third moment of the L channel as U respectivelylab-l、σlab-l、Slab-lThe first, second and third moments of the a channel are respectively recorded as Ulab-a,σlab-a,Slab-aThe first, second and third moments of the b channel are respectively recorded as Ulab-b,σlab-b,Slab-b
Step 7.3: fusing the color information in the HSV color space with the color information in the Lab color space as follows:
Figure BDA0002197729270000051
Figure BDA0002197729270000052
Figure BDA0002197729270000053
Figure BDA0002197729270000054
Figure BDA0002197729270000055
Figure BDA0002197729270000056
Figure BDA0002197729270000057
Figure BDA0002197729270000058
Figure BDA0002197729270000059
in the formula, Gu、Gσ、GsFirst, second and third order moments, C, of the fused luminance channel, respectivelyhsv-u、Chsv-σ、Chsv-sFirst, second and third order moments, C, of color channels in HSV color space, respectivelylab-u、Clab-σ、Clab-sRespectively a first moment, a second moment and a third moment of a color channel in the Lab color space;
combining the 9 eigenvalues into a 9-dimensional eigenvector;
and 8: the texture and color feature vectors of each tissue point are put in a multi-core support vector machine for learning, different kernel functions can be obtained by different features, then a group of optimal feature combination weights are found out, linear weighting is carried out on the color and texture feature kernel functions to obtain a new kernel function, and the new kernel function and the finally fused decision function are defined as follows;
Figure BDA0002197729270000061
Figure BDA0002197729270000062
in the formula (d)mA coefficient of a kernel function of not less than 0 and
Figure BDA0002197729270000063
km(xix) is a kernel function, M is the number of kernel functions, yiE { -1,1} is a label corresponding to the sample, aiIs the Lagrange coefficient and b is the domain value of the classification.
The method comprises the steps of firstly carrying out filtering and denoising pretreatment on collected fabric images, then respectively segmenting tissue points of the fabric in HSV and Lab color spaces, extracting channel components with the same properties, fusing the channel components to obtain local texture features and global color features of each tissue point, and finally carrying out fusion identification on the texture features and the color features by adopting a learning method of a multi-core support vector machine. The method can not effectively complete the tissue identification of the color textile fabric and has wide applicability.
Drawings
FIG. 1 is a schematic flow diagram of the process of the present invention.
Detailed Description
The technical scheme of the invention is further specifically described by the following embodiments and the accompanying drawings.
Example (b):
the invention comprises the following steps:
step 1: collecting an image of a colored textile fabric sample;
step 2: carrying out noise reduction treatment on the acquired sample image in MATLAB through median filtering;
and step 3: converting an RGB image of the fabric into a two-dimensional gray image, and then equalizing a histogram of the gray image to enhance yarn boundary information;
and 4, step 4: carrying out horizontal and vertical projection on the fabric by adopting a gray projection method to obtain gray accumulation curves of the warp direction and the weft direction of the fabric, then carrying out smoothing treatment to obtain a smooth curve, and extracting valley points of 2 curves to complete the positioning of fabric tissue points;
and 5: respectively converting the color textile fabric image from an RGB color space to an HSV color space and a Lab color space, respectively finishing the division of the tissue points in the HSV color space and the Lab color space, and numbering the divided tissue points in sequence;
step 6: extracting texture features under 2 color spaces for each tissue point respectively; the method comprises the following substeps:
step 6.1: firstly, extracting a pseudo gray image by using an LBP operator in a V channel of an HSV color space, and then selecting a pixel pair with the direction theta being 00,450,900,1350 and the distance d being 1 from the pseudo gray image by using a gray co-occurrence matrix to obtain characteristic parameters of the gray co-occurrence matrix in 4 directions: energy ASM, inverse difference IDM, contrast CON, correlation COR, which are defined as follows:
Figure BDA0002197729270000071
Figure BDA0002197729270000072
Figure BDA0002197729270000073
Figure BDA0002197729270000074
Figure BDA0002197729270000075
Figure BDA0002197729270000076
Figure BDA0002197729270000077
Figure BDA0002197729270000078
where P (i, j) is the probability of the corresponding pixel at location (i, j) occurring;
therefore, a 16-dimensional feature vector, which is marked as W, can be extracted from the V channelHSV-V
Step 6.2: extracting a pseudo gray image by using an LBP operator in an L channel of a Lab color space, selecting a pixel pair with the direction theta of 0 DEG, 45 DEG, 90 DEG, 135 DEG and the distance d of 1 from the pseudo gray image by using a gray co-occurrence matrix, and obtaining characteristic parameters of the gray co-occurrence matrix in 4 directions: energy, inverse difference, contrast and correlation, thereby extracting a 16-dimensional feature vector on the L channel, which is denoted as WLab-LFinally WHSV-VAnd WLab-LThe fusion is performed according to the following formula:
Figure BDA0002197729270000081
and 7: extracting color features under HSV and Lab color spaces respectively for each tissue point; the method comprises the following substeps:
step 7.1: extracting color distance in HSV color space, wherein the color distance describes color distribution by using first moment, second moment and third moment, and the first moment uiSecond moment sigmaiAnd third moment siIs defined as follows:
Figure BDA0002197729270000083
Figure BDA0002197729270000084
where P (i, j) is the color information of the pixel at location (i, j);
respectively recording the first moment, the second moment and the third moment of the H channel as UHSV-H、σHSV-H、SHSV-HThe first, second and third moments of the S channel are respectively recorded as UHSV-S,σHSV-S,SHSV-SRespectively recording the first moment, the second moment and the third moment of the V channel as UHSV-V,σHSV-V,SHSV-V
Step 7.2: extracting color distances in the Lab color space, and recording the first moment, the second moment and the third moment of the L channel as U respectivelylab-l、σlab-l、Slab-lThe first, second and third moments of the a channel are respectively recorded as Ulab-a,σlab-a,Slab-aThe first, second and third moments of the b channel are respectively recorded as Ulab-b,σlab-b,Slab-b
Step 7.3: fusing the color information in the HSV color space with the color information in the Lab color space as follows:
Figure BDA0002197729270000085
Figure BDA0002197729270000091
Figure BDA0002197729270000092
Figure BDA0002197729270000093
Figure BDA0002197729270000094
Figure BDA0002197729270000095
Figure BDA0002197729270000096
Figure BDA0002197729270000098
in the formula, Gu、Gσ、GsFirst, second and third order moments, C, of the fused luminance channel, respectivelyhsv-u、Chsv-σ、Chsv-sFirst, second and third order moments, C, of color channels in HSV color space, respectivelylab-u、Clab-σ、Clab-sRespectively a first moment, a second moment and a third moment of a color channel in the Lab color space;
combining the 9 eigenvalues into a 9-dimensional eigenvector;
and 8: the texture and color feature vectors of each tissue point are put in a multi-core support vector machine for learning, different kernel functions can be obtained by different features, then a group of optimal feature combination weights are found out, linear weighting is carried out on the color and texture feature kernel functions to obtain a new kernel function, and the new kernel function and the finally fused decision function are defined as follows;
Figure BDA00021977292700000910
in the formula (d)mA coefficient of a kernel function of not less than 0 and
Figure BDA0002197729270000101
km(xix) is a kernel function, M is the number of kernel functions, aiIs Lagrange coefficient, b is the classified threshold;
the specific embodiments described herein are merely illustrative of the spirit of the invention. Various modifications or additions may be made to the described embodiments or alternatives may be employed by those skilled in the art without departing from the spirit or ambit of the invention as defined in the appended claims.

Claims (1)

1. A method for identification of a colored textile tissue based on HSV and Lab color spaces, comprising:
step 1: collecting an image of a colored textile fabric sample;
step 2: performing noise reduction treatment on the acquired sample image through median filtering based on MATLAB, and filtering out speckle noise or salt and pepper noise in the image;
and step 3: converting an RGB image of the fabric into a two-dimensional gray image, and then equalizing a histogram of the gray image to enhance yarn boundary information;
and 4, step 4: carrying out horizontal and vertical projection on the fabric image after the step 3 by adopting a gray projection method to obtain gray accumulation curves of the warp direction and the weft direction of the fabric, further carrying out smoothing treatment to obtain a smooth curve, taking valley points of the curve as yarn gaps, and finally extracting valley points of the two gray accumulation curves of the warp direction and the weft direction to complete the positioning of fabric tissue points;
and 5: respectively converting the color textile fabric image in the step 2 from an RGB color space to an HSV color space and a Lab color space, respectively dividing a fabric tissue point image from the whole fabric image in the HSV color space and the Lab color space by adopting the tissue point positioning method in the step 4, wherein the number of the tissue points at least comprises one tissue cycle, and numbering the tissue points in sequence;
step 6: extracting texture features under 2 color spaces for each tissue point respectively; the method comprises the following substeps:
step 6.1: firstly, extracting a pseudo gray image by using an LBP operator in a V channel of an HSV color space, then selecting a pixel pair with the direction theta equal to 0 degrees, 45 degrees, 90 degrees, 135 degrees and the distance d equal to 1 from the pseudo gray image by using a gray co-occurrence matrix, calculating the probability of the pixel pair to appear at the same time, and obtaining the characteristic parameters of the gray co-occurrence matrix in 4 directions: energy ASM, inverse difference IDM, contrast CON, correlation COR, defined as follows:
Figure FDA0002197729260000021
Figure FDA0002197729260000022
Figure FDA0002197729260000023
wherein the content of the first and second substances,
Figure FDA0002197729260000025
Figure FDA0002197729260000027
Figure FDA0002197729260000028
wherein, P (i, j) is the probability of the corresponding pixel at the position (i, j) and k is the gray level of the pixel;
therefore, a 16-dimensional feature vector, which is marked as W, can be extracted from the V channelHSV-V
Step 6.2: extracting a pseudo gray image by using an LBP operator in an L channel of a Lab color space, selecting a pixel pair with the direction theta of 0 degrees, 45 degrees, 90 degrees, 135 degrees and the distance d of 1 from the pseudo gray image by using a gray co-occurrence matrix, and calculating the probability of the pixel pair to simultaneously appear, so that the characteristic parameters of the gray co-occurrence matrix in 4 directions can be obtained: energy, inverse difference, contrast and correlation, thereby extracting a 16-dimensional feature vector on the L channel, which is denoted as WLab-LFinally WHSV-VAnd WLab-LThe fusion is performed according to the following formula:
Figure FDA0002197729260000029
and 7: extracting color features under HSV and Lab color spaces respectively for each tissue point; the method comprises the following substeps:
step 7.1: extracting color distance in HSV color space, wherein the color distance describes color distribution by using first moment, second moment and third moment, and the first moment uiSecond moment sigmaiAnd third moment siIs defined as follows:
Figure FDA0002197729260000032
Figure FDA0002197729260000033
wherein, P (i, j) is the color information of the pixel at the position (i, j), and N is the size of the color textile fabric tissue point image;
respectively recording the first moment, the second moment and the third moment of the H channel as UHSV-H、σHSV-H、SHSV-HThe first, second and third moments of the S channel are respectively recorded as UHSV-S,σHSV-S,SHSV-SRespectively recording the first moment, the second moment and the third moment of the V channel as UHSV-V,σHSV-V,SHSV-V
Step 7.2: extracting color distances in the Lab color space, and recording the first moment, the second moment and the third moment of the L channel as U respectivelylab-l、σlab-l、Slab-lThe first, second and third moments of the a channel are respectively recorded as Ulab-a,σlab-a,Slab-aThe first, second and third moments of the b channel are respectively recorded as Ulab-b,σlab-b,Slab-b
Step 7.3: fusing the color information in the HSV color space with the color information in the Lab color space as follows:
Figure FDA0002197729260000034
Figure FDA0002197729260000042
Figure FDA0002197729260000045
Figure FDA0002197729260000046
Figure FDA0002197729260000047
Figure FDA0002197729260000048
in the formula, Gu、Gσ、GsFirst, second and third order moments, C, of the fused luminance channel, respectivelyhsv-u、Chsv-σ、Chsv-sFirst, second and third order moments, C, of color channels in HSV color space, respectivelylab-u、Clab-σ、Clab-sRespectively a first moment, a second moment and a third moment of a color channel in the Lab color space;
combining the 9 eigenvalues into a 9-dimensional eigenvector;
and 8: the texture and color feature vectors of each tissue point are put in a multi-core support vector machine for learning, different kernel functions can be obtained by different features, then a group of optimal feature combination weights are found out, linear weighting is carried out on the color and texture feature kernel functions to obtain a new kernel function, and the new kernel function and the finally fused decision function are defined as follows;
Figure FDA0002197729260000049
in the formula (d)mA coefficient of a kernel function of not less than 0 and
Figure FDA0002197729260000052
km(xix) is a kernel function, M is the number of kernel functions, yiE { -1,1} is a label corresponding to the sample, aiIs the Lagrange coefficient and b is the domain value of the classification.
CN201910853886.7A 2019-09-10 2019-09-10 Method for identifying color textile fabric tissue based on HSV (hue, saturation, value) and Lab (Lab) color spaces Pending CN110728302A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN201910853886.7A CN110728302A (en) 2019-09-10 2019-09-10 Method for identifying color textile fabric tissue based on HSV (hue, saturation, value) and Lab (Lab) color spaces

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN201910853886.7A CN110728302A (en) 2019-09-10 2019-09-10 Method for identifying color textile fabric tissue based on HSV (hue, saturation, value) and Lab (Lab) color spaces

Publications (1)

Publication Number Publication Date
CN110728302A true CN110728302A (en) 2020-01-24

Family

ID=69218140

Family Applications (1)

Application Number Title Priority Date Filing Date
CN201910853886.7A Pending CN110728302A (en) 2019-09-10 2019-09-10 Method for identifying color textile fabric tissue based on HSV (hue, saturation, value) and Lab (Lab) color spaces

Country Status (1)

Country Link
CN (1) CN110728302A (en)

Cited By (7)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111428814A (en) * 2020-04-16 2020-07-17 安徽农业大学 Blended yarn color automatic identification matching method
CN112365485A (en) * 2020-11-19 2021-02-12 同济大学 Melanoma identification method based on Circular LBP and color space conversion algorithm
CN112417944A (en) * 2020-08-31 2021-02-26 深圳市银星智能科技股份有限公司 Robot control method and electronic equipment
CN113049530A (en) * 2021-03-17 2021-06-29 北京工商大学 Single-seed corn seed moisture content detection method based on near-infrared hyperspectrum
CN113284147A (en) * 2021-07-23 2021-08-20 常州市新创智能科技有限公司 Foreign matter detection method and system based on yellow foreign matter defects
CN113724339A (en) * 2021-05-10 2021-11-30 华南理工大学 Color separation method for few-sample ceramic tile based on color space characteristics
CN114842043A (en) * 2022-07-04 2022-08-02 南通中豪超纤制品有限公司 Fabric style identification method and system based on image processing

Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110293188A1 (en) * 2010-06-01 2011-12-01 Wei Zhang Processing image data
CN103106645A (en) * 2013-03-15 2013-05-15 天津工业大学 Recognition method for woven fabric structure
CN106485288A (en) * 2016-12-21 2017-03-08 上海工程技术大学 A kind of automatic identifying method of yarn dyed fabric tissue

Patent Citations (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20110293188A1 (en) * 2010-06-01 2011-12-01 Wei Zhang Processing image data
CN103106645A (en) * 2013-03-15 2013-05-15 天津工业大学 Recognition method for woven fabric structure
CN106485288A (en) * 2016-12-21 2017-03-08 上海工程技术大学 A kind of automatic identifying method of yarn dyed fabric tissue

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
杨帆等著: "《精通图像处理经典算法 MATLAB版》", 31 March 2018, 北京航空航天大学出版社 *
王民等: ""基于混合色彩空间的分块颜色特征提取算法"", 《激光与光电子学进展》 *
王红军著: "《基于知识的机电系统故障诊断与预测技术》", 31 January 2014, 中国财富出版社 *
袁理等: ""结合全局与局部多样性特征的色纺纱色度学指标测试与评价"", 《纺织学报》 *

Cited By (9)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN111428814A (en) * 2020-04-16 2020-07-17 安徽农业大学 Blended yarn color automatic identification matching method
CN112417944A (en) * 2020-08-31 2021-02-26 深圳市银星智能科技股份有限公司 Robot control method and electronic equipment
CN112417944B (en) * 2020-08-31 2024-04-16 深圳银星智能集团股份有限公司 Robot control method and electronic equipment
CN112365485A (en) * 2020-11-19 2021-02-12 同济大学 Melanoma identification method based on Circular LBP and color space conversion algorithm
CN113049530A (en) * 2021-03-17 2021-06-29 北京工商大学 Single-seed corn seed moisture content detection method based on near-infrared hyperspectrum
CN113724339A (en) * 2021-05-10 2021-11-30 华南理工大学 Color separation method for few-sample ceramic tile based on color space characteristics
CN113724339B (en) * 2021-05-10 2023-08-18 华南理工大学 Color space feature-based color separation method for tiles with few samples
CN113284147A (en) * 2021-07-23 2021-08-20 常州市新创智能科技有限公司 Foreign matter detection method and system based on yellow foreign matter defects
CN114842043A (en) * 2022-07-04 2022-08-02 南通中豪超纤制品有限公司 Fabric style identification method and system based on image processing

Similar Documents

Publication Publication Date Title
CN110728302A (en) Method for identifying color textile fabric tissue based on HSV (hue, saturation, value) and Lab (Lab) color spaces
CN107578035B (en) Human body contour extraction method based on super-pixel-multi-color space
Gao et al. Automatic change detection in synthetic aperture radar images based on PCANet
CN107657279B (en) Remote sensing target detection method based on small amount of samples
CN108520226B (en) Pedestrian re-identification method based on body decomposition and significance detection
CN112818862B (en) Face tampering detection method and system based on multi-source clues and mixed attention
CN108319973B (en) Detection method for citrus fruits on tree
CN110717896B (en) Plate strip steel surface defect detection method based on significance tag information propagation model
CN111340824B (en) Image feature segmentation method based on data mining
JP2000003452A (en) Method for detecting face surface in digital picture, its detecting device, picture judging method, picture judging device and computer readable record medium
CN104899877A (en) Method for extracting image foreground based on super pixel and fast trimap image
CN110837768A (en) Rare animal protection oriented online detection and identification method
CN109740572A (en) A kind of human face in-vivo detection method based on partial color textural characteristics
CN111091134A (en) Method for identifying tissue structure of colored woven fabric based on multi-feature fusion
CN112163511A (en) Method for identifying authenticity of image
CN105718552A (en) Clothing freehand sketch based clothing image retrieval method
Wu et al. Strong shadow removal via patch-based shadow edge detection
CN105678735A (en) Target salience detection method for fog images
CN111080574A (en) Fabric defect detection method based on information entropy and visual attention mechanism
CN115527269B (en) Intelligent human body posture image recognition method and system
CN108509950A (en) Railway contact line pillar number plate based on probability characteristics Weighted Fusion detects method of identification
CN110298893A (en) A kind of pedestrian wears the generation method and device of color identification model clothes
CN105354547A (en) Pedestrian detection method in combination of texture and color features
CN110210561B (en) Neural network training method, target detection method and device, and storage medium
CN105528795B (en) A kind of infrared face dividing method using annular shortest path

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
WD01 Invention patent application deemed withdrawn after publication

Application publication date: 20200124

WD01 Invention patent application deemed withdrawn after publication