CN111563536A - Bamboo strip color self-adaptive classification method based on machine learning - Google Patents

Bamboo strip color self-adaptive classification method based on machine learning Download PDF

Info

Publication number
CN111563536A
CN111563536A CN202010302809.5A CN202010302809A CN111563536A CN 111563536 A CN111563536 A CN 111563536A CN 202010302809 A CN202010302809 A CN 202010302809A CN 111563536 A CN111563536 A CN 111563536A
Authority
CN
China
Prior art keywords
color
bamboo
image
channel
steps
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010302809.5A
Other languages
Chinese (zh)
Other versions
CN111563536B (en
Inventor
杨和
刘文哲
童同
高钦泉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Imperial Vision Information Technology Co ltd
Original Assignee
Fujian Imperial Vision Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Imperial Vision Information Technology Co ltd filed Critical Fujian Imperial Vision Information Technology Co ltd
Priority to CN202010302809.5A priority Critical patent/CN111563536B/en
Publication of CN111563536A publication Critical patent/CN111563536A/en
Application granted granted Critical
Publication of CN111563536B publication Critical patent/CN111563536B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Abstract

The invention relates to a bamboo strip color self-adaptive classification method based on machine learning, which can automatically adjust color classification standards according to the color change of bamboo strips. The method comprises the steps of collecting bamboo strip images, segmenting bamboo strip regions, extracting bamboo strip color features, clustering the bamboo strip color features and classifying the bamboo strip colors. Aiming at the characteristics of bamboo color, the invention provides an improved color histogram feature which can represent the color of bamboo strips more accurately. The method does not need the priori knowledge of manually classifying the color data of the bamboo strips, and has the characteristics of strong adaptability and high accuracy.

Description

Bamboo strip color self-adaptive classification method based on machine learning
Technical Field
The invention relates to the technical field of machine learning, in particular to a bamboo strip color self-adaptive classification method based on machine learning.
Background
With the increasing environmental protection pressure in recent years, the bamboo product processing industry is confronted with new opportunities and challenges, and particularly the development of the bamboo product processing industry is accelerated in countries and regions with high bamboo yield. However, the domestic bamboo industry is generally limited by low automation degree, few professionals and the like, and the rapid development of the bamboo industry is seriously hindered.
The processing process of the bamboo and bamboo semi-finished products produced in China mainly comprises the working procedures of cutting, splitting, rough planing, fine planing, pressing and the like. The bamboo receives production environment easily, growth age, influence such as carbomorphism technology, and bamboo surface color has obvious depth change to lead to the inhomogeneous problem of colour behind the clamp plate. Therefore, in order to improve the production quality of bamboo products, color classification is required.
At present, most bamboo processing factories mainly rely on manual bamboo color sorting classification according to the color depth of the surfaces of bamboo strips, and the classification standards are classified into several categories of extra deep, medium, light, extra light and the like according to the actual conditions of the factories. However, the manual sorting has the disadvantages of large professional requirements, high operation cost, low working efficiency, unstable sorting quality and the like.
To address the above industry pain points, related studies of bamboo cane color classification have been developed. At present, average color and texture features of bamboo chips are adopted, then a bayer classifier is used for color classification, color features are firstly extracted from HSV color space, then SVM is used for bamboo strip color classification, L a b color space features are used, and BP neural network is used for color classification of the bamboo chips.
Although there are many related studies, it is necessary to manually perform color screening on a certain number of bamboo strips in advance, and then collect bamboo strip data to train a model. Because the surface color of the bamboo strip is influenced by environments such as climate, the methods cannot meet the color classification of the bamboo strip produced by different bamboo strip raw material suppliers in different seasons. Therefore, in this case, the color classification of the surface of the bamboo strip requires manual re-screening of the bamboo strip color and then re-training of the color classification model. Meanwhile, manual screening errors are easily introduced by the mode, and the color classification stability and the practicability are reduced. The method for classifying bamboo strip colors by manually setting the gray information of the bamboo strips only considers the gray information and is easily influenced by the change of the light source environment.
Disclosure of Invention
In view of this, the present invention provides a bamboo color adaptive classification method based on machine learning, which can automatically learn bamboo color levels and has good robustness and accuracy.
The invention is realized by adopting the following scheme: a bamboo strip color self-adaptive classification method based on machine learning comprises the following steps:
step S1: collecting and extracting color information of a single bamboo strip, and preprocessing the color information of the single bamboo strip;
step S2: extracting color histogram features of the image obtained after preprocessing;
step S3: clustering color centers by using a k-means algorithm;
step S4: classifying colors of the new bamboo strips;
step S5: updating color feature library NF{1,2,...,}And updating the color clustering center.
Further, the step S1 specifically includes the following steps:
step S11: image acquisition: photographing an image of a surface of a bamboo strip portion as a picture I using a color area-array camera1
Step S12: converting the color image into a gray image: color picture I obtained by camera1The RGB channel of (a) is converted into a grayscale channel, and the conversion formula is Gray ═ R0.299 + G0.587 + B0.114, where Gray represents the grayscale value, and a grayscale image I is obtained2
Step S13: image binarization: gray image I using OTSU algorithm2Binarizing to obtain binarized image I3
Step S14: extracting a bamboo chip mask: image morphological operation is utilized to remove image I after binarization3Extracting the largest particle region from the medium and small particles to obtain a mask region I of the bamboo chips4
Step S15: and (3) extracting a bamboo chip color block: extracting bamboo sheet mask region I4The maximum inscribed rectangle of (a) is,cropping the picture I according to the rectangle1Obtaining the bamboo chip color block I5
Step S16: judging whether the whole bamboo strips are collected or not, if so, executing the step S17, otherwise, executing the step S11;
step S17: splicing color blocks: the image sequence I obtained in the above steps5 {1,2,...}Spliced into an image I6The image I6Namely the preprocessed image.
Further, the step S2 specifically includes the following steps:
step S21: image I obtained by splicing color blocks6Converting RGB color space into HSV color space to obtain image I7
Step S22: extraction of I7The color characteristics of (A): for the hue channel H, the range of H is 0-180, the width of a histogram bin is set to be 5, and 36H channel bins are obtained; for the saturation channel S, the characteristics thereof are not separately counted; for a brightness channel V, the range of V is 0-255, the width of a histogram bin is set to be 4, 64V channel bins are obtained, and a saturation threshold value S is setthreshThe threshold value range is 0-255, and the V channel bins are divided by the threshold value to obtain 128V channel bins; the last 164 color bins in total, statistics I6Obtaining a histogram of image pixels in a bin range to obtain a histogram color feature F, wherein the feature is a feature vector of 1x 164;
step S23: characteristic normalization: normalizing the feature vector F, wherein the normalization formula is as follows:
Figure BDA0002454543510000041
where I denotes the subscript of the feature vector, and W, H denote the image I, respectively6Width and height of (d); and finally obtaining the normalized characteristic NF.
Further, the specific content of the histogram of the statistical bin in step S22 is:
when the histogram of the bin is counted, a linear voting method is adopted, the pixel value is p, and two bins nearest to the p are the bini,binjCorresponding center is Ci,CjThen the votes for these two bins are:
Figure BDA0002454543510000042
the voting mode can avoid forcibly dividing continuous color features, can obtain a more accurate color statistical chart, and can improve the accuracy and robustness of color classification.
Further, the step S3 specifically includes the following steps:
step S31: preparing N bamboo strips, repeating the steps S1 and S2 to obtain N characteristic sequences NF{1,2,...}
Step S32: setting the color classification number k according to the color classification requirement of a factory, wherein the value range of k is 3-5, and calculating the N spliced images I obtained in the step S26Average luminance value of (a); according to the size distribution of the brightness values, k brightness values are taken at equal distances, and the color characteristics of the picture corresponding to the k brightness values are selected as an initial centroid;
step S33: clustering k central points C according to a k-means clustering algorithm{1,2,...,k}
Step S34: determining the corresponding relation between the k central points and the color depth: to the clustering center point C{1,2,...,k}And measuring, wherein the measurement formula is as follows:
Figure BDA0002454543510000051
n represents a clustering center subscript, i represents a color feature subscript, and only a V channel is selected for measurement; m isnThe size of (d) represents the color depth, m, corresponding to the cluster centernLarger, the lighter the color, mnSmaller, indicates darker color; will correspond to the obtained m{1,2,..,k}And sequencing to obtain the corresponding color depth relation.
Further, the step S4 specifically includes the following steps:
step S41: repeating the steps S1 and S2 on the new bamboo strip data to obtain a new characteristic sequence NFnew;
step S42: calculating the distances between the feature sequence NFnew and the k central points obtained in step S33, where the metric formula is:
Figure BDA0002454543510000052
wherein SnThe similarity between NFnew and the nth central point Cn is shown, and the value range of Sn is [0,1]Closer to 1, more similar; calculating with all central points to obtain a measurement result sequence S{0,1,...k}And calculating a color classification result:
Figure BDA0002454543510000053
g is the result of color classification.
Further, the step S5 specifically includes the following steps:
step S51: for the characteristic sequence NF obtained in step S41newAdd to feature library NF{1,2,...,}In (3), counting the currently added new characteristic sequence NFnewNumber of X, updated feature library NF{1,2,...,}The total number Y;
step S52: setting an iterative training threshold value P, wherein the range of P is 500-1000, the maximum number threshold value Q of a feature library is set, the range of Q is 5000-10000, and P is less than Q; if X is equal to P, performing step S53, otherwise performing step S51;
step S53: executing the step S3, and updating the color cluster center point;
step S54: if Y is>Q, calculating the updated feature library NF in accordance with step S42{1,2,...}And the metric value of the clustering center, selecting the minimum Y-Q value as an outlier characteristic value, and removing the outlier characteristic value from a characteristic library; otherwise, continuing to execute step S51;
compared with the prior art, the invention has the following beneficial effects:
the invention does not need to manually pre-screen the colors of the bamboo strips for training, and meanwhile, the iterative color feature library is automatically updated in the using process to train a new model. The method has the advantages of strong adaptability, high accuracy and good implementability.
Drawings
Fig. 1 is an original image acquired by a camera according to an embodiment of the present invention.
FIG. 2 is a mosaic color patch diagram according to an embodiment of the present invention; wherein, fig. 2(a) is a splicing picture of light-color bamboo strips, fig. 2(b) is a splicing picture of medium-color bamboo strips, and fig. 2(c) is a splicing picture of dark-color bamboo strips.
FIG. 3 is an overall flow chart of an embodiment of the present invention.
FIG. 4 is a color characterization diagram according to an embodiment of the present invention.
Detailed Description
The invention is further explained below with reference to the drawings and the embodiments.
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of example embodiments according to the present application. As used herein, the singular forms "a", "an" and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
As shown in fig. 3, the present embodiment provides a bamboo color adaptive classification method based on machine learning, including the following steps:
step S1: collecting and extracting color information of a single bamboo strip, and preprocessing the color information of the single bamboo strip;
step S2: extracting color histogram features of the image obtained after preprocessing;
step S3: clustering color centers by using a k-means algorithm;
step S4: classifying colors of the new bamboo strips;
step S5: updating color feature library NF{1,2,...,}And updating the color clustering center. .
In this embodiment, the step S1 specifically includes the following steps:
step S11: image acquisition: photographing an image of a surface of a bamboo strip portion as a picture I using a color area-array camera1(ii) a As shown in fig. 1;
step S12: converting the color image into a gray image: color picture I obtained by camera1The RGB channel of (a) is converted into a grayscale channel, and the conversion formula is Gray ═ R0.299 + G0.587 + B0.114, where Gray represents the grayscale value, and a grayscale image I is obtained2
Step S13: image binarization: gray image I using OTSU algorithm2Binarizing to obtain binarized image I3
Step S14: extracting a bamboo chip mask: image morphological operation is utilized to remove image I after binarization3Extracting the largest particle region from the medium and small particles to obtain a mask region I of the bamboo chips4
Step S15: and (3) extracting a bamboo chip color block: : extracting bamboo sheet mask region I4The picture I is cut according to the maximum inscribed rectangle1Obtaining the bamboo chip color block I5
Step S16: judging whether the whole bamboo strips are collected or not, if so, executing the step S17, otherwise, executing the step S11;
step S17: splicing color blocks: the image sequence I obtained in the above steps5 {1,2,...}Spliced into an image I6The image I6Namely the preprocessed image.
In this embodiment, the step S2 specifically includes the following steps:
step S21: image I obtained by splicing color blocks6Converting RGB color space into HSV color space to obtain image I7
Step S22: extraction of I7The color characteristics of (A): for hue channels H, the range of H is 0-180, the width of a histogram bin is set to be 5, and 36H channels are obtainedA bin; for the saturation channel S, the characteristics thereof are not separately counted; for a brightness channel V, the range of V is 0-255, the width of a histogram bin is set to be 4, 64V channel bins are obtained, and a saturation threshold value S is setthreshThe threshold value ranges from 0 to 255, and the V channel bins are divided by the threshold value to obtain 128V channel bins; the last 164 color bins in total, statistics I6Obtaining a histogram of image pixels in a bin range to obtain a histogram color feature F, wherein the feature is a feature vector of 1x 164;
step S23: characteristic normalization: normalizing the feature vector F, wherein the normalization formula is as follows:
Figure BDA0002454543510000091
where I denotes the subscript of the feature vector, and W, H denote the image I, respectively6Width and height of (d); and finally obtaining the normalized characteristic NF.
In this embodiment, the specific content of the histogram of the statistical bin in step S22 is:
when the histogram of the bins is counted, it is generally not reasonable if the pixel values are near the edges of two bins, one of which receives a vote of 1 and the other of which receives a vote of 0. Therefore, the present embodiment proposes to adopt a linear voting method, let the pixel value be p, and two bins nearest to this p be binsi,binjCorresponding center is Ci,CjThen the votes for these two bins are:
Figure BDA0002454543510000092
the voting mode can avoid forcibly dividing continuous color features, can obtain a more accurate color statistical chart, and can improve the accuracy and robustness of color classification.
In this embodiment, the step S3 specifically includes the following steps:
step S31: preparing N bamboo strips, and repeating the steps S1 and S2 to obtain N bamboo stripsCharacteristic sequence NF{1,2,...}
Step S32: setting k to 3, and calculating the N stitched images I obtained in step S26Average luminance value of (a); according to the size distribution of the brightness values, k brightness values are taken at equal distances, and the color characteristics of the picture corresponding to the k brightness values are selected as an initial centroid; for the k-means algorithm, if the centroid position is randomly selected and initialized, the final convergence result is influenced. In the embodiment, the selection mode of the centroid utilizes the brightness characteristic average value as the basis for initializing the centroid, and the convergence result obtained by each training is ensured to be stable.
Step S33: clustering k central points C according to a k-means clustering algorithm{1,2,...,k}
Step S34: determining the corresponding relation between the k central points and the color depth: to the clustering center point C{1,2,...,k}And measuring, wherein the measurement formula is as follows:
Figure BDA0002454543510000101
n represents a clustering center subscript, i represents a color feature subscript, and only a V channel is selected for measurement; m isnThe size of (d) represents the color depth, m, corresponding to the cluster centernLarger, the lighter the color, mnSmaller, indicates darker color; will correspond to the obtained m{1,2,..,k}And sequencing to obtain the corresponding color depth relation.
In this embodiment, the step S4 specifically includes the following steps:
step S41: repeating the steps S1 and S2 on the new bamboo strip data to obtain a new characteristic sequence NFnew;
step S42: calculating the distances between the feature sequence NFnew and the k central points obtained in step S33, where the metric formula is:
Figure BDA0002454543510000102
wherein SnThe similarity between NFnew and the nth central point Cn is shown, and the value range of Sn is [0,1]Closer to 1, more similar; and all ofCalculating the central point to obtain a measurement result sequence
S{0,1,...k}And calculating a color classification result:
Figure BDA0002454543510000103
g is the result of color classification.
In this embodiment, the step S5 specifically includes the following steps:
step S51: for the characteristic sequence NF obtained in step S41newAdd to feature library NF{1,2,...,}In (3), counting the currently added new characteristic sequence NFnewNumber of X, updated feature library NF{1,2,...,}The total number Y;
step S52: setting an iterative training threshold value P as 1000, wherein the range of P is 500-1000, the maximum number threshold value Q of a feature library is 10000, the range of Q is 5000-10000, and P is < Q; if X is equal to P, performing step S53, otherwise performing step S51;
step S53: executing the step S3, and updating the color cluster center point;
step S54: if Y is>Q, calculating the updated feature library NF in accordance with step S42{1,2,...}And the metric value of the clustering center, selecting the minimum Y-Q value as an outlier characteristic value, and removing the outlier characteristic value from a characteristic library; otherwise, continuing to execute step S51;
preferably, in this embodiment, as shown in fig. 2 and 4,
(1) n is prepared as 500 bamboo strips, and step S1 and step S2 are repeated. Obtaining 500 characteristic sequences NF{1,2,...}
(2) According to the factory color classification requirement, k is set to 3, or k is set to 5. Calculating 500 stitched images I obtained in step S26The average luminance value of (a). And according to the size distribution of the brightness values, taking k brightness values at equal distances, and selecting the color characteristics of the picture corresponding to the k brightness values as an initial centroid. For the k-means algorithm, if the centroid position is randomly selected and initialized, the final convergence result is influenced. The selection mode of the centroid in the invention utilizes the brightness characteristic average value as the initialized centroidThe basis of (2) ensures that the convergence result obtained by each training is stable.
(3) Clustering k central points C according to the steps of the traditional k-means clustering algorithm{1,2,...,k}
(4) Determining the corresponding relation between the k central points and the color depth: to the clustering center point C{1,2,...,k}And measuring, wherein the measurement formula is as follows:
Figure BDA0002454543510000111
n denotes the cluster center index and i denotes the color feature index, where we only choose the V channel for measurement. m isnThe size of (d) represents the color depth, m, corresponding to the cluster centernLarger, the lighter the color, mnSmaller, indicates darker color. Will correspond to the obtained m{1,2,..,k}And sequencing to obtain the corresponding color depth relation.
(4) Classifying colors of the new bamboo strips;
the step (4) specifically comprises the following steps:
(4-1) repeating the steps S1 and S2 on the new bamboo strip data to obtain a characteristic sequence NFnew
(4-2) calculation of the characteristic sequence NFnewAnd (3) the distance from the k central points obtained in the step (3) is measured according to the following formula:
Figure BDA0002454543510000121
wherein SnDenotes NFnewAnd the nth center point CnThe similarity of (c). SnValue range [0,1 ]]The closer to 1, the more similar the representation. Calculating with all central points to obtain a measurement result sequence S{0,1,...k}And calculating a color classification result:
Figure BDA0002454543510000122
g is the result of color classification.
(5) Refreshing faceColor feature library NF{1,2,...,}Iteratively training a color model;
the step (5) specifically comprises the following steps:
(5-1) for the characteristic sequence NF obtained in (4-1)newAdd to feature library NF{1,2,...,}In (3), counting the currently added new characteristic sequence NFnewNumber of (3), feature library NF{1,2,...,}The total number Y;
(5-2) setting an iteration training threshold value P to be 500 and a maximum number of feature libraries threshold value Q to be 5000. Performing step (5-3) if X is equal to P, otherwise performing step (5-1);
(5-3), executing the step (4), and retraining the model;
(5-4) if Y is>T, calculating a feature library NF according to the step (4-2){1,2,...}And the metric value of the clustering center, selecting the minimum Y-T value as an outlier characteristic value, and removing the outlier characteristic value from the characteristic library. Otherwise, continuing to execute the step (5-1);
and (5-5) re-clustering by using the cluster center point obtained in the step (5-3) as an initial point.
Preferably, the present embodiment only needs to provide a certain number of bamboo strips, and automatically learns the color distribution and classification model of the bamboo strips according to the adaptive analysis and judgment of the color change.
The above description is only a preferred embodiment of the present invention, and all equivalent changes and modifications made in accordance with the claims of the present invention should be covered by the present invention.

Claims (7)

1. A bamboo strip color self-adaptive classification method based on machine learning is characterized in that: the method comprises the following steps:
step S1: collecting and extracting color information of a single bamboo strip, and preprocessing the color information of the single bamboo strip;
step S2: extracting color histogram features of the image obtained after preprocessing;
step S3: clustering color centers by using a k-means algorithm;
step S4: classifying colors of the new bamboo strips;
step S5: furthermore, the utility modelNew color feature library NF{1,2,...,}And updating the color clustering center.
2. The machine learning-based bamboo cane color adaptive classification method according to claim 1, characterized in that: the step S1 specifically includes the following steps:
step S11: image acquisition: photographing an image of a surface of a bamboo strip portion as a picture I using a color area-array camera1
Step S12: converting the color image into a gray image: color picture I obtained by camera1The RGB channel of (a) is converted into a grayscale channel, and the conversion formula is Gray ═ R0.299 + G0.587 + B0.114, where Gray represents the grayscale value, and a grayscale image I is obtained2
Step S13: image binarization: gray image I using OTSU algorithm2Binarizing to obtain binarized image I3
Step S14: extracting a bamboo chip mask: image morphological operation is utilized to remove image I after binarization3Extracting the largest particle region from the medium and small particles to obtain a mask region I of the bamboo chips4
Step S15: and (3) extracting a bamboo chip color block: extracting bamboo sheet mask region I4The picture I is cut according to the maximum inscribed rectangle1Obtaining the bamboo chip color block I5
Step S16: judging whether the whole bamboo strips are collected or not, if so, executing the step S17, otherwise, executing the step S11;
step S17: splicing color blocks: the image sequence I obtained in the above steps5 {1,2,...}Spliced into an image I6The image I6Namely the preprocessed image.
3. The machine learning-based bamboo cane color adaptive classification method according to claim 1, characterized in that: the step S2 specifically includes the following steps:
step S21: image I obtained by splicing color blocks6Converting RGB color space into HSV color space to obtain image I7
Step S22: extraction of I7The color characteristics of (A): for the hue channel H, the range of H is 0-180, the width of a histogram bin is set to be 5, and 36H channel bins are obtained; for the saturation channel S, the characteristics thereof are not separately counted; for a brightness channel V, the range of V is 0-255, the width of a histogram bin is set to be 4, 64V channel bins are obtained, and a saturation threshold value S is setthreshThe threshold value range is 0-255, and the V channel bins are divided by the threshold value to obtain 128V channel bins; the last 164 color bins in total, statistics I6Obtaining a histogram of image pixels in a bin range to obtain a histogram color feature F, wherein the feature is a feature vector of 1x 164;
step S23: characteristic normalization: normalizing the feature vector F, wherein the normalization formula is as follows:
Figure FDA0002454543500000021
where I denotes the subscript of the feature vector, and W, H denote the image I, respectively6Width and height of (d); and finally obtaining the normalized characteristic NF.
4. The machine learning based bamboo cane color adaptive classification method according to claim 3, characterized in that: the specific content of the histogram of the statistical bin in step S22 is:
when the histogram of the bin is counted, a linear voting method is adopted, the pixel value is p, and two bins nearest to the p are the bini,binjCorresponding center is Ci,CjThen the votes for these two bins are:
Figure FDA0002454543500000031
the voting mode can avoid forcibly dividing continuous color features, can obtain a more accurate color statistical chart, and can improve the accuracy and robustness of color classification.
5. The machine learning-based bamboo cane color adaptive classification method according to claim 1, characterized in that: the step S3 specifically includes the following steps:
step S31: preparing N bamboo strips, repeating the steps S1 and S2 to obtain N characteristic sequences NF{1,2,...}
Step S32: setting the color classification number k according to the color classification requirement of a factory, wherein the value range of k is 3-5, and calculating the N spliced images I obtained in the step S26Average luminance value of (a); according to the size distribution of the brightness values, k brightness values are taken at equal distances, and the color characteristics of the picture corresponding to the k brightness values are selected as an initial centroid;
step S33: clustering k central points C according to a k-means clustering algorithm{1,2,...,k}
Step S34: determining the corresponding relation between the k central points and the color depth: to the clustering center point C{1,2,...,k}And measuring, wherein the measurement formula is as follows:
Figure FDA0002454543500000032
n represents a clustering center subscript, i represents a color feature subscript, and only a V channel is selected for measurement; m isnThe size of (d) represents the color depth, m, corresponding to the cluster centernLarger, the lighter the color, mnSmaller, indicates darker color; will correspond to the obtained m{1 ,2,..,k}And sequencing to obtain the corresponding color depth relation.
6. The machine learning-based bamboo cane color adaptive classification method according to claim 5, characterized in that: the step S4 specifically includes the following steps:
step S41: repeating the steps S1 and S2 on the new bamboo strip data to obtain a new characteristic sequence NFnew;
step S42: calculating the distances between the feature sequence NFnew and the k central points obtained in step S33, where the metric formula is:
Figure FDA0002454543500000041
wherein SnThe similarity between NFnew and the nth central point Cn is shown, and the value range of Sn is [0,1]Closer to 1, more similar; calculating with all central points to obtain a measurement result sequence S{0,1,...k}And calculating a color classification result:
Figure FDA0002454543500000042
g is the result of color classification.
7. The machine learning based bamboo cane color adaptive classification method according to claim 6, characterized in that: the step S5 specifically includes the following steps:
step S51: for the characteristic sequence NF obtained in step S41newAdd to feature library NF{1,2,...,}In (3), counting the currently added new characteristic sequence NFnewNumber of X, updated feature library NF{1,2,...,}The total number Y;
step S52: setting an iterative training threshold value P, wherein the range of P is 500-1000, the maximum number threshold value Q of a feature library is set, the range of Q is 5000-10000, and P is less than Q; if X is equal to P, performing step S53, otherwise performing step S51;
step S53: executing the step S3, and updating the color cluster center point;
step S54: if Y is>Q, calculating the updated feature library NF in accordance with step S42{1,2,...}And the metric value of the clustering center, selecting the minimum Y-Q value as an outlier characteristic value, and removing the outlier characteristic value from a characteristic library; otherwise, execution continues with step S51.
CN202010302809.5A 2020-04-17 2020-04-17 Bamboo strip color self-adaptive classification method based on machine learning Active CN111563536B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010302809.5A CN111563536B (en) 2020-04-17 2020-04-17 Bamboo strip color self-adaptive classification method based on machine learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010302809.5A CN111563536B (en) 2020-04-17 2020-04-17 Bamboo strip color self-adaptive classification method based on machine learning

Publications (2)

Publication Number Publication Date
CN111563536A true CN111563536A (en) 2020-08-21
CN111563536B CN111563536B (en) 2023-04-14

Family

ID=72071606

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010302809.5A Active CN111563536B (en) 2020-04-17 2020-04-17 Bamboo strip color self-adaptive classification method based on machine learning

Country Status (1)

Country Link
CN (1) CN111563536B (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112784854A (en) * 2020-12-30 2021-05-11 成都云盯科技有限公司 Method, device and equipment for segmenting and extracting clothing color based on mathematical statistics

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011210111A (en) * 2010-03-30 2011-10-20 Nippon Telegr & Teleph Corp <Ntt> Image feature quantity generation device, method and program
CN105344618A (en) * 2015-10-21 2016-02-24 国家林业局北京林业机械研究所 Rectangular bamboo chip waning defect and color sorting method
CN107516331A (en) * 2017-08-11 2017-12-26 广西师范大学 A kind of bamboo cane method for sorting colors and system
CN108460380A (en) * 2018-03-13 2018-08-28 广西师范大学 A kind of bamboo cane method for sorting colors and system based on domain color
CN109858521A (en) * 2018-12-29 2019-06-07 国际竹藤中心 A kind of bamboo category identification method based on artificial intelligence deep learning

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011210111A (en) * 2010-03-30 2011-10-20 Nippon Telegr & Teleph Corp <Ntt> Image feature quantity generation device, method and program
CN105344618A (en) * 2015-10-21 2016-02-24 国家林业局北京林业机械研究所 Rectangular bamboo chip waning defect and color sorting method
CN107516331A (en) * 2017-08-11 2017-12-26 广西师范大学 A kind of bamboo cane method for sorting colors and system
CN108460380A (en) * 2018-03-13 2018-08-28 广西师范大学 A kind of bamboo cane method for sorting colors and system based on domain color
CN109858521A (en) * 2018-12-29 2019-06-07 国际竹藤中心 A kind of bamboo category identification method based on artificial intelligence deep learning

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112784854A (en) * 2020-12-30 2021-05-11 成都云盯科技有限公司 Method, device and equipment for segmenting and extracting clothing color based on mathematical statistics

Also Published As

Publication number Publication date
CN111563536B (en) 2023-04-14

Similar Documents

Publication Publication Date Title
CN111738064B (en) Haze concentration identification method for haze image
CN108181316B (en) Bamboo strip defect detection method based on machine vision
CN109409355B (en) Novel transformer nameplate identification method and device
CN107392968B (en) The image significance detection method of Fusion of Color comparison diagram and Color-spatial distribution figure
CN107729812B (en) Method suitable for recognizing vehicle color in monitoring scene
CN109145848B (en) Wheat ear counting method
CN101162503A (en) Method for extracting and recognizing human ear characteristic by improved Hausdorff distance
Niu et al. Image segmentation algorithm for disease detection of wheat leaves
CN108009567B (en) Automatic excrement character distinguishing method combining image color and HOG and SVM
CN112906550B (en) Static gesture recognition method based on watershed transformation
CN109815923B (en) Needle mushroom head sorting and identifying method based on LBP (local binary pattern) features and deep learning
CN111950654B (en) Magic cube color block color reduction method based on SVM classification
CN115375690A (en) Tongue picture putrefaction classification and identification method
CN111260645A (en) Method and system for detecting tampered image based on block classification deep learning
CN113724339B (en) Color space feature-based color separation method for tiles with few samples
CN113012156B (en) Intelligent solid wood board color classification method
CN111563536B (en) Bamboo strip color self-adaptive classification method based on machine learning
CN111046838A (en) Method and device for identifying wetland remote sensing information
CN113052234A (en) Jade classification method based on image features and deep learning technology
CN108765426A (en) automatic image segmentation method and device
CN113408573A (en) Method and device for automatically classifying and classifying tile color numbers based on machine learning
CN112802074A (en) Textile flaw detection method based on illumination correction and visual saliency characteristics
CN110929740A (en) LGBM model-based tongue quality and tongue coating separation method
CN103871084B (en) Indigo printing fabric pattern recognition method
CN112364844B (en) Data acquisition method and system based on computer vision technology

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant