CN111563536B - Bamboo strip color self-adaptive classification method based on machine learning - Google Patents

Bamboo strip color self-adaptive classification method based on machine learning Download PDF

Info

Publication number
CN111563536B
CN111563536B CN202010302809.5A CN202010302809A CN111563536B CN 111563536 B CN111563536 B CN 111563536B CN 202010302809 A CN202010302809 A CN 202010302809A CN 111563536 B CN111563536 B CN 111563536B
Authority
CN
China
Prior art keywords
color
image
bamboo
channel
characteristic
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Active
Application number
CN202010302809.5A
Other languages
Chinese (zh)
Other versions
CN111563536A (en
Inventor
杨和
刘文哲
童同
高钦泉
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Fujian Imperial Vision Information Technology Co ltd
Original Assignee
Fujian Imperial Vision Information Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Fujian Imperial Vision Information Technology Co ltd filed Critical Fujian Imperial Vision Information Technology Co ltd
Priority to CN202010302809.5A priority Critical patent/CN111563536B/en
Publication of CN111563536A publication Critical patent/CN111563536A/en
Application granted granted Critical
Publication of CN111563536B publication Critical patent/CN111563536B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • YGENERAL TAGGING OF NEW TECHNOLOGICAL DEVELOPMENTS; GENERAL TAGGING OF CROSS-SECTIONAL TECHNOLOGIES SPANNING OVER SEVERAL SECTIONS OF THE IPC; TECHNICAL SUBJECTS COVERED BY FORMER USPC CROSS-REFERENCE ART COLLECTIONS [XRACs] AND DIGESTS
    • Y02TECHNOLOGIES OR APPLICATIONS FOR MITIGATION OR ADAPTATION AGAINST CLIMATE CHANGE
    • Y02PCLIMATE CHANGE MITIGATION TECHNOLOGIES IN THE PRODUCTION OR PROCESSING OF GOODS
    • Y02P90/00Enabling technologies with a potential contribution to greenhouse gas [GHG] emissions mitigation
    • Y02P90/30Computing systems specially adapted for manufacturing

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)
  • Spectrometry And Color Measurement (AREA)

Abstract

The invention relates to a bamboo strip color self-adaptive classification method based on machine learning, which can automatically adjust color classification standards according to the color change of bamboo strips. The method comprises the steps of collecting bamboo strip images, segmenting bamboo strip regions, extracting bamboo strip color features, clustering the bamboo strip color features and classifying the bamboo strip colors. Aiming at the characteristics of bamboo color, the invention provides an improved color histogram feature which can represent the color of bamboo strips more accurately. The method does not need the priori knowledge of manually classifying the color data of the bamboo strips, and has the characteristics of strong adaptability and high accuracy.

Description

Bamboo strip color self-adaptive classification method based on machine learning
Technical Field
The invention relates to the technical field of machine learning, in particular to a bamboo strip color self-adaptive classification method based on machine learning.
Background
With the increasing environmental protection pressure in recent years, the bamboo product processing industry meets new opportunities and challenges, and particularly, the development of the bamboo product processing industry is accelerated in countries and regions with high bamboo yield. However, the domestic bamboo industry is generally limited by the conditions of low automation degree, few professionals and the like, and the rapid development of the bamboo industry is seriously hindered.
The processing process of the Mao bamboo and the bamboo semi-finished product produced in China mainly comprises the working procedures of cutting, splitting, rough planing, finish planing, pressing and the like. The bamboo receives the production environment easily, growth period, influence such as carbomorphism technology, and bamboo surface color has obvious shade change to lead to the inhomogeneous problem of colour behind the clamp plate. Therefore, in order to improve the production quality of bamboo products, color classification is required.
At present, most bamboo processing factories mainly rely on manual bamboo color sorting classification according to the color depth of the surfaces of bamboo strips, and the classification standards are classified into several categories of extra deep, medium, light, extra light and the like according to the actual conditions of the factories. However, the manual sorting has the disadvantages of large professional requirements, high operation cost, low working efficiency, unstable sorting quality and the like.
To address the above industry pain points, research related to color classification of bamboo strands has been developed. At present, average color and texture features of bamboo chips are adopted, then a bayer classifier is used for color classification, color features are firstly extracted from HSV color space, then SVM is used for bamboo strip color classification, L a b color space features are used, and BP neural network is used for color classification of the bamboo chips.
Although there are many related studies, it is necessary to manually perform color screening on a certain number of bamboo strips in advance, and then collect bamboo strip data to train a model. Because the surface color of the bamboo strip is influenced by environments such as climate, the methods cannot meet the color classification of the bamboo strip produced by different bamboo strip raw material suppliers in different seasons. Therefore, in this case, the color classification of the surface of the bamboo strip requires manual re-screening of the bamboo strip color and then re-training of the color classification model. Meanwhile, manual screening errors are easily introduced by the mode, and the color classification stability and the practicability are reduced. The method for classifying bamboo strip colors by manually setting the gray information of the bamboo strips only considers the gray information and is easily influenced by the change of the light source environment.
Disclosure of Invention
In view of this, the present invention provides a bamboo color adaptive classification method based on machine learning, which can automatically learn bamboo color levels and has good robustness and accuracy.
The invention is realized by adopting the following scheme: a bamboo strip color self-adaptive classification method based on machine learning comprises the following steps:
step S1: collecting and extracting color information of a single bamboo strip, and preprocessing the color information of the single bamboo strip;
step S2: extracting color histogram features of the image obtained after the pretreatment;
and step S3: clustering color centers by using a k-means algorithm;
and step S4: classifying colors of the new bamboo strips;
step S5: updating a color feature library NF {1,2,...,} And updating the color clustering center.
Further, the step S1 specifically includes the following steps:
step S11: image acquisition: photographing an image of a surface of a bamboo strip portion as a picture I using a color area-array camera 1
Step S12: converting the color image into a gray image: color picture I obtained by camera 1 The RGB channel of (1) is converted into a Gray channel, and the conversion formula is Gray = R0.299 + G0.587 + B0.114, wherein Gray represents a Gray value, and a Gray image I is obtained 2
Step S13: image binarization: gray image I using OTSU algorithm 2 Binarizing to obtain binarized image I 3
Step S14: extracting a bamboo chip mask: image I after binaryzation is removed by image morphological operation 3 Extracting the largest particle region from the medium and small particles to obtain a mask region I of the bamboo chips 4
Step S15: and (3) extracting a bamboo chip color block: extracting bamboo sheet mask region I 4 The picture I is cut according to the maximum inscribed rectangle 1 Obtaining the bamboo chip color block I 5
Step S16: judging whether the whole bamboo strips are collected or not, if so, executing a step S17, otherwise, executing a step S11;
step S17: splicing the color blocks: the image sequence I obtained in the above steps 5 {1,2,...} Spliced into an image I 6 The image I 6 Namely the preprocessed image.
Further, the step S2 specifically includes the following steps:
step S21: image I obtained by splicing color blocks 6 Converting RGB color space into HSV color space to obtain image I 7
Step S22: extraction of I 7 The color characteristics of (A): for the hue channel H, the range of H is 0-180, the width of the histogram bin is set to be 5, and 36H channel bins are obtained; for the saturation channel S, its characteristics are not separately counted; for the luminance channel V, the range of V is 0-255, the width of histogram bin is set to 4, 64V channel bins are obtained, and a saturation threshold S is set thresh The threshold value ranges from 0 to 255, and the V channel bins are divided by the threshold value to obtain 128V channel bins; the last 164 color bins, statistic I 6 Obtaining a histogram of image pixels in a bin range to obtain a histogram color feature F, wherein the feature is a feature vector of 1x 164;
step S23: characteristic normalization: normalizing the feature vector F, wherein the normalization formula is as follows:
Figure BDA0002454543510000041
where I denotes the subscript of the feature vector, and W, H denote the image I, respectively 6 Width and height of (d); and finally obtaining the normalized characteristic NF.
Further, the specific content of the histogram of the statistical bin in step S22 is:
when the histogram of the bin is counted, a linear voting method is adopted, the pixel value is p, and two bins nearest to the p are the bin i ,bin j Corresponding center is C i ,C j Then the votes for these two bins are:
Figure BDA0002454543510000042
the voting mode can avoid forcibly dividing continuous color features, can obtain a more accurate color statistical chart, and can improve the accuracy and robustness of color classification.
Further, the step S3 specifically includes the following steps:
step S31: preparing N bamboo strips, weighingRepeating the step S1 and the step S2 to obtain N characteristic sequences NF {1,2,...}
Step S32: setting the color classification number k according to the color classification requirement of a factory, wherein the value range of k is 3-5, and calculating the N spliced images I obtained in the step S2 6 Average luminance value of (a); according to the size distribution of the brightness values, k brightness values are taken at equal distances, and the color characteristics of the pictures corresponding to the k brightness values are selected as initial mass centers;
step S33: clustering k central points C according to a k-means clustering algorithm {1,2,...,k}
Step S34: determining the corresponding relation between the k central points and the color depth: to the cluster central point C {1,2,...,k} And (3) measuring, wherein the measurement formula is as follows:
Figure BDA0002454543510000051
n represents a clustering center subscript, i represents a color feature subscript, and only a V channel is selected for measurement; m is n The size of (d) represents the color depth, m, corresponding to the cluster center n Larger indicates lighter color, m n Smaller indicates darker color; will correspond to the obtained m {1,2,..,k} And sequencing to obtain the corresponding color depth relation.
Further, the step S4 specifically includes the following steps:
step S41: repeating the step S1 and the step S2 on new bamboo strip data to obtain a new characteristic sequence NFnew;
step S42: calculating the distances between the feature sequence NFnew and the k central points obtained in step S33, where the measurement formula is:
Figure BDA0002454543510000052
wherein S n Representing the similarity between NFnew and the nth central point Cn, and the value range of Sn is [0, 1%]Closer to 1, more similar is indicated; calculating with all central points to obtain a measurement result sequence S {0,1,...k} And calculating a color classification result:
Figure BDA0002454543510000053
g is the result of color classification.
Further, the step S5 specifically includes the following steps:
step S51: for the characteristic sequence NF obtained in step S41 new Add to feature library NF {1,2,...,} In (3), counting the new characteristic sequence NF added currently new Number of X, updated feature library NF {1,2,...,} The total number Y;
step S52: setting an iterative training threshold value P, wherein the range of P is 500-1000, the maximum number threshold value Q of a feature library is 5000-10000, and P is restricted to Q; if X is equal to P, performing step S53, otherwise performing step S51;
step S53: step S3 is executed, and the color clustering center point is updated;
step S54: if Y is>Q, calculating the updated feature library NF in accordance with step S42 {1,2,...} And the metric value of the clustering center, selecting the minimum Y-Q value as an outlier characteristic value, and removing the outlier characteristic value from a characteristic library; otherwise, continuing to execute the step S51;
compared with the prior art, the invention has the following beneficial effects:
the invention does not need to manually pre-screen the colors of the bamboo strips for training, and simultaneously automatically updates the iterative color feature library in the using process to train a new model. The method has the advantages of strong adaptability, high accuracy and good implementability.
Drawings
Fig. 1 is an original image acquired by a camera according to an embodiment of the present invention.
FIG. 2 is a mosaic color patch diagram according to an embodiment of the present invention; wherein, fig. 2 (a) is a splicing picture of light-color bamboo strips, fig. 2 (b) is a splicing picture of medium-color bamboo strips, and fig. 2 (c) is a splicing picture of dark-color bamboo strips.
FIG. 3 is a general flow diagram of an embodiment of the present invention.
FIG. 4 is a color characterization diagram according to an embodiment of the present invention.
Detailed Description
The invention is further explained below with reference to the drawings and the embodiments.
It should be noted that the following detailed description is exemplary and is intended to provide further explanation of the disclosure. Unless defined otherwise, all technical and scientific terms used herein have the same meaning as commonly understood by one of ordinary skill in the art to which this application belongs.
It is noted that the terminology used herein is for the purpose of describing particular embodiments only and is not intended to be limiting of exemplary embodiments according to the present application. As used herein, the singular forms "a", "an", and "the" are intended to include the plural forms as well, and it should be understood that when the terms "comprises" and/or "comprising" are used in this specification, they specify the presence of stated features, steps, operations, devices, components, and/or combinations thereof, unless the context clearly indicates otherwise.
As shown in fig. 3, the present embodiment provides a bamboo color adaptive classification method based on machine learning, including the following steps:
step S1: collecting and extracting color information of a single bamboo strip, and preprocessing the color information of the single bamboo strip;
step S2: extracting color histogram features of the image obtained after preprocessing;
and step S3: clustering color centers using a k-means algorithm;
and step S4: classifying colors of the new bamboo strips;
step S5: updating color feature library NF {1,2,...,} And updating the color clustering center. .
In this embodiment, the step S1 specifically includes the following steps:
step S11: image acquisition: photographing an image of a surface of a bamboo strip portion as a picture I using a color area-array camera 1 (ii) a As shown in fig. 1;
step S12: converting the color image into a gray image: color picture I obtained by camera 1 The RGB channel of (1) is converted into a Gray channel, and the conversion formula is Gray = R0.299 + G0.587 + B0.114, wherein Gray represents a Gray value, and Gray is obtainedDegree image I 2
Step S13: image binarization: gray image I using OTSU algorithm 2 Binarizing to obtain binarized image I 3
Step S14: extracting a bamboo chip mask: image morphological operation is utilized to remove image I after binarization 3 Extracting the largest particle region of the medium and small particles as a mask region I of the bamboo chip 4
Step S15: and (3) extracting a bamboo chip color block: : extracting the bamboo chip mask region I 4 The picture I is cut according to the maximum inscribed rectangle 1 Obtaining bamboo chip color blocks I 5
Step S16: judging whether the whole bamboo strips are collected or not, if so, executing a step S17, otherwise, executing a step S11;
step S17: splicing the color blocks: the image sequence I obtained in the above steps 5 {1,2,...} Spliced into an image I 6 The image I 6 Namely the preprocessed image.
In this embodiment, the step S2 specifically includes the following steps:
step S21: image I obtained by splicing color blocks 6 Converting RGB color space into HSV color space to obtain image I 7
Step S22: extraction of I 7 The color characteristics of (2): for hue channel H, the range of H is 0-180, the width of histogram bin is set to be 5, and 36H channel bins are obtained; for the saturation channel S, its characteristics are not separately counted; for the luminance channel V, the range of V is 0-255, the width of the histogram bin is set to be 4, 64V channel bins are obtained, and the saturation threshold value S is set thresh =30, the threshold value ranges from 0 to 255, and the V channel bins are divided by the threshold value to obtain 128V channel bins; the last 164 color bins in total, statistics I 6 Obtaining a histogram of image pixels in a bin range to obtain a histogram color feature F, wherein the feature is a feature vector of 1x 164;
step S23: characteristic normalization: normalizing the feature vector F, wherein the normalization formula is as follows:
Figure BDA0002454543510000091
where I denotes the subscript of the feature vector, and W, H denote the image I, respectively 6 Width and height of (d); and finally obtaining the normalized characteristic NF.
In this embodiment, the specific content of the histogram of the statistical bin in step S22 is:
when the histogram of the bins is counted, it is generally not reasonable if the pixel values are near the edges of two bins, one of which receives a vote of 1 and the other of which receives a vote of 0. Therefore, the present embodiment proposes to adopt a linear voting method, where the pixel value is p, and the two bins nearest to p are bins i ,bin j Corresponding center is C i ,C j Then the votes for these two bins are:
Figure BDA0002454543510000092
the voting mode can avoid forcibly dividing continuous color features, can obtain a more accurate color statistical chart, and can improve the accuracy and robustness of color classification.
In this embodiment, the step S3 specifically includes the following steps:
step S31: preparing N bamboo strips, repeating the step S1 and the step S2 to obtain N characteristic sequences NF {1,2,...}
Step S32: setting k =3, and calculating N spliced images I obtained in the step S2 6 Average luminance value of (a); according to the size distribution of the brightness values, k brightness values are taken at equal distances, and the color characteristics of the picture corresponding to the k brightness values are selected as an initial centroid; for the k-means algorithm, if the centroid position is randomly selected and initialized, the final convergence result is influenced. In the embodiment, the selection mode of the centroid utilizes the brightness characteristic average value as the basis for initializing the centroid, and the convergence result obtained by each training is ensured to be stable.
Step S33: clustering k central points C according to a k-means clustering algorithm {1,2,...,k}
Step S34: determining the corresponding relation between the k central points and the color depth: to the cluster central point C {1,2,...,k} And (3) measuring, wherein the measurement formula is as follows:
Figure BDA0002454543510000101
n represents a clustering center subscript, i represents a color characteristic subscript, and only a V channel is selected for measurement; m is n The size of (d) represents the color depth, m, corresponding to the cluster center n Larger indicates lighter color, m n Smaller, indicates darker color; corresponding obtained m {1,2,..,k} And sequencing to obtain the corresponding color depth relation.
In this embodiment, the step S4 specifically includes the following steps:
step S41: repeating the step S1 and the step S2 on new bamboo strip data to obtain a new characteristic sequence NFnew;
step S42: calculating the distances between the feature sequence NFnew and the k central points obtained in step S33, where the measurement formula is:
Figure BDA0002454543510000102
wherein S n The similarity between NFnew and the nth central point Cn is shown, and the value range of Sn is [0,1 ]]Closer to 1, more similar is indicated; calculating with all central points to obtain measurement result sequence
S {0,1,...k} And calculating a color classification result:
Figure BDA0002454543510000103
g is the result of color classification.
In this embodiment, the step S5 specifically includes the following steps:
step S51: for the features obtained in step S41Sequence NF new Add to feature library NF {1,2,...,} In (3), counting the currently added new characteristic sequence NF new Number X, updated feature library NF {1,2,...,} The total number Y;
step S52: setting an iterative training threshold value P =1000, setting the range of P to be 500-1000, setting the maximum number threshold value Q =10000 of a feature library, setting the range of Q to be 5000-10000, and P-restricting Q; if X is equal to P, performing step S53, otherwise performing step S51;
step S53: step S3 is executed, and the color clustering center point is updated;
step S54: if Y is>Q, calculating the updated feature library NF in accordance with step S42 {1,2,...} And the metric value of the clustering center, selecting the minimum Y-Q value as an outlier characteristic value, and removing the outlier characteristic value from a characteristic library; otherwise, continuing to execute the step S51;
preferably, in the present embodiment, as shown in fig. 2 and 4,
(1) N =500 bamboo strips are prepared, and step S1 and step S2 are repeated. Obtaining 500 characteristic sequences NF {1,2,...}
(2) K =3, or k =5 is set according to the factory color classification requirements. Calculating 500 spliced images I obtained in the step S2 6 The average luminance value of (a). And according to the size distribution of the brightness values, taking k brightness values at equal distances, and selecting the color characteristics of the picture corresponding to the k brightness values as an initial centroid. For the k-means algorithm, if the centroid position is randomly selected and initialized, the final convergence result is influenced. The selection mode of the centroid in the invention utilizes the brightness characteristic average value as the basis for initializing the centroid, thereby ensuring the stability of the convergence result obtained by each training.
(3) Clustering k central points C according to the steps of the traditional k-means clustering algorithm {1,2,...,k}
(4) Determining the corresponding relation between the k central points and the color depth: to the clustering center point C {1,2,...,k} And (3) measuring, wherein the measurement formula is as follows:
Figure BDA0002454543510000111
n denotes the cluster center index and i denotes the color feature index, where we only choose the V channel for measurement. m is n The size of (d) represents the color depth, m, corresponding to the cluster center n Larger, the lighter the color, m n Smaller, indicates darker color. Corresponding obtained m {1,2,..,k} And sequencing to obtain the corresponding color depth relation.
(4) Classifying the colors of the new bamboo strips;
the step (4) specifically comprises the following steps:
(4-1) repeating the step S1 and the step S2 on new bamboo strip data to obtain a characteristic sequence NF new
(4-2) calculation of the characteristic sequence NF new And (3) the distance from the k central points obtained in the step (3) is measured according to the following formula:
Figure BDA0002454543510000121
wherein S n Denotes NF new And the nth center point C n The similarity of (c). S n Value range [0,1 ]]The closer to 1, the more similar the representation. Calculating with all central points to obtain a measurement result sequence S {0,1,...k} And calculating a color classification result:
Figure BDA0002454543510000122
g is the result of color classification.
(5) Updating color feature library NF {1,2,...,} Iteratively training a color model;
the step (5) specifically comprises the following steps:
(5-1) for the characteristic sequence NF obtained in (4-1) new Add to feature library NF {1,2,...,} In (3), counting the currently added new characteristic sequence NF new Number of X, feature library NF {1,2,...,} The total number Y;
(5-2) setting an iterative training threshold P =500, and setting a maximum number of feature libraries threshold Q =5000. If X is equal to P, executing step (5-3), otherwise executing step (5-1);
(5-3) executing the step (4) to retrain the model;
(5-4) if Y is>T, calculating a feature library NF according to the step (4-2) {1,2,...} And selecting the minimum Y-T value as an outlier characteristic value according to the metric value of the clustering center, and removing the outlier characteristic value from the characteristic library. Otherwise, continuing to execute the step (5-1);
and (5-5) re-clustering by using the cluster center point obtained in the step (5-3) as an initial point.
Preferably, the present embodiment only needs to provide a certain number of bamboo strips, and automatically learns the color distribution and classification model of the bamboo strips according to the adaptive analysis and judgment of the color change.
The above description is only a preferred embodiment of the present invention, and all equivalent changes and modifications made in accordance with the claims of the present invention should be covered by the present invention.

Claims (2)

1. A bamboo strip color self-adaptive classification method based on machine learning is characterized in that: the method comprises the following steps:
step S1: collecting and extracting color information of a single bamboo strip, and preprocessing the color information of the single bamboo strip;
step S2: extracting color histogram features of the image obtained after preprocessing;
and step S3: clustering color centers by using a k-means algorithm;
and step S4: classifying the colors of the new bamboo strips;
step S5: updating a color feature library NF {1,2,...,} Updating the color clustering center;
the step S1 specifically includes the steps of:
step S11: image acquisition: photographing an image of a surface of a bamboo strip portion as a picture I using a color area-array camera 1
Step S12: converting the color image into a gray image: color picture I obtained by camera 1 The RGB channel of (1) is converted into a Gray channel, and the conversion formula is Gray = R0.299 + G*0.587+ B0.114, wherein Gray represents Gray scale value, and a Gray scale image I is obtained 2
Step S13: image binarization: gray image I using OTSU algorithm 2 Binarizing to obtain an image I after binarization 3
Step S14: extracting a bamboo chip mask: image I after binaryzation is removed by image morphological operation 3 Extracting the largest particle region from the medium and small particles to obtain a mask region I of the bamboo chips 4
Step S15: and (3) extracting a bamboo chip color block: extracting the bamboo chip mask region I 4 According to the maximum inscribed rectangle, the picture I is cut 1 Obtaining bamboo chip color blocks I 5
Step S16: judging whether the whole bamboo strips are collected or not, if so, executing a step S17, otherwise, executing a step S11;
step S17: splicing color blocks: the image sequence I obtained in the above steps 5 {1,2,...} Spliced into an image I 6 The image I 6 The image is the preprocessed image;
the step S2 specifically includes the following steps:
step S21: image I obtained by splicing color blocks 6 Converting RGB color space into HSV color space to obtain image I 7
Step S22: extraction of I 7 The color characteristics of (A): for the hue channel H, the range of H is 0-180, the width of the histogram bin is set to be 5, and 36H channel bins are obtained; for the saturation channel S, the characteristics thereof are not separately counted; for the luminance channel V, the range of V is 0-255, the width of the histogram bin is set to be 4, 64V channel bins are obtained, and the saturation threshold value S is set thresh The threshold value ranges from 0 to 255, and the V channel bins are divided by the threshold value to obtain 128V channel bins; the last 164 color bins in total, statistics I 6 Obtaining a histogram color feature F of the image pixels in the bin range, wherein the feature is a feature vector of 1x 164;
step S23: characteristic normalization: normalizing the feature vector F, wherein the normalization formula is as follows:
Figure FDA0003989331490000021
where I denotes the subscript of the feature vector, and W, H denote the image I, respectively 6 Width and height of (d); finally obtaining normalized characteristic NF;
the step S3 specifically includes the following steps:
step S31: preparing N bamboo strips, repeating the step S1 and the step S2 to obtain N characteristic sequences NF {1,2,...}
Step S32: setting the color classification number k according to the color classification requirement of a factory, wherein the value range of k is 3-5, and calculating the N spliced images I obtained in the step S2 6 Average luminance value of (a); according to the size distribution of the brightness values, k brightness values are taken at equal distances, and the color characteristics of the picture corresponding to the k brightness values are selected as an initial centroid;
step S33: clustering k central points C according to a k-means clustering algorithm {1,2,...,k}
Step S34: determining the corresponding relation between the k central points and the color depth: to the clustering center point C {1,2,...,k} And measuring, wherein the measurement formula is as follows:
Figure FDA0003989331490000031
n represents a clustering center subscript, i represents a color characteristic subscript, and only a V channel is selected for measurement; m is n The size of (d) represents the color depth, m, corresponding to the cluster center n Larger indicates lighter color, m n Smaller, indicates darker color; corresponding obtained m {1 ,2,..,k} Sequencing to obtain corresponding color depth relation;
the step S4 specifically includes the following steps:
step S41: repeating the step S1 and the step S2 on new bamboo strip data to obtain a new characteristic sequence NFnew;
step S42: calculating the distances between the feature sequence NFnew and the k central points obtained in step S33, wherein the measurement formula is as follows:
Figure FDA0003989331490000032
wherein S n The similarity between NFnew and the nth central point Cn is shown, and the value range of Sn is [0,1 ]]Closer to 1, more similar; calculating with all central points to obtain a measurement result sequence S {0,1,...k} And calculating a color classification result:
Figure FDA0003989331490000033
g is the result of color classification;
the step S5 specifically includes the following steps:
step S51: for the characteristic sequence NF obtained in step S41 new Add to feature library NF {1,2,...,} In (3), counting the currently added new characteristic sequence NF new Number of X, updated feature library NF {1,2,...,} The total number Y;
step S52: setting an iterative training threshold value P, wherein the range of P is 500-1000, the maximum number threshold value Q of a feature library is 5000-10000, and P is restricted to Q; if X is equal to P, performing step S53, otherwise performing step S51;
step S53: step S3 is executed, and the color clustering center point is updated;
step S54: if Y is>Q, calculating the updated feature library NF in accordance with step S42 {1,2,...} And the metric value of the clustering center, selecting the minimum Y-Q value as an outlier characteristic value, and removing the outlier characteristic value from a characteristic library; otherwise, the process continues to step S51.
2. The machine learning-based bamboo cane color adaptive classification method according to claim 1, characterized in that: the specific content of the histogram of the statistical bin in step S22 is:
when the histogram of the bin is counted, a linear voting method is adopted, the pixel value is p, and two bins nearest to the p are taken as the bin i ,bin j Corresponding center is C i ,C j Then the votes for these two bins are:
Figure FDA0003989331490000041
the voting mode can avoid forcibly segmenting continuous color features, a more accurate color statistical graph can be obtained, and the color extraction mode can improve the accuracy and robustness of color classification.
CN202010302809.5A 2020-04-17 2020-04-17 Bamboo strip color self-adaptive classification method based on machine learning Active CN111563536B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010302809.5A CN111563536B (en) 2020-04-17 2020-04-17 Bamboo strip color self-adaptive classification method based on machine learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010302809.5A CN111563536B (en) 2020-04-17 2020-04-17 Bamboo strip color self-adaptive classification method based on machine learning

Publications (2)

Publication Number Publication Date
CN111563536A CN111563536A (en) 2020-08-21
CN111563536B true CN111563536B (en) 2023-04-14

Family

ID=72071606

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010302809.5A Active CN111563536B (en) 2020-04-17 2020-04-17 Bamboo strip color self-adaptive classification method based on machine learning

Country Status (1)

Country Link
CN (1) CN111563536B (en)

Families Citing this family (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN112784854B (en) * 2020-12-30 2023-07-14 成都云盯科技有限公司 Clothing color segmentation extraction method, device and equipment based on mathematical statistics

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011210111A (en) * 2010-03-30 2011-10-20 Nippon Telegr & Teleph Corp <Ntt> Image feature quantity generation device, method and program
CN105344618A (en) * 2015-10-21 2016-02-24 国家林业局北京林业机械研究所 Rectangular bamboo chip waning defect and color sorting method
CN107516331A (en) * 2017-08-11 2017-12-26 广西师范大学 A kind of bamboo cane method for sorting colors and system
CN108460380A (en) * 2018-03-13 2018-08-28 广西师范大学 A kind of bamboo cane method for sorting colors and system based on domain color
CN109858521A (en) * 2018-12-29 2019-06-07 国际竹藤中心 A kind of bamboo category identification method based on artificial intelligence deep learning

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
JP2011210111A (en) * 2010-03-30 2011-10-20 Nippon Telegr & Teleph Corp <Ntt> Image feature quantity generation device, method and program
CN105344618A (en) * 2015-10-21 2016-02-24 国家林业局北京林业机械研究所 Rectangular bamboo chip waning defect and color sorting method
CN107516331A (en) * 2017-08-11 2017-12-26 广西师范大学 A kind of bamboo cane method for sorting colors and system
CN108460380A (en) * 2018-03-13 2018-08-28 广西师范大学 A kind of bamboo cane method for sorting colors and system based on domain color
CN109858521A (en) * 2018-12-29 2019-06-07 国际竹藤中心 A kind of bamboo category identification method based on artificial intelligence deep learning

Also Published As

Publication number Publication date
CN111563536A (en) 2020-08-21

Similar Documents

Publication Publication Date Title
CN110415181B (en) Intelligent identification and grade judgment method for RGB (red, green and blue) images of flue-cured tobacco in open environment
CN111738064B (en) Haze concentration identification method for haze image
CN108181316B (en) Bamboo strip defect detection method based on machine vision
CN107392968B (en) The image significance detection method of Fusion of Color comparison diagram and Color-spatial distribution figure
CN107729812B (en) Method suitable for recognizing vehicle color in monitoring scene
CN109145848B (en) Wheat ear counting method
CN111860639B (en) System and method for judging quantized flue-cured tobacco leaf curing characteristics
Niu et al. Image segmentation algorithm for disease detection of wheat leaves
CN101162503A (en) Method for extracting and recognizing human ear characteristic by improved Hausdorff distance
CN106651966B (en) Picture color identification method and system
CN109815923B (en) Needle mushroom head sorting and identifying method based on LBP (local binary pattern) features and deep learning
CN111260645A (en) Method and system for detecting tampered image based on block classification deep learning
CN111563536B (en) Bamboo strip color self-adaptive classification method based on machine learning
CN108460380A (en) A kind of bamboo cane method for sorting colors and system based on domain color
CN114926661A (en) Textile surface color data processing and identifying method and system
CN113012156B (en) Intelligent solid wood board color classification method
CN113724339B (en) Color space feature-based color separation method for tiles with few samples
CN117274405B (en) LED lamp working color detection method based on machine vision
CN113408573B (en) Method and device for automatically classifying and classifying tile color numbers based on machine learning
CN113052234A (en) Jade classification method based on image features and deep learning technology
CN111832569B (en) Wall painting pigment layer falling disease labeling method based on hyperspectral classification and segmentation
CN108765426A (en) automatic image segmentation method and device
CN110751660B (en) Color image segmentation method
CN110929740A (en) LGBM model-based tongue quality and tongue coating separation method
CN103871084B (en) Indigo printing fabric pattern recognition method

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant