CN111798526A - Method and system for rapidly extracting dominant colors of color images based on clustering space mapping - Google Patents

Method and system for rapidly extracting dominant colors of color images based on clustering space mapping Download PDF

Info

Publication number
CN111798526A
CN111798526A CN202010025357.0A CN202010025357A CN111798526A CN 111798526 A CN111798526 A CN 111798526A CN 202010025357 A CN202010025357 A CN 202010025357A CN 111798526 A CN111798526 A CN 111798526A
Authority
CN
China
Prior art keywords
color
clustering
sample
space
colors
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010025357.0A
Other languages
Chinese (zh)
Other versions
CN111798526B (en
Inventor
刘尊洋
王自荣
孙晓泉
叶庆
徐英
豆贤安
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
National University of Defense Technology
Original Assignee
National University of Defense Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by National University of Defense Technology filed Critical National University of Defense Technology
Priority to CN202010025357.0A priority Critical patent/CN111798526B/en
Publication of CN111798526A publication Critical patent/CN111798526A/en
Application granted granted Critical
Publication of CN111798526B publication Critical patent/CN111798526B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/213Feature extraction, e.g. by transforming the feature space; Summarisation; Mappings, e.g. subspace methods
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Artificial Intelligence (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)
  • Image Processing (AREA)

Abstract

The invention provides a method and a system for quickly extracting the dominant color of a color image based on clustering space mapping, which comprises the following steps: expressing the color image by a CIELAB color space, carrying out color space quantization to obtain a plurality of levels of colors, and counting the frequency and the unit of each level of colors appearing in the color image; mapping the clustering samples to a color space from a pixel space to obtain a color sample space and a unit; clustering the colors of all levels in the mapped color sample space by using a pedigree clustering algorithm and acquiring pedigree clustering centers and units; and taking the result of the pedigree clustering as an initial class center to perform fast fuzzy C-means clustering on the clustering samples and determine the main color and units of the background.

Description

Method and system for rapidly extracting dominant colors of color images based on clustering space mapping
Technical Field
The invention relates to a data identification technology, in particular to a method and a system for quickly extracting the dominant color of a color image based on clustering space mapping.
Background
The extraction of the dominant colors of the images has important application in the fields of camouflage design, pattern recognition and the like. The self-adaptive camouflage technology in the camouflage color camouflage requires equipment to change the camouflage color and the pattern of the surface of the equipment per se according to the background characteristics of the environment in a short time after entering a certain background environment, so that high fusion with the environment is realized. Therefore, the adaptive camouflage makes high demands on the rapid design of the camouflage painting. When the camouflage color is designed, the background dominant color and the patch distribution information can provide basis for determining the camouflage color and the pattern. The currently common background dominant color extraction method mainly comprises a color histogram method and a cluster analysis method:
first, using a quantized color histogram and a thresholding method to obtain the dominant color of a background image is one of the current background dominant color obtaining methods. The technical solution introduced in the thesis "camouflage color selection algorithm based on background representative color extraction" (xuying, photoelectric engineering, 2007.1) falls into this category. After acquiring the color histogram, the method sets a threshold value and selects a color exceeding the threshold value as the dominant color of the image. In the technical scheme, a threshold value is manually set, and a color exceeding the threshold value is selected as the dominant color of the image, so that the method is simpler, but the following three problems exist: firstly, the effect is not stable enough, and a situation that a reasonable threshold value is difficult to find may occur; secondly, for the case that the color histogram is relatively continuous, there are cases where it is difficult to determine the dominant color; finally, the method for determining the dominant color usually needs personnel to participate to obtain a good effect, so that errors caused by subjective factors are introduced to a certain extent, and meanwhile, the efficiency is reduced.
Secondly, the method for extracting the dominant color of the background image by using the clustering algorithm is a technical scheme which is applied more at present, and compared with a method for directly setting the threshold value to determine the dominant color according to the color histogram, the method for extracting the dominant color by using the clustering algorithm introduces a clustering analysis method in the field of pattern recognition, the method for extracting the dominant color is improved to a certain extent. The paper "design method of imitation digital camouflage" (Jun, applied science bulletin, 2012.4) introduces a background dominant color extraction method using K-means clustering analysis, which adopts clustering analysis technology, and can extract background dominant colors of complex color spaces more accurately by using the characteristics that the color space is built on the subjective perception of human color and the ambiguity of color attribution is hidden in the color quantization process. In order to avoid clustering convergence to a local optimal value, the scheme adopts a multi-time K-means clustering method to obtain the optimal dominant color extraction effect. However, the clustering samples are all pixels, and the clustering algorithm needs to repeatedly and iteratively calculate the distance between each clustering sample and all clustering centers, so that the calculation is very time-consuming when the sample size is large, and meanwhile, the method does not mention an initial clustering center selection method which is very important to the clustering effect and efficiency, so that the timeliness requirement of the self-adaptive camouflage painting dominant color quick extraction is difficult to meet.
Aiming at the problem of the clustering algorithm, an improved clustering algorithm is proposed to extract the dominant color of the background image, and the effect and efficiency of dominant color extraction are mainly improved by optimizing an initial clustering center selection method. A layer-by-layer fuzzy C-means clustering background dominant color extraction method based on a pyramid structure is introduced in a paper 'imitating camouflage dominant color extraction method of tower-type FCM and CIEDE 2000' (Liu Zun Yan, infrared and laser engineering, 2010.2) and 'digital camouflage pattern generation method research based on a fractal Brownian model' (opportunity, war institute, 2016.1). Firstly, constructing a pyramid image sequence; then, decomposing the background into pyramid image sequences on different scales; and finally, from the (n-1) th layer to the 0 th layer of images, taking the clustering result of the previous layer of images as an initial center to perform FCM clustering on the current layer of images. The result obtained by the image of the 0 th layer is the final clustering center, and because the adjacent two layers of images have a large amount of redundancy, the clustering centers are very close, and the algorithm convergence time is greatly reduced. Although the clustering effect is optimized by using the fuzzy C-means clustering method, and the clustering efficiency is improved by using the pyramid structure for layer-by-layer clustering, the clustering samples are all pixels, and the clustering algorithm still needs to repeatedly calculate the distance between each clustering sample and all clustering centers, so that when the image size is large, the algorithm efficiency still cannot meet the timeliness requirement of rapidly extracting the dominant color by the self-adaptive camouflage.
Disclosure of Invention
The invention aims to provide a method and a system for quickly extracting the dominant color of a color image based on clustering space mapping.
The technical scheme for realizing the method of the invention is as follows: a method for rapidly extracting the dominant color of a color image based on clustering space mapping comprises the following steps:
step 1, representing a color image by using a CIELAB color space, quantizing the color space to obtain a plurality of levels of quantized colors, and counting the frequency of each level of quantized colors in the color image;
step 2, mapping the clustering samples to a color space from a pixel space to obtain a color sample space;
step 3, clustering all levels of colors in the mapped color sample space by using a pedigree clustering algorithm and acquiring a result class center of pedigree clustering;
and 4, taking the result of the pedigree clustering as an initial class center, carrying out fast FCM clustering on the clustering sample, and determining the main color of the background.
Further, the specific process of step 3 is:
step 301, setting each color sample in the color sample space as a class, settingj=yj
Figure BDA0002362221230000031
In the formula, yjFor the jth sample to be classified,jis the jth cluster set, N is the number of samples;
step 302, said collectionjFinding a pair of conditions satisfying | j ∈ I }
Figure BDA0002362221230000032
Cluster collection ofiAndk,Δ(i,k) Is thatiAndki.k ∈ I;
step 303, the step ofiIs combined tokIn and get rid ofiRemoving I from the index set I, and recalculatingkClass heart color of Vk
Figure BDA0002362221230000033
Wherein, VkIs the class center color of the kth class, NkFor the number of samples in the kth class, col (i) represents the color value of the ith sample, and TC _ r (i) represents the frequency of the color of the ith sample in the image;
step 304, if the number of the merged classes is equal to the number of the expected classes, the calculation is terminated; otherwise go to step 302.
Further, the specific process of step 4 is:
step S401, setting an iteration stop threshold and a maximum iteration count B, initializing a class center, and setting an iteration count B to 0;
step S402, calculating a partition matrix according to the formula (5)
Figure BDA0002362221230000034
Fuzzy membership of kth color sample to ith class
Figure BDA0002362221230000035
Figure BDA0002362221230000036
Wherein the content of the first and second substances,
Figure BDA0002362221230000037
in the b-th iteration, the fuzzy membership degree of the kth color sample to the ith class, m is a clustering control parameter,
Figure BDA0002362221230000038
color values Col (k) and ith centroid color for the kth sample
Figure BDA0002362221230000039
The color difference between them, define
Figure BDA00023622212300000310
c is the number of categories;
step S404, updating the clustering center matrix according to the formula (6)
Figure BDA00023622212300000311
Figure BDA0002362221230000041
Wherein the content of the first and second substances,
Figure BDA0002362221230000042
representing the class center of the ith category obtained by the b-th iterative computation, wherein TC _ R (k) is the frequency of appearance of the kth color, Col (k) is the color value of the kth sample, and To is the number of the clustering samples;
step S405, if
Figure BDA0002362221230000043
Or B is more than or equal to B, stopping and outputting
Figure BDA0002362221230000044
And
Figure BDA0002362221230000045
otherwise, go to step S401 with b + 1.
Further, the specific process of step 1 is:
step 101, for each of all quantized rectangular solid regions in the CIELAB color space, the length, width and height of each quantized rectangular solid region respectively correspond to the quantization interval of three coordinates, and the coordinate range is [ L [m,LM]、[am,aM]And [ b)m,bM];
Step 102, if the pixel color (L, a, b) satisfies Lm≤l≤LM,am≤a≤aMAnd b ism≤b≤bMThen the pixel color appears once.
Further, in step 1, the colors of the image in the three coordinates of L, a, and b in the CIELAB color space are divided into n levels, so that n is total3Color species; or divided into m, n and k levels with unequal levels according to the characteristics of three coordinates of L, a and b, and then m multiplied by n multiplied by k colors are shared.
The technical scheme for realizing the system of the invention is as follows: a color image dominant color fast extraction system based on clustering space mapping comprises:
a first unit for representing the color image by CIELAB color space and quantizing the color space to obtain a plurality of levels of quantized colors and counting the frequency of each level of quantized colors appearing in the color image;
a second unit for mapping the cluster samples from the pixel space to the color space to obtain a color sample space;
a third unit for clustering the colors of each level in the mapped color sample space by using a pedigree clustering algorithm and acquiring a pedigree clustering result core;
and performing a rapid clustering algorithm on the clustering samples by taking the result of the pedigree clustering as an initial class center and determining a fourth unit of the main color of the background.
Further, the third unit includes
Setting each color sample in the color sample space as a kind of subunit, in which the subunit is setj=yj
Figure BDA0002362221230000046
In the formula, yjFor the jth sample to be classified,jis the jth cluster set, N is the number of samples;
a subunit for calculating the centroid color of each type of color, the subunit calculating the centroid color V according tok
Figure BDA0002362221230000051
Wherein, VkIs the class center color of the kth class, NkFor the number of samples in the kth class, col (i) represents the color value of the ith sample, and TC _ r (i) represents the frequency of the color of the ith sample in the image;
at a collectionjSelecting a pair of clustering sets with the minimum distance from the IiAndka subunit of which is to beiIs combined tokIn and get rid ofiRemoving I from the index set I, and recalculatingkA hearty-like color of (c); and
if the number of merged classes is equal to the number of expected classes, the class center is obtained and the calculation subunit is terminated.
Further, the fourth unit includes
Computing partition matrices
Figure BDA0002362221230000052
Fuzzy membership of the kth sample to the ith class
Figure BDA0002362221230000053
A subunit of, wherein
Figure BDA0002362221230000054
Figure BDA0002362221230000055
In the b-th iteration, the fuzzy membership degree of the kth sample to the ith class, m is a clustering control parameter,
Figure BDA0002362221230000056
color values Col (k) and ith centroid color for the kth sample
Figure BDA0002362221230000057
The color difference between them, define
Figure BDA0002362221230000058
Updating cluster center matrices
Figure BDA0002362221230000059
A subunit of which
Figure BDA00023622212300000510
Figure BDA00023622212300000511
Representing the class center of the ith category obtained by the b-th iterative computation, wherein TC _ R (k) is the frequency of appearance of the kth color, Col (k) is the color value of the kth sample, and To is the number of the clustering samples; and
stop and output
Figure BDA00023622212300000512
And
Figure BDA00023622212300000513
a subunit, if
Figure BDA00023622212300000514
Or B is more than or equal to B, stopping and outputting
Figure BDA0002362221230000061
And
Figure BDA0002362221230000062
further, in the first unit
The length, width and height of each quantized rectangular solid region in all quantized rectangular solid regions of the CIELAB color space respectively correspond to quantization intervals of three coordinates, and the coordinate range is [ Lm,LM]、[am,aM]And [ b)m,bM];
If the pixel color (L, a, b) satisfies Lm≤l≤LM,am≤a≤aMAnd b ism≤b≤bMThen the pixel color appears once.
Further, colors existing in the images in the three coordinates of L, a and b in the CIELAB color space are divided into n levels in the first unit, so that n is shared3Color species; or divided into m, n and k stages with different stages according to the characteristics of the three coordinates of L, a and b, wherein the total number of m × n × k colors
Compared with the prior art, the invention has the following advantages: (1) the invention takes the quantized color as the sample space, which can greatly reduce the scale of the clustering sample without influencing the clustering effect; (2) the time complexity and the space complexity of the method are obviously reduced, the number of clustering samples is not obviously increased along with the increase of the image size, particularly, the advantage of the large-size image scheme is more obvious, and in the experiment, when the image size is 1024 multiplied by 1024, the efficiency can be improved by 30 times compared with the method using the pixels as clustering objects.
The invention is further described below with reference to the accompanying drawings.
Drawings
FIG. 1 is a schematic flow chart of the method of the present invention.
FIG. 2 is a comparison between the CQFCM algorithm of the present invention and the conventional FCM clustering algorithm, wherein (a) is the original image; (b) is a common FCM clustering algorithm effect graph, and (c) is a CQFCM algorithm effect graph of the invention.
Detailed Description
The difference between the color image fast fuzzy C-means clustering (CQFCM) and the common clustering algorithm (FCM) is that the clustering object is mapped to the quantized color value by the pixel. The target function J of the FCM algorithm is reflected on the clustering operationMFuzzy degree of membership uikAnd a cluster center ViAre rewritten from expressions (1), (2) and (3) to expressions (4), (5) and (6), respectively. For distinguishing from FCM, CQFCM algorithm is used
Figure BDA0002362221230000063
And
Figure BDA0002362221230000064
representing objective function, fuzzy membership and cluster center.
Figure BDA0002362221230000071
In the above formula, m > 1 is a constant which can control the fuzzy degree of the clustering result,
Figure BDA0002362221230000072
is a sample xkTo the ith cluster center viDistance of (d), muikIs the membership function of the kth sample to the ith class, which requires that the sum of the membership of one sample to each cluster is 1, in order to make Jm(U, P) is minimized, and membership degree mu is achievedikAnd a clustering center viUpdating according to equations (2) and (3). In updating muikTime press dikTwo cases of whether there is a value of 0 are discussed, define IkAnd
Figure BDA0002362221230000073
is Ik={i|1≤i≤c,dik=0},
Figure BDA0002362221230000074
The number of iterations is denoted by b.
Figure BDA0002362221230000075
Is the fuzzy membership of the kth sample to the ith class, b is the iteration number, m is the clustering control parameter,
Figure BDA0002362221230000076
in order to cluster the center vector of the cluster,
Figure BDA0002362221230000077
in order to be the objective function, the target function,
Figure BDA0002362221230000078
the distance of the kth sample to the i cluster centers, here Col (k) and
Figure BDA0002362221230000079
the color difference between the two is calculated by using the CIE94 color difference formula. TC _ R (k) is the frequency of appearance of the k-th color,
Figure BDA00023622212300000710
example one
The method for rapidly extracting the dominant color of the color image based on the clustering space mapping by adopting the CQFCM comprises the following steps:
step 1, the color image obtained by the image acquisition device is usedCIELAB color space representation and color space quantization: dividing colors existing in the images in the three coordinates of L, a and b of the CIELAB color space into n levels, and then totally sharing n3Color species; step 1 can also be divided into m, n and k levels with unequal levels according to the characteristics of the three coordinates of L, a and b, and at this time, m × n × k colors are shared.
The present invention uses the CIELAB color space because the color distance in the CIELAB color space and the perception of chromatic aberration by the human eye are closer.
Step 2, mapping the clustering samples to a color space from a pixel space: mapping the clustering samples to a color space from a pixel space by counting the occurrence frequency of the color;
and 3, deleting redundant colors: traversing the color space, and deleting the color with the frequency of 0 from the quantized color;
step 4, obtaining an initial class center: clustering the color sample space by using a pedigree clustering algorithm, removing the mean value of each class to obtain the class center of the class, and using the class center as the initial class center of the CQFCM clustering algorithm;
step 5, clustering to determine dominant colors: clustering the clustering samples by using a color image fast FCM (CQFCM) clustering algorithm on the basis of the initial class center;
and 6, determining the main color of the background.
Further, in step 1, the grayscale image is generally 256 colors, that is, the grayscale values of all pixels in the image are 256 integer grayscale values of 0 to 255. The color values of the color images are very many, for example, the value ranges of three coordinates in the Lab space adopted in this embodiment are (0-100, -128 to +127), if the integer color values are directly used as the cluster sample set, the total number of possible samples is 100 × 256 × 256 — 6553600, and further if the decimal number is considered, the number of samples is more, so if the cluster for the pixels is directly mapped to the cluster for the color set, the purpose of reducing the number of samples cannot be achieved. The human eye has a threshold for resolving chromatic aberration, and when the chromatic aberration is less than the threshold, the human eye cannot resolve the chromatic aberration, so that the number of color types that can be resolved by the human eye is limited. And the number of image colors (e.g. 6)553600) is far more than the number of colors that human eyes can distinguish, and the general dominant colors are only 3-5, so the moderate compression of the number of color types does not have great influence on the dominant color extraction effect. There are many methods for compressing color, and experiments show that the method of performing equal-level quantization on each color coordinate can achieve a more ideal effect. That is, if the coordinates L, a, and b are all divided into n stages, the color n is shared3And (4) seed preparation. It is easy to understand that when n is smaller, the number of colors is smaller, the clustering algorithm is more efficient, but the color distortion is also increased, resulting in a decrease in clustering effect, which is also unacceptable. Therefore, the number of levels n must be determined taking into account both the efficiency of the algorithm and the degree of distortion of the color. In the present embodiment, n is 20, that is, the three coordinates of L, a and b have 20 levels, and the total number of colors is 20 × 20 × 20, 8000, which is accurate enough for the resolution of human eyes.
The quantized colors of several levels obtained in step 1 are color matrices Col. Assuming that N-level quantized colors are obtained, the color matrix Col is a matrix of N rows × 3 columns, N is the number of color samples, and each row is a color value of one color sample; and 3 represents the values of L, a and b of the colors in the Lab color space.
The cluster samples in step 2 refer to the quantized colors. Assuming that 4 main colors need to be extracted, all pixels of the image need to be classified into 4 classes according to the principle of color similarity. And during clustering, calculating the distance between each quantized color value and four cluster center colors, dividing the colors into the categories with the minimum distance, traversing all pixels, comparing the distances between each pixel color and the four cluster center colors, and dividing all pixels into four categories. Therefore, the cluster samples are quantized colors, and the number of quantized colors is the number of samples. Since the number of quantized colors is much smaller than the number of pixels, the number of clustered samples is reduced by mapping the clustered samples from the pixels to the quantized colors in this embodiment. And the efficiency of the clustering algorithm determined by the number of the clustering samples is improved.
In step 2, mapping the clustering samples from the pixel space to the color space by counting the occurrence frequency of the color specifically as follows:
step 201, for n of Lab three-dimensional space3The length, width and height of each quantized rectangular solid region in each quantized rectangular solid region respectively correspond to quantization intervals of three coordinates, and the coordinate range is [ Lm,LM]、[am,aM]And [ b)m,bM];
Step 202, when the color of a certain pixel falls within the range (within the rectangular parallelepiped), i.e. when the pixel color (L, a, b) satisfies Lm≤l≤LM,am≤a≤aMAnd b ism≤b≤bMIt means that the color appears once.
Traversing the whole image to obtain the occurrence frequency of all colors, wherein the Lab coordinate values and the occurrence frequency of all colors form the color space of the clustering sample.
The redundant clustering samples in the step 3 refer to redundant colors with the statistical frequency of 0, that is, the colors do not appear in the image, so the redundant samples increase the number of samples, but have no influence on the clustering effect, thereby the complexity of the clustering algorithm is increased meaninglessly. For this reason, redundant colors can be deleted from the quantized colors to further reduce the number of samples, improving algorithm efficiency.
The principle of the pedigree clustering algorithm in the step 4 is that each subset formed by a small amount of samples is regarded as a category, then similar categories are combined step by step to form a nested sequence with the progressively reduced category number, and the specific process is as follows:
step 401, setting each color sample in the color sample space as a class, settingj=yj
Figure BDA0002362221230000091
In the formula, yjFor the jth sample to be classified,jis the jth cluster set, N is the number of samples;
step 402, said at least one object is collectedjFinding a pair of conditions satisfying | j ∈ I }
Figure BDA0002362221230000101
Cluster collection ofiAndk,Δ(i,k) Is thatiAndki.k ∈ I;
step 403, merging the two classes with the minimum distance into one class, that is, merging the two classes into one classiIs combined tokIn and get rid ofiRecalculatingkSimilar heart of
Figure BDA0002362221230000102
Wherein, VkIs the class center color of the kth class, NkFor the number of samples in the kth class, col (i) represents the color value of the ith sample, and TC _ r (i) represents the frequency of the color of the ith sample in the image;
step 404, removing I from the index set I, if the cardinal number of I is equal to c, namely, the number of the merged classes is equal to the number of the expected classes, terminating the calculation; otherwise go to step 402.
The distance described in step 402 refers to the size of the color difference (color difference), a large distance indicates a large color difference, a small distance indicates a small color difference, and the color difference calculation uses the CIE94 color formula. The study of the formula of chromatic aberration and its influence on the extraction of dominant colors of images (zhangyufa, application of photoelectric technology, 2010.6) has a clear calculation method for the CIE94 color formula. The influence of the complete clustering sample space is to be achieved so that information cannot be lost, and therefore, the frequency of the appearance of the color in the image needs to be counted.
In step 404, the number of the desired classes is determined to be 3-5. For camouflage, the closer the camouflage color and the background color are to the camouflage effect, the better the camouflage color and the background color are, however, the colors in the background are many, thousands of colors, but the camouflage color is generally only 3-5 colors.
The specific process of the step 5 is as follows:
step 501, setting an iteration stop threshold and a maximum iteration number B, initializing a class center, and making the iteration number B equal to 0;
step 502, calculating a partition matrix according to equation (5)
Figure BDA0002362221230000103
Fuzzy membership of the kth sample to the ith class
Figure BDA0002362221230000104
Figure BDA0002362221230000105
Where k denotes the kth sample, i denotes the ith class, b is the number of iterations,
Figure BDA0002362221230000111
in the b-th iteration, the fuzzy membership degree of the kth sample to the ith class, m is a clustering control parameter,
Figure BDA0002362221230000112
is the distance from the kth sample to the i cluster centers, i.e. the color value Col (k) of the kth sample and the i cluster center color
Figure BDA0002362221230000113
The color difference between them, define
Figure BDA0002362221230000114
Dividing the matrix to represent the membership degree of each sample to all classes, wherein the number of rows of the divided matrix is the number of the classes, the number of columns is the number of the samples, the membership degree of each sample to all the classes in turn is calculated once each iteration to obtain the divided matrix, and the divided matrix of the iteration of the b th time is
Figure BDA0002362221230000115
Wherein the elements of the ith row and k column
Figure BDA0002362221230000116
For the b-th iteration, the k-th sample has fuzzy membership to the i-th class.
Step 503, updating the cluster center matrix according to the formula (6)
Figure BDA0002362221230000117
Figure BDA0002362221230000118
Wherein the content of the first and second substances,
Figure BDA0002362221230000119
representing the class center of the ith category obtained by the b-th iterative computation, wherein TC _ R (k) is the frequency of appearance of the kth color, Col (k) is the color value of the kth sample, and To is the number of the clustering samples;
step 504, if
Figure BDA00023622212300001110
Or B is more than or equal to B, stopping and outputting
Figure BDA00023622212300001111
And
Figure BDA00023622212300001112
otherwise, let b +1 go to step 501.
1. Algorithmic effect analysis
It can be proved that the CQFCM dominant color extraction method of the present invention is equivalent to FCM in classifying samples. Assuming that the initial centers of similarity are the same, the quantization Color col (k) is equal to the Color (x, y) of the pixel with coordinates (x, y) in the image, and for convenience of writing, Color (x, y) is denoted as im (z). It is now demonstrated that the quantized color and the pixel point are concentric with respect to the same class vjAre equal in degree of membership, i.e.
Figure BDA00023622212300001113
The relationship shown in the formula (8) can be obtained by referring to the formulas (2) and (5), that is
Figure BDA00023622212300001114
Figure BDA00023622212300001115
In the formula (I), the compound is shown in the specification,
Figure BDA00023622212300001116
is Col (k) pair in CQFCM algorithm
Figure BDA00023622212300001117
The degree of membership of (a) is,
Figure BDA00023622212300001118
is im (z) to V in the FCM algorithmiDegree of membership.
Referring to the formulas (3) and (6), the relationship shown in the formula (9) can be obtained, that is
Figure BDA0002362221230000121
Figure BDA0002362221230000122
Thereby, when the initialization is satisfied
Figure BDA0002362221230000123
For any b have
Figure BDA0002362221230000124
And
Figure BDA0002362221230000125
namely, the CQFCM and FCM are equivalent in classification, which shows that the optimization performance of the cluster is kept unchanged by the CQFCM algorithm relative to the FCM algorithm.
2. Algorithmic efficiency analysis
2.1 sample number
Before analyzing the time complexity and the space complexity of the CQFCM algorithm and the FCM algorithm, the relation between the number of samples of the CQFCM algorithm and the FCM algorithm is compared. And if the size of the image is m multiplied by n and the number of the actually appeared colors is To, the number of the FCM clustered samples is m multiplied by n, and the number of the CQFCM algorithm clustered samples is To. When the image size is 256 × 256, the total number of clustering samples of the FCM algorithm is 256 × 256 to 65536, the quantization series is 20, and the common color is 2038000, and To is not greater than the total number of quantized colors, so there are
Figure BDA0002362221230000126
I.e. the cluster number of the CQFCM algorithm is always less than one eighth of the FCM algorithm.
Another benefit of the present algorithm is that the total number of clustered samples never exceeds the total number of quantized colors as the image size increases. For example, when the image size is 512 × 512, the data amount of the original FCM algorithm cluster is 262144, and the cluster sample number To of the CQFCM algorithm still does not exceed 8000.
2.2 time complexity
Considering the time complexity of the clustering part, one iteration needs To calculate a partition matrix U and a clustering center vector V, the complexity of the calculation U is the product of the class number and the sample number, namely the calculation times are To multiplied by c, and when the calculation V is calculated, the calculation times are To multiplied by c because To is required for calculating each clustering center and c class centers are shared. And the number of operations for calculating U and V by FCM is m × n × c. In addition, the whole clustering process is completed through multiple iterations. It can be seen that when the color image is clustered, the difference between the efficiencies of the CQFCM algorithm and the FCM algorithm depends on the total number of samples when the number of classes is the same, i.e. the time complexity ratio of the clustering algorithm is substantially equal to the number of samples
Figure BDA0002362221230000127
2.3 spatial complexity
For storing image data, both algorithms need space m × n × 3 × 0sizeof (float), for storing U, the FCM algorithm needs m × 1n × 2c × 3sizeof (float) bytes, CQFCM needs To be To × 4c × 5sizeof (float) bytes, and the CQFCM algorithm needs To store TC _ R and Col, where TC _ R occupies To × 6sizeof (int) and Col occupies To × 73 × 8sizeof (float), so the overall storage space needed by the CQFCM clustering algorithm is less than To × (c +4) × sizeof (float) except for storing images. When To × (c +4) × (float), the spatial complexity of the CQFCM clustering algorithm is less than that of the FCM clustering algorithm. In particular, when m × n is 128 × 128 and c is 4, To × (c +4) ≦ 8000 × 8 ≦ 6400 < 1282×4=65536,Namely, the CQFCM clustering algorithm has lower spatial complexity, and the advantages are more obvious as the image size increases and the clustering number increases.
3. Simulation experiment
In order to verify the performance of the algorithm, three groups of simulation experiments are designed. The common conditions of the experiments are that the computer is configured with: ThinkPadT480s, cpucorrei 7 binuclear, memory 8 GB. Simulation environment: windows10 operating system, MatlabR2018 a.
Experiment one: comparison of algorithmic effects
The specific experimental conditions are as follows: the image size is 512 multiplied by 512, the clustering number is 5, the tower type FCM (PFCM) clustering algorithm provided by the tower type FCM and CIEDE2000 imitated camouflage color dominant color extraction method and the color image fast FCM (CQFCM) clustering algorithm provided by the invention are used for extracting the background dominant color, and the Lab color space CIE94 distance formula shows that under the condition of consistent conditions, the clustering effect of the tower type FCM and the color image fast FCM is basically consistent, and the clustering effect is consistent with the theoretical analysis result
Experiment two: comparison of algorithm efficiency
As a conclusion of experiments, the clustering effect of the PFCM and the CQFCM algorithm is basically consistent. The experiment compares the efficiency of the two images, processes the images with different sizes, repeatedly clusters each image for 5 times, counts the average time of various algorithms, and records the time consumed by the two images for processing the images with different sizes in table 1.
TABLE 1 two algorithms handle different size image efficiency contrast
Figure BDA0002362221230000141
Analysis table 1 shows that, because the PFCM algorithm needs to use the pixel points as the clustering objects, the clustering time is rapidly increased along with the increase of the image size, and the clustering time for the image with the size of 1024 × 1024 pixels reaches 126.9 seconds; however, the CQFCM algorithm takes a quantized color as a clustering object, and thus the time consumption is small as the image size increases, and is about 4.2s for clustering images having a size of 1024 × 1024 pixels. Therefore, the clustering efficiency of the algorithm is greatly improved compared with that of the PFCM algorithm, and the advantages of the CQFCM algorithm are obviously improved along with the increase of the image size.
Fig. 2 is a dominant color chart, for example, a 5-color dominant color chart is an image in which all colors of an original image are replaced by only 5 colors, and the dominant color extraction effect is better, the closer the dominant color chart is to a background image.
Example two
A color image dominant color fast extraction system based on clustering space mapping comprises:
dividing L, a and b coordinates of a CIELAB color space in a color image into n-level first units;
a second unit for mapping the cluster sample from the pixel space to the color space to count the frequency of each grade of color appearing in the color image and deleting the color with the frequency of 0;
clustering the color sample space by using a pedigree clustering algorithm and taking the mean value of each class as a third unit of the class center of the class;
and taking the result of the pedigree clustering as an initial class center, performing FCM clustering on the clustering samples, and determining a fourth unit of the main color of the background.
In the first unit, the gray image is generally 256 colors, that is, the gray values of all pixels in the image are 256 integer gray values of 0 to 255. The color values of the color images are very many, and if the three coordinates of the Lab space adopted in this embodiment are taken from the ranges of (0-100, -128 to +127), the total number of possible samples is 100 × 256 × 256 to 6553600 if the integer color values are directly used as the cluster sample set, so that if the clusters for the pixels are directly mapped to the clusters for the color set, the purpose of reducing the number of samples cannot be achieved. The human eye has a threshold for resolving chromatic aberration, and when the chromatic aberration is less than the threshold, the human eye cannot resolve the chromatic aberration, so that the number of color types that can be resolved by the human eye is limited. The number of the image colors (such as 6553600) is far more than the number of colors that can be distinguished by human eyes, and generally, the number of the dominant colors is only 3-5, so that the dominant color extraction effect is not greatly influenced by properly compressing the number of the color types. There are many methods for color compression, and experiments show that the method for performing equal-level quantization on each color coordinate can be used for the color compressionSo as to obtain the ideal effect. That is, if the coordinates L, a, and b are all divided into n stages, the color n is shared3And (4) seed preparation. It is easy to understand that when n is smaller, the number of colors is smaller, the clustering algorithm is more efficient, but the color distortion is also increased, resulting in a decrease in clustering effect, which is also unacceptable. Therefore, the number of levels n must be determined taking into account both the efficiency of the algorithm and the degree of distortion of the color. In the present embodiment, n is 20, that is, the three coordinates of L, a and b have 20 levels, and the total number of colors is 20 × 20 × 20, 8000, which is accurate enough for the resolution of human eyes.
In the second unit, the clustering samples are mapped to the color space from the pixel space by counting the occurrence frequency of the color, specifically:
step 201, for n of Lab three-dimensional space3The length, width and height of each quantized rectangular solid region in each quantized rectangular solid region respectively correspond to quantization intervals of three coordinates, and the coordinate range is [ Lm,LM]、[am,aM]And [ b)m,bM];
Step 202, when the color of a certain pixel falls within the range (within the rectangular parallelepiped), i.e. when the pixel color (L, a, b) satisfies Lm≤l≤LM,am≤a≤aMAnd b ism≤b≤bMIt means that the color appears once.
Traversing the whole image to obtain the occurrence frequency of all colors, wherein the Lab coordinate values and the occurrence frequency of all colors form the color space of the clustering sample.
In the second cell, the color with frequency 0, i.e. the color does not appear in the image, i.e. the redundant sample. The redundant samples increase the number of samples, but have no influence on the clustering effect, thereby causing the complexity of the clustering algorithm to be increased meaninglessly. For this reason, redundant colors can be deleted from the quantized colors to further reduce the number of samples, improving algorithm efficiency.
In the third unit, each subset formed by a small number of samples is regarded as a category, and then similar categories are combined step by step to form a nested sequence with the progressively reduced category number, and the specific process is as follows:
step 301, setting each color sample in the color sample space as a class, settingj=yj
Figure BDA0002362221230000161
In the formula, yjFor the jth sample to be classified,jis the jth cluster set, N is the number of samples;
step 302, said collectionjFinding a pair of conditions satisfying | j ∈ I }
Figure BDA0002362221230000162
Cluster collection ofiAndk,Δ(i,k) Is thatiAndki.k ∈ I;
step 303, merging the two classes with the minimum distance into one class, namely merging the two classes into one classiIs combined tokIn and get rid ofiRecalculatingkSimilar heart of
Figure BDA0002362221230000163
Wherein, VkIs the class center color of the kth class, NkFor the number of samples in the kth class, col (i) represents the color value of the ith sample, and TC _ r (i) represents the frequency of the color of the ith sample in the image;
step 304, removing I from the index set I, and if the cardinal number of I is equal to c, namely, the number of the merged classes is equal to the number of the expected classes, terminating the calculation; otherwise go to step 302.
The distance described in step 302 refers to the size of the color difference (color difference), a large distance indicates a large color difference, a small distance indicates a small color difference, and the color difference calculation uses the CIE94 color formula. The CIE94 color formula has a definite calculation method in the study of color difference formula and its influence on the extraction of image dominant colors (Zhang Yufa, photoelectric technology application, 2010.6),
in step 304, the number of the desired classes is determined to be 3-5. For camouflage, the closer the camouflage color and the background color are to the camouflage effect, the better the camouflage color and the background color are, however, the colors in the background are many, thousands of colors, but the camouflage color is generally only 3-5 colors.
The specific process of determining the main color of the background in the fourth unit is as follows:
step 401, setting an iteration stop threshold and a maximum iteration number B, initializing a class center, and making the iteration number B equal to 0;
step 402, calculating a partition matrix according to the following formula
Figure BDA0002362221230000164
Fuzzy membership of the kth sample to the ith class
Figure BDA0002362221230000171
Figure BDA0002362221230000172
Where k denotes the kth sample, i denotes the ith class, b is the number of iterations,
Figure BDA0002362221230000173
in the b-th iteration, the fuzzy membership degree of the kth sample to the ith class, m is a clustering control parameter,
Figure BDA0002362221230000174
is the distance from the kth sample to the i cluster centers, i.e. the color value Col (k) of the kth sample and the i cluster center color
Figure BDA0002362221230000175
The color difference between them, define
Figure BDA0002362221230000176
Step 403, updating the cluster center matrix according to the following formula
Figure BDA0002362221230000177
Figure BDA0002362221230000178
Wherein the content of the first and second substances,
Figure BDA0002362221230000179
representing the class center of the ith category obtained by the b-th iterative computation, wherein TC _ R (k) is the frequency of appearance of the kth color, Col (k) is the color value of the kth sample, and To is the number of the clustering samples;
step 404, if
Figure BDA00023622212300001710
Or B is more than or equal to B, stopping and outputting
Figure BDA00023622212300001711
And
Figure BDA00023622212300001712
otherwise, let b +1 go to step 401.

Claims (10)

1. A method for rapidly extracting the dominant color of a color image based on clustering space mapping is characterized by comprising the following steps:
step 1, representing a color image by using a CIELAB color space, quantizing the color space to obtain a plurality of levels of quantized colors, and counting the frequency of each level of quantized colors in the color image;
step 2, mapping the clustering samples to a color space from a pixel space to obtain a color sample space;
step 3, clustering all levels of colors in the mapped color sample space by using a pedigree clustering algorithm and acquiring a result class center of pedigree clustering;
and 4, taking the result of the pedigree clustering as an initial class center, carrying out fast FCM clustering on the clustering sample, and determining the main color of the background.
2. The method according to claim 1, wherein the specific process of step 3 is as follows:
step 301, setting each color sample in the color sample space as a class, settingj=yj
Figure FDA0002362221220000011
I ═ j ═ 1,2, …, N }, where y isjFor the jth sample to be classified,jis the jth cluster set, N is the number of samples;
step 302, said collectionjFinding a pair of conditions satisfying | j ∈ I }
Figure FDA0002362221220000012
Cluster collection ofiAndk,Δ(i,k) Is thatiAndki.k ∈ I;
step 303, the step ofiIs combined tokIn and get rid ofiRemoving I from the index set I, and recalculatingkClass heart color of Vk
Figure FDA0002362221220000013
Wherein, VkIs the class center color of the kth class, NkFor the number of samples in the kth class, col (i) represents the color value of the ith sample, and TC _ r (i) represents the frequency of the color of the ith sample in the image;
step 304, if the number of the merged classes is equal to the number of the expected classes, the calculation is terminated; otherwise go to step 302.
3. The method according to claim 1, wherein the specific process of step 4 is as follows:
step S401, setting an iteration stop threshold and a maximum iteration count B, initializing a class center, and setting an iteration count B to 0;
step S402, calculating a partition matrix according to the formula (5)
Figure FDA0002362221220000014
Middle k color sampleDegree of fuzzy membership of this to ith class
Figure FDA0002362221220000015
Figure FDA0002362221220000021
Wherein the content of the first and second substances,
Figure FDA0002362221220000022
in the b-th iteration, the fuzzy membership degree of the kth color sample to the ith class, m is a clustering control parameter,
Figure FDA0002362221220000023
color values Col (k) and ith centroid color for the kth sample
Figure FDA0002362221220000024
The color difference between them, define
Figure FDA0002362221220000025
c is the number of categories;
step S404, updating the clustering center matrix according to the formula (6)
Figure FDA0002362221220000026
Figure FDA0002362221220000027
Wherein the content of the first and second substances,
Figure FDA0002362221220000028
representing the class center of the ith category calculated after the b-th iteration, wherein TC _ R (k) is the frequency of appearance of the kth color, Col (k) is the color value of the kth sample, and To is the number of the clustering samples;
step S405, if
Figure FDA0002362221220000029
Or B is more than or equal to B, stopping and outputting
Figure FDA00023622212200000210
And
Figure FDA00023622212200000211
otherwise, go to step S401 with b + 1.
4. The method according to claim 2 or 3, wherein the specific process of step 1 is as follows:
step 101, for each of all quantized rectangular solid regions in the CIELAB color space, the length, width and height of each quantized rectangular solid region respectively correspond to the quantization interval of three coordinates, and the coordinate range is [ L [m,LM]、[am,aM]And [ b)m,bM];
Step 102, if the pixel color (L, a, b) satisfies Lm≤l≤LM,am≤a≤aMAnd b ism≤b≤bMThen the pixel color appears once.
5. The method according to claim 4, wherein the colors of the image in the three coordinates L, a and b of the CIELAB color space are divided into n levels in step 1, and n is total3Color species; or divided into m, n and k levels with unequal levels according to the characteristics of three coordinates of L, a and b, and then m multiplied by n multiplied by k colors are shared.
6. A color image dominant color fast extraction system based on clustering space mapping is characterized by comprising:
a first unit for representing the color image by CIELAB color space and quantizing the color space to obtain a plurality of levels of quantized colors and counting the frequency of each level of quantized colors appearing in the color image;
a second unit for mapping the cluster samples from the pixel space to the color space to obtain a color sample space;
a third unit for clustering the colors of each level in the mapped color sample space by using a pedigree clustering algorithm and acquiring a pedigree clustering result core;
and performing a rapid clustering algorithm on the clustering samples by taking the result of the pedigree clustering as an initial class center and determining a fourth unit of the main color of the background.
7. The system of claim 6, wherein the third unit comprises:
setting each color sample in the color sample space as a kind of subunit, in which the subunit is setj=yj
Figure FDA0002362221220000031
I ═ j ═ 1,2, …, N }, where y isjFor the jth sample to be classified,jis the jth cluster set, N is the number of samples;
a subunit for calculating the centroid color of each type of color, the subunit calculating the centroid color V according tok
Figure FDA0002362221220000032
Wherein, VkIs the class center color of the kth class, NkFor the number of samples in the kth class, col (i) represents the color value of the ith sample, and TC _ r (i) represents the frequency of the color of the ith sample in the image;
at a collectionjSelecting a pair of clustering sets with the minimum distance from the IiAndka subunit of which is to beiIs combined tokIn and get rid ofiRemoving I from the index set I, and recalculatingkA hearty-like color of (c); and
if the number of merged classes is equal to the number of expected classes, the class center is obtained and the calculation subunit is terminated.
8. The system of claim 6, wherein the fourth unit comprises:
computing partition matrices
Figure FDA0002362221220000033
Fuzzy membership of the kth sample to the ith class
Figure FDA0002362221220000034
A subunit of, wherein
Figure FDA0002362221220000035
Figure FDA0002362221220000036
In the b-th iteration, the fuzzy membership degree of the kth sample to the ith class, m is a clustering control parameter,
Figure FDA0002362221220000041
color values Col (k) and ith centroid color for the kth sample
Figure FDA0002362221220000042
The color difference between them, define
Figure FDA0002362221220000043
Updating cluster center matrices
Figure FDA0002362221220000044
A subunit of which
Figure FDA0002362221220000045
Figure FDA0002362221220000046
Represents the class center of the ith category obtained by the b-th iterative computation, and TC _ R (k) is the th categoryThe frequency of occurrence of k colors, Col (k), is the color value of the kth sample, and To is the number of clustering samples; and
stop and output
Figure FDA0002362221220000047
And
Figure FDA0002362221220000048
a subunit, if
Figure FDA0002362221220000049
Or B is more than or equal to B, stopping and outputting
Figure FDA00023622212200000410
And
Figure FDA00023622212200000411
9. system according to claim 7 or 8, characterized in that in the first unit
The length, width and height of each quantized rectangular solid region in all quantized rectangular solid regions of the CIELAB color space respectively correspond to quantization intervals of three coordinates, and the coordinate range is [ Lm,LM]、[am,aM]And [ b)m,bM];
If the pixel color (L, a, b) satisfies Lm≤l≤LM,am≤a≤aMAnd b ism≤b≤bMThen the pixel color appears once.
10. The system of claim 9, wherein the colors of the image in the three coordinates L, a, and b of the CIELAB color space are each divided into n levels in the first unit, and n is total3Color species; or divided into m, n and k levels with unequal levels according to the characteristics of three coordinates of L, a and b, and then m multiplied by n multiplied by k colors are shared.
CN202010025357.0A 2020-01-10 2020-01-10 Method and system for rapidly extracting dominant colors of color images based on clustering space mapping Active CN111798526B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010025357.0A CN111798526B (en) 2020-01-10 2020-01-10 Method and system for rapidly extracting dominant colors of color images based on clustering space mapping

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010025357.0A CN111798526B (en) 2020-01-10 2020-01-10 Method and system for rapidly extracting dominant colors of color images based on clustering space mapping

Publications (2)

Publication Number Publication Date
CN111798526A true CN111798526A (en) 2020-10-20
CN111798526B CN111798526B (en) 2022-04-19

Family

ID=72805846

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010025357.0A Active CN111798526B (en) 2020-01-10 2020-01-10 Method and system for rapidly extracting dominant colors of color images based on clustering space mapping

Country Status (1)

Country Link
CN (1) CN111798526B (en)

Cited By (3)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113484250A (en) * 2021-02-10 2021-10-08 北京简耘科技有限公司 Method for manufacturing colorimetric card for evaluating color of potato peel and potato pulp and evaluation method
CN115464557A (en) * 2022-08-15 2022-12-13 深圳航天科技创新研究院 Method for adjusting mobile robot operation based on path and mobile robot
WO2023198054A1 (en) * 2022-04-14 2023-10-19 人工智能设计研究所有限公司 Color unit determination method and apparatus, electronic device and storage medium

Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103065153A (en) * 2012-12-17 2013-04-24 西南科技大学 Video key frame extraction method based on color quantization and clusters
CN104281849A (en) * 2013-07-03 2015-01-14 广州盖特软件有限公司 Fabric image color feature extraction method
CN107392880A (en) * 2017-07-25 2017-11-24 北京华新创科信息技术有限公司 A kind of imitative pattern painting automatic generation method
CN107578451A (en) * 2017-09-20 2018-01-12 太原工业学院 A kind of adaptive key color extraction method towards natural image
CN110120080A (en) * 2019-04-12 2019-08-13 青岛九维华盾科技研究院有限公司 A method of quickly generating standard pattern-painting mass-tone

Patent Citations (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN103065153A (en) * 2012-12-17 2013-04-24 西南科技大学 Video key frame extraction method based on color quantization and clusters
CN104281849A (en) * 2013-07-03 2015-01-14 广州盖特软件有限公司 Fabric image color feature extraction method
CN107392880A (en) * 2017-07-25 2017-11-24 北京华新创科信息技术有限公司 A kind of imitative pattern painting automatic generation method
CN107578451A (en) * 2017-09-20 2018-01-12 太原工业学院 A kind of adaptive key color extraction method towards natural image
CN110120080A (en) * 2019-04-12 2019-08-13 青岛九维华盾科技研究院有限公司 A method of quickly generating standard pattern-painting mass-tone

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
刘尊洋等: "基于谱系法改进FCM 的仿造迷彩主色提取方法", 《红外与激光工程》 *
刘尊洋等: "综合颜色和分布信息评价目标可见光伪装效果", 《红外与激光工程》 *

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113484250A (en) * 2021-02-10 2021-10-08 北京简耘科技有限公司 Method for manufacturing colorimetric card for evaluating color of potato peel and potato pulp and evaluation method
CN113484250B (en) * 2021-02-10 2023-01-31 北京简耘科技有限公司 Method for manufacturing colorimetric card for evaluating color of potato skins and potato flesh and evaluation method
WO2023198054A1 (en) * 2022-04-14 2023-10-19 人工智能设计研究所有限公司 Color unit determination method and apparatus, electronic device and storage medium
CN115464557A (en) * 2022-08-15 2022-12-13 深圳航天科技创新研究院 Method for adjusting mobile robot operation based on path and mobile robot

Also Published As

Publication number Publication date
CN111798526B (en) 2022-04-19

Similar Documents

Publication Publication Date Title
CN111798526B (en) Method and system for rapidly extracting dominant colors of color images based on clustering space mapping
Celebi Improving the performance of k-means for color quantization
CN110348399B (en) Hyperspectral intelligent classification method based on prototype learning mechanism and multidimensional residual error network
CN108491430B (en) Unsupervised Hash retrieval method based on clustering characteristic directions
CN110796667B (en) Color image segmentation method based on improved wavelet clustering
WO2012111236A1 (en) Image identification device and program
CN108876797A (en) A kind of image segmentation system and method based on Spiking-SOM neural network clustering
CN109684922A (en) A kind of recognition methods based on the multi-model of convolutional neural networks to finished product dish
CN109376787B (en) Manifold learning network and computer vision image set classification method based on manifold learning network
CN109635140B (en) Image retrieval method based on deep learning and density peak clustering
CN109409438B (en) Remote sensing image classification method based on IFCM clustering and variational inference
CN106874862B (en) Crowd counting method based on sub-model technology and semi-supervised learning
CN108154158B (en) Building image segmentation method for augmented reality application
CN108427745A (en) The image search method of visual dictionary and adaptive soft distribution based on optimization
CN109543723A (en) A kind of image clustering method of robust
CN106650744A (en) Image object co-segmentation method guided by local shape migration
CN107832786A (en) A kind of recognition of face sorting technique based on dictionary learning
CN109711442B (en) Unsupervised layer-by-layer generation confrontation feature representation learning method
CN109886281A (en) One kind is transfinited learning machine color image recognition method based on quaternary number
CN112580502A (en) SICNN-based low-quality video face recognition method
CN112990264A (en) Multi-view clustering method based on consistent graph learning
Albkosh et al. Optimization of discrete wavelet transform features using artificial bee colony algorithm for texture image classification.
CN108921853B (en) Image segmentation method based on super-pixel and immune sparse spectral clustering
CN113033345B (en) V2V video face recognition method based on public feature subspace
CN111860359B (en) Point cloud classification method based on improved random forest algorithm

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
GR01 Patent grant
GR01 Patent grant