CN112016621A - Training method of classification model, color classification method and electronic equipment - Google Patents

Training method of classification model, color classification method and electronic equipment Download PDF

Info

Publication number
CN112016621A
CN112016621A CN202010888167.1A CN202010888167A CN112016621A CN 112016621 A CN112016621 A CN 112016621A CN 202010888167 A CN202010888167 A CN 202010888167A CN 112016621 A CN112016621 A CN 112016621A
Authority
CN
China
Prior art keywords
cluster
type value
color
target color
training
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Granted
Application number
CN202010888167.1A
Other languages
Chinese (zh)
Other versions
CN112016621B (en
Inventor
王洋
李湘
洪蕾
魏鹏
谭庆超
龚大东
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Shanghai First Financial Data Technology Co ltd
Original Assignee
Shanghai First Financial Data Technology Co ltd
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Shanghai First Financial Data Technology Co ltd filed Critical Shanghai First Financial Data Technology Co ltd
Priority to CN202010888167.1A priority Critical patent/CN112016621B/en
Publication of CN112016621A publication Critical patent/CN112016621A/en
Application granted granted Critical
Publication of CN112016621B publication Critical patent/CN112016621B/en
Active legal-status Critical Current
Anticipated expiration legal-status Critical

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/24Classification techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour

Landscapes

  • Engineering & Computer Science (AREA)
  • Data Mining & Analysis (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Artificial Intelligence (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Biology (AREA)
  • Evolutionary Computation (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • General Engineering & Computer Science (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Multimedia (AREA)
  • Image Analysis (AREA)

Abstract

The method for training the classification model, the color classification method and the electronic device comprises the following steps: acquiring sample data, inputting the sample data to a preset color identification model, and determining a first type value of a target color in the sample data; inputting the first type value of the target color into a classification model to be trained for training, wherein the classification model to be trained comprises at least one class cluster; calculating the association degree of the first type value of the target color and the class cluster; clustering the first type value of the target color based on the relevance; and updating the class clusters based on the clustering result to finish the training of the classification model to be trained, and classifying colors such as lipstick through the trained model.

Description

Training method of classification model, color classification method and electronic equipment
Technical Field
The present application relates to the field of artificial intelligence technologies, and in particular, to a training method for a classification model, a color classification method, and an electronic device.
Background
Lip color cosmetics serve as core products of the color cosmetics, the online consumption limit of the lip color cosmetics is continuously increased, and the requirement of Chinese consumers on the lip color cosmetics is higher and higher. The inventor finds that brands have strong recognition for color numbers of lip color cosmetics, and various names and definitions are provided for products with different colors, for example, certain lipsticks of different brands are actually one color but have different names, so that certain difficulties are caused in analyzing the current popular trend of the lipsticks. The research on the popular trend of the lipstick number can further help the brand to better grasp the popular trend of the lipstick number and can obtain more accurate guidance suggestions in developing new products, marketing and consumer positioning.
Disclosure of Invention
The application provides a training method of a classification model, a color classification method and electronic equipment, and also provides a computer-readable storage medium to provide the training method of the classification model, which can objectively and accurately classify various lipstick colors of different brands so as to facilitate subsequent analysis.
In a first aspect, the present application provides a method for training a classification model, including:
acquiring sample data, inputting the sample data to a preset color identification model, and determining a first type value of a target color in the sample data;
inputting a first type value of a target color into a classification model to be trained for training, wherein the classification model to be trained comprises at least one class cluster;
calculating the association degree of the first type value of the target color and the class cluster;
clustering the first type value of the target color based on the degree of association;
and updating the class clusters based on the clustering result so as to complete the training of the classification model to be trained.
Further, calculating the association degree of the first type value of the target color and the class cluster, including:
acquiring a cluster center of the cluster;
a first distance measure of a first type value of the target color and a cluster center of each cluster class is calculated, and a degree of association is determined based on the first distance measure, wherein the first distance measure is inversely related to the degree of association.
Further, clustering the first type value of the target color based on the degree of association includes:
determining a minimum distance metric of the first distance metrics based on the first distance metrics;
clustering the first type value of the target color based on the minimum distance metric.
Further, updating the cluster based on the clustering result includes:
comparing the minimum distance metric to a preset threshold;
and if the minimum distance measure is larger than a preset threshold value, increasing the number of the cluster classes.
Further, updating the cluster based on the clustering result includes:
comparing the minimum distance metric to a preset threshold;
and if the minimum distance measure is less than or equal to the preset threshold, clustering the first type value of the target color to the class cluster corresponding to the minimum distance measure.
Further, still include:
acquiring a cluster center of the cluster and each first type value of the cluster;
calculating a second distance measure of each first type value from the cluster center of the class cluster;
obtaining an average of the second distance measures based on each second distance measure;
the cluster center of the class cluster is updated based on the average of the measure of the second distance.
In a second aspect, the present application provides a training apparatus for classification models, including:
the acquisition module is used for acquiring sample data, inputting the sample data into a preset color identification model and determining a first type value of a target color in the sample data;
the input module is used for inputting the first type value of the target color into a classification model to be trained for training, and the classification model to be trained comprises at least one class cluster;
the calculation module is used for calculating the association degree of the first type value of the target color and the class cluster;
the clustering module is used for clustering the first type value of the target color based on the association degree;
and the updating module is used for updating the class clusters based on the clustering result so as to finish the training of the classification model to be trained.
In a third aspect, the present application provides a color classification method, including:
acquiring data to be classified, inputting the data to be classified into a preset color identification model, and determining a first type value of a target color in the data to be classified;
inputting a first type value of the target color into a trained classification model, wherein the trained classification model comprises at least one class cluster;
acquiring a cluster center of the cluster;
calculating a first distance measure of a first type value of the target color and a cluster center of each of the clusters, determining the degree of association based on the first distance measure, the first distance measure being inversely related to the degree of association.
Clustering a first type value of a target color based on the association degree;
wherein the preset classification model is obtained by training based on the training method
In a fourth aspect, the present application provides an electronic device comprising,
a memory for storing computer program code, said computer program code comprising instructions which, when read from said memory by said mobile electronic device, cause said electronic device to perform the method as described above
In a fifth aspect, the present application provides a computer readable storage medium having stored thereon a computer program which, when run on a computer, causes the computer to perform the method according to the first aspect.
In a sixth aspect, the present application provides a computer program for performing the method of the first aspect when the computer program is executed by a computer.
In a possible design, the program in the fourth aspect may be stored in whole or in part on a storage medium packaged with the processor, or in part or in whole on a memory not packaged with the processor.
Drawings
Fig. 1 is a diagram illustrating a structure of an electronic device according to an embodiment of the present disclosure;
FIG. 2 is a flowchart of a method for training a classification model according to an embodiment of the present disclosure;
FIG. 3 is a flowchart of another method for training a classification model according to an embodiment of the present disclosure;
fig. 4 is a schematic diagram of acquired sample data provided in an embodiment of the present application.
FIG. 5 is a flowchart of another method for training a classification model according to an embodiment of the present disclosure;
fig. 6 is a flowchart of a color classification method according to an embodiment of the present application.
Fig. 7 is a schematic structural diagram of a color classification device according to an embodiment of the present application.
Detailed Description
The technical solutions in the embodiments of the present application will be described below with reference to the drawings in the embodiments of the present application. In the description of the embodiments herein, "/" means "or" unless otherwise specified, for example, a/B may mean a or B; "and/or" herein is merely an association describing an associated object, and means that there may be three relationships, e.g., a and/or B, which may mean: a exists alone, A and B exist simultaneously, and B exists alone.
In the following, the terms "first", "second" are used for descriptive purposes only and are not to be understood as indicating or implying relative importance or implicitly indicating the number of technical features indicated. Thus, a feature defined as "first" or "second" may explicitly or implicitly include one or more of that feature. In the description of the embodiments of the present application, "a plurality" means two or more unless otherwise specified.
Lip color cosmetics serve as core products of the color cosmetics, the online consumption limit of the lip color cosmetics is continuously increased, and the requirement of Chinese consumers on the lip color cosmetics is higher and higher. It follows that the name of the lip color cosmetic product is eight-flower, and the brand has strong recognition for the color number of the lip color cosmetic, and various names and definitions are provided for products with different colors, for example, a certain lipstick of different brands is substantially one color, but has different names, so that the current popular trend of the lipstick is difficult to analyze. The research on the popular trend of the lipstick number can further help the brand to better grasp the popular trend of the lipstick number and can obtain more accurate guidance suggestions in developing new products, marketing and consumer positioning.
Fig. 1 is a schematic diagram of an electronic device according to an embodiment of the present invention. As shown in fig. 1, the electronic apparatus 1 of the embodiment includes: a processor 11, a memory 12 and a computer program 13 stored in the memory 12 and executable on the processor 11, the computer program 13 implementing, when executed by the processor 11, a training method applied to a classification model, a color classification method in an embodiment, see in particular below.
The electronic device 1 includes, but is not limited to, a processor 11, a memory 12. It will be appreciated by those skilled in the art that fig. 1 is merely an example of an electronic device and does not constitute a limitation of the electronic device 1 and may comprise more or less components than those shown, or some components may be combined, or different components, e.g. the electronic device 1 may further comprise input output devices, network access devices, buses, etc.
The Processor 11 may be a Central Processing Unit (CPU), other general purpose Processor, a Digital Signal Processor (DSP), an Application Specific Integrated Circuit (ASIC), a Field Programmable Gate Array (FPGA) or other Programmable logic device, discrete Gate or transistor logic device, discrete hardware component, or the like. A general purpose processor may be a microprocessor or the processor may be any conventional processor or the like.
The memory 12 may be an internal storage unit of the electronic device 1, such as a hard disk or a memory of the electronic device 1. The memory 12 may also be an external storage device of the electronic device 1, such as a plug-in hard disk, a Smart Media Card (SMC), a Secure Digital (SD) Card, a Flash memory Card (Flash Card), and the like provided on the electronic device 1. Further, the memory 12 may also include both an internal storage unit and an external storage device of the electronic device 1. The memory 12 is used for storing computer programs and other programs and data required by the electronic device 1. The memory 12 may also be used to temporarily store data that has been output or is to be output.
Referring to fig. 2, fig. 2 is a flowchart of an embodiment of a training method for a classification model of the present application, which may be used in the electronic device, and specifically includes:
step 202, sample data is obtained, the sample data is input into a preset color identification model, and a first type value of a target color in the sample data is determined.
The sample data includes, but is not limited to, pictures, videos, etc. containing a target color, which is a color that needs to be classified, such as a lipstick number, etc.
Specifically, an image including a color of lipstick is input into an existing color recognition model, and a first type value of a target color is obtained, wherein the type value of the color includes, but is not limited to, RGB values, LAB values, and color values of other color spaces.
Step 204, inputting a first type value of a target color into a classification model to be trained for training, wherein the classification model to be trained comprises at least one class cluster;
and step 206, calculating the association degree of the first type value of the target color and the class cluster.
Step 208, clustering the first type value of the target color based on the relevance;
and step 210, updating the class cluster based on the clustering result to complete the training of the classification model to be trained.
By the method, a model capable of classifying colors can be finally obtained, the training model can classify target colors, for example, a certain lipstick color is input into the trained model, the model can classify the lipstick color based on the relevance degree by calculating the relevance degree of a first type value of the lipstick color and a class cluster under the model, and the problem of disordered naming of the lipstick color in the prior art is solved. Based on this, the prevalence trend of the current lipstick can be further analyzed.
Referring to fig. 3, fig. 3 is a flow chart of another embodiment of a method for training a classification model of the present application, the method comprising:
step 302, obtaining sample data, inputting the sample data into a preset color identification model, and determining a first type value of a target color in the sample data
It has been shown above that the sample data includes, but is not limited to, pictures, videos, etc. with a target color that needs to be classified, such as lipstick, including multiple lipsticks in the image, or including only a single lipstick color. Specifically, methods for acquiring images containing lipstick are very diverse, and in one example, by means of crawling, the type of color of lipstick is acquired in an official website of each lipstick, specifically, an official picture of a mainstream color makeup brand lip color makeup product on a crawling line, for example, a certain brand official website, a heaven official flagship store, and each maul beauty makeup platform, and after the images are acquired, a product library of lip color makeup is established, which contains subdivided categories of lipstick, lip glaze, lip honey, and the like. The acquired image can cover most brands on the market, new products can be updated in stages, the quantity of the products is enriched continuously, and the brands cover high-end lines, public lines, national goods lines and the like.
Referring to fig. 4, fig. 4 relates to a picture containing lipstick colors obtained from a lipstick official gazette, in which the lipstick colors are mainly classified into three places, a paste 402 of lipstick, a lip color 404 corresponding to the color of the paste 402, and the color of a color patch 406 corresponding to the color of the paste 402. It should be noted that although the color of the paste, the color of the paste applied to the lips, and the color of the commercial color block indicate a lipstick color, the specific colors displayed are slightly different, i.e., the colors displayed may not be uniform, for example, after the lipstick paste is applied to the lips, the two colors may be superimposed because the lips originally have a certain color, and thus the colors of the paste may be slightly different visually. Therefore, in order to obtain a reddish color type value from a picture more accurately, weighted average processing can be performed on the type values of the three colors, so as to improve the training precision of the training color classification model.
In one implementation, the first type value may be obtained directly, and the first type value of the target color is directly recognized through an existing color recognition model. Of course, the first type value may also be obtained indirectly, for example, by inputting a photo including a lipstick color into an existing color recognition model, obtaining another type value of the target color, and then converting the another type value of the target color into the first type value, illustratively, the first type value of the target color is a Lab value, which may be obtained directly, or may be obtained by converting the another type value, such as by converting an RGB value of the target color into a Lab value of the target color. At present, the lipstick colors are about 200-300, and each lipstick color has corresponding RGB value and Lab value.
RGB is composed of a red channel (R), a green channel (G), and a blue channel (B). The RGB color space is based on three basic colors of R (Red), G (Green) and B (Blue) and is superposed to different degrees to generate rich and wide colors, so the RGB color space is commonly called a three-primary-color mode. There are an infinite number of different colors in nature, and the human eye can only distinguish a limited number of different colors, and the RGB pattern can represent more than one thousand, six hundred and ten thousand different colors, which are very close to the colors of nature seen by the human eye, and is also called a natural color pattern. Red, green and blue represent three basic colors in the visible spectrum or three primary colors, each of which is classified into 256 levels according to its brightness. When the three primary colors of the color light are overlapped, various intermediate colors can be generated due to different color mixing proportions,
while the LabLabLab color model is a color model established by the CIE (International Commission on illumination). Any point color in nature can be expressed in Lab space, which is larger than RGB space. In addition, this mode describes human visual perception in a digital manner, independent of the device, so it makes up for the deficiency that the RGB and CMYK modes must rely on the device color characteristics. Lab color space (English: Lab color space) is a color-opponent space with dimension L representing luminance, and a and b representing color opponent dimensions, based on the non-linearly compressed CIE XYZ color space coordinates. Unlike the RGB and CMYK color spaces, Lab colors are designed to approximate human vision. It focuses on perceptual uniformity, with its L-component closely matching human luminance perception. And thus can be used to make accurate color balance by modifying the output levels of the a and b components, or to adjust the luminance contrast using the L component. These transformations are difficult or impossible in RGB or CMYK — they model the output of physical devices, not the human visual perception.
Step 304, inputting the first type value of the target color into a classification model to be trained for training, wherein the classification model to be trained comprises at least one class cluster.
Specifically, each cluster class in the present application represents a target color. Taking lipstick as an example, the types of the existing lipstick are 200-300, and after the training of the model to be trained is completed, the trained model will include the 200-300 class clusters. In the training process of the classification model, the number of the first type values under each class cluster becomes larger along with the input of sample data. It should be noted that each class cluster includes a cluster center, and when the classification model is trained, the cluster center of each class cluster is the most accurate point in the class cluster, and is most capable of describing the target color indicated by the class cluster.
In one embodiment, in the initial training, a first type value of several target colors may be selected as a cluster center according to the number of sample data, and illustratively, several lipstick Lab values of the current hottest pins are selected as an initial cluster center.
Step 306, obtaining cluster centers of the clusters, calculating a first distance metric between a first type value of the target color and the cluster center of each cluster, and determining a correlation degree based on the first distance metric, wherein the first distance metric and the correlation degree are in negative correlation.
The first distance measure is used to indicate the distance of the first type value of the target color from the cluster center of each cluster class, in other words, the first distance measure represents the closeness of the target color and the color indicated by the cluster center of each cluster class. For example, the closer the colors, the smaller the distance metric, and vice versa.
Taking the lipstick color as the target color as an example, the closer the first type value of a lipstick color in the sample data is to the cluster center of each cluster, the greater the association degree between the two.
It should be noted that the distance metric includes a euclidean distance, and the calculation of the degree of association is not limited to the euclidean distance, and may be performed by other metrics, such as a cosine distance, and the like, and is not limited herein.
Step 308, determining a minimum distance metric among the first distance metrics based on the first distance metrics, and clustering the first type values of the target colors based on the minimum distance metric.
It has been shown above that the first distance measure is used to indicate the distance of the first type value of the target color from the cluster center of each cluster class, in which step the smallest distance measure of the first distance measures from the first type value of the target color to the cluster center of each cluster class is determined, to which cluster class the first type value of the target color belongs.
Taking the target color as lipstick and the first type value as Lab value for example, the first distance metric may be calculated by the following formula,
Figure BDA0002656160630000061
l represents the luminance, a comprises colors from dark green (low luminance value) to gray (medium luminance value) to bright pink (high luminance value); b is the distance measure between two points, from bright blue (low brightness value) to gray (medium brightness value) to yellow (high brightness value).
Step 310, comparing the minimum distance metric with a preset threshold, and increasing the number of the class clusters if the minimum distance metric is greater than the preset threshold.
The step is mainly used for judging whether a new cluster needs to be added, specifically, the step judges by setting a threshold, the minimum distance metric is greater than the preset threshold, which indicates that the first type value does not belong to any cluster, and a new cluster needs to be added to classify the first type value.
In one embodiment, the first type value, which does not belong to any class cluster, is set as the cluster center of the new class cluster.
Referring to fig. 5, a flow chart of another embodiment of the training method of the classification model of the present application is specifically described.
Wherein, the steps 502-508 are the same as the steps 302-308 in fig. 3, and are not described herein again.
In step 510, the minimum distance metric is compared with a preset threshold, and if the minimum distance metric is less than or equal to the preset threshold, the first type value of the target color is clustered to a cluster corresponding to the minimum distance metric.
This step is mainly used to determine the classification criteria of the target color, i.e. how to cluster the first type values into the class cluster corresponding to the minimum distance measure. One way of calculating a distance metric Δ E has been shown above, where Δ E calculated by Lab can represent the difference in color of the human eye, where the range of Δ E can be set, e.g., Δ E range 0-1 where the human eye is perceived as minimal in color difference; the range of delta E is 1-3, and the human eyes can feel chromatic aberration, but the chromatic aberration is slight; the range of delta E is 3-6, and the human eye feeling chromatic aberration is obvious; the larger the Δ E range, the more intense the perceived color difference. The delta E is taken as a reference value according to the obvious difference of human eyes in color difference, and the range of the delta E of the color difference interval can be set to be less than or equal to 6.
That is, when Δ E is less than 6, it may be determined that the class cluster to which the target color belongs, that is, the preset threshold is set to 6, and when the minimum distance metric between the first type value of the target color and the cluster center of the class cluster to which the first type value of the target color belongs is less than or equal to 6, the first type value of the target color is clustered to the class cluster corresponding to the minimum distance metric.
In one embodiment, the RGB value of the target color (e.g. the color of lipstick in the image) is (205,12,19), the RGB value of the cluster corresponding to the minimum distance measure is determined to be (205,13,20), Δ E can be calculated to be 0.33 by the above formula, and the color difference of naked eyes is not strong, so that the target color is classified in the color group with RGB value of (205,13, 20). The specific calculation method can convert RGB into Lab calculation, which has been shown above, and is not limited herein.
In a non-limiting embodiment, in the step of any one of the above-described training methods for a classification model, the method may further include the following steps: the cluster center of the cluster and each first type value of the cluster are obtained, second distance measurement between each first type value and the cluster center of the cluster is calculated, the average value of the second distance measurement is obtained based on each second distance measurement, and the cluster center of the cluster is updated based on the average value of the second distance measurement.
The method comprises the steps of adjusting the accuracy of a cluster center, classifying sample data into a certain cluster after the sample data is input every time, and adjusting the position of the cluster center after the sample data is classified every time because the minimum distance measurement from a first type value newly added to the cluster center every time is different and the accuracy of a trained model is not high without adjusting the accuracy of the cluster center. Specifically, the method for adjusting the position of the cluster center by a single cluster may calculate a second distance metric between each first type value and the cluster center of the cluster, where the second distance metric is consistent with the calculation method of the first distance metric, which is not described herein again, then calculate an average value of the metrics of the second distance, and based on the average value of the metrics of the second distance, the cluster re-determines the cluster center.
Referring to fig. 6, another aspect of the present application provides a classification method, including:
step 602, acquiring data to be classified, inputting the data to be classified into a preset color identification model, and determining a first type value of a target color in the data to be classified;
step 604, inputting the first type value of the target color into a trained classification model, wherein the trained classification model comprises at least one class cluster;
step 606, obtaining the cluster center of the cluster;
step 608, calculating a first distance measure of the first type value of the target color and the cluster center of each of the clusters, and determining the association degree based on the first distance measure, wherein the first distance measure is in negative correlation with the association degree.
Step 610, clustering the first type value of the target color based on the association degree.
It should be noted that the training of the model can be trained by the training method disclosed above.
On the other hand, the application also provides a training device of the classification model,
an obtaining module 702, configured to obtain sample data, input the sample data to a preset color identification model, and determine a first type value of a target color in the sample data;
an input module 704, configured to input a first type value of a target color into a classification model to be trained for training, where the classification model to be trained includes at least one class cluster;
a calculating module 706, configured to calculate a degree of association between the first type value of the target color and the class cluster;
a clustering module 708 for clustering the first type value of the target color based on the degree of association;
and the updating module 710 is configured to update the class cluster based on the clustering result to complete the training of the classification model to be trained.
It is to be understood that the electronic devices and the like described above include hardware structures and/or software modules for performing the respective functions in order to realize the functions described above. Those of skill in the art will readily appreciate that the various illustrative elements and algorithm steps described in connection with the embodiments disclosed herein may be implemented as hardware or combinations of hardware and computer software. Whether a function is performed as hardware or computer software drives hardware depends upon the particular application and design constraints imposed on the solution. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present embodiments.
In the embodiment of the present application, the electronic device and the like may be divided into functional modules according to the method example, for example, each functional module may be divided according to each function, or two or more functions may be integrated into one processing module. The integrated module can be realized in a hardware mode, and can also be realized in a software functional module mode. It should be noted that, the division of the modules in the embodiment of the present invention is schematic, and is only a logic function division, and there may be another division manner in actual implementation.
Through the above description of the embodiments, it is clear to those skilled in the art that, for convenience and simplicity of description, the foregoing division of the functional modules is merely used as an example, and in practical applications, the above function distribution may be completed by different functional modules according to needs, that is, the internal structure of the device may be divided into different functional modules to complete all or part of the above described functions. For the specific working processes of the system, the apparatus and the unit described above, reference may be made to the corresponding processes in the foregoing method embodiments, and details are not described here again.
Each functional unit in the embodiments of the present application may be integrated into one processing unit, or each unit may exist alone physically, or two or more units are integrated into one unit. The integrated unit can be realized in a form of hardware, and can also be realized in a form of a software functional unit.
The integrated unit, if implemented in the form of a software functional unit and sold or used as a stand-alone product, may be stored in a computer readable storage medium. Based on such understanding, the technical solutions of the embodiments of the present application may be essentially implemented or make a contribution to the prior art, or all or part of the technical solutions may be implemented in the form of a software product stored in a storage medium and including several instructions for causing a computer device (which may be a personal computer, a server, or a network device) or a processor to execute all or part of the steps of the methods described in the embodiments of the present application. And the aforementioned storage medium includes: flash memory, removable hard drive, read only memory, random access memory, magnetic or optical disk, and the like.
The above description is only an embodiment of the present application, but the scope of the present application is not limited thereto, and any changes or substitutions within the technical scope of the present disclosure should be covered by the scope of the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (10)

1. A training method of a classification model is characterized by comprising the following steps:
acquiring sample data, inputting the sample data to a preset color identification model, and determining a first type value of a target color in the sample data;
inputting the first type value of the target color into a classification model to be trained for training, wherein the classification model to be trained comprises at least one class cluster;
calculating the association degree of the first type value of the target color and the class cluster;
clustering the first type value of the target color based on the relevance;
and updating the class cluster based on the clustering result so as to finish the training of the classification model to be trained.
2. The training method according to claim 1, wherein the calculating the association degree of the first type value of the target color with the class cluster comprises:
acquiring a cluster center of the cluster;
calculating a first distance measure of a first type value of the target color and a cluster center of each of the clusters, determining the degree of association based on the first distance measure, the first distance measure being inversely related to the degree of association.
3. The training method of claim 2, wherein clustering the first type value of the target color based on the relevance comprises:
determining a minimum distance metric of the first distance metrics based on the first distance metrics;
clustering the first type value of the target color based on the minimum distance metric.
4. The training method of claim 3, wherein the updating the class cluster based on the clustering result comprises:
comparing the minimum distance metric to a preset threshold;
and if the minimum distance metric is larger than a preset threshold value, increasing the number of the class clusters.
5. The training method of claim 3, wherein the updating the class cluster based on the clustering result comprises:
comparing the minimum distance metric to a preset threshold;
if the minimum distance metric is smaller than or equal to the preset threshold, clustering the first type value of the target color to a cluster corresponding to the minimum distance metric.
6. Training method according to any of claims 1-5, further comprising:
acquiring a cluster center of the cluster and each first type value of the cluster;
calculating a second distance metric between each of the first type values and a cluster center of the class cluster;
obtaining an average of the second distance measures based on each of the second distance measures;
updating a cluster center of the class cluster based on an average of the measure of the second distance.
7. A method of classifying colors, comprising:
acquiring data to be classified, inputting the data to be classified into a preset color identification model, and determining a first type value of a target color in the data to be classified;
inputting a first type value of the target color into a preset classification model, wherein the preset classification model comprises at least one class cluster;
acquiring a cluster center of the cluster;
calculating a first distance measure of a first type value of the target color and a cluster center of each of the clusters, determining the degree of association based on the first distance measure, the first distance measure being inversely related to the degree of association;
clustering the first type value of the target color based on the relevance;
wherein the preset classification model is obtained by training based on the training method of any one of claims 1-6.
8. An electronic device, comprising: a memory for storing computer program code, the computer program code comprising instructions which, when read from the memory by the electronic device, cause the electronic device to perform the method of any of claims 1-6 or cause the electronic device to perform the method of claim 7.
9. A computer readable storage medium comprising computer instructions which, when run on an electronic device, cause the electronic device to perform the method of any of claims 1-6 or cause the electronic device to perform the method of claim 7.
10. A computer program product, characterized in that, when run on a computer, causes the computer to perform the method of any of claims 1-6 or causes the computer to perform the method of claim 7.
CN202010888167.1A 2020-08-28 2020-08-28 Training method of classification model, color classification method and electronic equipment Active CN112016621B (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202010888167.1A CN112016621B (en) 2020-08-28 2020-08-28 Training method of classification model, color classification method and electronic equipment

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202010888167.1A CN112016621B (en) 2020-08-28 2020-08-28 Training method of classification model, color classification method and electronic equipment

Publications (2)

Publication Number Publication Date
CN112016621A true CN112016621A (en) 2020-12-01
CN112016621B CN112016621B (en) 2023-11-24

Family

ID=73502315

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202010888167.1A Active CN112016621B (en) 2020-08-28 2020-08-28 Training method of classification model, color classification method and electronic equipment

Country Status (1)

Country Link
CN (1) CN112016621B (en)

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN113705681A (en) * 2021-08-28 2021-11-26 北京工业大学 Lipstick number identification method based on machine learning
US20220180565A1 (en) * 2020-12-09 2022-06-09 Chanel Parfums Beaute Method for identifying a lip-makeup product appearing in an image

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107064955A (en) * 2017-04-19 2017-08-18 北京汽车集团有限公司 barrier clustering method and device
CN107451597A (en) * 2016-06-01 2017-12-08 腾讯科技(深圳)有限公司 A kind of sample class label method and device for correcting
CN111368762A (en) * 2020-03-09 2020-07-03 金陵科技学院 Robot gesture recognition method based on improved K-means clustering algorithm
CN111400528A (en) * 2020-03-16 2020-07-10 南方科技大学 Image compression method, device, server and storage medium

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107451597A (en) * 2016-06-01 2017-12-08 腾讯科技(深圳)有限公司 A kind of sample class label method and device for correcting
CN107064955A (en) * 2017-04-19 2017-08-18 北京汽车集团有限公司 barrier clustering method and device
CN111368762A (en) * 2020-03-09 2020-07-03 金陵科技学院 Robot gesture recognition method based on improved K-means clustering algorithm
CN111400528A (en) * 2020-03-16 2020-07-10 南方科技大学 Image compression method, device, server and storage medium

Cited By (2)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US20220180565A1 (en) * 2020-12-09 2022-06-09 Chanel Parfums Beaute Method for identifying a lip-makeup product appearing in an image
CN113705681A (en) * 2021-08-28 2021-11-26 北京工业大学 Lipstick number identification method based on machine learning

Also Published As

Publication number Publication date
CN112016621B (en) 2023-11-24

Similar Documents

Publication Publication Date Title
JP5116765B2 (en) Method for creating paint color database, search method using the database, and system, program and recording medium thereof
Gevers et al. Color in computer vision: fundamentals and applications
Cheng et al. Effective learning-based illuminant estimation using simple features
Rizzi et al. From retinex to automatic color equalization: issues in developing a new algorithm for unsupervised color equalization
Mojsilovic A computational model for color naming and describing color composition of images
Gijsenij et al. Perceptual analysis of distance measures for color constancy algorithms
CN108701217A (en) A kind of face complexion recognition methods, device and intelligent terminal
CN109903256A (en) Model training method, chromatic aberration calibrating method, device, medium and electronic equipment
Vazquez-Corral et al. Color constancy by category correlation
US11576478B2 (en) Method for simulating the rendering of a make-up product on a body area
CN103699532B (en) Image color retrieval method and system
CN104636759B (en) A kind of method and picture filter information recommendation system for obtaining picture and recommending filter information
AU2015201623A1 (en) Choosing optimal images with preference distributions
Falomir et al. A model for colour naming and comparing based on conceptual neighbourhood. An application for comparing art compositions
Banić et al. Color cat: Remembering colors for illumination estimation
CN112016621A (en) Training method of classification model, color classification method and electronic equipment
CN115547270A (en) Chromatic aberration adjusting method, device and equipment based on spectral analysis and storage medium
Flachot et al. Deep neural models for color classification and color constancy
Justiawan et al. Comparative analysis of color matching system for teeth recognition using color moment
Drew et al. The zeta-image, illuminant estimation, and specularity manipulation
CN114512085A (en) Visual color calibration method of TFT (thin film transistor) display screen
CN113642358B (en) Skin color detection method, device, terminal and storage medium
Kaya et al. Parametric and nonparametric correlation ranking based supervised feature selection methods for skin segmentation
CN114820514A (en) Image processing method and electronic equipment
CN108205677A (en) Method for checking object, device, computer program, storage medium and electronic equipment

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination
CB02 Change of applicant information
CB02 Change of applicant information

Address after: 200041 floor 11, No. 651, Nanjing West Road, Jing'an District, Shanghai

Applicant after: Shanghai Yingfan Digital Technology Co.,Ltd.

Address before: 200041 26th floor, East Building, China Merchants Plaza, 333 Chengdu North Road, Jing'an District, Shanghai

Applicant before: Shanghai First Financial Data Technology Co.,Ltd.

GR01 Patent grant
GR01 Patent grant