CN113705681A - Lipstick number identification method based on machine learning - Google Patents

Lipstick number identification method based on machine learning Download PDF

Info

Publication number
CN113705681A
CN113705681A CN202111000354.2A CN202111000354A CN113705681A CN 113705681 A CN113705681 A CN 113705681A CN 202111000354 A CN202111000354 A CN 202111000354A CN 113705681 A CN113705681 A CN 113705681A
Authority
CN
China
Prior art keywords
lipstick
hsv
value
training
clustering
Prior art date
Legal status (The legal status is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the status listed.)
Pending
Application number
CN202111000354.2A
Other languages
Chinese (zh)
Inventor
张文东
郑鑫
Current Assignee (The listed assignees may be inaccurate. Google has not performed a legal analysis and makes no representation or warranty as to the accuracy of the list.)
Beijing University of Technology
Original Assignee
Beijing University of Technology
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by Beijing University of Technology filed Critical Beijing University of Technology
Priority to CN202111000354.2A priority Critical patent/CN113705681A/en
Publication of CN113705681A publication Critical patent/CN113705681A/en
Pending legal-status Critical Current

Links

Images

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/21Design or setup of recognition systems or techniques; Extraction of features in feature space; Blind source separation
    • G06F18/214Generating training patterns; Bootstrap methods, e.g. bagging or boosting
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06FELECTRIC DIGITAL DATA PROCESSING
    • G06F18/00Pattern recognition
    • G06F18/20Analysing
    • G06F18/23Clustering techniques
    • G06F18/232Non-hierarchical techniques
    • G06F18/2321Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions
    • G06F18/23213Non-hierarchical techniques using statistics or function optimisation, e.g. modelling of probability density functions with fixed number of clusters, e.g. K-means clustering
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T5/00Image enhancement or restoration
    • G06T5/70Denoising; Smoothing
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/90Determination of colour characteristics

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Data Mining & Analysis (AREA)
  • General Physics & Mathematics (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Evolutionary Computation (AREA)
  • General Engineering & Computer Science (AREA)
  • Artificial Intelligence (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Bioinformatics & Computational Biology (AREA)
  • Evolutionary Biology (AREA)
  • Bioinformatics & Cheminformatics (AREA)
  • Biomedical Technology (AREA)
  • Health & Medical Sciences (AREA)
  • Biophysics (AREA)
  • Computational Linguistics (AREA)
  • General Health & Medical Sciences (AREA)
  • Molecular Biology (AREA)
  • Computing Systems (AREA)
  • Mathematical Physics (AREA)
  • Software Systems (AREA)
  • Probability & Statistics with Applications (AREA)
  • Image Analysis (AREA)

Abstract

The invention provides a lipstick number identification method based on machine learning, which is used for solving the problem of poor stability of a judgment result caused by judging the color of lipstick through human eyes or traditional algorithm identification in the prior art. The method comprises the steps of obtaining a lipstick color number training sample; training an identification model by using chrominance information of a training sample, wherein the model comprises two parallel clustering branches, and an RGB value and an HSV value are respectively input into the two branches to be independently trained until clustering is stable, and the number of clusters and the cluster center are determined; and respectively inputting the RGB and HSV values of the sample to be recognized into the trained recognition model to obtain two results, and inputting the two results into a decision tree to obtain a final recognition result. The method adopts RGB and HSV to respectively judge the color of the lipstick, has higher accuracy compared with single RGB or HSV identification, and simultaneously eliminates the interference on the identification result caused by different illumination environments through image normalization, thereby improving the accuracy of the identification of the color number of the lipstick.

Description

Lipstick number identification method based on machine learning
Technical Field
The invention belongs to the technical field of machine learning application, and particularly relates to a red number identification method based on machine learning.
Background
At present, when people select the lipstick, people often judge the color of the lipstick through human eyes, and then decide whether to select the lipstick according to the color of the lipstick.
The method for judging the color of the lipstick through human eyes is easily influenced by various factors, such as the influence of the body condition of the human body and the influence of the light environment in which the lipstick is placed, so that people can obtain different color judgment results for the lipstick with the same color under different scenes, namely the judgment result for the color of the lipstick is unstable.
Furthermore, because the types of the lipsticks are various, the lipsticks of different brands have respective independent color definition standards, the brands have strong identification degrees for the color numbers of the lip color cosmetics, products with different colors have various names and definitions, the lipsticks of different brands are not easy to be put together for color comparison, and the color numbers of the lipsticks cannot be determined only by means of difficulty in identifying the specific colors of the lipsticks with naked eyes. For example, a certain lipstick of different brands is essentially one color, but with different names, it is difficult for people to remember the difference between similar color lipsticks of different brands, thereby causing trouble to people in selecting the color of the lipstick. Therefore, certain difficulties are posed to the analysis of the current prevalence trend of lipstick.
At present, only one RGB value is used as a parameter for identifying a lipstick number, the method adopts two parameters of RGB and HSV simultaneously, double channels are used for identifying simultaneously, results of the RGB and HSV are comprehensively evaluated, and errors are reduced.
Disclosure of Invention
In order to solve the problem of quickly and accurately identifying lipstick numbers of different brands, the invention provides a lipstick number identification method based on machine learning. The specific technical scheme is as follows:
step 1, obtaining a lipstick number training sample, preprocessing the training sample, and obtaining chrominance information of the training sample, wherein the chrominance information comprises RGB (red, green, blue) value information and HSV (hue, saturation and value) information;
step 2, training a recognition model by using chrominance information of a training sample, wherein the recognition model comprises two parallel clustering branches, and the RGB value and the HSV value are respectively input into the two parallel clustering branches for independent training until clustering is stable, so that two sets of independent clustering branches are obtained, and the number and the center of clusters are determined;
and 3, respectively inputting the RGB value and HSV value information of the sample to be recognized into the trained recognition model to obtain two output results, and inputting the two results into a decision tree for decision making to obtain a final recognition result.
Further, the pretreatment comprises:
normalizing the pictures of the training samples;
carrying out noise reduction processing on the pictures of the training samples, and averaging;
and acquiring the RGB value and HSV value of the training sample after the noise reduction treatment.
Further, the training samples are the chroma information of the lipstick obtained on various major lipstick brand official websites, and the samples to be identified are lipstick pictures uploaded by users.
Further, the clustering branches are clustered by using a K-Medians clustering method in machine learning.
Compared with the prior art, the beneficial effect of this disclosure is:
the method and the device realize the identification of the lipstick number of the picture to be detected, realize the intelligent identification of the lipstick number, and the user can identify the lipstick number only by providing the facial image. In addition, the traditional lipstick number identification method is only based on RGB detection standard for detection, and the RGB and HSV double channels are adopted for decision identification, so that the identification precision is improved. The recognition speed is improved through the double-channel training.
Drawings
Fig. 1 is a flowchart of a lipstick number identification method based on machine learning according to the present invention.
Detailed Description
In order to make the purpose, technical solutions and advantages of the present application clearer, the technical solutions in the present application will be clearly and completely described below with reference to the embodiments and the accompanying drawings, and it is obvious that the described embodiments are only a part of the embodiments of the present application, and not all of the embodiments.
Step 1: acquiring standard RGB values and HSV values of the lipstick on various official websites of the large lipstick brand;
step 2: the acquired lipstick data is processed.
Normalizing the lipstick picture collected in the above steps, performing noise reduction, and averaging,
acquiring an RGB value and an HSV value of a training sample after noise reduction;
and step 3: training samples are collected by the steps to train the recognition model.
And training an identification model by using the chrominance information of the training sample, wherein the identification model comprises two parallel clustering branches, and the RGB value and the HSV value are respectively input into the two parallel clustering branches to be independently trained until the clustering is stable, so that two sets of independent clustering branches are obtained, and the cluster number and the cluster center are determined. The classification model to be trained comprises at least one class cluster;
in the embodiment, the clustering branches adopt K-Medians in machine learning, and the training processes of the two branches are the same, specifically as follows:
clustering the training samples by using any clustering method to obtain the cluster center of the initial cluster;
calculating the Euclidean distance between the chromaticity information of the training sample and the cluster center of each initial class cluster, wherein the smaller the Euclidean distance is, the higher the similarity between the training sample and the class cluster is;
comparing the Euclidean distance with a preset threshold value, and if the Euclidean distance is smaller than or equal to the preset threshold value, clustering the training sample to a cluster with the minimum Euclidean distance; if the Euclidean distance is larger than a preset threshold value, the similarity between the training sample and the existing cluster center is too low, and the clustering cannot be performed, adding a cluster, wherein the cluster center value of the cluster is equal to the value of the training sample; wherein, the preset threshold belongs to artificial setting and is related to the identification precision; the smaller the preset threshold value is, the higher the identification precision is; the larger the preset threshold value is, the lower the recognition accuracy is.
And when the clustering result is stable, finishing training, storing the number of the trained class clusters and the class cluster center value in a file, finishing the identification model, and directly calling class cluster information from the file for use when the identification model is used.
The clustering method used in the method can be K-Medians in machine learning;
and 4, step 4: acquiring an RGB value and an HSV value of a picture to be recognized, and respectively inputting the RGB value and the HSV value into corresponding recognition models for recognition to obtain an output result of a lipstick number;
acquiring a picture to be identified, wherein in one possible implementation mode, a female facial picture with the heat degree exceeding a preset threshold value or a picture uploaded by a user is acquired through a social media and is used as the picture to be identified;
and respectively inputting the RGB value and HSV value information of the sample to be recognized into the trained recognition model to obtain two output results, and inputting the two results into a decision tree to make a decision to obtain a final recognition result.
Obviously, many modifications and variations are possible in light of the above teaching. The embodiments were chosen and described in order to best explain the principles of the invention and the practical application, to thereby enable others skilled in the art to best utilize the invention and various embodiments with various modifications as are suited to the particular use contemplated. While embodiments in accordance with the invention have been described above, these embodiments are not intended to be exhaustive or to limit the invention to the precise embodiments described.
Finally, it should be noted that: the above-mentioned embodiments are only specific embodiments of the present application, and are used for illustrating the technical solutions of the present application, but not limiting the same, and the scope of the present application is not limited thereto, and although the present application is described in detail with reference to the foregoing embodiments, those skilled in the art should understand that: any person skilled in the art can modify or easily conceive the technical solutions described in the foregoing embodiments or equivalent substitutes for some technical features within the technical scope disclosed in the present application; such modifications, changes or substitutions do not depart from the spirit and scope of the embodiments of the present application and are intended to be covered by the present application. Therefore, the protection scope of the present application shall be subject to the protection scope of the claims.

Claims (4)

1. A lipstick number identification method based on machine learning is characterized by comprising the following steps:
step 1, obtaining a lipstick number training sample, preprocessing the training sample, and obtaining chrominance information of the training sample, wherein the chrominance information comprises RGB (red, green, blue) value information and HSV (hue, saturation and value) information;
step 2, training a recognition model by using chrominance information of a training sample, wherein the recognition model comprises two parallel clustering branches, and the RGB value and the HSV value are respectively input into the two parallel clustering branches for independent training until clustering is stable, so that two sets of independent clustering branches are obtained, and the number and the center of clusters are determined;
and 3, respectively inputting the RGB value and HSV value information of the sample to be recognized into the trained recognition model to obtain two output results, and inputting the two results into a decision tree for decision making to obtain a final recognition result.
2. The machine learning-based lipstick number recognition method according to claim 1, characterized in that said preprocessing comprises:
normalizing the pictures of the training samples;
carrying out noise reduction processing on the pictures of the training samples, and averaging;
and acquiring the RGB value and HSV value of the training sample after the noise reduction treatment.
3. The machine learning-based lipstick number identification method according to claim 1,
the training samples are the chroma information of the lipstick obtained on the official website of each big lipstick brand,
the sample to be identified is a lipstick picture uploaded by a user.
4. The machine learning-based lipstick number identification method according to claim 1,
and the clustering branches are clustered by using a K-Medians clustering method in machine learning.
CN202111000354.2A 2021-08-28 2021-08-28 Lipstick number identification method based on machine learning Pending CN113705681A (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
CN202111000354.2A CN113705681A (en) 2021-08-28 2021-08-28 Lipstick number identification method based on machine learning

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
CN202111000354.2A CN113705681A (en) 2021-08-28 2021-08-28 Lipstick number identification method based on machine learning

Publications (1)

Publication Number Publication Date
CN113705681A true CN113705681A (en) 2021-11-26

Family

ID=78656400

Family Applications (1)

Application Number Title Priority Date Filing Date
CN202111000354.2A Pending CN113705681A (en) 2021-08-28 2021-08-28 Lipstick number identification method based on machine learning

Country Status (1)

Country Link
CN (1) CN113705681A (en)

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114818883A (en) * 2022-04-07 2022-07-29 中国民用航空飞行学院 CART decision tree fire image identification method based on optimal combination of color features

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107799116A (en) * 2016-08-31 2018-03-13 科大讯飞股份有限公司 More wheel interacting parallel semantic understanding method and apparatus
US20180260974A1 (en) * 2017-03-09 2018-09-13 Hewlett Packard Enterprise Development Lp Color recognition through learned color clusters
CN110322522A (en) * 2019-07-11 2019-10-11 山东领能电子科技有限公司 A kind of vehicle color identification method based on the interception of target identification region
CN112016621A (en) * 2020-08-28 2020-12-01 上海第一财经数据科技有限公司 Training method of classification model, color classification method and electronic equipment

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN107799116A (en) * 2016-08-31 2018-03-13 科大讯飞股份有限公司 More wheel interacting parallel semantic understanding method and apparatus
US20180260974A1 (en) * 2017-03-09 2018-09-13 Hewlett Packard Enterprise Development Lp Color recognition through learned color clusters
CN110322522A (en) * 2019-07-11 2019-10-11 山东领能电子科技有限公司 A kind of vehicle color identification method based on the interception of target identification region
CN112016621A (en) * 2020-08-28 2020-12-01 上海第一财经数据科技有限公司 Training method of classification model, color classification method and electronic equipment

Cited By (1)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN114818883A (en) * 2022-04-07 2022-07-29 中国民用航空飞行学院 CART decision tree fire image identification method based on optimal combination of color features

Similar Documents

Publication Publication Date Title
WO2019100282A1 (en) Face skin color recognition method, device and intelligent terminal
Elgammal et al. Skin detection-a short tutorial
Rahmat et al. Skin color segmentation using multi-color space threshold
Yang et al. Research on a skin color detection algorithm based on self-adaptive skin color model
CN103634680B (en) The control method for playing back and device of a kind of intelligent television
CN104636759B (en) A kind of method and picture filter information recommendation system for obtaining picture and recommending filter information
CN111860538A (en) Tongue color identification method and device based on image processing
CN104598888B (en) A kind of recognition methods of face gender
CN110598574A (en) Intelligent face monitoring and identifying method and system
Lionnie et al. A comparison of human skin color detection for biometrie identification
CN110599553B (en) Skin color extraction and detection method based on YCbCr
CN113705681A (en) Lipstick number identification method based on machine learning
Mohammed et al. Image segmentation for skin detection
Kumar et al. Real time detection and tracking of human face using skin color segmentation and region properties
Yusuf et al. Human face detection using skin color segmentation and watershed algorithm
CN108416782A (en) View-based access control model identifies and the tobacco leaf ranking method and system of illumination correction
Berbar Novel colors correction approaches for natural scenes and skin detection techniques
CN110728318A (en) Hair color identification method based on deep learning
Manjare et al. Skin detection for face recognition based on HSV color space
CN108875572B (en) Pedestrian re-identification method based on background suppression
CN110348530B (en) Method for identifying lipstick number
Berbar Skin colour correction and faces detection techniques based on HSL and R colour components
Aminian et al. Face detection using color segmentation and RHT
Nabiyev et al. Towards a biometric purpose image filter according to skin detection
CN110991223A (en) Method and system for identifying beautiful pupil based on transfer learning

Legal Events

Date Code Title Description
PB01 Publication
PB01 Publication
SE01 Entry into force of request for substantive examination
SE01 Entry into force of request for substantive examination