WO2023245404A1 - Procédé de classification d'une couleur d'un produit de maquillage et outil pour l'assistance au développement de couleurs de produits de maquillage - Google Patents

Procédé de classification d'une couleur d'un produit de maquillage et outil pour l'assistance au développement de couleurs de produits de maquillage Download PDF

Info

Publication number
WO2023245404A1
WO2023245404A1 PCT/CN2022/100042 CN2022100042W WO2023245404A1 WO 2023245404 A1 WO2023245404 A1 WO 2023245404A1 CN 2022100042 W CN2022100042 W CN 2022100042W WO 2023245404 A1 WO2023245404 A1 WO 2023245404A1
Authority
WO
WIPO (PCT)
Prior art keywords
color
colors
makeup
lightness
subfamily
Prior art date
Application number
PCT/CN2022/100042
Other languages
English (en)
Inventor
Haiting GU
Alexander M. JASPERS
Theo M. PHAN VAN SONG
Yue Qiao
Original Assignee
L'oreal
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by L'oreal filed Critical L'oreal
Priority to PCT/CN2022/100042 priority Critical patent/WO2023245404A1/fr
Priority to FR2208340A priority patent/FR3138962A1/fr
Publication of WO2023245404A1 publication Critical patent/WO2023245404A1/fr

Links

Images

Classifications

    • GPHYSICS
    • G01MEASURING; TESTING
    • G01JMEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
    • G01J3/00Spectrometry; Spectrophotometry; Monochromators; Measuring colours
    • G01J3/46Measurement of colour; Colour measuring devices, e.g. colorimeters
    • G01J3/462Computing operations in or between colour spaces; Colour management systems
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T11/002D [Two Dimensional] image generation
    • G06T11/001Texturing; Colouring; Generation of texture or colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/40Extraction of image or video features
    • G06V10/56Extraction of image or video features relating to colour
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2200/00Indexing scheme for image data processing or generation, in general
    • G06T2200/24Indexing scheme for image data processing or generation, in general involving graphical user interfaces [GUIs]

Definitions

  • Embodiments of the invention are related to a method for classifying a color of a cometic makeup product, which isautomatically implemented by a computer, and a tool for assistance in the development of colors of cosmetic makeup products, which is implemented by a computerand controlled by a user.
  • cosmetic product is meant any product as defined in Regulation (EC) No. 1223/2009 of the European Parliament and of the Council of November 30, 2009, concerning cosmetic products.
  • a cosmetic make-up product, or “makeup product” is more particularly intended to cover a body surface in order to modify the perceived color and/or texture.
  • the studied markets can vary from the low-cost products to luxury products, and the products can vary according tothe market location, such as for example the Asian or Chinese market different than the European market.
  • the visualization is typically madeby color experts with physical samples of the colors under human eye perception, for instance with thumbnails color samples of the makeup products, because of the great reaction in the subjective perception of the makeup colorsthat can be produced by a slight variation of the absolute color. Indeed, in the example of a red color for a lipstick, a first red color can subjectively appear cold and lifeless while a second red color, very close in the absolute to the first red color, can subjectively appear warm and rich.
  • the absolute difference between two colors can be for instance the distance separating these colors in a “standard observer” model, such as the “CIEXYZ” color space, or “CIELAB” color space.
  • CIE International Commission on Illumination
  • These standard observer models aredefined by the International Commission on Illumination (abbreviated CIE) , and the colors they define are not relative to any particular device such as a computer monitor or a printer, but instead relate to the CIE standard observer which is an averaging of the results of color matching experiments under laboratory conditions.
  • the RGB “Red Green Blue” color space is defined by coordinates of the red, green, and blue additive primaries, and is typically used in electronics for sensing and displaying colors.
  • the “CIELAB” color space also referred to as L*a*b*, expresses colors as three coordinates: L*for lightness, and a*and b*for, respectively, greenish to reddish chromaticity, and blueish to yellowish chromaticity (the chromaticity conveying both hue and chroma measures) .
  • L*a*b* expresses colors as three coordinates: L*for lightness, and a*and b*for, respectively, greenish to reddish chromaticity, and blueish to yellowish chromaticity (the chromaticity conveying both hue and chroma measures) .
  • a*b*plane a unique hue can be identified by a unique anglein the trigonometric circle.
  • the CIELAB color space is designed to be more perceptually uniform than, for instance, the RGB color space.
  • the subjective perception of makeup products colors is not wall translated in the CIELAB color space, such that a slight variation of the absolute color in the CIE
  • the human eye perception of colors goes beyond the L*a*b*coordinates, and attachesimportance to subjective perceptions such as the “background effect” of a color, where for example looking at a single color square surrounded by a colored background results in different perception of the color square depending on the background’s color.
  • the perception of a lipstick color may vary depending on the skin tone of the wearer.
  • acomputer implemented method for automatically classifying a color of a makeupproduct comprises:
  • the color family volumes being designed according to a subjective perception of makeup colors, and the makeupproduct color being classified according to the assigned labels.
  • the said subjective perception of makeup colors used for the design of the volumes, and the hereinafter definedboundaries value, line and surface are advantageously defined by human color experts, for instance according to the aforementioned sensitive subjective perception, specific to makeup colors.
  • the human color experts may provide a visual database of discrete color-points, and the design of the volume and boundaries value, line and surfacemay be performed to obtain a continuity of color-points by a data-driven computation configured to fit with the visual database.
  • thedata-driven computation configured to fit with the visual database can be implemented by a machine-learning trained model.
  • the computer-implemented method according to this aspect permits to classify colors based on labels which are representative of the subjective perceptions of makeup colors, and defined by conditions established by the subjective perceptionsspecific tomakeup colors.
  • the consequent classification for instance applied to each color of a group of analyzed makeup products, can thus provide useful information to a color expert performing development of makeup product colors, despite the limitations of the computer representations of colors (color spaces) and of the on-screen colors display.
  • the set of color family volumes in the L*a*b*space is generated from a data base including a finite number of points in the L*a*b*space, each point being labelled with a respective family label according to the subjective perception of makeup colors, and from a mathematical calculation comprising a triangulation generating envelopes enclosing all the points of each respective family and an interpolation spreading the envelopes until the respective facing surfaces of neighboring envelopes matches with each other, the envelopes defining the enclosure of the respective color family volumes.
  • the color family volumes are configured to delimit brown, pink, orange, purple, and red colors in the L*a*b*space according to the subjective perception of makeup colors.
  • the method additionally comprises:
  • a lightness subfamily label (amongst light, medium, dark, for example) to the makeup product color according to an identification of the position of the input coordinates data in comparison with at least one lightness boundary value in the lightness coordinate axis L*of the L*a*b*space;
  • the at least one lightness boundary value being designed according to a subjective perception of makeup colors, and the makeupproduct color being classified according to the assigned labels.
  • the at least one lightness boundary value in the lightness coordinate axis L*of the L*a*b*space is decreased of a step for input coordinates having a chroma value greater than a threshold set according to the subjective perception of makeup colors.
  • the method additionally comprises:
  • a chroma subfamily label (amongst high, intermediate, low, for example) to the makeup product color according to the position of the input coordinates data in comparison with at least one chroma boundary line in an a*b*plane of the L*a*b*space;
  • the at least one chroma boundary line being designed according to a subjective perception of makeup colors, and the makeupproduct color being classified according to the assigned labels.
  • the at least one chroma boundary line in the a*b*plane of the L*a*b*space varies depending on the hue of the input coordinate data, according to the subjective perception of makeup colors.
  • the chroma subfamily includes a high label assigned if the chroma of the input coordinates is greater than a first chroma boundary line, an intermediate label assigned if the chroma of the input coordinates is between the first chroma boundary line and a second chroma boundary line, and a low label assigned if the chroma of the input coordinates is lower than the second chroma boundary line.
  • the method additionally comprises:
  • hue tone subfamily label (amongst cool, neutral, warm, for example) to the makeup product color according to the position of the input coordinates data in comparison with at least one hue tone boundary surface inside the respective color family volume in the L*a*b*space;
  • theat least one hue tone boundary surface being designed according to a subjective perception of makeup colors, and the makeupproduct color being classified according to the assigned labels.
  • the at least one hue tone boundary surface in each color family volume of the L*a*b*space is defined according to the subjective perception of makeup colors.
  • This hierarchy permits to exhibit a convenient classification of labels for instance for analysis of colors of makeup products. That being said, other hierarchy can be used too, and regarding the processing for assigning of the labels, the respective processing stage can be performed all at the same time or possibly thelightness label should be processed after the chroma label, because the lightness boundary line values may be dependent on chroma label.
  • a computer implemented tool intended to becontrolled by a user, for assistance in the development of colors of makeup products, comprises:
  • the tool additionally comprises:
  • an application mode adapted for the user to select a set of at least one makeup product color from the displayed map (ID card) and to select at least one skin tone photograph model, configured to display images of simulation of applications of the selected set of makeup product colors respectively on the at least one skin tone photograph model.
  • the tool additionally comprises:
  • a color creation mode adapted for the user to select at least one skin tone photograph model and to set parameters for generating a custom color, configured to display an image of simulation of an application of the custom color on the at least one skin tone photograph model.
  • the color creation mode is additionally adapted for the user to select a set of at least one makeup product color from the displayed map (ID card) , and is configured to simultaneously display comparison images of simulation of applications of the custom color and of the selected set of makeup product colors, respectively on the at least one skin tone photograph model.
  • a computer program product comprises instructions which, when the program is executed by a computer, cause the computer to carry out the method as defined hereinabove, or cause the computer to carry out the tool as defined hereinabove.
  • a computer-readable storage medium comprises instructions which, when executed by a computer, cause the computer to carry out the method as defined hereinabove, or cause the computer to carry out the tool as defined hereinabove.
  • aspects and embodiments provide in particular a digital system that enables instant analysis of colorcosmetics shades and shade range creation, based on an integrated process of 1) data visualization ofinstrumental measurement data according to labels adapted for the classification of makeup productscolors; 2) digital application of measuredshades on images depicting models of different skin tones taken, for instance using a pseudo-spectral imagingsystem; 3) functionality to accurately conceive and apply shades digitally on images taken, for example usingaforementioned pseudo-spectral camera system.
  • FIG. 7 illustrate embodiments of a method for classifying a color of a makeupproduct, according to the invention
  • FIG. 14 illustrate embodiments of a tool for assistance in the development of colors of makeupproducts, according to the invention.
  • Figure 1 illustrates a method 100 for classifying a color of a makeupproduct 102, which is designed to be automatically implemented by a computer.
  • the makeup product 102 is preferentially a lipstick product.
  • the method is described here for providing classification labels for a single color 104, of one single makeup product 102.
  • the makeup product 102 is selected amongst a database of makeup product colors at a step 102.
  • the classification is mostly intended to be applied to several colors of several makeup products, in order to distinguish these colors according to subjective human perception of makeup colors. In such a case the method is performedto each color of the set of several makeup products.
  • the method comprises, at a step 104, providing input coordinates data in the L*a*b*space of the makeup product color 102.
  • the L*a*b*space is the conventional “CIELAB” color space, expressing colors as three coordinates: L*for lightness, and a*and b*for, respectively, greenish to reddish chromaticity, and blueish to yellowish chromaticity (the chromaticity conveying both hue and chroma measures) .
  • the step 104 may include a conventional conversion of the color’s coordinates from any other color space to coordinate in the L*a*b*color space.
  • a first assignation step 106 assigns a color family label (which can be for instance brown, pink, orange, purple, or redcolor families) to the makeup product color 102-104.
  • the color family label may be assigned according to an identification of a color family volume, amongst a set of color family volumes in the L*a*b*space, containing the input coordinates data.
  • the color family volumes arespecifically designed according to a subjective perception of makeup colors. For example, and as described hereinafter in relation with figure 3, the color family volumes are decided by fitting with a visual database, advantageously according to a data-driven computation.
  • Second assignation step 108 assignsa lightness subfamily label (which can be for instance light, medium, or darklightnesssubfamilies) , achroma subfamily label (which can be for instance high, intermediate, or lowchroma subfamilies) , and a hue tone subfamily label (which can be for instance cool, neutral, or warmhue tone subfamilies) to the makeup product color 102-104.
  • a lightness subfamily label which can be for instance light, medium, or darklightnesssubfamilies
  • achroma subfamily label which can be for instance high, intermediate, or lowchroma subfamilies
  • a hue tone subfamily label which can be for instance cool, neutral, or warmhue tone subfamilies
  • the lightness subfamily label may be assigned according to an identification of the position of the input coordinates data 104 in comparison with at least one lightness boundary value in the lightness coordinate axis L*of the L*a*b*space.
  • the at least one lightness boundary value is specifically designed according to a subjective perception of makeup colors.
  • the chroma subfamily label may be assignedaccording to the position of the input coordinates data 104 in comparison with at least one chroma boundary line in an a*b*plane of the L*a*b*space.
  • the at least one chroma boundary line is specifically designed according to a subjective perception of makeup colors.
  • the hue tone subfamily label may be assigned according to the position of the input coordinates data 104 in comparison with at least one hue tone boundary surface inside the respective color family volume 106 in the L*a*b*space.
  • the at least one hue tone boundary surface is specifically designed according to a subjective perception of makeup colors.
  • the makeupproduct color is classified according to the assigned labels, for instance and advantageously in an identification card map, hierarchically arranged as described hereinafter in relation with figure 7.
  • Figure 2 illustrates an example discrete representation of the L*a*b*color space, also called CIELAB color space.
  • the CIELAB color space expresses colors as three coordinates: L*for lightness, and a*and b*for, respectively, greenish to reddish hues, and blueish to yellowish hues.
  • the L*a*b*space permits to easily derive the hue, the chroma and the value of the color points in the L*a*b*coordinate system.
  • the value of a color point is defined by its lightness coordinate L*.
  • Figure 3 illustrates the generation of the set of color family volumesin the L*a*b*space, which are used for the identification of the color family label of the input coordinates data at the first assignation step 106 of the method 100.
  • a data base including a finite number of points in the L*a*b*space for each color family is provided, wherein each point is labelled with a respective color family label BRN, ORG, PNK, RD, PRP, as depicted by the scatter plot 302.
  • the labels are “manually” assigned to each point in the data base, by human color experts and according to the subjective perception of makeup colors. These labels may correspond to the evaluation of a brown color family BRN, an orange color family ORG, a pink color family PNK, a red color family RD, and a purple color family PRP.
  • This “manual” assignation is performed one time for configuring the computer implemented classification method according to the specific subjective perception of makeup colors. This one-time manual assignation may be performed according to a classical technique of classification of makeup colors described hereinafter in relation with figure 8.
  • a mathematical calculation is performed in order to extrapolate a continuous volume in the L*a*b*space for each discrete color family scatter plot.
  • the mathematical calculation generates envelopes enclosing all the points of each respective family, for example thanks to a conventional triangulation technique, such as the Delaunay triangulation, and an alpha-shape generation.
  • the mathematical calculation advantageously executes an interpolation spreading the envelopes, as shown by plot 306.
  • the spreading is configured in order to fill the gaps between colors family volumes, until the respective facing surfaces of neighboring envelopes matches with each other. This can for example be performed by choosing the nearest neighbor according to the distance to triangulation surface of each color family.
  • An input point, positioned in the L*a*b*color space according to its coordinates, is thus assigned with the respective label BRN, ORG, PNK, RD, PRP of the envelope which encloses the input point in the L*a*b*color space.
  • Figure 4 illustrates the identification of the position of the input coordinates data 104 in comparison with at least one lightness boundary value BndVal1, BndVal2used for assigning the lightness subfamily labelat step 108 of the method 100.
  • the lightness subfamily labels include a “light” lightness labelLGT, a “medium” lightness label MDM, and a “dark” lightness label DRK and are identified in comparison with a firstlightness boundary value BndVal1 and a second lightness boundary value BndVal2, inferior than the first lightness boundary value BndVal1.
  • the light label LGT is assigned if the lightness input coordinateL*is greater than the first lightness boundary value BndVal1
  • the medium labelMDM is assigned if the lightness input coordinate L*is between the first lightness boundary BndVal1 value and the second lightness boundary value BndVal2
  • the dark labelDRK is assigned if the lightness input coordinateL*is lower than the second lightness boundary value BndVal2.
  • both lightness boundary values BndVal1, BndVal2 are specifically designed according to the subjective perception of makeup colors, and in particular, the level in the L*axis of the lightness boundary values may be set according to the chromaof the input coordinate data.
  • the lightness boundary valuesBndVal1, BndVal2 are advantageously decreased of a step C*Stp for input coordinates having a chroma value C*greater than a thresholdchosen according to the subjective perception of makeup colors.
  • the lightness boundary values BndVal1, BndVal2 slightly decreasefor the highest chromatic colors, in particular, for the “high” chroma subfamilly label assigned at step 108 of the method 100 described in relation with figure 5. This advantageously permits to compensate for an effect, called Helmholtz Kohlrausch effect, where the subjective perception of brightness increases withchroma.
  • Figure 5 illustrates the identification of the position of the input coordinates data 104 in comparison with at least one chroma boundary line BndCrcl1, BndCrcl2 in the a*b*plane including the input coordinates point, used for assigning the chroma subfamily label at step 108 of the method 100.
  • thechroma subfamily labels include a “high” chromalabelHGH, an “intermediate” chroma label INTR, and a “low” chroma label LW and are identified in comparison with a first chroma boundary line BndCrcl1 and a second chroma boundary line BndCrcl2, inferior than the first chroma boundary line BndCrcl1.
  • the first chroma boundary line BndCrcl1 and the second chroma boundary line BndCrcl2 are circle-like, i.e. these lines have a circular appearance.
  • the high label HGH is assigned if the chroma of the input coordinateC*is greater than the firstchroma boundary line BndCrcl1, the intermediatelabelINTR is assigned if the chroma of the input coordinate C*is between the firstchroma boundary line BndCrcl1 and the second chroma boundary line BndCrcl2, and the lowlabelLW is assigned if the chroma of the input coordinateC*is smaller than the secondchroma boundary line BndCrcl2.
  • the both chroma boundary lines BndCrcl1, BndCrcl2 are specifically designed according to the subjective perception of makeup colors, and in particular, these lines are defined only for hues in the a*b*plane that are susceptible to an application of a makeup product.
  • the hues that are susceptible to an application are approximatively located on the half plane of positive value on a*, i.e. from yellowish orange hues yORG to blueish purple hues bPRP.
  • the chroma boundary lines BndCrcl1, BndCrcl2 are advantageously depending on the hue of the input coordinate data, in order to take into account to the subjective perception of the chroma in accordance with the hue ofthe respective color. Indeed, for example the orange hues yORG appear “weaker” in terms of chroma, than the purple huesbPRP.
  • the both chroma boundary lines BndCrcl1, BndCrcl2 may have the appearance of a portion of a spiral having a slightly larger radius on the positive values of b*side (yellowish orange hues yORG) and a slightly narrowerradius on the negative values of b*side (blueish purple hues bPRP) , in comparison with the spiral radius around the zero value of b* (reddish hues at positive values of a*) .
  • Figure 6 illustrates the identification of the position of the input coordinates data 104 in comparison with at least one hue tone boundary surface BndSrfc 1, BndSrfc2 in the L*a*b*space, more particularly in the color family volume (figure 3) including the input coordinates point, used for assigning the hue tone subfamily label at step 108 of the method 100.
  • thehue tone subfamily labels include a “warm” hue tonelabelWRM, a “neutral” hue tone label NTR, and a “cool” hue tone label CL and are identified in comparison with a first hue tone boundary surface BndSrfc1 and a second hue tone boundary surface BndSrfc2, delimiting the space inside each respective color family volumes BRN, ORG, RD, PNK, PRP.
  • Thewarm label WRM is assigned if the input pointis located on one side of the firsthue tone boundary surface BndSrfc1, the neutral label NTR is assigned ifthe input pointis located between the other side of the firsthue tone boundary surface BndSrfc1 and one side of the second hue tone boundary surface BndSrfc2, and the cool labelCL is assigned if the input pointis located on the other side of the secondhue tone boundary surface BndSrfc2.
  • hue tone boundary surfaces BndSrfc1, BndSrfc2 are specifically designed according to the subjective perception of makeup colors, and in particular, these surfaces’ location and the neutral range are decided byvisual results of makeup color experts.
  • Figure 7 illustrates an example resultat the final step 110 of the classification method 100 described hereinabove in relation with figures 1 to 6.
  • the makeupproduct color is classified according to the assigned labels, advantageouslyin the illustrated identification card map, which depicts a hierarchical arrangementof colors according to the color family labels firstly, to thehue tone subfamily labels secondly, and then equally to the lightness subfamily label and the chroma subfamily label.
  • the table may be organized without the sets of subcolumns for each chroma subfamily labels, replaced by a sorting of the colors in each lightness sublines by the rising order of their chroma values, for instance rising order from left to right in the respective sublines.
  • the identification card map can be generated many times, for different selections of groups of colors of makeup products, in a very fast manner.
  • the method thus permits to save a lot of time to colors expert, in comparison with the classical technique for this kind of classification, performed manually by visual inspection of thumbnail color samples as shown by figure 8.
  • the classification is classically made by color experts with physical samples of the colors under human eye perception, for instance with thumbnails color samples of the makeup products which are individually characterized according to the subjective perception, and are spread out over a lab bench or a white board.
  • the chroma value may be seen as the distance of the color point from the origin, the subjective perception of chroma varies depending on the hue.
  • a high chroma point in a given hue for instance a purplish hue
  • a lower chroma point in another hue for instance an orangish hue
  • the classification according to subjective perception of makeup colors remains to be performed visually, and is thus not practicable because of limitation of the displaying performance of the computer monitor.
  • the tool which isimplemented by a computer controlled by the user, may be embodied in practice as a computer program product comprising instructions which, when the program is executed by a computer, cause the computer to carry out the tool according to the present disclosure; or may be embodied in practice as a computer-readable storage medium comprising instructions which, when executed by a computer, cause the computer to carry out the tool according to the present disclosure.
  • the application mode is adapted for the user to select a set of one or moremakeup product color from the displayed identification card map from the mapping mode, and to select one skin tone photograph model or several skin tone photograph models.
  • the application mode is configured to display images of simulation of applications of the selected set of makeup product colors respectively on theat least one skin tone photograph model.
  • Figure 11 shows possible options in the application mode.
  • the user selected about nine or more shades from the mapping mode, and the user selected one skin tone photograph model.
  • Two view options may be provided for displaying the images of simulations, for instance a full-face view (see figure 12) or a close-up view as depicted in figure 11.
  • Shade selection can bemodified at any time, by the same process as in mappingmode, for instance using checkable boxes for each image of simulation. Also, the shade selection that is modifiedcan automatically be reflected in the mapping mode.

Landscapes

  • Physics & Mathematics (AREA)
  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • General Physics & Mathematics (AREA)
  • Spectroscopy & Molecular Physics (AREA)
  • Mathematical Physics (AREA)
  • Multimedia (AREA)
  • Spectrometry And Color Measurement (AREA)

Abstract

L'outil mis en oeuvre par ordinateur, destiné à être commandé par un utilisateur, pour une assistance dans le développement de couleurs de produits de maquillage, comprenant : un mode de mappage adapté à l'utilisateur pour sélectionner un groupe de couleurs de produits de maquillage à partir d'une banque de couleurs de produit de maquillage, configuré pour classifier chaque couleur du groupe de couleurs sélectionné avec le procédé de classification d'une couleur d'un produit de maquillage selon un marquage conçu selon une perception subjective de couleurs de maquillage. Le mode de mappage étant configuré pour afficher une carte des couleurs sélectionnées agencées dans une table en fonction des étiquettes respectivement attribuées, la table étant organisée par une ligne principale pour chaque étiquette de famille de couleurs, et une colonne principale pour chaque étiquette de sous-famille de tonalités de teinte, chaque ligne principale comprenant une sous-ligne pour chaque étiquette de sous-famille de luminosité, chaque colonne principale comprenant une sous-colonne pour chaque étiquette de sous-famille de chrominance.
PCT/CN2022/100042 2022-06-21 2022-06-21 Procédé de classification d'une couleur d'un produit de maquillage et outil pour l'assistance au développement de couleurs de produits de maquillage WO2023245404A1 (fr)

Priority Applications (2)

Application Number Priority Date Filing Date Title
PCT/CN2022/100042 WO2023245404A1 (fr) 2022-06-21 2022-06-21 Procédé de classification d'une couleur d'un produit de maquillage et outil pour l'assistance au développement de couleurs de produits de maquillage
FR2208340A FR3138962A1 (fr) 2022-06-21 2022-08-17 Méthode de classification d’une couleur d’un produit de maquillage et outil d’aide au développement de couleurs de produits de maquillage

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/CN2022/100042 WO2023245404A1 (fr) 2022-06-21 2022-06-21 Procédé de classification d'une couleur d'un produit de maquillage et outil pour l'assistance au développement de couleurs de produits de maquillage

Publications (1)

Publication Number Publication Date
WO2023245404A1 true WO2023245404A1 (fr) 2023-12-28

Family

ID=89379011

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/CN2022/100042 WO2023245404A1 (fr) 2022-06-21 2022-06-21 Procédé de classification d'une couleur d'un produit de maquillage et outil pour l'assistance au développement de couleurs de produits de maquillage

Country Status (2)

Country Link
FR (1) FR3138962A1 (fr)
WO (1) WO2023245404A1 (fr)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5311293A (en) * 1983-07-18 1994-05-10 Chromatics Color Sciences International, Inc. Method and instrument for selecting personal compatible colors
US20080080766A1 (en) * 2006-10-02 2008-04-03 Gregory Payonk Apparatus and Method for Analyzing Skin Using L*a*b* Colorspace
CN106659286A (zh) * 2014-07-23 2017-05-10 博姿有限公司 选择化妆品颜色的方法
US20170140252A1 (en) * 2005-10-03 2017-05-18 Susan Lynn Stucki Computerized, personal-scent analysis sytem
CN108885134A (zh) * 2016-02-08 2018-11-23 平等化妆品公司 用于配制和注出视觉定制化妆品的设备和方法
US20190090614A1 (en) * 2016-03-23 2019-03-28 L'oreal Method for determining the color of a cosmetic product adapted for a wearer's skin

Family Cites Families (5)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US6067504A (en) * 1983-07-18 2000-05-23 Chromatics Color Sciences International, Inc. Method for correctly identifying hair color
US8498456B2 (en) 2009-07-13 2013-07-30 Stylecaster, Inc. Method and system for applying cosmetic and/or accessorial enhancements to digital images
WO2012065037A1 (fr) * 2010-11-12 2012-05-18 Colormodules Inc. Procédé et système de correspondance de couleurs et de recommandation de couleur
US11315173B2 (en) 2016-09-15 2022-04-26 GlamST LLC Applying virtual makeup products
US20180374140A1 (en) * 2017-06-22 2018-12-27 Susan L Stucki Computerized, personal beauty product analysis system

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US5311293A (en) * 1983-07-18 1994-05-10 Chromatics Color Sciences International, Inc. Method and instrument for selecting personal compatible colors
US20170140252A1 (en) * 2005-10-03 2017-05-18 Susan Lynn Stucki Computerized, personal-scent analysis sytem
US20080080766A1 (en) * 2006-10-02 2008-04-03 Gregory Payonk Apparatus and Method for Analyzing Skin Using L*a*b* Colorspace
CN106659286A (zh) * 2014-07-23 2017-05-10 博姿有限公司 选择化妆品颜色的方法
CN108885134A (zh) * 2016-02-08 2018-11-23 平等化妆品公司 用于配制和注出视觉定制化妆品的设备和方法
US20190090614A1 (en) * 2016-03-23 2019-03-28 L'oreal Method for determining the color of a cosmetic product adapted for a wearer's skin

Also Published As

Publication number Publication date
FR3138962A1 (fr) 2024-02-23

Similar Documents

Publication Publication Date Title
Zhou et al. A survey of colormaps in visualization
US8000524B2 (en) Color naming, color categorization and describing color composition of images
JP5968070B2 (ja) 色処理装置および色調整方法
Mittelstädt et al. Colorcat: Guided design of colormaps for combined analysis tasks
US20100284610A1 (en) Skin color evaluation method, skin color evaluation apparatus, skin color evaluation program, and recording medium with the program recorded thereon
KR101913612B1 (ko) 이미지 내의 복합 토큰들을 식별하기 위한 시스템 및 방법
CN110231148B (zh) 一种面向颜色分辨的展陈光源显色性评价方法及系统
US5150199A (en) Method for correlating color measuring scales
KR20140077322A (ko) 화장품 추천 방법 및 이를 이용하는 장치
CN105991899A (zh) 颜色转换信息生成设备和方法
WO2023245404A1 (fr) Procédé de classification d'une couleur d'un produit de maquillage et outil pour l'assistance au développement de couleurs de produits de maquillage
JP7436453B2 (ja) 塗色検索装置
Sanz et al. Customising a qualitative colour description for adaptability and usability
JP2005091005A (ja) 色評価装置
KR102289628B1 (ko) 퍼스널 컬러 진단 시스템
JP5941041B2 (ja) 任意の色の等価明度を示す値および鮮やかさ感を示す値の正規化方法、トーン種別判別方法、マンセル値算出方法、画像形成方法、インターフェース画面表示装置
JP2022538094A (ja) メイクアップパレットまたは染毛配色の少なくとも一方を推奨するためのコンピューティングデバイス、方法、および装置
JP2020536244A (ja) ヘアカラーのクロスメーカー提案を決定するためのプロセス
Connolly The relationship between colour metrics and the appearance of three‐dimensional coloured objects
KR101366163B1 (ko) 색지각 조건을 대응한 색재현 성능 최적화 하이브리드 방법 및 시스템
Luzuriaga et al. Color machine vision system: an alternative for color measurement
KR102596914B1 (ko) 외부환경을 고려하는 측색 방법, 장치, 및 컴퓨터-판독가능 매체
Fan et al. A comparative study of color between abstract paintings, oil paintings and Chinese ink paintings
Baniya Study of various metrics evaluating color quality of light sources
Safibullaevna et al. Processing Color Images, Brightness and Color Conversion

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 22947184

Country of ref document: EP

Kind code of ref document: A1