WO2014102567A1 - Method for determining make-up removal efficiency - Google Patents

Method for determining make-up removal efficiency Download PDF

Info

Publication number
WO2014102567A1
WO2014102567A1 PCT/IB2012/057740 IB2012057740W WO2014102567A1 WO 2014102567 A1 WO2014102567 A1 WO 2014102567A1 IB 2012057740 W IB2012057740 W IB 2012057740W WO 2014102567 A1 WO2014102567 A1 WO 2014102567A1
Authority
WO
WIPO (PCT)
Prior art keywords
image
make
removal
average
evaluation
Prior art date
Application number
PCT/IB2012/057740
Other languages
French (fr)
Inventor
Gaurav Agarwal
Original Assignee
L'oréal
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by L'oréal filed Critical L'oréal
Priority to PCT/IB2012/057740 priority Critical patent/WO2014102567A1/en
Publication of WO2014102567A1 publication Critical patent/WO2014102567A1/en

Links

Classifications

    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • A45D44/005Other cosmetic or toiletry articles, e.g. for hairdressers' rooms for selecting or displaying personal cosmetic colours or hairstyle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • G06T7/0014Biomedical image inspection using an image reference approach
    • G06T7/0016Biomedical image inspection using an image reference approach involving temporal comparison
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • A45D2044/007Devices for determining the condition of hair or skin or for selecting the appropriate cosmetic or hair treatment
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D2200/00Details not otherwise provided for in A45D
    • A45D2200/10Details of applicators
    • A45D2200/1063Removing cosmetic substances, e.g. make-up
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal

Definitions

  • Present invention relates to a method to assert efficiency of make-up removal.
  • Make-up can be removed by manual cleansing or by different electronic devices such as cleansing brushes or by impregnated or dry wipes or pads.
  • make-up is understood to be facial make-up applied over skin or hair dye or nail polish.
  • the present intention addresses evaluation of efficiency of removal of such make-up.
  • the colorimetry test protocol request that several areas be marked on the skin (typically four areas of 4*4 cm 2 ) and each area is subjected to 3 L,a,b, measurements using a colorimeter (for instance CR300 colorimeter).
  • the made-up skin colorimetry is compared with bare skin colorimetry to provide a AE max for each measured area and the cleaned up skin colorimetry is compared with made-up skin colorimetry to provide a ⁇ for each measured area.
  • the efficiency of makeup removal is then provided as a percentage corresponding to the average of the values (AE/AE max )* 100.
  • the colorimetry measurement protocol is widely used to quantify make up removal efficiency of a composition or device such as wipe or brush. However, this colorimetry protocol does not provide very accurate results.
  • the invention relates to an image processing method wherein images of human skin or artificial bio skin are compared before and after make-up removal. More specifically, the invention relates to a method for evaluating make-up removal efficiency comprising the steps of:
  • the invention also relates to a method for evaluating make-up removal efficiency comprising the steps of:
  • the method further comprises the steps of:
  • the pixel value for each evaluation area is comprised between 0 and 255, where value 0 represents a black area and value 255 represents a white area.
  • the images are captured in vitro on dead or reconstructed keratinous, and wherein the evaluation areas are defined as image pixels of the captured images.
  • the images are captured in vivo on living human keratinous, and wherein the evaluation areas are defined as quadrants on the captured images.
  • the make-up can be applied on skin, or the make-up can be dye applied on hair, or the make-up can be polish applied on nails.
  • FIG. 1 illustrates the image of a skin portion after applying makeup
  • FIG. 5 illustrates a quadrant divided image of a skin portion after applying makeup
  • FIG. 6 illustrates a quadrant divided image of said skin portion after removing makeup
  • FIG. 7 illustrates a quadrant divided image of said skin portion for make-up removal uniformity determination.
  • Figures 1 and 2 show images of a skin portion after applying makeup and after removing makeup respectively.
  • Figure 1 is referred to as a first image of a given made-up skin portion before make-up removal and figure 2 is referred to as a second image of said given skin portion after make-up removal.
  • Each image is a digital image and can be divided in a two dimensional array of evaluation areas. Each area can be a unique pixel or a group of pixels depending on the image resolution of the image capturing device.
  • 16 evaluation areas have been defined but it is understood that more or less than 16 areas can be defined while implementing the present invention.
  • Each area is also assigned a pixel value 3 ⁇ 4, bi in Figure 1 and Figure 2 respectively, over a gray scale representing the intensity of light over the given area.
  • the pixel value is comprised between 0 and 255, where value 0 represents a black area and value 255 represents a white area.
  • the make-up removal efficiency of a device or substrate can be quantified by two parameters: a Cleaning efficiency (E) defined by the average of absolute value of the difference between corresponding pixel values in the first image after applying makeup and the second image after removing the makeup; and a Uniformity Index (U) defined by the average of absolute value of difference between neighboring pixels in the image after removing the makeup.
  • E Cleaning efficiency
  • U Uniformity Index
  • the Cleaning efficiency (E) may be calculated as follow:
  • (E) ⁇ with a; and bi the pixel values of each area (pixel) in first and second images and n the number of areas (pixels) considered in each image.
  • the Uniformity Index (U) may be calculated as follow:
  • the cleaning efficiency can be further expressed as a percentage of cleaning with respect to bare skin before make-up is applied.
  • a reference image can be captured, illustrated in figure 3.
  • the reference image represents the same skin portion as figures 1 and 2 and was taken before make-up was applied, preferably once skin portion was cleaned with soap.
  • An average of pixel values differences between similar evaluation areas of the reference and the red images can thus be calculated as:
  • cleaning efficiency E' can then be defined as a percentage of the first calculated average over the second calculated average:
  • the images may be captured in vitro on reconstructed skin, in which case the evaluation areas are defined as image pixels of the captured images as described above.
  • the reconstructed skin can be bioskin® or any other artificial skin for evaluation purposes.
  • the images may also be captured in vivo on living human skin, in which case evaluation areas are defined as quadrants on the captured image. Indeed, when performing in vivo test using instruments such as chromospheres, it is difficult to capture two images maintaining the exact same position of corresponding pixels.
  • each image can be divided into number of windows (4, 9, 16 k 2 ) as illustrated in figure 4 to define quadrants that can be substantially equivalent from one image to the other.
  • FIGS 5 and 6 illustrate the calculation of the two parameters of Cleaning efficiency (E) and Uniformity Index (U) when using the quadrant decomposition of the captured images.
  • Each image is a digital image and is divided in a two dimensional array of evaluation areas.
  • Each evaluation area is a quadrant defined on the captured image and each quadrant includes a group of pixels.
  • four evaluation areas have been defined, each having 16 pixels, but it is understood that more or less than 4 areas can be defined including more or less than 16 pixels while implementing the present invention.
  • Each pixel in each quadrant is also assigned a pixel value pi, ⁇ , r;, 3 ⁇ 4, Pi, Qi, Ri, Si, as defined above with respect to 3 ⁇ 4
  • the Cleaning efficiency (E) may then be calculated as follow:
  • the Uniformity Index (U) of the makeup removal can be defined by average of difference between sum of pixels of neighboring quadrants in the image of skin captured after removing the makeup, as illustrated in figure 7.
  • the Uniformity Index (U) may then be calculated as follow: ⁇ (B N - B N+l )
  • B N the pixel values of each quadrant (defined as the sum of color values of each pixel in the given quadrant) in second image and k the number of areas (quadrants) considered in the image.
  • uniformity index was calculated only in horizontal direction (average of absolute value of difference between horizontal neighboring pixels) only.
  • uniformity index can be also calculated in vertical direction (average of absolute value of difference between vertical neighboring pixels).
  • overall uniformity index can be defined as average of uniformity index in horizontal and vertical direction.
  • an image is taken of a portion of right chick of the human after removing make-up by cleansing brushes.
  • the images were captured using Chromasphere instrument capturing digital images of skin in standardized light. Standard foundation and blusher was applied on both left and right chick of the consumer. On the left side, the make-up was removed using make-up remover wipe comprising a nonwoven substrate and formulation for removing the make-up. On the right side chick, the make-up was removed by electronic oscillating cleansing brush and same formulation as used for wipes.
  • cleansing efficiency of cleansing brush is better than cleansing wipes
  • cleansing brushes remove make-up more uniformly than wipes.
  • the method of the invention considers each micro evaluation areas (pixel) and is therefore more reliable when compared to classical colorimetry methods; it also provides additional information about uniform performance of a makeup removal mean, which information is not available when using colorimetry.
  • the method of the invention is also easy to operate and little time consuming
  • E Cleaning efficiency
  • U Uniformity Index

Landscapes

  • Engineering & Computer Science (AREA)
  • Health & Medical Sciences (AREA)
  • General Health & Medical Sciences (AREA)
  • Medical Informatics (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Radiology & Medical Imaging (AREA)
  • Quality & Reliability (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Physics & Mathematics (AREA)
  • General Physics & Mathematics (AREA)
  • Theoretical Computer Science (AREA)
  • Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)

Abstract

The invention relates to a method for evaluating make-up removal efficiency comprising the steps of: - Capturing a first image of a given made-up keratinous portion before make-up removal; - Capturing a second image of said keratinous portion after make-up removal; - Dividing each image into a plurality of evaluation areas; - Determining a pixel value for each evaluation area; - Determining a cleaning efficiency (E) by calculating an average of pixel values differences between similar evaluation areas of the first and the second captured images. - Determining a uniformity index (U) by calculating an average of pixel values differences between neighboring evaluation areas within the second captured image. The method of the invention provides reliable and complete information about makeup removal mean.

Description

"METHOD FOR DETERMINING MAKE-UP REMOVAL EFFICIENCY"
Field of the invention
Present invention relates to a method to assert efficiency of make-up removal.
Background of the invention
Make-up can be removed by manual cleansing or by different electronic devices such as cleansing brushes or by impregnated or dry wipes or pads.
In the context of the present invention, make-up is understood to be facial make-up applied over skin or hair dye or nail polish. The present intention addresses evaluation of efficiency of removal of such make-up.
One known scientific approach to quantify the amount of make-up removed by any of above means is colorimetry. For instance US 2011/0088711 or FR 2 909 871 disclose use of colorimetry to check the efficiency of make-up removal compositions. Notably, a test protocol is disclosed to measure colorimetry differences before make-up application and after make-up removal.
The colorimetry test protocol request that several areas be marked on the skin (typically four areas of 4*4 cm2) and each area is subjected to 3 L,a,b, measurements using a colorimeter (for instance CR300 colorimeter). The made-up skin colorimetry is compared with bare skin colorimetry to provide a AEmax for each measured area and the cleaned up skin colorimetry is compared with made-up skin colorimetry to provide a ΔΕ for each measured area. The efficiency of makeup removal is then provided as a percentage corresponding to the average of the values (AE/AEmax)* 100. The colorimetry measurement protocol is widely used to quantify make up removal efficiency of a composition or device such as wipe or brush. However, this colorimetry protocol does not provide very accurate results.
Moreover, uniformity of make-up removal cannot be quantified using the above mentioned test protocol. Summary of the invention
There is a need for a new approach of evaluating the performance of make-up removal means.
In one aspect, the invention relates to an image processing method wherein images of human skin or artificial bio skin are compared before and after make-up removal. More specifically, the invention relates to a method for evaluating make-up removal efficiency comprising the steps of:
- Capturing an image of a keratinous portion after make-up removal;
- Dividing said image into a plurality of evaluation areas;
- Determining a pixel value for each evaluation area;
- Determining a uniformity index by calculating an average of pixel values differences between neighboring evaluation areas.
The invention also relates to a method for evaluating make-up removal efficiency comprising the steps of:
- Capturing a first image of a given made-up keratinous portion before make-up removal;
- Capturing a second image of said given keratinous portion after makeup removal;
- Dividing each image into a plurality of evaluation areas;
- Determining a pixel value for each evaluation area;
- Determining a cleaning efficiency by calculating an average of pixel values differences between similar evaluation areas of the first and the second captured images.
According to an embodiment, the method further comprises the steps of:
- Capturing a reference image of the given keratinous portion before make-up application;
- Dividing the reference image into a plurality of evaluation areas;
- Calculating an average of pixel values differences between similar evaluation areas of the reference and the first captured images;
- Determining a percentage of cleaning efficiency as a percentage of the average of differences between the first and the second captured images over the average of differences between the reference and the first captured images.
According to an embodiment, the pixel value for each evaluation area is comprised between 0 and 255, where value 0 represents a black area and value 255 represents a white area.
According to an embodiment, the images are captured in vitro on dead or reconstructed keratinous, and wherein the evaluation areas are defined as image pixels of the captured images.
According to another embodiment, the images are captured in vivo on living human keratinous, and wherein the evaluation areas are defined as quadrants on the captured images.
According to the invention, the make-up can be applied on skin, or the make-up can be dye applied on hair, or the make-up can be polish applied on nails.
Brief description of the figures
Further features and advantages of the invention will become apparent upon reading the following description and with reference with the appended figures which show:
- Figure 1 illustrates the image of a skin portion after applying makeup ;
- Figure 2 illustrates the image of said skin portion after removing makeup;
- Figure 3 illustrates the image of said skin portion before applying makeup;
- Figure 4 illustrates division of a captured image into quadrants;
- Figure 5 illustrates a quadrant divided image of a skin portion after applying makeup;
- Figure 6 illustrates a quadrant divided image of said skin portion after removing makeup;
- Figure 7 illustrates a quadrant divided image of said skin portion for make-up removal uniformity determination. Detailed description of the invention
The method for evaluating make-up removal according to the invention relies on image processing. Figures 1 and 2 show images of a skin portion after applying makeup and after removing makeup respectively. Figure 1 is referred to as a first image of a given made-up skin portion before make-up removal and figure 2 is referred to as a second image of said given skin portion after make-up removal.
Each image is a digital image and can be divided in a two dimensional array of evaluation areas. Each area can be a unique pixel or a group of pixels depending on the image resolution of the image capturing device. In figures 1 and 2, 16 evaluation areas have been defined but it is understood that more or less than 16 areas can be defined while implementing the present invention.
Each area, in each image, is assigned a pixel number (i=l,..., n), with n=16 (n= a x b) in the illustrated example, where a and b are number of rows and columns . Each area is also assigned a pixel value ¾, bi in Figure 1 and Figure 2 respectively, over a gray scale representing the intensity of light over the given area. In a preferred embodiment, the pixel value is comprised between 0 and 255, where value 0 represents a black area and value 255 represents a white area.
The make-up removal efficiency of a device or substrate can be quantified by two parameters: a Cleaning efficiency (E) defined by the average of absolute value of the difference between corresponding pixel values in the first image after applying makeup and the second image after removing the makeup; and a Uniformity Index (U) defined by the average of absolute value of difference between neighboring pixels in the image after removing the makeup. Higher value of E represents a better performance of makeup removal mean; and lower value of U represents more uniform removal of makeup.
The Cleaning efficiency (E) may be calculated as follow:
(E) = ^ with a; and bi the pixel values of each area (pixel) in first and second images and n the number of areas (pixels) considered in each image.
The Uniformity Index (U) may be calculated as follow:
/=!,(«-!)
(a - l) *b
(2) with bi the pixel values of each area (pixel) in second image and n the number of areas (pixels) considered in the image.
In an embodiment, the cleaning efficiency can be further expressed as a percentage of cleaning with respect to bare skin before make-up is applied.
To that purpose, a reference image can be captured, illustrated in figure 3. The reference image represents the same skin portion as figures 1 and 2 and was taken before make-up was applied, preferably once skin portion was cleaned with soap. Similarly to first and second images (figures 1 and 2) evaluation areas are defined in the referenced image (figure 3) and are assigned a pixel number (i=l,..., n), with n=16 in the illustrated example, and a pixel value c;.
An average of pixel values differences between similar evaluation areas of the reference and the red images can thus be calculated as:
Figure imgf000007_0001
n
Then, cleaning efficiency E' can then be defined as a percentage of the first calculated average over the second calculated average:
E'=E/Emax* 100 (4)
The images may be captured in vitro on reconstructed skin, in which case the evaluation areas are defined as image pixels of the captured images as described above. The reconstructed skin can be bioskin® or any other artificial skin for evaluation purposes.
When the method of the invention is to be applied, one need to make sure that the images of the skin portion before and after removing the makeup are captured exactly at the same position to be able to make the correspondence between pixels when calculating the cleaning efficiency (E) and the Uniformity Index (U) as defined above. In other words the corresponding pixel's position should not be changed between first and second image and between first and referenced image. This can be quite easily achieved when performing in vitro test.
The images may also be captured in vivo on living human skin, in which case evaluation areas are defined as quadrants on the captured image. Indeed, when performing in vivo test using instruments such as chromospheres, it is difficult to capture two images maintaining the exact same position of corresponding pixels.
Therefore, each image can be divided into number of windows (4, 9, 16 k2) as illustrated in figure 4 to define quadrants that can be substantially equivalent from one image to the other.
Figures 5 and 6 illustrate the calculation of the two parameters of Cleaning efficiency (E) and Uniformity Index (U) when using the quadrant decomposition of the captured images.
Each image is a digital image and is divided in a two dimensional array of evaluation areas. Each evaluation area is a quadrant defined on the captured image and each quadrant includes a group of pixels. In figures 5 and 6, four evaluation areas have been defined, each having 16 pixels, but it is understood that more or less than 4 areas can be defined including more or less than 16 pixels while implementing the present invention.
In each image, each pixel is assigned a pixel number (i=l,..., n) and each quadrant is assigned a quadrant number (j=l, k). Each pixel in each quadrant is also assigned a pixel value pi, φ, r;, ¾, Pi, Qi, Ri, Si, as defined above with respect to ¾
The Cleaning efficiency (E) may then be calculated as follow:
Figure imgf000008_0001
(* - !) * * (5)
with pi and Pi the pixel values of each pixel in a given quadrant of first and second images and n the number of pixels in each quadrant; and k the number of quadrant considered in each image.
When considering the quadrant decomposition of captured images, the Uniformity Index (U) of the makeup removal can be defined by average of difference between sum of pixels of neighboring quadrants in the image of skin captured after removing the makeup, as illustrated in figure 7.
The Uniformity Index (U) may then be calculated as follow: ∑(BN - BN+l)
N=\to((kk--\l))
(U)
(6)
with BN the pixel values of each quadrant (defined as the sum of color values of each pixel in the given quadrant) in second image and k the number of areas (quadrants) considered in the image.
In above illustrated examples uniformity index was calculated only in horizontal direction (average of absolute value of difference between horizontal neighboring pixels) only. In other embodiment of this method, uniformity index can be also calculated in vertical direction (average of absolute value of difference between vertical neighboring pixels). Yet in another embodiment of this invention overall uniformity index can be defined as average of uniformity index in horizontal and vertical direction.
An in vivo example of implementation of the invention is being given below, where images of human chick were captured and corresponding area of 200 x 200 pixels was selected on each image for the evaluation.
- an image is taken of a portion of left chick of the human before applying make-up;
- an image is taken of a portion of left chick of the human after applying make-up.
- an image is taken of a portion of left chick of the human after removing make-up with make-up removal wipes.
- an image is taken of a portion of right chick of the human before applying make-up.
- an image is taken of a portion of right chick of the human after applying make-up.
- an image is taken of a portion of right chick of the human after removing make-up by cleansing brushes.
The images were captured using Chromasphere instrument capturing digital images of skin in standardized light. Standard foundation and blusher was applied on both left and right chick of the consumer. On the left side, the make-up was removed using make-up remover wipe comprising a nonwoven substrate and formulation for removing the make-up. On the right side chick, the make-up was removed by electronic oscillating cleansing brush and same formulation as used for wipes.
Following results were obtained using the quadrant method described in this invention:
Figure imgf000010_0001
The method of the invention allows drawing the following conclusions:
- In above example, cleansing efficiency of cleansing brush is better than cleansing wipes;
- Cleansing brushes remove the make-up more uniformly in horizontal direction (lower value of uniformity index);
- Wipes remove the make- up more uniformly in Vertical direction
(lower value of uniformity index);
- Overall, cleansing brushes remove make-up more uniformly than wipes.
The method of the invention considers each micro evaluation areas (pixel) and is therefore more reliable when compared to classical colorimetry methods; it also provides additional information about uniform performance of a makeup removal mean, which information is not available when using colorimetry.
The method of the invention is also easy to operate and little time consuming When implementing the method of the invention, one can defined either the two parameters consisting of Cleaning efficiency (E) and Uniformity Index (U) or only one of the two parameters. While the invention was described in conjunction with the detailed description above and illustrated figure, the foregoing description is only intended to illustrate and not limit the scope of the invention, which is defined by the scope of the appended claims. Other aspects, advantages and modifications are within the claims. Notably, the description was mainly made with respect to skin make-up removal efficiency, but the same approach can be used for any other keratinous portion to which color is applied such as hair dye or nail polish remover.

Claims

1. A method for evaluating make-up removal efficiency comprising the steps of:
- Capturing an image of a keratinous portion after make-up removal; - Dividing said image into a plurality of evaluation areas;
- Determining a pixel value for each evaluation area;
- Determining a uniformity index (U) by calculating an average of pixel values differences between neighboring evaluation areas.
2. A method for evaluating make-up removal efficiency comprising the steps of:
- Capturing a first image of a given made-up keratinous portion before make-up removal;
- Capturing a second image of said given keratinous portion after make- up removal;
- Dividing each image into a plurality of evaluation areas;
- Determining a pixel value for each evaluation area;
- Determining a cleaning efficiency (E) by calculating an average of pixel values differences between similar evaluation areas of the first and the second captured images.
3. The method of claim 2, further comprising the steps of:
- Capturing a reference image of the given keratinous portion before make-up application;
- Dividing the reference image into a plurality of evaluation areas;
- Calculating an average (Emax) of pixel values differences between similar evaluation areas of the reference and the first captured images;
- Determining a percentage of cleaning efficiency (Ε') as a percentage of the average of differences between the first and the second captured images over the average of differences between the reference and the first captured images.
4. The method of any one of preceding claims, wherein the pixel value for each evaluation area is comprised between 0 and 255, where value 0 represents a black area and value 255 represents a white area.
5. The method of any one of claims 1 to 4, wherein the images are captured in vitro on dead or reconstructed keratinous, and wherein the evaluation areas are defined as image pixels of the captured images.
6. The method of any one of claims 1 or 2, wherein the images are captured in vivo on living human keratinous, and wherein the evaluation areas are defined as quadrants on the captured images.
7. The method of any one of claims 1 to 6, wherein the make-up is applied on skin.
8. The method of any one of claims 1 to 6, wherein the make-up is dye applied on hair.
9. The method of any one of claims 1 to 6, wherein the make-up polish is applied on nails.
PCT/IB2012/057740 2012-12-27 2012-12-27 Method for determining make-up removal efficiency WO2014102567A1 (en)

Priority Applications (1)

Application Number Priority Date Filing Date Title
PCT/IB2012/057740 WO2014102567A1 (en) 2012-12-27 2012-12-27 Method for determining make-up removal efficiency

Applications Claiming Priority (1)

Application Number Priority Date Filing Date Title
PCT/IB2012/057740 WO2014102567A1 (en) 2012-12-27 2012-12-27 Method for determining make-up removal efficiency

Publications (1)

Publication Number Publication Date
WO2014102567A1 true WO2014102567A1 (en) 2014-07-03

Family

ID=47844406

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/IB2012/057740 WO2014102567A1 (en) 2012-12-27 2012-12-27 Method for determining make-up removal efficiency

Country Status (1)

Country Link
WO (1) WO2014102567A1 (en)

Cited By (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109855927A (en) * 2017-11-20 2019-06-07 株式会社爱茉莉太平洋 Pore copies the screening technique of object, the evaluation method of substance with skin cleaning capacity and the substance with skin cleaning capacity
CN111024694A (en) * 2019-12-10 2020-04-17 上海发那科机器人有限公司 System and method for detecting wiping effect of ostrich hair wiping equipment on vehicle body
CN111091029A (en) * 2018-10-24 2020-05-01 宁波方太厨具有限公司 Method for evaluating uniformity of baked food in oven
CN113554623A (en) * 2021-07-23 2021-10-26 江苏医像信息技术有限公司 Intelligent quantitative analysis method and system for human face skin

Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19846530A1 (en) * 1998-10-09 2000-04-13 Henkel Kgaa Controlling the distribution of structures on a surface or of particles in the space comprises producing a two-dimensional image of the distribution, breaking down into
FR2909871A1 (en) 2006-12-13 2008-06-20 Oreal Use of 2-ethylhexyl salicylate as a make-up remover
JP2011080915A (en) * 2009-10-08 2011-04-21 Shiseido Co Ltd Method, device and program for evaluating application unevenness of skin care preparation
US20110088711A1 (en) 2009-06-24 2011-04-21 L'oreal Wipe with an emulsion containing a thickening polymer and a hydrophobic modified inulin

Patent Citations (4)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
DE19846530A1 (en) * 1998-10-09 2000-04-13 Henkel Kgaa Controlling the distribution of structures on a surface or of particles in the space comprises producing a two-dimensional image of the distribution, breaking down into
FR2909871A1 (en) 2006-12-13 2008-06-20 Oreal Use of 2-ethylhexyl salicylate as a make-up remover
US20110088711A1 (en) 2009-06-24 2011-04-21 L'oreal Wipe with an emulsion containing a thickening polymer and a hydrophobic modified inulin
JP2011080915A (en) * 2009-10-08 2011-04-21 Shiseido Co Ltd Method, device and program for evaluating application unevenness of skin care preparation

Non-Patent Citations (2)

* Cited by examiner, † Cited by third party
Title
LEI JI ET AL: "An Agreement Coefficient for Image Comparison", PHOTOGRAMMETRIC ENGINEERING & REMOTE SENSING, vol. 72, no. 7, 1 July 2006 (2006-07-01), pages 823 - 833, XP055076533 *
MARAGOS P: "Morphological correlation and mean absolute error criteria", 19890523; 19890523 - 19890526, 23 May 1989 (1989-05-23), pages 1568 - 1571, XP010082648 *

Cited By (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
CN109855927A (en) * 2017-11-20 2019-06-07 株式会社爱茉莉太平洋 Pore copies the screening technique of object, the evaluation method of substance with skin cleaning capacity and the substance with skin cleaning capacity
CN109855927B (en) * 2017-11-20 2024-02-02 株式会社爱茉莉太平洋 Pore imitation, method for evaluating substance having skin cleaning ability, and method for screening substance having skin cleaning ability
CN111091029A (en) * 2018-10-24 2020-05-01 宁波方太厨具有限公司 Method for evaluating uniformity of baked food in oven
CN111091029B (en) * 2018-10-24 2022-05-17 宁波方太厨具有限公司 Method for evaluating uniformity of baked food in oven
CN111024694A (en) * 2019-12-10 2020-04-17 上海发那科机器人有限公司 System and method for detecting wiping effect of ostrich hair wiping equipment on vehicle body
CN113554623A (en) * 2021-07-23 2021-10-26 江苏医像信息技术有限公司 Intelligent quantitative analysis method and system for human face skin

Similar Documents

Publication Publication Date Title
JP6086573B2 (en) Method and apparatus for characterizing pigmented spots and its application in methods for evaluating the coloring or depigmenting effect of cosmetic, skin or pharmaceutical products
JP5885344B2 (en) Method for characterizing skin or skin tone
WO2014102567A1 (en) Method for determining make-up removal efficiency
JP2007252891A (en) Estimation method of evaluation value by visual recognition of beauty of skin
JP3351958B2 (en) Skin evaluation method
JP3426052B2 (en) Skin evaluation device
US20110142305A1 (en) Targeted image transformation of skin attribute
JP2009082338A (en) Skin discrimination method using entropy
Azemin et al. GLCM texture analysis on different color space for pterygium grading
JP5635762B2 (en) Method for calculating nipple shape or collagen-like structure
EP2400458A3 (en) Method and device for segmenting biological cells in an image
JP2014064896A5 (en)
JP2944309B2 (en) Skin surface morphological feature detection method
KR102239575B1 (en) Apparatus and Method for skin condition diagnosis
JP2011212307A (en) Cosmetic-applied skin evaluation device, cosmetic-applied skin evaluation method, and cosmetic-applied skin evaluation program
JP2007252892A (en) Estimation method of evaluation value by visual recognition of three-dimensional shape of skin surface
JP7213911B2 (en) Cosmetic evaluation method and cosmetic design method
JP6036782B2 (en) Method and apparatus for evaluating gloss of cosmetics
JP2015064823A (en) Cosmetic evaluation method, and facial expression wrinkle quantitation method
JP6967767B2 (en) Makeup breakage evaluation method and makeup breakage evaluation model
KR20120050060A (en) Quantitative estimating method to skin adhesion of cosmetics goods
KR20160097760A (en) Method For Evaluating Application Uniformity of Makeup Cosmetics
JP5343882B2 (en) A method for distinguishing the degree of delamination of stratum corneum cells
JP2017063904A (en) Skin evaluation apparatus, skin evaluation method, and skin evaluation program
JP6200767B2 (en) Skin internal structure estimation method and skin color unevenness evaluation method

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 12830887

Country of ref document: EP

Kind code of ref document: A1

NENP Non-entry into the national phase

Ref country code: DE

122 Ep: pct application non-entry in european phase

Ref document number: 12830887

Country of ref document: EP

Kind code of ref document: A1