WO2014196532A1 - 透明感評価装置、透明感評価方法および透明感評価プログラム - Google Patents
透明感評価装置、透明感評価方法および透明感評価プログラム Download PDFInfo
- Publication number
- WO2014196532A1 WO2014196532A1 PCT/JP2014/064746 JP2014064746W WO2014196532A1 WO 2014196532 A1 WO2014196532 A1 WO 2014196532A1 JP 2014064746 W JP2014064746 W JP 2014064746W WO 2014196532 A1 WO2014196532 A1 WO 2014196532A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- evaluation
- index
- skin
- component
- transparency
- Prior art date
Links
- 238000011156 evaluation Methods 0.000 title claims abstract description 802
- 238000004364 calculation method Methods 0.000 claims abstract description 342
- 238000009826 distribution Methods 0.000 claims abstract description 66
- 238000012360 testing method Methods 0.000 claims description 14
- 239000008280 blood Substances 0.000 claims description 10
- 210000004369 blood Anatomy 0.000 claims description 10
- 238000005034 decoration Methods 0.000 claims description 9
- 239000000284 extract Substances 0.000 claims description 6
- 210000003205 muscle Anatomy 0.000 claims description 5
- 210000000216 zygoma Anatomy 0.000 claims description 5
- 239000002131 composite material Substances 0.000 abstract 1
- 230000001953 sensory effect Effects 0.000 description 107
- 238000006243 chemical reaction Methods 0.000 description 60
- 239000011148 porous material Substances 0.000 description 22
- 230000007423 decrease Effects 0.000 description 18
- 238000007781 pre-processing Methods 0.000 description 17
- 238000010586 diagram Methods 0.000 description 12
- 238000000034 method Methods 0.000 description 12
- 238000012545 processing Methods 0.000 description 10
- 239000002537 cosmetic Substances 0.000 description 9
- 230000037303 wrinkles Effects 0.000 description 9
- 239000011248 coating agent Substances 0.000 description 7
- 238000000576 coating method Methods 0.000 description 7
- 238000001514 detection method Methods 0.000 description 7
- 230000032683 aging Effects 0.000 description 6
- 230000000052 comparative effect Effects 0.000 description 5
- 239000007788 liquid Substances 0.000 description 5
- 230000035807 sensation Effects 0.000 description 5
- 230000006870 function Effects 0.000 description 4
- 238000012937 correction Methods 0.000 description 3
- 210000004709 eyebrow Anatomy 0.000 description 3
- 206010015535 Euphoric mood Diseases 0.000 description 2
- 239000003795 chemical substances by application Substances 0.000 description 2
- 230000000295 complement effect Effects 0.000 description 2
- 230000000875 corresponding effect Effects 0.000 description 2
- 239000006071 cream Substances 0.000 description 2
- 238000009499 grossing Methods 0.000 description 2
- 239000004615 ingredient Substances 0.000 description 2
- 239000000463 material Substances 0.000 description 2
- 238000005259 measurement Methods 0.000 description 2
- 238000012986 modification Methods 0.000 description 2
- 230000004048 modification Effects 0.000 description 2
- 238000000053 physical method Methods 0.000 description 2
- 238000000611 regression analysis Methods 0.000 description 2
- 229910010413 TiO 2 Inorganic materials 0.000 description 1
- 238000013459 approach Methods 0.000 description 1
- 230000003796 beauty Effects 0.000 description 1
- 239000003086 colorant Substances 0.000 description 1
- 230000002596 correlated effect Effects 0.000 description 1
- 238000000605 extraction Methods 0.000 description 1
- 210000000887 face Anatomy 0.000 description 1
- 238000012880 independent component analysis Methods 0.000 description 1
- 230000000873 masking effect Effects 0.000 description 1
- 238000000691 measurement method Methods 0.000 description 1
- 230000003287 optical effect Effects 0.000 description 1
- 239000002245 particle Substances 0.000 description 1
- 238000005192 partition Methods 0.000 description 1
- 239000000049 pigment Substances 0.000 description 1
- 239000000843 powder Substances 0.000 description 1
- 238000000513 principal component analysis Methods 0.000 description 1
- 230000000475 sunscreen effect Effects 0.000 description 1
- 239000000516 sunscreening agent Substances 0.000 description 1
Images
Classifications
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D44/00—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/103—Detecting, measuring or recording devices for testing the shape, pattern, colour, size or movement of the body or parts thereof, for diagnostic purposes
- A61B5/1032—Determining colour for diagnostic purposes
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/27—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands using photo-electric detection ; circuits for computing concentration
-
- A—HUMAN NECESSITIES
- A45—HAND OR TRAVELLING ARTICLES
- A45D—HAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
- A45D44/00—Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
- A45D2044/007—Devices for determining the condition of hair or skin or for selecting the appropriate cosmetic or hair treatment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B2576/00—Medical imaging apparatus involving image processing or analysis
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/743—Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01J—MEASUREMENT OF INTENSITY, VELOCITY, SPECTRAL CONTENT, POLARISATION, PHASE OR PULSE CHARACTERISTICS OF INFRARED, VISIBLE OR ULTRAVIOLET LIGHT; COLORIMETRY; RADIATION PYROMETRY
- G01J3/00—Spectrometry; Spectrophotometry; Monochromators; Measuring colours
- G01J3/46—Measurement of colour; Colour measuring devices, e.g. colorimeters
- G01J2003/467—Colour computing
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/25—Colour; Spectral properties, i.e. comparison of effect of material on the light at two or more different wavelengths or wavelength bands
- G01N21/251—Colorimeters; Construction thereof
-
- G—PHYSICS
- G01—MEASURING; TESTING
- G01N—INVESTIGATING OR ANALYSING MATERIALS BY DETERMINING THEIR CHEMICAL OR PHYSICAL PROPERTIES
- G01N21/00—Investigating or analysing materials by the use of optical means, i.e. using sub-millimetre waves, infrared, visible or ultraviolet light
- G01N21/17—Systems in which incident light is modified in accordance with the properties of the material investigated
- G01N21/55—Specular reflectivity
- G01N21/57—Measuring gloss
-
- G—PHYSICS
- G16—INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR SPECIFIC APPLICATION FIELDS
- G16H—HEALTHCARE INFORMATICS, i.e. INFORMATION AND COMMUNICATION TECHNOLOGY [ICT] SPECIALLY ADAPTED FOR THE HANDLING OR PROCESSING OF MEDICAL OR HEALTHCARE DATA
- G16H30/00—ICT specially adapted for the handling or processing of medical images
- G16H30/40—ICT specially adapted for the handling or processing of medical images for processing medical images, e.g. editing
Definitions
- the present invention relates to a transparency evaluation apparatus, a transparency evaluation method, and a transparency evaluation program, and in particular, a transparency for evaluating the transparency of skin or makeup skin based on a photographed image obtained by photographing the skin or makeup skin.
- the present invention relates to a feeling evaluation apparatus, a transparency evaluation method, and a transparency evaluation program.
- a physical measurement method of the skin regarding transparency for example, a method of measuring skin conditions such as moisture content, oil content and texture shape, or a method of measuring optical properties of the skin such as specular reflection and internal scattering Has been proposed, and the transparency of the skin is evaluated based on these physical quantities.
- these measurement methods measure the local physical quantity of the skin. For this reason, the physical quantity obtained by measurement does not directly represent the transparency, which is a sensation when looking at the entire skin, and it was difficult to accurately evaluate the transparency based on this physical quantity. .
- Patent Document 1 does not evaluate transparency based on a measurement value obtained by a physical measurement method, and is difficult to say an objective evaluation method.
- said evaluation method of patent document 1 etc. evaluates the transparency of bare skin, and when the transparency is similarly evaluated about the skin (makeup skin) to which makeup was given, between sensory evaluations. There was a gap in the evaluation results.
- the present invention has been made to solve such conventional problems, and transparency evaluation that can objectively evaluate the transparency of the skin according to the sense when the skin is viewed as a whole.
- An object is to provide a device, a transparency evaluation method, and a transparency evaluation program.
- Another object of the present invention is to provide a transparency evaluation apparatus, a transparency evaluation method, and a transparency evaluation program that can accurately evaluate the transparency of makeup skin.
- the transparency evaluation apparatus includes an image input unit that inputs a photographed image obtained by photographing a subject's skin, a representative value of a brightness component in the photographed image, a representative value of a color component in the photographed image, and a photographed image. At least one of the occurrence factors of a negative factor in which the value of the brightness component or the value of the color component locally changes is calculated as the first skin evaluation index, and the intensity distribution and the color of the brightness component in the photographed image are calculated. At least one of the intensity distributions of the components is obtained, and at least one of the smoothness of the change of the brightness component and the smoothness of the change of the color component based on the obtained intensity distribution is used as the second skin evaluation index.
- the skin index calculation unit brightnesses the captured image with a plurality of contour lines set in stages at regular intensity intervals in order to obtain the intensity distribution of the brightness component and the intensity distribution of the color component in the captured image. It is preferable to partition according to the value of the component or the value of the color component.
- the skin index calculation unit can calculate the intervals between the plurality of adjacent contour lines, and can calculate the second skin evaluation index based on the obtained uniformity of the intervals between the plurality of contour lines.
- the skin index calculation unit may calculate the second skin evaluation index based on the number of contour lines that divide the captured image.
- the skin index calculation unit sets an evaluation area for calculating the second skin evaluation index so as to linearly connect the contour portion of the face from the cheek portion of the subject's face.
- the skin index calculation unit can calculate the average value of the brightness component in the captured image as the representative value of the brightness component and calculate the average value of the color component in the captured image as the representative value of the color component. Further, the skin index calculation unit can calculate the number of negative factors detected from the captured image, the total area, or the area ratio as the generation amount of the negative factors.
- the skin index calculation unit detects a portion where the brightness component value or the color component value in the captured image changes locally and is larger than the negative factor as color unevenness, and the detected amount of color unevenness is detected.
- the total index is calculated as a third skin evaluation index, and the total index calculation unit can calculate a total index by combining a plurality of evaluation indexes further including the third skin evaluation index. Further, the skin index calculation unit can calculate the total area, area ratio, or number of color unevenness detected from the captured image as the amount of color unevenness generated.
- a photographed image obtained by photographing a subject's skin is input, the representative value of the brightness component in the photographed image, the representative value of the color component in the photographed image, and the brightness component in the photographed image.
- the intensity distribution of the brightness component and the intensity distribution of the color component in the photographed image are calculated as at least one of the generation factors of the negative factor that locally changes the value or the color component value as the first skin evaluation index.
- the transparency evaluation program includes a step of acquiring a photographed image obtained by photographing a subject's skin, a representative value of a brightness component in the photographed image, a representative value of a color component in the photographed image, and brightness in the photographed image. At least one of the generation factors of a negative factor in which the component value or the color component value locally changes is calculated as the first skin evaluation index, and the intensity distribution of the brightness component and the color component in the photographed image are calculated. At least one of the intensity distributions is obtained, and at least one of the smoothness of the brightness component change and the smoothness of the color component change is calculated as the second skin evaluation index based on the obtained intensity distribution.
- the step and a plurality of evaluation indices including the calculated first skin evaluation index and the second skin evaluation index are combined with each other to calculate a comprehensive index for skin transparency
- a method, based on the overall index calculated is intended to execute the steps on a computer to evaluate the clarity of the subject's skin.
- the transparency evaluation device includes an image input unit that inputs a photographed image obtained by photographing a face of a subject to whom makeup is applied, a representative value of a brightness component in the photographed image, a representative value of a color component in the photographed image, And a skin index calculation unit that calculates at least one of the generation amounts of a negative factor that locally changes the value of the brightness component or the color component in the captured image as a first skin evaluation index, and the captured image
- a makeup index calculation unit that calculates the amount of the red component caused by the blood color as a first makeup evaluation index, and a first skin evaluation index and a first makeup calculated by the skin index calculation unit and the makeup index calculation unit, respectively.
- a comprehensive index calculation unit that calculates a total index for transparency by combining multiple evaluation indexes including the evaluation index, and makeup is applied based on the total index calculated by the total index calculation unit Those having a transparency evaluation unit for evaluating the transparency of the face of the examiner.
- the makeup index calculation unit calculates the average value of the red component in the captured image, the area of the portion where the red component is detected in the captured image, or the area ratio of the portion where the red component is detected in the captured image as the amount of the red component It is preferable to calculate as Further, it is preferable that the makeup index calculating unit sets a predetermined evaluation region in at least one of the inter-brow portion, cheek portion, and chin portion of the subject's face, and calculates the amount of the red component in the predetermined evaluation region. .
- the makeup index calculation unit detects a spot portion where the value of the brightness component or the value of the color component locally changes in the photographed image, and uses the hue difference between the spot portion and the surrounding area as a second makeup evaluation index.
- the total index calculation unit can calculate the total index by combining a plurality of evaluation indexes further including the second makeup evaluation index.
- the makeup index calculation unit detects a low-luminance portion indicating a shadow generated on the face of the subject as a concavo-convex portion based on the value of the brightness component in the photographed image, and determines the amount of the concavo-convex portion as a third makeup evaluation index.
- the comprehensive index calculation unit can calculate a comprehensive index by combining a plurality of evaluation indices further including the third makeup evaluation index.
- index calculation part calculates the area or area ratio of the uneven
- the makeup index calculating unit sets a predetermined area in at least one of the eye part of the subject's face and the part extending from the nose to the mouth, and calculates the amount of the uneven part in the predetermined area.
- the makeup index calculation unit extracts the makeup-derived brightness component or the makeup-derived color component from the photographed image based on the brightness component value or the color component value different from each other from the skin and makeup, and is derived from the makeup
- the brightness component value or the color component value derived from the makeup is extracted in a non-uniform manner to calculate the makeup non-uniformity as the fourth makeup evaluation index.
- a total index can be calculated by combining a plurality of evaluation indices further including a makeup evaluation index.
- the makeup index calculation unit sets a predetermined area on the cheek portion of the subject and calculates the makeup non-uniformity in the predetermined area.
- the makeup index calculation unit detects a middle glossy portion indicating the shininess of the face of the subject based on the intensity of the brightness component in the photographed image, and calculates the amount of the middle glossy portion as a fifth makeup evaluation index
- the comprehensive index calculation unit can calculate a comprehensive index by combining a plurality of evaluation indexes further including the fifth makeup evaluation index. Further, it is preferable that the makeup index calculation unit sets a predetermined region in at least one of the cheekbone portion and the nose muscle portion of the subject's face and calculates the amount of the glossy portion in the predetermined region.
- the skin index calculation unit obtains at least one of the intensity distribution of the brightness component and the intensity distribution of the color component in the photographed image, and the smoothness of the change of the brightness component based on the obtained intensity distribution, and the color At least one of the smoothness of the component change is calculated as the second skin evaluation index, and the total index calculation unit calculates a total index by combining a plurality of evaluation indexes further including the second skin evaluation index. be able to.
- the skin index calculation unit detects a portion where the brightness component value or the color component value in the photographed image changes locally and is larger than the negative factor as color unevenness, and the detected amount of color unevenness is detected.
- the total index is calculated as a third skin evaluation index, and the total index calculation unit can calculate a total index by combining a plurality of evaluation indexes further including the third skin evaluation index.
- a photographed image obtained by photographing a face of a subject to whom makeup is applied is input, a representative value of a brightness component in the photographed image, a representative value of a color component in the photographed image, and a photographed image
- the amount of the red component resulting from the blood color in the photographed image is calculated by calculating at least one of the generation amounts of the minus factor that locally changes the value of the brightness component or the value of the color component as the first skin evaluation index.
- a plurality of evaluation indices including the calculated first skin evaluation index and the first makeup evaluation index are combined with each other to calculate a total index for transparency, and the calculated total Based on the index, the transparency of the face of the subject to whom makeup is applied is evaluated.
- the transparency evaluation program includes a step of obtaining a photographed image obtained by photographing a face of a subject to whom makeup is applied, a representative value of a brightness component in the photographed image, a representative value of a color component in the photographed image, and A step of calculating at least one of the generation amounts of a negative factor that locally changes the value of the brightness component or the color component in the photographed image as a first skin evaluation index, A step of calculating the amount of the red component as the first makeup evaluation index, and a plurality of evaluation indexes including the calculated first skin evaluation index and the first makeup evaluation index are combined with each other to calculate an overall index for transparency. And a step of evaluating the transparency of the face of the subject to whom makeup has been applied based on the calculated overall index. That.
- the first evaluation index for obtaining at least one of the overall brightness component value, the overall color component value, and the generation amount of the negative factor, and the smoothness of the change in the brightness component are obtained.
- Transparency is evaluated on the basis of a comprehensive index that combines the second evaluation index for which at least one of the smoothness of the color component and the change in color component is obtained.
- the first skin evaluation index is calculated and the amount of the red component in the photographed image is calculated as the first makeup evaluation index to evaluate the transparency of the face of the subject to whom makeup has been applied. Therefore, it becomes possible to evaluate the transparency of the makeup skin with high accuracy.
- FIG. 1 It is a block diagram which shows the structure of the transparency evaluation apparatus which concerns on Embodiment 1 of this invention. It is a figure which shows the evaluation area
- L * component contour line distribution images are shown, (a) is an image of a subject with high transparency, and (b) is an image of an image of a subject with low transparency.
- the change of the L * component in the evaluation region set in the L * component contour distribution image is shown, (a) is the change in the L * component contour distribution image of the subject with high transparency, and (b) is the subject of the subject with low transparency. It is a figure which shows the change in an L * component contour-line distribution image, respectively.
- FIG. 3 is a diagram showing a correlation between a general index used in Embodiment 1 and a sensory evaluation value.
- 6 is a block diagram illustrating a configuration of an index calculation unit of a transparency evaluation apparatus according to Embodiment 2.
- FIG. 10 is a block diagram illustrating a configuration of an index calculation unit of a transparency evaluation apparatus according to a modification of the second embodiment.
- FIG. 10 is a block diagram illustrating a configuration of an index calculation unit of a transparency evaluation apparatus according to Embodiment 3.
- FIG. It is the figure which calculated
- FIG. 10 is a diagram showing a correlation between a comprehensive index used in Embodiment 3 and a sensory evaluation value. It is a block diagram which shows the structure of the parameter
- FIG. 10 is a block diagram illustrating a configuration of an index calculation unit of a transparency evaluation apparatus according to Embodiment 3.
- FIG. It is the figure which calculated
- FIG. 10 shows L * component contour line distribution images in Embodiment 5, where (a) shows an image of a subject with high transparency and (b) shows an image of a subject with low transparency.
- region set to the L * component contour line distribution image in Embodiment 5 is shown, (a) is a change in the L * component contour line distribution image of a test subject with high transparency, (b) is transparent. It is a figure which shows the change in a L * component contour line distribution image of a test subject with a low feeling, respectively. It is a figure which shows the evaluation area
- Embodiment 10 shows an a * component image of makeup skin according to Embodiment 5, where (a) shows an image of a subject with high transparency, and (b) shows an image of a subject with low transparency.
- the image which extracted the low-intensity part from the L * component image is shown, (a) is a test subject's image with high transparency, (b) is a figure which respectively shows the test subject's image with low transparency.
- It is a block diagram which shows the structure of the makeup
- FIG. It is a figure which shows the evaluation area
- FIG. 7 shows a binarized image in which a non-uniform part of makeup is detected in the sixth embodiment, (a) is an image of a subject with few non-uniform parts of makeup, and (b) is an image of a subject with many non-uniform parts of makeup.
- FIG. FIG. 25 is a block diagram illustrating a configuration of a makeup index calculation unit according to a modification of the sixth embodiment. It is the figure which calculated
- FIG. 1 shows the configuration of a transparency evaluation apparatus that performs the transparency evaluation method according to Embodiment 1 of the present invention.
- the transparency evaluation apparatus evaluates the transparency of the subject's face F using a photographed image obtained by photographing the subject's face F with the camera C, and includes an image input unit (not shown) connected to the camera C.
- a pre-processing unit 1, a color space conversion unit 2, an index calculation unit 3, a comprehensive index calculation unit 4, a transparency evaluation unit 5, and a display unit 6 are sequentially connected to the input unit.
- a control unit 7 is connected to the color space conversion unit 2, the index calculation unit 3, the comprehensive index calculation unit 4, and the transparency evaluation unit 5, and an operation unit 8 is connected to the control unit 7.
- the preprocessing unit 1 performs preprocessing such as light amount correction and noise removal on the captured image input from the camera C via the image input unit.
- the captured image input from the camera C has an RGB color space.
- the camera C only needs to be able to photograph the subject's face F, and a digital camera, a CCD camera, or the like can be used.
- a photographed image photographed with a mobile phone such as a smartphone can be used.
- the color space conversion unit 2 converts the color space of the photographed image input from the preprocessing unit 1 to generate a color space conversion image.
- As the color space converted image for example, an image converted into an L * a * b * color space, an LCH color space, a YCC color space, or the like can be used.
- the color space conversion unit 2 When converting to the L * a * b * color space, a D65 light source can be used as the calculation light source. Then, the color space conversion unit 2 generates a brightness component image and a color component image by dividing the generated color space conversion image into a brightness component (luminance component) and a color component, respectively. Specifically, in the case of a color space conversion image having an L * a * b * color space, the brightness component indicates the L * component, the color component is the a * component (complementary color component corresponding to red and green), b * component (complementary color component corresponding to yellow and blue), C * component (saturation component), Hue component (hue component), and the like.
- the index calculation unit 3 includes a brightness / color calculation unit 9 and a gradation characteristic calculation unit 10 connected to the color space conversion unit 2.
- the brightness / color calculation unit 9 sets an evaluation region R1 on the face F of the subject with respect to the brightness component image and the color component image generated by the color space conversion unit 2.
- the evaluation region R1 can be set, for example, on the entire face F or the cheek portion.
- the brightness / color calculation unit 9 calculates the overall brightness component value in the evaluation region R1 set in the brightness component image, that is, the representative value of the brightness component in the evaluation region R1. Further, the brightness / color calculation unit 9 calculates an overall color component value in the evaluation region R1 set in the color component image, that is, a representative value of the color component in the evaluation region R1.
- the overall brightness component value and color component value in the evaluation region R1 can be calculated, for example, from the average value of the brightness component and the average value of the color component in the evaluation region R1, respectively.
- the gradation characteristic calculation unit 10 sets an evaluation region R2 on the face F of the subject with respect to the brightness generation image generated by the color space conversion unit 2.
- the evaluation region R2 can be set, for example, in a range from the cheek portion of the subject's face F to the contour portion of the face F.
- the gradation characteristic calculation unit 10 obtains the intensity distribution of the brightness component in the evaluation region R2 set in the brightness component image, and changes (gradation) of the brightness component over the evaluation region R2 based on the obtained intensity distribution.
- a gradation characteristic representing the smoothness of the image is calculated.
- the brightness / color calculation unit 9 outputs the overall brightness component value and the overall color component value in the evaluation region R1 to the overall index calculation unit 4 as the first skin evaluation index, respectively, and the gradation characteristics.
- the calculation unit 10 outputs the smoothness of the change in the brightness component over the evaluation region R2 to the comprehensive index calculation unit 4 as the second skin evaluation index.
- the comprehensive index calculation unit 4 combines the first skin evaluation index input from the brightness / color calculation unit 9 and the second skin evaluation index input from the gradation characteristic calculation unit 10 with each other. A comprehensive index for the transparency of the face F is calculated.
- the transparency evaluation unit 5 evaluates the transparency of the face F of the subject based on the total index calculated by the total index calculation unit 4.
- the display unit 6 includes a display device such as an LCD, for example, and displays the evaluation result of the transparency evaluated by the transparency evaluation unit 5.
- the operation unit 8 is for an operator to input information, and can be formed from a keyboard, a mouse, a trackball, a touch panel, and the like.
- the control unit 7 controls each unit in the transparency evaluation device based on various command signals input from the operation unit 8 by the operator.
- the color space conversion unit 2, the index calculation unit 3, the comprehensive index calculation unit 4, the transparency evaluation unit 5, and the control unit 7 are composed of a CPU and an operation program for causing the CPU to perform various processes. These may be constituted by digital circuits. Further, a memory can be connected to the CPU via a signal line such as a bus. For example, the color space conversion image generated by the color space conversion unit 2, the image generated by the index calculation unit 3, and the transparency evaluation The transparency evaluation results calculated by the unit 5 can be stored in the memory, respectively, and the images and the transparency evaluation results stored in the memory can be displayed on the display unit 6 under the control of the control unit 7. .
- a database storing the relationship between the sensory evaluation value calculated by performing sensory evaluation of the transparency of the bare skin in advance and the comprehensive index can be connected to the transparency evaluation unit 5.
- the transparency evaluation unit 5 evaluates the transparency of the bare skin by comparing the relationship between the sensory evaluation value read from the database and the comprehensive index with the comprehensive index input from the comprehensive index calculation unit 4. Can do.
- a photographed image obtained by photographing the subject's face F with the camera C is input from the camera C to the preprocessing unit 1 of the transparency evaluation apparatus via an image input unit (not shown) as shown in FIG.
- the captured image is subjected to preprocessing such as light source correction and noise removal, and then output from the preprocessing unit 1 to the color space conversion unit 2.
- the color space of the captured image is converted by the color space color space conversion unit 2, for example, A color space conversion image is generated by converting into the L * a * b * color space.
- the color space conversion unit 2 extracts a brightness component and a color component from the color space conversion image, and generates a brightness component image and a color component image, respectively.
- an L * component image can be generated as a brightness component image and a C * component image can be generated as a color component image.
- the generated L * component image and C * component image are output from the color space conversion unit 2 to the brightness / color calculation unit 9, and the L * component image is output from the color space conversion unit 2 to the gradation characteristic calculation unit 10. Is output.
- the brightness / color calculation unit 9 sets an evaluation region R1 on the cheek portion of the face F of the subject as shown in FIG. Set.
- the evaluation region R1 can be set to the L * component image and the C * component image via the control unit 7 when the operator operates the operation unit 8, respectively.
- the brightness and color calculating section 9, L * for C * evaluation set in the component image region R1 C * ingredient with the evaluation area R1 set in the component image obtaining the average value of the intensity of the L * component The average value of the intensity of the is obtained.
- region R1 set to the test subject's face F the value of the whole L * component and the value of the whole C * component can each be calculated
- the skin of young people is white, bright, and low in saturation, but as the skin ages, the skin becomes generally yellowish and darker, and the overall transparency of the skin decreases. It is known to do.
- the overall L * component value and the overall C * component value of the evaluation region R1 obtained by the brightness / color calculation unit 9 serve as indices representing changes in transparency with aging. It is conceivable that. Specifically, the higher the overall L * component value (brighter), the higher the transparency of the subject's face F, and the lower the overall C * component value, the more transparent the subject's face F. It feels high. Therefore, the average value of the L * component and the average value of the C * component in the evaluation region R1 are used as the first skin evaluation index for evaluating the transparency, respectively, from the brightness / color calculation unit 9 to the comprehensive index calculation unit 4. Output to.
- the average value of the L * component and the average value of the C * component in the evaluation region R1 used in the first skin evaluation index are physical quantities close to the sense when the entire face F of the subject is viewed, and the first skin
- the evaluation index gives an objective index close to sensory evaluation for the evaluation of transparency.
- the gradation characteristic calculation unit 10 obtains the intensity distribution of the brightness component for the predetermined evaluation region R2 of the L * component image input from the color space conversion unit 2, and evaluates the evaluation region R2 based on the obtained intensity distribution.
- the gradation characteristic representing the smoothness of the change of the brightness component over the entire range is calculated.
- the gradation characteristic calculation unit 10 performs a plurality of steps in stages at regular intensity intervals on the face F of the subject in the L * component image.
- a contour line M is set, and an L * component contour line distribution image G is generated by dividing the subject's face F according to the value of the L * component by the plurality of contour lines M.
- FIG. 3A is an L * component contour distribution image G of a subject with high transparency
- FIG. 3B is an L * component contour distribution image G of a subject with low transparency.
- the L * component contour line distribution image G regions surrounded by two contour lines adjacent to each other are represented as having the same intensity.
- the plurality of contour lines M that divide the subject's face F should be set to have an intensity interval of about 1/10 of the intensity range of the L * component image, or in increments of 3 to 5 digits. Is preferred.
- the tone characteristic calculation unit 10 sets the evaluation region R2 so that the contour portion of the face F is linearly connected from the cheek portion of the face F of the subject to the L * component contour line distribution image G.
- the evaluation region R2 it is preferable to set the evaluation region R2 so as to pass through a portion having the highest value of the L * component in the cheek portion of the face F of the subject.
- the evaluation region R2 can be set as a region that linearly connects the position of the cheek portion where the value of the L * component is highest to the contour portion of the face F in the horizontal direction.
- the evaluation region R2 is set to a region that linearly connects the position from the position where the L * component value is the highest in the cheek to the contour portion of the face F so as to intersect substantially perpendicularly to a plurality of contour lines. You can also The evaluation region R2 can also be set to a predetermined region of the L * component contour line distribution image G via the control unit 7 when the operator operates the operation unit 8.
- FIG. 4B shows the change in the L * component in the evaluation region R2 set in the L * component contour line distribution image G (FIG. 3 (a)) of the subject with high transparency
- FIG. 4B shows the change in the L * component in the evaluation region R2 set in the L * component contour line distribution image G (FIG. 3B) of the subject having a low A.
- the change in the L * component in the evaluation region R2 of the subject with high transparency is the curvature of the face F with respect to the change in the L * component of the subject with low transparency. It turns out that it falls smoothly so that the fixed curve according to may be drawn. Therefore, the gradation characteristic representing the smoothness of the change in the L * component can be used as the second skin evaluation index for evaluating the transparency.
- the gradation characteristics of the L * component can be calculated based on, for example, the intervals between the plurality of contour lines M adjacent to each other in the evaluation region R2 and the obtained uniformity of the intervals between the plurality of contour lines M. That is, as shown in FIGS. 4A and 4B, the intervals S1 between the plurality of contour lines M in the evaluation area R2 of the subject with high transparency are equal to the intervals S1 of the plurality of contour lines M in the evaluation area R2 of the subject with low transparency.
- the variation with respect to the interval S2 is small, and this is one factor that improves the smoothness of the change in the L * component.
- the gradation characteristics of the L * component can be quantified by calculating the standard deviation or the like for each of the intervals S1 and S2 to obtain the variation.
- the intervals S1 and S2 between the plurality of contour lines M can be obtained from, for example, the number of pixels of the L * component contour line distribution image G, and the uniformity of the intervals between the plurality of contour lines M is obtained by obtaining variation with respect to the obtained number of pixels. Can be calculated.
- the gradation characteristics of the L * component can also be calculated based on the number of contour lines M that divide the evaluation region R2. As shown in FIGS. 3A and 3B, the subject with high transparency has a larger number of contour lines M that divide the evaluation region R2 than the subject with low transparency, which is a smooth change in the L * component. It is a factor to improve the safety. Therefore, the gradation characteristic of the L * component can be quantified by obtaining the number of contour lines M that divide the evaluation region R2. The tone characteristic of the L * component thus obtained is output from the tone characteristic calculator 10 to the general index calculator 4 as the second skin evaluation index.
- the second skin evaluation index is transparent based on a change in the brightness of the face F of the subject with respect to the first skin evaluation index that evaluates transparency based on the overall brightness of the face F of the subject. It is to evaluate the feeling, and the transparency can be evaluated from a viewpoint different from the first skin evaluation index. Further, the gradation characteristic of the L * component used in the second skin evaluation index is a physical quantity close to a sense when the entire face F of the subject is viewed, like the first skin evaluation index. The skin evaluation index gives an objective index close to sensory evaluation for the evaluation of transparency.
- the overall L * component value and the overall C * component value in the evaluation region R1 obtained by the brightness / color calculation unit 9 are respectively calculated as the first skin evaluation index.
- the tone characteristics of the L * component in the evaluation region R2 obtained by the tone characteristic calculation unit 10 are input to the overall index calculation unit 4 as the second skin evaluation index.
- the total index calculation unit 4 inputs the input first skin evaluation index and second skin evaluation index, that is, the value of the overall L * component of the evaluation region R1 and the overall C * component of the evaluation region R1. By combining the values and the gradation characteristics of the L * component of the evaluation region R2 with each other by linearly summing them using, for example, a multiple regression equation obtained by performing sensory evaluation in advance, A comprehensive index for determining the evaluation of transparency is calculated.
- the calculated total index value is output from the total index calculation unit 4 to the transparency evaluation unit 5.
- the transparency evaluation unit 5 evaluates the transparency of the subject's face F based on the total index calculated by the total index calculation unit 4, and the evaluation result is displayed on the display unit 6.
- the overall index that combines the first skin evaluation index and the second skin evaluation index is used to objectively provide a sense of transparency along the sense when the subject's face F is viewed as a whole. Therefore, it is possible to obtain an evaluation result that sufficiently matches the evaluation of transparency by sensory evaluation.
- the second skin evaluation index for evaluating transparency based on changes in the brightness component is a new evaluation index that has not existed before, and the transparency is evaluated based on a comprehensive index including the second skin evaluation index. By doing this, the evaluation result can be brought closer to the evaluation of transparency by sensory evaluation.
- the uniformity of the intervals between the plurality of contour lines M is obtained by obtaining the intervals between the plurality of contour lines M based on the number of pixels of the image and calculating the variation in the number of pixels respectively representing the intervals between the plurality of contour lines M.
- the numerical value decreases as the interval between the contour lines becomes uniform.
- FIG. 5 shows a graph in which the average value of the L * component in the evaluation region R1 is plotted against the sensory evaluation value
- FIG. 6 plots the average value of the C * component in the evaluation region R1 against the sensory evaluation value
- FIG. 7 shows a graph in which the uniformity of the intervals between the plurality of contour lines that define the evaluation region R2 is plotted against the sensory evaluation value.
- the sensory evaluation value is a value obtained by evaluating the transparency in 30 steps by sensory evaluation, and is evaluated as having no transparency as the value approaches 30.
- the correlation coefficient R 2 was 0.42.
- the correlation coefficient R 2 was 0.0032.
- the correlation coefficient R 2 was 0.32.
- the sensory evaluation value S (total index value) ) 177.1-410.9 ⁇ L + 83.5 ⁇ C-33.7 ⁇ K The multiple regression equation was obtained.
- these values are obtained by substituting the average value L of the L * component, the average value C of the C * component, and the uniformity K of the contour line spacing into the multiple regression equation.
- a graph in which this comprehensive index is plotted against the sensory evaluation value is shown in FIG.
- the correlation coefficient R 2 was 0.62.
- the comprehensive index shows a sufficiently high correlation with the sensory evaluation value, and as an index for evaluating transparency, not only the average value of the L * component and the average value of the C * component but also a new contour line It can be seen that the transparency can be evaluated with high accuracy by adding the uniformity of the interval.
- the brightness / color calculation unit 9 obtains an overall L * component value and an overall C * component value for the evaluation region R1 set on the face F of the subject. Although it was set as the 1st skin evaluation parameter
- an average value of the a * component, an average value of the b * component, and an average value of the Hue component are obtained, and at least one of these average values is an overall color component value, that is, It can also be used as the first skin evaluation index.
- the gradation characteristic calculation unit 10 obtains the smoothness of the change in the L * component over the evaluation region R2 and uses it as the second skin evaluation index, but the intensity of the brightness component in the evaluation region R2 The distribution and the intensity distribution of the color component are obtained, and based on the obtained intensity distribution, at least one of the smoothness of the change in the brightness component and the smoothness of the change in the color component over the evaluation region R2 is used as the second skin evaluation index. What is necessary is just to calculate and is not restricted to this.
- the smoothness of the change of the C * component over the evaluation region R2 is calculated as the second skin evaluation index, or the smoothness of the change of the L * component and the smoothness of the change of the C * component over the evaluation region R2 are respectively calculated. 2 as a skin evaluation index.
- the smoothness of the change of the a * component, the smoothness of the change of the b * component, and the smoothness of the change of the Hue component over the evaluation region R2 can be calculated as the second skin evaluation index.
- control unit 7 can select an index calculated by the brightness / color calculation unit 9 and the gradation characteristic calculation unit 10 according to the age of the subject so that desired evaluation accuracy is maintained.
- a transparency evaluation apparatus is set as the structure which incorporated the camera. You can also.
- a transparency evaluation apparatus can be provided in a device equipped with a camera such as a digital camera, a mobile phone (such as a smart phone), and a tablet.
- the transparency evaluation apparatus can also input a photographed image to the preprocessing unit 1 via a network.
- a transparency evaluation apparatus is connected to a computer that stores a captured image via a network, evaluates skin transparency based on the captured image input from the computer, and stores the evaluation result in a server or the like. .
- the user can browse the evaluation result of transparency by accessing the server, or can obtain the evaluation result of transparency from the server via the network.
- FIG. 9 shows a configuration of a transparency evaluation apparatus that performs the transparency evaluation method according to the second embodiment.
- This transparency evaluation apparatus is the same as the transparency evaluation apparatus according to the first embodiment shown in FIG. 1, except that the minus factor calculation unit 21 is arranged in the index calculation unit 3 instead of the brightness / color calculation unit 9, and color space conversion is performed.
- the unit 2 and the comprehensive index calculation unit 4 are connected to each other.
- the minus factor calculation unit 21 sets the evaluation region R3 for the face F of the subject of the brightness component image or the color component image generated by the color space conversion unit 2, and the brightness component in the set evaluation region R3. Detect negative factors, such as spots and pores, where values or color component values change locally.
- the evaluation region R3 can be set on the entire face F or the cheek portion of the subject. Subsequently, the minus factor calculation unit 21 obtains the amount of occurrence of the minus factor in the evaluation region R3 based on the detection result.
- Negative factors can be identified by generating a Dog image and extracting spots and pores.
- a Dog image Difference of Gaussian image
- the stain is 2 mm to 10 mm in size and has a frequency of 0.05 cycle / mm to 0.25 cycle / mm
- the pore is 0.5 mm to 2 mm in size and 0.25 cycle / mm to 1 It has a frequency of .0 cycle / mm. Therefore, the negative factor calculation unit 21 performs Dog image processing so that components having frequency bands of spots and pores are extracted.
- the shape of each component is calculated from the binarized image subjected to threshold processing, and the round shape and the circularity (4 ⁇ ⁇ area) / perimeter length 2 is 0.4 to 1.0.
- a component having a circumference of 0.5 to 10 mm with a circumference of 0.5 to 10 mm is detected as a negative factor. That is, the negative factor is detected by performing Dog image processing based on the frequency band and size.
- a negative factor can be detected by generating a Dog image using the B channel in the RGB color space.
- a negative factor for example, a component having an intensity equal to or lower than a predetermined threshold is extracted from an L * component image without generating a Dog image, and a principal component analysis and an independent component analysis are performed on the extracted component. Can also be detected.
- the negative factor calculation unit 21 calculates the amount of negative factors generated, for example, the number of negative factors, the total area, the area ratio, and the like based on the detection result of the negative factors in the evaluation region R3.
- the total area of the minus factor can be calculated from the number of pixels constituting the minus factor in the image. This minus factor occurs with aging, and the smaller the amount of generation, the higher the transparency of the subject's face F.
- the darkness of the stain greatly affects the transparency, the darkness of the surrounding skin can be calculated as the amount of occurrence of the negative factor.
- the density of the stain is obtained by, for example, obtaining an average value of the color components of the entire evaluation region R3 and an average value of the color components of the stain portion detected in the evaluation region R3, and comparing the average value of the color components of the entire evaluation region R3.
- the difference (color difference) between the average values of the color components of the spot portion can be calculated and obtained.
- the calculated negative factor generation amount in the evaluation region R3 is output from the negative factor calculation unit 21 to the comprehensive index calculation unit 4 as the first skin evaluation index.
- the generation amount of the negative factor in the evaluation region R3 used in the first skin evaluation index is a physical quantity close to a sense when the entire face F of the subject is viewed, and the first skin evaluation index is about transparency evaluation. It gives an objective index close to sensory evaluation.
- the total index calculation unit 4 receives the number of negative factors, the total area, and the area ratio from the negative factor calculation unit 21 as the first skin evaluation index, and also changes the L * component from the gradation characteristic calculation unit 10. Smoothness is input as the second skin evaluation index.
- the comprehensive index calculation unit 4 combines the number of input negative factors, the total area of the negative factors, the area ratio of the negative factors, and the smoothness of the change of the L * component using a multiple regression equation or the like. Then, a comprehensive index for determining the evaluation of the transparency of the face F of the subject is calculated. Then, the transparency evaluation unit 5 evaluates the transparency of the face F of the subject based on the total index calculated by the total index calculation unit 4, and the evaluation result is displayed on the display unit 6.
- the overall index that combines the first skin evaluation index and the second skin evaluation index is used to objectively provide a sense of transparency along the sense when the subject's face F is viewed as a whole. Therefore, it is possible to obtain an evaluation result that sufficiently matches the evaluation of transparency by sensory evaluation.
- a negative factor calculation unit 22 is newly provided in the index calculation unit 3, and this negative factor calculation unit 22 is converted into a color space conversion.
- the transparency evaluation device can also be configured by connecting to the unit 2 and the comprehensive index calculation unit 4 respectively.
- the negative factor calculation unit 22 calculates the number of negative factors, the total area, and the area ratio as the first skin evaluation index, and outputs the first skin evaluation index to the general index calculation unit 4.
- the overall index calculation unit 4 receives the overall L * component value and the overall C * component value in the evaluation region R1 from the brightness / color calculation unit 9 and the tone characteristic calculation unit 10. To the smoothness of the change of the L * component in the evaluation region R2, and the number of negative factors, the total area, and the area ratio are input from the negative factor calculation unit 22, respectively.
- the overall index calculation unit 4 is configured to input the overall L * component value of the evaluation region R1, the overall C * component value of the evaluation region R1, and the smoothness of the change in the L * component of the evaluation region R2.
- the transparency evaluation unit 5 evaluates the transparency of the face F of the subject based on the total index calculated by the total index calculation unit 4, and the evaluation result is displayed on the display unit 6. In this way, by evaluating the transparency based on more indexes having different properties, it is possible to evaluate the transparency with high accuracy.
- Embodiment 3 In the transparency evaluation device according to the first and second embodiments, a color unevenness calculation unit that detects color unevenness of the face F of the subject and calculates the amount of generation thereof is newly provided in the index calculation unit 3 to provide the transparency evaluation device. Can be configured.
- a color unevenness calculation unit 31 is newly provided in the index calculation unit 3, and the color unevenness calculation unit 31 is replaced with a color space.
- Each can be connected to the conversion unit 2 and the comprehensive index calculation unit 4.
- the color unevenness calculation unit 31 sets an evaluation region R4 for the face F of the subject of the brightness component image or the color component image generated by the color space conversion unit 2, and the brightness component of the brightness component image is set in the set evaluation region R4.
- a value larger than the negative factor is detected as color unevenness while the value or the value of the color component changes locally.
- the evaluation region R4 can be set on the entire face F or the cheek portion of the subject. Subsequently, the color unevenness calculation unit 31 obtains the amount of color unevenness generated in the evaluation region R4 based on the detection result.
- Color unevenness can be extracted by generating a Dog image in the same manner as a negative factor.
- the color unevenness has a size of about 10 mm or more and a frequency of 0.05 cycle / mm or more. Therefore, a Dog image is generated so that a part having a frequency of 0.05 cycle / mm or more is extracted from the L * component image input from the color space conversion unit 2.
- the shape of each component is calculated from the binarized image subjected to threshold processing, and the round shape and the circularity (4 ⁇ ⁇ area) / perimeter length 2 is 0.4 to 1.0.
- a portion having a circumference of about 10 mm or more, preferably 0.6 to 1.0, is detected as color unevenness. That is, color unevenness is detected by performing Dog image processing based on the frequency band and size.
- the color unevenness calculation unit 31 calculates the amount of color unevenness generated, for example, the total area, the area ratio, and the number of color unevenness based on the detection result of the color unevenness in the evaluation region R4.
- the total area of the color unevenness can be calculated from the number of pixels constituting the color unevenness in the image. This uneven color occurs with aging, and the smaller the amount of color generation, the higher the transparency of the subject's face F. Therefore, the color unevenness calculation unit 31 outputs the amount of color unevenness occurrence in the evaluation region R4 to the comprehensive index calculation unit 4 as a third skin evaluation index.
- the amount of color unevenness in the evaluation region R4 used in the third skin evaluation index is a physical quantity that is close to the sensation when viewing the entire face F of the subject, and the third skin evaluation index is about transparency evaluation. It gives an objective index close to sensory evaluation.
- the total index calculation unit 4 receives the number of negative factors, the total area, and the area ratio from the negative factor calculation unit 21 as the first skin evaluation index, and also changes the L * component from the gradation characteristic calculation unit 10. Smoothness is input as the second skin evaluation index, and the total area and area ratio of the color unevenness are input from the color unevenness calculation unit 31 as the third skin evaluation index.
- the total index calculation unit 4 inputs the number of input negative factors, the total area of the negative factors, the area ratio of the negative factors, the smoothness of the change of the L * component, the total area of the color unevenness, and the area of the color unevenness.
- the evaluation result can be obtained that sufficiently matches the evaluation of the transparency by the sensory evaluation.
- FIG. 12 shows a graph in which the total area of spots in the evaluation region R3 is plotted against the sensory evaluation value
- FIG. 13 shows a graph in which the total area of pores in the evaluation region R3 is plotted against the sensory evaluation value
- FIG. 14 shows a graph in which the total area of color unevenness in the evaluation region R4 is plotted against the sensory evaluation value.
- the comprehensive index shows a sufficiently high correlation with the sensory evaluation value, and the total area F of the stain, the total area P of the pores, the total area I of the color unevenness, and the uniformity K of the contour line interval with each other. It turns out that transparency can be evaluated with high accuracy by combining them.
- Embodiment 4 As shown in FIG. 16, in the transparency evaluation apparatus according to the third embodiment shown in FIG. 10, a color unevenness calculation unit 41 is newly provided in the index calculation unit 3, and the color unevenness calculation unit 41 is replaced with a color space conversion unit. 2 and the comprehensive index calculation unit 4 can be connected to each other to constitute a transparency evaluation apparatus. That is, the designation calculation unit 3 includes all of the brightness / color calculation unit, the gradation characteristic calculation unit, the minus factor calculation unit, and the color unevenness calculation unit described in the first to third embodiments. In the same manner as described above, the color unevenness calculation unit 41 calculates the amount of color unevenness in the evaluation region R4 as the third skin evaluation index, and outputs the third skin evaluation index to the comprehensive index calculation unit 4. .
- the overall index calculation unit 4 receives the overall L * component value and the overall C * component value in the evaluation region R1 from the brightness / color calculation unit 9 and the tone characteristic calculation unit 10.
- the smoothness of the change of the L * component in the evaluation region R2 is input from the negative region, the number of negative factors, the total area, and the area ratio are input from the negative factor calculation unit 22, and the total color unevenness area is further input from the color unevenness calculation unit 41. And the area ratio are respectively input.
- the overall index calculation unit 4 is configured to input the overall L * component value of the evaluation region R1, the overall C * component value of the evaluation region R1, and the smoothness of the change in the L * component of the evaluation region R2.
- the number of negative factors in the evaluation region R3, the total area of negative factors in the evaluation region R3, the area ratio of negative factors in the evaluation region R3, the total area of color unevenness in the evaluation region R4, and the color unevenness in the evaluation region R4 are combined with each other using a multiple regression equation or the like, thereby calculating a comprehensive index for determining the evaluation of the subject's face F for transparency.
- the transparency evaluation unit 5 evaluates the transparency of the subject's face F based on the total index calculated by the total index calculation unit 4, and the evaluation result is displayed on the display unit 6. In this way, by evaluating the transparency based on more indexes having different properties, it is possible to evaluate the transparency with high accuracy.
- the total area of color unevenness in the evaluation region R4 was calculated, and these values were combined with each other to obtain a comprehensive index for transparency. That is, the overall index is obtained from the overall index of the fourth embodiment by using an index excluding the uniformity of the intervals between the plurality of contour lines that define the evaluation region R2.
- comprehensive index 189.7 ⁇ 532.1 ⁇ L + 72.0 ⁇ C + 185.4 ⁇ F ⁇ 109.7 ⁇ P + 99.6 ⁇ I there were.
- L is the average value of the L * component
- C is the average value of the C * component
- F is the total area of the stain
- P is the total area of the pores
- I is the total area of color unevenness
- K is the contour line This shows the uniformity of the interval.
- FIG. 17 shows a graph in which the comprehensive index of Embodiment 4 is plotted against the sensory evaluation value
- FIG. 18 shows a graph in which the comprehensive index of the comparative example is plotted against the sensory evaluation value.
- the correlation coefficient R 2 was 0.98.
- the correlation coefficient R 2 was 0.89.
- the comprehensive index of Embodiment 4 has sufficiently improved the correlation with the sensory evaluation value compared to the comprehensive index of the comparative example, and the uniformity of the contour line interval is used as an index for evaluating the transparency. It can be seen that transparency can be evaluated with high accuracy.
- the transparency of the subject's face is evaluated.
- the transparency of the subject's skin can be evaluated, for example, the arm can be evaluated.
- the evaluation of the transparency of the bare skin as in the first to fourth embodiments is performed by causing a computer composed of input means, a CPU, a memory, and the like to function by a transparency evaluation program recorded on a recording medium. can do. That is, the transparency evaluation program recorded on the recording medium causes the computer to function, whereby the image input unit acquires a captured image obtained by capturing the subject's bare skin, and the CPU executes the previous operation based on the acquired captured image.
- the processing unit 1, the color space conversion unit 2, the index calculation unit 3, the comprehensive index calculation unit 4, and the transparency evaluation unit 5 are executed to evaluate the transparency of the subject's skin.
- Embodiment 5 In the first to fourth embodiments, the transparency of the subject's bare skin was evaluated, but the transparency of the face F (makeup skin) of the subject to whom makeup was applied can also be evaluated.
- FIG. 19 the structure of the transparency evaluation apparatus which concerns on Embodiment 5 is shown.
- This transparency evaluation apparatus is the same as the transparency evaluation apparatus according to Embodiment 1 shown in FIG. 1 except that an index calculation unit 51 is arranged instead of the index calculation unit 3.
- the transparency evaluation apparatus evaluates the transparency using a photographed image obtained by photographing the face F of a subject who has applied makeup with the camera C, and includes an image input unit 1a connected to the camera C.
- the preprocessing unit 1, the color space conversion unit 2, the index calculation unit 51, the comprehensive index calculation unit 4, the transparency evaluation unit 5, and the display unit 6 are sequentially connected to the image input unit 1a.
- a control unit 7 is connected to the color space conversion unit 2, the index calculation unit 51, the comprehensive index calculation unit 4, and the transparency evaluation unit 5, and an operation unit 8 is connected to the control unit 7.
- the image input unit 1a inputs a photographed image from the camera C that photographed the face F (makeup skin) of the subject to whom makeup was applied.
- the preprocessing unit 1 performs preprocessing such as light amount correction and noise removal on the captured image input from the image input unit 1a.
- the color space conversion unit 2 converts the color space of the photographed image input from the preprocessing unit 1 to generate a color space conversion image.
- the index calculation unit 51 includes a skin index calculation unit 52 and a makeup index calculation unit 53 that are respectively connected to the color space conversion unit 2.
- the skin index calculation unit 51 and the makeup index calculation unit 52 receive the color space conversion image from the color space conversion unit 2 and evaluate the skin evaluation index and makeup evaluation for evaluating the transparency of the makeup skin based on the color space conversion image. Each index is calculated.
- the skin evaluation index is an evaluation index common to the indexes of Embodiments 1 to 4 for evaluating the transparency of the bare skin, and the transparency is evaluated according to the sense when the makeup skin is viewed as a whole.
- the makeup evaluation index is an evaluation index peculiar to the evaluation of the transparency of the makeup skin, and is a new index indicating a change in transparency caused by makeup. For example, the thick feeling of makeup that is felt when the makeup is applied thickly causes a decrease in transparency, and a makeup evaluation index can be determined based on the feeling of thick coating.
- the skin index calculation unit 52 and the makeup index calculation unit 53 output the calculated skin evaluation index and the makeup evaluation index to the overall index calculation unit 4, respectively.
- the total index calculation unit 4 calculates a total index for transparency by combining the skin evaluation index and the makeup evaluation index calculated by the skin index calculation unit 52 and the makeup index calculation unit 53, respectively.
- the transparency evaluation unit 5 evaluates the transparency of the face F of the subject to whom makeup is applied based on the overall index calculated by the overall index calculation unit 4.
- the display unit 6 includes a display device such as an LCD, for example, and displays the evaluation result of the transparency evaluated by the transparency evaluation unit 5.
- the operation unit 8 is for an operator to input information, and can be formed from a keyboard, a mouse, a trackball, a touch panel, and the like.
- the control unit 7 controls each unit in the transparency evaluation device based on various command signals input from the operation unit 8 by the operator.
- the skin index calculation unit 52 of the index calculation unit 51 includes a brightness / color calculation unit 9, a negative factor calculation unit 21, a color unevenness calculation unit 31, and a color space conversion unit 2 and a comprehensive index calculation unit 4, respectively.
- a gradation characteristic calculation unit 10 is included.
- the brightness / color calculation unit 9 sets the evaluation region R1 on the face F of the subject to whom makeup is applied, and the brightness component in the evaluation region R1. Or a representative value of the color component is calculated.
- the evaluation region R1 can be set, for example, on the entire face F or the cheek portion.
- the representative value of the brightness component and the representative value of the color component can be calculated, for example, from the average value of the brightness component and the average value of the color component in the evaluation region R1, respectively.
- the minus factor calculation unit 21 sets an evaluation region R3 on the face F of the subject to whom makeup is applied, and the value of the brightness component or the color component in the evaluation region R3 Detect negative factors such as stains and pores whose value changes locally and determine the amount of negative factors generated.
- the evaluation region R3 can be set on the entire face F or the cheek portion of the subject. Examples of the amount of negative factors generated include the number of negative factors, the total area, and the area ratio.
- the color unevenness calculation unit 31 sets an evaluation region R4 on the face F of the subject to whom makeup is applied, and the value of the brightness component or the color component in the evaluation region R4 Of the portion where the value of varies locally, a portion larger than the negative factor is detected as color unevenness to determine the amount of color unevenness generated.
- the evaluation region R4 can be set on the entire face F or cheek portion of the subject. Examples of the amount of color unevenness include the total area, the area ratio, and the number of color unevenness.
- the gradation characteristic calculation unit 10 sets an evaluation region R2 on the face F of the subject to whom makeup is applied, and changes in brightness components in the evaluation region R2 ( A gradation characteristic representing the smoothness of (gradation) is calculated.
- the evaluation region R2 can be set, for example, in a range from the cheek portion of the subject's face F to the contour portion of the face F.
- the makeup index calculation unit 53 of the index calculation unit 51 includes a red component amount calculation unit 54, a stain hue difference calculation unit 55, a concavo-convex amount calculation unit 56, and a color space conversion unit 2 and a comprehensive index calculation unit 4, respectively.
- a medium gloss amount calculation unit 57 is included.
- the red component amount calculation unit 54 sets an evaluation region R5 on the face F of the subject to whom makeup is applied to the color component image generated by the color space conversion unit 2.
- the evaluation region R5 is preferably set to at least one of a portion between the eyebrows, a cheek portion, and a chin portion.
- the red component amount calculation unit 54 calculates the amount of the red component in the evaluation region R5 set in the color component image.
- the red component glabella portion, which represents the redness due to markedly appears ruddy part of the subject's face F, such as the cheek portions and the chin portion, for example, the a * component of the color component images It can be obtained by extraction.
- the red component is more preferably a component having a positive value among the a * components of the color component image in order to more reliably extract redness due to blood color.
- the red component is more preferably a component having a value of 13 or more and 30 or less of the a * component in order to suppress erroneous detection of lips having redness.
- the amount of the red component includes an average value of the red component (for example, a * component) in the evaluation region R5, and a portion where the value of the red component in the evaluation region R5 indicates a predetermined range (for example, the value of the a * component is The area ratio of the portion where the value of the red component shows a predetermined range in the evaluation region R5 (for example, the portion where the value of the a * component shows 13 or more and 30 or less), etc. Can be calculated.
- the red component amount calculation unit 54 outputs the amount of the red component in the evaluation region R5 to the comprehensive index calculation unit 4 as a makeup evaluation index.
- the stain hue difference calculation unit 55 sets an evaluation region R6 on the face F of the subject to whom makeup is applied to the brightness component image or the color component image generated by the color space conversion unit 2, and the set evaluation In the region R6, a spot portion where the value of the brightness component or the value of the color component changes locally is detected.
- the evaluation region R6 is preferably set to the entire face F or the cheek portion of the subject.
- the spot hue difference calculation unit 55 calculates the hue difference between the spot portion and the surrounding area in the color component image.
- the stain hue difference calculation unit 55 outputs the hue difference between the stain portion in the evaluation region R6 and its surroundings to the comprehensive index calculation unit 4 as a makeup evaluation index.
- the unevenness amount calculation unit 56 sets an evaluation region R7 on the face F of the subject to whom makeup is applied to the brightness component image generated by the color space conversion unit 2, and the brightness in the set evaluation region R7. Based on the component values, a low-luminance part indicating a shadow generated on the face of the subject is detected as an uneven part.
- uneven amount calculating unit 56 standard L * values in the face F of the subject, for example, calculates the average value of L * across face F, the average value of the L *, color space conversion unit 2
- a difference image is generated by subtracting each pixel from the L * value of the generated brightness component image to obtain ⁇ L * , and a low luminance portion where ⁇ L * ⁇ 3 in the difference image is detected as an uneven portion. it can.
- the evaluation region R7 is preferably set to at least one of, for example, an eye portion and a portion extending from the nose to the mouth. Subsequently, the concavo-convex amount calculation unit 56 calculates the amount of the concavo-convex portion detected in the evaluation region R7.
- the amount of the uneven portion can be calculated from the area (number of pixels) or the area ratio of the uneven portion in the evaluation region R7.
- the unevenness amount calculation unit 56 outputs the amount of the unevenness portion in the evaluation region R7 to the comprehensive index calculation unit 4 as a makeup evaluation index.
- the medium gloss amount calculation unit 57 sets an evaluation region R8 on the face F of the subject to whom makeup is applied with respect to the brightness component image generated by the color space conversion unit 2, and the brightness in the set evaluation region R8. Based on the intensity of the thickness component, a medium gloss portion having a medium light reflectance is detected.
- the medium glossy part is the reflection of light between the shine and matte parts when the part with high light reflectance is the shining part and the part with low light reflectance is the matte part.
- the gloss part which has a rate is shown.
- the middle glossy part is preferably a part where the intensity of the brightness component shows a value of 60 or more and 70 or less.
- the evaluation region R8 is preferably set to at least one of the cheekbone portion and the nasal muscle portion, for example.
- the middle gloss amount calculation unit 57 calculates the amount of the middle gloss portion detected in the evaluation region R8.
- the amount of the middle gloss portion can be calculated from the area (number of pixels) or the area ratio of the middle gloss portion in the evaluation region R8.
- the medium gloss amount calculation unit 57 outputs the amount of the medium gloss part in the evaluation region R8 to the comprehensive index calculation unit 5 as a makeup evaluation index.
- a photographed image obtained by photographing the face F of the subject to whom makeup has been applied is photographed from the camera C as shown in FIG.
- the image is input to the preprocessing unit 1 via the image input unit 1a.
- the photographed image is preprocessed by the preprocessing unit 1, and then a color space conversion image is generated by the color space conversion unit 2.
- the color space conversion unit 2 extracts a brightness component and a color component from the color space conversion image, and generates a brightness component image and a color component image, respectively.
- an L * component image can be generated as a brightness component image
- a C * component image, an a * component image, and a Hue component image can be generated as color component images.
- the color space conversion unit 2 uses the generated L * component image and C * component image for the brightness / color calculation unit 9, the negative factor calculation unit 21, the color unevenness calculation unit 31, and the gradation characteristic calculation unit of the skin index calculation unit 52. 10 respectively.
- the brightness / color calculation unit 9 evaluates the L * component image and the C * component image on the cheek portion of the face F of the subject to which makeup is applied, for example, as in the setting of the evaluation region R1 shown in FIG. Region R1 is set. Subsequently, the brightness and color calculating section 9, L * for C * evaluation set in the component image region R1 C * ingredient with the evaluation area R1 set in the component image obtaining the average value of the intensity of the L * component The average value of the intensity of the is obtained. Thereby, about the evaluation area
- the skin of young people is white, bright, and low in saturation, while the skin is generally yellowish and dark with aging, and the transparency of the skin is overall It is known to decrease. This change in brightness and saturation gives the same impression on the makeup skin.
- the overall L * component value and the overall C * component value of the evaluation region R1 obtained by the brightness / color calculation unit 12 serve as indices representing changes in the transparency of the makeup skin. Conceivable. Specifically, the higher the overall L * component value (brighter), the higher the transparency of the face F of the subject to whom makeup was applied, and the lower the overall C * component value, The translucency of the face F of the subject who has been subjected to is felt high.
- the average value of the L * component and the average value of the C * component in the evaluation area R1 are used as skin evaluation indexes for evaluating the transparency of the makeup skin, respectively, from the brightness / color calculation unit 9 to the comprehensive index calculation unit 4. Output to.
- the average value of the L * component and the average value of the C * component in the evaluation region R1 used in the skin evaluation index are physical quantities close to the sense when the entire face F of the subject is viewed, and the sensory evaluation for the evaluation of transparency It gives an objective index close to.
- the negative factor calculation unit 21 sets an evaluation region R3 on the cheek portion of the face F of the subject to whom makeup has been applied to the L * component image, and the value of the L * component is locally determined in the set evaluation region R3. Detect negative factors such as spots and pores that change to.
- the negative factor can be specified by generating a Dog image and extracting spots and pores, similarly to the negative factor calculation unit 21 of the second embodiment.
- the minus factor calculation unit 21 calculates the generation amount of the minus factor, for example, the number of minus factors, the total area, the area ratio, and the like based on the detection result of the minus factor in the evaluation region R3.
- This minus factor is generated with aging, and the smaller the amount of generation, the higher the transparency of the face F of the subject to whom makeup is applied.
- the darkness on the surrounding makeup skin can be calculated as the amount of occurrence of the negative factor.
- the density of the stain is obtained by, for example, obtaining an average value of the color components of the entire evaluation region R3 and an average value of the color components of the stain portion detected in the evaluation region R3, and comparing the average value of the color components of the entire evaluation region R3.
- the difference (color difference) between the average values of the color components of the spot portion can be calculated and obtained.
- the calculated negative factor occurrence amount in the evaluation region R3 is output from the negative factor calculation unit 21 to the comprehensive index calculation unit 4 as a skin evaluation index.
- the amount of occurrence of a negative factor in the evaluation region R3 used in this skin evaluation index is a physical quantity that is close to the sense when viewing the entire face F of the subject, and the skin evaluation index is an objective that is close to sensory evaluation for the evaluation of transparency. Give a good index.
- the color unevenness calculation unit 31 sets an evaluation region R4 on the face F of the subject to whom makeup is applied to the L * component image, and the value of the L * component locally changes in the set evaluation region R4. A portion larger than the negative factor is detected as color unevenness.
- the color unevenness can be generated and extracted as in the case of the color unevenness calculation unit 31 of the third embodiment.
- the evaluation region R4 can be set on the entire face F or the cheek portion of the subject.
- the color unevenness calculation unit 31 calculates the amount of color unevenness, for example, the total area and the area ratio of the color unevenness, based on the detection result of the color unevenness in the evaluation region R4. This color unevenness occurs with aging, and the smaller the amount of the color unevenness, the higher the transparency of the face F of the subject to whom makeup is applied. Therefore, the color unevenness calculation unit 31 outputs the color unevenness generation amount in the evaluation region R4 to the comprehensive index calculation unit 4 as a skin evaluation index.
- the amount of color unevenness in the evaluation region R4 used in this skin evaluation index is a physical quantity that is close to the sensation when viewing the entire face F of the subject, and the skin evaluation index is an objective that is close to sensory evaluation for the evaluation of transparency. Give a good index.
- the gradation characteristic calculation unit 10 obtains the intensity distribution of the brightness component for the predetermined evaluation region R2 of the L * component image, and obtains the obtained intensity distribution. Based on this, a gradation characteristic representing the smoothness of the change in the brightness component over the evaluation region R2 is calculated. Specifically, as shown in FIGS. 22 (a) and 22 (b), the gradation characteristic calculation unit 10 performs a plurality of steps in a stepwise manner at a constant intensity interval on the face F of the subject in the L * component image. A contour line M is set, and an L * component contour line distribution image G is generated by dividing the subject's face F according to the value of the L * component by the plurality of contour lines M.
- FIG. 22A is an L * component contour distribution image G of a subject with a high makeup skin transparency
- FIG. 22B is an L * component contour distribution image G of a subject with a makeup skin transparency low. It is.
- the plurality of contour lines M that divide the subject's face F should be set to have an intensity interval of about 1/10 of the intensity range of the L * component image, or in increments of 3 to 5 digits. Is preferred.
- the tone characteristic calculation unit 10 sets the evaluation region R2 so that the contour portion of the face F is linearly connected from the cheek portion of the face F of the subject to the L * component contour line distribution image G.
- the evaluation region R2 it is preferable to set the evaluation region R2 so as to pass through a portion having the highest value of the L * component in the cheek portion of the face F of the subject.
- the evaluation region R2 can be set as a region that linearly connects the position of the cheek portion where the value of the L * component is highest to the contour portion of the face F in the horizontal direction.
- evaluation region R2 is set to a region that linearly connects the position from the position where the L * component value is the highest in the cheek to the contour portion of the face F so as to intersect substantially perpendicularly to a plurality of contour lines.
- FIG. 23 (a) shows a change in the L * component in the evaluation region R2 set in the L * component contour line distribution image G (FIG. 22B) of the subject having low makeup skin transparency.
- FIG. 23 (a) and (b) shows a change in the L * component in the evaluation region R2 set in the L * component contour line distribution image G (FIG. 22B) of the subject having low makeup skin transparency.
- the gradation characteristics of the L * component are obtained in the same manner as in the gradation characteristic calculation unit 10 of the first embodiment, and the intervals between a plurality of contour lines M adjacent to each other in the evaluation region R2 are obtained. It can be calculated based on the uniformity of the interval. Further, the gradation characteristics of the L * component can also be calculated based on the number of contour lines M that divide the evaluation region R2.
- the tone characteristic of the L * component thus obtained is output from the tone characteristic calculator 10 to the general index calculator 4 as a skin evaluation index.
- This skin evaluation index is the brightness of the face F of the subject who has been applied with respect to the above evaluation index that evaluates the transparency based on the overall brightness of the face F of the subject who has applied the makeup. Transparency is evaluated based on the change of the above, and the transparency can be evaluated from a viewpoint different from the above skin evaluation index.
- the tone characteristic of the L * component used in the skin evaluation index is a physical quantity that is close to the sensation when viewing the entire face F of the subject, similar to the above-described skin evaluation index. It gives an objective index close to sensory evaluation.
- the color space conversion unit 2 converts the generated L * component image, C * component image, a * component image, and Hue component image into a red component amount calculation unit 54, a stain hue difference calculation unit 55 in the makeup index calculation unit 53, Output to the unevenness amount calculation unit 56 and the medium gloss amount calculation unit 57, respectively.
- the red component amount calculation unit 54 applies the makeup of the face F between the eyebrows, the cheeks, and the chin to the a * component image input from the color space conversion unit 2.
- An evaluation region R5 is set for the portion.
- FIG. 25 (a) shows an a * component image of makeup skin evaluated by sensory evaluation when the foundation skin is applied to the subject's face F, and the transparency is high.
- An a * component image of the makeup skin evaluated in the evaluation is shown in FIG.
- many red components D are distributed in the interbrow portion, cheek portion, and chin portion where the evaluation region R5 is set
- FIG. 25 (b) shows that the red component D is hardly distributed in the interbrow portion, cheek portion, and chin portion where the evaluation region R5 is set.
- redness due to the blood color is remarkably generated in the portion between the eyebrows, the cheek portion and the chin portion.
- FIG. 25 (a) when redness due to blood color is recognized even after the foundation is applied, unnatural changes due to makeup are less likely to be felt, and the skin color is adjusted by the foundation and is highly transparent. It is thought that a feeling was obtained.
- FIG. 25 (b) when redness due to blood color is not recognized after applying the foundation, an unnatural change caused by makeup becomes large. This unnatural change causes a feeling of thick coating with a thick foundation and makes the feeling of transparency low.
- the red component amount calculation unit 54 calculates the amount of the red component in the evaluation region R5 set in the a * component image.
- the amount of the red component serves as an index indicating a decrease in transparency caused by makeup, and the lower the value, the lower the transparency of makeup skin.
- the amount of the red component, the average value of the values of a * component, the area of the portion indicating the value of the a * component is 13 to 30 or a * values of the components from such an area ratio of the portion showing 13 to 30, Can be calculated.
- the lip portion M shows a red component, but the red component excluding the lip portion M is reduced by setting the value of the a * component to 13 or more and 30 or less. Can be extracted.
- the red component amount calculation unit 54 outputs the amount of the red component in the evaluation region R5 to the comprehensive index calculation unit 4 as a makeup evaluation index for evaluating the transparency of the makeup skin.
- the amount of the red component used in the makeup evaluation index is not commonly used when evaluating the transparency of the bare skin like the above-described skin evaluation index, and shows a decrease in transparency due to makeup. It is an evaluation index peculiar to makeup skin, and transparency can be evaluated from a viewpoint different from the skin evaluation index.
- the stain hue difference calculation unit 55 sets an evaluation region R6 on the cheek portion of the face F of the subject to whom makeup is applied with respect to the L * component image input from the color space conversion unit 2, and the set evaluation region In R6, a spot portion where the value of the L * component changes locally is detected.
- the spot portion can be detected from a Dog image or the like in the same manner as the minus factor calculation unit 21.
- makeup may be applied to the face F of the subject, which may increase the difference in hue between the spot portion and the surrounding area.
- a foundation is applied to the face F of the subject, the skin color is uniformly adjusted by the foundation, but the foundation color overlaps with the color of the spot portion, so that an unnatural color such as gray is partially formed. May occur.
- the unnatural color of this spot has a large difference from the surrounding color that is uniformly arranged with the foundation, and this hue difference causes a feeling of thick coating with a thick coat of makeup, etc. It is done.
- the spot hue difference calculation unit 55 calculates the hue difference between the detected spot portion and the surrounding area using, for example, a Hue component image.
- This hue difference is an index indicating a decrease in transparency caused by makeup, and the greater the difference, the lower the transparency of makeup skin.
- the spot hue difference calculation unit 55 outputs the hue difference between the spot portion in the evaluation region R6 and the surrounding area to the comprehensive index calculation unit 4 as a makeup evaluation index for evaluating the transparency of the makeup skin.
- the hue difference between the spot portion used in this makeup evaluation index and its surroundings is not commonly used when evaluating the transparency of the bare skin as in the above-described skin evaluation index, but is transparent due to makeup. It is an evaluation index peculiar to makeup skin that shows a decrease in feeling, and transparency can be evaluated from a viewpoint different from the skin evaluation index.
- the unevenness amount calculation unit 56 extends from the eye part of the face F of the subject to whom makeup is applied and the nose to the mouth with respect to the L * component image input from the color space conversion unit 2.
- An evaluation region R7 is set for the portion.
- region R7 is detected.
- a low-luminance portion L in which the value of the L * component is equal to or less than a predetermined threshold is extracted from the eye portion of the makeup skin in which the foundation is applied to the face F of the subject. Results are shown.
- FIG. 26A shows makeup skin evaluated as having high transparency by sensory evaluation
- FIG. 26B shows makeup skin evaluated as having low transparency by sensory evaluation.
- the low-luminance portion L is small in the eye portion where the evaluation region R7 is set, whereas in FIG. It can be seen that there are many portions L.
- the unevenness calculation unit 56 calculates the amount of the unevenness detected in the evaluation region R7.
- the amount of the uneven portions becomes a new evaluation index indicating a decrease in transparency caused by makeup, and the greater the value, the lower the transparency of the makeup skin.
- the amount of the uneven portion can be calculated from the area or area ratio of the uneven portion in the evaluation region R7.
- the unevenness amount calculation unit 56 outputs the amount of the unevenness portion in the evaluation region R7 to the comprehensive index calculation unit 4 as a makeup evaluation index for evaluating the transparency of the makeup skin.
- the phenomenon in which the uneven portions are conspicuous after makeup appears prominently by using powdery cosmetics, and the cosmetic evaluation index calculated by the unevenness calculating unit 56 is used when powdered cosmetics are used. It is preferable.
- the amount of the uneven portion used for this makeup evaluation index is not commonly used when evaluating the transparency of bare skin like the above-described skin evaluation index, and shows a decrease in transparency due to makeup. It is an evaluation index peculiar to makeup skin, and transparency can be evaluated from a viewpoint different from the skin evaluation index.
- the medium gloss amount calculation unit 57 evaluates the L * component image input from the color space conversion unit 2 on the cheekbone portion and the nasal muscle portion of the face F of the subject to whom makeup is applied.
- a region R8 is set, and a medium gloss portion having a medium light reflectance in the set evaluation region R8 is detected based on the intensity of the L * component.
- the medium gloss portion can be obtained, for example, by detecting a portion where the intensity of the L * component shows a value of 60 or more and 70 or less.
- a middle glossy portion that is, a so-called gloss
- a so-called gloss is generated in the cheekbone and nose muscle portion. May decrease.
- the foundation is applied to the face F of the subject, the skin color is uniformly adjusted by the foundation, while the gloss is reduced or increased.
- the gloss decreases when a foundation with low gloss is applied, and the gloss increases when a liquid foundation is applied. Due to the decrease or increase in gloss, the middle gloss portion is unnaturally reduced, causing a feeling of thick coating of makeup and the like, and it is felt that the transparency is low.
- the middle gloss amount calculation unit 57 calculates the amount of the middle gloss portion detected in the evaluation region R8.
- the amount of the medium gloss portion is an index indicating a decrease in transparency due to makeup, and the lower the value, the lower the transparency of the makeup skin.
- the amount of the middle gloss portion can be calculated from the area or area ratio of the middle gloss portion in the evaluation region R8.
- the medium gloss amount calculation unit 57 outputs the amount of the medium gloss part in the evaluation region R8 to the comprehensive index calculation unit 4 as a makeup evaluation index for evaluating the transparency of the makeup skin.
- the phenomenon that the middle gloss portion decreases after the makeup appears prominently by using the liquid cosmetic, and the cosmetic evaluation index calculated by the middle gloss amount calculation unit 57 uses the liquid cosmetic. It is preferable to use in some cases.
- the amount of the medium gloss part used in this makeup evaluation index is not commonly used when evaluating the transparency of bare skin like the above-described skin evaluation index, and it reduces the transparency caused by makeup. It is an evaluation index peculiar to the makeup skin to be shown, and the transparency can be evaluated from a viewpoint different from the skin evaluation index.
- the five skin evaluation indexes calculated by the brightness / color calculation unit 9, the negative factor calculation unit 21, the color unevenness calculation unit 31, and the gradation characteristic calculation unit 10 are input to the general index calculation unit 4.
- the four makeup evaluation indexes calculated by the red component amount calculation unit 54, the blot hue difference calculation unit 55, the unevenness amount calculation unit 56, and the middle gloss amount calculation unit 57 are input to the total index calculation unit 4.
- the total index calculation unit 4 inputs the five skin evaluation indices and the four makeup evaluation indices, that is, the value of the overall L * component of the evaluation region R1, and the value of the overall C * component of the evaluation region R1.
- Five skin evaluation indices including the amount of occurrence of a negative factor in the evaluation region R3, the amount of color unevenness in the evaluation region R4, and the gradation characteristics of the L * component of the evaluation region R2, and the red color of the evaluation region R5
- four makeup evaluation indexes including an amount of a component, a stain difference in the evaluation region R6 and the surrounding hue difference, an amount of the uneven portion in the evaluation region R7, and an amount of the middle glossy portion in the evaluation region R8 are set in advance.
- the skin evaluation index represents the transparency when the face F of the subject to whom makeup was applied is viewed as a whole, whereas the makeup evaluation index represents a decrease in transparency due to makeup.
- the index can be expressed by subtracting four makeup evaluation indices from five skin evaluation indices.
- the calculated total index value is output from the total index calculation unit 4 to the transparency evaluation unit 5.
- the transparency evaluation unit 5 evaluates the transparency of the face F of the subject to whom makeup is applied, based on the comprehensive index calculated by the comprehensive index calculation unit 4, and the evaluation result is displayed on the display unit 6.
- the overall index that combines the skin evaluation index and the makeup evaluation index objectively provides a sense of transparency along the sense when the face F of the subject to whom makeup is applied is viewed as a whole. Since the evaluation is performed, an evaluation result that sufficiently matches the evaluation of transparency by sensory evaluation can be obtained.
- the skin evaluation index calculated based on the change in the brightness component in the tone characteristic calculation unit 10 is a new index that has not existed before, and the transparency of the makeup skin is improved based on the comprehensive index including the skin evaluation index. By evaluating, it can be closer to the evaluation of transparency by sensory evaluation.
- the makeup evaluation index calculated by the makeup index calculation unit 53 is an evaluation index unique to makeup skin that shows a change in transparency caused by makeup, and is combined with the skin evaluation index to improve the transparency of makeup skin. Can be evaluated well.
- FIG. 27 shows a configuration of makeup index calculating unit 61 of the transparency evaluation apparatus according to Embodiment 6.
- This makeup index calculation unit 61 is a makeup index calculation unit 53 of the transparency evaluation apparatus according to the fifth embodiment, in which the makeup non-uniformity calculation unit 62 is replaced with the color space conversion unit 2 and the comprehensive index instead of the unevenness calculation unit 56.
- Each is connected to the calculation unit 4.
- the makeup non-uniformity calculation unit 62 sets an evaluation region R9 on the face F of the subject on which the makeup is applied to the photographed image. As shown in FIG. 28, the evaluation region R9 is preferably set at the cheek portion extending from the lower side of the eye to the upper side of the scoring line (wrinkles extending from the small nose to the corner of the mouth).
- the makeup non-uniformity calculation unit 62 calculates different values of the brightness component derived from the skin and the brightness component derived from the makeup, or the value of the color component derived from the skin and the color component derived from the makeup. Based on the value, a brightness component derived from makeup or a color component derived from makeup is extracted from the evaluation region R9 of the photographed image.
- the photographed image only needs to be able to extract a brightness component or a color component derived from makeup.
- the wavelength region of the G channel for example, in the wavelength region of 480 nm to 600 nm, the difference between the intensity of the color component derived from the skin and the intensity of the color component derived from the makeup appears greatly. Therefore, by using this G channel image, the color component derived from the makeup Can be easily taken out.
- the wavelength region of the G channel is preferably set to around 550 nm, which increases the difference between the intensity of the color component derived from the skin and the intensity of the color component derived from the makeup, so that the color component derived from the makeup is highly accurate. It can be taken out.
- the makeup non-uniformity calculating unit 62 determines whether the value of the brightness component derived from makeup or the value of the color component derived from makeup is not the same in the brightness component derived from makeup or the color component derived from makeup.
- a nonuniform portion that changes uniformly is extracted to calculate the nonuniformity of makeup in the evaluation region R9.
- the non-uniform part of makeup is caused by, for example, a small lump produced by applying a foundation to the face F of the subject.
- the foundation is prepared by the foundation, such as a lump formed by the foundation solidified in the form of particles on the surface of the skin, a lump of foundation entering the pores, and a lump of foundation adhering to the edge of the skin groove. Gives the impression of the surrounding colors appearing white. And this uneven part of makeup causes unnatural powder feeling and thick coating feeling due to makeup, and it is felt that the transparency is low.
- the makeup non-uniformity calculation unit 62 generates a smoothed image from which the non-uniform makeup portion is removed by smoothing the G channel image, and subtracts the G channel image and the smoothed image for each pixel.
- the smoothing process is preferably smoothed so that a part having a spatial frequency of 0.1 cycle / mm or less is removed so that more uneven parts of makeup are removed, for example, using a Gaussian filter. Can be smoothed.
- the difference image is preferably clarified with respect to a non-uniform portion of the makeup, for example, by performing a binarization process to generate a binarized image. In the binarization processing, for example, ⁇ 30 can be set as a threshold for the intensity of the difference image.
- FIGS. 29 (a) and 29 (b) a difference image is generated for the cheek portion of the makeup skin in which the foundations of different materials are applied to the face F of the same subject, and the non-uniform portion P of the makeup is generated.
- a binarized image obtained by extracting and applying a binarization process to the difference image to emphasize the non-uniform portion P of the makeup is shown.
- FIG. 29 (a) shows a case where a foundation that hardly causes lumps is applied
- FIG. 29 (b) shows a case where a foundation that tends to cause lumps is applied.
- the non-uniform portion P of the makeup is displayed as a black dot.
- the makeup non-uniformity calculation unit 62 calculates the makeup non-uniformity from the binarized image from which the makeup non-uniformity portion P is extracted.
- the nonuniformity of makeup can be calculated from, for example, the total area, area ratio, and number of nonuniform portions in the evaluation region R9. Then, the makeup non-uniformity calculation unit 62 outputs the makeup non-uniformity in the evaluation region R9 to the comprehensive index calculation unit 4 as a makeup evaluation index.
- the non-uniformity of makeup used for this makeup evaluation index is an evaluation index peculiar to makeup skin showing a decrease in transparency caused by makeup, and the transparency can be evaluated from a viewpoint different from the skin evaluation index. .
- the overall index that combines the skin evaluation index and the makeup evaluation index objectively provides a sense of transparency along the sense when the face F of the subject to whom makeup is applied is viewed as a whole. Since the evaluation is performed, an evaluation result that sufficiently matches the evaluation of transparency by sensory evaluation can be obtained. Further, the uneven portion detected by the uneven amount calculating unit 56 of the fifth embodiment is caused by a wrinkle portion at the eyes, but this wrinkle portion is not clearly present in all people. Even if the same makeup is applied, there may be a person who can detect the uneven portion and a person who is not detected, and the evaluation result of the transparency may vary.
- the non-uniform portion P of makeup is caused by the state of pores and skin grooves present in the skin of a wide range of subjects, the makeup material, and the like. For this reason, the transparency of the makeup skin can be more accurately evaluated for a wide range of subjects based on the degree of makeup non-uniformity.
- the makeup non-uniformity calculating unit 62 is arranged in place of the unevenness calculating unit 56 of the fifth embodiment.
- the makeup index calculation of the first embodiment is performed.
- the makeup index calculation unit 61 may be configured by newly adding a makeup non-uniformity calculation unit 62 in the unit 53.
- the evaluation of the transparency of the makeup skin as in Embodiments 5 and 6 above is performed by causing a computer composed of input means, a CPU, a memory, and the like to function by a transparency evaluation program recorded on a recording medium.
- the transparency evaluation program recorded on the recording medium causes the computer to function, so that the image input unit obtains a photographed image of the face of the subject to whom makeup is applied, and based on the obtained photographed image
- the CPU executes the preprocessing unit 1, the color space conversion unit 2, the index calculation unit 51, the comprehensive index calculation unit 4, and the transparency evaluation unit 5, and evaluates the transparency of the face of the subject to whom makeup has been applied. .
- the skin index calculation unit 52 has five skins including the representative value of the brightness component, the representative value of the color component, the generation amount of the negative factor, the generation amount of color unevenness, and the gradation characteristics. While calculating the evaluation index, the makeup evaluation calculation unit 53 calculated four makeup evaluation indices including the amount of the red component, the hue difference between the spot portion and its surroundings, the amount of the uneven portion, and the amount of the middle gloss portion. It is not limited to this. In the above-described sixth embodiment, the skin index calculation unit 52 includes five skins including the representative value of the brightness component, the representative value of the color component, the generation amount of the negative factor, the generation amount of color unevenness, and the gradation characteristics.
- the makeup evaluation calculating unit 61 calculated four makeup evaluation indices including the amount of the red component, the hue difference between the spot portion and the surrounding area, the unevenness of makeup, and the amount of the middle gloss portion.
- the skin index calculation unit 52 calculates at least one of the representative value of the brightness component, the representative value of the color component, and the generation amount of the negative factor as the skin evaluation index, and calculates the makeup evaluation.
- Can accurately evaluate the transparency of makeup skin by calculating the amount of red component as a makeup evaluation index and calculating the overall index by combining the obtained skin evaluation index and the makeup evaluation index. .
- FIG. 31 (a) shows the calculation of a comprehensive index by combining five skin evaluation indices and three makeup evaluation indices with each other for the face F of 13 subjects who have applied a liquid foundation. The sensory evaluation is performed, and the overall index is plotted against the sensory evaluation value obtained from the sensory evaluation.
- the total evaluation value includes five skin evaluation indexes (representative value L of brightness component, representative value C of color component, generation amount F of negative factor, generation amount I of color unevenness and gradation characteristic K), It is calculated by combining three makeup evaluation indexes (a red component amount D, a stain portion and surrounding hue difference H, and an intermediate gloss portion amount B) excluding the amount of the uneven portion.
- the overall index A + 10.5 ⁇ L ⁇ 2.6 ⁇ C ⁇ 28.1 ⁇ F ⁇ 0.9 ⁇ I ⁇ 1.0 ⁇ K ⁇ (19.8 ⁇ D ⁇ 4.3 ⁇ H + 1) .5 ⁇ B).
- A is a constant.
- FIG. 31 (b) shows the total index for the face F of nine subjects who applied the liquid foundation by combining five skin evaluation indices with each other and calculating the total index and performing sensory evaluation of transparency. Is plotted against the sensory evaluation value obtained from the sensory evaluation.
- the comprehensive evaluation value was calculated by combining five skin evaluation indexes including a representative value of the brightness component, a representative value of the color component, a generation amount of a negative factor, a generation amount of color unevenness, and a gradation characteristic.
- the sensory evaluation value is a value obtained by evaluating the transparency in 30 steps by sensory evaluation, and the value closer to 30 is evaluated as having no transparency. .
- FIG. 32 (a) shows a total index calculated by combining five skin evaluation indices and three makeup evaluation indices with each other for the faces F of 15 subjects who applied a powdery foundation, and is transparent.
- the sensory evaluation of feeling is performed, and the overall index is plotted against the sensory evaluation value obtained from the sensory evaluation.
- the total evaluation value includes five skin evaluation indexes (representative value L of brightness component, representative value C of color component, generation amount F of negative factor, generation amount I of color unevenness and gradation characteristic K), It is calculated by combining three makeup evaluation indexes (amount of red component D, stain portion and surrounding hue difference H, and amount of uneven portion W) excluding the amount of the middle glossy portion.
- the overall index A + 3.79 ⁇ L ⁇ 1.67 ⁇ C + 0.25 ⁇ F ⁇ 0.18 ⁇ I ⁇ 1.9 ⁇ K ⁇ (4.9 ⁇ D ⁇ 4.1 ⁇ H ⁇ 0 .8 ⁇ W).
- A is a constant.
- FIG. 32 (b) shows the total index for the face F of the 11 subjects who applied the powdery foundation by combining the five skin evaluation indices with each other and calculating the overall index and performing the sensory evaluation of transparency. Is plotted against the sensory evaluation value obtained from the sensory evaluation.
- the comprehensive evaluation value was calculated by combining five skin evaluation indexes including a representative value of the brightness component, a representative value of the color component, a generation amount of a negative factor, a generation amount of color unevenness, and a gradation characteristic.
- the sensory evaluation value is a value obtained by evaluating the transparency in 30 steps by sensory evaluation, and the value closer to 30 is evaluated as having no transparency. .
- FIG. 32 for (a) the result of the correlation between the general index and sensory evaluation values, the correlation coefficient R 2 was 0.70.
- FIG. 32 (b) the result of the correlation between the general index and sensory evaluation values, the correlation coefficient R 2 was 0.33. Therefore, even if the transparency of makeup skin is evaluated only from the skin evaluation index, the transparency of makeup skin cannot be accurately evaluated. It can be seen that by evaluating the transparency of the makeup skin in combination with the index, an evaluation result that sufficiently matches the sensory evaluation can be obtained, and the transparency of the makeup skin can be accurately evaluated.
- the transparency of the makeup skin is evaluated using the transparency evaluation apparatus according to Embodiment 6 shown in FIG. 27, and the result is used to evaluate the transparency of the makeup skin using the transparency evaluation apparatus according to Embodiment 5.
- the Example compared with the evaluated result is shown.
- FIGS. 34 (a) and 34 (b) show a skin evaluation index and a makeup evaluation index using the transparency evaluation apparatus of Embodiments 5 and 6 by applying a foundation to the face F of a subject with a clear wrinkle at the eyes. Are combined with each other to calculate a comprehensive index and perform a sensory evaluation of transparency, and the total index is plotted against a sensory evaluation value obtained from the sensory evaluation.
- FIGS. 34 (a) and 34 (b) show the fifth and sixth embodiments in which the same foundation as in FIGS. 33 (a) and 33 (b) is applied to the face F of the subject whose wrinkle portion is unclear.
- the skin evaluation index and the makeup evaluation index are combined with each other to calculate the overall index using the transparency evaluation apparatus of the above, and the sensory evaluation of the transparency is performed, and the overall index is plotted against the sensory evaluation value obtained from the sensory evaluation. It is a thing.
- FIGS. 33 (a) and 34 (a) show five skin evaluation indexes (representative value L of brightness component, representative value C of color component, minus) using the transparency evaluation apparatus of the fifth embodiment.
- FIGS. 33 (b) and 34 (b) show five skin evaluation indexes (representative value L of brightness component, representative value C of color component, negative factor) using the transparency evaluation apparatus of the sixth embodiment.
- FIG. 33 (a) and (b) Shown in FIG. 33 (a) and (b) the result of the correlation between the general index and sensory evaluation values for the correlation coefficients R 2 shown in FIG. 33 (a) is a 0.83, FIG. 33 (b) the correlation coefficient R 2 was 0.85.
- the correlation coefficient R 2 in FIG. 34A is 0.12, and FIG. the correlation coefficient R 2 of) was 0.80.
- the evaluation of transparency was performed about the makeup
- skin-correcting agents such as masking agents (such as TiO 2 ) and pigments, such as BB cream, CC cream, cosmetic base, sunscreen, etc. Evaluation can be made.
- SYMBOLS 1 Pre-processing part, 1a Image input part, 2 Color space conversion part, 3,51 Index calculation part, 4 Comprehensive index calculation part, 5 Transparency evaluation part, 6 Display part, 7 Control part, 8 Operation part, 9 Brightness Color calculation unit, 10 gradation characteristic calculation unit, 21, 22 negative factor calculation unit, 31, 41 color unevenness calculation unit, 52 skin index calculation unit, 53, 61 makeup index calculation unit, 54 red component amount calculation unit, 55 Spot hue difference calculation unit, 56 Concavity and convexity calculation unit, 57 Medium gloss amount calculation unit, 62 Makeup nonuniformity calculation unit, F face, C camera, R1, R2, R5 to R9 evaluation area, M contour line, G L * component Contour line distribution image, S1, S2 Contour line interval, D red component, L low luminance part, P non-uniform part.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Physics & Mathematics (AREA)
- Engineering & Computer Science (AREA)
- General Health & Medical Sciences (AREA)
- Pathology (AREA)
- Public Health (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Animal Behavior & Ethology (AREA)
- Biophysics (AREA)
- Veterinary Medicine (AREA)
- Dermatology (AREA)
- Dentistry (AREA)
- Oral & Maxillofacial Surgery (AREA)
- Theoretical Computer Science (AREA)
- Mathematical Physics (AREA)
- Spectroscopy & Molecular Physics (AREA)
- Chemical & Material Sciences (AREA)
- Analytical Chemistry (AREA)
- Biochemistry (AREA)
- General Physics & Mathematics (AREA)
- Immunology (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Image Analysis (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
しかしながら、これらの測定方法は、肌の局所的な物理量を測定するものである。このため、測定で得られる物理量は、肌全体を見たときの感覚である透明感を直接的に表すものではなく、この物理量に基づいて透明感を高精度に評価するのは困難であった。
また、特許文献1などの上記の評価方法は素肌の透明感を評価するものであり、化粧が施された肌(化粧肌)について同様に透明感を評価した場合には官能評価との間で評価結果にずれが生じていた。例えば、被験者の顔にファンデーションを塗布すると、ファンデーションにより肌の色が全体的に整えられる一方で、顔の頬部などの一部分で血色が低下するなど、素肌に対して不自然な変化が生じる場合がある。この化粧に起因して生じる不自然な変化は、化粧の厚塗り感などの印象を与え、化粧肌の透明感を低下させる要因となる。このような、化粧肌に特有に生じる不自然な変化は、特許文献1などの素肌に特化した透明感の評価方法では反映されないため、化粧肌の透明感を高精度に評価することが困難であった。
また、この発明は、化粧肌の透明感を高精度に評価することができる透明感評価装置、透明感評価方法および透明感評価プログラムを提供することも目的としている。
また、肌指標算出部は、撮影画像を区画する複数の等高線の数に基づいて第2の肌評価指標を算出してもよい。
また、肌指標算出部は、撮影画像から検出されたマイナス因子の個数、総面積、または面積率をマイナス因子の発生量として算出することができる。
また、肌指標算出部は、撮影画像から検出された色むらの総面積、面積率または個数を色むらの発生量として算出することができる。
また、化粧指標算出部は、被験者の顔の眉間部分、頬部分および顎部分のうち少なくとも1つに所定の評価領域を設定して、所定の評価領域における赤色成分の量を算出することが好ましい。
また、化粧指標算出部は、被験者の顔の目元部分および鼻から口元に延びる部分のうち少なくとも1つに所定の領域を設定して、所定の領域における凹凸部分の量を算出することが好ましい。
また、化粧指標算出部は、被験者の頬部分に所定の領域を設定して、所定の領域における化粧の不均一度を算出することが好ましい。
また、化粧指標算出部は、被験者の顔の頬骨部分および鼻筋部分のうち少なくとも1つに所定の領域を設定して、所定の領域における光沢部分の量を算出することが好ましい。
また、この発明によれば、第1の肌評価指標を算出すると共に撮影画像における赤色成分の量を第1の化粧評価指標として算出して化粧が施された被験者の顔の透明感を評価するので、化粧肌の透明感を高精度に評価することが可能となる。
実施の形態1
図1に、この発明の実施の形態1に係る透明感の評価方法を行う透明感評価装置の構成を示す。透明感評価装置は、被験者の顔FをカメラCで撮影した撮影画像を用いて被験者の顔Fの透明感を評価するもので、カメラCに接続される図示しない画像入力部を備え、この画像入力部に前処理部1、色空間変換部2、指標算出部3、総合指標算出部4、透明感評価部5および表示部6が順次接続されている。また、色空間変換部2、指標算出部3、総合指標算出部4および透明感評価部5には制御部7が接続され、この制御部7に操作部8が接続されている。
色空間変換部2は、前処理部1から入力される撮影画像の色空間を変換して色空間変換画像を生成する。色空間変換画像としては、例えば、L*a*b*色空間、LCH色空間、またはYCC色空間などに変換した画像を用いることができる。L*a*b*色空間に変換する場合には、計算光源としてD65光源を用いることができる。そして、色空間変換部2は、生成した色空間変換画像を明るさ成分(輝度成分)と色成分に分けて明るさ成分画像と色成分画像をそれぞれ生成する。具体的には、L*a*b*色空間を有する色空間変換画像であれば、明るさ成分はL*成分を示し、色成分はa*成分(赤と緑に対応した補色成分)、b*成分(黄色と青に対応した補色成分)、C*成分(彩度成分)およびHue成分(色相成分)などを示すものである。
明るさ・色算出部9は、色空間変換部2で生成された明るさ成分画像および色成分画像に対して被験者の顔Fに評価領域R1を設定する。評価領域R1は、例えば顔F全体または頬部分に設定することができる。明るさ・色算出部9は、明るさ成分画像に設定された評価領域R1において全体的な明るさ成分の値、すなわち評価領域R1における明るさ成分の代表値を算出する。また、明るさ・色算出部9は、色成分画像に設定された評価領域R1において全体的な色成分の値、すなわち評価領域R1における色成分の代表値を算出する。評価領域R1における全体的な明るさ成分の値および色成分の値は、例えば、評価領域R1における明るさ成分の平均値および色成分の平均値からそれぞれ算出することができる。
明るさ・色算出部9は、評価領域R1における全体的な明るさ成分の値と全体的な色成分の値をそれぞれ第1の肌評価指標として総合指標算出部4に出力し、階調特性算出部10は、評価領域R2にわたる明るさ成分の変化の滑らかさを第2の肌評価指標として総合指標算出部4に出力する。
透明感評価部5は、総合指標算出部4で算出された総合指標に基づいて被験者の顔Fの透明感を評価する。
操作部8は、操作者が情報の入力操作を行うためのもので、キーボード、マウス、トラックボール、タッチパネル等から形成することができる。
制御部7は、操作者により操作部8から入力された各種の指令信号等に基づいて、透明感評価装置内の各部の制御を行うものである。
また、予め素肌の透明感の官能評価を実施して算出された官能評価値と総合指標との関係を格納したデータベースを透明感評価部5に接続することもできる。透明感評価部5は、データベースから読み出された官能評価値と総合指標との関係と、総合指標算出部4から入力された総合指標とを比較することにより、素肌の透明感を評価することができる。
まず、被験者の顔FをカメラCで撮影して得られた撮影画像が、図1に示すように、カメラCから透明感評価装置の前処理部1に図示しない画像入力部を介して入力される。撮影画像は、光源補正およびノイズ除去などの前処理が施された後、前処理部1から色空間変換部2に出力され、その撮影画像の色空間が色空間色空間変換部2により、例えばL*a*b*色空間に変換されて色空間変換画像が生成される。そして、色空間変換部2は、色空間変換画像から明るさ成分と色成分を抽出し、明るさ成分画像と色成分画像をそれぞれ生成する。例えば、明るさ成分画像としてL*成分画像を生成すると共に色成分画像としてC*成分画像を生成することができる。生成されたL*成分画像とC*成分画像が色空間変換部2から明るさ・色算出部9に出力されると共に、L*成分画像が色空間変換部2から階調特性算出部10に出力される。
続いて、明るさ・色算出部9は、L*成分画像に設定された評価領域R1についてL*成分の強度の平均値を求めると共にC*成分画像に設定された評価領域R1についてC*成分の強度の平均値を求める。これにより、被験者の顔Fに設定された評価領域R1について、全体的なL*成分の値と、全体的なC*成分の値をそれぞれ求めることができる。
具体的には、階調特性算出部10は、図3(a)および(b)に示すように、L*成分画像の被験者の顔Fに対して、一定の強度間隔で段階的に複数の等高線Mを設定し、この複数の等高線Mにより被験者の顔FをL*成分の値に従って区画したL*成分等高線分布画像Gを生成する。ここで、図3(a)は透明感が高い被験者のL*成分等高線分布画像Gであり、図3(b)は透明感が低い被験者のL*成分等高線分布画像Gである。なお、L*成分等高線分布画像Gでは、互いに隣り合う2つの等高線で囲まれた領域は同一の強度を示すものとして表されている。このようにして、被験者の顔Fを複数の等高線で区画することにより、被験者の顔FにおけるL*成分の分布を複数の等高線の位置で表すことができる。
なお、被験者の顔Fを区画する複数の等高線Mは、L*成分画像の強度レンジに対して1/10程度の強度間隔となるように、または、3~5digit刻みとなるように設定することが好ましい。
なお、評価領域R2は、操作者が操作部8を操作することにより、制御部7を介してL*成分等高線分布画像Gの所定の領域に設定することもできる。
このようにして求められたL*成分の階調特性を第2の肌評価指標として、階調特性算出部10から総合指標算出部4に出力する。
透明感評価部5は、総合指標算出部4で算出された総合指標に基づいて、被験者の顔Fの透明感を評価し、その評価結果が表示部6に表示される。
この実施例は、40歳代の8名の被験者に対して、実施の形態1の透明感の評価方法により透明感を評価すると共に、透明感の官能評価を行ったものである。ここで、実施の形態1の透明感の評価方法では、各被験者について、評価領域R1におけるL*成分の平均値、評価領域R1におけるC*成分の平均値、および評価領域R2を区画する複数の等高線の間隔の均一度をそれぞれ算出し、これらの値を互いに組み合わせて透明感に対する総合指標を求めている。なお、複数の等高線Mの間隔の均一度は、画像の画素数に基づいて複数の等高線Mの間隔を求めて、その複数の等高線Mの間隔をそれぞれ表す画素数のばらつきを算出したものであり、等高線の間隔が均一であるほど数値が小さくなるものである。
図5に基づいてL*成分の平均値と官能評価値との間の相関を求めた結果、相関係数R2は0.42であった。また、図6に基づいてC*成分の平均値と官能評価値との間の相関を求めた結果、相関係数R2は0.0032であった。さらに、図7に基づいて等高線間隔の均一度と官能評価値との間の相関を求めた結果、相関係数R2は0.32であった。続いて、図5~図7について、L*成分の平均値Lと、C*成分の平均値Cと、等高線間隔の均一度Kとを重回帰分析した結果、官能評価値S(総合指標値)=177.1-410.9×L+83.5×C-33.7×Kの重回帰式が得られた。
また、透明感評価装置は、ネットワークを介して撮影画像を前処理部1に入力することもできる。例えば、透明感評価装置は、撮影画像を保存したコンピュータとネットワークを介して接続され、コンピュータから入力された撮影画像に基づいて肌の透明感を評価して、その評価結果をサーバなどに格納する。これにより、ユーザは、サーバにアクセスすることで透明感の評価結果を閲覧、または、サーバからネットワークを介して透明感の評価結果を入手することができる。
図9に、実施の形態2に係る透明感の評価方法を行う透明感評価装置の構成を示す。この透明感評価装置は、図1に示した実施の形態1における透明感評価装置において、明るさ・色算出部9に換えてマイナス因子算出部21を指標算出部3に配置し、色空間変換部2と総合指標算出部4にそれぞれ接続したものである。
また、マイナス因子は、Dog画像を生成することなく、例えば、L*成分画像から所定の閾値以下の強度を有する成分を抽出し、抽出された成分について主成分分析および独立成分分析などを実施することにより検出することもできる。
このようにして、算出された評価領域R3におけるマイナス因子の発生量は、第1の肌評価指標として、マイナス因子算出部21から総合指標算出部4に出力される。
そして、透明感評価部5が、総合指標算出部4で算出された総合指標に基づいて、被験者の顔Fの透明感を評価し、その評価結果が表示部6に表示される。
上記と同様にして、マイナス因子算出部22において、マイナス因子の個数、総面積および面積率が第1の肌評価指標として算出され、この第1の肌評価指標が総合指標算出部4に出力される。
そして、透明感評価部5が、総合指標算出部4で算出された総合指標に基づいて、被験者の顔Fの透明感を評価し、その評価結果が表示部6に表示される。このように、互いに異なる性質を有するより多くの指標に基づいて透明感を評価することにより、高精度に透明感を評価することができる。
実施の形態1および2に係る透明感評価装置において、被験者の顔Fの色むらを検出してその発生量を算出する色むら算出部を指標算出部3に新たに設けて透明感評価装置を構成することができる。
色むら算出部31は、色空間変換部2で生成された明るさ成分画像または色成分画像の被験者の顔Fに対して評価領域R4を設定し、設定された評価領域R4において明るさ成分の値または色成分の値が局所的に変化すると共にマイナス因子よりも大きな部分を色むらとして検出する。ここで、評価領域R4は、被験者の顔F全体または頬部分などに設定することができる。続いて、色むら算出部31は、検出結果に基づいて評価領域R4における色むらの発生量を求める。
そして、透明感評価部5が、総合指標算出部4で算出された総合指標に基づいて、被験者の顔Fの透明感を評価し、その評価結果が表示部6に表示される。
この実施例は、40歳代の8名の被験者に対して、実施の形態3の透明感の評価方法により透明感を評価すると共に、透明感の官能評価を行ったものである。ここで、実施の形態3の透明感の評価方法では、各被験者について、評価領域R3におけるシミの総面積、評価領域R3における毛穴の総面積、評価領域R2を区画する複数の等高線の間隔の均一度、および評価領域R4における色むらの総面積をそれぞれ算出し、これらの値を互いに組み合わせて透明感に対する総合指標を求めている。なお、シミの総面積、毛穴の総面積および色むらの総面積は、シミ、毛穴および色むらをそれぞれ構成する画素数から算出している。
図12に基づいてシミの総面積と官能評価値との間の相関を求めた結果、相関係数R2は0.57であった。また、図13に基づいて毛穴の総面積と官能評価値との間の相関を求めた結果、相関係数R2は0.58であった。さらに、図14に基づいて色むらの総面積と官能評価値との間の相関を求めた結果、相関係数R2は0.49であった。また、上記に示したように、図7に基づいて求められた等高線間隔の均一度と官能評価値との間の相関係数R2は0.32である。
続いて、図12~図14および図7について、シミの総面積Fと、毛穴の総面積Pと、色むらの総面積Iと、等高線間隔の均一度Kとを重回帰分析した結果、官能評価値S(総合指標値)=-79.67-112.26×F+153.72×P+134.45×I+41.98×Kの重回帰式が得られた。
図16に示すように、図10に示した実施の形態3に係る透明感評価装置において、指標算出部3に色むら算出部41を新たに設け、この色むら算出部41を色空間変換部2と総合指標算出部4にそれぞれ接続して透明感評価装置を構成することもできる。すなわち、指定算出部3は、実施の形態1~3で示された明るさ・色算出部、階調特性算出部、マイナス因子算出部および色むら算出部の全てを有するものである。
上記と同様にして、色むら算出部41では、評価領域R4における色むらの発生量が第3の肌評価指標として算出され、この第3の肌評価指標が総合指標算出部4に出力される。
この実施例は、40歳代の8名の被験者に対して、実施の形態4の透明感の評価方法により透明感を評価すると共に、透明感の官能評価を行ったものである。ここで、実施の形態4の透明感の評価方法では、各被験者について、評価領域R1におけるL*成分の平均値、評価領域R1におけるC*成分の平均値、評価領域R2を区画する複数の等高線の間隔の均一度、評価領域R3におけるシミの総面積、評価領域R3における毛穴の総面積、および評価領域R4における色むらの総面積をそれぞれ算出し、これらの値を互いに組み合わせて透明感に対する総合指標を求めている。
また、上記の実施の形態1~4のような素肌の透明感の評価は、入力手段、CPUおよびメモリなどから構成されるコンピュータを記録媒体に記録された透明感評価プログラムにより機能させることで実行することができる。すなわち、記録媒体に記録された透明感評価プログラムがコンピュータを機能させることにより、画像入力部が、被験者の素肌を撮影した撮影画像を取得し、取得された撮影画像に基づいて、CPUが、前処理部1、色空間変換部2、指標算出部3、総合指標算出部4および透明感評価部5を実行させて被験者の素肌について透明感の評価を行う。
実施の形態1~4では、被験者の素肌の透明感を評価したが、化粧が施された被験者の顔F(化粧肌)の透明感を評価することもできる。
図19に、実施の形態5に係る透明感評価装置の構成を示す。この透明感評価装置は、図1に示した実施の形態1の透明感評価装置において、指標算出部3に換えて指標算出部51を配置したものである。具体的には、透明感評価装置は、化粧を施した被験者の顔FをカメラCで撮影した撮影画像を用いて透明感を評価するもので、カメラCに接続される画像入力部1aを備え、この画像入力部1aに前処理部1、色空間変換部2、指標算出部51、総合指標算出部4、透明感評価部5および表示部6が順次接続されている。また、色空間変換部2、指標算出部51、総合指標算出部4および透明感評価部5には制御部7が接続され、この制御部7に操作部8が接続されている。
前処理部1は、画像入力部1aから入力される撮影画像に対して、光量補正およびノイズ除去などの前処理を施す。
色空間変換部2は、前処理部1から入力される撮影画像の色空間を変換して色空間変換画像を生成する。
指標算出部51は、色空間変換部2にそれぞれ接続される肌指標算出部52と化粧指標算出部53とを有する。肌指標算出部51および化粧指標算出部52は、色空間変換部2から色空間変換画像が入力され、色空間変換画像に基づいて化粧肌の透明感を評価するための肌評価指標および化粧評価指標をそれぞれ算出する。
肌指標算出部52および化粧指標算出部53は、それぞれ算出した肌評価指標および化粧評価指標を総合指標算出部4に出力する。
透明感評価部5は、総合指標算出部4で算出された総合指標に基づいて化粧が施された被験者の顔Fの透明感を評価する。
操作部8は、操作者が情報の入力操作を行うためのもので、キーボード、マウス、トラックボール、タッチパネル等から形成することができる。
制御部7は、操作者により操作部8から入力された各種の指令信号等に基づいて、透明感評価装置内の各部の制御を行うものである。
図20に示すように、肌指標算出部52は、それぞれ色空間変換部2と総合指標算出部4に接続された明るさ・色算出部9、マイナス因子算出部21、色むら算出部31および階調特性算出部10を有する。
図21に示すように、化粧指標算出部53は、それぞれ色空間変換部2と総合指標算出部4に接続された赤色成分量算出部54、シミ色相差算出部55、凹凸量算出部56および中光沢量算出部57を有する。
ここで、赤色成分としては、眉間部分、頬部分および顎部分などの被験者の顔Fの一部に顕著に表れる血色に起因した赤みを表すものであり、例えば、色成分画像のa*成分を抽出することで得ることができる。なお、赤色成分は、血色に起因した赤みをより確実に抽出するために、色成分画像のa*成分のうち正の値を有する成分であるのがより好まし。また、赤色成分は、赤みを有する唇などの誤検出を抑制するために、a*成分のうち13以上30以下の値を有する成分であるのがさらに好ましい。
赤色成分量算出部54は、評価領域R5における赤色成分の量を化粧評価指標として総合指標算出部4に出力する。
シミ色相差算出部55は、評価領域R6におけるシミ部分とその周囲との色相差を化粧評価指標として総合指標算出部4に出力する。
また、検出された低輝度部分のうち、大きさおよび形状に基づいてシワまたは毛穴を示す部分をさらに抽出して凹凸部分を検出することが好ましい。なお、評価領域R7は、例えば目元部分および鼻から口元に延びる部分のうち少なくとも1つに設定するのが好ましい。続いて、凹凸量算出部56は、評価領域R7において検出された凹凸部分の量を算出する。ここで、凹凸部分の量としては、評価領域R7における凹凸部分の面積(画素数)または面積率から算出することができる。
凹凸量算出部56は、評価領域R7における凹凸部分の量を化粧評価指標として総合指標算出部4に出力する。
続いて、中光沢量算出部57は、評価領域R8において検出された中光沢部分の量を算出する。ここで、中光沢部分の量としては、評価領域R8における中光沢部分の面積(画素数)または面積率から算出することができる。
中光沢量算出部57は、評価領域R8における中光沢部分の量を化粧評価指標として総合指標算出部5に出力する。
まず、化粧が施された被験者の顔FをカメラCで撮影して得られた撮影画像が、実施の形態1~4と同様に、図19に示すように、カメラCから透明感評価装置の画像入力部1aを介して前処理部1に入力される。撮影画像は、前処理部1において前処理が施された後、色空間変換部2において色空間変換画像が生成される。色空間変換部2は、色空間変換画像から明るさ成分と色成分を抽出し、明るさ成分画像と色成分画像をそれぞれ生成する。例えば、明るさ成分画像としてL*成分画像を生成すると共に色成分画像としてC*成分画像、a*成分画像およびHue成分画像を生成することができる。
このようにして、算出された評価領域R3におけるマイナス因子の発生量は、肌評価指標として、マイナス因子算出部21から総合指標算出部4に出力される。
具体的には、階調特性算出部10は、図22(a)および(b)に示すように、L*成分画像の被験者の顔Fに対して、一定の強度間隔で段階的に複数の等高線Mを設定し、この複数の等高線Mにより被験者の顔FをL*成分の値に従って区画したL*成分等高線分布画像Gを生成する。ここで、図22(a)は化粧肌の透明感が高い被験者のL*成分等高線分布画像Gであり、図22(b)は化粧肌の透明感が低い被験者のL*成分等高線分布画像Gである。このようにして、化粧が施された被験者の顔Fを複数の等高線で区画することにより、化粧が施された被験者の顔FにおけるL*成分の分布を複数の等高線の位置で表すことができる。
なお、被験者の顔Fを区画する複数の等高線Mは、L*成分画像の強度レンジに対して1/10程度の強度間隔となるように、または、3~5digit刻みとなるように設定することが好ましい。
また、L*成分の階調特性は、評価領域R2を区画する複数の等高線Mの数に基づいて算出することもできる。
このようにして求められたL*成分の階調特性を肌評価指標として、階調特性算出部10から総合指標算出部4に出力する。
ここで、被験者の顔Fにファンデーションが塗布された化粧肌について、透明感が高いと官能評価で評価された化粧肌のa*成分画像を図25(a)に示し、透明感が低いと官能評価で評価された化粧肌のa*成分画像を図25(b)に示す。図25(a)では、評価領域R5が設定された眉間部分、頬部分および顎部分において赤色成分D(黒く表示された部分)が多く分布しているのに対して、図25(b)では、評価領域R5が設定された眉間部分、頬部分および顎部分において赤色成分Dがほとんど分布していないことがわかる。
この化粧評価指標に用いられた赤色成分の量は、上述した肌評価指標のように素肌の透明感を評価する際に共通して用いられるものではなく、化粧に起因した透明感の減少を示す化粧肌に特有の評価指標であり、肌評価指標とは異なる観点から透明感を評価することができる。
この化粧評価指標に用いられたシミ部分とその周囲との色相差は、上述した肌評価指標のように素肌の透明感を評価する際に共通して用いられるものではなく、化粧に起因した透明感の減少を示す化粧肌に特有の評価指標であり、肌評価指標とは異なる観点から透明感を評価することができる。
ここで、図26(a)および(b)に、被験者の顔Fにファンデーションが塗布された化粧肌の目元部分について、L*成分の値が所定の閾値以下を示す低輝度部分Lを抽出した結果を示す。図26(a)は官能評価で透明感が高いと評価された化粧肌であり、図26(b)は官能評価で透明感が低いと評価された化粧肌である。その結果、図26(a)では、評価領域R7が設定される目元部分において低輝度部分Lが少ないのに対して、図26(b)では、評価領域R7が設定される目元部分において低輝度部分Lが多いことがわかる。
なお、化粧後に凹凸部分が目立つ現象は、パウダー性の化粧品を使用することにより顕著に現れるものであり、凹凸量算出部56により算出される化粧評価指標はパウダー性の化粧品を用いた場合に用いることが好ましい。
この化粧評価指標に用いられた凹凸部分の量は、上述した肌評価指標のように素肌の透明感を評価する際に共通して用いられるものではなく、化粧に起因した透明感の減少を示す化粧肌に特有の評価指標であり、肌評価指標とは異なる観点から透明感を評価することができる。
なお、化粧後に中光沢部分が低下する現象は、リキッド性の化粧品を使用することにより顕著に現れるものであり、中光沢量算出部57により算出される化粧評価指標はリキッド性の化粧品を用いた場合に用いることが好ましい。
この化粧評価指標に用いられた中光沢部分の量は、上述した肌評価指標のように素肌の透明感を評価する際に共通して用いられるものではなく、化粧に起因した透明感の減少を示す化粧肌に特有の評価指標であり、肌評価指標とは異なる観点から透明感を評価することができる。
透明感評価部5は、総合指標算出部4で算出された総合指標に基づいて、化粧が施された被験者の顔Fの透明感を評価し、その評価結果が表示部6に表示される。
さらに、化粧指標算出部53で算出される化粧評価指標は、化粧に起因した透明感の変化を示す化粧肌に特有の評価指標であり、肌評価指標と組み合わせることにより化粧肌の透明感を精度よく評価することができる。
図27に、実施の形態6に係る透明感評価装置の化粧指標算出部61の構成を示す。この化粧指標算出部61は、実施の形態5に係る透明感評価装置の化粧指標算出部53において、凹凸量算出部56に換えて化粧不均一度算出部62を色空間変換部2と総合指標算出部4にそれぞれ接続したものである。
化粧不均一度算出部62は、撮影画像に対して、化粧が施された被験者の顔Fに評価領域R9を設定する。評価領域R9は、図28に示すように、目の下側からほうれい線(小鼻から口角に延びるシワ)の上側にわたる頬部分に設定することが好ましい。
ここで、撮影画像としては、化粧由来の明るさ成分または色成分を取り出すことができればよく、例えば、カメラCから入力される撮影画像、色空間変換部2で生成されるL*成分画像などを用いることができる。特に、カメラCから入力される撮影画像のRGBチャンネルのうちGチャンネルを用いて化粧由来の色成分を取り出すことが好ましい。Gチャンネルの波長領域、例えば480nmから600nmの波長領域では、皮膚由来の色成分の強度と化粧由来の色成分の強度の差が大きく表れるため、このGチャンネル画像を用いることで化粧由来の色成分を容易に取り出すことができる。なお、Gチャンネルの波長領域は550nm付近に設定することが好ましく、これにより皮膚由来の色成分の強度と化粧由来の色成分の強度の差がより大きくなり、化粧由来の色成分を高精度に取り出すことができる。
ここで、化粧の不均一部分とは、例えば、被験者の顔Fにファンデーションを塗布することにより生じる小さな塊に起因して生じるものである。具体的には、塗布する際にファンデーションが皮膚の表面で粒子状に固められて生じる塊、毛穴に入り込んだファンデーションの塊、および皮溝の縁に付着したファンデーションの塊などが、ファンデーションで整えられた周囲の色に対して白く浮き出た印象を与える。そして、この化粧の不均一部分が、化粧による不自然な粉感および厚塗り感などを引き起こして、透明感が低いと感じられる。
そして、化粧不均一度算出部62は、評価領域R9における化粧の不均一度を化粧評価指標として総合指標算出部4に出力する。この化粧評価指標に用いられる化粧の不均一度は、化粧に起因した透明感の減少を示す化粧肌に特有の評価指標であり、肌評価指標とは異なる観点から透明感を評価することができる。
また、実施の形態5の凹凸量算出部56で検出される凹凸部分は目元のシワ部分などに起因して生じるものであるが、このシワ部分は全ての人に明確に存在するものではないため、同様の化粧を施しても凹凸部分が検出される人と検出されない人が生じて、透明感の評価結果にばらつきが生じるおそれがある。これに対して、化粧の不均一部分Pは、幅広い被験者の皮膚に存在する毛穴および皮溝の状態、および化粧の材質などに起因して生じるものである。このため、幅広い被験者について、化粧の不均一度に基づいて更に正確に化粧肌の透明感を評価することができる。
具体的には、肌指標算出部52が、明るさ成分の代表値、色成分の代表値、および、マイナス因子の発生量のうち、少なくとも1つを肌評価指標として算出すると共に、化粧評価算出部が、赤色成分の量を化粧評価指標として算出し、得られた肌評価指標と化粧評価指標を互いに組み合わせて総合指標を算出することにより、化粧肌の透明感を精度よく評価することができる。
図31(a)は、リキッド性のファンデーションを塗布した13名の被験者の顔Fに対して、5つの肌評価指標と3つの化粧評価指標とを互いに組み合わせて総合指標を算出すると共に透明感の官能評価を行い、総合指標を官能評価から得られた官能評価値に対してプロットしたものである。
ここで、総合評価値は、5つの肌評価指標(明るさ成分の代表値L、色成分の代表値C、マイナス因子の発生量F、色むらの発生量Iおよび階調特性K)と、凹凸部分の量を除いた3つの化粧評価指標(赤色成分の量D、シミ部分とその周囲の色相差Hおよび中光沢部分の量B)とを互いに組み合わせて算出されたものである。具体的には、総合指標=A+10.5×L-2.6×C-28.1×F-0.9×I-1.0×K-(19.8×D-4.3×H+1.5×B)で示された。なお、Aは定数である。
なお、図31(a)および(b)において、官能評価値は、官能評価により透明感を30段階で評価したものであり、値が30に近づくほど透明感がないと評価されたものである。
このことから、肌評価指標のみから化粧肌の透明感を評価しても化粧肌の透明感を正確に評価することはできず、化粧に起因した透明感の変化を示す化粧評価指標を肌評価指標に組み合わせて化粧肌の透明感を評価することにより官能評価と充分に一致した評価結果が得られ、化粧肌の透明感を正確に評価できることがわかる。
ここで、総合評価値は、5つの肌評価指標(明るさ成分の代表値L、色成分の代表値C、マイナス因子の発生量F、色むらの発生量Iおよび階調特性K)と、中光沢部分の量を除いた3つの化粧評価指標(赤色成分の量D、シミ部分とその周囲の色相差Hおよび凹凸部分の量W)とを互いに組み合わせて算出されたものである。具体的には、総合指標=A+3.79×L-1.67×C+0.25×F-0.18×I-1.9×K-(4.9×D-4.1×H-0.8×W)で示された。なお、Aは定数である。
なお、図32(a)および(b)において、官能評価値は、官能評価により透明感を30段階で評価したものであり、値が30に近づくほど透明感がないと評価されたものである。
このことから、肌評価指標のみから化粧肌の透明感を評価しても化粧肌の透明感を正確に評価することはできず、化粧に起因した透明感の変化を示す化粧評価指標を肌評価指標に組み合わせて化粧肌の透明感を評価することにより官能評価と充分に一致した評価結果が得られ、化粧肌の透明感を正確に評価できることがわかる。
また、図34(a)および(b)は、目元のシワ部分が不明瞭な被験者の顔Fに図33(a)および(b)と同様のファンデーションを塗布して、実施の形態5および6の透明感評価装置を用いて肌評価指標と化粧評価指標とを互いに組み合わせて総合指標を算出すると共に透明感の官能評価を行い、総合指標を官能評価から得られた官能評価値に対してプロットしたものである。
なお、図33(a)および(b)並びに図34(a)および(b)において、官能評価値は、官能評価により透明感を5段階で評価したものであり、値が5に近づくほど透明感が高いと評価されたものである。
Claims (26)
- 被験者の肌を撮影した撮影画像を入力する画像入力部と、
前記撮影画像における明るさ成分の代表値、前記撮影画像における色成分の代表値、および、前記撮影画像において明るさ成分の値または色成分の値が局所的に変化するマイナス因子の発生量のうち、少なくとも1つを第1の肌評価指標として算出すると共に、前記撮影画像において明るさ成分の強度分布および色成分の強度分布のうち少なくとも一方を求め、求められた前記強度分布に基づいて明るさ成分の変化の滑らかさ、および、色成分の変化の滑らかさのうち少なくとも1つを第2の肌評価指標として算出する肌指標算出部と、
前記肌指標算出部で算出された前記第1の肌評価指標と前記第2の肌評価指標を含む複数の評価指標を互いに組み合わせて肌の透明感に対する総合指標を算出する総合指標算出部と、
前記総合指標算出部で算出された前記総合指標に基づいて前記被験者の肌の透明感を評価する透明感評価部と
を有することを特徴とする透明感評価装置。 - 前記肌指標算出部は、前記撮影画像における明るさ成分の強度分布および色成分の強度分布を求めるために、一定の強度間隔で段階的に設定された複数の等高線により、前記撮影画像を明るさ成分の値または色成分の値に従って区画する請求項1に記載の透明感評価装置。
- 前記肌指標算出部は、互いに隣り合う前記複数の等高線の間隔をそれぞれ求め、求められた前記複数の等高線の間隔の均一度に基づいて前記第2の肌評価指標を算出する請求項2に記載の透明感評価装置。
- 前記肌指標算出部は、前記撮影画像を区画する前記複数の等高線の数に基づいて前記第2の肌評価指標を算出する請求項2に記載の透明感評価装置。
- 前記肌指標算出部は、前記第2の肌評価指標を算出するための評価領域を被験者の顔の頬部分から顔の輪郭部分を直線的に結ぶように設定する請求項1~4のいずれか一項に記載の透明感評価装置。
- 前記肌指標算出部は、前記撮影画像における明るさ成分の平均値を前記明るさ成分の代表値として算出すると共に前記撮影画像における色成分の平均値を前記色成分の代表値として算出する請求項1~5のいずれか一項に記載の透明感評価装置。
- 前記肌指標算出部は、前記撮影画像から検出された前記マイナス因子の個数、総面積、または面積率を前記マイナス因子の発生量として算出する請求項1~6のいずれか一項に記載の透明感評価装置。
- 前記肌指標算出部は、前記撮影画像における明るさ成分の値または色成分の値が局所的に変化すると共に前記マイナス因子よりも大きな部分を色むらとして検出し、検出された前記色むらの発生量を第3の肌評価指標として算出し、
前記総合指標算出部は、前記第3の肌評価指標をさらに含む前記複数の評価指標を互いに組み合わせて前記総合指標を算出する請求項1~7のいずれか一項に記載の透明感評価装置。 - 前記肌指標算出部は、前記撮影画像から検出された前記色むらの総面積、面積率または個数を前記色むらの発生量として算出する請求項8に記載の透明感の評価方法。
- 被験者の肌を撮影した撮影画像を入力し、
前記撮影画像における明るさ成分の代表値、前記撮影画像における色成分の代表値、および、前記撮影画像において明るさ成分の値または色成分の値が局所的に変化するマイナス因子の発生量のうち、少なくとも1つを第1の肌評価指標として算出すると共に、前記撮影画像において明るさ成分の強度分布および色成分の強度分布のうち少なくとも一方を求め、求められた前記強度分布に基づいて明るさ成分の変化の滑らかさ、および、色成分の変化の滑らかさのうち少なくとも1つを第2の肌評価指標として算出し、
算出された前記第1の肌評価指標と前記第2の肌評価指標を含む複数の評価指標を互いに組み合わせて肌の透明感に対する総合指標を算出し、
算出された前記総合指標に基づいて前記被験者の肌の透明感を評価することを特徴とする透明感評価方法。 - 被験者の肌を撮影した撮影画像を取得するステップと、
前記撮影画像における明るさ成分の代表値、前記撮影画像における色成分の代表値、および、前記撮影画像において明るさ成分の値または色成分の値が局所的に変化するマイナス因子の発生量のうち、少なくとも1つを第1の肌評価指標として算出すると共に、前記撮影画像において明るさ成分の強度分布および色成分の強度分布のうち少なくとも一方を求め、求められた前記強度分布に基づいて明るさ成分の変化の滑らかさ、および、色成分の変化の滑らかさのうち少なくとも1つを第2の肌評価指標として算出するステップと、
算出された前記第1の肌評価指標と前記第2の肌評価指標を含む複数の評価指標を互いに組み合わせて肌の透明感に対する総合指標を算出するステップと、
算出された前記総合指標に基づいて前記被験者の肌の透明感を評価するステップとをコンピュータに実行させるための透明感評価プログラム。 - 化粧が施された被験者の顔を撮影した撮影画像を入力する画像入力部と、
前記撮影画像における明るさ成分の代表値、前記撮影画像における色成分の代表値、および、前記撮影画像における明るさ成分の値または色成分の値を局所的に変化させるマイナス因子の発生量のうち、少なくとも1つを第1の肌評価指標として算出する肌指標算出部と、
前記撮影画像において血色に起因する赤色成分の量を第1の化粧評価指標として算出する化粧指標算出部と、
前記肌指標算出部および前記化粧指標算出部でそれぞれ算出された前記第1の肌評価指標および前記第1の化粧評価指標を含む複数の評価指標を互いに組み合わせて透明感に対する総合指標を算出する総合指標算出部と、
前記総合指標算出部で算出された前記総合指標に基づいて化粧が施された前記被験者の顔の透明感を評価する透明感評価部と
を有する透明感評価装置。 - 前記化粧指標算出部は、前記撮影画像における赤色成分の平均値、前記撮影画像において赤色成分が検出される部分の面積、または、前記撮影画像において赤色成分が検出される部分の面積率を前記赤色成分の量として算出する請求項12に記載の透明感評価装置。
- 前記化粧指標算出部は、前記被験者の顔の眉間部分、頬部分および顎部分のうち少なくとも1つに所定の評価領域を設定して、前記所定の評価領域における前記赤色成分の量を算出する請求項12または13に記載の透明感評価装置。
- 前記化粧指標算出部は、前記撮影画像において明るさ成分の値または色成分の値が局所的に変化するシミ部分を検出し、前記シミ部分とその周囲との色相差を第2の化粧評価指標として算出し、
前記総合指標算出部は、前記第2の化粧評価指標をさらに含む複数の評価指標を互いに組み合わせて前記総合指標を算出する請求項12~14のいずれか一項に記載の透明感評価装置。 - 前記化粧指標算出部は、前記撮影画像における明るさ成分の値に基づいて前記被験者の顔に生じた陰影を示す低輝度部分を凹凸部分として検出して、前記凹凸部分の量を第3の化粧評価指標として算出し、
前記総合指標算出部は、前記第3の化粧評価指標をさらに含む複数の評価指標を互いに組み合わせて前記総合指標を算出する請求項12~15のいずれか一項に記載の透明感評価装置。 - 前記化粧指標算出部は、前記撮影画像における前記凹凸部分の面積または面積率を前記凹凸部分の量として算出する請求項16に記載の透明感評価装置。
- 前記化粧指標算出部は、前記被験者の顔の目元部分および鼻から口元に延びる部分のうち少なくとも1つに所定の領域を設定して、前記所定の領域における前記凹凸部分の量を算出する請求項16または17に記載の透明感評価装置。
- 前記化粧指標算出部は、皮膚由来と化粧由来の互いに異なる明るさ成分の値または色成分の値に基づいて、前記撮影画像から化粧由来の明るさ成分または化粧由来の色成分を取り出し、前記化粧由来の明るさ成分の値または前記化粧由来の色成分の値が不均一に変化する部分を抽出して化粧の不均一度を第4の化粧評価指標として算出し、
前記総合指標算出部は、前記第4の化粧評価指標をさらに含む複数の評価指標を互いに組み合わせて前記総合指標を算出する請求項12~15のいずれか一項に記載の透明感評価装置。 - 前記化粧指標算出部は、前記被験者の頬部分に所定の領域を設定して、前記所定の領域における前記化粧の不均一度を算出する請求項19に記載の透明感評価装置。
- 前記化粧指標算出部は、前記撮影画像において明るさ成分の強度に基づいて前記被験者の顔のテカリを示す中光沢部分を検出して、前記中光沢部分の量を第5の化粧評価指標として算出し、
前記総合指標算出部は、前記第5の化粧評価指標をさらに含む複数の評価指標を互いに組み合わせて前記総合指標を算出する請求項12~20のいずれか一項に記載の透明感評価装置。 - 前記化粧指標算出部は、前記被験者の顔の頬骨部分および鼻筋部分のうち少なくとも1つに所定の領域を設定して、前記所定の領域における前記光沢部分の量を算出する請求項21に記載の透明感評価装置。
- 前記肌指標算出部は、前記撮影画像における明るさ成分の強度分布および色成分の強度分布のうち少なくとも一方を求め、求められた前記強度分布に基づいて明るさ成分の変化の滑らかさ、および、色成分の変化の滑らかさのうち少なくとも1つを第2の肌評価指標として算出し、
前記総合指標算出部は、前記第2の肌評価指標をさらに含む前記複数の評価指標を互いに組み合わせて前記総合指標を算出する請求項12~22のいずれか一項に記載の透明感評価装置。 - 前記肌指標算出部は、前記撮影画像における明るさ成分の値または色成分の値が局所的に変化すると共に前記マイナス因子よりも大きな部分を色むらとして検出し、検出された前記色むらの発生量を第3の肌評価指標として算出し、
前記総合指標算出部は、前記第3の肌評価指標をさらに含む前記複数の評価指標を互いに組み合わせて前記総合指標を算出する請求項12~23のいずれか一項に記載の透明感評価装置。 - 化粧が施された被験者の顔を撮影した撮影画像を入力し、
前記撮影画像における明るさ成分の代表値、前記撮影画像における色成分の代表値、および、前記撮影画像における明るさ成分の値または色成分の値を局所的に変化させるマイナス因子の発生量のうち、少なくとも1つを第1の肌評価指標として算出し、
前記撮影画像において血色に起因する赤色成分の量を第1の化粧評価指標として算出し、
算出された前記第1の肌評価指標および前記第1の化粧評価指標を含む複数の評価指標を互いに組み合わせて透明感に対する総合指標を算出し、
算出された前記総合指標に基づいて化粧が施された前記被験者の顔の透明感を評価する透明感評価方法。 - 化粧が施された被験者の顔を撮影した撮影画像を取得するステップと、
前記撮影画像における明るさ成分の代表値、前記撮影画像における色成分の代表値、および、前記撮影画像における明るさ成分の値または色成分の値を局所的に変化させるマイナス因子の発生量のうち、少なくとも1つを第1の肌評価指標として算出するステップと、
前記撮影画像において血色に起因する赤色成分の量を第1の化粧評価指標として算出するステップと、
算出された前記第1の肌評価指標および前記第1の化粧評価指標を含む複数の評価指標を互いに組み合わせて透明感に対する総合指標を算出するステップと、
算出された前記総合指標に基づいて化粧が施された前記被験者の顔の透明感を評価するステップとをコンピュータに実行させるための透明感評価プログラム。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
KR1020157034229A KR101713086B1 (ko) | 2013-06-07 | 2014-06-03 | 투명감 평가 장치, 투명감 평가 방법 및 투명감 평가 프로그램 |
JP2015521456A JP6026655B2 (ja) | 2013-06-07 | 2014-06-03 | 透明感評価装置、透明感評価装置の作動方法、透明感評価方法および透明感評価プログラム |
CN201480031976.3A CN105263399B (zh) | 2013-06-07 | 2014-06-03 | 透明感评价装置、透明感评价方法 |
US14/955,157 US9750326B2 (en) | 2013-06-07 | 2015-12-01 | Transparency evaluation device, transparency evaluation method and transparency evaluation program |
Applications Claiming Priority (6)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013-120650 | 2013-06-07 | ||
JP2013120650 | 2013-06-07 | ||
JP2013-199882 | 2013-09-26 | ||
JP2013199882 | 2013-09-26 | ||
JP2014035658 | 2014-02-26 | ||
JP2014-035658 | 2014-02-26 |
Related Child Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
US14/955,157 Continuation US9750326B2 (en) | 2013-06-07 | 2015-12-01 | Transparency evaluation device, transparency evaluation method and transparency evaluation program |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014196532A1 true WO2014196532A1 (ja) | 2014-12-11 |
Family
ID=52008171
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/064746 WO2014196532A1 (ja) | 2013-06-07 | 2014-06-03 | 透明感評価装置、透明感評価方法および透明感評価プログラム |
Country Status (5)
Country | Link |
---|---|
US (1) | US9750326B2 (ja) |
JP (1) | JP6026655B2 (ja) |
KR (1) | KR101713086B1 (ja) |
CN (1) | CN105263399B (ja) |
WO (1) | WO2014196532A1 (ja) |
Cited By (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017043191A1 (ja) * | 2015-09-10 | 2017-03-16 | 富士フイルム株式会社 | 肌の光沢評価装置、光沢評価方法および光沢評価プログラム |
JP2019217253A (ja) * | 2018-06-14 | 2019-12-26 | 株式会社コーセー | 皮膚透明感を評価する方法 |
WO2023120525A1 (ja) * | 2021-12-21 | 2023-06-29 | 花王株式会社 | 印象変更支援方法 |
Families Citing this family (14)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2014196532A1 (ja) * | 2013-06-07 | 2014-12-11 | 富士フイルム株式会社 | 透明感評価装置、透明感評価方法および透明感評価プログラム |
KR102322442B1 (ko) * | 2015-03-31 | 2021-11-09 | (주)아모레퍼시픽 | 개인맞춤형 화장료 제안방법 |
JP6650819B2 (ja) * | 2016-04-15 | 2020-02-19 | 株式会社 資生堂 | 色ムラ部位の評価方法、色ムラ部位評価装置及び色ムラ部位評価プログラム |
JP6872742B2 (ja) * | 2016-06-30 | 2021-05-19 | 学校法人明治大学 | 顔画像処理システム、顔画像処理方法及び顔画像処理プログラム |
JP6055160B1 (ja) * | 2016-07-08 | 2016-12-27 | 株式会社オプティム | 化粧品情報提供システム、化粧品情報提供装置、化粧品情報提供方法、及びプログラム |
JP6876941B2 (ja) * | 2016-10-14 | 2021-05-26 | パナソニックIpマネジメント株式会社 | バーチャルメイクアップ装置、バーチャルメイクアップ方法及びバーチャルメイクアッププログラム |
RU2742966C2 (ru) | 2016-10-18 | 2021-02-12 | Конинклейке Филипс Н.В. | Вспомогательное устройство и устройство визуализации |
JP7020626B2 (ja) * | 2017-02-01 | 2022-02-16 | エルジー ハウスホールド アンド ヘルスケア リミテッド | メイクアップ評価システム及びその動作方法 |
US10565741B2 (en) * | 2017-02-06 | 2020-02-18 | L'oreal | System and method for light field correction of colored surfaces in an image |
CN109671046B (zh) * | 2017-10-12 | 2021-08-17 | 精诚工坊电子集成技术(北京)有限公司 | 利用皮肤图像分析皮肤水分的方法及装置 |
US10732100B2 (en) | 2018-06-29 | 2020-08-04 | L'oreal | Systems and methods for predicting sun protection factor of sunscreen formulations in vitro |
US20210315512A1 (en) * | 2018-07-16 | 2021-10-14 | Skin Rejuvenation Technologies (Pty) Ltd | Method and system for cosmetic recommendations |
EP3628187A1 (en) * | 2018-09-26 | 2020-04-01 | Chanel Parfums Beauté | Method for simulating the rendering of a make-up product on a body area |
EP3664035B1 (en) | 2018-12-03 | 2021-03-03 | Chanel Parfums Beauté | Method for simulating the realistic rendering of a makeup product |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004215991A (ja) * | 2003-01-16 | 2004-08-05 | Shiseido Co Ltd | 皮膚の透明感評価方法 |
JP2009213729A (ja) * | 2008-03-12 | 2009-09-24 | Nippon Kagaku Yakin Co Ltd | 透明感評価装置及び透明感評価方法 |
JP2010273737A (ja) * | 2009-05-26 | 2010-12-09 | Mandom Corp | 皮膚の表面状態の評価方法 |
Family Cites Families (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
FR2861177A1 (fr) * | 2003-10-16 | 2005-04-22 | Oreal | Ensemble comportant un systeme d'analyse du niveau de clarte de la peau et des gammes de produits cosmetiques |
FR2897768B1 (fr) * | 2006-02-24 | 2008-05-09 | Oreal | Procede d'evaluation de l'eclat du teint. |
JP2010022547A (ja) | 2008-07-18 | 2010-02-04 | Pola Chem Ind Inc | 皮膚透明感の鑑別法 |
TWI471117B (zh) * | 2011-04-29 | 2015-02-01 | Nat Applied Res Laboratoires | 可用於行動裝置之人臉膚質評估演算介面裝置 |
FR2975804B1 (fr) * | 2011-05-27 | 2022-06-17 | Lvmh Rech | Procede de caracterisation du teint de la peau ou des phaneres |
WO2014196532A1 (ja) * | 2013-06-07 | 2014-12-11 | 富士フイルム株式会社 | 透明感評価装置、透明感評価方法および透明感評価プログラム |
-
2014
- 2014-06-03 WO PCT/JP2014/064746 patent/WO2014196532A1/ja active Application Filing
- 2014-06-03 JP JP2015521456A patent/JP6026655B2/ja active Active
- 2014-06-03 KR KR1020157034229A patent/KR101713086B1/ko active IP Right Grant
- 2014-06-03 CN CN201480031976.3A patent/CN105263399B/zh active Active
-
2015
- 2015-12-01 US US14/955,157 patent/US9750326B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2004215991A (ja) * | 2003-01-16 | 2004-08-05 | Shiseido Co Ltd | 皮膚の透明感評価方法 |
JP2009213729A (ja) * | 2008-03-12 | 2009-09-24 | Nippon Kagaku Yakin Co Ltd | 透明感評価装置及び透明感評価方法 |
JP2010273737A (ja) * | 2009-05-26 | 2010-12-09 | Mandom Corp | 皮膚の表面状態の評価方法 |
Cited By (6)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2017043191A1 (ja) * | 2015-09-10 | 2017-03-16 | 富士フイルム株式会社 | 肌の光沢評価装置、光沢評価方法および光沢評価プログラム |
JP2017051474A (ja) * | 2015-09-10 | 2017-03-16 | 富士フイルム株式会社 | 肌の光沢評価装置、光沢評価方法および光沢評価プログラム |
US10638968B2 (en) | 2015-09-10 | 2020-05-05 | Fujifilm Corporation | Skin gloss evaluation device, skin gloss evaluation method, and skin gloss evaluation program |
JP2019217253A (ja) * | 2018-06-14 | 2019-12-26 | 株式会社コーセー | 皮膚透明感を評価する方法 |
JP7307544B2 (ja) | 2018-06-14 | 2023-07-12 | 株式会社コーセー | 皮膚透明感を評価する方法 |
WO2023120525A1 (ja) * | 2021-12-21 | 2023-06-29 | 花王株式会社 | 印象変更支援方法 |
Also Published As
Publication number | Publication date |
---|---|
KR101713086B1 (ko) | 2017-03-07 |
US20160106198A1 (en) | 2016-04-21 |
JPWO2014196532A1 (ja) | 2017-02-23 |
US9750326B2 (en) | 2017-09-05 |
CN105263399A (zh) | 2016-01-20 |
JP6026655B2 (ja) | 2016-11-16 |
KR20160008220A (ko) | 2016-01-21 |
CN105263399B (zh) | 2017-06-23 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6026655B2 (ja) | 透明感評価装置、透明感評価装置の作動方法、透明感評価方法および透明感評価プログラム | |
CN108024719B (zh) | 肌肤的光泽评价装置、光泽评价方法及记录介质 | |
JP5885344B2 (ja) | 皮膚又は外皮の色調を特徴付けるための方法 | |
US20170079599A1 (en) | Moisture feeling evaluation device, moisture feeling evaluation method, and moisture feeling evaluation program | |
JP7235895B2 (ja) | 美容的皮膚特性を視覚化するための装置及び方法 | |
US11605243B2 (en) | Apparatus and method for determining cosmetic skin attributes | |
US20080304736A1 (en) | Method of estimating a visual evaluation value of skin beauty | |
JP5753055B2 (ja) | 肌画像分析装置及び肌画像分析方法 | |
JP2009082338A (ja) | エントロピーを用いた肌の鑑別方法 | |
JP2007252891A (ja) | 肌の美しさの目視評価値の推定方法 | |
JP6008698B2 (ja) | 顔画像分析装置及び顔画像分析方法 | |
US20160345887A1 (en) | Moisture feeling evaluation device, moisture feeling evaluation method, and moisture feeling evaluation program | |
JP6323097B2 (ja) | 色測定装置、色測定システム、色測定方法、および、プログラム | |
JP2016151490A (ja) | メイクアップの評価方法、メイクアップの評価システム、及びメイクアップ製品の推奨方法 | |
JP6351550B2 (ja) | ハリ感評価装置、ハリ感評価方法およびハリ感評価プログラム | |
JP6730051B2 (ja) | 肌状態評価方法 | |
JP2019007895A (ja) | 塗布状態評価方法 | |
KR102370800B1 (ko) | 피부 미백 지수의 생성 방법 | |
KR20230052341A (ko) | 민감성 피부를 위한 맞춤형 화장품 제조 시스템 및 그 방법 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480031976.3 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14807207 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015521456 Country of ref document: JP Kind code of ref document: A |
|
ENP | Entry into the national phase |
Ref document number: 20157034229 Country of ref document: KR Kind code of ref document: A |
|
NENP | Non-entry into the national phase |
Ref country code: DE |
|
122 | Ep: pct application non-entry in european phase |
Ref document number: 14807207 Country of ref document: EP Kind code of ref document: A1 |