WO2014208067A1 - 肌の官能評価装置および肌の評価方法 - Google Patents
肌の官能評価装置および肌の評価方法 Download PDFInfo
- Publication number
- WO2014208067A1 WO2014208067A1 PCT/JP2014/003328 JP2014003328W WO2014208067A1 WO 2014208067 A1 WO2014208067 A1 WO 2014208067A1 JP 2014003328 W JP2014003328 W JP 2014003328W WO 2014208067 A1 WO2014208067 A1 WO 2014208067A1
- Authority
- WO
- WIPO (PCT)
- Prior art keywords
- skin
- sensory evaluation
- evaluation value
- image
- unit
- Prior art date
Links
Images
Classifications
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
- A61B5/442—Evaluating skin mechanical properties, e.g. elasticity, hardness, texture, wrinkle assessment
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/0059—Measuring for diagnostic purposes; Identification of persons using light, e.g. diagnosis by transillumination, diascopy, fluorescence
- A61B5/0077—Devices for viewing the surface of the body, e.g. camera, magnifying lens
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
- A61B5/443—Evaluating skin constituents, e.g. elastin, melanin, water
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/44—Detecting, measuring or recording for evaluating the integumentary system, e.g. skin, hair or nails
- A61B5/441—Skin evaluation, e.g. for skin disorder diagnosis
- A61B5/444—Evaluating skin marks, e.g. mole, nevi, tumour, scar
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
-
- A—HUMAN NECESSITIES
- A61—MEDICAL OR VETERINARY SCIENCE; HYGIENE
- A61B—DIAGNOSIS; SURGERY; IDENTIFICATION
- A61B5/00—Measuring for diagnostic purposes; Identification of persons
- A61B5/74—Details of notification to user or communication with user or patient ; user input means
- A61B5/742—Details of notification to user or communication with user or patient ; user input means using visual displays
- A61B5/743—Displaying an image simultaneously with additional graphical information, e.g. symbols, charts, function plots
Definitions
- the present application relates to an evaluation apparatus and an evaluation method for determining a sensory evaluation value such as skin transparency.
- Patent Document 1 discloses a method of evaluating local transparency of skin by bringing a probe into contact with or in proximity to the skin.
- the transparency required by the method disclosed in Patent Document 1 is local skin transparency, and is not required from a wide part such as the entire cheek of the face. For this reason, for example, there is a case where it does not coincide with the transparency of the skin that is felt when the entire face is viewed.
- One non-limiting exemplary embodiment of the present application provides a sensory evaluation apparatus and a skin evaluation method capable of determining a sensory evaluation value of skin.
- An evaluation method is an evaluation method for determining a sensory evaluation value of skin from an image, and (a) obtains an image including the skin of a subject, and (b) Extracting a region; (c) calculating at least two feature indexes indicating features in the image in the skin region; and (d) calculating a skin sensory evaluation value based on the calculated at least two feature indexes. decide.
- the sensory evaluation value of the skin can be determined with high accuracy by using two or more feature indices of the skin image.
- (A) has shown typically the structure of one Embodiment of the sensory evaluation apparatus of skin.
- (B) has shown the structure of the control apparatus of a sensory evaluation apparatus.
- (C) has shown the structure of the cloud server.
- (A) And (b) is the front view and side view which show arrangement
- (A) It is a functional block diagram of a sensory evaluation apparatus
- (b) is a functional block diagram of a cloud server. It is a flowchart which shows the procedure which evaluates skin with a sensory evaluation apparatus.
- (A) And (b) has shown the example of a screen displayed on a display apparatus before sensory evaluation.
- the outline of one aspect of the skin sensory evaluation apparatus and the skin evaluation method of the present invention is as follows.
- An evaluation method for determining a sensory evaluation value of skin from an image includes: (a) obtaining an image including the skin of the subject; (b) extracting a skin region from the image; c) calculating at least two feature indexes indicating features in the image in the skin region; and (d) determining a sensory evaluation value of the skin based on the calculated at least two feature indexes.
- the skin region is divided into a plurality of unit blocks, and in the steps (c) and (d), the at least two feature indices are calculated for each unit block, and the skin sensation is calculated.
- An evaluation value may be determined, and (e) the sensory evaluation value of the skin obtained for each unit block may be displayed on a display device in association with the position of the unit block.
- the step (b) extracts the skin region by detecting the face of the subject in the image and excluding the region of the facial part on the image based on the detected face position. May be.
- the step (e) may be displayed on the display device in association with the position of the unit block in a color tone or gradation according to the sensory evaluation value of the skin obtained for each unit block.
- the feature index includes skin spots, wrinkles, textures and pores, face fringe lines, average pixel values in the unit block, dispersion and hue, reflectance of the skin surface, moisture, oil content and color unevenness. It may be one selected from the group consisting of
- the sensory evaluation value may be one selected from the group consisting of the transparency of the skin, the skin age of the subject, and the skin impression of the subject.
- the step (d) is based on a correlation between the at least two feature indexes measured in advance from a plurality of subjects and a sensory evaluation value determined by evaluating the skin of the subjects.
- the sensory evaluation value of the skin may be determined.
- the correlation may be obtained by multiple regression analysis.
- the information regarding the beauty device or cosmetics related to the calculated feature index or the determined sensory evaluation value may be further displayed on the display device.
- the sensory evaluation apparatus includes an imaging unit that acquires an image including the skin of a subject, a skin region extraction unit that extracts a skin region from the image, and the skin region.
- a feature index calculation unit that calculates at least two feature indexes indicating features, and a sensory evaluation value determination unit that determines a sensory evaluation value of skin based on the calculated at least two feature indexes.
- the sensory evaluation device further includes a display unit, the skin region extraction unit divides the skin region into a plurality of unit blocks, and the feature index calculation unit displays the at least two feature indexes for each unit block.
- the sensory evaluation value determination unit calculates the sensory evaluation value of the skin for each unit block, and the display unit calculates the sensory evaluation value of the skin obtained for each unit block as the position of the unit block. You may display in relation to.
- the skin area extracting unit detects the subject's face in the image, and extracts the skin area by excluding the face part area on the image based on the detected face position. Also good.
- the display unit may display the sensory evaluation value of the skin obtained for each unit block in association with the position of the unit block with a color tone or gradation according to the sensory evaluation value.
- the feature index includes skin spots, wrinkles, textures and pores, face fringe lines, average pixel values in the unit block, dispersion and hue, reflectance of the skin surface, moisture, oil content and color unevenness. It may be one selected from the group consisting of
- the sensory evaluation value may be one selected from the group consisting of the transparency of the skin, the skin age of the subject, and the skin impression of the subject.
- the sensory evaluation value determination unit is based on a correlation between the at least two feature indexes measured in advance from a plurality of subjects and a sensory evaluation value determined by evaluating the skin of the plurality of subjects.
- the sensory evaluation value of the skin may be determined.
- the correlation may be obtained by multiple regression analysis.
- the display unit may further display information relating to a beauty device or cosmetic related to the calculated feature index or the determined sensory evaluation value.
- a sensory evaluation apparatus is configured to be recorded on the storage element and executable by the imaging element, a control device including a storage element and an arithmetic element, a display device, and the storage element.
- the professional program (a) obtains an image including the skin of the subject, (b) extracts a skin area from the image, and (c) extracts the image in the skin area. Calculating at least two feature indexes indicating the features in (d) determining a sensory evaluation value of the skin based on the calculated at least two feature indexes, and (e) determining the sensory evaluation value of the determined skin. Display on the display device.
- FIG. 1 (a) schematically shows the configuration of an embodiment of the sensory evaluation apparatus for skin according to the present invention.
- the sensory evaluation apparatus 101 acquires an image of the face of the subject 24, and calculates two or more feature indices indicating the image characteristics in the skin area of the face from the image. Further, the sensory evaluation value of the skin is determined from the calculated two or more feature indexes.
- the feature index indicating the feature of the image is an index of the feature appearing in the image in the skin region included in the photographed image.
- the feature index is a value obtained by indexing image data related to skin spots, wrinkles, texture, and pores appearing in an image and a value indexed regarding pixel values of the image.
- the value obtained by indexing the amount of pores indicated by a combination of one or more of the number of skin pores, the amount of pores, the area, and the concentration, as indexed on the amount of skin spots and wrinkles appearing in the image A value indexed by the fineness of texture (texture), a value indexed by a combination of one or more of the length, thickness, and depth of a fringe line, an image in a predetermined unit block of the skin area.
- the sensory evaluation value of skin refers to a subjective evaluation value that cannot be obtained directly from an image.
- the sensory evaluation value of the skin is determined subjectively by the evaluator looking at the skin of the subject.
- the sensory evaluation apparatus according to the present embodiment is based on the correlation between the sensory evaluation value of the skin determined by the evaluator evaluating the skin of a plurality of subjects in advance and two or more feature indexes, and the subject under evaluation.
- a sensory evaluation value is determined from two or more characteristic indexes of the examiner.
- the evaluator is preferably an expert in the beauty field who evaluates skin transparency, skin age, and skin impression, but may be a general user who does not specialize in these evaluations, and is not limited to an expert.
- the skin sensory evaluation device 101 shown in FIG. 1A includes an imaging device 10, a light source 12, a display device 14, and a control device 18.
- FIG. 1B shows the configuration of the control device 18.
- the control device includes a memory 18A and a calculation unit 18B.
- the face of the subject 24 is photographed by the imaging device 10 of the sensory evaluation device 101, and a face image is acquired.
- the light source 12 that emits polarized light is used in order to more accurately calculate the feature index of the image.
- FIGS. 2A and 2B are a front view and a side view showing the arrangement of the imaging device 10, the light source 12, and the display device 14 in the sensory evaluation device 101.
- the light source 12 is provided on the display device 14, and the light source 12 includes a first light source 12A and a second light source 12B.
- the first light source 12A emits, for example, white linearly polarized light La having a polarization axis in the vertical direction (y direction in FIG. 2A), and the second light source 12B is in the horizontal direction (x in FIG. 2A).
- White linearly polarized light Lb having a polarization axis in the direction) is emitted.
- the lens optical system of the imaging device 10 is provided with a polarizing plate having a polarization axis in the vertical direction (y direction in FIG. 2A), for example.
- a double-headed arrow shown in FIG. 2A indicates the direction of the polarization axis.
- the subject when obtaining the sensory evaluation value of the skin using the sensory evaluation device 101, the subject holds the head 24 h at a position separated from the display device 14 by, for example, a distance D. , Face to display device 14.
- the optical system of the imaging device 10 is set so that the entire face can be photographed with an appropriate size and resolution.
- An example of the size of the display device 14 is a width W of about 30 cm and a height H of about 50 cm.
- the distance D is about 30 cm to 70 cm.
- the control device 18 receives image data from the imaging device 10 and calculates feature indexes of two or more images.
- the sensory evaluation value of the skin is determined based on the correlation between the sensory evaluation value of the skin obtained in advance and two or more feature indexes.
- the display device 14 displays the captured image. Further, the image feature index and skin sensory evaluation value obtained by the control device 18 are displayed.
- a user interface such as a touch panel 16 may be provided on the screen of the display device 14. That is, the display device 14 may be a touch screen display or the like.
- the imaging device 10 is a general video camera or digital still camera.
- the optical system of the imaging apparatus 10 is provided with a polarizing plate having a polarization axis parallel to a predetermined direction, for example, the vertical direction.
- the control device 18 may be a personal computer or the like, or may be configured by a dedicated circuit or the like.
- FIG. 1A the control device 18 and the display device 14 are shown separately.
- the control device 18 and the display device 14 may be housed in an integral housing, and may be a tablet type. It may be a portable information terminal or the like.
- the control device 18 further includes a communication unit 18C, and is connected via a communication network 22 to a cloud server 20 of a service provider that applies a service related to skin sensory evaluation. May be.
- the communication unit 18 ⁇ / b> C transmits the captured image, the image feature index, the skin sensory evaluation value, and the like to the cloud server 20.
- the cloud server 20 can be connected to a plurality of sensory evaluation devices 101.
- FIG. 1C shows the configuration of the cloud server 20.
- the cloud server 20 includes a database 20A, a memory 20B, a calculation unit 20C, and a communication unit 20D.
- the cloud server 20 receives captured images, image feature indices, skin sensory evaluation values, and the like from the plurality of sensory evaluation apparatuses 101 through the communication unit 20D.
- the cloud server 20 obtains a correlation between the sensory evaluation value of the skin and two or more feature indexes based on the image feature indexes received from the plurality of sensory evaluation apparatuses 101. Further, the obtained correlation is transmitted to each sensory evaluation apparatus 101 by the communication unit 20D.
- FIG. 3 schematically shows a cross section of human skin.
- the skin image includes various kinds of information such as skin spots, wrinkles, pores, and frying lines.
- a feature index is calculated by selectively extracting the information from the skin image.
- the skin 300 includes an epidermis 300 ⁇ / b> A having a depth in a range of about 0.06 mm or more and about 0.2 mm from the surface 300 ⁇ / b> S of the skin 300 toward the inside, and a dermis 300 ⁇ / b> B existing inside.
- Skin spots, wrinkles, pores, and frying lines differ in the shape of the skin and the depth position at which they exist in the skin 300. Therefore, the feature index can be calculated by obtaining images from different depths of the skin and identifying the shape.
- Image information from different skin depths can be obtained by using polarized light and color components.
- polarized light and color components For example, when skin is photographed using linearly polarized light parallel to a predetermined direction as a light source, the linearly polarized light is reflected on the skin surface 300S while maintaining the polarization direction.
- the linearly polarized light reflected inside the skin 300A is emitted from the skin 300A with its polarization direction disturbed by scattering. For this reason, if a light source that emits linearly polarized light is used and polarized light parallel to the light source is detected (parallel polarization conditions), an image with a large amount of information on the skin surface and a small amount of internal information can be obtained.
- a light source that emits linearly polarized light is used and polarized light orthogonal to the light source is detected (orthogonal polarization condition)
- an image with a large amount of information inside the skin and a small amount of surface information can be obtained. That is, by using polarized light as a light source, an image that selectively includes information on the inside of the skin and information on the surface can be obtained.
- the light from the light source is incident on the inside of the skin 300A as the wavelength is longer, and is reflected inside. Accordingly, in the image obtained by photographing the skin, the blue (B) component contains more information on the surface of the skin, and the red (R) and infrared components contain more information inside the epidermis 300A.
- characteristic indicators such as spots and wrinkles may have properties such as being easy to absorb light in a specific wavelength region.
- the specific index can be calculated by using the light component in the specific wavelength region.
- Table 1 shows an example of conditions for calculating the feature index.
- the stain exists inside the epidermis 300A. It is also known that the darker the stain, the smaller the difference in the amount of light between the blue component and the red component in the light obtained from the spot portion. Therefore, by taking a picture under the condition of orthogonal polarization that allows more internal information to be obtained, obtaining a pixel value difference between blue and red in each pixel of the image, and performing threshold processing, the portion of the spot from the photographed image is obtained. Can be selectively extracted.
- the wrinkles and the frying lines are present in the vicinity of the surface 300S of the skin 300.
- Shooting under the condition of parallel polarization that allows more surface information to be obtained, and finding the difference between the pixel values of blue and red at each pixel of the image suppresses the effects of light reflection on the skin surface, and reduces wrinkles.
- an image including a lot of information on the fringe line can be obtained.
- by processing an image using a line detection filter an image containing a large amount of wrinkle and bubble line information can be obtained.
- threshold processing based on the length of the detected portion may be further performed.
- the difference between the pixel values of blue and red is obtained here.
- the blue pixel value may be obtained.
- Other color pixel values may be used, and the present invention is not limited only to obtaining a difference between blue and red pixel values.
- the pore exists in the vicinity of the surface 300S of the skin 300.
- it is strongly influenced by illumination in the environment where the sensory evaluation apparatus 101 is used.
- pores are relatively easy to identify on the image.
- a point detection filter it becomes easier to extract pores on the image.
- photographing is performed under the condition of orthogonal polarization. Further, by extracting the blue component in each pixel of the image and processing the image using a point detection filter, an image containing a large amount of pore information can be obtained.
- the conditions shown in Table 1 are examples for calculating the feature index, and other conditions may be adopted to acquire the face image.
- the feature index shown in Table 1 can be distinguished and calculated even when the color component condition and the filter condition are used without using the polarization condition. Therefore, in this embodiment, the sensory evaluation apparatus 101 includes the light source 12 that emits polarized light, but the sensory evaluation apparatus does not need to include the light source 12 that emits polarized light.
- FIG. 4A shows functional blocks of the sensory evaluation apparatus 101.
- the sensory evaluation device 101 includes an imaging unit 32, a skin region extraction unit 34, a feature index calculation unit 36, a sensory evaluation value determination unit 42, a display data generation unit 44, And a display unit 45. Further, a control unit 46 that controls each functional block and an input unit 48 that gives an instruction to the control unit 46 are provided.
- the sensory evaluation apparatus 101 When communicating with the cloud server 102, the sensory evaluation apparatus 101 further includes a transmission unit 38 and a reception unit 40.
- the imaging unit 32, the display unit 45, the control unit 46, and the input unit 48 are the imaging device 10, the display device 14, the control device 18, and the touch panel shown in FIG. 16 respectively.
- the transmission unit 38 and the reception unit 40 are realized by the communication unit 18 ⁇ / b> C of the control device 18.
- the functions of the skin region extraction unit 34, the feature index calculation unit 36, and the sensory evaluation value determination unit 42 are realized by software. Specifically, the functions of these functional blocks are realized by the arithmetic unit 18B executing the program stored in the memory 18A. In accordance with this program, the calculation unit 18B controls the imaging device 10, the light source 12, and the display device 14.
- the imaging unit 32 When the sensory evaluation device 101 starts operating based on an instruction from the subject or the operator of the sensory evaluation device 101, the imaging unit 32 first captures the subject's face and acquires an image including the face.
- the skin region extraction unit 34 extracts a skin region from the obtained image.
- the feature index calculation unit 36 calculates at least two feature indexes of the obtained image in the skin region. The feature index is calculated for each unit block after the skin region is divided into a plurality of unit blocks.
- the sensory evaluation value determination unit 42 determines the sensory evaluation value of the skin based on the calculated at least two feature indexes. The sensory evaluation value is also determined for each unit block.
- the display data generating unit 44 generates display data for displaying the sensory evaluation value of the skin obtained for each unit block on the display unit in association with the position of the unit block.
- the display unit 45 displays the generated display data.
- the transmitting unit 38 transmits to the cloud server 102 the image data of the skin region and the image feature index extracted by the skin region extracting unit.
- the receiving unit 40 receives information indicating the correlation between the sensory evaluation value of skin and two or more feature indexes, specifically, coefficients of a regression equation determined by multiple regression analysis from the cloud server 102. This information is held in the sensory evaluation value determination unit 42, and the sensory evaluation value is used for determination.
- the coefficient of the regression equation held in the sensory evaluation value determination unit 42 may be stored in the sensory evaluation device 101 in advance, or from the cloud server 102 when the sensory evaluation device 101 determines the sensory evaluation value. You may get it.
- FIG. 4B shows functional blocks of the cloud server 102.
- the cloud server 102 includes a transmission unit 52, a reception unit 54, a sensory evaluation value input unit 56, a database 58, a regression analysis calculation unit 60, and a regression equation holding unit 62.
- These functional blocks are realized by software. Specifically, the functions of these functional blocks are realized by the arithmetic unit 20C executing the program stored in the memory 20B of the cloud server 102 illustrated in FIG. 1C.
- the cloud server 102 receives skin region image data and image feature indices from a plurality of sensory evaluation apparatuses 101 connected via the communication network 22.
- the evaluator looks at the image of the skin region collected in the cloud server 102, determines the sensory evaluation value of the image, and inputs the sensory evaluation value to the sensory evaluation value input unit 56.
- the database 58 stores the feature indices of two or more types of images corresponding to the skin region images and the determined sensory evaluation values in association with each other.
- the regression analysis calculation unit 60 obtains a correlation between the feature index and the sensory evaluation value from a combination of two or more kinds of feature indexes stored in the database 58 and the determined sensory evaluation value. Specifically, the coefficient of the regression equation for determining the sensory evaluation value from two or more feature indexes is determined by multiple regression analysis. The determined regression equation coefficient is held in the regression equation holding unit 62 and transmitted to the sensory evaluation device 101 at a predetermined timing.
- FIG. 5 is a flowchart showing the procedure of the skin evaluation method. The skin evaluation method using the sensory evaluation apparatus 101 will be described in more detail with reference to FIG. 4 (a) and FIG.
- an image of the subject's face is acquired (step S1).
- photographing is performed using polarized light under orthogonal polarization conditions and parallel polarization conditions.
- the first light source 12A is turned on to perform the first shooting
- the first light source 12A is turned off
- the second light source 12B is turned on to perform the second shooting.
- the first shooting is performed under the parallel polarization condition
- the second shooting is performed under the orthogonal polarization condition.
- the light intensity and lighting time of the first light source 12A and the second light source 12B and the exposure conditions of the imaging device 10 may be the same or different.
- a first image and a second image are generated by the first shooting and the second shooting, respectively.
- the feature index For the calculation of the feature index, an image photographed under a condition suitable for the calculation of the feature index among the first image and the second image is used.
- the feature index may be calculated using both images.
- the first image and the second image are referred to as a face image photographed together.
- FIG. 6A shows an example of an initial screen before photographing displayed on the display device 14 of the sensory evaluation device 101.
- the display area of the display device 14 includes a main screen 70 having a large area and a sub-screen 72 having a small area that is positioned below the main screen 70.
- the captured image is displayed in real time on the main screen 70 and functions as a digital mirror. For example, information such as a clock and a weather forecast may be displayed on the sub screen 72.
- FIG. 6B shows a screen immediately before the start of shooting.
- the touch panel 16 is provided on the surface of the display device 14, and when the subject touches the touch panel 16 with a finger or the like, operation menus and function switching are performed on the upper portion 80 and the left portion 78 of the sub screen 72.
- a menu is displayed.
- a guide 76 for guiding the position of the face of the subject may be shown on the main screen 70.
- a mark or a number 74 indicating the timing of shooting may be shown.
- the photographed face image is displayed in real time on the right side 72R of the sub-screen 72, and a calendar is displayed on the left side 72L.
- the calendar may include a mark indicating that the sensory evaluation device 101 has been used for photographing in the past. After the display shown in FIG. 6B, shooting is performed as described above.
- the skin region extracting unit 34 specifies the position of the face 312 on the photographed image 302 as shown in FIG.
- a known face authentication image processing technique can be used to specify the position of the face 312 on the image 302. After the position of the face 312 is specified, the area of the face 312 is excluded from the areas of eyes, mouth, and eyebrows that are face parts (parts), and the skin area 310 that is skin is specified on the image (step S3). .
- the skin region extraction unit 34 divides the extracted skin region 310 to generate a unit block 310u (S4).
- the unit block 310u has a size suitable for determining a feature index to be calculated.
- the unit block 310u has a size of 10 pixels ⁇ 10 pixels.
- the unit blocks 310u may be set so as to partially overlap each other. In FIG. 7, the unit block 310u is shown large for ease of understanding.
- a feature index is calculated for each unit block (steps S5a, S5b, S5c).
- a stain amount and a wrinkle amount are calculated as feature indexes for each unit block.
- the amount of stain and the amount of wrinkle are, for example, the number of pixels in a region calculated as a stain or wrinkle.
- the luminance level for each unit block is calculated. As shown in Table 1, since an image photographed with orthogonal polarization is suitable for calculating the amount of stain, the second image is used. The first image is used for calculating the wrinkle amount.
- the stain amount When calculating the stain amount, the difference between the blue pixel value and the red pixel value of all the pixels in the unit block 310u is obtained, and the stain amount is calculated by threshold processing.
- the stain amount When calculating the wrinkle amount, the difference between the blue pixel value and the red pixel value of all the pixels in the unit block 310u is obtained, and the line detection filter processing is performed to calculate the edge detected portion as the wrinkle amount. Also, the luminance level of the unit block 310u is calculated using the first image, the second image, or both.
- the display data generation unit 44 generates image data of an image to be displayed by associating the transparency as the obtained sensory evaluation value with the coordinates in the skin region 310 of the unit block 310u from which the transparency was obtained.
- the transparency of all the unit blocks 310u in the skin region 310 is determined by repeating steps S4 to S7 (step S8).
- the feature index is calculated for each unit block 310u, and the transparency is sequentially determined.
- the feature index of all unit blocks 310u in the skin region is calculated, and then all units The transparency of the block 310u may be obtained.
- the display unit 45 displays a transparency map with the obtained unit block 310u as a unit.
- the map may be displayed in a color tone or gradation according to the transparency value.
- FIG. 8A to 8D show examples of screens displayed on the display device 14 after the measurement is completed.
- the screen shown in FIG. 8A is displayed.
- an image photographed by the imaging device 10 is displayed in real time.
- the calculated feature index is shown by, for example, a radar chart 82.
- the average feature index of the entire measured skin area is displayed.
- On the left side 72L an image of a face taken at the time of measurement is shown.
- a menu for designating the position of the face is displayed on the upper portion 80 of the sub screen 72, and a menu for switching contents and functions to be displayed is displayed on the left side portion 78.
- the subject can display a specific skin region of the face or change the displayed content by touching the upper part 80 and the left part 78 of the sub screen 72, for example.
- FIG. 8B is an example in which the location designated by the subject and the designated feature index are displayed on the right side 72R of the sub-screen 72. A part of the photographed image is enlarged and displayed at the position designated by the subject. Also, spots and wrinkles are shown in red areas.
- FIG. 8C shows an example in which the determined transparency is displayed on the main screen as an average value for each skin region.
- the sensory evaluation value may be indicated by a numerical value.
- FIG. 8D shows a mapping display by the unit block 310 u of transparency.
- the unit block 86 with a low transparency is shown in a predetermined color so as to be superimposed on the captured image.
- the transparency of the determined sensory evaluation value is superimposed on the photographed face image and displayed, which part of the entire face is highly transparent, etc. It can be easily recognized.
- the sensory evaluation device 101 may display additional information on the display device 14 based on the result of the sensory evaluation value.
- FIG. 9A shows an example in which, as shown in FIG. 8A, a feature index influenced by a decrease in sensory evaluation value is displayed when transparency is shown as an average value for each skin area on the main screen. Is shown.
- FIG. 9A shows a display 87 indicating that the amount of the stain reduces the evaluation of transparency.
- FIG. 9B shows an example in which a region having a low sensory evaluation value is highlighted and displayed when transparency is displayed by mapping by the unit block 310u as shown in FIG. 8B. ing.
- a region with low transparency is highlighted with a marker 88.
- advice for improving the sensory evaluation value on the sub screen 72 and the sensory evaluation value are improved.
- Information on cosmetics and beauty equipment for the purpose may be further displayed.
- Such information related to the sensory evaluation value may be stored in advance in the memory 18A of the control device 18 of the sensory evaluation device 101, or the control device 18 receives such information from the cloud server 102. May be.
- FIGS. 10A to 10D show examples of information displayed on the sub screen 72 of the display device 14.
- a calendar may be displayed on the sub screen.
- the calendar shows information measured in the past.
- the subject touches the corresponding information on the sub screen the past measurement result can be displayed on the main screen 70.
- a radar chart of feature indexes may be further displayed on the sub screen 72.
- information on cosmetics and beauty equipment for improving the sensory evaluation value may be displayed in detail.
- FIG. 11 shows the correlation between the subjectively evaluated transparency and the transparency determined using the regression equation.
- teacher data is data (skin area image) used to determine the coefficient of the regression equation.
- the test data is data in which the sense of transparency is determined by the sensory evaluation device 101 using the determined coefficient.
- the sense of transparency was also determined by the sensory evaluation device 101 for the teacher data.
- the evaluator also subjectively evaluated the transparency of the test data.
- the correlation coefficient R 2 between the transparency based on the subjective evaluation obtained from the teacher data and the transparency determined by the sensory evaluation apparatus 101 is about 0.89. This means that in the image used as the teacher data, there is a high correlation between the transparency based on the subjective evaluation and the transparency determined from the stain amount, the wrinkle amount, and the luminance level.
- the correlation coefficient R 2 between the transparency based on the subjective evaluation obtained from the test data and the transparency determined by the sensory evaluation apparatus 101 is about 0.79. This means that the regression equation obtained from the teacher data can be suitably used for estimating transparency with high correlation from the amount of stain, the amount of wrinkles and the luminance level even in images other than the teacher data. .
- FIG. 12 shows the correlation between the amount of stain in an image in which the evaluator has subjectively evaluated transparency and the transparency by subjective evaluation.
- the correlation coefficient R 2 between the amount of stain and the transparency by subjective evaluation is about 0.41, which is not so high.
- the inventor of the present application similarly obtained a correlation coefficient R 2 between one feature index and a transparent feeling by subjective evaluation for the wrinkle amount, the conical line, and the pore amount, and all of them were about 0.4. I understood. From this, it can be seen that it is difficult to estimate the skin transparency with high correlation by using only one feature index such as a stain or wrinkle obtained directly from the image of the skin region.
- two or more feature indexes it is possible to estimate skin transparency with high accuracy.
- transparency has been described as the sensory evaluation value.
- skin age and skin impression can also be determined as subjective indicators related to the skin. In this case, these sensory evaluation values can be similarly determined with high accuracy from the feature index of the skin image.
- the sensory evaluation value of the skin can be determined with high accuracy by using two or more feature indices of the skin image.
- the sensory evaluation value of the skin can be determined by non-contact measurement, and the sensory evaluation value can be determined more easily. For example, since it can be determined in units of a certain size for each part of the skin of the face, a sensory evaluation value having a high correlation with the sensory evaluation value when the face is actually evaluated can be automatically obtained.
- FIG. 13 is a flowchart showing a procedure for determining a regression equation in the cloud server 102.
- the cloud server 102 may determine a coefficient of a regression equation that is initially set in the sensory evaluation device 101. In addition, the cloud server 102 sequentially receives the image of the skin region and the image feature index from the plurality of sensory evaluation devices 101 connected to the cloud server 102, and updates the regression equation each time the data reaches a predetermined amount. The updated regression equation may be transmitted to a plurality of connected sensory evaluation apparatuses 101.
- the sensory evaluation apparatus 101 executes steps S1 to S4 and steps S5a, 5b, and 5c as described above, and calculates a stain amount, a wrinkle amount, and a luminance level that are feature indexes in the unit block. .
- the cloud server 102 receives these data from the sensory evaluation device 101. In addition, the image data of the skin region for which these feature indexes are obtained is also received.
- the sensory evaluation value input unit 56 receives the transparency that the evaluator looks at the image of the skin region and subjectively determines from the image (step S26). Since evaluation of transparency by the evaluator is difficult for each unit block, it is performed for each skin area of the face. In this case, the same transparency is associated with unit blocks in the same skin region.
- the evaluation of transparency by the evaluator may not be performed directly by the evaluator accessing the cloud server 102.
- the cloud server 102 may transmit skin region image data necessary for evaluation to an information terminal that can be used by an evaluator.
- the evaluator may evaluate the transparency by looking at the image displayed on the information terminal, associate the evaluation result with the image, and transmit the evaluation result to the cloud server 102.
- the input transparency is recorded in the database 58 in association with the data of the amount of wrinkles, the amount of wrinkles and the luminance level for each unit block (step S27). This procedure is performed for the entire unit block in the same skin region (step S28). Further, the transparency is evaluated by the same procedure for all the images for which the transparency is not yet input, and recorded in the database 58 (step 29).
- the regression analysis calculation unit 60 performs a multiple regression analysis of these numerical values in order to determine the correlation between the stain amount, the wrinkle amount and the luminance level and the transparency, and the regression equation Is obtained (step S30).
- the regression equation holding unit 62 holds the determined regression equation. Specifically, the regression equation coefficients are stored (step S31).
- each sensory evaluation apparatus 101 can update a regression equation as needed, and can determine a transparency with the updated regression equation.
- the sensory evaluation apparatus of the present invention can be realized by a mobile information device such as a smartphone, a tablet-type information device, or a laptop computer that includes a display unit and a camera positioned on the display unit side. can do.
- the program for executing the evaluation method for determining the sensory evaluation value of the skin according to the procedure shown in FIG. 4 is stored in the portable information device, and the program is executed by the calculation unit of the portable information device, as described above.
- the present invention can be implemented.
- the sensory evaluation apparatus and the evaluation method for determining the sensory evaluation value of the present embodiment may be used for sensory evaluation of, for example, hands other than the face.
- only the face is an evaluation target.
- imaging may be performed up to the chest, and sensory evaluation values of the neck and the chest may be measured.
- the sensory evaluation apparatus and the evaluation method for determining the sensory evaluation value disclosed in the present application are suitably used for sensory evaluation of the skin.
Landscapes
- Health & Medical Sciences (AREA)
- Life Sciences & Earth Sciences (AREA)
- Molecular Biology (AREA)
- Surgery (AREA)
- Biophysics (AREA)
- Pathology (AREA)
- Engineering & Computer Science (AREA)
- Biomedical Technology (AREA)
- Heart & Thoracic Surgery (AREA)
- Medical Informatics (AREA)
- Veterinary Medicine (AREA)
- Physics & Mathematics (AREA)
- Animal Behavior & Ethology (AREA)
- General Health & Medical Sciences (AREA)
- Public Health (AREA)
- Dermatology (AREA)
- Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
- Radiology & Medical Imaging (AREA)
- Measuring And Recording Apparatus For Diagnosis (AREA)
- Measurement Of The Respiration, Hearing Ability, Form, And Blood Characteristics Of Living Organisms (AREA)
Abstract
Description
12 光源
12A 第1光源
12B 第2光源
14 表示装置
16 タッチパネル
18 制御装置
18A メモリ
18B 演算部
18C 通信部
20 クラウドサーバー
20A データベース
20B メモリ
20C 演算部
20D 通信部
22 通信ネットワーク
24 被検者
24h 頭部
32 撮像部
34 肌領域抽出部
36 特徴指標算出部
38 送信部
40 受信部
42 官能評価値決定部
44 表示データ生成部
45 表示部
46 制御部
48 入力部
52 送信部
54 受信部
56 官能評価値入力部
58 データベース
60 回帰分析演算部
62 回帰式保持部
70 メイン画面
72 サブ画面
72L 左側
72R 右側
101 官能評価装置
102 クラウドサーバー
Claims (19)
- 画像から肌の官能評価値を決定する評価方法であって、
(a)被検者の肌を含む画像を取得し、
(b)前記画像から、肌領域を抽出し、
(c)前記肌領域において、前記画像における特徴を示す少なくとも2つの特徴指標を算出し、
(d)前記算出された少なくとも2つの特徴指標に基づいて、肌の官能評価値を決定する、
評価方法。 - 前記ステップ(b)において、前記肌領域を複数の単位ブロックに分割し、
前記ステップ(c)および(d)において、前記単位ブロックごとに、前記少なくとも2つの特徴指標を算出し、前記肌の官能評価値を決定し、
(e)前記単位ブロックごとに求めた肌の官能評価値を、前記単位ブロックの位置に関連づけて表示装置に表示する、
請求項1に記載の評価方法。 - 前記ステップ(b)は、前記画像において、前記被検者の顔を検出し、
前記検出した顔の位置に基づき、前記画像上において、顔部品の領域を除外することによって前記肌領域を抽出する、
請求項2に記載の評価方法。 - 前記ステップ(e)は、前記単位ブロックごとに求めた肌の官能評価値に応じた色調または階調で前記単位ブロックの位置に関連づけて前記表示装置に表示する、
請求項3に記載の評価方法。 - 前記特徴指標は、前記肌のシミ、シワ、キメおよび毛穴、前記顔のほうれい線、前記単位ブロックにおける画素値の平均、分散および色相、前記肌の表面の反射率、水分、油分および色ムラからなる群から選ばれる1つである、請求項1から4のいずれかに記載の評価方法。
- 前記官能評価値は前記肌の透明感、前記被検者の肌年齢、前記被検者の肌印象からなる群から選ばれる1つである請求項5に記載の評価方法。
- 前記ステップ(d)は、複数の被検者からあらかじめ測定された前記少なくとも2つの特徴指標と、前記複数の被検者の肌を評価して決定された官能評価値との相関関係に基づき、前記肌の官能評価値を決定する、請求項6に記載の評価方法。
- 前記相関関係は、重回帰分析により求められる請求項7に記載の評価方法。
- 前記算出された特徴指標または前記決定された官能評価値に関連する美容機器または化粧品に関する情報を前記表示装置にさらに表示する、請求項1から8のいずれかに記載の評価方法。
- 被検者の肌を含む画像を取得する撮像部と、
前記画像から、肌領域を抽出する肌領域抽出部と、
前記肌領域において、前記画像における特徴を示す少なくとも2つの特徴指標を算出する特徴指標算出部と、
前記算出された少なくとも2つの特徴指標に基づいて、肌の官能評価値を決定する官能評価値決定部と
を備えた官能評価装置。 - 表示部をさらに備え、
前記肌領域抽出部は、前記肌領域を複数の単位ブロックに分割し、
前記特徴指標算出部は、前記単位ブロックごとに前記少なくとも2つの特徴指標を算出し、
前記官能評価値決定部は、前記単位ブロックごとに前記肌の官能評価値を決定し、
前記表示部は、前記単位ブロックごとに求めた肌の官能評価値を、前記単位ブロックの位置に関連づけて表示する、
請求項10に記載の官能評価装置。 - 肌領域抽出部は、前記画像において、前記被検者の顔を検出し、
前記検出した顔の位置に基づき、前記画像上において、顔部品の領域を除外することによって前記肌領域を抽出する、
請求項11に記載の官能評価装置。 - 前記表示部は、前記単位ブロックごとに求めた肌の官能評価値を、前記官能評価値に応じた色調または階調で前記単位ブロックの位置に関連づけて表示する、
請求項12に記載の官能評価装置。 - 前記特徴指標は、前記肌のシミ、シワ、キメおよび毛穴、前記顔のほうれい線、前記単位ブロックにおける画素値の平均、分散および色相、前記肌の表面の反射率、水分、油分および色ムラからなる群から選ばれる1つである、請求項10から13のいずれかに記載の官能評価装置。
- 前記官能評価値は前記肌の透明感、前記被検者の肌年齢、前記被検者の肌印象からなる群から選ばれる1つである請求項14に記載の官能評価装置。
- 前記官能評価値決定部は、複数の被検者からあらかじめ測定された前記少なくとも2つの特徴指標と、前記複数の被検者の肌を評価して決定された官能評価値との相関関係に基づき、前記肌の官能評価値を決定する、請求項15に記載の官能評価装置。
- 前記相関関係は、重回帰分析により求められる請求項16に記載の官能評価装置。
- 前記表示部は、前記算出された特徴指標または前記決定された官能評価値に関連する美容機器または化粧品に関する情報をさらに表示する、請求項10から17のいずれかに記載の官能評価装置。
- 撮像装置と、
記憶素子および演算素子を含む制御装置と
表示装置と
前記記憶素子に記録され、前記演算素子によって実行可能なように構成されたプログラムと
を備え、
前記プロプログラムは、
(a)被検者の肌を含む画像を取得させ、
(b)前記画像から、肌領域を抽出し、
(c)前記肌領域において、前記画像における特徴を示す少なくとも2つの特徴指標を算出し、
(d)前記算出された少なくとも2つの特徴指標に基づいて、肌の官能評価値を決定し、
(e)前記決定した肌の官能評価値を前記表示装置に表示させる
肌の官能評価装置。
Priority Applications (4)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2015523859A JP6345176B2 (ja) | 2013-06-28 | 2014-06-20 | 肌の官能評価装置および肌の評価方法 |
EP14817805.6A EP3000386B1 (en) | 2013-06-28 | 2014-06-20 | Skin function evaluation device and skin evaluation method |
US14/900,863 US10667744B2 (en) | 2013-06-28 | 2014-06-20 | Skin function evaluation device and skin evaluation method |
CN201480036117.3A CN105338887B (zh) | 2013-06-28 | 2014-06-20 | 肌肤的感官评价装置以及肌肤的评价方法 |
Applications Claiming Priority (2)
Application Number | Priority Date | Filing Date | Title |
---|---|---|---|
JP2013136185 | 2013-06-28 | ||
JP2013-136185 | 2013-06-28 |
Publications (1)
Publication Number | Publication Date |
---|---|
WO2014208067A1 true WO2014208067A1 (ja) | 2014-12-31 |
Family
ID=52141426
Family Applications (1)
Application Number | Title | Priority Date | Filing Date |
---|---|---|---|
PCT/JP2014/003328 WO2014208067A1 (ja) | 2013-06-28 | 2014-06-20 | 肌の官能評価装置および肌の評価方法 |
Country Status (5)
Country | Link |
---|---|
US (1) | US10667744B2 (ja) |
EP (1) | EP3000386B1 (ja) |
JP (1) | JP6345176B2 (ja) |
CN (1) | CN105338887B (ja) |
WO (1) | WO2014208067A1 (ja) |
Cited By (13)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015125552A1 (ja) * | 2014-02-21 | 2015-08-27 | 富士フイルム株式会社 | うるおい感評価装置、うるおい感評価方法およびうるおい感評価プログラム |
JP2016075561A (ja) * | 2014-10-06 | 2016-05-12 | パナソニックIpマネジメント株式会社 | 光沢判定装置および光沢判定方法 |
JP2017012384A (ja) * | 2015-06-30 | 2017-01-19 | 花王株式会社 | シワ状態分析装置及びシワ状態分析方法 |
JPWO2016121518A1 (ja) * | 2015-01-29 | 2017-11-09 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP2018060532A (ja) * | 2016-10-06 | 2018-04-12 | 御木本製薬株式会社 | 皮膚評価方法 |
JP2019025190A (ja) * | 2017-08-02 | 2019-02-21 | 花王株式会社 | 肌の評価方法 |
KR20190051256A (ko) * | 2017-11-06 | 2019-05-15 | 주식회사 베이바이오텍 | 이미지 분석 결과 및 학습된 피부 인식기에 기초하여 피부의 상태를 평가하는 방법 및 프로그램 |
JP2019217253A (ja) * | 2018-06-14 | 2019-12-26 | 株式会社コーセー | 皮膚透明感を評価する方法 |
KR20200020767A (ko) * | 2020-02-19 | 2020-02-26 | 주식회사 베이바이오텍 | 피부 상태 평가 방법 및 프로그램 |
US11069057B2 (en) | 2016-05-25 | 2021-07-20 | Panasonic Intellectual Property Management Co., Ltd. | Skin diagnostic device and skin diagnostic method |
JP2022003542A (ja) * | 2017-07-28 | 2022-01-11 | ポーラ化成工業株式会社 | 肌状態のケアに関する情報出力システム |
JP2022529677A (ja) * | 2019-04-23 | 2022-06-23 | ザ プロクター アンド ギャンブル カンパニー | 美容的皮膚特性を視覚化するための装置及び方法 |
US11605243B2 (en) | 2019-04-23 | 2023-03-14 | The Procter & Gamble Company | Apparatus and method for determining cosmetic skin attributes |
Families Citing this family (15)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
EP2972155B1 (en) | 2013-03-15 | 2023-08-23 | Shiseido Company, Limited | Systems and methods for specifying and formulating customized topical agents |
KR102429838B1 (ko) * | 2016-03-11 | 2022-08-05 | (주)아모레퍼시픽 | 피부결 블랍을 기초로한 피부결 평가 장치 및 그 방법 |
JP6872742B2 (ja) * | 2016-06-30 | 2021-05-19 | 学校法人明治大学 | 顔画像処理システム、顔画像処理方法及び顔画像処理プログラム |
WO2018043826A1 (ko) * | 2016-08-31 | 2018-03-08 | 주식회사 톤28 | 사용자 맞춤형 화장품 제조장치 및 방법 |
CN111066060B (zh) | 2017-07-13 | 2024-08-02 | 资生堂株式会社 | 虚拟面部化妆去除和模拟、快速面部检测和地标跟踪 |
US10803677B2 (en) * | 2018-04-30 | 2020-10-13 | Mathew Powers | Method and system of automated facial morphing for eyebrow hair and face color detection |
WO2020014695A1 (en) | 2018-07-13 | 2020-01-16 | Shiseido Americas Corporation | System and method for adjusting custom topical agents |
CA3106578A1 (en) * | 2018-07-16 | 2020-01-23 | Swift Medical Inc. | Apparatus for visualization of tissue |
AU2019307872A1 (en) * | 2018-07-16 | 2021-01-28 | Skin Rejuvenation Technologies (Pty) Ltd | A method and system for cosmetic recommendations |
CN109685046B (zh) * | 2019-02-28 | 2023-09-29 | 福建工程学院 | 一种基于图像灰度的皮肤光透明程度分析方法及其装置 |
WO2020189754A1 (ja) * | 2019-03-20 | 2020-09-24 | 学校法人慶應義塾 | 推定方法、推定モデルの生成方法、プログラム、及び推定装置 |
JP7539917B2 (ja) | 2019-04-09 | 2024-08-26 | 株式会社 資生堂 | 画像取り込みが改善された局所用剤を作成するためのシステムおよび方法 |
WO2022010310A1 (en) * | 2020-07-09 | 2022-01-13 | Samsung Electronics Co., Ltd. | Electronic device for acquiring image by using light-emitting module having polarizing filter and method for controlling same |
CN117241722A (zh) * | 2021-04-26 | 2023-12-15 | 莱雅公司 | 用于检测人类受试者的皮肤状况的计算设备、方法和装置 |
CN115829910B (zh) * | 2022-07-07 | 2023-06-27 | 广州莱德璞检测技术有限公司 | 一种肌肤的感官评价装置以及肌肤的评价方法 |
Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003024306A (ja) * | 2001-07-13 | 2003-01-28 | Shiseido Co Ltd | 皮膚のはり評価システム |
JP2007252891A (ja) * | 2006-02-23 | 2007-10-04 | Masahiro Nakagawa | 肌の美しさの目視評価値の推定方法 |
JP4105554B2 (ja) | 2003-01-16 | 2008-06-25 | 株式会社資生堂 | 皮膚の透明感評価方法 |
Family Cites Families (22)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
US6088392A (en) * | 1997-05-30 | 2000-07-11 | Lucent Technologies Inc. | Bit rate coder for differential quantization |
US20030065552A1 (en) * | 2001-10-01 | 2003-04-03 | Gilles Rubinstenn | Interactive beauty analysis |
US20030065523A1 (en) * | 2001-10-01 | 2003-04-03 | Francis Pruche | Early detection of beauty treatment progress |
US7324668B2 (en) | 2001-10-01 | 2008-01-29 | L'oreal S.A. | Feature extraction in beauty analysis |
JP4761924B2 (ja) | 2004-10-22 | 2011-08-31 | 株式会社 資生堂 | 肌状態診断システムおよび美容のためのカウンセリングシステム |
CN101083940B (zh) | 2004-10-22 | 2010-06-16 | 株式会社资生堂 | 皮肤状况诊断系统和美容咨询系统 |
EA015357B1 (ru) * | 2005-03-23 | 2011-06-30 | Мэри Кэй Инк. | Осветляющая кожу композиция |
US20070190175A1 (en) * | 2005-12-08 | 2007-08-16 | Tasker Products Ip Holding Corp | Skin care composition for dermatological disorders |
US20090245603A1 (en) * | 2007-01-05 | 2009-10-01 | Djuro Koruga | System and method for analysis of light-matter interaction based on spectral convolution |
JP2010515489A (ja) * | 2007-01-05 | 2010-05-13 | マイスキン インコーポレイテッド | 皮膚を撮像するためのシステム、装置、及び方法 |
US20080304736A1 (en) * | 2007-02-20 | 2008-12-11 | Masahiro Nakagawa | Method of estimating a visual evaluation value of skin beauty |
JP5290585B2 (ja) | 2008-01-17 | 2013-09-18 | 株式会社 資生堂 | 肌色評価方法、肌色評価装置、肌色評価プログラム、及び該プログラムが記録された記録媒体 |
WO2010082942A1 (en) * | 2008-02-01 | 2010-07-22 | Canfield Scientific, Incorporated | Automatic mask design and registration and feature detection for computer-aided skin analysis |
KR101076307B1 (ko) * | 2009-02-16 | 2011-10-26 | 고려대학교 산학협력단 | 피부 나이 추론 방법 |
US8150501B2 (en) * | 2009-03-26 | 2012-04-03 | Johnson & Johnson Consumer Companies, Inc. | Method for measuring skin erythema |
FR2944899B1 (fr) * | 2009-04-23 | 2014-04-25 | Lvmh Rech | Procede et appareil de caracterisation des taches pigmentaires et procede d'appreciation de l'effet de traitement d'une tache pigmentaire par un produit cosmetique |
FR2952519B1 (fr) | 2009-11-13 | 2012-07-20 | O Ts Team Creatif | Procede et dispositif pour l'analyse de la tendance d'une peau a etre plus ou moins seche. |
JP5638234B2 (ja) * | 2009-12-22 | 2014-12-10 | ショットモリテックス株式会社 | 透明感測定装置および透明感測定方法 |
JP2012053813A (ja) | 2010-09-03 | 2012-03-15 | Dainippon Printing Co Ltd | 人物属性推定装置、人物属性推定方法、及びプログラム |
US20120288168A1 (en) * | 2011-05-09 | 2012-11-15 | Telibrahma Convergent Communications Pvt. Ltd. | System and a method for enhancing appeareance of a face |
JP5657494B2 (ja) * | 2011-09-22 | 2015-01-21 | 富士フイルム株式会社 | シワ検出方法、シワ検出装置およびシワ検出プログラム、並びに、シワ評価方法、シワ評価装置およびシワ評価プログラム |
CN104067311B (zh) * | 2011-12-04 | 2017-05-24 | 数码装饰有限公司 | 数字化妆 |
-
2014
- 2014-06-20 WO PCT/JP2014/003328 patent/WO2014208067A1/ja active Application Filing
- 2014-06-20 JP JP2015523859A patent/JP6345176B2/ja active Active
- 2014-06-20 EP EP14817805.6A patent/EP3000386B1/en active Active
- 2014-06-20 CN CN201480036117.3A patent/CN105338887B/zh active Active
- 2014-06-20 US US14/900,863 patent/US10667744B2/en active Active
Patent Citations (3)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
JP2003024306A (ja) * | 2001-07-13 | 2003-01-28 | Shiseido Co Ltd | 皮膚のはり評価システム |
JP4105554B2 (ja) | 2003-01-16 | 2008-06-25 | 株式会社資生堂 | 皮膚の透明感評価方法 |
JP2007252891A (ja) * | 2006-02-23 | 2007-10-04 | Masahiro Nakagawa | 肌の美しさの目視評価値の推定方法 |
Cited By (19)
Publication number | Priority date | Publication date | Assignee | Title |
---|---|---|---|---|
WO2015125552A1 (ja) * | 2014-02-21 | 2015-08-27 | 富士フイルム株式会社 | うるおい感評価装置、うるおい感評価方法およびうるおい感評価プログラム |
JP2016075561A (ja) * | 2014-10-06 | 2016-05-12 | パナソニックIpマネジメント株式会社 | 光沢判定装置および光沢判定方法 |
JPWO2016121518A1 (ja) * | 2015-01-29 | 2017-11-09 | ソニー株式会社 | 情報処理装置、情報処理方法、およびプログラム |
JP2017012384A (ja) * | 2015-06-30 | 2017-01-19 | 花王株式会社 | シワ状態分析装置及びシワ状態分析方法 |
US11069057B2 (en) | 2016-05-25 | 2021-07-20 | Panasonic Intellectual Property Management Co., Ltd. | Skin diagnostic device and skin diagnostic method |
JP2018060532A (ja) * | 2016-10-06 | 2018-04-12 | 御木本製薬株式会社 | 皮膚評価方法 |
JP7083478B2 (ja) | 2016-10-06 | 2022-06-13 | 御木本製薬株式会社 | 皮膚評価方法 |
JP2022003542A (ja) * | 2017-07-28 | 2022-01-11 | ポーラ化成工業株式会社 | 肌状態のケアに関する情報出力システム |
JP7264959B2 (ja) | 2017-07-28 | 2023-04-25 | ポーラ化成工業株式会社 | 肌状態のケアに関する情報出力システム |
JP2019025190A (ja) * | 2017-08-02 | 2019-02-21 | 花王株式会社 | 肌の評価方法 |
KR102151710B1 (ko) * | 2017-11-06 | 2020-09-04 | 주식회사 베이바이오텍 | 이미지 분석 결과 및 학습된 피부 인식기에 기초하여 피부의 상태를 평가하는 방법 및 프로그램 |
KR20190051256A (ko) * | 2017-11-06 | 2019-05-15 | 주식회사 베이바이오텍 | 이미지 분석 결과 및 학습된 피부 인식기에 기초하여 피부의 상태를 평가하는 방법 및 프로그램 |
JP2019217253A (ja) * | 2018-06-14 | 2019-12-26 | 株式会社コーセー | 皮膚透明感を評価する方法 |
JP7307544B2 (ja) | 2018-06-14 | 2023-07-12 | 株式会社コーセー | 皮膚透明感を評価する方法 |
JP2022529677A (ja) * | 2019-04-23 | 2022-06-23 | ザ プロクター アンド ギャンブル カンパニー | 美容的皮膚特性を視覚化するための装置及び方法 |
JP7235895B2 (ja) | 2019-04-23 | 2023-03-08 | ザ プロクター アンド ギャンブル カンパニー | 美容的皮膚特性を視覚化するための装置及び方法 |
US11605243B2 (en) | 2019-04-23 | 2023-03-14 | The Procter & Gamble Company | Apparatus and method for determining cosmetic skin attributes |
KR20200020767A (ko) * | 2020-02-19 | 2020-02-26 | 주식회사 베이바이오텍 | 피부 상태 평가 방법 및 프로그램 |
KR102189865B1 (ko) * | 2020-02-19 | 2020-12-11 | 주식회사 베이바이오텍 | 피부 상태 평가 방법 및 프로그램 |
Also Published As
Publication number | Publication date |
---|---|
EP3000386B1 (en) | 2021-01-20 |
JPWO2014208067A1 (ja) | 2017-02-23 |
EP3000386A4 (en) | 2016-07-06 |
US20160135730A1 (en) | 2016-05-19 |
US10667744B2 (en) | 2020-06-02 |
JP6345176B2 (ja) | 2018-06-20 |
CN105338887B (zh) | 2019-10-25 |
CN105338887A (zh) | 2016-02-17 |
EP3000386A1 (en) | 2016-03-30 |
Similar Documents
Publication | Publication Date | Title |
---|---|---|
JP6345176B2 (ja) | 肌の官能評価装置および肌の評価方法 | |
JP6343612B2 (ja) | 肌分析方法、肌分析装置および肌分析装置の制御方法 | |
US10617301B2 (en) | Information processing device and information processing method | |
US8155413B2 (en) | Method and system for analyzing skin conditions using digital images | |
WO2016121518A1 (ja) | 情報処理装置、情報処理方法、およびプログラム | |
US20090060304A1 (en) | Dermatology information | |
US20150373264A1 (en) | Digital mirror apparatus | |
KR20160144971A (ko) | 정보 처리 장치, 정보 처리 방법 및 프로그램 | |
JP2016112024A (ja) | 情報処理装置の制御方法および画像処理方法 | |
US10803988B2 (en) | Color analysis and control using a transparent display screen on a mobile device with non-transparent, bendable display screen or multiple display screen with 3D sensor for telemedicine diagnosis and treatment | |
KR101640456B1 (ko) | 디스플레이 패널의 각 픽셀들의 개구부를 통해 촬영하는 촬영 장치 및 방법 | |
US10055844B2 (en) | Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method | |
US11730372B2 (en) | Accessory device and imaging device | |
US20080166268A1 (en) | Test sheet, apparatus, method and program for diagnosing an object | |
JP2013212177A (ja) | 画像解析方法、画像解析装置、及び画像解析プログラム | |
US9852503B2 (en) | Diagnostic apparatus for lesion, image processing method in the same apparatus, and medium storing program associated with the same method | |
JP6152938B2 (ja) | 電子ミラー装置 | |
JP2014157428A (ja) | 画像表示装置、画像表示方法、及び、プログラム | |
JP6527765B2 (ja) | シワ状態分析装置及びシワ状態分析方法 | |
JP2017192767A (ja) | 画像解析方法、画像解析装置、及び画像解析プログラム | |
JP2017023474A (ja) | 皮膚毛穴解析装置,皮膚毛穴解析方法および該装置に用いるプログラム | |
KR101957773B1 (ko) | 영상을 이용한 피부 상태 평가 방법 및 영상을 이용한 피부 상태 평가 장치 | |
US20180040124A1 (en) | Disease diagnostic apparatus, image processing method in the same apparatus, and medium storing program associated with the same method | |
CN115662324A (zh) | 柔性显示屏的显示补偿方法、装置及显示装置 | |
JP2005034424A (ja) | 皮膚表面状態評価方法並びにその装置 |
Legal Events
Date | Code | Title | Description |
---|---|---|---|
WWE | Wipo information: entry into national phase |
Ref document number: 201480036117.3 Country of ref document: CN |
|
121 | Ep: the epo has been informed by wipo that ep was designated in this application |
Ref document number: 14817805 Country of ref document: EP Kind code of ref document: A1 |
|
ENP | Entry into the national phase |
Ref document number: 2015523859 Country of ref document: JP Kind code of ref document: A |
|
WWE | Wipo information: entry into national phase |
Ref document number: 14900863 Country of ref document: US |
|
WWE | Wipo information: entry into national phase |
Ref document number: 2014817805 Country of ref document: EP |
|
NENP | Non-entry into the national phase |
Ref country code: DE |