WO2023217626A1 - Detection and visualization of cutaneous signs using a heat map - Google Patents

Detection and visualization of cutaneous signs using a heat map Download PDF

Info

Publication number
WO2023217626A1
WO2023217626A1 PCT/EP2023/061788 EP2023061788W WO2023217626A1 WO 2023217626 A1 WO2023217626 A1 WO 2023217626A1 EP 2023061788 W EP2023061788 W EP 2023061788W WO 2023217626 A1 WO2023217626 A1 WO 2023217626A1
Authority
WO
WIPO (PCT)
Prior art keywords
interest
cutaneous signs
image
heat map
zone
Prior art date
Application number
PCT/EP2023/061788
Other languages
French (fr)
Inventor
Panagiotis-alexandros BOKARIS
Julien Despois
Matthieu PERROT
Frédéric FLAMENT
Benjamin ASKENAZI
Original Assignee
L'oreal
Priority date (The priority date is an assumption and is not a legal conclusion. Google has not performed a legal analysis and makes no representation as to the accuracy of the date listed.)
Filing date
Publication date
Application filed by L'oreal filed Critical L'oreal
Publication of WO2023217626A1 publication Critical patent/WO2023217626A1/en

Links

Classifications

    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V10/00Arrangements for image or video recognition or understanding
    • G06V10/70Arrangements for image or video recognition or understanding using pattern recognition or machine learning
    • G06V10/82Arrangements for image or video recognition or understanding using pattern recognition or machine learning using neural networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/045Combinations of networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0464Convolutional networks [CNN, ConvNet]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06NCOMPUTING ARRANGEMENTS BASED ON SPECIFIC COMPUTATIONAL MODELS
    • G06N3/00Computing arrangements based on biological models
    • G06N3/02Neural networks
    • G06N3/04Architecture, e.g. interconnection topology
    • G06N3/0475Generative networks
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T7/00Image analysis
    • G06T7/0002Inspection of images, e.g. flaw detection
    • G06T7/0012Biomedical image inspection
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06VIMAGE OR VIDEO RECOGNITION OR UNDERSTANDING
    • G06V40/00Recognition of biometric, human-related or animal-related patterns in image or video data
    • G06V40/10Human or animal bodies, e.g. vehicle occupants or pedestrians; Body parts, e.g. hands
    • G06V40/16Human faces, e.g. facial parts, sketches or expressions
    • G06V40/161Detection; Localisation; Normalisation
    • AHUMAN NECESSITIES
    • A45HAND OR TRAVELLING ARTICLES
    • A45DHAIRDRESSING OR SHAVING EQUIPMENT; EQUIPMENT FOR COSMETICS OR COSMETIC TREATMENTS, e.g. FOR MANICURING OR PEDICURING
    • A45D44/00Other cosmetic or toiletry articles, e.g. for hairdressers' rooms
    • A45D44/005Other cosmetic or toiletry articles, e.g. for hairdressers' rooms for selecting or displaying personal cosmetic colours or hairstyle
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/10Image acquisition modality
    • G06T2207/10024Color image
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/20Special algorithmic details
    • G06T2207/20084Artificial neural networks [ANN]
    • GPHYSICS
    • G06COMPUTING; CALCULATING OR COUNTING
    • G06TIMAGE DATA PROCESSING OR GENERATION, IN GENERAL
    • G06T2207/00Indexing scheme for image analysis or image enhancement
    • G06T2207/30Subject of image; Context of image processing
    • G06T2207/30004Biomedical image processing
    • G06T2207/30088Skin; Dermal

Definitions

  • the subject of the invention is a cosmetic and non-therapeutic method for detecting and quantifying cutaneous signs in a zone of interest of a user. It is also directed to a system for implementing this method.
  • a cosmetic product is a product as defined in Regulation (EC) No 1223/2009 of the European Parliament and of the Council dated 30 November 2009 relating to cosmetic products.
  • the document US8218862 discloses a method for generating a cutaneous mask delimiting a region of interest (ROI) in an image of skin comprising:
  • skin detection comprising:
  • Atlases represent a portion of the knowledge on which the tools for evaluating cutaneous signs are based. The performance thereof for evaluating cosmetic products or for modelling the kinetics of the development of ageing based on hundreds of thousands of photos is known.
  • the data from the atlases were used to feed the algorithms, by associating a photo with a score. Then, they were trained to score these various observation areas.
  • the patent application WO2020169214 filed by the applicant discloses predicting cutaneous ageing by taking into account the atlases and also the surroundings or the habits of a user, notably sleep time, pollution and/or living space. By combining the data originating from selfies with information such as the pace of life, the sleep or the surroundings of users, this prediction offers an " ageing curve " based on these data.
  • the invention relates to a cosmetic and non-therapeutic method for detecting and quantifying cutaneous signs in a zone of interest of a user, notably the contour of the eyes, the cheeks, the contour of the mouth, the forehead, said method comprising (i) acquisition of an input image of a zone of interest of a user, including a region of interest encompassing cutaneous signs, (ii) analysis of the zone of interest in order to supply a score of the cutaneous signs, by comparison with reference scores pre-recorded in a reference database, (iii) visualization of the region of interest in the form of a heat map for detecting the cutaneous signs, the heat map being obtained from an algorithm focussed on the cutaneous signs, (iv) merging of the heat map with the input image of the region of interest in order to supply an output image of the region of interest and to optimize the visualization of the cutaneous signs.
  • the " reference database" can comprise:
  • Atlases displayed on screen or printed on a medium or in a form stored on a computer storage medium such as the atlas described in the patent application WO2011/141769, each atlas representing various gradations of at least one feature of bodily typology,
  • the data can be processed by artificial intelligence (AI) algorithms which can include fuzzy logic, neural networks, genetic programming and decision tree programming.
  • AI artificial intelligence
  • All engines can be trained on the basis of inputs such as information on the product, advice from experts, a user profile or data based on sensory perception.
  • an AI engine can implement an iterative learning process.
  • Training can be based on a wide variety of learning rules or training algorithms and scientific publications, such as the publications:
  • the invention also relates to a cosmetic and non-therapeutic method for temporal monitoring of a zone of interest of a user, which comprises detecting and quantifying cutaneous signs in a zone of interest as described above; temporal monitoring of the output images resulting from a comparison of the output images generated at the different temporal instants, and temporal monitoring of the scores resulting from a comparison of the scores generated at the different temporal instants.
  • the invention also relates to a cosmetic and non-therapeutic method for detecting and quantifying cutaneous signs in a zone of interest of a user, comprising communication means suitable for communicating with a user computing device and configured to import, from a library of images in the user computing device, at least one input image comprising a representation of the zone of interest of the user; processing means configured to effect (ii) analysis of the zone of interest in order to supply a score of the cutaneous signs, by comparison with reference scores pre-recorded in a reference database, (iii) visualization of the region of interest in the form of a heat map for detecting the cutaneous signs, the heat map being obtained from an algorithm focussed on the cutaneous signs, (iv) merging of the heat map with the input image of the region of interest in order to supply an output image of the region of interest and to optimize the visualization of the cutaneous signs.
  • the invention also relates to a computer program comprising instructions which, when the program is executed by a computer, cause the latter to implement the method described above.
  • the invention relates lastly to a computer-readable medium comprising instructions which, when they are executed by a computer, cause the latter to implement the method described above.
  • the cosmetic and non-therapeutic method for detecting and quantifying cutaneous signs in a zone of interest of a user exhibits one or more of the following features, taken alone or in combination:
  • It comprises (v) extraction of the cutaneous signs of the region of interest by converting the heat map into a colour space image, (vi) production of an output image of the region of interest based on the colour space image and the input image.
  • the learning base is inclusive, covering all ages, ethnic origins and phototypes, and all the bases are scored using a universal scale in order to obtain comparable measurements.
  • the detection of cutaneous signs, the assessment of the score of the cutaneous signs, the provision of the heat map and the creation of the final image are effected via an image processing or computer vision algorithm.
  • the machine learning model is a pretrained convolutional neural network.
  • the machine learning model is supplied with data from an atlas associating pre-recorded images to pre-recorded severity scores.
  • the cutaneous signs are chosen from openness of the pores, skin texture, a line, a fine line, crow's feet, sagging, colour of shadows under the eyes, depth of shadows under the eyes, skin grain, a facial expression, morphology, degree of hydration, degree of sheen or colour of a spot.
  • Detection of the cutaneous signs, assessment of the score of the cutaneous signs, provision of the heat map and creation of the output image are effected via a machine learning model.
  • the machine learning model is a pretrained convolutional neural network.
  • the machine learning model is supplied with data from an atlas associating pre-recorded images to pre-recorded severity scores.
  • the machine learning model has been trained to score different cutaneous signs.
  • It comprises a filtration of at least one channel.
  • the cosmetic and non-therapeutic method for temporal monitoring of a zone of interest of a user exhibits one or more of the following features, taken alone or in combination:
  • the heat map is obtained from a generative adversarial network (GAN).
  • GAN generative adversarial network
  • the output image comprises acquisition of data supplied by the user in response to a questionnaire, and processing of these data by the machine learning model.
  • the method further comprises recommendation of at least one cosmetic product or cosmetic routine on the basis of the output image and the score.
  • FIG. 1 is a block diagram of a method for detecting and quantifying shadows under the eyes in accordance with the present invention.
  • FIG. 1 is a block diagram of a method for detecting and quantifying crow’s foot lines in accordance with the present invention.
  • FIG. 1 is a block diagram of a prediction of ageing, associated with a cosmetic product recommendation.
  • FIG. 1 is a schematic drawing illustrating the various steps of a consumer experience based on implementation of the method according to the invention in an application for a mobile phone or tablet.
  • FIG 5 illustrates a screen for the application in , showing a questionnaire.
  • FIG. 1 illustrates a screen for the application in during the processing of the input data.
  • FIG. 1 illustrates a screen for the application in , showing a presentation, to the user, of the results of the assessment of the cutaneous signs.
  • FIG. 1 illustrates a screen for the application in , showing the user details of a result from .
  • FIG. 1 illustrates a screen for the application in , showing an analysis report for the user.
  • FIG. 1 illustrates a screen for the application in , showing suggestions of products for the user.
  • FIG 11 illustrates a screen for the application in , showing a QR code assigned to the user.
  • the method according to the invention comprises a step in which a processing unit extracts an image of the region of interest 111 from a first input image of a zone of interest 112.
  • the zone of interest 112 is formed by the face; the region of interest 111 is formed by the lower contour of the eye.
  • the cutaneous signs present in the region of interest 111 are lines, fine lines and shadows.
  • the image of the zone of interest 112 can be captured using an automated and controlled skin image capture system, such as the VISIA complexion analysis system for analysis of the facial skin, available from Canfield Scientific, Inc.
  • the image is captured with a standard light that can be expressed in the form of an RGB (red, green, blue) colour image.
  • the image can also be captured using a different lighting modality, or using a multispectral imaging system, provided that the regions of interest 111 can be distinguished according to a skin index measurement (for example concentrations of melanin and/or haemoglobin) derived from the captured image.
  • the skin detection procedure uses, for example, the measurement of the individual typology angle (ITA), which is used as a skin indicator.
  • ITA is calculated using the L* and b* channels of the transformed skin image CIE L*a*b* (hereinafter called L*a*b).
  • L*a*b the transformed skin image CIE L*a*b*
  • the ITA is defined for each image pixel (i,j) as arctan (L*[i,j] ⁇ 50)/b*[i,j]) and is linked to the concentration of melanin in the skin.
  • the method comprises a step 1 in which the processing unit generates a heat map 12 on the basis of the image of the zone of interest 111.
  • the heat map 12 is converted into a colour space image 13 with green channel 131, blue channel 132 and yellow channel 133 in order to visualize a line of shadow 130.
  • the blue and green channels ideally display the spots and the hyperpigmented pores, since these features have a greater absorption in the blue and green spectra.
  • the heat map is a graphical representation of statistical data which assigns, to the intensity of a variable magnitude, a range of tones or a colour chart on a two-dimensional matrix.
  • step 3 the colour space image 13 and the input image of the region of interest 111 are combined in order to produce an output image 14 showing the line of shadow 130 of the colour space image on the image of the zone of interest 111.
  • the block diagram shown in comprises a step in which a first image of the face 21 is acquired via an image acquisition device.
  • the method comprises a step in which a processing unit extracts an image of the zone of interest 111 from the first acquired image of a zone of interest, namely a face.
  • the region of interest 111 is formed by the corner of an eye.
  • the cutaneous signs present in the region of interest 111 are mainly lines and fine crow’s foot lines.
  • the method then comprises a step in which the processing unit generates a heat map 22 on the basis of the image of the region of interest 111.
  • the heat map 22 is converted into a colour space image 23 with green channel 131 and blue channel 132 in order to visualize the crow’s foot lines 230.
  • a step 23 the colour space image 23 and the input image are combined in order to produce an output image 24 showing the crow’s foot lines 230 of the colour space image 23 on the image of the region of interest 111.
  • the method comprises a step in which the processing unit can extract physiognomic data of the cutaneous signs from the output image and/or attribute a score.
  • a temporal indication of the time of acquisition of the image is optionally associated with these physiognomic data.
  • the physiognomic data and their temporal indication can be stored in a memory.
  • the method optionally comprises temporal monitoring of the treatment of the line of shadow on the basis of input images acquired at various times, for example every day.
  • the temporal monitoring can comprise a step in which the image acquisition device acquires a new image.
  • the temporal monitoring comprises a step in which the processing unit compares the physiognomic data extracted from the last acquired image with the physiognomic data extracted from the penultimate acquired image.
  • This comparison allows the processing unit to calculate a progression gradient of the cutaneous sign. This progression gradient can be calculated based on a difference between the scores attributed to the two images.
  • the score can be calculated based on “ reference data of a database” , which are listed for example in the international application WO2020169214A1 filed by the applicant.
  • the block diagram shown in illustrates, in a general fashion, the steps of the method according to the invention.
  • An image 50 is acquired, for example, using a multifunction phone (usually a smartphone), a touchscreen tablet or a computer in a step 69.
  • a multifunction phone usually a smartphone
  • a touchscreen tablet or a computer in a step 69.
  • a step 70 it is possible to perform a holistic analysis of the various cutaneous signs present on the image 50, such as lines, texture, spots, pores, redness, sagging, pigmentation, hydration, glow, facial expressions, morphology of the face.
  • the images of the regions of interest are processed as per the method according to the invention, so as to optimize the visualization of the cutaneous signs chosen in a step 71.
  • the cutaneous signs chosen are the crow’s foot lines and the shadows visualized on the output images of the region of the crow’s feet 51 and the region of the shadows 52.
  • the zones in which the algorithms are concentrated in order to supply the score are visualized in the form of a heat map.
  • the latter is then merged with the initial image using image processing techniques in order to designate the zones of interest for the user.
  • results of the various features are combined in order to predict ageing and to offer a decision concerning selection of a suitable product, at step 72, or of a cosmetic routine.
  • the method according to the invention can lead to an exhaustive evaluation of the whole face, covering all skin manifestations such as lines, texture, sagging, pigmentation, pores, and also irregularities in colour and the visual properties.
  • Holistic attributes such as glow or apparent age, as they could be perceived by others, are readily available.
  • this new diagnostic is a means of providing consumers with a simulation not only of immediate or long-term efficacy but also of the impact of the exposome 53 and of lifestyles on the appearance of the contour of the eye.
  • the method according to the invention aids decision-making in the beauty sector, for example for preparing for the location of cosmetic injections and for providing an accessible simulation of treatment.
  • the method according to the invention is key in defining an overall look and for achieving an overall appearance.
  • FIG. 1 is a schematic drawing illustrating a non-limiting example of implementation of a recommendation and division of a routine according to various aspects of the present disclosure.
  • a fixed or mobile computing device has captured at least one image 402 of the user.
  • the image 402 can be presented on a display of the computing device.
  • the image 402 is processed in accordance with the method according to the invention.
  • the image 402 can undergo normalization, the image normalization engine 212 using a face detection algorithm for detecting the part of the image 402 that represents the face.
  • the image normalization engine can use the face detection algorithm to find a bounding box which includes the face.
  • the image normalization engine can modify the image in order to centre a bounding frame.
  • the image normalization engine can zoom the image 402 in order to make the bounding frame as large as possible.
  • a questionnaire is shown on the screen of the user’s device in order to collect data that will be processed by the artificial intelligence algorithm, with a view to attributing scores to the different cutaneous signs.
  • the scores attributed to the cutaneous signs are presented to the user.
  • the scores are shown in the form of horizontal bars spaced apart from one another.
  • Other tools for graphical representation could be used, for example curves or histograms, but the bar diagrams are particularly suitable for this graphical representation of a statistical series of discrete quantitative variables.
  • a visualization of the clinical signs is provided to the user, by application of filters on an input image, according to the cutaneous sign visualized.
  • step 65 recommendations of cosmetic products or routines are presented on the device screen, according to the priorities established by the skin analysis at step 63.
  • FIG. 5 to 11 show details of the application presented in .
  • FIG 5 shows a questionnaire page of the application; this page is to be completed by the user.
  • This questionnaire allows the user to make a self-assessment of the state of their skin.
  • the responses are processed by the method according to the invention in order to determine scores.
  • This loading page shows a loading page presented on the screen during the analysis of the questionnaire and the assessment of the image. This loading page is normally displayed for 30 to 45 seconds.
  • the skin problems are displayed with an intuitive colour code going from requires attention on the left to skin excellent on the right. They are generally noted on a scale from 0% (on the left) to 100% (on the right).
  • FIG. 1 shows a page displaying a skin problem visualized by interposition of a filter.
  • the magnifier tool By using the magnifier tool, the user can enlarge a specific zone of the face or can even zoom in and out.
  • Each cutaneous sign is associated with a filter or a light.
  • a filter or a light For example, for lines, texture and firmness, a standard polarizing filter for white light is interposed. For spots or redness, a pair of polarizing filters is interposed. For UV damage and pores, a UV light is added.
  • FIG. 1 shows a page displaying the recommended products, for example classed according to brands.
  • Product 1 is offered by a first brand.
  • Products 2, 3 and 4 are offered by a second brand.
  • Products 5 and 6 are offered by a third brand.
  • FIG 11 shows a page displaying a QR, for example for a case where the user wishes to recover the results of the assessment later on.

Landscapes

  • Engineering & Computer Science (AREA)
  • Theoretical Computer Science (AREA)
  • Physics & Mathematics (AREA)
  • Health & Medical Sciences (AREA)
  • General Physics & Mathematics (AREA)
  • General Health & Medical Sciences (AREA)
  • Evolutionary Computation (AREA)
  • Artificial Intelligence (AREA)
  • Computing Systems (AREA)
  • Software Systems (AREA)
  • Biophysics (AREA)
  • Data Mining & Analysis (AREA)
  • Mathematical Physics (AREA)
  • General Engineering & Computer Science (AREA)
  • Molecular Biology (AREA)
  • Computational Linguistics (AREA)
  • Biomedical Technology (AREA)
  • Life Sciences & Earth Sciences (AREA)
  • Computer Vision & Pattern Recognition (AREA)
  • Medical Informatics (AREA)
  • Multimedia (AREA)
  • Quality & Reliability (AREA)
  • Radiology & Medical Imaging (AREA)
  • Nuclear Medicine, Radiotherapy & Molecular Imaging (AREA)
  • Human Computer Interaction (AREA)
  • Oral & Maxillofacial Surgery (AREA)
  • Databases & Information Systems (AREA)
  • Measuring And Recording Apparatus For Diagnosis (AREA)

Abstract

The invention relates to a cosmetic and non-therapeutic method for detecting and quantifying cutaneous signs in a zone of interest of a user, notably the contour of the eyes, the cheeks, the contour of the mouth, the forehead, said method comprising (i) acquisition of an input image of a zone of interest (112), (ii) analysis of the zone of interest in order to supply a score of the cutaneous signs, (iii) visualization of the region of interest (111) in the form of a heat map (12, 22), (iv) merging of the heat map (12, 22) with the input image of the region of interest (111).

Description

Detection and visualization of cutaneous signs using a heat map
The subject of the invention is a cosmetic and non-therapeutic method for detecting and quantifying cutaneous signs in a zone of interest of a user. It is also directed to a system for implementing this method.
More generally, a cosmetic product is a product as defined in Regulation (EC) No 1223/2009 of the European Parliament and of the Council dated 30 November 2009 relating to cosmetic products.
Prior art
Computer-assisted analysis of the skin has become widespread over the course of the last decade with the availability of controlled lighting systems and of digital image capture and processing capabilities.
Because of the importance attributed to the appearance of the face, computer-assisted analysis studies have concentrated on the skin of the face. There are commercially available systems for imaging the skin of the face which can capture digital images in a controlled manner. These systems are often coupled to computer analysis systems for displaying and quantifying the visible features of the skin in standard white light images, such as hyperpigmented spots, lines and texture, as well as the non-visible features in fluorescence or hyperspectral absorption images such as UV spots.
The document US8218862 discloses a method for generating a cutaneous mask delimiting a region of interest (ROI) in an image of skin comprising:
detecting the skin in the image of skin in order to generate a skin map, skin detection comprising:
performing a raw skin segmentation operation on the image of skin;
converting the raw skin segmented image of skin into an image with a colour space having at least three channels; and
filtering two of the at least three channels;
generating a melanin index image on the basis of the filtered channels; and
performing a thresholding operation on the melanin index image in order to separate the cutaneous and non-cutaneous areas in the cutaneous image;
providing an initial contour based on the map of the skin; and
optimizing the initial contour in order to generate a contour of the cutaneous mask, the skin map and the cutaneous mask each comprising at least one natural limit of the skin.
The six "Atlas des signes du viellissement cutané [Atlases of the signs of cutaneous ageing]" published by MED’COM since 2007 are also known. These works systematically study and establish a characterization and a classification of skin linked to age and geography.
By identifying several features of the face, they make it possible for professionals to qualify and to quantify the signs of age as a function of their development. They define these criteria - openness of the pores, depth of the lines on the forehead and crow's feet - and assign them a severity score, which starts from zero and can go up to 9.
These atlases represent a portion of the knowledge on which the tools for evaluating cutaneous signs are based. The performance thereof for evaluating cosmetic products or for modelling the kinetics of the development of ageing based on hundreds of thousands of photos is known.
In order to automate the diagnostics, the data from the atlases were used to feed the algorithms, by associating a photo with a score. Then, they were trained to score these various observation areas.
By virtue of thousands of selfies studied across the globe, algorithms based on artificial intelligence have learnt to recognize the various signs of the ageing of the face in any type of photograph.
By combining the data originating from selfies with information on the pace of life, the sleep or the surroundings of users, these diagnostics now make it possible to offer a user an estimated "ageing curve", which can be compared to an average of people of the same age.
The patent application WO2020169214 filed by the applicant discloses predicting cutaneous ageing by taking into account the atlases and also the surroundings or the habits of a user, notably sleep time, pollution and/or living space. By combining the data originating from selfies with information such as the pace of life, the sleep or the surroundings of users, this prediction offers an "ageing curve" based on these data.
There is a need to improve the performance of the above methods further in order to better:
Provide complete and accurate diagnostics of the face combining diverse features.
Personalize cosmetic care/a product based on the diagnostic, either by creating a new product or by selecting it from a given range.
Personalize the user experience by displaying the results of the diagnostic and/or by simulating their development for knowledge sharing.
Definition of the invention
The invention relates to a cosmetic and non-therapeutic method for detecting and quantifying cutaneous signs in a zone of interest of a user, notably the contour of the eyes, the cheeks, the contour of the mouth, the forehead, said method comprising (i) acquisition of an input image of a zone of interest of a user, including a region of interest encompassing cutaneous signs, (ii) analysis of the zone of interest in order to supply a score of the cutaneous signs, by comparison with reference scores pre-recorded in a reference database, (iii) visualization of the region of interest in the form of a heat map for detecting the cutaneous signs, the heat map being obtained from an algorithm focussed on the cutaneous signs, (iv) merging of the heat map with the input image of the region of interest in order to supply an output image of the region of interest and to optimize the visualization of the cutaneous signs.
Accurate evaluations of cutaneous signs surpassing those of the prior art in terms of accuracy and of inclusivity are obtained in accordance with the method of the invention, by virtue of processing the images. The system is based on the knowledge from decades of research and of data covering all the various skin types stored in the reference database.
The "reference database" can comprise:
Data stored with one or more sequences of images comprising at least two images,
One or more atlases displayed on screen or printed on a medium or in a form stored on a computer storage medium, such as the atlas described in the patent application WO2011/141769, each atlas representing various gradations of at least one feature of bodily typology,
Synthetic images,
Internet publications.
The data can be processed by artificial intelligence (AI) algorithms which can include fuzzy logic, neural networks, genetic programming and decision tree programming.
All engines can be trained on the basis of inputs such as information on the product, advice from experts, a user profile or data based on sensory perception. Using an input, an AI engine can implement an iterative learning process.
Training can be based on a wide variety of learning rules or training algorithms and scientific publications, such as the publications:
Frederic Flament et al., « Effect of the sun on visible clinical signs of aging in Caucasian skin », Clin Cosmet Investig Dermatol., 2013 Sep 27;6:221-32.
Frederic Flament et al., Assessing changes in some facial signs of fatigue in Chinese women, induced by a single working day, International Journal of Cosmetic Science, 29 November 2018.
The invention also relates to a cosmetic and non-therapeutic method for temporal monitoring of a zone of interest of a user, which comprises detecting and quantifying cutaneous signs in a zone of interest as described above; temporal monitoring of the output images resulting from a comparison of the output images generated at the different temporal instants, and temporal monitoring of the scores resulting from a comparison of the scores generated at the different temporal instants.
The invention also relates to a cosmetic and non-therapeutic method for detecting and quantifying cutaneous signs in a zone of interest of a user, comprising communication means suitable for communicating with a user computing device and configured to import, from a library of images in the user computing device, at least one input image comprising a representation of the zone of interest of the user; processing means configured to effect (ii) analysis of the zone of interest in order to supply a score of the cutaneous signs, by comparison with reference scores pre-recorded in a reference database, (iii) visualization of the region of interest in the form of a heat map for detecting the cutaneous signs, the heat map being obtained from an algorithm focussed on the cutaneous signs, (iv) merging of the heat map with the input image of the region of interest in order to supply an output image of the region of interest and to optimize the visualization of the cutaneous signs.
The invention also relates to a computer program comprising instructions which, when the program is executed by a computer, cause the latter to implement the method described above.
The invention relates lastly to a computer-readable medium comprising instructions which, when they are executed by a computer, cause the latter to implement the method described above.
Preferred embodiments
Preferably, the cosmetic and non-therapeutic method for detecting and quantifying cutaneous signs in a zone of interest of a user according to the invention exhibits one or more of the following features, taken alone or in combination:
It comprises (v) extraction of the cutaneous signs of the region of interest by converting the heat map into a colour space image, (vi) production of an output image of the region of interest based on the colour space image and the input image.
The learning base is inclusive, covering all ages, ethnic origins and phototypes, and all the bases are scored using a universal scale in order to obtain comparable measurements.
The detection of cutaneous signs, the assessment of the score of the cutaneous signs, the provision of the heat map and the creation of the final image are effected via an image processing or computer vision algorithm.
The machine learning model is a pretrained convolutional neural network.
The machine learning model is supplied with data from an atlas associating pre-recorded images to pre-recorded severity scores.
The cutaneous signs are chosen from openness of the pores, skin texture, a line, a fine line, crow's feet, sagging, colour of shadows under the eyes, depth of shadows under the eyes, skin grain, a facial expression, morphology, degree of hydration, degree of sheen or colour of a spot.
Detection of the cutaneous signs, assessment of the score of the cutaneous signs, provision of the heat map and creation of the output image are effected via a machine learning model.
The machine learning model is a pretrained convolutional neural network.
The machine learning model is supplied with data from an atlas associating pre-recorded images to pre-recorded severity scores.
The machine learning model has been trained to score different cutaneous signs.
It comprises a filtration of at least one channel.
It comprises the application of a contour to the heat map on the basis of a directional masking line and a reference point.
It further comprises recommendation of at least one cosmetic product on the basis of the severity score.
Preferably, the cosmetic and non-therapeutic method for temporal monitoring of a zone of interest of a user according to the invention exhibits one or more of the following features, taken alone or in combination:
The heat map is obtained from a generative adversarial network (GAN).
The output image comprises acquisition of data supplied by the user in response to a questionnaire, and processing of these data by the machine learning model.
The method further comprises recommendation of at least one cosmetic product or cosmetic routine on the basis of the output image and the score.
The invention will be better understood from reading the following detailed description of a non-limiting exemplary implementation thereof, and from examining the schematic and partial appended drawings, in which:
Brief description of the drawings
is a block diagram of a method for detecting and quantifying shadows under the eyes in accordance with the present invention.
is a block diagram of a method for detecting and quantifying crow’s foot lines in accordance with the present invention.
is a block diagram of a prediction of ageing, associated with a cosmetic product recommendation.
is a schematic drawing illustrating the various steps of a consumer experience based on implementation of the method according to the invention in an application for a mobile phone or tablet.
[Fig 5]
[Fig 5] illustrates a screen for the application in , showing a questionnaire.
illustrates a screen for the application in during the processing of the input data.
illustrates a screen for the application in , showing a presentation, to the user, of the results of the assessment of the cutaneous signs.
illustrates a screen for the application in , showing the user details of a result from .
illustrates a screen for the application in , showing an analysis report for the user.
illustrates a screen for the application in , showing suggestions of products for the user.
[Fig 11]
[Fig 11] illustrates a screen for the application in , showing a QR code assigned to the user.
As is shown in , the method according to the invention comprises a step in which a processing unit extracts an image of the region of interest 111 from a first input image of a zone of interest 112. In the example shown, the zone of interest 112 is formed by the face; the region of interest 111 is formed by the lower contour of the eye. The cutaneous signs present in the region of interest 111 are lines, fine lines and shadows.
The image of the zone of interest 112 can be captured using an automated and controlled skin image capture system, such as the VISIA complexion analysis system for analysis of the facial skin, available from Canfield Scientific, Inc.
In the exemplary embodiment in , the image is captured with a standard light that can be expressed in the form of an RGB (red, green, blue) colour image. However, the image can also be captured using a different lighting modality, or using a multispectral imaging system, provided that the regions of interest 111 can be distinguished according to a skin index measurement (for example concentrations of melanin and/or haemoglobin) derived from the captured image.
Several skin detection algorithms have been developed for various purposes, including face detection. For example, such algorithms are described in the article RL Hsu, et al., “Face detection in color images”, IEEE Transactions on Pattern Analysis and Machine Intelligence, vol. 24, no. 5, pp. 696-707, May 2002.). If for such skin the detection algorithms supply a suitable level of granularity, they can be used for detecting regions of interest in accordance with the present invention.
The skin detection procedure uses, for example, the measurement of the individual typology angle (ITA), which is used as a skin indicator. The ITA is calculated using the L* and b* channels of the transformed skin image CIE L*a*b* (hereinafter called L*a*b). (For a detailed description of this metric, see GN Stamatas, et al., "Non-Invasive Measurements of Skin Pigmentation In Situ," Pigment Cell Research, vol. 17, pp: 618-626, 2004.) The ITA is defined for each image pixel (i,j) as arctan (L*[i,j]−50)/b*[i,j]) and is linked to the concentration of melanin in the skin. The hypothesis is that the ITA values for the skin pixels will be grouped around one value, whilst the ITA values for the pixels without skin are considerably far from the ITA value of the skin pixels.
In the example shown, the method comprises a step 1 in which the processing unit generates a heat map 12 on the basis of the image of the zone of interest 111. In a step 2, the heat map 12 is converted into a colour space image 13 with green channel 131, blue channel 132 and yellow channel 133 in order to visualize a line of shadow 130. The blue and green channels ideally display the spots and the hyperpigmented pores, since these features have a greater absorption in the blue and green spectra.
As can be seen from , the heat map is a graphical representation of statistical data which assigns, to the intensity of a variable magnitude, a range of tones or a colour chart on a two-dimensional matrix.
In a step 3, the colour space image 13 and the input image of the region of interest 111 are combined in order to produce an output image 14 showing the line of shadow 130 of the colour space image on the image of the zone of interest 111.
The block diagram shown in comprises a step in which a first image of the face 21 is acquired via an image acquisition device.
Thereafter, the method comprises a step in which a processing unit extracts an image of the zone of interest 111 from the first acquired image of a zone of interest, namely a face. In the example shown, the region of interest 111 is formed by the corner of an eye. The cutaneous signs present in the region of interest 111 are mainly lines and fine crow’s foot lines.
The method then comprises a step in which the processing unit generates a heat map 22 on the basis of the image of the region of interest 111.
In a step 22, the heat map 22 is converted into a colour space image 23 with green channel 131 and blue channel 132 in order to visualize the crow’s foot lines 230.
In a step 23, the colour space image 23 and the input image are combined in order to produce an output image 24 showing the crow’s foot lines 230 of the colour space image 23 on the image of the region of interest 111.
Thereafter, the method comprises a step in which the processing unit can extract physiognomic data of the cutaneous signs from the output image and/or attribute a score. A temporal indication of the time of acquisition of the image is optionally associated with these physiognomic data. The physiognomic data and their temporal indication can be stored in a memory.
The method optionally comprises temporal monitoring of the treatment of the line of shadow on the basis of input images acquired at various times, for example every day.
Thus, the temporal monitoring can comprise a step in which the image acquisition device acquires a new image.
Thereafter, the temporal monitoring comprises a step in which the processing unit compares the physiognomic data extracted from the last acquired image with the physiognomic data extracted from the penultimate acquired image.
This comparison allows the processing unit to calculate a progression gradient of the cutaneous sign. This progression gradient can be calculated based on a difference between the scores attributed to the two images.
The score can be calculated based on “reference data of a database”, which are listed for example in the international application WO2020169214A1 filed by the applicant.
The block diagram shown in illustrates, in a general fashion, the steps of the method according to the invention.
An image 50 is acquired, for example, using a multifunction phone (usually a smartphone), a touchscreen tablet or a computer in a step 69.
Firstly, in a step 70, it is possible to perform a holistic analysis of the various cutaneous signs present on the image 50, such as lines, texture, spots, pores, redness, sagging, pigmentation, hydration, glow, facial expressions, morphology of the face.
The images of the regions of interest are processed as per the method according to the invention, so as to optimize the visualization of the cutaneous signs chosen in a step 71. In the example shown, the cutaneous signs chosen are the crow’s foot lines and the shadows visualized on the output images of the region of the crow’s feet 51 and the region of the shadows 52.
Artificial intelligence algorithms (classifiers, CNN) are used to supply a score for each cutaneous sign on the basis of a grading scale provided by an expert on the basis of the data of an exposome 53.
In order to visualize the information, the zones in which the algorithms are concentrated in order to supply the score are visualized in the form of a heat map. The latter is then merged with the initial image using image processing techniques in order to designate the zones of interest for the user.
Simulations of the development of these signs (before and after effects) are carried out with the aid of generative adversarial networks.
The results of the various features are combined in order to predict ageing and to offer a decision concerning selection of a suitable product, at step 72, or of a cosmetic routine.
The method according to the invention can lead to an exhaustive evaluation of the whole face, covering all skin manifestations such as lines, texture, sagging, pigmentation, pores, and also irregularities in colour and the visual properties. Holistic attributes, such as glow or apparent age, as they could be perceived by others, are readily available.
Coupled with the knowledge that has been published, this new diagnostic is a means of providing consumers with a simulation not only of immediate or long-term efficacy but also of the impact of the exposome 53 and of lifestyles on the appearance of the contour of the eye.
The method according to the invention aids decision-making in the beauty sector, for example for preparing for the location of cosmetic injections and for providing an accessible simulation of treatment.
Combined with perception of the hair, such as its volume or frizziness, the method according to the invention is key in defining an overall look and for achieving an overall appearance.
is a schematic drawing illustrating a non-limiting example of implementation of a recommendation and division of a routine according to various aspects of the present disclosure.
At step 61, a fixed or mobile computing device has captured at least one image 402 of the user. The image 402 can be presented on a display of the computing device. The image 402 is processed in accordance with the method according to the invention.
The image 402 can undergo normalization, the image normalization engine 212 using a face detection algorithm for detecting the part of the image 402 that represents the face. The image normalization engine can use the face detection algorithm to find a bounding box which includes the face. In a second step of normalization, the image normalization engine can modify the image in order to centre a bounding frame. In a third step of normalization, the image normalization engine can zoom the image 402 in order to make the bounding frame as large as possible. By performing the steps of normalization, the image normalization engine can reduce the differences in disposition and size between several images and can thus improve the learning and the precision of the automatic learning model. In some embodiments, different normalization steps may be taken.
At step 62, a questionnaire is shown on the screen of the user’s device in order to collect data that will be processed by the artificial intelligence algorithm, with a view to attributing scores to the different cutaneous signs.
At step 63, the scores attributed to the cutaneous signs are presented to the user. In the example, the scores are shown in the form of horizontal bars spaced apart from one another. Other tools for graphical representation could be used, for example curves or histograms, but the bar diagrams are particularly suitable for this graphical representation of a statistical series of discrete quantitative variables.
At step 64, a visualization of the clinical signs is provided to the user, by application of filters on an input image, according to the cutaneous sign visualized.
At step 65, recommendations of cosmetic products or routines are presented on the device screen, according to the priorities established by the skin analysis at step 63.
Figures 5 to 11 show details of the application presented in .
[Fig 5] shows a questionnaire page of the application; this page is to be completed by the user. This questionnaire allows the user to make a self-assessment of the state of their skin. The responses are processed by the method according to the invention in order to determine scores.
shows a loading page presented on the screen during the analysis of the questionnaire and the assessment of the image. This loading page is normally displayed for 30 to 45 seconds.
shows a page displaying the results. In the example shown, the skin problems are displayed with an intuitive colour code going from requires attention on the left to skin excellent on the right. They are generally noted on a scale from 0% (on the left) to 100% (on the right).
shows a page displaying a skin problem visualized by interposition of a filter. By using the magnifier tool, the user can enlarge a specific zone of the face or can even zoom in and out.
Each cutaneous sign is associated with a filter or a light. For example, for lines, texture and firmness, a standard polarizing filter for white light is interposed. For spots or redness, a pair of polarizing filters is interposed. For UV damage and pores, a UV light is added.
shows a page displaying a final report, which stresses the priorities and the skin problems of the user.
shows a page displaying the recommended products, for example classed according to brands. Product 1 is offered by a first brand. Products 2, 3 and 4 are offered by a second brand. Products 5 and 6 are offered by a third brand.
[Fig 11] shows a page displaying a QR, for example for a case where the user wishes to recover the results of the assessment later on.
Of course, the invention is not limited to the exemplary embodiments that have just been described.

Claims (18)

  1. Cosmetic and non-therapeutic method for detecting and quantifying cutaneous signs in a zone of interest of a user, notably the contour of the eyes, the cheeks, the contour of the mouth, the forehead, said method comprising (i) acquisition of an input image of a zone of interest (112) of a user including a region of interest (111) encompassing cutaneous signs (130, 230), (ii) analysis of the zone of interest in order to supply a score of the cutaneous signs, by comparison with reference scores pre-recorded in a reference database, (iii) visualization of the region of interest (111) in the form of a heat map (12, 22) in order to detect the cutaneous signs, the heat map being obtained from an algorithm focussed on the cutaneous signs, (iv) merging of the heat map (12, 22) with the input image of the region of interest (111) in order to supply an output image (14, 24) of the region of interest (111) and to optimize the visualization of the cutaneous signs (130, 230).
  2. Method according to Claim 1, comprising (v) extraction of the cutaneous signs (130, 230) of the region of interest (111) by converting the heat map (12, 22) into a colour space image (13, 23), (vi) production of an output image (14, 24) of the region of interest (111) based on the colour space image (13, 23) and the input image.
  3. Method according to either of the preceding claims, in which the reference database is inclusive, covering all ages, ethnic origins and phototypes, and all the bases are scored using a universal scale in order to obtain comparable measurements.
  4. Method according to any one of the preceding claims, in which the detection of cutaneous signs, the assessment of the score of the cutaneous signs, the provision of the heat map and the creation of the final image are effected via an image processing or computer vision algorithm.
  5. Method according to any one of the preceding claims, in which a detection of the cutaneous signs, an assessment of the score of the cutaneous signs, provision of the heat map and creation of the output image are effected via a machine learning model.
  6. Method according to the preceding claim, in which the machine learning model is a preformed convolutional neural network.
  7. Method according to either of Claims 5 and 6, in which the machine learning model is supplied with data from an atlas associating pre-recorded images with pre-recorded scores of severity.
  8. Method according to any one of the preceding claims, in which the cutaneous signs (130, 230) are chosen from openness of the pores, skin texture, a line, a fine line, crow's feet, sagging, colour of shadows under the eyes, depth of shadows under the eyes, skin grain, a facial expression, morphology, degree of hydration, degree of sheen, or colour of a spot.
  9. Method according to any one of Claims 5 to 7, in which the machine learning model has been trained to score different cutaneous signs (130, 230).
  10. Method according to any one of the preceding claims, comprising the application of a contour to the heat map (12, 22) on the basis of a directional masking line and a reference point.
  11. Method according to any one of the preceding claims, additionally comprising a recommendation of at least one cosmetic product on the basis of the severity score.
  12. Cosmetic and non-therapeutic method for temporal monitoring of a zone of interest (112) of a user, which comprises detecting and quantifying cutaneous signs (130, 230) in a zone of interest (112) according to any one of the preceding claims at different temporal instants; temporal monitoring of the output images resulting from a comparison of the output images (14, 24) generated at the different temporal instants, and temporal monitoring of the scores resulting from a comparison of the scores generated at the different temporal instants.
  13. Method according to any one of the preceding claims, in which the heat map is obtained from a generative adversarial network (GAN).
  14. Method according to any one of the preceding claims, comprising acquisition of data supplied by the user in response to a questionnaire, and processing of these data by the machine learning model.
  15. Method according to one of the preceding claims, further comprising recommendation of at least one cosmetic product or cosmetic routine on the basis of the output image (24) and the score.
  16. System for detecting and quantifying cutaneous signs (130, 230) in a zone of interest (112) of a user, comprising communication means suitable for communicating with a user computing device and configured to import, from a library of images in the user computing device, at least one input image comprising a representation of the zone of interest (112) of the user; processing means configured to effect (ii) analysis of the zone of interest in order to supply a score of the cutaneous signs, by comparison with reference scores pre-recorded in a reference database, (iii) visualization of the region of interest (111) in the form of a heat map (12, 22) for detecting the cutaneous signs, the heat map being obtained from an algorithm focussed on the cutaneous signs, (iv) merging of the heat map (12, 22) with the input image of the region of interest (111) in order to supply an output image (14, 24) of the region of interest (111) and to optimize the visualization of the cutaneous signs (130, 230).
  17. Computer program comprising instructions which, when the program is executed by a computer, cause the latter to implement the method according to one of Claims 1 to 11.
  18. Computer-readable medium comprising instructions which, when they are executed by a computer, cause the latter to implement the method according to one of Claims 1 to 11.
PCT/EP2023/061788 2022-05-10 2023-05-04 Detection and visualization of cutaneous signs using a heat map WO2023217626A1 (en)

Applications Claiming Priority (2)

Application Number Priority Date Filing Date Title
FR2204423A FR3135556A1 (en) 2022-05-10 2022-05-10 Detection and visualization of skin signs using a heat map
FRFR2204423 2022-05-10

Publications (1)

Publication Number Publication Date
WO2023217626A1 true WO2023217626A1 (en) 2023-11-16

Family

ID=83900243

Family Applications (1)

Application Number Title Priority Date Filing Date
PCT/EP2023/061788 WO2023217626A1 (en) 2022-05-10 2023-05-04 Detection and visualization of cutaneous signs using a heat map

Country Status (2)

Country Link
FR (1) FR3135556A1 (en)
WO (1) WO2023217626A1 (en)

Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
WO2011141769A1 (en) 2010-05-10 2011-11-17 L'oreal Method for evaluating a body typology characteristic
US8218862B2 (en) 2008-02-01 2012-07-10 Canfield Scientific, Incorporated Automatic mask design and registration and feature detection for computer-aided skin analysis
US20170270593A1 (en) * 2016-03-21 2017-09-21 The Procter & Gamble Company Systems and Methods For Providing Customized Product Recommendations
WO2020169214A1 (en) 2019-02-21 2020-08-27 L'oreal Machine-implemented beauty assistant for predicting face aging
US20200342213A1 (en) * 2019-04-23 2020-10-29 The Procter & Gamble Company Apparatus and method for determining cosmetic skin attributes
US20210407153A1 (en) * 2020-06-30 2021-12-30 L'oreal High-resolution controllable face aging with spatially-aware conditional gans

Patent Citations (6)

* Cited by examiner, † Cited by third party
Publication number Priority date Publication date Assignee Title
US8218862B2 (en) 2008-02-01 2012-07-10 Canfield Scientific, Incorporated Automatic mask design and registration and feature detection for computer-aided skin analysis
WO2011141769A1 (en) 2010-05-10 2011-11-17 L'oreal Method for evaluating a body typology characteristic
US20170270593A1 (en) * 2016-03-21 2017-09-21 The Procter & Gamble Company Systems and Methods For Providing Customized Product Recommendations
WO2020169214A1 (en) 2019-02-21 2020-08-27 L'oreal Machine-implemented beauty assistant for predicting face aging
US20200342213A1 (en) * 2019-04-23 2020-10-29 The Procter & Gamble Company Apparatus and method for determining cosmetic skin attributes
US20210407153A1 (en) * 2020-06-30 2021-12-30 L'oreal High-resolution controllable face aging with spatially-aware conditional gans

Non-Patent Citations (4)

* Cited by examiner, † Cited by third party
Title
FREDERIC FLAMENT ET AL.: "Assessing changes in some facial signs of fatigue in Chinese women, induced by a single working day", INTERNATIONAL JOURNAL OF COSMETIC SCIENCE, 29 November 2018 (2018-11-29)
FREDERIC FLAMENT ET AL.: "Effect of the sun on visible clinical signs of aging in Caucasian skin", CLIN COSMET INVESTIG DERMATOL., vol. 6, 27 September 2013 (2013-09-27), pages 221 - 32
GN STAMATAS ET AL.: "Non-Invasive Measurements of Skin Pigmentation In Situ", PIGMENT CELL RESEARCH, vol. 17, 2004, pages 618 - 626, XP002585053
RL HSU ET AL.: "Face detection in color images", IEEE TRANSACTIONS ON PATTERN ANALYSIS AND MACHINE INTELLIGENCE, vol. 24, no. 5, May 2002 (2002-05-01), pages 696 - 707, XP055071114, DOI: 10.1109/34.1000242

Also Published As

Publication number Publication date
FR3135556A1 (en) 2023-11-17

Similar Documents

Publication Publication Date Title
JP7248820B2 (en) Apparatus and method for determining cosmetic skin attributes
EP3249562B1 (en) Method for obtaining care information, method for sharing care information, and electronic apparatus therefor
US8094186B2 (en) Skin condition diagnosis system and beauty counseling system
JP2022529677A (en) Devices and methods for visualizing cosmetic skin properties
JP2017174454A (en) Skin diagnosis and image processing method
KR100993875B1 (en) A fuzzy based personal color diagnosis system and method
CN108024719B (en) Skin gloss evaluation device, gloss evaluation method, and recording medium
KR102485256B1 (en) Customized Skin diagnostic and Managing System
US11010894B1 (en) Deriving a skin profile from an image
JP2016519608A (en) Skin diagnosis and image processing system, apparatus and product
US20200146622A1 (en) System and method for determining the effectiveness of a cosmetic skin treatment
US20220164852A1 (en) Digital Imaging and Learning Systems and Methods for Analyzing Pixel Data of an Image of a Hair Region of a User's Head to Generate One or More User-Specific Recommendations
CN113610844A (en) Intelligent skin care method, device, equipment and storage medium
CN109891519A (en) Information processing unit, information processing method and program
WO2023217626A1 (en) Detection and visualization of cutaneous signs using a heat map
WO2021243640A1 (en) Oral care based digital imaging systems and methods for determining perceived attractiveness of facial image portion
US20180116582A1 (en) Elasticity evaluation apparatus, elasticity evaluation method, and elasticity evaluation program
CN113763498A (en) Portrait simple-stroke region self-adaptive color matching method and system for industrial manufacturing
KR102090370B1 (en) Coaching system of personal color based on color psychology
WO2023217608A1 (en) Method for predicting the development of cutaneous signs over time
KR102425873B1 (en) Personal color diagnostic method and system based on machine learning and augmented reality
Sengupta et al. Contrast enhancement for color dermascopy images using equalization based on luminosity
KR102439973B1 (en) Apparatus and method for image description reflecting emotion
Kamble Foundation Makeup Shade Recommendation using Computer Vision Based on Skin Tone Recognition
JP2023087699A (en) Level estimation method for face showing actual feeling of happiness, level estimation device for face showing actual feeling of happiness, and level estimation program for face showing actual feeling of happiness

Legal Events

Date Code Title Description
121 Ep: the epo has been informed by wipo that ep was designated in this application

Ref document number: 23724784

Country of ref document: EP

Kind code of ref document: A1